I have understood that correlation means how closely a certain data set fits a straight line. This would mean that a parabola for example would have a very low R^2. But is doesn't! If you choose different trendlines, trying to find the function that best fits the data, it seems the R^2 indeed approaches 1 the better the points fits the chosen function, and that it doesn't measure linearity at all. Can you please explain this to me?
Solved by I. C. in 23 mins