What Is R Squared And Negative R Squared

R Squared – A Way Of Evaluating Regression

Regression is a way of fitting a function to a set of data.  For instance, maybe you have been using satellites to count the number of cars in the parking lot of a bunch of Walmart stores for the past couple of years.  You also know the quarterly sales that Walmart had during that time frame from their earnings report.  You want to find a function that relates the two, so that you can use your satellites to count the number of cars and predict Walmart’s quarterly earnings.  (In order to get an advantage in the stock market)

In order to generate that function you would use regression analysis.  But after you generated the car to profit relationship function how can you tell if it is a good quality?  After all, if you are using it to try to predict the stock market, you will be betting real money on it.  You need to know, is your model a good fit?  A bad fit?  Mediocre?  One commonly used metric for determining goodness of fit is R squared, or more formally, the coefficient of determination.

This post goes over R2, and by the end you will understand what it is and how to calculate it, but unfortunately you won’t have a good rule of thumb for what R2 value is good enough for your analysis, because it is entirely problem dependent.  Any R squared value greater than zero means that the regression analysis did better than just using a horizontal line through the mean value.  In the rare cases you get a negative r squared value, you should probably rethink your regression analysis, especially if you are forcing an intercept.

 

What Is R Squared?

We will get into the equation for R2 in a little bit, but first what is R squared?   It is how much better your regression line is than a simple horizontal line through the mean of the data.  In the plot below the blue line is the data that we are trying to generate a regression to and the horizontal red line is the average of that data.

r squared is comparing against a horizontal line

The red line is the value that gives the lowest summed squared error to the blue data points, assuming you had no other information about the blue data points other than their y value.  This is shown in the plot below.  For the blue data points, only the y values are available.  You don’t know anything else about those values.   You can download the Excel file I used to generate these plots and tables here.r squared - the best you can do is mean value if you have no other informationIf you want to select a value that gives you the lowest summed squared error, the value that you would select was the mean value, shown as the red triangle.  A different way to think about that assertion is this.  If I took all 7 of the y points (0, 1, 4, 9, 16, 25, 36), rearranged them into a random order, and made you guess the value of all 7 points before revealing any of their order, what strategy would give you the minimum sum squared error?   That strategy is to guess the mean value for all the points.

With regression the question is, now that you have more information (the X values in this case) can you make a better approximation than just guessing the mean value?  And the R squared value answers the question, how much better did you do?

That is actually a pretty intuitive understanding.  First calculate how much error you would have if you don’t even try to do regression, and instead just guess the mean of all the values.  That is the total error.  It could be low if all the data is clustered together, or it could be high if the data is spread out.

SS stands for summed squared error, which is how the error is calculated.  To get the total sum squared error you

  • Start with the mean value
  • For every data point subtract that mean value from the data point value
  • Square that difference

Add up all of the squares.  This results in summed squared error

how to calculate sum squared error

 

Get The Regression Error

Next calculate the error in your regression values against the true values.  This is your regression error.  Ideally the regression error is very low, near zero.

sum squared error for a linear regression

The ratio of the regression error against the total error tells you how much of the total error remains in your regression model.  Subtracting that ratio from 1.0 gives how much error you removed using the regression analysis.  That is R2

In actual equation form it is

r squared equation

To get the total error, you subtract the mean value from each data point, and square the results
sum squared error equation

For the sum squared regression error, the equation is the same except you use the regression prediction instead of the mean value

sum squared error for regression equation

What is a Good R Squared Value?

In most statistics books, you will see that an R squared value is always between 0 and 1, and that the best value is 1.0.   That is only partially true. The lower the error in your regression analysis relative to total error, the higher the R2 value will be.  The best R2 value is 1.0.  To get that value you have to have zero error in your regression analysis.

r squared value of zero

However R2 is not truly limited to a lower bound of zero.  You can get a negative r squared value.

What Does A Negative R Squared Value Mean?

For practical purposes, the lowest R2 you can get is zero, but only because the assumption is that if your regression line is not better than using the mean, then you will just use the mean value.  However if your regression line is worse than using the mean value, the r squared value that you calculate will be negative.

For instance.  Let’s say that you wanted to make a prediction on the population of one of the states in the United States.   I am not giving you any information other than the population of all 50 states, based on the 2010 census.  I.e.  I am not telling you the name of the state you are trying to make the prediction on, you just have to guess the population (in millions) of all the states in a random order.   The best you could do here is take the mean value.  Your total squared error would be 2298.2

If you used something other than mean, for instance took the median, the summed squared error would be 2247.2

Which when converted into R2 is

negative r squared value

And you get a negative R2 number

 

Another Way To Get A Negative R Squared Value

The most common way to end up with a negative r squared value is to force your regression line through a specific point, typically by setting the intercept.  The way the ordinary least squares regression equations work is by making a line that passes through a specified point and has the lowest possible sum squared error while still passing through the specified point.

If the point that is chosen is the mean value of x and y, the resulting line will have the lowest possible sum squared error, and the highest possible R-squared value.  If you chose the mean x and y you cannot get a negative r squared value.

However if you specify a different point for the regression line to go through, you will still get the line that generates the lowest sum squared error through that point, but that doesn’t mean that line is good.  For instance in this chart from Excel

this shows how you can get a negative r squared value by specifying an intercept

The intercept for both of the regression lines was set at zero.  For the regression line for the blue points, that is not too far off of the best possible regression line, and so the resulting R2 value is positive.  However for the regression line for the red points, the true intercept should be around 120, so setting the intercept to be 0 forces the regression line far away from where it should be.  The result is that the regression sum squared error is greater than if you used used the mean value, and hence a negative r squared value is the result.

The assertion that the R squared value has to be greater than or equal to zero is based on the assumption that if you get a negative R squared value, you will dump whatever regression calculation you are using and just go with the mean value.

The take away for R2 is

  • An R2 of 1.0 is the best. It means you have no error in your regression.
  • An R2 of 0 means your regression is no better than taking the mean value, i.e. you are not using any information from the other variables
  • A Negative R2 means you are doing worse than the mean value. However maybe summed squared error isn’t the metric that matters most to you and this is OK.  (For instance, maybe you care most about mean absolute error instead)

R Squared Example

As an example of how to calculate R squared, let’s look at this data

example data for linear regression

 

This data is just the numbers 0 through 6, with the y value being the square of those numbers.  The linear regression equation for this data is

y = 6x-5

and is plotted on the graph below

linear regression line

Excel has calculated the R2 of this equation to be .9231.  How can we duplicate that manually?

Well the equation is

r squared equation

So we need to find the total summed squared error (based off the mean) and the summed squared error based off the regression line.

The mean value of the y values of the data (0, 1, 4, 9, 16, 25, 36) is 13

linear regression sample data

To find the total summed square error, we will subtract 13 from each of the y values, square that result, and add up all of the squares.  Graphically, this is shown below.  At every data point, the distance between the red line and the blue line is squared, and then all of those squares are summed up

how to calculate sum squared error

The total sum squared error is 1092, with most of the error coming from the edges of the chart where the mean is the farthest way from the true value

sum squared error table

Now we need find the values that our regression line of y = 6x-5 predicts, and get the summed squared error of that.  For the sum squared value, we will subtract each y regression value from the true value, take the square, and sum up all of the squares

regression error table

So the total summed squared error of the linear regression is 84, and the total summed squared error is 1092 based on the mean value.

Plugging these numbers into the R2 equation we get

what is r squared equation

Which is the same value that Excel calculated.

A different way to think about the same result would be that we have 84/1092 = 7.69 % of the total error remaining.  Basically, if someone had just given us the y values, and then told us that they were going to pick random order those y values and we had to guess what they all were, the best guess we could have made was the mean for each one.   But if now they give us the X value, and tell us to try to guess the Y value, we can use the linear regression line and remove 92.31% of the error from our guess.

 

But Wait, Can’t We Do Better?

We just showed a linear regression line that produced an R squared value of .9231 and said that that was the best linear fit we could make based on the summed squared error metric.  But couldn’t we do better with a different regression fit?

Well the answer is yes, of course we could.  We used a linear regression, i.e. a straight line, on this data.  However the data itself wasn’t linear.  Y is the square of x with this data.  So if we used a square regression, and in fact just used the equation y = x2, we get a much better fit, which is shown below

x squared regression line

Here we have an R2 of 1.0, because the regression line exactly matches the data, and there is no remaining error.   However the fact that we were able to do this is somewhat beside the point for this R2 explanation.  We were able to find an exact match for this data only because it is a toy data set.  The Y values were built as the square of the X values, so it is no surprise that making a regression that utilized that fact gave a good match.

For most data sets, an exact match will not be able to be generated because the data will be noisy and not a simple equation.  For instance, an economist might be doing a study to determine what attributes of a person correlate to their adult profession and income.  Some of those attributes could be height, childhood interests, parent’s incomes, school grades, SAT scores etc.   It is unlikely that any of those will have a perfect R2 value, in fact the R squared value some of them might be quite low.  But there are times that even a low R2 value could be of interest

Any R2 value above 0.0 indicates that there could be some correlation between the variable and the result, although very low values are likely just random noise.

 

 

An Odd Special Case For R Squared

Just for fun, what do you think the R2 of this linear regression line for this data is?

r squared for a horizontal line

Here we have a pure horizontal line, all the data is 5.0.

As it turns out, the R squared value of a linear regression on this data is undefined.   Excel will display the value as N/A

r squared undefined

What has happened in this example is that the total summed squared error is equal to zero.   All the data values exactly equal the mean value.  So there is zero error if you just estimate the mean value.

Of course, there is also zero error for the regression line.  You end up with a zero divided by zero term in the R squared equation, which is undefined.

r squared undefined equation

2 thoughts on “What Is R Squared And Negative R Squared

  1. ali

    Hi,

    Thanks for this very simple and informative post! I am trying to model a stock market time series data via LSTM. I have observed that my RMSEs on both train and test sets are almost identical in addition to a positive correlation between the predictions and the original values in the test set. However, I got negative r squared value. I was wondering if I could still rely on my predictions? And what would be the logical explanation of negative r-squared, if any, for time series tasks?

    Thanks in advance!

    Reply
    1. Fairly Nerdy Post author

      Ali,

      I’ll start by saying that I am personally an indexer with respect to the market, and just buy a mix of total market, international total market, and bonds according to my asset allocation plan. So I can’t really comment on which strategies, if any, might be successful in making market predictions.

      The negative R-squared value means that your prediction tends to be less accurate that the average value of the data set over time. That doesn’t shock me that it could happen, but it surprises me given how you are describing your analysis set up. If that is occurring, it shouldn’t be too difficult to detect just by plotting the true data, your predicted data, and a simple average. If your predicted results are actually that bad, they should be visually well off from the true data.

      Frequently the case for a negative R squared for linear regression is that the line is forced to go through the origin. In the case of the stock market, if for instance you wanted a linear regression of the SP 500 from 2010 to 2018, where it went from ~$1100 to ~$2800, a linear regression of that would not be terrible (in hindsight) However if you constrained your regression model to pass through zero (which is an option on many software tools) it would get a lot worse

      Reply

Leave a Reply

Your email address will not be published. Required fields are marked *