Coefficient of Determination Calculator (R-squared)

Measure how well your regression model explains the variation in your data.

R² Coefficient of Determination (R²) Calculator
R² (R-squared)
-
Adjusted R²
-
SS_res
-
SS_tot
-

📖 What is R² (Coefficient of Determination)?

The coefficient of determination, denoted R², is the proportion of variance in the dependent variable (Y) that is predictable from the independent variable(s) (X). It is one of the most widely used statistics in regression analysis, providing an intuitive measure of how well the model fits the data.

R² ranges from 0 to 1 for ordinary linear regression with an intercept. An R² of 0 means the model explains none of the variability - you would do just as well by predicting ȳ (the mean) for every observation. An R² of 1 means the model explains all the variability - every data point lies exactly on the regression line.

R² is calculated from the sum-of-squares decomposition. The total variance (SS_tot) is split into explained variance (SS_reg) and unexplained variance (SS_res): SS_tot = SS_reg + SS_res. R² = SS_reg/SS_tot = 1 − SS_res/SS_tot.

For simple linear regression (one predictor), R² equals the square of the Pearson correlation coefficient r: R² = r². For multiple regression, R² is the squared multiple correlation coefficient and cannot be computed from a single pairwise r.

📐 Formulas

R² = 1 − SS_res / SS_tot = SS_reg / SS_tot

SS_tot = Σ(yᵢ − ȳ)² (total sum of squares)

SS_res = Σ(yᵢ − ŷᵢ)² (residual sum of squares)

SS_reg = Σ(ŷᵢ − ȳ)² = SS_tot − SS_res (regression sum of squares)

From r: R² = r² (simple linear regression only)

Adjusted R²: R²_adj = 1 − [(1−R²)(n−1) / (n−k−1)]

where n = sample size, k = number of predictors (independent variables)

F-statistic from R²: F = [R²/k] / [(1−R²)/(n−k−1)] - tests overall model significance

📖 How to Use This Calculator

1
Choose the input mode. From Data: enter your X/Y pairs and the calculator does the regression. From SS: enter precomputed SS values. From r: enter the correlation coefficient.
2
Enter the number of predictors k (default = 1 for simple regression). This is needed to compute adjusted R².
3
Click Calculate R². Results include R², adjusted R², SS values, and an interpretation.

📝 Example Calculations

Example 1 - From Data

X: 1,2,3,4,5. Y: 2,4,5,4,5. ȳ = 4. SS_tot = 6. After regression: SS_res = 2.0. R² = 1 − 2/6 = 0.667

The linear model explains 66.7% of Y's variance.

R² = 0.667
Try this example →

Example 2 - From Correlation

r = 0.95. R² = 0.95² = 0.9025. The predictor explains 90.25% of the variance in Y.

R² = 0.9025
Try this example →

Example 3 - Adjusted R² (Multiple Regression)

R² = 0.82, n = 30, k = 5 predictors. Adj R² = 1 − (1−0.82)×29/24 = 1 − 0.18×1.208 = 1 − 0.2175 = 0.782

Adjusted R² is lower - penalises for 5 predictors.

Adjusted R² = 0.782
Try this example →

Example 4 - Poor vs Good Fit Comparison

Model A: R² = 0.95. Model B (adds 3 noisy predictors): R² = 0.96, Adj R² = 0.93. Adjusted R² reveals that the extra predictors added little value.

Model A R² = 0.95 vs Model B Adj R² = 0.93
Try this example →

❓ Frequently Asked Questions

What is R-squared (R²)?+
R² (the coefficient of determination) measures the proportion of total variance in the dependent variable Y that is explained by the regression model. R² = 0 means the model explains nothing - the regression is no better than just predicting the mean ȳ for every observation. R² = 1 means the model explains all variation - every data point falls exactly on the regression line.
How do you interpret R²?+
R² = 0.85 means 85% of the variance in Y is explained by the predictor(s). The remaining 15% is unexplained variability (residual). Benchmarks vary by field: R² > 0.90 is excellent in many engineering contexts; R² > 0.70 is acceptable in social sciences where human behaviour is inherently variable.
What is the difference between R² and adjusted R²?+
R² always increases (or stays the same) when you add more predictors, even if they are useless. Adjusted R² penalises for adding predictors that don't improve the model: Adj R² = 1 − [(1−R²)(n−1)/(n−k−1)] where k is the number of predictors. Use adjusted R² to compare models with different numbers of variables.
Can R² be negative?+
R² as defined (1 − SS_res/SS_tot) can be negative for non-linear models when the model performs worse than simply predicting the mean for every observation. For ordinary linear regression with an intercept, R² is always between 0 and 1.
What is SS_tot, SS_res, and SS_reg?+
SS_tot (total sum of squares) = Σ(yᵢ − ȳ)² measures total variance. SS_res (residual sum of squares) = Σ(yᵢ − ŷᵢ)² measures unexplained variance. SS_reg (regression sum of squares) = Σ(ŷᵢ − ȳ)² measures explained variance. They are related: SS_tot = SS_reg + SS_res. R² = SS_reg/SS_tot.