Quadratic Regression Calculator

Fit a parabolic curve to your data using least squares and get the quadratic equation, R², and predictions instantly.

∼ Quadratic Regression Calculator
Coefficient a (x²)
-
Coefficient b (x)
-
Coefficient c (const)
-
R² (R-squared)
-
Vertex X
-
Data Points (n)
-

📖 What is Quadratic Regression?

Quadratic regression is a form of polynomial regression that fits a second-degree parabola - y = ax² + bx + c - to a set of observed data points using the method of least squares. While linear regression assumes a straight-line relationship between X and Y, quadratic regression accommodates situations where the relationship is curved, rising to a peak and then falling (or falling then rising).

The three coefficients a, b, and c are determined simultaneously by minimising the total sum of squared residuals: SS_res = Σ(yᵢ − axᵢ² − bxᵢ − c)². Setting the partial derivatives with respect to each coefficient to zero produces a 3×3 system of normal equations, solved here using Gaussian elimination with partial pivoting for numerical stability.

Quadratic regression appears throughout science and engineering. In physics, a thrown ball follows a parabolic trajectory - height versus time is exactly quadratic. In pharmacology, dose-response curves often rise steeply then plateau or decline, fitting a quadratic. In economics, a firm's profit function is classically quadratic in price or output, reflecting diminishing marginal returns. In sports analytics, a player's performance metric may peak at a certain age and decline on either side.

This calculator handles the full 3×3 least-squares solution, reports R², and includes a prediction field so you can evaluate the fitted parabola at any X value.

📐 Formulas

ŷ = ax² + bx + c

The coefficients a, b, c satisfy the normal equations (derived by setting ∂SS/∂a = ∂SS/∂b = ∂SS/∂c = 0):

Normal equations (3×3 system):

c·n + b·Σx + a·Σx² = Σy

c·Σx + b·Σx² + a·Σx³ = Σxy

c·Σx² + b·Σx³ + a·Σx⁴ = Σx²y

Solved simultaneously for a, b, c using Gaussian elimination with partial pivoting.

R² (goodness of fit): R² = 1 − SS_res / SS_tot

where SS_res = Σ(yᵢ − ŷᵢ)² and SS_tot = Σ(yᵢ − ȳ)²

Vertex (turning point): x* = −b / (2a), y* = ax*² + bx* + c

Prediction: ŷ = ax² + bx + c evaluated at any chosen X.

📖 How to Use This Calculator

1
Enter the X values (independent variable) as a comma-separated list. Aim for at least 5–6 points so R² is meaningful.
2
Enter the Y values (dependent variable) in the same order. Both lists must have identical lengths.
3
Click Calculate Quadratic Regression. Coefficients a, b, c, R², and the fitted equation appear instantly.
4
Use the Prediction field to evaluate ŷ = ax² + bx + c at any X - for example, to forecast a value outside the observed range.
5
Check R²: values above 0.90 indicate the parabola fits well. If R² is low, try a different model (exponential, linear, or higher-degree polynomial).

📝 Example Calculations

Example 1 - Projectile Height vs Time

A ball is thrown upward. Time (s): 0, 1, 2, 3, 4. Height (m): 0, 15, 20, 15, 0.

Fitting y = at² + bt + c: the normal equations yield a ≈ −5.00, b ≈ 20.00, c ≈ 0.00.

Equation: ŷ = −5.00t² + 20.00t. R² = 1.000 - perfect fit (as expected from physics).

Vertex at t = −20/(2×−5) = 2.00 s, maximum height = 20 m. Matches the observed peak.

Result = ŷ = −5.00x² + 20.00x, R² = 1.000
Try this example →

Example 2 - Dose-Response Curve

Drug dose (mg): 5, 10, 20, 40, 80, 160. Response (%): 10, 35, 70, 90, 75, 40.

Fitting gives a ≈ −0.0045, b ≈ 1.025, c ≈ 0.30.

Equation: ŷ = −0.0045x² + 1.025x + 0.30. R² ≈ 0.973.

Optimal dose (vertex): x* = −1.025 / (2 × −0.0045) ≈ 114 mg - the dose yielding peak response.

Result = ŷ = −0.0045x² + 1.025x + 0.30, R² ≈ 0.973
Try this example →

Example 3 - Price vs Demand Profit

Price (£): 10, 20, 30, 40, 50. Profit (£000): 5, 18, 25, 24, 15.

Fit: a ≈ −0.040, b ≈ 2.48, c ≈ −18.0. R² ≈ 0.987.

Equation: ŷ = −0.040x² + 2.48x − 18.0. Vertex at x* = −2.48/(2×−0.040) = £31.00 - the profit-maximising price.

Maximum expected profit: ŷ(31) = −0.040(961) + 2.48(31) − 18 = −38.44 + 76.88 − 18 = £20,440.

Result = ŷ = −0.040x² + 2.48x − 18.0, R² ≈ 0.987
Try this example →

Example 4 - Athlete Performance by Age

Age (years): 18, 22, 26, 30, 34, 38. Sprint time (s): 10.8, 10.2, 10.0, 10.3, 10.9, 11.6.

Fit: a ≈ 0.030, b ≈ −1.62, c ≈ 31.8. R² ≈ 0.983.

Vertex (peak performance - minimum time) at age x* = 1.62/(2×0.030) = 27.0 years.

At age 27: ŷ = 0.030(729) − 1.62(27) + 31.8 = 21.87 − 43.74 + 31.8 = 9.93 s.

Result = Peak at age 27.0 years, ŷ = 9.93 s, R² ≈ 0.983
Try this example →

Example 5 - Revenue vs Units Sold

Units (000s): 1, 2, 3, 4, 5, 6, 7. Revenue (£000s): 8, 20, 36, 46, 50, 48, 38.

Fit: a ≈ −2.21, b ≈ 19.21, c ≈ −9.07. R² ≈ 0.987.

Vertex at x* ≈ 4.35 (000 units) - revenue-maximising output. Max predicted revenue: ŷ(4.35) ≈ £52,800.

Prediction at 8 units: ŷ = −2.21(64) + 19.21(8) − 9.07 = −141.44 + 153.68 − 9.07 = £3,170 - showing the business would lose money at excessive volumes.

Result = ŷ = −2.21x² + 19.21x − 9.07, R² ≈ 0.987
Try this example →

❓ Frequently Asked Questions

What is quadratic regression?+
Quadratic regression fits a parabola y = ax² + bx + c to a set of data points using the method of least squares. Unlike linear regression which fits a straight line, quadratic regression captures curved relationships where Y first increases then decreases (or vice versa). The three coefficients a, b, c are found by solving a 3×3 system of normal equations derived from minimising the sum of squared residuals.
When should I use quadratic instead of linear regression?+
Use quadratic regression when your data shows a clear curved pattern - for example, projectile motion (parabolic arc), dose-response curves with a plateau, profit functions with diminishing returns, or any data that rises then falls (or falls then rises). If a scatter plot of your data shows a U-shape or inverted-U-shape, quadratic regression is appropriate. A residual plot from linear regression that shows a systematic curve also signals the need for a quadratic term.
How is quadratic regression calculated?+
Quadratic regression solves the system of normal equations obtained by setting the partial derivatives of SS_res = Σ(yᵢ − axᵢ² − bxᵢ − c)² to zero with respect to a, b, and c. This yields a 3×3 linear system involving the sums Σ1, Σx, Σx², Σx³, Σx⁴, Σy, Σxy, Σx²y. Solving this system - using Gaussian elimination - gives the least-squares coefficients.
What does R-squared mean in quadratic regression?+
R² measures the proportion of variance in Y explained by the quadratic model: R² = 1 − SS_res/SS_tot, where SS_res = Σ(yᵢ−ŷᵢ)² and SS_tot = Σ(yᵢ−ȳ)². R² = 0.95 means the parabola accounts for 95% of the variation in Y. Note that adding more terms always increases R², so compare models using adjusted R² or an F-test when deciding whether the quadratic term genuinely improves the fit.
What does the coefficient 'a' tell me?+
The coefficient a determines the curvature and direction of the parabola. If a > 0, the parabola opens upward with a minimum at x = −b/(2a). If a < 0, it opens downward with a maximum at x = −b/(2a). The magnitude of a controls how 'wide' or 'narrow' the parabola is - a large |a| means a sharper curve.
How do I find the vertex of the fitted parabola?+
The vertex (turning point) is at x* = −b/(2a) and y* = a(x*)² + b(x*) + c. This is the minimum of the parabola if a > 0 or the maximum if a < 0. For example, if the parabola models profit as a function of price, the vertex x* gives the optimal price that maximises profit.
What is the minimum number of points needed for quadratic regression?+
You need at least 3 data points to fit a quadratic (3 coefficients). However, with exactly 3 points the parabola passes through all three exactly and R² = 1 by definition, which is not a meaningful goodness-of-fit. Use at least 5–6 points for a reliable R² and to detect whether the quadratic model is genuinely appropriate.
How is quadratic regression different from quadratic interpolation?+
Quadratic interpolation (e.g. Lagrange interpolation through 3 points) passes the curve exactly through 3 specific points. Quadratic regression with n > 3 points finds the best-fit parabola that minimises the sum of squared errors across all points - it will not generally pass through any of them exactly. Regression is preferred for noisy real-world data; interpolation is for when you trust each data point exactly.