Percent Error Calculator
Find how far off an experimental measurement is from the true value — as a percentage.
±% What is Percent Error?
Percent error (also called percentage error) measures how far an experimental or measured value deviates from the true, accepted, or theoretical value — expressed as a percentage of the true value. The formula is: Percent Error = |Measured − Actual| / |Actual| × 100. It is one of the most common accuracy metrics in science, engineering, and quality control.
The absolute value in the numerator ensures percent error is always non-negative — it measures the magnitude of the error, not its direction. If you also need to know whether you overestimated or underestimated, compute the signed percent error without the absolute value: (Measured − Actual) / |Actual| × 100. A positive signed error means the measured value is too high (overestimate); negative means too low (underestimate).
Percent error is different from absolute error (|Measured − Actual|, in the original units) and from percentage difference (which divides by the average of both values, for comparing two equal-standing measurements). Percent error always uses the accepted/actual value as the denominator because you are measuring accuracy relative to the truth.
Common applications include: chemistry lab reports (how close is your experimental yield to theoretical?), physics measurements (how close is your measured g to 9.81 m/s²?), calibration of instruments, quality control (how close is the manufactured dimension to specification?), and machine-learning model validation (though that usually uses RMSE or MAE rather than a single percent error).