|
In everyday language, we use the terms accuracy and precision interchangeably. In mathematical terminology, however, the accuracy of a measure (an approximate number) is defined as the ratio of the size of the maximum possible error to the size of the number.
This ratio is called the relative error. We express the accuracy as a percent, by converting the relative error to a decimal and subtracting it from 1 (and writing the resulting decimal as a percent). The smaller the relative error, the more accurate the measure.
Here's how it works. We can measure two different items to the nearest centimeter: a desk and a notepad. The desk is 120 cm long, and the notepad is 12 cm long. The maximum possible error in each case is 0.5 cm. Both measures are to the same level of precision, but the relative error differs:

So the relative errors for the desk and the notepad are
.
Therefore, though the measurements of 120 cm and 12 cm are equally precise, they are not equally accurate. The measurement of 120 cm is more accurate, because it has the smaller relative error. You can record the accuracy as a percentage by subtracting the relative error from one and writing the resulting decimal as a percentage. So, in the first instance, the accuracy is 1 - 0.0042 = 0.9958, or 99.58%, and in the second instance, it is 1 - 0.042 = 0.958, or 95.8%. Clearly, the accuracy of the two measurements differs significantly. Note 13
|