We have learned that physical measurement involves error and that every physical measurement is an approximation. This leads us to a new question: How much error is involved in any given measurement? The terms precision and accuracy relate to how good an approximation is. For example, how precise were our measurements of the sides of the right triangles, and how accurate were our measurements of the distance from Mars to the Sun?
Since measurements are approximate, the most meaningful way of interpreting a measurement is as an interval with a lower bound and an upper bound. Imagine that we have measured a line segment, using a ruler divided into centimeters, and found the length to be 5 cm. To be more precise, we can state the measure as an interval -- either in words, 5 cm to the nearest 0.5 cm, or using notation, such as 5 cm 0.5 cm (read "5 cm plus or minus 0.5 cm"). Either presentation gives the center of the interval and the distance of the upper and lower bounds from this center (5 0.5 implies a lower bound of 4.5 and an upper bound of 5.5). We can also state that the maximum possible error for this measure is 0.5 cm (which is half the size of the measurement unit). Note 12
In summary, the precision of a measurement depends on the size of the smallest measuring unit -- whether the measurement is, for example, to the nearest 10 feet, to the nearest foot, or to the nearest tenth of a foot. The smaller the interval, the more we have "narrowed it down," and thus the more precise the measurement.