A radar gun measured the speed of a baseball at 103 miles per hour. If the baseball was actually going 102.8 miles per hour, about what was the percent error in this measurement?

1 answer

To calculate the percent error in the measurement, you can use the following formula:

\[ \text{Percent Error} = \left( \frac{|\text{Measured Value} - \text{Actual Value}|}{\text{Actual Value}} \right) \times 100% \]

In this case, the measured value is 103 miles per hour and the actual value is 102.8 miles per hour.

  1. Calculate the absolute error: \[ |\text{Measured Value} - \text{Actual Value}| = |103 - 102.8| = 0.2 \text{ miles per hour} \]

  2. Plug the values into the percent error formula: \[ \text{Percent Error} = \left( \frac{0.2}{102.8} \right) \times 100% \]

  3. Calculate: \[ \frac{0.2}{102.8} \approx 0.00195 \] \[ 0.00195 \times 100% \approx 0.195% \]

Thus, the percent error in the measurement is approximately 0.195%.