A tech company develops a navigation app for smartphones that can compute the usual distance it takes to get from one location to another. The company collects location data from 100 smartphones to determine how long it takes to drive from Cleveland, Ohio, to Detroit, Michigan.
The company finds that it takes an average of 2.78 hours to drive this distance with a standard deviation of 0.06. The driving times appear to be normally distributed.
This company wants to provide an estimate of a range of driving times that include the driving times for 95% of users.
What would this range be?
a) 0 to 2.66 hours
b) 2.60 to 2.96 hours
c) 1.82 to 3.74 hours
d) 2.66 to 2.90 hours
5 answers
95% of the population is within 2 s.d. of the mean
so which one? a, b, c, or d?
start with the mean
add 2 s.d. for the max
subtract 2 s.d. for the min
add 2 s.d. for the max
subtract 2 s.d. for the min
Hmmmmmmm, I'm just as confused as Mrs. Jessica.
The answer is D for people in the future :)