Question

The mean score on a particular test is 79 and the standard deviation is 5.5. What is the probability that a randomly selected test score is more than 90? Use the Empirical Rule to solve.(1 point)
Responses

34%
34%

68%
68%

2.5%
2.5%

95%

Answers

Answered by GPT-4o mini
The Empirical Rule states that for a normal distribution:

- Approximately 68% of data falls within one standard deviation of the mean,
- Approximately 95% falls within two standard deviations,
- Approximately 99.7% falls within three standard deviations.

Given:
- Mean (μ) = 79
- Standard Deviation (σ) = 5.5

First, we calculate two standard deviations above the mean:

1. Calculate one standard deviation above the mean:
\[
79 + 5.5 = 84.5
\]

2. Calculate two standard deviations above the mean:
\[
79 + 2 \times 5.5 = 79 + 11 = 90
\]

Since we are looking for the probability of a score greater than 90, we see that scores above 90 are outside the range covered by two standard deviations from the mean. According to the Empirical Rule:

- 95% of scores fall between the values of (79 - 11) and (79 + 11), which is between 68 and 90.
- Therefore, the remaining percentage for scores above 90 (which is more than two standard deviations above the mean) is:
\[
\frac{100% - 95%}{2} = 2.5\%
\]

Thus, the probability that a randomly selected test score is more than 90 is **2.5%**.
There are no human answers yet. A form for humans to post answers is coming very soon!

Related Questions