Asked by virginia
Assume that a standardized test is designed to have the mean score of 100 and the standard deviation of 15. At the 95% confidence interval, how large does the sample size have to be if the margin of error is to be 3 points?
Answers
Answered by
MathGuru
Formula:
n = {[(z-value) * sd]/E}^2
...where n = sample size, sd = standard deviation, E = maximum error, and ^2 means squared.
Using the values you have in your problem:
n = {[(1.96) * 15]/3}^2
Calculate for sample size. Round your answer.
n = {[(z-value) * sd]/E}^2
...where n = sample size, sd = standard deviation, E = maximum error, and ^2 means squared.
Using the values you have in your problem:
n = {[(1.96) * 15]/3}^2
Calculate for sample size. Round your answer.
There are no AI answers yet. The ability to request AI answers is coming soon!
Submit Your Answer
We prioritize human answers over AI answers.
If you are human, and you can answer this question, please submit your answer.