Asked by tristen
1. A random sample of 100 computers showed a mean of 115 gigabytes used with a standard deviation of 20 gigabytes. What is the standard error of the mean?
Answers
Answered by
MathGuru
Standard error of the mean:
Standard deviation divided by the square root of the sample size.
Standard deviation divided by the square root of the sample size.
There are no AI answers yet. The ability to request AI answers is coming soon!
Submit Your Answer
We prioritize human answers over AI answers.
If you are human, and you can answer this question, please submit your answer.