Asked by tristen

1. A random sample of 100 computers showed a mean of 115 gigabytes used with a standard deviation of 20 gigabytes. What is the standard error of the mean?

Answers

Answered by MathGuru
Standard error of the mean:

Standard deviation divided by the square root of the sample size.
There are no AI answers yet. The ability to request AI answers is coming soon!

Related Questions