standard error mean=20/sqrt(100)=+-2gigabites.
That is, the sample mean is within +-2 gigabites for the real mean 68 percent of the time.
A university has 1000 computers available for students to use. Each computer has a 250 gigabyte hard drive. The university wants to estimate the space occupied on the hard drives. A random sample of 100 computers showed a mean of 115 gigabytes used with a standard deviation of 20 gigabytes. What is the standard error of the mean
1 answer