Ask a New Question

Asked by Los

if a distribution has a mean of 50 and a standard deviation of five what value would be -1 standard deviation from the mean
5 years ago

Answers

Answered by oobleck
surely, that would be 45, right?
5 years ago
Answered by Damon
Unless there is some very strange trick :)
5 years ago

Related Questions

In a distribution with m = 40 and s = 2, a score of X = 46 would be considered an extremely high sco... Here is a distribution you have seen before, in Exercise Set 2. It is the distribution of ages in th... Look again at the distribution of ages in Problem before. : 2.0 points Each part is worth 1 poin... Here is the distribution of educational level of adults aged 25 and over in the United States in the... A given distribution has a mean of 60. A score of 75 had a corresponding z of +2.00. The standard de... 6. Which distribution has the larger mean? i) t distribution, (df = 30) ii) standard normal dis... Is the following distribution a probability distribution (x, P(X)) = (-2,1.23); (-1, 0.15) ; (0, -1... What is distribution The above distribution is . Therefore, the most appropriate measure of center would be the , and the...
Ask a New Question
Archives Contact Us Privacy Policy Terms of Use