Ask a New Question

Asked by Crystal

If a distribution of IQ test scores has a mean of 100 and a standard deviation of 16, what would be the result if we first subtract the mean (100) from each score, and then divide each of those resulting scores by the standard deviation (16)? 

14 years ago

Answers

Answered by PsyDAG
Z = (score-mean)/SD

You would have the Z score for each raw score, stating their value in terms of standard deviations from the mean.
14 years ago

Related Questions

On a test whose distribution is approximately normal with a mean of 50 and a standard deviation of 1... A distribution of 800 test scores in a biology course was approximately normally distributed with a... If a distribution of test scores has μ = 65 with σ = 6, and you scored at the 89th percentile, what... the distribution having p.d.f. f(x)=(exp(x)*a^x)/x! x=0,1,2,... a>=0 find the maximum liklihood e... the distribution having p.d.f. f(x)=exp(a-x) x>0 Let x1=min{x1,x2,...,xn}and zn=n(x1-a), find t... A distribution of test scores of 600 examinees follows a normal distriburion with an everage of 80 h... Consider the the distribution of under-inflated tires on a four-wheel automobile. The mass density... The distribution of test scores for a class are represented by the dot plot below. If another test s...
Ask a New Question
Archives Contact Us Privacy Policy Terms of Use