Asked by Crystal
If a distribution of IQ test scores has a mean of 100 and a standard deviation of 16, what would be the result if we first subtract the mean (100) from each score, and then divide each of those resulting scores by the standard deviation (16)?
Answers
Answered by
PsyDAG
Z = (score-mean)/SD
You would have the Z score for each raw score, stating their value in terms of standard deviations from the mean.
You would have the Z score for each raw score, stating their value in terms of standard deviations from the mean.