Why must you square the deviation scores when computing a standard deviation using the definitional formula?

Why is the computational formula easier to use when computing variance and standard deviation?

1 answer

The sum of the deviations (unsquared) from the mean should always = zero, since the mean serves as a fulcrum (balance point) for the distribution.

You need to square the sum of the scores rather than the sum of the squared deviations (squaring each scores deviation from the mean).