To find the slope and intercept of the least-squares regression line, we can use the formula:
slope (b) = r * (SDy / SDx)
intercept (a) = mean of y - (b * mean of x)
where r is the correlation coefficient, SDy is the standard deviation of the y variable (final exam scores), SDx is the standard deviation of the x variable (pre-exam total scores), and the mean of x and y are the means of the respective variables.
Given:
r = 0.9
SDx = 50
mean of x = 275
SDy = 10
mean of y = 70
4-1) To find the slope:
slope (b) = 0.9 * (10/50) = 0.9 * 0.2 = 0.18
To find the intercept:
intercept (a) = 70 - (0.18 * 275) = 70 - 49.5 = 20.5
4-2) To predict Mary's pre-exam total scores using the regression line:
Mary's final exam score (y) = 81
predicted pre-exam total score (x) = a + b * y
= 20.5 + 0.18 * 81
= 20.5 + 14.58
= 35.08
Therefore, the predicted pre-exam total score for Mary is approximately 35.08.
4-3) To calculate r^2, which represents the proportion of the total variation in y that can be explained by the regression model:
r^2 = r^2
Given that r = 0.9, r^2 = 0.9^2 = 0.81
This means that 81% of the variation in the final exam scores can be explained by the pre-exam total scores.
Since r^2 = 0.81, 81% of the variation in Mary's final exam score (81) can be explained by her pre-exam total score using the regression model. However, this also implies that 19% of the variation is unexplained.
Thus, her actual pre-exam score could be either higher or lower than the predicted value (35.08) because there is still some variation that is not captured by the regression model.