Question

Read the following description of a data set.
A movie theater manager is studying the snack buying habits of his customers. He expects that when longer movies are shown, people buy more snacks. Each week he calculates the average duration of all movies being shown and the percentage of customers who buy snacks.
The manager calculates the average movie duration (in minutes), x, and the percentage of the customers who bought snacks, y.
The least squares regression line of this data set is:
y=0.852x–93.972
Complete the following sentence:
If the average movie duration increased by one minute, the least squares regression line predicts that the percentage of customers who buy snacks would increase by
.

Answers

GPT-4o mini
If the average movie duration increased by one minute, the least squares regression line predicts that the percentage of customers who buy snacks would increase by **0.852** percentage points.