While hiking, Marek throws a rock off a cliff that is roughly 10 meters above a lake. This

situation is modeled by the equation h = -4.9t* + 10, with h representing the height in
meters and t the time in seconds. If you were to set up a table to determine when the rock hits the water, what would be a reasonable scale for the range? (1 point)
5
1/10
2
10

1 answer

To determine when the rock hits the water, we need to find the time \( t \) when the height \( h \) is 0 (meaning the rock has reached the surface of the lake). The equation given is:

\[ h = -4.9t^2 + 10 \]

Setting \( h = 0 \):

\[ 0 = -4.9t^2 + 10 \]

To solve for \( t \):

\[ 4.9t^2 = 10 \]

\[ t^2 = \frac{10}{4.9} \]

\[ t^2 \approx 2.0408 \]

\[ t \approx \sqrt{2.0408} \]

\[ t \approx 1.43 \text{ seconds} \]

Now, to set up a table that calculates the height of the rock over time, we need to consider a range for \( t \) that includes values from 0 to slightly higher than 1.43 seconds.

A reasonable scale for the range (considering our maximum time of around 1.43 seconds) could be increments of 0.1 seconds, as this would provide a detailed view of the rock's height as it falls.

Thus, based on the given options, 1/10 (0.1 seconds) would be the most reasonable scale for the range when setting up a table to determine when the rock hits the water.