While hiking, Marek throws a rock off a cliff that is roughly 10 meters above a lake. This situation is modeled by the equation h=−4.9t2+10

h
=

4.9
t
2
+
10
, with h representing the height in meters and t the time in seconds. If you were to set up a table to determine when the rock hits the water, what would be a reasonable scale for the range?(1 point)
Responses

2
2

10
10

110
1
10
1 tenth

5

1 answer

To determine when the rock hits the water, we need to find the time \( t \) when the height \( h \) is 0 (when the rock reaches the surface of the lake).

Using the equation given:

\[ h = -4.9t^2 + 10 \]

We set \( h = 0 \):

\[ 0 = -4.9t^2 + 10 \]

Solving for \( t \):

\[ 4.9t^2 = 10 \]

\[ t^2 = \frac{10}{4.9} \]

\[ t^2 \approx 2.04 \]

Taking the square root:

\[ t \approx \sqrt{2.04} \approx 1.43 \text{ seconds} \]

Since we are determining the reasonable scale for the range of the table, we should consider the time it takes for the rock to hit the water, which is approximately 1.43 seconds.

Given the options provided, a reasonable scale for the range in a table designed to show when the rock hits the water would be 1 tenth (0.1 seconds). This allows for sufficient granularity in tracking the time progression leading up to the rock hitting the water.

So the final answer is:

1 tenth