While hiking, Marek throws a rock off a cliff that is roughly 10 meters above a lake. This situation is modeled by the equation h=−4.9t2+10


=

4.9
𝑡
2
+
10
, with h representing the height in meters and t the time in seconds. If you were to set up a table to determine when the rock hits the water, what would be a reasonable scale for the range?(1 point)
Responses

10
10

5
5

110
1
10
1 tenth

2

1 answer

To determine when the rock hits the water, we need to find the time \( t \) when the height \( h \) is equal to 0 (i.e., when the rock reaches the lake). The equation we have is:

\[ h = -4.9t^2 + 10 \]

Setting \( h \) to 0:

\[ 0 = -4.9t^2 + 10 \]

Rearranging gives:

\[ 4.9t^2 = 10 \]

\[ t^2 = \frac{10}{4.9} \]

\[ t^2 \approx 2.04 \]

Taking the square root:

\[ t \approx \sqrt{2.04} \approx 1.43 \text{ seconds} \]

Given that the time is around 1.43 seconds, we want to set up a table for reasonable increments of time (the domain), from \( t = 0 \) to around \( t = 2 \) seconds.

In this case, a reasonable scale for the range of time in the table to determine when the rock hits the water would be increments of 0.1 seconds, as it allows for enough detail to see the change in height as time approaches 1.43 seconds.

So, the best option among the scale choices provided is:

1 tenth