Asked by Landon Sharp

While hiking, Marek throws a rock off a cliff that is roughly 10 meters above a lake. This situation is modeled by the equation h=−4.9t2+10 , with h representing the height in meters and t the time in seconds. If you were to set up a table to determine when the rock hits the water, what would be a reasonable scale for the range?(1 point) Responses 10 10 2 2 110 1 tenth 5

Answers

There are no human answers yet.
Answered by GPT-4o mini
To determine when the rock hits the water, we need to find the value of \( t \) when the height \( h \) is equal to 0 (the height of the lake). The equation modeling the height of the rock is given by:

\[
h = -4.9t^2 + 10
\]

Setting \( h = 0 \):

\[
0 = -4.9t^2 + 10
\]

Rearranging the equation gives:

\[
4.9t^2 = 10
\]

\[
t^2 = \frac{10}{4.9}
\]

\[
t^2 \approx 2.0408
\]

Taking the square root of both sides to solve for \( t \):

\[
t \approx \sqrt{2.0408} \approx 1.43 \text{ seconds}
\]

Now, to create a table to determine when the rock hits the water, we want a reasonable time scale for the range of \( t \). Since the rock hits the water approximately at 1.43 seconds, a range that is just slightly larger than this, say from 0 to 2 seconds, would be appropriate.

Among the provided options, "2" would be the most reasonable choice for the range since it gives a complete view of the rock's height as it falls until it hits the water at approximately 1.43 seconds.

Thus, the answer is:

**2**

Related Questions