While hiking, Marek throws a rock off a cliff that is roughly 10 meters above a lake. The situation is modeled by the equation: h = -4.9t² + 10, with h representing the height in meters, and t the time in seconds. If you were to set up a table to determine when the rock hits he water, what would be a reasonable scale for the range?

1 answer

To determine when the rock hits the water, we need to find out when the height \( h \) becomes 0 (i.e., when it reaches the lake). We have the equation:

\[ h = -4.9t^2 + 10 \]

Setting \( h = 0 \):

\[ 0 = -4.9t^2 + 10 \]

Now, we can solve for \( t \):

\[ 4.9t^2 = 10 \]

\[ t^2 = \frac{10}{4.9} \]

\[ t^2 \approx 2.04 \]

Taking the square root:

\[ t \approx \sqrt{2.04} \approx 1.43 \text{ seconds} \]

Therefore, the rock will hit the water approximately 1.43 seconds after being thrown.

For your table setup, you want to establish a reasonable time range that includes the moment the rock hits the water:

  • A good starting point could be to start from \( t = 0 \) seconds (when the rock is thrown).
  • Since the rock hits the water at about 1.43 seconds, you might set the range slightly beyond this time, perhaps to 2 seconds or even 3 seconds for a comfortable scale.

Thus, a reasonable scale for your time \( t \) would be:

  • Interval: \( t = 0, 0.5, 1.0, 1.5, 2.0, 2.5, 3.0 \)
  • Values: Corresponding heights can be calculated for these time values to observe the change in height until it reaches 0.

By following this setup, you can neatly analyze the height of the rock over time and effectively determine when it reaches the lake.