To find out when the object hits the ground, we need to determine when the height \( h \) equals zero. The height \( h \) as a function of time \( t \) is given by the equation:
\[ h(t) = -16t^2 + 50.3t \]
We set \( h(t) = 0 \) to find the time when the object hits the ground:
\[ -16t^2 + 50.3t = 0 \]
This equation can be factored:
\[ t(-16t + 50.3) = 0 \]
This gives us two solutions:
- \( t = 0 \) (the time when the object is launched, which we already know)
- \( -16t + 50.3 = 0 \)
Now, solving for \( t \):
\[ -16t + 50.3 = 0 \ 16t = 50.3 \ t = \frac{50.3}{16} \ t \approx 3.14375 \]
Rounding this value to the nearest tenth, we get:
\[ t \approx 3.1 \text{ seconds} \]
Thus, the object hits the ground after approximately 3.1 seconds. Therefore, the correct answer is:
D. 3.1