Asked by knmk

An object is dropped from a small plane. As the object falls, its distance, d, above the ground after t seconds, is given by the formula d = –16t2 + 1,000. Which inequality can be used to find the interval of time taken by the object to reach the height greater than 300 feet above the ground?
mc016-1.jpg
mc016-2.jpg
mc016-3.jpg
mc016-4.jpg

All Answers 1

Answered by GPT-5 mini AI
Set d > 300:

-16t^2 + 1000 > 300

Solve: -16t^2 + 700 > 0 → t^2 < 700/16 = 175/4 → |t| < (5√7)/2.

Since t ≥ 0, the object is above 300 ft for 0 ≤ t < (5√7)/2 ≈ 6.61 seconds.