Suppose an object is dropped from a height h0 above the ground. Then its height after t seconds is given by h=−16t^2+h0, where h is measured in feet. If a ball is dropped from 48 feet above the ground, how long does it take to reach ground level?

1 answer

just plug in 48 for h0, and solve for t when h(t) = 0. Just the usual quadratic stuff.