A golf ball is hit off the top of a cliff that is 75 feet tall at an angle of 45° to the horizontal with an initial velocity of 80 feet per second. The quadratic equation shown below models the height, h(x), of the ball when it is x feet from the cliff’s edge. How far will the ball travel until it hits the ground? Round your answer to the nearest hundredth of a second.

1 answer

Horizontal speed constant forever = 80 cos 45 = u = 56.6 ft/s

Initial vertical speed Vi = 80 sin 45 = 56.6 ft/s

Now you did not give me the quadratic so I will have to solve it with physics

v = Vi - g t
where g = about 32 ft/s^2 in these obsolete units
then
h = Hi + Vi t - 16 t^2
so
0 = 75 + 56.6 t - 16 t^2
but d = u t = 56.6 t
so t = d/56.6
and
0 = 75 + d - 16 (d^2/56.6^2)

0 = 75 + d - .005 d^2

.005 d^2 - d - 75 = 0

d^2 - 200 d - 15,016 = 0

d = [ 200 +/- sqrt ( 40,000+60,067)]/2

d = [ 200 +/- 316 ]/2

d = 258 feet
(ignore the negative distance, that is where it would have had to start if there were no cliff :)