A rock is thrown upward with a speed of 48 feet per second from the edge of a cliff 400 feet above the ground. What is the speed of the rock when it hits the ground? Use acceleration due to gravity as -32 feet per second squared and approximate your answer to 3 decimal places.

I don't understand exactly what this question is asking. I know velocity is the integral of acceleration, but how would I get the acceleration equation?

1 answer

a(t) = -32 ft/sec^2 , given
v(t) = -32t + c
when t = 0, v(0) = 48 ft/sec, given
48 = 0 + c ---> c = 48
v(t) = -31t + 48

s(t) = -16t^2 + 48t + k
when t = 0 , s = 400 ft
s(t) = -16t^2 + 48t + 400

after doing a few of those you should be able to go directly to my last equation.
when it hits the ground, s(t) = 0
-16t^2 + 48t + 400 = 0
divide each term by -8
2t^2 - 6t - 50 = 0
use the quadratic equation formula to solve for t, one of the answers should be negative.
Use the positive root only