A rock is thrown upward with a speed of 48 feet per second from the edge of a cliff 400 feet above the ground. What is the speed of the rock when it hits the ground? Use acceleration due to gravity as –32 feet per second squared and approximate your answer to 3 decimal places.

1 answer

a=dv/dt
v=INT a dt=-32t + k
but at t=0 v=48 so k=48
v=-32t+48
h= INT v dt= int( -32t +48)dt
h=-16t^2 + 48t + k2
but at h(400), at t=0, this means k2 is 400

h=-16t^2+48t + 400
when h=0, find t
16t^2-48t-400=0
t^2-3t-25=0

t=(3+-sqrt(9+100))/2
t=1.5-+sqrt(109/4)) use positive number...
now find v at that time...
so what is v when