Let u be the horizontal speed and v the vertical.
u(0)=8.0m/s, v(0)=0m/s
Let x be the horizontal height and y the vertical.
x(0)=0m, y(0)=20m
y(t) = y(0) + v(0) t -(9.81/2) t^2
Find t when y(t) = 0m
So: 0m = 20m - 4.905(m/s^2) t^2
A ball is thrown horizontally from the top of a cliff 20m high. If the initial horizontal velocity is 8.0m/s,find how long it takes to reach the horizontal plane at the foot of the cliff.
3 answers
From equation of motion. V^2=U^2+2gh find v. Applying the parameter given v=21m/s. From a=v-u/t then substitute for v,u and g given to find t. T=1.36sec
Formula=V=U +or- gt
You remove( U+or_) because you are given initial horizontal velocity not initial velocity.
So you have (V=gt)
Because you are looking for (t)
You make it the subject of formula,you now have T=V÷g
Then you apply the numbers
T=8÷10
=O.8s(seconds)
You remove( U+or_) because you are given initial horizontal velocity not initial velocity.
So you have (V=gt)
Because you are looking for (t)
You make it the subject of formula,you now have T=V÷g
Then you apply the numbers
T=8÷10
=O.8s(seconds)