A ball rolls off a table moving 3.6m/s in the horizontal direction. The table is 1.2 meters high. Find the amount of time needed for the ball to hit the ground.

1 answer

Kinematic equation:
d = vi*t + (1/2)a*t^2

Since its moving at 3.6m/s in the horizontal direction, disregard it. Hence initial velocity is 0

1.2m = 0 + (1/2)9.81m/s^2 * t^2
(1.2m / (1/2)9.81 m/s^2) = t^2
(1.2m / (1/2)9.81 m/s^2)^1/2 = t

t = 0.49 s