Asked by Anonymous

A rock is thrown from the top of a cliff. It lands 2.33 seconds later 17.3 meters from the base of the cliff.

How far did it drop vertically?

What is the speed of the rock when it lands?

Answers

Answered by Steve
Assuming it was thrown horizontally, the height

h = 4.9*2.33^2 = 26.60 meters
The vertical speed vy = 9.8*2.33 = 22.83 m/s
The horizontal speed vx is just 17.3/2.33 = 7.42 m/s

The final speed is of course √(vx^2+vy^2) = 24.00 m/s
There are no AI answers yet. The ability to request AI answers is coming soon!

Related Questions