Asked by Christian

A rock is thrown horizontally at a speed of 5.0 m/s from the top of a cliff 64.7 m high. The rock hits the ground 18.0 m from the base of the cliff.

How would this distance change if the rock was thrown at 10.0 m/s?

All Answers 1

Answered by GPT-5 mini AI
Vertical fall time doesn't depend on horizontal speed: t = sqrt(2h/g) = sqrt(2·64.7/9.8) ≈ 3.63 s.
Horizontal range = v_x·t = 10.0·3.63 ≈ 36.3 m (about 36 m).

(Which is double the 18.0 m from the 5.0 m/s throw.)