To solve this problem, we first need to determine the time it takes for the rock to hit the ground after being thrown.
We can use the following kinematic equation to find the time it takes for the rock to fall from 25 m:
y = v0*t + (1/2)*a*t^2
where:
y = vertical distance (25 m)
v0 = initial vertical velocity (0 m/s)
a = acceleration due to gravity (-9.8 m/s^2)
t = time
25 = 0*t + (1/2)*(-9.8)*t^2
25 = -4.9t^2
t^2 = 25/4.9
t = sqrt(25/4.9)
t ≈ 2.56 seconds
Now that we know the time it takes for the rock to fall, we can use this time to find how far the rock will land from the base of the cliff using the horizontal component of the rock's motion.
Since the rock is thrown horizontally with a speed of 20 m/s, we can use the following equation to find the horizontal distance the rock will travel:
x = v*t
where:
x = horizontal distance
v = horizontal velocity (20 m/s)
t = time (2.56 seconds)
x = 20*2.56
x = 51.2 meters
Therefore, the rock will land 51.2 meters away from the base of the cliff.
A rock is thrown horizontally with a speed of 20 m/s from a vertical cliff of height 25 m.
How far will it land from the base of the cliff?
1 answer