To find the speed at which the rock was thrown, we can use the following equation for projectile motion:
d = (v^2 * sin(2θ)) / g
Where:
- d is the horizontal distance traveled (95 m)
- v is the initial velocity of the rock
- θ is the angle of the throw (50.8 degrees)
- g is the acceleration due to gravity (9.8 m/s^2)
Rearranging the equation to solve for v:
v^2 = (d * g) / sin(2θ)
Plugging in the given values:
v^2 = (95 * 9.8) / sin(2 * 50.8)
v^2 = 922.1 / sin(101.6)
Using a calculator:
v^2 ≈ 149.12
Taking the square root of both sides:
v ≈ √149.12
v ≈ 12.2 m/s
Therefore, the rock was thrown at approximately 12.2 m/s.
a rock is thrown horizontally off a 100m cliff making an angle of 50.8 degree. it land 95m away .at what speed was it thrown?
1 answer