We can solve this problem using kinematic equations. Let's assume that the package is dropped from rest:
Initial velocity of package (u) = 0 m/s
Acceleration due to gravity (a) = 9.81 m/s^2 (assuming no air resistance)
Height of package above ground (h) = 100 m
Using the equation of motion, h = ut + 1/2(at^2), we can find the time taken for the package to hit the ground:
100 = 0t + 1/2(9.81t^2)
Simplifying,
4.905t^2 = 100
t^2 = 20.39
t = 4.51 seconds (rounded to 2 decimal places)
Now, using the equation of motion, s = ut + 1/2(at^2), we can find the horizontal distance travelled by the package during this time:
s = 40 x 4.51 + 1/2(0)(4.51)^2
s = 180.4 meters
Therefore, the package hits the ground 180.4 meters away from the point it was released, in the horizontal direction.
A rescue plane drops a package of emergency supplies to a stranded explorer. The plane is travelling at 40m/s and at a height of 100m above the ground. Where does the package hit the ground relative to the point it was released?
1 answer