an arrow which has an initial speed of 40m/s is aimed at a target which is level with it at a distance of 100m from the point of projection. find the least time of flight for the arrow to hit the target.
1 answer
Divide the distance by the speed