One of the fastest recorded pitches in major-league baseball, thrown by Billy Wagner in 2003, was clocked at 101.0 mi/h . If a pitch were thrown horizontally with this velocity, how far would the ball fall vertically by the time it reached home plate, 60.5 ft away?

I tried to convert from miles to feet, but I got huge numbers that I couldn't work with and I tried to find the time when it fell so that I could fine the vertical distance, but to no avail. I have no idea how to start or do this problem. It seems really simple though.

3 answers

Long ago I memorized a conversion factor: 60mph = 88ft/sec

time to home plate: = 60.5ft/velocity

Now put that time into
distance=1/2 g t^2 where g us 33ft/s^2

I get about six to seven feet.
I get 2.64 feet.
You are right Elisa. I get 2.66 ft.

101mph = (101/60)88 = 148.13 fps.

60.5/148.13 = .408 sec.

h = 16t^2 = 15(.408^2) = 2.66 feet.