Asked by Elisa
One of the fastest recorded pitches in major-league baseball, thrown by Billy Wagner in 2003, was clocked at 101.0 mi/h . If a pitch were thrown horizontally with this velocity, how far would the ball fall vertically by the time it reached home plate, 60.5 ft away?
I tried to convert from miles to feet, but I got huge numbers that I couldn't work with and I tried to find the time when it fell so that I could fine the vertical distance, but to no avail. I have no idea how to start or do this problem. It seems really simple though.
I tried to convert from miles to feet, but I got huge numbers that I couldn't work with and I tried to find the time when it fell so that I could fine the vertical distance, but to no avail. I have no idea how to start or do this problem. It seems really simple though.
Answers
Answered by
bobpursley
Long ago I memorized a conversion factor: 60mph = 88ft/sec
time to home plate: = 60.5ft/velocity
Now put that time into
distance=1/2 g t^2 where g us 33ft/s^2
I get about six to seven feet.
time to home plate: = 60.5ft/velocity
Now put that time into
distance=1/2 g t^2 where g us 33ft/s^2
I get about six to seven feet.
Answered by
Elisa
I get 2.64 feet.
Answered by
tchrwill
You are right Elisa. I get 2.66 ft.
101mph = (101/60)88 = 148.13 fps.
60.5/148.13 = .408 sec.
h = 16t^2 = 15(.408^2) = 2.66 feet.
101mph = (101/60)88 = 148.13 fps.
60.5/148.13 = .408 sec.
h = 16t^2 = 15(.408^2) = 2.66 feet.
There are no AI answers yet. The ability to request AI answers is coming soon!
Submit Your Answer
We prioritize human answers over AI answers.
If you are human, and you can answer this question, please submit your answer.