Asked by Larry

HELP! I can't figure the formula out for this problem. a pitcher threw a baseball clocked at 90/mph. The pitcher’s mound is 60.5 feet from home plate. How long did it take, in seconds, for the ball to travel from the pitcher’s mound to home plate?


Answers

Answered by Reiny
time = distance/rate

90 miles/hour
= 90(5280) ft / 3600 seconds
= 132 ft/s

so time = 60.5 ft/132 ft/s
= .45833... seconds

= appr .46 seconds
Answered by Larry
I was close! Thank you!
There are no AI answers yet. The ability to request AI answers is coming soon!

Related Questions