HELP! I can't figure the formula out for this problem. a pitcher threw a baseball clocked at 90/mph. The pitcher’s mound is 60.5 feet from home plate. How long did it take, in seconds, for the ball to travel from the pitcher’s mound to home plate?

2 answers

time = distance/rate

90 miles/hour
= 90(5280) ft / 3600 seconds
= 132 ft/s

so time = 60.5 ft/132 ft/s
= .45833... seconds

= appr .46 seconds
I was close! Thank you!