A pitcher throws a baseball horizontally from the mound to home plate. The ball falls 0.868 m (2.85 ft) by the time it reaches home plate 18.3 m (60 ft) away. How fast was the pitcher's pitch?

1 answer

Answer to my own question, you use d=vi*t+1/2*a*t^2 after solving for t you use d=vxo*t