1 answer
(click or scroll down)
To find the distance the shortstop needs to throw the ball, we can use the Pythagorean Theorem, which states a^2 + b^2 = c^2, where a and b are the lengths of the two shorter sides of a right triangle, and c is the length of the hypotenuse.
In this case, the shortstop is 30 feet from second base and the entire field is 90 feet wide. So the length of one side of the square (a) is 90 feet. The distance from the shortstop to first base represents the hypotenuse of the right triangle.
Using the Pythagorean Theorem:
30^2 + b^2 = 90^2
900 + b^2 = 8100
b^2 = 7200
b = √7200
b = 84.9 feet
Therefore, the shortstop needs to throw the ball approximately 84.9 feet to reach first base.