The distance from 2nd base to home plate is the hypotenuse of a right triangle where the two sides are each 90 feet (the sides of the square). Using the Pythagorean theorem, we can calculate the distance the ball traveled.
a^2 + b^2 = c^2
90^2 + 90^2 = c^2
8100 + 8100 = c^2
16200 = c^2
c = sqrt(16200)
c ≈ 127.3 feet
Therefore, the ball traveled approximately 127.3 feet from 2nd base to home plate.
A baseball diamond is a square that is 90 feet on each side. A player threw the ball from 2nd base to home.
How far did the ball travel? Round to the nearest tenth.
1 answer