what is the approximate distance from one corner to the opposite corner when one corner is 60 yards and another coner is 100 yards.

1 answer

To find the distance from one corner to the opposite corner in a rectangle, you can use the Pythagorean theorem. If one corner is 60 yards and the other corner is 100 yards, these represent the lengths of the two sides of the rectangle.

The Pythagorean theorem states:

\[ c = \sqrt{a^2 + b^2} \]

where \( c \) is the distance between the corners (the hypotenuse), and \( a \) and \( b \) are the lengths of the other two sides.

In this case:

\[ a = 60 \text{ yards} \] \[ b = 100 \text{ yards} \]

Plugging the values into the formula:

\[ c = \sqrt{60^2 + 100^2} \] \[ c = \sqrt{3600 + 10000} \] \[ c = \sqrt{13600} \] \[ c \approx 116.62 \text{ yards} \]

So, the approximate distance from one corner to the opposite corner is about 116.62 yards.