To find the distance between opposite corners of the field, we can use the Pythagorean theorem. The width and length of the field represent the two legs of a right triangle, where the distance between the opposite corners is the hypotenuse.
Given:
- Width (one leg) = 60 yards
- Length (another leg) = 80 yards
Let \( a = 60 \) yards, and \( b = 80 \) yards. According to the Pythagorean theorem:
\[ c^2 = a^2 + b^2 \]
Substituting the values:
\[ c^2 = 60^2 + 80^2 \]
Calculating \( 60^2 \) and \( 80^2 \):
\[ 60^2 = 3600 \] \[ 80^2 = 6400 \]
Now, adding these:
\[ c^2 = 3600 + 6400 = 10000 \]
Taking the square root to find \( c \):
\[ c = \sqrt{10000} = 100 \]
Thus, the distance between the opposite corners of the field is
\[ \boxed{100} \text{ yards.} \]