True.
To determine the distance the shortstop needs to throw the ball to first base, let's visualize the situation. In a standard baseball field:
- The distance between bases (from first to second, second to third, and third to home) is 90 feet.
- The shortstop is positioned 30 feet from second base, which means she is \(90 - 30 = 60\) feet from first base, measured directly along the baseline.
To find the distance from her position to first base, we can consider the field as a right triangle where one leg is 60 feet (the distance from second base to first base) and the other leg is 30 feet (the distance from second base to the shortstop, moving along the baseline).
Using the Pythagorean theorem:
\[ d = \sqrt{(60^2 + 30^2)} \] \[ d = \sqrt{(3600 + 900)} \] \[ d = \sqrt{4500} \] \[ d \approx 67.08 \text{ feet} \]
Thus, she needs to throw approximately 67.08 feet to first base, not 94.9 feet.
Therefore, the statement is False.