10.

Radio signals travel at a rate of 3 × 108 meters per second. How many seconds will it take for a radio signal to travel from a satellite to the surface of the Earth if the satellite is orbiting at a height of 7.5 × 106 meters?

2.5 × 102 seconds

6 seconds

2.5 × 10–2 seconds

4.0 × 103 seconds

1 answer

To determine how long it takes for a radio signal to travel from a satellite to the surface of the Earth, we need to use the formula:

\[ \text{Time} = \frac{\text{Distance}}{\text{Speed}} \]

Given:
- Distance (height of satellite) = \( 7.5 \times 10^6 \) meters
- Speed of radio signals = \( 3 \times 10^8 \) meters per second

Plugging in the values, we get:

\[ \text{Time} = \frac{7.5 \times 10^6 \, \text{meters}}{3 \times 10^8 \, \text{meters/second}} \]

\[ \text{Time} = \frac{7.5}{3} \times \frac{10^6}{10^8} \, \text{seconds} \]

\[ \text{Time} = 2.5 \times 10^{-2} \, \text{seconds} \]

Therefore, the correct answer is:

\[ 2.5 \times 10^{-2} \, \text{seconds} \]