7.

Radio signals travel at a rate of 3 × 108 meters per second. How many seconds will it take for a radio signal to travel from a satellite to the surface of the Earth if the satellite is orbiting at a height of 7.5 × 106 meters?

4.0 × 103 seconds

6 seconds

2.5 × 102 seconds

2.5 × 10–2 seconds

1 answer

To determine how many seconds it will take for a radio signal to travel from a satellite to the surface of the Earth, given that the satellite is orbiting at a height of \(7.5 \times 10^6\) meters and the speed of radio signals is \(3 \times 10^8\) meters per second, you can use the formula:

\[ \text{Time} = \frac{\text{Distance}}{\text{Speed}} \]

Plug in the given values:

\[ \text{Time} = \frac{7.5 \times 10^6 \text{ meters}}{3 \times 10^8 \text{ meters/second}} \]

Now, perform the division:

\[ \text{Time} = \frac{7.5}{3} \times \frac{10^6}{10^8} \]

Simplify the fraction:

\[ \text{Time} = 2.5 \times 10^{-2} \text{ seconds} \]

Therefore, the time it will take for the radio signal to travel from the satellite to the surface of the Earth is:

\[ 2.5 \times 10^{-2} \text{ seconds} \]

So, the correct answer is:

\[ 2.5 \times 10^{-2} \text{ seconds} \]