8.

Radio signals travel at a rate of 3 × 108 meters per second. How many seconds will it take for a radio signal to travel from a satellite to the surface of the Earth if the satellite is orbiting at a height of 3.6 × 107 meters?

1.08 × 1016 seconds

10.8 × 1015 seconds

8.3 seconds

1.2 × 10–1 seconds

1 answer

To determine how long it takes for a radio signal to travel from a satellite to the surface of the Earth, we can use the formula:

\[ \text{time} = \frac{\text{distance}}{\text{speed}} \]

Here, the distance is the height of the satellite's orbit (3.6 × 10^7 meters) and the speed is the speed of radio signals (3 × 10^8 meters per second).

Plugging in the values:

\[ \text{time} = \frac{3.6 \times 10^7 \text{ meters}}{3 \times 10^8 \text{ meters per second}} \]

\[ \text{time} = \frac{3.6}{3} \times \frac{10^7}{10^8} \]

\[ \text{time} = 1.2 \times 10^{-1} \text{ seconds} \]

Therefore, it will take **1.2 × 10^{-1} seconds** for the radio signal to travel from the satellite to the surface of the Earth. The correct answer is:

1.2 × 10–1 seconds