1.2 × 10 –1 seconds
It would take 1.2 × 10 –1 seconds for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of 3.6 × 107 meters.
Radio signals travel at a rate of 3 × 108 meters per second. How many seconds would it take for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of 3.6 × 107 meters? (Hint: Time is distance divided by rate.)
(1 point)
Responses
8.3 seconds
8.3 seconds
1.2 × 10–1 seconds
1.2 × 10 –1 seconds
1.08 × 1016 seconds
1.08 × 10 16 seconds
10.8 × 1015 seconds
1 answer