Time = Distance / Speed
Distance = 4.2 x 10^7 meters
Speed = 5 x 10^5 meters per second
Time = (4.2 x 10^7) / (5 x 10^5)
Time = 84 seconds
Therefore, it would take 84 seconds for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of 4.2 x 10^7 meters.
Radio signals travel at a rate of 5 × 105 meters per second. How many seconds would it take for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of 4.2 × 107 meters? (Hint: Time is distance divided by speed.
1 answer