We can use the formula:
distance = speed x time
The distance in this case is the height of the satellite above the surface of the Earth, which is 3.54 x 10^7 meters. The speed is the speed of light, which is 3.00 x 10^8 meters per second. We want to find the time it takes for the signal to travel from the satellite to the surface of the Earth, so we can rearrange the formula to:
time = distance / speed
Plugging in the values, we get:
time = 3.54 x 10^7 meters / (3.00 x 10^8 meters per second)
time = 0.118 seconds
Therefore, it takes approximately 0.118 seconds for a radio signal to travel from a satellite orbiting at a height of 3.54 x 10^7 meters to the surface of Earth.