A radio signal travels at 3.00⋅10^8 meters per second. How many seconds will it take for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of 3.54⋅10^7 meters? Show your work.

2 answers

3.54⋅10^7 ÷ (3.00⋅10^8)
= 3.54/3 * 10^7/10^8
= 1.18 * 10^-1
= .118 seconds
v = s / t

t = s / v

v = velicity

s = distance

t = time

In this case:

v = 3 ⋅ 10⁸ m / s

s = 3.54 ⋅ 10⁷ m

t = s / v

t = 3.54 ⋅ 10⁷ m / 3 ⋅ 10⁸

t = ( 3.54 / 3 ) ⋅ 10⁷ / 10⁸

Simmilar as 10⁸ / 10⁷ = 10

10⁷ / 10⁸ = 1 / 10

t = 1.18 ⋅ 1 / 10 =

1.18 / 10 = 0.118 s