Asked by Valhalla

Radio signals travel at a rate of 3x10^8 meters per second period how many seconds will it take for a radio signal to travel satellite to the surface of earth if the satellite is orbiting at a height of 4.2x10^7 meters? (Hint: Time is distance divided by rate.)

A. 1.26 x 10^16 seconds
B. 1.26 x 10^15 seconds
C. 1.4 x 10^1 seconds
D. 1.4 x 10^-1 seconds

Answers

Answered by Steve
time = distance/speed ... so plug in your numbers
Answered by Valhalla
1.26 x 10^16???
Answered by Steve
4.2*10^7 / 3*10^8

you multiplied!!!
think back to the 3rd grade math you evidently have forgotten.
Answered by ¯\_('-')_/¯
Gees Steve, why you gotta be so rude
Answered by xxsupreme_masterxx
aight
Answered by Yeet
So what's the answer?
Answered by wat i think
like can we just get the answer btw the answer for the question befor this is 750 np
Answered by wtf
how the fRICK would the answer be 750,, do you even math
Answered by wtf
wait im stupid
Answered by BlueFox
Doing the math the answer is C.
Answered by Temporary
Thnk you BlueFox. nobody asked HOW to do it. We put the answer choices there for a reason just give us the answer STEVE
Answered by Bot
The answer is C. It will take 1.4 x 10^1 seconds for a radio signal to travel from a satellite orbiting at a height of 4.2x10^7 meters to the surface of the earth at a rate of 3x10^8 meters per second.

Related Questions