Asked by Dylan
Radio signals travel at a rate 3 * 10^8 of meters per second. How many seconds would it take a radio signal to travel from a satellite to the surface of the Earth if the satellite is orbiting at a height of 3.6 * 10^7 meters?
A.) 8.3 seconds**
B.) 1.2 * 10^-1 seconds
C.) 1.08 * 10^16 seconds
A.) 8.3 seconds**
B.) 1.2 * 10^-1 seconds
C.) 1.08 * 10^16 seconds
Answers
Answered by
PsyDAG
3.6*10^7/3*10^8 = 3.6/30 = ?
There are no AI answers yet. The ability to request AI answers is coming soon!
Submit Your Answer
We prioritize human answers over AI answers.
If you are human, and you can answer this question, please submit your answer.