Asked by Kaai97
Radio signals travel at a rate of 3*10^8 meters per second. How many seconds would it take for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of 4.2*10^7 meters? (Hint: Time is distance divided by speed.)
a. 1.26*10^16 seconds
b. 1.26*10^15 seconds***
c. 1.4*10^1 seconds
d. 1.4*10^-1 seconds
a. 1.26*10^16 seconds
b. 1.26*10^15 seconds***
c. 1.4*10^1 seconds
d. 1.4*10^-1 seconds
Answers
Answered by
Steve
a and b are clearly wrong. You have multiplied instead of dividing. Read the hint.
When you divide, you subtract exponents, not add.
When you divide, you subtract exponents, not add.
Answered by
Anna
Your answer would be -1, because you are subtracting 8 from 7.
Answered by
Anna
Your answer would be -1, because you are subtracting 8 from 7. so it is D.
There are no AI answers yet. The ability to request AI answers is coming soon!
Submit Your Answer
We prioritize human answers over AI answers.
If you are human, and you can answer this question, please submit your answer.