Asked by Leapordclaw
A radio signal travels at
3
.
00
⋅
10
8
meters per second. How many seconds will it take for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of
3
.
54
⋅
10
7
meters? Show your work.
3
.
00
⋅
10
8
meters per second. How many seconds will it take for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of
3
.
54
⋅
10
7
meters? Show your work.
Answers
Answered by
oobleck
time = distance/speed
so, just plug in your numbers (whatever the heck they are -- can't read 'em)
so, just plug in your numbers (whatever the heck they are -- can't read 'em)
Answered by
Reiny
Why are you typing in such a strange way?
Is this want you mean?
A radio signal travels at 3.00 x 10^8 meters per second. How many seconds will it take for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of 3.54 x 10^7 meters? Show your work.
time = distance/rate = 3.54 x 10^7/3.00 x 10^8 seconds
= 1.18 x 1/10
= .118 seconds
Is this want you mean?
A radio signal travels at 3.00 x 10^8 meters per second. How many seconds will it take for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of 3.54 x 10^7 meters? Show your work.
time = distance/rate = 3.54 x 10^7/3.00 x 10^8 seconds
= 1.18 x 1/10
= .118 seconds
Answered by
Lazzy
Thank you Reiny. Your explanation really helped me understand the question.
Answered by
TYTYM
thank you Reiny
.
.
Answered by
Pirates
2022 anyone?
Answered by
burger
this is my last question of 6th grade lol
Answered by
Bella
Thanks reiny here in may 2022
There are no AI answers yet. The ability to request AI answers is coming soon!
Submit Your Answer
We prioritize human answers over AI answers.
If you are human, and you can answer this question, please submit your answer.