Question
Radio signals travel at a rate of 3*10^8 meters per second. How many seconds would it take for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of 4.2*10^7 meters? (Hint: Time is distance divided by speed.)
a. 1.26*10^16 seconds
b. 1.26*10^15 seconds***
c. 1.4*10^1 seconds
d. 1.4*10^-1 seconds
a. 1.26*10^16 seconds
b. 1.26*10^15 seconds***
c. 1.4*10^1 seconds
d. 1.4*10^-1 seconds
Answers
Steve
a and b are clearly wrong. You have multiplied instead of dividing. Read the hint.
When you divide, you subtract exponents, not add.
When you divide, you subtract exponents, not add.
Anna
Your answer would be -1, because you are subtracting 8 from 7.
Anna
Your answer would be -1, because you are subtracting 8 from 7. so it is D.