Asked by Cow

A radio signal travels at
3.00 ⋅ 10^8 meters per second. How many seconds will it take for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of 3.54 ⋅ 10^7 meters? Show your work.

This is my question in Algebra and was wondering how do I do it so I multiply them, divide them, subtracting, or adding?

Answers

Answered by oobleck
just remember your units.
distance = speed * time, so

time = distance/speed
Answered by Cow
So it would be 3.00*10^7 * 3.54*10^7? To get the answer?
Answered by oobleck
excuse me? You want the time, right? They already gave you the distance

time(s) = distance (m) ÷ speed (m/s)

If you include your units, you will never be confused on what gets divided or multiplied. Your final answer must be in the correct units.

You need to
(a) read the question carefully
(b) pay attention when help is given, not grab the first thing you see.
Answered by Cow
Sorry about that:( So instead of multiplying I divide?
Answered by dude
yeas bro
There are no AI answers yet. The ability to request AI answers is coming soon!

Related Questions