If a satellite completes an orbit 820 miles above the earth in 11 hours at a velocity of 22,000 mph, how long would it take a satellite to complete an orbit if it is at 1400 miles above the earth at a velocity of 36,000 mph? (Use 36,000 miles as the radius of the earth. )

3 answers

The answer should be 7.54 hours. What is the formula I need to get this answer?
The formula that will clear up your problem is
......Vc = sqrt(µ/r)
where Vc = the velocity required to keep a body in a circular orbit, in feet/sec., r = the orbital radius in feet and µ = the Earth's gravitational constant or 1.407974x10^16 ft^3/sec^2.

The numbers you offer are inconsistent and unreal. Also, the radius of the earth is 3963 miles or 20,924,640 feet.

The real circular velocity for an orbit of 820 miles altitude is
Vc = sqrt(1.407974x10^16/(3963+820)5280
Vc = 23,612 fps or 16,099 mph.

The orbital period is therefore
Tc = (3963+820)5280(2)3.14/23,612 hr
Tc = 112 min.

The orbital period may also be derived from
Tc = 2(Pi)sqrt(r^3/µ)
Tc = 6.28sqrt(25,254,240^3/1.407974x10^16)
Tc = 6720 sec. or 1.866 hr or 112 min.

For the 1400 mile altitude,
Tc = 15,203mph and Tc = 132.98 min.

Tc(1400)/Tc(840) = r(1400)^(3/2)/r(840)^(3/2)
Lucy have 211 coins how much she have