T1= r1/r2 * V2/V1
T1= (1700+3960)/(860+3960) * 34000/27000
The time in hours it takes a satellite to complete an orbit around the earth varies directly as the radius of the orbit (from the center of the earth) and inversely as the orbital velocity. If a satellite completes an orbit 860 miles above the earth in 12 hours at a velocity of 34,000 mph, how long would it take a satellite to complete an orbit if it is at 1700 miles above the earth at a velocity of 27,000 mph? (Use 3960 miles as the radius of the earth.)
2 answers
The actual velocity of a satellite orbiting at 860 miles altitude is 24,113 fps.
The period derives from
T = 2(Pi)sqrt(r^3/µ) = 113.3 minutes.
The actual velocity of a satellite orbiting at 1700 miles altitude is 21,705 fps.
The period derives from
T = 2(Pi)sqrt(r^3/µ) = 144.18 minutes.
Alternatively,
T2/T1 = sqrt[(r2)^3/(r1)^3]
or T2/T1 =sqrt[5660^3/4820)^3]=1.272489
T2 = 113.3(1.272489) = 144.18
The period derives from
T = 2(Pi)sqrt(r^3/µ) = 113.3 minutes.
The actual velocity of a satellite orbiting at 1700 miles altitude is 21,705 fps.
The period derives from
T = 2(Pi)sqrt(r^3/µ) = 144.18 minutes.
Alternatively,
T2/T1 = sqrt[(r2)^3/(r1)^3]
or T2/T1 =sqrt[5660^3/4820)^3]=1.272489
T2 = 113.3(1.272489) = 144.18