To solve this problem, we can use trigonometry.
First, we need to calculate the altitude the plane will descend to when it reaches the runway. Since the plane is beginning a 20° descent from a height of 3,760 feet, the vertical displacement can be calculated as:
Vertical displacement = 3,760 ft * sin(20°)
Vertical displacement = 3,760 ft * 0.3420
Vertical displacement ≈ 1,286.32 ft
Next, we need to calculate the horizontal distance the plane will travel during this descent. This can be calculated as:
Horizontal distance = 3,760 ft * cos(20°)
Horizontal distance = 3,760 ft * 0.9397
Horizontal distance ≈ 3,536.55 ft
To find the distance from the runway to the plane at the start of the approach, we can use the Pythagorean theorem:
Distance = √(Horizontal distance^2 + Vertical displacement^2)
Distance = √(3,536.55^2 + 1,286.32^2)
Distance ≈ √(12,511,406.25 + 1,652,736.94)
Distance ≈ √14,164,143.19
Distance ≈ 3,765.28 ft
Converting this distance to miles, we have:
3,765.28 ft ÷ 5,280 ft/mile ≈ 0.71 miles
Therefore, the airplane is approximately 0.7 miles from the runway at the start of the approach. The closest answer choice is 0.8 mi.