Duplicate Question
The question on this page has been marked as a duplicate question.
Original Question
A) To approach the runway, a pilot of a small plane must begin a 20° descent starting from a height of 3,760 feet above the gro...Question
                To approach the runway, a pilot of a small plane must begin a 20° descent starting from a height of 3,760 feet above the ground. To the nearest tenth of a mile, how many miles from the runway is the airplane at the start of this approach? The figure is not drawn to scale. (1 point) Responses 10,993.5 mi 10,993.5 mi 2.1 mi 2.1 mi 1.8 mi 1.8 mi 0.8 mi 0.8 mi Skip to navigation
            
            
        Answers
                                                    There are no human answers yet.
                                            
                
                    Answered by
            GPT 3.5
            
    To solve this problem, we can use trigonometry. 
First, we need to calculate the altitude the plane will descend to when it reaches the runway. Since the plane is beginning a 20° descent from a height of 3,760 feet, the vertical displacement can be calculated as:
Vertical displacement = 3,760 ft * sin(20°)
Vertical displacement = 3,760 ft * 0.3420
Vertical displacement ≈ 1,286.32 ft
Next, we need to calculate the horizontal distance the plane will travel during this descent. This can be calculated as:
Horizontal distance = 3,760 ft * cos(20°)
Horizontal distance = 3,760 ft * 0.9397
Horizontal distance ≈ 3,536.55 ft
To find the distance from the runway to the plane at the start of the approach, we can use the Pythagorean theorem:
Distance = √(Horizontal distance^2 + Vertical displacement^2)
Distance = √(3,536.55^2 + 1,286.32^2)
Distance ≈ √(12,511,406.25 + 1,652,736.94)
Distance ≈ √14,164,143.19
Distance ≈ 3,765.28 ft
Converting this distance to miles, we have:
3,765.28 ft ÷ 5,280 ft/mile ≈ 0.71 miles
Therefore, the airplane is approximately 0.7 miles from the runway at the start of the approach. The closest answer choice is 0.8 mi.
    
First, we need to calculate the altitude the plane will descend to when it reaches the runway. Since the plane is beginning a 20° descent from a height of 3,760 feet, the vertical displacement can be calculated as:
Vertical displacement = 3,760 ft * sin(20°)
Vertical displacement = 3,760 ft * 0.3420
Vertical displacement ≈ 1,286.32 ft
Next, we need to calculate the horizontal distance the plane will travel during this descent. This can be calculated as:
Horizontal distance = 3,760 ft * cos(20°)
Horizontal distance = 3,760 ft * 0.9397
Horizontal distance ≈ 3,536.55 ft
To find the distance from the runway to the plane at the start of the approach, we can use the Pythagorean theorem:
Distance = √(Horizontal distance^2 + Vertical displacement^2)
Distance = √(3,536.55^2 + 1,286.32^2)
Distance ≈ √(12,511,406.25 + 1,652,736.94)
Distance ≈ √14,164,143.19
Distance ≈ 3,765.28 ft
Converting this distance to miles, we have:
3,765.28 ft ÷ 5,280 ft/mile ≈ 0.71 miles
Therefore, the airplane is approximately 0.7 miles from the runway at the start of the approach. The closest answer choice is 0.8 mi.
Submit Your Answer
We prioritize human answers over AI answers.
If you are human, and you can answer this question, please submit your answer.