To solve this problem, we can break it down into two right-angled triangles:
1. In the first triangle, airport A is the starting point, and airport B is the final destination. The aircraft has traveled 500km on a bearing of 070 degrees. Let's label the distance from A to B as *AB*.
Using trigonometry, we can calculate the northward component of *AB*:
*AB northward component = 500km * sin(70°)*
*AB northward component = 500km * 0.9397*
*AB northward component = 469.85km*
2. In the second triangle, airport B is the starting point, and airport C is the final destination. The aircraft has traveled 800km on a bearing of 035 degrees. Let's label the distance from B to C as *BC*.
Using trigonometry, we can calculate the northward component of *BC*:
*BC northward component = 800km * sin(35°)*
*BC northward component = 800km * 0.5736*
*BC northward component = 458.88km*
To find how far north airport C is from airport A, we need to add the northward components of *AB* and *BC*:
*Distance north from A to C = AB northward component + BC northward component*
*Distance north from A to C = 469.85km + 458.88km*
*Distance north from A to C = 928.73km*
Therefore, airport C is 928.73km north of airport A.
An aircraft flies 500km from an airport A on a bearing of 070 degree to arrive at airport B.The plane then flies 800km from airport B on a bearing of 035 degree to arrive at airport C. Calculate how far north C is from A?
1 answer