To find the distance of the airplane from the airport, we can use the Pythagorean theorem.
First, let's consider the distances traveled in the north-south and east-west directions separately.
The airplane flies 170 miles due west, so this is the distance traveled in the east-west direction.
Next, we need to find the distance traveled in the north-south direction. To do this, we need to calculate the component of the 240 miles that points north or south.
The direction is given as 200°30', which means it is 200 degrees and 30 minutes clockwise from north. To convert minutes to degrees, we divide by 60. So, 30 minutes is equal to 30/60 = 0.5 degrees.
Now, we need to find the component of 240 miles that points north or south, given the angle of 200°30'. We can use trigonometry to do this.
The component of the distance in the north-south direction is given by:
240 * sin(200°30')
Using a calculator, we find that sin(200°30') is approximately -0.5736. Multiplying this by 240, we get:
-0.5736 * 240 = -137.664
Since the direction is 200°30', which is south, the component is negative.
Now, we can use the Pythagorean theorem to find the distance from the airport:
Distance from the airport = √(170^2 + (-137.664)^2)
Using a calculator, we find that √(170^2 + (-137.664)^2) is approximately 218.4
Therefore, the airplane is approximately 218.4 miles from the airport at this time (to the nearest mile).