To solve this problem, we can use the Pythagorean theorem.
Let's assume the distance travelled by the eastbound jet is x miles. Then, the distance travelled by the southbound jet would be x miles as well (since they both left at the same time).
According to the Pythagorean theorem, the square of the hypotenuse (distance between the two jets) is equal to the sum of the squares of the other two sides.
So, the equation becomes:
x^2 + x^2 = (300^2) + (400^2)
Simplifying this equation:
2x^2 = 90000 + 160000
2x^2 = 250000
Dividing both sides by 2:
x^2 = 125000
Taking the square root of both sides:
x = √125000
x ≈ 353.55 miles
Thus, the distance between the two jets at the end of the hour is approximately 353.55 miles.
Two jets left an airport at the same time one tracked east at 300 miles per hour. The other traveled south at 400 miles per hour how far apart were the keys at the end of the hour
1 answer