apparently 2 miles is the straight-line distance, so d=2 and x=sqrt(3)
2d dd/dt = 2x dx/dt
2(2) dd/dt = 2sqrt(3)*500
dd/dt = 250sqrt(3)
A plane flying horizontally at an altitude of 1 mi and a speed of 500 mi/h passes directly over a radar station. Find the rate at which the distance from the plane to the station is increasing when it is 2 mi away from the station.
I drew a diagram and figured out I need to find dd/dt which is the distance of the plane from the radar station over time
I found y to be the altitude of the plane (1 mi)
I also found x to be the distance away from the radar station (2 mi)
dc/dt = 500 mi/h
I noticed I could use the pythagorean equation here.
d^2 = x^2 + y^2
I solved for d to be sqrt(5)
and I differentiated the equation to be 2d*(dd/dt) = 2x*(dx/dt) + 2y*(dy/dt)
y is constant therefore dy/dt = 0
I evaluated it to be (dd/dt) = [2(2)(500)]/(2sqrt(5)) or 1000/sqrt(5)
which is equivalent to 447mi/hr
My problem is the answer is 250sqrt(3) which is 433mi/hr
What am I doing wrong?
2 answers
I'm confused as to how you got d =2 and x=sqrt(3)