A airplane has a speed of 290 km/h and is driving at an angle of 30 degrees below the horizontal when the pilot releasese a radar decoy. The horizontal distance between the release point and where the decoy hits the ground is 700m. a) How long is the decoy in the air? b) How high was the release point?

My teacher gave us the answer since we ran out of time. He said a) is 10 seconds and b) is 879m. Can you explain how I can get these answers? We have to know how to solve this type of proble for a quiz soon.

Figure the horizontal and vertical components of the airplane speed. The vertical speed is the initial vertical speed of the decoy. That speed is down, or negative.

The horizontal speed is the initial horizontal speed of the decoy.

Horizontaldistance=speedhorizontal*time
solve for time.

Now, having that time,
distancetoground=Speedvertical*time -1/2 g time^2
solve this for the vertical distance to the ground.