She must fly faster than the speed of light, that is,
it can't be done.
I want your explanation of why I said that.
A jet pilot is making a 1000 mile trip in her military Tomcat. She flies 800 mph for one hour and decides she wants to finish the trip so that she would have flown at an average of 1000 mph. How fast should she go to finish the trip and meet her goal?(Hint: Think carefully about what is meant by average speed").
3 answers
Well, I subtracted 1000-800= 200 miles, which was my remaining miles. She wants to finish at an average of 1000 mph so that she would have flown at an average of 1000mph, so speed is 1000mph. I plugged in this values in the formula: average speed=distance/time...
1000=200/time
time=200/1000 and i got 1/5th which is 1/5th of an hour...
1000=200/time
time=200/1000 and i got 1/5th which is 1/5th of an hour...
no,
it said that she flew at 800 mph for one hour, so she used up 1 hour of flying time
To average 1000 mph in flying 1000 miles would take exactly one hour, which she has already spent flying.
So she must do the remaining 200 miles in zero time.
That is why it is not possible.
it said that she flew at 800 mph for one hour, so she used up 1 hour of flying time
To average 1000 mph in flying 1000 miles would take exactly one hour, which she has already spent flying.
So she must do the remaining 200 miles in zero time.
That is why it is not possible.