An airplane lands with a velocity of 100 m.s. The maximum rate at which it can slow down is 5,0 m.s

1. What is the minimum time needed to come to rest, starting from when it first touches the ground?

2. Can this plane safely land at a small airport where the landing strip is 0,80 km long?

2 answers

I assume you mean Vi = 100 m/s and a = -5 m/s^2

v = Vi + a t
0 = 100 -5 t
t = 20 seconds
to go from 100 to 0 at -5m/s^2

d = Vi (20) - (1/2)(5)(20)^2
d = 100 (20) - 2.5(400)
= 2000 - 1000
= 1000 meters to stop, runs 200 meters beyond the end of the runway into the harbor where he strikes a passenger ferry.
Another approach for question #2 (aside from Damon's answer) is to find the final velocity of the plane from the given 800 meters landing strip, that is, if the positive and not equal to zero, then it cannot safely land that small airport. So,

Vi= 100m/s
a= -5 m/s^2
x= 800 meters

Vf^2=Vi^2+2ax
Vf^2=(100)^2+2(-5)(800)
Vf^2=10000-8000=2000
Vf=44.72 m/s

It means that if the plane was to land the 800 landing strip, it will have a velocity of 44.72 m/s (not stopping) and cannot land the given landing strip/airport.