A brick is thrown vertically upward with an initial speed of 5.00 m/s from the roof of a building. If the building is 112.0 m tall, how much time passes before the brick lands on the ground?

1 answer

a = -g = -9.8

V = Vinitial + a t = 5 - 9.8 t

h = Hinitial + Vi t +(1/2)at^2 = 112 + 5t-4.9t^2 = 0 at the end

so
112 + 5 t -4.9 t^2 = 0

4.9t^2 -5 t -112 = 0
solve quadratic, use positive t