A brick is thrown vertically upward with an initial speed of 3.00 m/s from the roof of a building. If the building is 78.4 m tall, how much time passes before the brick lands on the ground? I think I may have gotten part of this question by first calculating how long it takes the brick to come back down to the level from which it was thrown, but the rest of the process (figuring out how long before it hits the ground from that point) is really puzzling me. Any help would be great!
1 answer
Nevermind. I got something close to what I think I'm supposed to get (I calculated 3.71 s). The only thing is, when I plug my time back into the equation to check for correctness, I end up getting a negative 78.4 m, which obviously doesn't equal positive 78.4 m. Any thoughts on what I might have done?