A ball is thrown upward with an initial velocity of 35 meters per second from a cliff that is 80 meters high. The height of the ball is given by the quadratic equation h = -49t^2 + 35t + 140 where h is in meters and t is the time in seconds since the ball was thrown. Find the time that the ball will be 60 meters from the ground. Round your answer to the nearest tenth of a second.

PS: I know the answer is 9.0 seconds because it is a test question I got wrong, I need steps if anyone can help:)