Asked by Jon

A ball is thrown vertically upward with a speed of 1.53 m/s from a point 4.21 m above the ground. Calculate the time in which the ball reach the ground.

I'm figuring I have to do something with 9.80 which is gravity but I don't know what.

Answers

Answered by bobpursley
heightfinal=heightinitial + Vi*time-1/2 g time^2

solve for time. Use the quadratic equation.
Answered by drwls
Calculate the time T1 it takes to rise to maximum altitude H. Velocity will be zero there. This time can be obtained from
g T1 = 1.53 m/s

Calculate the height at that time from

H = 4.21 + 1.53 T1 - (1/2) g T1^2

Then calculate the time T2 it takes to fall to the ground from that height H. Get that by solving

H = (1/2) g T2^2

The total time of flight is T1 + T2
Answered by Jon
I'm alittle confused.

on the very first part do I multiply 9.80*1.53 or is the time I need 1.53
Answered by bobpursley
You are confused. In my equation, you solve for time. It is a quadratic. DrWls gave an alternate approach, but you are still solving for time. In his approach, the total time is the sum of the times to get up to max altitude, and the time to fall to the ground.
There are no AI answers yet. The ability to request AI answers is coming soon!

Related Questions