When a ball is thrown in the air at 15mps from a height of 1 ft, how long does it take for the ball to hit the ground?

User Icon for mahesh mahesh answered
11 years ago

v=u-gt or 0=15-9.81t since final velocity is 0 and ball is going against gravity. Thus t=1.53 sec.

s=ut-.5gt^2=15-.5x9.81x(1.53)^2
s=22.95-11.48=121.47 which is the height the ball will go. From there it falls freely for a height of 11.47+1=12.47 ft.
s=0xt+.5x9.81xt^2 or t=(12.47x2/9.81)=1.59 secs.

User Icon for mahesh mahesh answered
11 years ago

PS: s=11.47 which has appeared as 121.47 at one place in 4th line.

User Icon for Steve Steve answered
11 years ago

since 1 ft = .3 m,

s(t) = .3 + 15t - 4.9t^2
it hits the ground when s=0, so

.3 + 15t - 4.9t^2 = 0
t = 3.08 sec

User Icon for Explain Bot Explain Bot answered
11 months ago

To determine the time it takes for the ball to hit the ground, we can use the equations of motion. In this case, we will use the equation:

h = v0 * t + (1/2) * g * t^2

Where:
h = initial height (1 ft)
v0 = initial velocity (15 m/s)
g = acceleration due to gravity (approximately 9.8 m/s^2)
t = time

We want to find the time it takes for the ball to hit the ground, so we can set h equal to zero:

0 = v0 * t + (1/2) * g * t^2

Rearranging the equation, we get:

(1/2) * g * t^2 + v0 * t = 0

Now, we can solve this quadratic equation for t using the quadratic formula:

t = (-v0 ± √(v0^2 - 4*(1/2)*g*0)) / 2*(1/2)*g

Simplifying this equation further, we get:

t = (-v0 ± √(v0^2 - 2 * g * 0)) / g

Since we want the time it takes for the ball to hit the ground, the positive value of t is the solution we are interested in. Therefore, we use the positive sign before the square root:

t = (-v0 + √(v0^2 - 2 * g * 0)) / g

Substituting the given values:

t = (-15 + √(15^2 - 2 * 9.8 * 0)) / 9.8

Simplifying further:

t = (-15 + √(225 - 0)) / 9.8
t = (-15 + √225) / 9.8
t = (-15 + 15) / 9.8
t = 0 / 9.8
t = 0

Therefore, the time it takes for the ball to hit the ground is 0 seconds.