A ball is thrown upward with a speed of 40 feet per second from the edge of a cliff 500 feet above the ground. What is the speed of the ball when it hits the ground? Use acceleration due to gravity as –32 feet per second squared and approximate your answer to 3 decimal places.

1 answer

v₀ = 40 ft / s

h₀ = 500 ft

a = - 32 ft / s²

a = dv(t) / dt

dv(t) = a ∙ dt

v(t) = ∫ a ∙ dt

v(t) = ∫ - 32 ∙ dt

v(t) = - ∫ 32 ∙ dt

v(t) = - 32 t + v₀

v(t) = - 32 t + 40

v(t) = dh(t) / dt

dh(t) = v(t) ∙ dt

h(t) = ∫ v(t) ∙ dt

h(t) = ∫ ( - 32 t + 40) dt

h(t) = - ∫ 32 t dt + ∫ 40 dt

h(t) = - 32 ∫ t dt + ∫ 40 dt

h(t) = - 32 ∫ t² / 2 + ∫ 40 t + h₀

h(t) = - 16 t² + 40 t + h₀

h(t) = - 16 t² + 40 t + 500

The stone hits the ground when its height is 0:

- 16 t² + 40 t + 500 = 0

The solutions are:

t = ( 5 / 4 ) ( 1 - √ 21 ) = - 4.47822 sec

and

t = ( 5 / 4 ) ( 1 + √ 21 ) = 6.97822 sec

Time can't be negative so t = 6.97822 sec

v = | v(t) | = | - 32 t + 40 | =

| - 32 ∙ 6.978226 + 40 | =

| -183.303232 | = 183.303232 ft / s

v = 183.303 ft / s

approximately to 3 decimal places