Asked by A.S.
Suppose a scientist on Earth throws a baseball upward. The scientist lets go of the ball 2 meters above the ground with an initial velocity of 10 meters per second. How long does it take for the ball to hit the ground (H=0)? (Use the quadratic formula to solve this problem, show all work). Write a statement to interpret your results.
Answers
Answered by
Damon
h = ho + Vi t -(g/2)t^2 assume g = 9.8m/s^2
0 = 2 + 10 t -4.9 t^2
4.9 t^2 - 10 t - 2 = 0
t = [10 +/-sqrt(100+39.2)]/9.8
t = [ 10 +/-11.8 ]/ 9.8
(negative result is before you threw it)
t = 2.22 seconds
0 = 2 + 10 t -4.9 t^2
4.9 t^2 - 10 t - 2 = 0
t = [10 +/-sqrt(100+39.2)]/9.8
t = [ 10 +/-11.8 ]/ 9.8
(negative result is before you threw it)
t = 2.22 seconds
There are no AI answers yet. The ability to request AI answers is coming soon!
Submit Your Answer
We prioritize human answers over AI answers.
If you are human, and you can answer this question, please submit your answer.