A baseball is hit with a speed of 27.0 m/s at an angle of 42.0 deg. It lands on the flat roof of a 10.0 m-tall nearby building.

If the ball was hit when it was 1.0 m above the ground, what horizontal distance does it travel before it lands on the building?

- - - - - - - - - - - - - - - - - - - - - - - - - - -

I tried to get the time it took the ball to land on the roof.

y = y_0 + v_0yt - .5gt^2
10.0 = 1.0 + 0 - .5(9.8)t^2

I got to be 1.35 s

And I plugged that into this equation: x = v_0 * cosALPHA * t

I got x to be 27.193

They want the answer in two significant figures so 2.7*10^1

This is actually wrong. I'm not sure where I screwed up on my work process.

1 answer

Vo = 27m/s @ 42 deg..
Vo(h)= 27cos42=20.1m/s
Vo(v) = 27sin42 = 18.1m/s.

Vf = Vv + gt,
t(up) = (Vf-Vv) / g =
(0-18.1) / -9.8 = 1.85s.

d(up) = (Vv)t + 0.5at^2,
d(up) = 18.1*1.85 + 0.5(-9.8)(1.85)^2,
d(up) = 33.49 - 16.77 = 16.72m.

d(dn) = (16.72+1m) - 10 = 7.7m.

d(dn) = Vv + 0.5gt^2 = 7.7m,
0 + 0.5*9.8t^2 = 7.7,
4.9t^2 = 7.7,
t^2 = 1.57,
t(dn) = 1.25s.

T(tot.)=t(up) + t(dn)=1.85 + 1.25 = 3.1s.

d(h)=Vh * T = 20.1m/s * 3.1s = 62m