A ball is thrown at an angle of 45 degrees to the ground. If the ball lands 90m away, what is the initial speed of the ball? I know you use x(t)=(v*cos(alpha))t and y(t)=(v*sin(alpha))t-g/2t^2 but I'm still really confused about what to do. Please show all work. Thank you.

1 answer

The question is missing information.
By matching different initial velocities of the ball, it can land at 90m away at virtually any angle α.

I will solve for the initial velocity in terms of the angle α, but also give a solution for the MINIMUM initial velocity for the ball to land 90m away.

The key is to recognize that
x(t)=90m=D, and
y(t)=0
(i.e. the ball lands D=90m away).

Using the second formula, we have
y(t)=(v*sin(alpha))t-(g/2)t^2, or
(v*sin(alpha))t=(g/2)t^2

Solve for t after cancelling t on each side:
t=2vsin(α)/g

Substitute in the first equation to solve for v(cos(α):
D=vcos(α)*(2vsin(α)/g
Substitute
sin(2α) = 2sin(α)cos(&alpha)
D=v²sin(2α)/g
or
v²=Dg/sin(2α)

(The following is the only place where calculus comes in)

The minimum velocity that can give the distance D is obtained when the ball is thrown at 45° with the horizontal, in which case α=45°, or when 2α=90°.

In this case the (minimum) initial velocity is
v=sqrt(90*9.81)=29.71m/s