When a ball is thrown straight down from the top of a tall building, with initial velocity 30 ft/sec, the distance from the release point at time t in seconds is given by s(t)=16t^2+30t.

If the release point is 300 ft above ground, what is the velocity of the ball at the time it hits the ground?

I think you would have to average the feet per seconds but I'm not sure.