Tom was playing fetch in the park with his Jack Russell terrier. On one throw, the ball landed 90 meters from the terrier, and the terrier started running toward it at 11 meters per second. At the same time, a greyhound that was 140 meters away from the ball started running toward it at a speed of 19 meters per second.

If each dog kept a constant speed, how long did it take for both dogs to be the same distance from the ball?
Simplify any fractions.
seconds

1 answer

Let's denote the time it takes for both dogs to be the same distance from the ball as \( t \) seconds.

For the Jack Russell Terrier:

  • Initial distance from the ball: 90 meters
  • Speed: 11 meters per second
  • Distance from the ball after \( t \) seconds: \[ \text{Distance}_{\text{terrier}} = 90 - 11t \]

For the Greyhound:

  • Initial distance from the ball: 140 meters
  • Speed: 19 meters per second
  • Distance from the ball after \( t \) seconds: \[ \text{Distance}_{\text{greyhound}} = 140 - 19t \]

Setting their distances equal:

We want to find when both distances from the ball are the same: \[ 90 - 11t = 140 - 19t \]

Solve for \( t \):

  1. Rearranging the equation to isolate \( t \): \[ 90 - 11t + 19t = 140 \] \[ 90 + 8t = 140 \]

  2. Subtract 90 from both sides: \[ 8t = 140 - 90 \] \[ 8t = 50 \]

  3. Divide by 8: \[ t = \frac{50}{8} = \frac{25}{4} = 6.25 \text{ seconds} \]

Conclusion:

It takes 6.25 seconds for both dogs to be the same distance from the ball.