A dog starts chasing a cat when they are 12 meters apart. For every 7 meters that the dog runs, the cat runs 4 meters. How far does the dog have to run to catch up to the cat?

4 answers

The cat runs 4/7 as fast as the dog. So, if the dog has to run x meters, the cat will have run 4/7 x during that time.

x = 12 + 4/7 x
3/7 x = 12
x = 28

That is, the cat will have run 16 meters while the dog runs 28 meters, having made up the cat's 12-meter head start.
Idk what Steve said
You have to make a model like:
. ----------------
. |. Dog | cat |
. ----------------
. |. 7 | 4. |
. |. 14. | 8. |
. |. 21. |. 12. |
. |. 28. | 16. |
. |. 35 | 20. |
|. (42).| 24. |
. |. | 38 |
. | | (42).|

Guys try to ignore the dots
.
It can’t because by the time the dog reaches the cats position the cat is further away and when the dog reaches that location the cat is even further away and the sequence repeats