At 8 A.M., Dylan and his neighbor, Mickey, drove in their cars to a city that was 240 miles away from their neighborhood. When Dylan reached the city, Mickey had 40 miles to go. He finally completed the trip 48 minutes later. (a) How long did it take Mickey to drive to the city? (b) What was Dylan's driving speed in miles per hour? ( there were 3 similar problems posted on this website but none of them seemed to help me so I'm looking for someone that could thx :) )

5 answers

an assumption is made that they each drive at some constant speed

Mickey ... 40 mi in 4/5 hr ... 40 / (4/5) = 50 mph

Dylan reached the city when Mickey had driven 200 mi ... in 4 hr at 50 mph

Dylan drove 240 mi in 4 hr ... speed = distance / time
a. r = 40/(48/60) = 50 mi/h = Mickey's speed.
r * t = 240,
50t = 240,
t = 4.8 Hours = Mickey's speed.
b. 4.8 hours = Mickey's time.
4.8 - 48/60 = 4 Hrs. = Dylan's time.
r * t = 240,
r * 4 = 240,
r = Dylan's speed.
This doesn’t make sense
yes it does