t = k/r, where k is a constant
given: t = 5, r = 60
5 = k/60
k = 300
t = 300/r
when r = 40
t = 300/40 = 7.5 hrs
The time T required to drive a fixed distance varies inversely as the speed r. it takes five hours at 60 mph to drive a fixed distance. how long would it take to drive the fixed distance at 40 mph?
2 answers
Thank you so much:)