To determine the speed in one hour per mile, we need to convert the given information into appropriate units.
First, we need to convert the distance from feet to miles. Since there are 5,280 feet in a mile, we can calculate:
20 feet * (1 mile / 5,280 feet) ≈ 0.00378 miles
Next, we need to determine how many 20-foot intervals are in one hour. Since there are 60 minutes in an hour, and 60 seconds in a minute, we have:
60 minutes/hour * 60 seconds/minute = 3,600 seconds/hour
Then, we can find the number of 20-foot intervals in one hour:
3,600 seconds/hour / 3 seconds/intervals = 1,200 intervals/hour
Finally, we divide the distance traveled in one hour (in miles) by the number of intervals (in miles) to find the speed:
0.00378 miles / 1,200 intervals ≈ 0.00000315 miles per interval
Therefore, Curtis can go approximately 0.00000315 miles in one hour per mile.
If it takes Curtis three seconds to go 20 feet then how fast can you go in one hour per mile?
1 answer