The rate that an object cools is directly proportional to the difference between its temperature ( in Kelvins) at that time and the surrounding temperature (in Kelvins). If an object is initially at 35K, and the surrounding temperature remains constant at 10K, it takes 5 minutes for the object to cool to 25K. How long will it take for the object to cool to 20K?

1 answer

You are assuming Newton's law of cooling. This leads to a differential equation of T(t) that has a simple exponential decay law. I will skip these steps.

Let k be the cooling rate constant and T' be the surrounding temperature (10 K in this case). t is time in minutes.

The temperature decay equation is

T - T' = 25 e^(-kt)

where the 25 K is the initial temperature difference relative to the surroundings.

Note that dT/dt = -k*(T-T'), as required by the cooling law, and that T(t=0) = 25 + T'

15 = 25*e^(-5k)
-5k = ln(0.6)
k= 0.102 min^-1

When T = 20K,
10 = 25*e^(-0.102 t)
0.4 = e^(-0.102 t)
-.102t = ln(0.4)
t = 9.0 minutes