Thermodynamics is Easier Than I Thought

Actually, thermodynamics is hard and I don’t understand it.  But even without totally understanding thermodynamics, it turns out its possible to do a surprising number of useful calculations with just a couple of simple rules about entropy.

The setup is as follows: Imagine that there is some set of states of the world, called the macrostates, that we humans can distinguish.  To each of these macrostates is associated some large number of microstates, where a microstate is a complete specification of all information about all the particles in a system.

For example, given a container of gas, different macrostates would correspond to different pressures and temperatures of the gas, since we can determine those with macroscopic measurements.  Microstates would correspond to complete information about how all particles in the gas are moving.

Every macrostate has an associated quantity called its entropy, written with an $S$.  The entropy of a macrostate obeys the following rules:

1. The entropy is equal to Boltzmann’s constant, $k$, times the logarithm of the number of associated microstates: $S = k\log W$.
2. If a system is at a temperature $T$, and you heat it by adding energy $E$ to it (while keeping it at temperature $T$, by allowing it to expand, say), then its entropy increases by $E/T$.
3. The total of the entropy of the universe always increases.  (This is the second law of thermodynamics.)

These rules alone let you do a surprising number of useful calculations:

Doubling the Volume of a Gas

Suppose you have a gas occupying a volume $V$ at temperature $T$ with total number of molecules $N$.  How much energy do you have to add to it to double its volume while keeping it at the same temperature?

With a doubled volume, imagine that the gas occupies two spaces of volume $V$.  Then the number of microstates of the system gets multiplied by $2^N$ since, for each microstate of the original gas, there are $2^N$ new microstates corresponding to which space of volume $V$ the molecule is in.

That means that the entropy must have gone up by $k \log(2^N) = k N \log 2$ by Rule #1, but by rule #2, that means that to achieve this entropy increase by heating, we must have added $k N T \log 2$ energy to the system.

Efficient Refrigeration

Generally, when hot things and cold things touch, the hot things get cooler and the cold things get hotter.  This is because hot things that are getting cooler are losing entropy, and cold things that are getting hotter are gaining entropy, but the entropy being gained is greater than the entropy being lost.

How much energy does it take to reverse the process?  Suppose you have a refrigerator with 10 kg of food that’s currently at room temperature (say 20° C) and you want to lower it to 0°C.   Suppose the specific heat of the food is 3.5 kJ/kg°C.  That means that a total of $(3.5 \mathrm{kJ}/\mathrm{kg}^\circ\mathrm{C})\times 10\mathrm{kg}\times 20^\circ\mathrm{C} = 700\mathrm{kJ}$ can be extracted from the food as it cools.

When the food has lost $E$ energy, that must mean its temperature is:

$\displaystyle{20^\circ\mathrm{C}-\frac{E}{ (3.5 \mathrm{kJ}/\mathrm{kg}^\circ\mathrm{C})\times 10\mathrm{kg}} = 20^\circ\mathrm{C}-\frac{E}{ (35 \mathrm{kJ}/{}^\circ\mathrm{C})} = 293\mathrm{K}-\frac{E}{ (35 \mathrm{kJ}/{}^\circ\mathrm{C})}}$

That means the entropy lost by the food as its cooled is

$\displaystyle{\int_0^{700\mathrm{kJ}}\frac{dE}{293\mathrm{K}-E/(35 \mathrm{kJ}/{}^\circ\mathrm{C})}}$

This is about 2.5 kJ/°C.  Whatever process is used for refrigeration, it must obey Rule #3, and thus increase entropy somewhere else.

If you increase entropy by exhausting heat into the room, which is at 20°C = 293K, then you’ll have to exhaust $(2.5 \mathrm{kJ}/{}^\circ\mathrm{C})\times 293\mathrm{K} = 732.5 \mathrm{kJ}$ of energy.  You can get 700kJ of that from the food, but you still need an extra 32.5 kJ of energy, which is why refrigerators have to be plugged in and don’t work on their own.

Deleting a Bit

How much energy does it take to delete a bit of information in a computer?  A bit could be in state 0 or state 1, and after deleting it, it will be in state 0 (say).  That means that the process of deleting the bit takes all the microstates of the 0 macrostate and all the microstates of the 1 macrostate to the 0 microstate.  That’s halves the number of microstates, or subtracts $k\log 2$ from the entropy.

In order to obey Rule #3, you therefore have to exhaust a minimum of $kT\log 2$ energy as waste heat, where $T$ is the ambient temperature.  This is called Landauer’s principle.