Actually, thermodynamics is hard and I don’t understand it. But even without totally understanding thermodynamics, it turns out its possible to do a surprising number of useful calculations with just a couple of simple rules about entropy.
The setup is as follows: Imagine that there is some set of states of the world, called the macrostates, that we humans can distinguish. To each of these macrostates is associated some large number of microstates, where a microstate is a complete specification of all information about all the particles in a system.
For example, given a container of gas, different macrostates would correspond to different pressures and temperatures of the gas, since we can determine those with macroscopic measurements. Microstates would correspond to complete information about how all particles in the gas are moving.
Every macrostate has an associated quantity called its entropy, written with an . The entropy of a macrostate obeys the following rules:
- The entropy is equal to Boltzmann’s constant,
, times the logarithm of the number of associated microstates:
.
- If a system is at a temperature
, and you heat it by adding energy
to it (while keeping it at temperature
, by allowing it to expand, say), then its entropy increases by
.
- The total of the entropy of the universe always increases. (This is the second law of thermodynamics.)
These rules alone let you do a surprising number of useful calculations:
Doubling the Volume of a Gas
Suppose you have a gas occupying a volume at temperature
with total number of molecules
. How much energy do you have to add to it to double its volume while keeping it at the same temperature?
With a doubled volume, imagine that the gas occupies two spaces of volume . Then the number of microstates of the system gets multiplied by
since, for each microstate of the original gas, there are
new microstates corresponding to which space of volume
the molecule is in.
That means that the entropy must have gone up by by Rule #1, but by rule #2, that means that to achieve this entropy increase by heating, we must have added
energy to the system.
Efficient Refrigeration
Generally, when hot things and cold things touch, the hot things get cooler and the cold things get hotter. This is because hot things that are getting cooler are losing entropy, and cold things that are getting hotter are gaining entropy, but the entropy being gained is greater than the entropy being lost.
How much energy does it take to reverse the process? Suppose you have a refrigerator with 10 kg of food that’s currently at room temperature (say 20° C) and you want to lower it to 0°C. Suppose the specific heat of the food is 3.5 kJ/kg°C. That means that a total of can be extracted from the food as it cools.
When the food has lost energy, that must mean its temperature is:
That means the entropy lost by the food as its cooled is
This is about 2.5 kJ/°C. Whatever process is used for refrigeration, it must obey Rule #3, and thus increase entropy somewhere else.
If you increase entropy by exhausting heat into the room, which is at 20°C = 293K, then you’ll have to exhaust of energy. You can get 700kJ of that from the food, but you still need an extra 32.5 kJ of energy, which is why refrigerators have to be plugged in and don’t work on their own.
Deleting a Bit
How much energy does it take to delete a bit of information in a computer? A bit could be in state 0 or state 1, and after deleting it, it will be in state 0 (say). That means that the process of deleting the bit takes all the microstates of the 0 macrostate and all the microstates of the 1 macrostate to the 0 microstate. That’s halves the number of microstates, or subtracts from the entropy.
In order to obey Rule #3, you therefore have to exhaust a minimum of energy as waste heat, where
is the ambient temperature. This is called Landauer’s principle.