Mark Eichenlaub explains entropy

“Entropy is how much information you’re missing.”

“Entropy is not a property of the system. It’s a property of how we describe the system.”

Mark Eichenlaub is awesome in general when it comes to explanations of physics-related notions that are actually illuminating, which is why I keep quoting him at length here over and over again (mostly for my own future reference). Here’s one more, his entire answer to “what’s an intuitive way to understand entropy”:

Entropy is how much information you’re missing. For example, if you want to know where I live and I tell you it’s in the United States, that’s high entropy because the US is a large country. It would still take quite a bit of information to pin it down. If instead I said I live in zipcode 21218, the entropy is lower because you have more information. The more information you have, the less entropy.

A zip code is a 5-digit number, so I’ve given you 5 digits of information. Your entropy regarding where I live has gone down by about 5 digits. (This isn’t exact because zip codes aren’t all the same size. Because I live in a city, my zip code is fairly small and you get more information out of it than if I lived in rural Montana.)

As another toy example, suppose I roll ten dice and tell you that the sum is 30. You can’t tell from that what the exact numbers on each die are, so you have entropy – you’re missing information. Those exact numbers are called a microstate. The sum is called a macrostate. It turns out there are 2,930,455 microstates that have a sum of 30, so you have about 6.5 digits of entropy. (It’s a seven digit number, but you don’t have seven full digits of entropy because not all 10 million seven-digit numbers are possible, only the first 2.9 million of them.)

What if I said the sum of the ten dice was 59? There are only 10 microstates for that, so now you only have one digit of entropy. Different macrostates have different entropy.

If I tell you that the first five dice add to 13 and the second five add to 17, you know that the sum is 30, like before, but you have extra information this time, since you know the intermediate sums as well. There are 420 microstates for the first five dice and 780 for the second five, making 420*780 = 327,600 microstates in all. This is just about one digit less entropy than when I simply told you the sum, so by telling you the partial sums, I gave you about one digit of information.

We’ve been measuring entropy in how many digits the number of microstates has. The number of digits is given the mathematical name logarithm. The entropy is denoted by S and the number of microstates by \Omega, so

S = \log \Omega

This is Boltzmann’s formula for entropy. If there’s only 1 microstate, this formula says that we know everything and the entropy is zero. It also says that if you have two independent systems, your total entropy is the sum of your entropy for each system individually because \log(AB) = \log A + \log B.

In physics, students are often confused about entropy, and I think that’s partially because they try to think of it as being a property of the system. Systems have a certain amount of energy, momentum, and charge, but they don’t have a certain amount of entropy. (Recall that the ten dice have a different entropy if I tell you the sum is 30 than if I tell you the two partial sums are 13 and 17, even though it’s the same dice.)

Instead, entropy is a property of the way we describe the system. This is quite different from most other quantities we manipulate in physics classes, which leads to some of the confusion on it.

A canonical physics example is a piston of gas. The microstate is the position and momentum of every gas molecule. This is the equivalent of knowing each die roll. The usual macrostate we give is the pressure, density, volume, and chemical composition. This is like knowing the sum of the dice.

Once we have this macrostate, we can use an “equation of state” to figure out what will happen to the gas if we make the piston bigger or heat it up, etc. The equation of state for an ideal gas is P = \rho T, pressure is density times temperature. (You might be more familiar with PV = nRT, which is the same equation with extra constants thrown in to be more confusing.) Using the equation of state generally gives us very accurate results. Even though the real physics going on underneath is the random motions of some 10^{23} molecules, averaging out over all of them works okay.

Even though we can get a lot done using average properties like pressure, the microstate of the gas is very complicated, with all the different molecules moving in different ways. The amount of information we’d need to receive in order to know this complete state is called the entropy.

If you heat the gas up, there’s now more uncertainty about each molecule’s speed, so the entropy is higher. If you suddenly make the piston larger, there’s more uncertainty about each molecule’s position, so the entropy goes up again.

We’ve been discussing the thermal energy of the gas as something with high entropy. By contrast, the gravitational potential energy of a lump of clay you’re holding up in the air has low entropy. For the gas, the you’d need a long, long list of each molecule’s momentum and position to give the entire microstate. For the clay, the macrostate is the mass and height of the clay. Again you’d need a list of each molecule’s gravitational potential energy to describe the clay’s microstate. However, all the molecules in the clay are essentially the same. The first one is about 2m high, the second one is about 2m high, etc. This is very different from the gas, where all the molecules all have different energies and you don’t know what they are. Because knowing about the macrostate of the clay tells you pretty much everything about its microstate, there is very little entropy. Another way of saying this is that the thermal energy is stored in microscopic degrees of freedom while the potential energy is stored in a single macroscopic degree of freedom.

When you drop the clay, its potential energy gets converted to kinetic energy as it falls. Because all the molecules are falling with the same velocity, knowing the macrostate (the velocity of the clay) tells us the microstate (the part of the velocity of the molecules that comes from the ball’s motion), and the entropy hasn’t increased much. But when the clay hits the ground, the kinetic energy of all the molecules moving together becomes thermal energy of all the molecules moving randomly. We used to know what each molecule was doing, but now we don’t. Our ignorance about the microstate has increased; the entropy has gone up.

The second law of thermodynamics says that entropy always increases. We now understand what this means: you can’t just spontaneously learn more about the microstate. Once you lose information about the microstate (for example by dropping the clay so that the molecules all start moving randomly), you don’t just get it back again.

Go back to the die rolls. Recall that ten dice with a sum of 59 is very low entropy. If you start grabbing the dice and re-rolling them at random, the sum will start going down. If the sum started out at 13 (also very low entropy), it will increase. In general, random interactions with the dice will move you towards the most-likely sum, which is 35. That’s the state of highest entropy. Any sort of random interaction tends to increase the entropy, at least until it gets to maximum.

Just so with the lump of clay. Once it hits the floor, the random interactions of the clay with the floor tend to maximize the entropy of the clay. You could randomly roll a bunch of 6’s and get to a low-entropy state of the dice, and you could randomly have all the clay molecules bounce in the same direction, but it’s extremely unlikely.

An example that helps drive all this home is mixing two gases. Suppose you have two gases in a container. There’s a partition in the middle of the container, and all of one type of gas (which we’ll call red) is on the left and the other type (blue) is on the right.

If you open the partition, the gases will mix because there are many more microstates where the gases are mixed than where they are separated, just like there are many more ways for dice to sum to 35 than to 59. The amount of information you lose is, for each molecule, which side of the container it’s on. So for N molecules you lose Nbits of information. (Bits and digits are basically the same; you just need to multiply by a constant to convert between them.)

There is an old paradox called Maxwell’s demon. It says that after the red and blue gases mix, you put the partition back. A demon with a trap door sits there and opens the door whenever a red molecule comes up from the left or a blue molecule comes up from the right, but closes the door otherwise. In this way, the red molecules all accumulate on the right hand side and the blue molecules accumulate on the leftt.

source: http://upload.wikimedia.org/wiki…

We know that this unmixed state has lower entropy than when the molecules are mixed, so the entropy has gone down. We can make opening and closing the trap door an arbitrarily-efficient process, so entropy is decreasing in a closed system. This violates the second law!

The resolution is simple, though. Entropy is not a property of the system. The mixed and separated molecules don’t have an innate amount of entropy to them. Entropy is how much information you’re missing. If we want to know if the demon reduces the entropy, we need to know if it’s missing more information before or after the unmixing process.

If the demon knows nothing about the microstate, it can’t make its scheme work because it doesn’t know when to open the trap door. On the other hand, if it knows everything about the microstate so that it can easily tell when to open the trap door, then there is no entropy to begin with because there was no missing information. The entropy can’t go down if it started out at zero.

Suppose the demon has the minimum amount of information. This would mean that for every molecule that comes up to the trap door, the demon knows whether it’s red or blue, so it knows whether to open the door. That’s one bit per molecule. So the demon needs N bits of additional information. But that’s exactly how much we would naively say the entropy goes down when you unmix the gas! Hence, the change in entropy of the gas as you unmix it is exactly equal to the information you need to carry out the unmixing process. This example was historically very useful in helping us understand that entropy is just however much information you don’t have.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s