Entropy is not disorder

Top-left: a low-entropy painting by Piet Mondrian. Bottom-right: a high-entropy painting by Jackson Pollock.

Entropy is a fundamental concept, spanning chemistry, physics, mathematics and computer science, but it is widely misunderstood. It is often described as “the degree of disorder” of a system, but it has more to do with counting possibilities than messiness.

Entropy is the number of configurations of a system that are consistent with some constraint. In thermodynamics, the study of heat, this constraint is the total heat energy of the system, and the configurations are arrangements of atoms and molecules. For instance, the molecules of water in an insulated thermos are always moving, colliding, changing positions and speeds, but in a way that keeps their total heat energy fixed. The number of molecular configurations is a very large number— so large that it would be an understatement to call it “astronomical.” To make this number manageable, entropy is described by the number of digits in the number of configurations: a million is 6, ten million is 7, a hundred million is 8, and so on.

The association with disorder comes from the fact that we often call systems with many possible configurations “messy” and more constrained systems “clean,” but this need not be the case. The picture above compares the artwork of Piet Mondrian with that of Jackson Pollock. We could say that Pollock’s painting has more entropy, not because we subjectively think it’s messier, but because there are more possible configurations consistent with the artist’s intent. Move a single drop of paint a few inches and it’s still essentially the same painting. A stray drop on Mondrian’s painting would ruin it. In this case, the constraint is the artist’s vision and the entropy is the number of possible ways to realize it. We could call Pollock messy, but we could also call him open-minded.

In computer science, information entropy is measured in bytes, the same unit that quantifies the size of a file on disk. Data in a computer is a pattern of zeros and ones, and the number of possible patterns can be counted just like the number of configurations of molecules in a thermos or paint drops on a canvas. In this context, one wouldn’t think of the size of a data set as its degree of disorder.

When two particles collide, the collision debris has only a tiny amount of thermodynamic entropy, despite how messy it may look on the monitor. Lead ions, consisting of 208 protons and neutrons each, produce debris with more entropy than single-proton collisions, and this entropy is relevant in some studies of lead ion collisions. Information entropy was also important in the search for the Higgs boson—the algorithms used to search for this rare particle were designed to minimize the entropy of mixing between Higgs-like events and non-Higgs events, so that the Higgs would stand out more clearly against the noise.

Although entropy has different meanings in different contexts, it has one profound implication: If all configurations are equally likely, the total entropy can increase but not decrease. The reason is mathematical, having to do with a larger number of possible configurations being more likely than a smaller number. When applied to heat, it is called the Second Law of Thermodynamics, even though it is more a consequence of counting and probabilities than a law of nature.

Jim Pivarski

Want a phrase defined? Have a question? E-mail today@fnal.gov.