A Description of Entropy

[Edited continually.]

Abstract: This is a layman’s description of entropy as understood in thermodynamics. There is nothing revolutionary in here. I’m not forwarding a new understanding of entropy, everything here can also be found in a thermodynamics textbook. The purpose of this article is not to contradict the scientific understanding of entropy, but to attempt to elaborate it in layman’s terms. Entropy is not “That thing which results in the heat death of the universe,” which tends to be the way some people, particularly science fiction authors and readers, tend to regard it. Entropy, though it is an abstract description of underlying processes, is, understood as those processes, just as necessary to our existence as gravity. That is the purpose of this article; to clear up that particular misunderstanding which is so pervasive in some circles.

On TimS’s suggestion, I’ll frame this description around the basic understanding most people have about entropy: It’s that law of thermodynamics which prevents perpetual motion machines from happening. So we’ll consider three universes—one with backwards entropy. One with no entropy. And finally the universe we do live in.

To start with, imagine, for a moment, that entropy were reversed; that the amount of entropy in the universe were constantly -decreasing-.

What would this imply for us?

Approximately the same thing that increasing entropy implies; the amount of work we can extract from the universe is finite, dictated not by increasing energy homogeneity, but by decreasing. In a universe in which entropy is reversed, rivers would flow uphill—planets would slowly disintegrate, in point of fact, into galactic clouds of gas. We could extract work from this process, while it lasted, provided we could survive in such an environment to begin with. (We’ve evolved to extract work in one direction. Our cells would be no more capable of producing work in the other direction than a steam engine, which would require carbon dioxide fed deliberately into it, to be divided into oxygen and carbon.)

If all of this seems absurd, limit our consideration to a simple thing—Newtonian gravity. Consider a closed system containing two particles, five meters apart, as our initial condition. This is a higher potential energy state—a lower entropy state—than the final condition, in which those particles have collided, and the energy has been reexpressed as atomic vibrations, heat. If entropy were reversed, those particles would never be permitted to collide; the law of physics would forbid it, because it would increase the entropy, in violation of our reversed law of entropy.

In the current universe, the work we can extract from matter is not merely limited by entropy, it is in fact permitted by it; it is the irreversibility of thermodynamic processes which permit machines to work to begin with. A steam engine in which steam is as likely to contribute its engine to deoxidizing carbon and forming it into coal as it is to push a turbine is one in which the turbine moves fitfully, unpredictably, and in no particular direction. The arrow of entropy is the not merely the limit on how much work can happen, it is also the mechanism by which work happens. It does not, in fact, matter which direction it goes; as long as it is predictable, work could be extracted. Rivers wouldn’t flow without entropy.

Entropy is not a property of a system, but a property of forces; the law of entropy can be restated as “Forces do what forces do.” We measure entropy strictly in terms of things meaningful to use; it cannot be directly measured, because it doesn’t exist.

Entropy is not an increase in homogeneity. It’s not an increase in the number of microstates. These are products of forces.

And forces can work in opposing directions. The number of -macrostates- is in constant decline; this is also a product of forces. In macroscopic terms, heterogeneity, not homogeneity, is on the rise; consider that, according to Big Bang Theory, an early state of the universe was a nearly uniform cloud of gas. Compare that to the macroscopic state of the universe today, a heterogeneous mess. (It doesn’t matter for this argument whether Big Bang Theory is true or not; if true, this behavior is exactly what physics would predict.)

Statistical mechanics doesn’t contradict this, but frames it in terms of probability; when dealing with statistical distributions, i/​e, a cloud of gas, it’s a way of expressing mathematically what is happening on an individual basis to each of the atoms. If you were capable of modeling each individual atom in the cloud of gas, you would arrive at the same conclusions. Entropy isn’t necessarily related to information, although it can be modeled that way very easily in statistical mechanics, because information about a statistical representation of an entropic process does vary in relation to the entropy. (Which means that information-theoretic models can still model entropy for a statistical system.)

The mathematical models for statistical mechanics entropy and informational entropy are very similar, which has led some people, including myself, to an initial misapprehension that they were describing similar processes; one of my early understandings of entropy was that information about the universe was being encoded into the universe, and that quantum uncertainty had to increase elsewhere in order to accommodate this certainty. I will provide an armchair logic proof of why this isn’t necessary below. First, as to why they are similar—this is because they are both modeling, and measuring, uncertainty about particular variables within the system.

Shannon Entropy—the measurement of entropy in information systems—is a measure of information which can be encoded in a given variable, and thus the measure of information which is lost when that variable is lost. This ties into conventional entropy because of statistical mechanics, which is framed on the concept of a “microstate”—that is, a configuration of particles which can result in a “macrostate,” which is an observable macroscopic state. (A forest is still a forest if a tree is two inches further to the left; the forest is the macrostate, the state of each individual tree is a microstate. From an airplane, the precise position of an individual tree doesn’t matter; the macrostate is the same. A particular macrostate is, loosely speaking, a collection of microstates such that it is casually impossible to identify which exact microstate it is in.)

A given macrostate of a cloud of gas can encode—provided you could read its microstate—a vast amount of information. The exact amount is not important; “vast” is all that you need to know. Statistical mechanics asserts that the most likely macrostate is the one which is described by the most possible microstates, or, to phrase this somewhat differently, the most likely state of matter is the one in which, if you could read and write to the microstate, the most information can be encoded. This may sound like a remarkable claim up until you revisit the definition of a macrostate and one particular word—“possible.” (Note that statistical mechanics makes a justified assumption that all possible microstates are equally likely; Liouville’s theorem proves this for the case that all possible microstates were at some point in the past equally likely. Or, in other words, all microstates are equally likely, provided you have not actually read the microstate.)

That “possible” is pretty important. Absent quantum fluctuations, which throw a wrench into the layman description (I’m sticking to classical mechanics because I’m judging that introducing the necessary “degrees of freedom” to explain behavior there is too messy for a layman), all your particles in a closed system can’t, in any microstate, simultaneously adopt a leftward velocity; this state would violate the conservation of momentum. Entropy in statistical mechanics is a different way of -measuring- entropy, but that entropy must still be the product of the laws of physics operating on individual particles. Between point A and a later point B in time for a closed system X, entropy will -never- be less in point B than in point A. The “statistical” in “statistical mechanics” doesn’t grant the possibility that anything can happen; none of the possible microstates include “lower (mechanical) entropy over time.” (Again, classical mechanics. For those interested in the quantum version of all this, I’ll have to refer you to the concepts “degrees of freedom” and then “gauge theory”. If somebody can come up with a layman’s description of that, by all means.)

[Draft section; will continue pending further review.]

Earlier I promised you an armchair proof that entropy is not necessarily information. The armchair proof that entropy is not necessarily related to a universal concept of information, or level of system quantum uncertainty, is relatively simple—entropy still increases in a modeled system with 100% information about what is going on within it and no quantum uncertainty. This is not to say that entropy -doesn’t- represent system information or total quantum uncertainty, only that these concepts aren’t necessary to entropy. It also is not to say that entropic calculations do or do not directly represent -internal- uncertainty; it certainly limits the -amount- of information which can be represented in a universe, consistent with the statistical mechanics interpretation of entropy. An armchair proof that entropy bounds the internal information storage capacity of a system is also relatively trivial; the process of information binding requires work, and entropy limits the amount of work that can be done within a system.

Entropy is, broadly speaking, a statement of the irreversibility of forces. In a closed solar system, gravity means that eventually, everything will be in a stable configuration; a single concentrated ball, or dead objects orbiting each other. There are several stable configurations, but each are local entropic maximums. The irreversibility of entropy, in a final description, can also be treated as an expression of the fact that stability persists, and instability ends—and importantly that this is true of every scale.

It’s not disorder. It’s not homogeneity. It’s not the number of states. These expressions of entropy are expressions of particular forces.