The Two Things You Didn’t Realize Govern Everything

These two quantities pervade dialogue yet evade understanding. Their relationship governs everything.

Photo by Daniele Levis Pelusi on Unsplash

There are two quantities of relevance to everything that happens in the universe. They are quantities that pervade dialogue yet evade understanding: energy and entropy. We shall see how these quantities guide the evolution of systems, but first we must lay the foundation with some definitions.

Energy

Most of us have an intuitive notion of the meaning of energy. Energy is the currency of action. It is the capacity for something to change the state of something else. Energy is how we distinguish between “something” and “nothing.”

In the material world, energy comes in basically two types:

Things have energy due to their very existence. This is called mass energy. Things can have additional energy either by moving (giving them kinetic energy) or by being in the field of something else (giving them potential energy). Temperature and speed are two forms of kinetic energy we can easily observe and understand. Gravity is a potential energy field created by masses which gives energy to other masses.

Energy, denoted E, seems to be conserved in the universe. That means that if something gains energy, something else has lost it.

How Energy Flows

Energy transmutes from one entity to another and from one form to another in order to spread itself out. Nature abhors an energy gradient. Energy tries to equalize itself through high-energy things releasing their energy and entering a lower energy state. This is why material flows from the top of a mountain range (where it has a high potential energy) to the bottom (where it has lower potential energy). Rivers flow downhill. Erosion brings stone and silt to the valley below.

Energy tries to equalize itself through high-energy things releasing their energy and entering a lower energy state. This is why material flows from the top of a mountain range to the bottom.

Photo by Daniele Levis Pelusi on Unsplash

It is natural to ask, if the universe has been around for many billions of years, why is energy is still moving around? Wouldn’t it have all equalized by now? How come all of the mountains haven’t been flattened?

The answer lies in energy’s creative twin, entropy. Energy and entropy are engaged in a perpetual dance.

Entropy

Entropy is perhaps the most misunderstood basic quantity in all of science. Of the many attempts to convey its meaning in common language are “disorder” and “lost energy.” Google defines entropy as “representing the unavailability of a system’s thermal energy . . . often interpreted as the degree of disorder or randomness in the system.”

While entropy may represent energy loss or be interpreted as disorder, these attributes do not define entropy. I believe this confusion between interpretation and definition is a big part of why many miss out on the fundamental character of entropy.

While entropy may represent energy loss or be interpreted as disorder, these attributes do not define entropy.

So let's look at the definition of entropy, which is, quite simply:

The Boltzmann Equation for Entropy

The k out front is just a number used to make the units work out. ln is the natural logarithm operator which scales Omega.

Lastly, Omega is the only quantity in the definition of entropy which has physical meaning. Omega, and by extension entropy, is the number of states that the elements of a system can inhabit.

Entropy is the number of states that the elements of a system can inhabit.

Entropy, like energy, is most easily understood through comparison. To compare the entropy of different arrangements, we define macrostates and microstates:

Macrostates are the high-level properties of a system which are of interest to us. Microstates are all the different configurations of the system’s elements which can result in a given macrostate.

The entropy of two macrostates can be compared by counting the number of microstates they have. The macrostates with more microstates are more likely because there are more numbers of ways for it to occur. Thus, entropy is unique amongst physical quantities in that it increases in a closed system over time.

The entropy of two macrostates can be compared by counting the number of microstates they have. The macrostates with more microstates are more likely because there are more numbers of ways for it to occur.

The fact that entropy increases over time means that systems tend toward outcomes for which they have the most number of arrangements for that outcome.

Entropy’s Dice

Photo by Erik Mclean on Unsplash

Consider the roll of a single die. We can define six possible macrostates, one corresponding to each possible outcome of the roll. For our purposes, each of these macrostates has exactly one microstate, namely the number on top of the die at the end of the roll. This even distribution of microstates is what makes each outcome equally likely. There’s the same number of ways for each of them to occur.

When we roll two dice, the relative number of microstates for each macrostate changes. This makes a certain macrostate more or less likely. There are 11 macrostates given by the outcomes 2–12. The outcomes of 2 and 12 each have only two microstates, that of rolling double 1’s or double 6’s. The outcome of 7, meanwhile, has six microstates because there are six arrangements of the two dice which result in a total of 7. That is why 7 is the most likely outcome of the roll of two dice. It has the most number of microstates, and thus the highest entropy. Yet, contrary to the popular notion of entropy, we would hardly say that 7 is the most chaotic or disordered of the numbers 2–12.

Graphic by the author.

Comparing the entropy of different macrostates is, in my view, one of its most useful applications. It allows us to gain insightful answers to questions from “why does life exist?” to “what does the future hold for financial markets?”

To help practice comparing the entropy of two different states, consider the following example:

Changes in Entropy of Different States of Matter

Remember from high school chemistry that the state of matter of something describes whether it is a solid, liquid or gas. These states are considered macrostates from the perspective of entropy. The entropy of a material increases as it progresses through the phases from solid to gas, because the particles of the material have more freedom to explore their environment.

In a solid object, particles are contained to the object's boundary. For a liquid, particles are constrained to the bottom of their container. For a gas, particles are constrained to the boundaries of their environment. The greater the physical boundaries which the material can occupy, the greater its entropy. The gas macrostate has the greatest number of microstates compared to the liquid or solid macrostates.

Photo by Richard Price on Unsplash

We are now prepared to examine how energy and entropy interact with one another. This is the beginning of understanding the physics of relationship.

The Dance of Energy and Entropy

Ok, so we know that energy likes to smooth out in the universe and that energy goes from high-energy things to low-energy things. And we know that things like inhabiting states for which they have the most number of ways to inhabit them.

Do you see the conflict here?

The conflict, which is the core driving force of the universe, is that having more energy gives things more ways to arrange themselves. So higher energy states are generally higher entropy as well.

The conflict, which is the core driving force of the universe, is that having more energy gives things more ways to arrange themselves. So higher energy states are generally higher entropy as well.

Imagine a ball going back and forth in a simple U-shaped ramp. The ball will have higher energy the faster it is traveling at the bottom. The faster it travels at the bottom, the higher it can reach in the ramp and the more entropy it has from its ability to occupy more locations in space. Therefore, the higher energy the ball, the greater its entropy.

Graphic by the author.

The push-pull relationship of energy and entropy is encapsulated by the Gibbs Free Energy:

The Gibbs Free Energy

The Gibbs Free Energy is a quantity that takes into account the relative influence of energy and entropy on the behavior of a system. It is defined as:

The Gibbs Free Energy Equation

The triangles, delta, indicate that we are looking at changes between two states. G is the Gibbs free energy, E is energy, and S is entropy.

Since △E wants to be negative and △S wants to be positive, a negative △G indicates that the change in the system’s state is likely to be spontaneous. The more negative △G, the more likely the change.

Since △E wants to be negative and △S wants to be positive, a negative △G indicates that the change in the system’s state is likely to be spontaneous. The more negative △G, the more likely the change.

When △E and △S are the same sign (both negative or both positive), the outcome for △G depends on their relative sizes, and on the third quantity, T. T is the temperature of the environment in which the change is taking place.

The temperature is simply a measurement of the ambient energy of the environment. To illustrate the possibilities for △G, we list the four basic possibilities in a table (excluding △E or △S = 0, since these possibilities are more boring):

Now let’s look at each of the four possibilities in turn, looking at physical examples. Each possibility is given an archetypal name to aid in understanding.

Gibbs Archetype One: △E < 0 & △S > 0

Archetype: Fire

△G: < 0

Photo by Peter John Maridable on Unsplash

When △E is negative and △S is positive, the system has released energy to its surroundings and at the same time increased the number of possibilities open to it. The most common example of this process in nature is a combustion process, such as fire or digestion.

During combustion, long, complex molecules with energy stored in their bonds break apart into tiny pieces. In so doing, the energy in their bonds is released to their surroundings (the fire is hot), and the atoms in the molecules become liberated to the atmosphere (fire emits carbon dioxide and water as gases).

Gibbs Archetype Two: △E > 0 & △S <0

Archetype: Photosynthesis

△G: > 0

Photo by Stefan Steinbauer on Unsplash

Archetype two is photosynthesis. In photosynthesis, materials have an increase in energy and a decrease in entropy. Energy from sunlight is captured through the formation of molecular bonds, which chain gas molecules to one another, limiting their movement. The photosynthesis archetype is the inverse of the fire archetype. In nature, these two processes cycle back and forth.

Since △G is always positive for photosynthesis-type processes, we might expect that they never occur at all. Clearly, there are other factors at play. Photosynthesis is able to occur because:

While the materials directly participating in photosynthesis have reduced entropy, they increase the entropy of their surroundings. Gibbs energies give us probabilities, not absolute certainties. There is always a chance of a process occurring.

Gibbs Archetype Three: △E > 0 & △S > 0

Archetype: Evaporation

△G: > 0 when T is small, △G: < 0 when T is large

Photo by 🇮🇳Saif Ali on Unsplash

Evaporation-archetype processes occur spontaneously at high temperatures.

Archetype three is evaporation. During evaporation, molecules of liquid gain energy, using this energy to break free from their liquid confinement and enter a higher-entropy gas state.

Other examples of the evaporation archetype are melting and expansion.

Gibbs Archetype Four: △E < 0 & △S < 0

Archetype: Condensation

△G: > 0 when T is large, △G: < 0 when T is small

Photo by Osman Rana on Unsplash

Evaporation-archetype processes occur spontaneously at low temperatures.

The condensation archetype encompasses processes whereby energy is released and entropy falls. This is, of course, the partner to the evaporation archetype. Each process from the condensation or evaporation archetype has an analogous reverse process from the other.

Other examples of the condensation process include crystallization and contraction.

The “Value” of Energy

We know that processes from the evaporation or condensation archetypes will occur spontaneously depending on the temperature. One substance that undergoes a condensation-type process at a low temperature will transition to undergo an evaporation-type process at a high temperature. The boundary between these two processes is based on the entropy-value of its energy.

In short, the substance will release energy to its surroundings if that energy causes a greater increase in entropy for the surroundings than it does for the substance. By the same token, a substance will absorb energy from its surroundings when the energy gives it more entropy than the energy would give its surroundings.

A substance will release energy to its surroundings if that energy causes a greater increase in entropy for the surroundings than it does for the substance. By the same token, a substance will absorb energy from its surroundings when the energy gives it more entropy than the energy would give its surroundings.

What the system does is based on the answer to this question: if it takes energy, will that energy give it more additional microstates than it currently offers the source of the energy? If so, thermodynamics urges the system to take the energy.

Let’s explore this dynamic by example, this time straying into the social realm.

Income Redistribution

Consider money as a form of energy that can be held by an individual.

Photo by rupixen.com on Unsplash

Like other forms of energy, having more money gives its holder greater entropy, because the holder can occupy more microstates. There are more options available to someone with lots of money than there are to someone with less money. Disregarding for a moment the entropy of conscious experiences, more money allows a person to travel to more places than someone with less money. Fancy restaurants, transcontinental flights, and so on.

However, the amount of entropy that money gives an individual is not simply linear. One quantity of money will create a different number of options for someone with lots of money than it will for someone with very little money. Consider $1,000,000. To someone who has $1,000, the addition of $1,000,000 gives them a huge increase in opportunity. However, to someone who has $1,000,000,000, the addition of $1,000,000 gives them very few options that were not already available to them.

The amount of entropy that money gives an individual is not simply linear. One quantity of money will create a different number of options for someone with lots of money than it will for someone with very little money.

If the economy were guided by the laws of thermodynamics, wealth redistribution would be an inevitable consequence of this difference in entropy given. Those with less money would be considered “cold” while those with more money would be considered “hot.” Eventually, just like hot coffee in a cold room, thermodynamics would dictate that the temperature evens out.

From the perspective of the cold people, the ambient temperature is high, since the average energy of the whole economy is greater than their own energy. Thus, they would be compelled to undergo an evaporation process, whereby they take energy from the environment to increase their entropy.

Photo by Hermes Rivera on Unsplash

From the perspective of the hot people, the ambient temperature is low, since the average energy of the whole economy is smaller than their own energy. Thus, they would be compelled to undergo a condensation process, whereby they take energy from the environment to increase their entropy.

The greater the difference in energy, the more likely it is to become equalized. This is because, as the energy divide widens, the change in the entropy-value of that energy also widens.

This same consideration is what determines whether water will condense or evaporate. Water will condense, releasing its energy to the environment, when its energy would produce more possibilities for its environment than it does for itself. On the other hand, water will evaporate when the energy of the environment produces more possibilities for the water than it does for itself.

Conclusion

Thermodynamics, the physics of relationship, is an insightful tool for determining the expected behavior of systems. To utilize its methods, we need ways to measure the energy and entropy of a system. In the physical sciences, these measurements are possible through a combination of calculation and counting.

In the social sciences and in psychology, the laws of thermodynamics are expected to be equally powerful, but extreme difficulty is encountered in determining the energies and entropies of systems. The central challenge is that these quantities are subjective. What carries “energy” in one mind or one culture may be very different from what carries energy in another. Even more difficult is the ascribing of social or mental “entropy,” which would describe the number of mental or social states available.

In the social realm, the central challenge is that these quantities are subjective. What carries “energy” in one mind or one culture may be very different from what carries energy in another. Even more difficult is the ascribing of social or mental “entropy,” which would describe the number of mental or cultural states available.

Photo by pawel szvmanski on Unsplash

Nevertheless, the urge to discover suitable solutions to “the hard problem,” governance, ethics, and our spiritual needs demands that we pursue murky and uncertain waters. I believe we will find that thermodynamics offers us a hugely undertapped resource in these efforts.