The transfer of energy dictates everything on earth from the movement of atoms to the global economy. In high school/first-year chemistry we learn that the rules governing the movement of energy are simply defined by three laws of thermodynamics (four if you count the zeroth law). Yet, this simplicity can be misleading – as demonstrated by how often the second law is misunderstood, misused and abused. The second law states:

The entropy of closed systems undergoing real processes must increase.

For some people the second law translates to “everything progresses from order to disorder” or “it is impossible for complexity to arise from randomness.” The biggest promoters for this misguided interpretation are advocates for intelligent design and/or irreducible complexity, which are just thinly veiled pseudonyms for creationism. They argue that complex systems like the flagellum or the human eye could not evolve spontaneously because they are complex – A logically precarious stance to take since these claims have been thoroughly debunked by evolutionary biologists.1

A quick bit of reflection on our day-to-day lives produces examples of complexity arising from less complex components. Ants, neurons, and transistors are just some examples of small building blocks that become infinitely more complex systems when combined in the right circumstances.

It is easy to argue that the above examples are the result of agency but there are also many examples of objects naturally arranging themselves into complex structures. In fact, the natural world is very good at arranging atoms. Diamonds, ice crystals, and polycyclic aromatic hydrocarbons are just a few examples. Ambipolar molecules are an especially useful illustration of this tendency. When these molecules come into contact with water they form beautiful monolayers, bilayers, micelles and other structures.

So, returning to the 2nd law of thermodynamics, the correct interpretation is that complex structures – like those listed above – are possible, but at the cost of increased entropy in the surrounding environment.2

The tendency for a system to self-organize, when given the right circumstance and some energy from the surrounding environment, is one of the most important phenomena we observe. Yet, this transition from energy to order is not obvious when looking at our current laws of thermodynamics. This has led some researchers to suggest it may be possible to formalize a fourth law of thermodynamics that describes how complex systems arise.

Robert Hazen, a geologist at George Mason University, as well as others have hypothesized that this new law would need to encompass the following four components:

the number of units/elements/parts the degree/strength of the interaction between the units how energy into the system effects the units the changes in energy input

The right combination of these variables will result in a product that is greater than the sum of its parts. The goal for researchers is to produce a model that predicts the emergent structure, given n units in x volume with a degree of interaction y and energy input z.

Building a model like this is not a trivial task. There are a number of researchers, including physicists, chemists, and information theorists, currently attempting this feat.

Some of my favorite research on emergence is conducted by computer scientists. There are computer programs where users provide simple instructions to small, randomly organized subunits and then watch for spontaneous generations of complexity. One of the most famous of these programs is John Conway’s Game of Life. In Game of Life (link to download page) a grid of black and white squares switch color based on the color of the adjacent squares. These seemingly random interactions can result in simple species like “gliders” or complex systems like the ones in the video below:





These black and white little creatures do not feed, but they do exhibit many other characteristics necessary for evolution. Their collisions (selection events) can lead to replication, death, an entirely new creature or nothing at all. The study of these programs is an example of bottom up research into the emergence of complexity.

Top down research is also underway by, for example, physicists who study the formation of sand ripples under water waves (pdf). By manipulating the variables (amount of sand, frequency of waves, etc.) they are attempting to uncover that the equations that predict ripple emergence.

There is a small subset of chemists that are particularly interested in emergent phenomena: those who are trying to understand/recreate abiogenesis, the generation of living organisms from non-living building blocks. Abiogenesis is dependent on a series of emergent events. These include cell membrane formation, self-replicator formation, protein/RNA/DNA folding and a myriad of other emergent events that have defined life on earth.

Recently, the number of chemists researching self-assembly and nano- to micrometer-sized formations – also emergent phenomena – has significantly increased due to the impending end to Moore’s law and the promise of nanotechnology. These phenomena usually involve the spontaneous emergence of structures through the control of reaction conditions.

I feel confident that chemists are particularly well suited to study emergence and contribute to the formulation of a fourth law of thermodynamics. We know how to control the concentration of subunits, the interactions between these subunits based on molecular structure or nanoparticle surfactants, the polarity of the solvent, and other variables in order to achieve specific emergent phenomena. Note the similarities between this list of variables and Professor Hazen’s list (1-4 above). Through the sheer force of combinatorial chemistry we are already inadvertently making progress towards this goal. It may just be that we need to put all of these pieces together in one model.

There is one significant caveat I have to mention when talking about emergent phenomena: I have used the word complexity several times relying on the intuitive sense of what it means. Yet, there is no universally accepted definition of complexity. How do we quantify this term in a way that would allow us to compare the complexity of a micelle to the complexity of the human brain?

The unfortunate answer, as pointed out by Hazen in Genesis: The Scientific Quest of Life’s Origins, is that we may simply not have the tools or language necessary to define complexity yet. Hazen likens it to the effort to define water before understanding atoms. We now know it as H 2 O, but before defining water’s atomic constituents there was no binding theme. Is it a solid, liquid or gas? The answer is ‘yes’. A similar conundrum emerged when scientists tried to classify the interrelation between animals before the theory of evolution. Do we group them by size, shape, color…?4

I really hope that a new thermodynamic law describing emergence is on the horizon. I find satisfaction in encompassing theories that, like evolution, are simple, elegant, and obvious in retrospect.

References:

[1] For an entertaining/informative decimation of intelligent design watch this lecture by Prof. Kenneth Miller (one hour lecture).

[2] For a particularly thought-provoking type of emergent complexity read up on Boltzmann Brains.

[3] Here are some more interesting results from John Conway’s Game of life. One of my favorites demonstrates a dynamic Droste effect (Take note of the beginning shapes before it zooms out).

[4] Many of the examples I used in the above post were taken from “Genesis: The Scientific Quest of Life’s Origins” by Robert M. Hazen and “Emergence: From Chaos to Order” by John h. Holland.