Part 3: Entropy, Energy and Order (excerpt from “Origin of Mind: a history of systems”)

Another problem with measuring complexity can be approached through a related concept, entropy. Entropy was defined first by Rudolf Clausius. He was interested in generating work from steam engines and found that, “Heat can never pass from a colder to a warmer body without some other change …” (Clausius, 1867). Over time this principle was found to be universal and it became known as the second law of thermodynamics. The second law states that in closed systems, randomness always increases. As a result many theorists think of entropy as the inevitable loss of order (Carroll, 2010, p. 30) (Kauffman, Reinventing the Sacred: a new view of science, reason and religion, 2008, p. 13) (McFadden, 2000, p. 132) (Lederman & Hill, 2004, p. 183). Everything just naturally falls apart.

If a fundamental law of this universe is that things fall apart then how does this apply to the Big Bang? What fell apart between then and now? What was ‘lost’ between the Big Bang and now? The second law is often interpreted as saying that ‘order’ was lost. But how does the loss of order lead to humanity? To know this we need to examine what ‘order’ means in a thermodynamic context. Thermodynamics is the study of heat exchange (thermo = heat, dynamic = change). Thermodynamic ‘order’ is a measure of heat change. This makes sense because the founders of thermodynamics were interested in producing work through heat exchange. The problem is that the heat differentials necessary to make work can never be permanent. Entropy always reduces the heat differential over time to bring the system to equilibrium and thus the impossibility of a perpetual motion machine. The second law states that order in the form of heat differentials is always lost.

So what is heat? Heat is a macroscopic measurement of microscopic motion. Groups of atoms and molecules that vibrate more, are hotter. When they vibrate they move, but relative to what? They move relative to each other. Since Einstein, there has been no fixed point from which to judge motion. Matter in motion is only ‘in motion’ relative to other matter. If motion is relative then heat is relative. Relative to what? Relative to everything else in the closed system, in the box. This, of course, begs the question of how fine grained you choose to measure differences inside your box, or between the box and its environment, its ‘boundary.’ Something can only be hot if something else is cold. There must be a relative differential to make both heat and motion meaningful. More relative motion in the box is ‘hotter’ only at the level of fine graining the box defines and only if there is a ‘cooler’ exterior. There is neither heat nor motion if there is no measurable difference created by delineating a boundary that defines this difference. Heat and motion are relative to how you fine grain the measurements that define them. If everything is the same temperature then everything is moving at the same speed relative to everything else. This is called equilibrium. It is the state towards which entropy relentlessly moves. But where do we stop counting what is counted as ‘everything’? Where do we draw the fine graining line?

In a nutshell, if both motion and heat are relative measures then the only really important measurement in entropy is difference. Entropy inexorably lowers heat/motion differentials. This puts us in a strange position for using entropy to describe the difference between the Big Bang and now. The initial state was a tiny ‘hot’ soup of energy. There were no heat differentials in the initial state. Everything was moving equally fast. Almost fourteen billion years later there are heat differentials everywhere. My coffee is moving faster than this sofa, the sun is moving faster than the Earth and your brain is hopefully moving faster than your toes. Differences are everywhere now. The question is, if there were no differences at the beginning and differences in temperature are always lost as per the second law, why do we have more differences now? Why isn’t the universe one big cooling gas cloud or crystal?

Scientists like Eric Chaisson and Fred Spier talk about the rise of complexity in the universe (Chaisson E. , 2001) (Spier, 2010). Other scientists like Sean Carroll talk about the loss of order (Carroll, 2010). No one denies the second law, but our understanding of how it applies to a universe that appears to be ‘complexifying’ is mysterious. This problem is often related the ‘arrow of time’ and it remains one of the great intractable problems in science. Many theorists try to get around this problem by saying that complexity (gain of order) is built up in some places only by displacing masses of entropy (loss of order) that makes the overall net loss of order fit the second law. This solution has two flaws.

First it presumes that the universe is moving into empty space and that empty space can be used as an entropy ‘garbage can.’ The problem is that the universe is not like a cupcake with a firecracker in it, where parts of the cupcake ‘move into’ the air around it. The universe isn’t moving into an absolute Newtonian empty space, away from a fixed point source. The universe is expanding, it is growing, everything relative to each other. Everything is accelerating away from everything else in proportion to its distance. The universe is like bread dough that is expanding to become a loaf. There is no empty space that can be filled with disordered matter-energy. Everything around us, including the ‘space’ was all here at the beginning only it is expanding. Empty space isn’t rushing into our universe to fill the ever widening gaps. Where would it come from?

The second problem with displacing entropy to explain the growth of complex order is even more impossible to explain away. No matter how you cut the cake, the result was that the universe moved from a thermodynamically undifferentiated state to a thermodynamically differentiated state. According to the second law of thermodynamics, ‘entropy,’ this should never happen. Correction, this should never happen in a ‘closed’ space.

But what if the universe were scale-free? What if the only box put around the universe was the illusory one we imagine by stopping our measurements at certain scale-levels? What if the only thing that ‘closes’ the universe system is the limits of our tests?

When Claude Shannon was developing information theory he was looking for a term for the measure of uncertainty in a message. He was going to call this measure the ‘uncertainty function’ but John Von Neumann recommended otherwise,

“You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage” (Tribus & McIrvine, 1971).

The problem has always been that concepts like entropy, complexity and order have been hard to define. The result has been unclear concepts used to describe theories that are unclear about the big picture. We have no theory of everything. To remedy this problem this investigation works towards a redefining of complexity. Like ‘entropy,’ we are looking for some way to measure the change from the Big Bang until now. This is extremely difficult and it is, to a deep extent, the impetus behind this investigation.

One thing that does seem likely is that information will play a key role. Shannon’s information theory is the only known metric for dynamic change. Canonically Shannon information is understood as the measure of the reduction of uncertainty by a receiver. Where entropy is the increase in uncertainty, or randomness, or noise within a spectrum of interaction; information is the increase in certainty, or regularity or predictableness within a spectrum of interaction. Information is the regularities within a less regular (i.e. noisy) environment (Deacon T. , Shannon-Boltzmann-Darwin: redefining information Part 1, 2008). This investigation tries to forge a bond between complexity and information so that we can understand the evolution of the universe not just as a loss of order, but also as a gain of ‘order,’ as defined by complexity and information. This investigation casts entropy and the loss of order as just one side of the coin the universe flips. The other side of entropy is positive symmetry and the gain of order. Both entropy and positive symmetry are universal and both are statistical steps into an unknown future.

So now back to the question. How do we measure the change in the universe between the Big Bang and now? Is the universe more complex or less complex now? The answer depends on scale. If you presume that there are no scale levels smaller than quanta then you can presume the initial state of the Big Bang was a uniform mix of energy, a point source. You just say there cannot be any differences between the different things in this point source because there are no different locations that can maintain different states within it.
Everything is the same because it is all in the same place. The amount of information needed to describe that point source is minimal. That is what a point source is, simple, easy to describe, it contains just one bit of information. Unfortunately, information can only exist as a relative measure between existing states and possible states. It only exists as measured against a noisy environment. What is the environment for the universe? The universe was a ‘point source’ relative to what?

Supposedly, this simple Big Bang ‘bit’ contained the whole universe. The question is, can we really believe that it was simple, that it had no fine-grained structure, the entire universe? This investigation presumes that if we were the size of a quark component and looked into the initial state we would find a whole universe of pattern. This is what a scale-free universe means, pattern at all levels. The result of this point of view is that there was an unknown level of differentiation packed into that tiny Big Bang. This unknown level of differentiation has changed into the current, known level of differentiation over the last fourteen billion years. Fortunately, this presumption ends entropy’s exclusivity.

Physical entropy is a one way process. It goes from heat difference to heat equality, from order to randomness, from information to noise. If the sender were the Big Bang and the current universe were the receiver, and entropy were the only rule, then the initial state should have been highly differentiated and rich in information. Over the last fourteen billion years we should have lost that information. We should be noise. We should be a gas cloud. Obviously, we are not.

The problem of defining complexity is the problem of, ‘compared to what?’ At what scale level do we arbitrarily stop counting differences and how do we choose to box, or limit, what we want to measure. Just because we cannot see fine grained pattern yet does not mean it does not exist. To presume that it does not exist because our current technology has limits is to commit the prejudice of previous centuries.

Was the initial state more complex than the current state? We don’t know. It didn’t have more matter-energy. It had exactly the same amount of matter-energy. Did this matter-energy have more different relationships than current matter-energy? This depends on the scale-level we measure those differences on. The bottom line is that a scale-free universe makes measures of complexity relative, not absolute. My ninety kilogram body may be considered more complex than a ninety kilogram rock because I contain more movement differentials. However, it may be considered less complex than ninety kilograms of vapourized rock because as this vapour disperses it will enter billions of increasingly randomly differentiated heat states. My body maintains a relatively limited, predictable and repetitive number of heat states. Does this mean that vapour is more complex than I am? When I imagine the pattern that constitutes me, it is not Grassberger’s perfect checkerboard image to the left, nor is it the maximum randomness image to the right. It is somewhere in between.

This investigation denies that the simplistic point of view represented by ‘the growth of entropy’ as represented by cosmologists like Carroll and ‘the growth of complexity’ as represented by Spier and Chaisson. In their places it takes a new look at how those relative differences that define our current universe came to exist. Entropy is only a part of this story. Complexity is only part of this story. This investigation introduces new ways to think about these concepts.

One of the keys to this new approach is reimagining ‘order.’ Order is not some absolute measure of heat difference or ‘complexity’ or randomness or information. Order is also relative. In this investigation ‘order’ is defined as the kind of persistent differences that have appeared since the Big Bang, absent any fatuous comparisons with an unknown ‘initial state.’ Order here is defined as those regularities in relative motion differentials that have accrued since shortly after the ‘initial state.’ At some point shortly after the ‘initial state’ there began to persist those certain regularities of relative motion that constitute matter as we know it. These regularities did not exist previous to this point and they continue to exist up until now. These regularities are what this investigation calls ‘order.’ The addition of billions of emergent types of regularities to this initial ‘order from the unknown’ is the story of humanity.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s