Part 3: Entropy, Energy and Order (excerpt from “Origin of Mind: a history of systems”)

Another problem with measuring complexity can be approached through a related concept, entropy. Entropy was defined first by Rudolf Clausius. He was interested in generating work from steam engines and found that, “Heat can never pass from a colder to a warmer body without some other change …” (Clausius, 1867). Over time this principle was found to be universal and it became known as the second law of thermodynamics. The second law states that in closed systems, randomness always increases. As a result many theorists think of entropy as the inevitable loss of order (Carroll, 2010, p. 30) (Kauffman, Reinventing the Sacred: a new view of science, reason and religion, 2008, p. 13) (McFadden, 2000, p. 132) (Lederman & Hill, 2004, p. 183). Everything just naturally falls apart.

If a fundamental law of this universe is that things fall apart then how does this apply to the Big Bang? What fell apart between then and now? What was ‘lost’ between the Big Bang and now? The second law is often interpreted as saying that ‘order’ was lost. But how does the loss of order lead to humanity? To know this we need to examine what ‘order’ means in a thermodynamic context. Thermodynamics is the study of heat exchange (thermo = heat, dynamic = change). Thermodynamic ‘order’ is a measure of heat change. This makes sense because the founders of thermodynamics were interested in producing work through heat exchange. The problem is that the heat differentials necessary to make work can never be permanent. Entropy always reduces the heat differential over time to bring the system to equilibrium and thus the impossibility of a perpetual motion machine. The second law states that order in the form of heat differentials is always lost.

So what is heat? Heat is a macroscopic measurement of microscopic motion. Groups of atoms and molecules that vibrate more, are hotter. When they vibrate they move, but relative to what? They move relative to each other. Since Einstein, there has been no fixed point from which to judge motion. Matter in motion is only ‘in motion’ relative to other matter. If motion is relative then heat is relative. Relative to what? Relative to everything else in the closed system, in the box. This, of course, begs the question of how fine grained you choose to measure differences inside your box, or between the box and its environment, its ‘boundary.’ Something can only be hot if something else is cold. There must be a relative differential to make both heat and motion meaningful. More relative motion in the box is ‘hotter’ only at the level of fine graining the box defines and only if there is a ‘cooler’ exterior. There is neither heat nor motion if there is no measurable difference created by delineating a boundary that defines this difference. Heat and motion are relative to how you fine grain the measurements that define them. If everything is the same temperature then everything is moving at the same speed relative to everything else. This is called equilibrium. It is the state towards which entropy relentlessly moves. But where do we stop counting what is counted as ‘everything’? Where do we draw the fine graining line?

In a nutshell, if both motion and heat are relative measures then the only really important measurement in entropy is difference. Entropy inexorably lowers heat/motion differentials. This puts us in a strange position for using entropy to describe the difference between the Big Bang and now. The initial state was a tiny ‘hot’ soup of energy. There were no heat differentials in the initial state. Everything was moving equally fast. Almost fourteen billion years later there are heat differentials everywhere. My coffee is moving faster than this sofa, the sun is moving faster than the Earth and your brain is hopefully moving faster than your toes. Differences are everywhere now. The question is, if there were no differences at the beginning and differences in temperature are always lost as per the second law, why do we have more differences now? Why isn’t the universe one big cooling gas cloud or crystal?

Scientists like Eric Chaisson and Fred Spier talk about the rise of complexity in the universe (Chaisson E. , 2001) (Spier, 2010). Other scientists like Sean Carroll talk about the loss of order (Carroll, 2010). No one denies the second law, but our understanding of how it applies to a universe that appears to be ‘complexifying’ is mysterious. This problem is often related the ‘arrow of time’ and it remains one of the great intractable problems in science. Many theorists try to get around this problem by saying that complexity (gain of order) is built up in some places only by displacing masses of entropy (loss of order) that makes the overall net loss of order fit the second law. This solution has two flaws.

First it presumes that the universe is moving into empty space and that empty space can be used as an entropy ‘garbage can.’ The problem is that the universe is not like a cupcake with a firecracker in it, where parts of the cupcake ‘move into’ the air around it. The universe isn’t moving into an absolute Newtonian empty space, away from a fixed point source. The universe is expanding, it is growing, everything relative to each other. Everything is accelerating away from everything else in proportion to its distance. The universe is like bread dough that is expanding to become a loaf. There is no empty space that can be filled with disordered matter-energy. Everything around us, including the ‘space’ was all here at the beginning only it is expanding. Empty space isn’t rushing into our universe to fill the ever widening gaps. Where would it come from?

The second problem with displacing entropy to explain the growth of complex order is even more impossible to explain away. No matter how you cut the cake, the result was that the universe moved from a thermodynamically undifferentiated state to a thermodynamically differentiated state. According to the second law of thermodynamics, ‘entropy,’ this should never happen. Correction, this should never happen in a ‘closed’ space.

But what if the universe were scale-free? What if the only box put around the universe was the illusory one we imagine by stopping our measurements at certain scale-levels? What if the only thing that ‘closes’ the universe system is the limits of our tests?

When Claude Shannon was developing information theory he was looking for a term for the measure of uncertainty in a message. He was going to call this measure the ‘uncertainty function’ but John Von Neumann recommended otherwise,

“You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage” (Tribus & McIrvine, 1971).

The problem has always been that concepts like entropy, complexity and order have been hard to define. The result has been unclear concepts used to describe theories that are unclear about the big picture. We have no theory of everything. To remedy this problem this investigation works towards a redefining of complexity. Like ‘entropy,’ we are looking for some way to measure the change from the Big Bang until now. This is extremely difficult and it is, to a deep extent, the impetus behind this investigation.

One thing that does seem likely is that information will play a key role. Shannon’s information theory is the only known metric for dynamic change. Canonically Shannon information is understood as the measure of the reduction of uncertainty by a receiver. Where entropy is the increase in uncertainty, or randomness, or noise within a spectrum of interaction; information is the increase in certainty, or regularity or predictableness within a spectrum of interaction. Information is the regularities within a less regular (i.e. noisy) environment (Deacon T. , Shannon-Boltzmann-Darwin: redefining information Part 1, 2008). This investigation tries to forge a bond between complexity and information so that we can understand the evolution of the universe not just as a loss of order, but also as a gain of ‘order,’ as defined by complexity and information. This investigation casts entropy and the loss of order as just one side of the coin the universe flips. The other side of entropy is positive symmetry and the gain of order. Both entropy and positive symmetry are universal and both are statistical steps into an unknown future.

So now back to the question. How do we measure the change in the universe between the Big Bang and now? Is the universe more complex or less complex now? The answer depends on scale. If you presume that there are no scale levels smaller than quanta then you can presume the initial state of the Big Bang was a uniform mix of energy, a point source. You just say there cannot be any differences between the different things in this point source because there are no different locations that can maintain different states within it.
Everything is the same because it is all in the same place. The amount of information needed to describe that point source is minimal. That is what a point source is, simple, easy to describe, it contains just one bit of information. Unfortunately, information can only exist as a relative measure between existing states and possible states. It only exists as measured against a noisy environment. What is the environment for the universe? The universe was a ‘point source’ relative to what?

Supposedly, this simple Big Bang ‘bit’ contained the whole universe. The question is, can we really believe that it was simple, that it had no fine-grained structure, the entire universe? This investigation presumes that if we were the size of a quark component and looked into the initial state we would find a whole universe of pattern. This is what a scale-free universe means, pattern at all levels. The result of this point of view is that there was an unknown level of differentiation packed into that tiny Big Bang. This unknown level of differentiation has changed into the current, known level of differentiation over the last fourteen billion years. Fortunately, this presumption ends entropy’s exclusivity.

Physical entropy is a one way process. It goes from heat difference to heat equality, from order to randomness, from information to noise. If the sender were the Big Bang and the current universe were the receiver, and entropy were the only rule, then the initial state should have been highly differentiated and rich in information. Over the last fourteen billion years we should have lost that information. We should be noise. We should be a gas cloud. Obviously, we are not.

The problem of defining complexity is the problem of, ‘compared to what?’ At what scale level do we arbitrarily stop counting differences and how do we choose to box, or limit, what we want to measure. Just because we cannot see fine grained pattern yet does not mean it does not exist. To presume that it does not exist because our current technology has limits is to commit the prejudice of previous centuries.

Was the initial state more complex than the current state? We don’t know. It didn’t have more matter-energy. It had exactly the same amount of matter-energy. Did this matter-energy have more different relationships than current matter-energy? This depends on the scale-level we measure those differences on. The bottom line is that a scale-free universe makes measures of complexity relative, not absolute. My ninety kilogram body may be considered more complex than a ninety kilogram rock because I contain more movement differentials. However, it may be considered less complex than ninety kilograms of vapourized rock because as this vapour disperses it will enter billions of increasingly randomly differentiated heat states. My body maintains a relatively limited, predictable and repetitive number of heat states. Does this mean that vapour is more complex than I am? When I imagine the pattern that constitutes me, it is not Grassberger’s perfect checkerboard image to the left, nor is it the maximum randomness image to the right. It is somewhere in between.

This investigation denies that the simplistic point of view represented by ‘the growth of entropy’ as represented by cosmologists like Carroll and ‘the growth of complexity’ as represented by Spier and Chaisson. In their places it takes a new look at how those relative differences that define our current universe came to exist. Entropy is only a part of this story. Complexity is only part of this story. This investigation introduces new ways to think about these concepts.

One of the keys to this new approach is reimagining ‘order.’ Order is not some absolute measure of heat difference or ‘complexity’ or randomness or information. Order is also relative. In this investigation ‘order’ is defined as the kind of persistent differences that have appeared since the Big Bang, absent any fatuous comparisons with an unknown ‘initial state.’ Order here is defined as those regularities in relative motion differentials that have accrued since shortly after the ‘initial state.’ At some point shortly after the ‘initial state’ there began to persist those certain regularities of relative motion that constitute matter as we know it. These regularities did not exist previous to this point and they continue to exist up until now. These regularities are what this investigation calls ‘order.’ The addition of billions of emergent types of regularities to this initial ‘order from the unknown’ is the story of humanity.

Part 2: Complexity and Information (excerpt from “Origin of Mind: a history of systems”)

So, how does transformation occur in a scale free universe? How do you talk about ‘what’ caused ‘what’ and ‘when’ if everything we currently know of is made of some unknown level of fine grained energy relationships? We cannot. Until we develop better measurement technology we will not know for sure what is down there. What we can say for sure is that the billions of different ‘things’ that we currently know constitute our universe, are all composed of what looks like the same matter-energy. We all know that a teaspoon is different than a bluebird, but we also know that the two are made of exactly the same matter-energy. What makes the bluebird different from the teaspoon, is not different elementary particles, it is how the very same elementary particles maintain different energy relationships. In fact, different energy relationships are what constitute the different forms of matter.

This investigation pursues how these different relationships were sampled, maintained and reproduced from a completely undifferentiated Big Bang to the vastly differentiated universe of today. However, presuming a scale free universe has important ramifications on how we choose to measure this differentiation.

How can we measure the accrued differentiation of the universe? Can we call this measure complexity, and if so, is the universe getting more or less complex over time? It started with a tiny uniform ‘hot soup’ of matter-energy and, “from so simple a beginning endless forms most beautiful and most wonderful have been, and are being evolved” (Darwin, n.d.). Obviously it is getting more complex – isn’t it? Well, this would depend on how we define ‘complexity.’

Common sense definitions of complexity usually rely on measures of ‘more’ and ‘different.’ The more, different elements a system has, the more complex it is. Biologist Steven J. Gould refers to this measure of complexity when he describes complexity as the, “number and form of components” (Gould S. J., The Structure of Evolutionary Theory, 2002, p. 1264) (Gould S. J., Full House: the spread of excellence from plato to darwin, 1996). By his definition the more different components an organism has, the more complex it is.

On the other hand, physics and computer science definitions of complexity often refer either to information theory or algorithmic complexity. Peter Grünwald and Paul Vitanyi summarize the relationship between the two nicely:

“Both theories aim at providing a means for measuring ‘information.’ They use the same unit to do this: the bit. In both cases, the amount of information in an object may be interpreted as the length of a description of the object. In the Shannon approach, however, the method of encoding objects is based on the presupposition that the objects to be encoded are outcomes of a known random source—it is only the characteristics of that random source that determine the encoding, not the characteristics of the objects that are its outcomes. In the Kolmogorov complexity approach we consider the individual objects themselves, in isolation so-to-speak, and the encoding of an object is a short computer program (compressed version of the object) that generates it and then halts” (Grünwald & Vitanyi, 2004).

Note first that the terms ‘information’ and ‘complexity’ appear correlative here. They both measure information even though Kolmogorov calls his measure complexity. Shannon developed his definition first and his solution was to measure information as a relationship between the outcome that does occur and the probability distribution of what could have occurred. He called this statistical relationship ‘entropy’ because entropy, as previously defined by Boltzmann, was also a measure of a future outcome based on its probability in that environment. What Shannon was essentially able to do was to show how any universal system could be defined as a communication process. This process was the selection of a signal (what does occur) from a limited set of noise (what could occur).

Kolmogorov was unsatisfied with this solution. He was a computer scientist and thus wanted an absolute, computable measurement not a probability. He got it by limiting the definition of the system to what was inside it. For him, any system, or entity, is only as complex as the minimum number of bits needed to describe its components. By eliminating the need to measure a system’s relationship to its environment Kolmogorov invented a way to effectively count what something was. The result was that Kolmogorov complexity is essentially a measure of how random something is. This is because Kolmogorov complexity does not count the regularities, they are compressed, thus his measure is mostly a count of the random irregularities within a system.

Kolmogorov’s and Gould’s ideas of complexity are both similar in that they say nothing about how a system came to be structured as it is, but once it exists, they can measure its complexity. They both measure a static amount of structure. Shannon does not treat system structure as static, but as a dynamic relationship that unfolds over time. Unfortunately, Shannon’s solution just pushes the static assumption a little further down the road by assuming the ‘known random source” from which the outcome can be selected. To define Shannon information one needs to have absolute statistical knowledge of the closed system within which the signal is selected, its ‘environmental’ or ‘noise’ or ‘entropic’ potential. Kolmogorov and Gould count static amounts. Shannon counts statistical potential within a static limit.

Now consider how complex complexity can get. The universe contains no closed systems and no system in the universe is static. So how does one use such metrics to count what does actually occur. It turns out that both Gould and Kolmogorov metrics are great for measuring systems that already exist and can be easily idealized to static structures. For these measurements, one just needs to stop the universe (e.g. make it absolutely static), draw a box around what you think the system is, and then count. For Shannon metrics, one can measure dynamic transformations, but only in limited contexts (e.g. by making its potential to interact with its environment statistically static). In Shannon metrics the box is extended to be arbitrarily drawn around the environment the system is contained within.

Now before we consider the limits of static measurements in a dynamic universe let us briefly consider how to decide what to count. A deep flaw in all of these measurements is that they all require a human to decide what is countable and what is not. Do we need to count gluon states to determine the complexity of a one kilogram rabbit? If we did, then a one kilogram rock could be ‘more complex’ than our rabbit. It could have more and different quark and gluon relationships. And what about one kilogram of nebulae vapor?

Presumably, when Gould speaks of organismic complexity as a count of more and different parts he is only considering biological parts: organelles, cells, organs, etc. If Gould wants to count detail up to and including organelles, he chooses to ignore the molecular details within each organelle that makes it act uniquely in some situations. Gould is choosing to ignore fine grained complexity. Kolmogorov complexity also depends on an arbitrary human decision on how to represent the minimum constitutive part of the system to be computed. What exactly does each bit include, or more interestingly not include, and how could that fine grained and chaotic complexity influence the future.

This brings us back to the inability of amounts of static complexity or information to measure a dynamic universe. Both Gould and Kolmogorov complexity metrics are fundamentally limited to measuring structure that has already evolved and are arbitrarily ‘bounded.’ They contribute little to predicting future structure. Shannon information can be used to make limited predictions because it describes a process of how statistical relationships unfold over time. Numerous attempts have been made to use Shannon information to predict the kind of transformational structures that actually populate our universe, but little consensus has been reached (Grassberger, 1986, vol. 25) (Prokopenko, Boschetti, & Ryan, 2009).

Peter Grassberger does a wonderful job of illustrating the problem of defining information complexity with the set of three images at the top of this post. Here is what he says about them:

“Compare now the three patterns shown … [above]. Fig. lc is made by using a random number generator. Kolmogorov complexity and Shannon entropy are biggest for it, and smallest for Fig. la. On the other hand, most people will intuitively call Fig. lb the most complex, since it seems to have more “structure.” Thus, complexity in the intuitive sense is not monotonically increasing with entropy or “disorder.” Instead, it is small for completely ordered and for completely disordered patterns, and has a maximum in between (Hogg and Huberman 1985). This agrees with the notion that living matter should be more complex than both perfect crystals and random glasses, say.

The solution of this puzzle is the well-known ability of humans to make abstractions, i.e., to distinguish intuitively between “important” and “unimportant” features. For instance, when one is shown pictures of animals, one immediately recognizes the concepts “dog,” “cat,” etc., although the individual pictures showing dogs might in other respects be very different. So one immediately classifies the pictures into sets, with pictures within one set considered as equivalent. Moreover, these sets carry probability measures (since one expects not all kinds of dogs to appear equally often, and to be seen equally likely from all angles). Thus, one actually has ensembles: when calling a random pattern complex or not, one actually means that the ensemble of all “similar” patterns (whatever that means in detail) is complex or not complex. After all, if the pattern in Fig. lc were made with a good random number generator, the chance of producing precisely Fig. lc would be exactly the same as that to produce Figs. la or lb. (namely 2 -n, where N is the total number of pixels). If we call the latter more “complex,” it really means that we consider it implicitly to belong to a different ensemble, and it is this ensemble that has different complexity” (Grassberger, 1986, vol. 25).

Clearly, measuring complexity and information is an unresolved problem. Has the universe gained complexity in the sense of Kolmogorov randomness or in the sense of Gouldian numbers of different parts? And if the universe provides so many natural joints at which to cut it into understandable pieces, how do we choose which joints to use in which situations? Who gets to choose? Perhaps Shannon information allows for a prediction of what might emerge, but it too depends on what we choose to look for, and even more importantly, it still limits its predictive power to the known limit of the chosen system. Clearly, we do not yet know how to measure change in this universe.

The normal procedure in such a case is to choose a definition for complexity and then move on. This investigation proposes to redefine complexity as a relative, not absolute, measurement. Essentially, complexity is a measure of difference between systems, but systems are scale free so what is environment to one system is component to another. What is a regularity (e.g. signal) in one system can be noise at a smaller scale level. A cell is a regular component within your body which itself is made of many different fluids, boundaries and cells doing many different ‘noisy’ activities. That same cell is a noisy environment for its own regular organelles which are in turn noisy environments for regular proteins, and on to molecules, atoms, protons and quarks. The only rule to relative complexity is that environments are always more noisy than the systems that are maintained and reproduced within them. Outside is noisier than inside.

This definition is not, in fact, a definition. This kind of proposal is a way to start exploring what a new definition could be. Much of this investigation explores a different way of understanding relative change. It explores how to compare change at different scale levels. However, this exploration is descriptive, not deterministic. This investigation argues for a new way to understand universal evolution by showing how it evolved, by showing how atoms, prokaryotes and minds were selected for persistence, not by a deterministic algorithm or formula that proves it.

As mentioned, this is just a beginning. It is important to impress upon the reader the faults of the current framework and the need for an improved framework. To lay this foundation of doubt we need to take another short peek at another aspect of the complexity of ‘complexity.’