The Fourth Law, Part 2. [5/24/03]




My outback epiphany soon yielded to doubts concerning the possibility of an aging surfer discovering a new law of nature.  The next day it occurred to me  that the Fourth Law could in all likelihood be derived from the First Law, that is, the Principle of Maximum Efficiency was probably a consequence of the law of conservation of energy.  I deduced this from the simple example of an object falling in a gravitational field, since it appeared obvious that if the object deviated from the most direct path, kinetic energy would be produced in excess of the reduction in potential energy, which would remain the same no matter what path the object took.  (Put another way, if energy were conserved when the object took the most direct path, any less direct path would result in excess kinetic energy and violate the conservation law.)


Upon my return from Australia, I directed my reading to the subjects of biology, ecology, complexity and thermodynamics in an effort to determine if my ideas were truly original.  Except for the venerable and largely forgotten Maupertuis and the elusive Chris Davis,1 I did not come across anyone who professed a similar viewpoint.


Although my designation of the Principal of Maximum Efficiency as the "Fourth Law" was originally a tongue-in-cheek dig at Kauffman et al, it also seemed like a catchy book title, along the lines of Paul Davies’ "The Fifth Miracle".  However, on further consideration, it appeared that the Fourth Law provided a missing link by clearly addressing the dynamics of nonequilibrium thermodynamics.  The possibility that the First Law renders the Fourth redundant in the technical sense does not, in my view, negate the usefulness of the Fourth Law, since the redundancy is neither obvious nor useful unless restated in the form I have proposed.


Complexity and Thermodynamics


I will now summarize my hypothesis.  My logic is based on the guiding principles of parsimony and universality.  The problem can be stated as follows:  How is it possible for complex structures (biological or inanimate) to form in the face of the Second Law of Thermodynamics?


The obvious answer is, of course, that nothing in the Second Law prohibits the creation of local order, as long as the entropy of the universe increases in the process.  However, one has the sense that this is not the whole story, since there seems to be some positive force which encourages complexity.


The simplest accommodation to this concern is to interpret the Second Law in its “strong” form, such that entropy is not only inclined to increase, but must increase to the maximum allowed by the constraints imposed on the system.  Entropy thereby becomes a “force” in the same sense as gravity.2  Increasing entropy can be thought of as providing the engine for the generation of order.


But how is this engine to operate?  There is no apparent connection between (universal) increases in entropy and (local) increases in order.  Perhaps there is another law of nature at work.  One possibility is that there is a specific law of complexity that operates to generate order.  In addition to the circularity implicit in such a law, the following objections come to mind.


1.      Why postulate a new law if the old one(s) will suffice?  It makes sense to make sure that the implications of the current laws of thermodynamics are thoroughly understood before inventing new ones.  The thermodynamicists I have read (Haynie, Prigogine, Jaynes, etc.) are unanimous in decrying the widespread lack of understanding of the basic principles of thermodynamics, particularly among biologists.  If it ain’t broke, why fix it?


2.      Is the law universal or does it only apply to complex (biological) systems?  The laws of complexity that I’ve seen proposed seem to be tailored to describe biological complexity.  By contrast, astrophysicists do not seem inclined to appeal to a law of complexity to explain galaxy formation, etc., since the existing laws of physics appear to suffice.


3.      Does the law betray a teleological bias in favor of complexity?  This bias can take many forms, from the hand of God to the anthropic principle.  Not only does teleology run counter to the principle of parsimony, it violates the First Law.3


As an alternative to the above, I propose the following, which I call The Fourth Law of Thermodynamics:  The flow of energy through a system is such that the thermodynamic efficiency of the process is maximized, given the structural constraints on the system.  An alternate and perhaps more direct way of stating the same thing would be:  Energy flows through a system such that the heat dissipated by the process is minimized.


The Fourth Law possesses the following attractive features.  It is parsimonious since it extends a known physical law (the Principle of Least Action); it is universal since it applies to all natural phenomena, including physical, biological and cultural; and it is unbiased since there is no a priori reason to believe that complexity will be selected over simplicity in general, with the actual outcome contingent on circumstance.


The importance of the Fourth Law stems from its role as a selection mechanism.  At first it might appear that if energy is always flowing through a system in the most efficient manner, there is nothing left to optimize.  However, this would be true only if the system were in static or dynamic (steady state) equilibrium.  Since neither state is common in nature, it is also unlikely that a given structure is optimal at any point in time.  Instead, the goal of optimal efficiency acts as an attractor which is constantly shifting position due to the constantly changing environment.


While the above may seem to imply the futility of an analytical approach, it is possible to assume stability in the shorter term or the consider the effects of greater of lesser rates of environmental change.  In an ecosystem, stability should encourage greater biodiversity, as finer and finer niches are created and filled.  From an evolutionary perspective, species will emerge which are more and more specialized.  In contrast, ecosystems that are frequently disturbed should have less biodiversity and be populated by more generalist species. 


The Fourth Law does not generate complexity directly, but causes it to be “fixed” as a consequence of selection based on efficiency maximization.  This process is unbiased relative to increases in complexity, since the environment may favor evolution toward simplicity.  There is an optimal level of disturbance (or stability) in the environment for the evolution of increasing complexity, as well as a level of disturbance beyond which simplicity is favored.




What difficulties are likely to be encountered in the above theory?  First, there is the question of the equivalence of efficiency maximization to entropy minimization, which is true in isothermal systems but not in general.  Second, a strong form of the Second Law would dictate that the change in entropy is maximized as energy flows through a system.  This appears to be the basis of  Jaynes' principle of Maximum Entropy Production (MEP).4  However, since the increase in entropy under MEP also results in an increase in the available work done by the system, there should be no conflict with the Principle of Maximum Efficiency.


An unavoidable issue that is raised by MEP and elsewhere is the relationship between thermodynamics and information theory.  Jaynes notes the confusion created by Shannon when he “appropriated” the term entropy “for a new set of meanings”.5  The looseness of the original analogy between information theory and thermodynamics seems to have been forgotten with time, with the equation of entropy with disorder having taken over from its original meaning of irreversibility.


In addition, there is confusion within information theory over the concept of entropy.  For instance, there is the quandary over whether a random signal contains the least amount of information (because it’s meaningless) or the most (because the longest algorithm is required to reproduce it).  This confusion is the result of a lack of connection between information theory and the real world, with information viewed as ethereal and nonmaterial.


In comparing information theory with thermodynamics, the most obvious difference is the lack of a conservation law for information.  One is not surprised by the statements, “Additional information has been generated” or “The information was destroyed”, used either in a colloquial or a technical sense.  This is in stark contrast to the First Law of Thermodynamics.


The most well-known case of perplexity in relating information to thermodynamics is, of course, Maxwell’s Demon.  The original speculations ignored the cost of information, begging the question, “What does the demon eat?”  (Schrödinger's cat?)  The assumption that information is nonmaterial is an example of the persistence of dualism in modern scientific thought.  Even Jaynes, a self-professed materialist, perpetuates the error by making the distinction between information and the “physical” in his discussions of the “subjective” nature of entropy.  However, as the Buddha observed 2500 years ago, it’s all material; there’s nothing else.


Information must be material in the same sense that energy is material.  Therefore, information can only exist as stored potentials and has no ontological reality without a material device to process the potentials.  For example, a bit is stored in computer memory by placing an electromagnetic potential in a particular physical location.  The bit constitutes potential energy in exactly the same way as water stored in a tank on a hill.  In either case, utilizing the potential energy requires a machine to convert the energy to work, increasing entropy in the process.


Information cannot be considered in isolation any more than the “complexity” of life can.  Information exists in open systems, which require a full accounting of the amount of energy required to operate and build the device which processes the information, not to mention the thermodynamic processes involved in the evolution of the beings that built the device, and so on, along with the entropy generated by these processes.  There is no free lunch and no “order for free”.


Information theory is an incomplete model that has taken on a life of its own due to what Jaynes calls the Mind Projection Fallacy.6  Prigogine sees the same confusion of the real with the ideal as the root of the “time paradox” of modern science.7  Like both, I see this tendency to mistake the model for the world as pervasive in science, although I don’t consider this realization as startling as they seem to.



Footnotes and References:


1 At the time I could find no address, phone number or affiliation anywhere on his web site and an email address I eventually located was no longer active.

2 It seems possible that entropy (or more correctly, the entropy gradient) is to time as gravity is to acceleration.

3 It appears, strangely, that biologists are much more resistant to teleological arguments than physicists.  Perhaps this is because biologists have been in hand-to-hand combat over the issue for so long.  Physicists seem much more relaxed about incorporating New Age cosmology into their theories, especially when they expound on the subjects of biological evolution or consciousness.

4 E. T. Jaynes,"The Minimum Entropy Production Principle", Ann. Rev. Phys. Chem. 1980. 31:579-601.

5 E. T. Jaynes,"The Minimum Entropy Production Principle", Ann. Rev. Phys. Chem. 1980. 31:583.

6 E. T. Jaynes,"Clearing Up Mysteries – The Original Goal", In the Proceedings Volume, Maximum Entropy and Bayesian Methods, J. Skilling, Editor, Kluwer Academic Publishers, Dordrecht-Holland. 1989. p. 7.

7 Ilya Prigogine, The End of Certainty, The Free Press, 1997.