** The Time Paradox**
[

One of the more perplexing problems in physics is the time
paradox, which involves the inconsistency between the reversibility of the laws
of motion and the second law of thermodynamics. Virtually every writer who
touches on the subject of thermodynamics raises this issue.^{1,2,3} While various authors have
attempted to resolve this contradiction, no one appears to have come up with a
straightforward solution. A typical explanation is provided by Richard
Feynman in his book *The Character of Physical Law*, where he says that
"all there is to it is that the irreversibility is caused by the general
accidents of life", reflecting the probabilistic view favored since Boltzmann's
time.^{4} The most determined effort so far to resolve this
paradox is presented by Ilya Prigogine in his book *The End of Certainty*, where he attacks the problem
with a daunting arsenal of elaborate mathematics and esoteric philosophy.

However, it seems to me that the resolution of this paradox is both simple and obvious, once one takes the proper perspective. First, it is important to acknowledge that all mathematical descriptions of reality are only approximations or models of reality. For instance, while Newton’s law of gravitation between two bodies may provide useful predictions of the motion of the Moon around the Earth, the actual motions are complicated by the effects of other planetary and celestial bodies, friction, radiation and so on, such that calculations of the exact motions would prove intractable even if the mathematical laws were precisely accurate.

With the above in mind, we can envision the simplest dynamical system, consisting of a single object in space. It is immediately clear that dynamical laws are meaningless in this context, since position and motion can only be measured relative to another object. If we move to a universe containing two objects, we can then determine positions, motions and forces using the mathematics of Newton or Einstein. And in fact the laws of motion are usually presented in this two-body form. When applied to the real world, this of course neglects the effects of the other objects in the universe.

Now the time paradox results from the fact that these equations are symmetrical with respect to time, while our everyday experience confirms that time moves in one direction only. So is time a subjective illusion, as some physicists have asserted, or are the laws of motion incorrect or incomplete? If the laws of motion are incomplete, what is missing? More specifically, where can the second law of thermodynamics be found in dynamics?

The reconciliation between dynamics and thermodynamics rests on the recognition that the second law is a direct consequence of the law of inertia. However, this only becomes obvious when one realizes the hidden assumption imbedded in the dynamical approach. This unspoken assumption of dynamics is to disregard any prior information on initial conditions, in particular regarding the relative proximity of objects in space. One might immediately object that the laws of motion are valid anywhere and everywhere in space and that the initial positions are therefore irrelevant, but this argument reveals the source of the problem, for in fact the implicit assumption is made that the objects are in proximity to one another relative to the surrounding expanse of space.

So if we imagine two objects in the vicinity of one another in constant relative motion with no forces in play, by the law of inertia they will either be approaching one another or moving farther apart. As has been pointed out by innumerable writers, in watching a short movie of this motion, one would not be able to tell whether the movie were being run forward or in reverse. However, if we were to watch the movie long enough in the forward mode, we would notice that the objects at some point would be moving farther apart and would then continue to do so. By the same token, if we ran the long movie in reverse the objects would generally move closer, except perhaps at the “end” where they might move farther apart for awhile.

So we can see that even in the case of a two-body universe, we would be able to identify a direction of time, based only on the law of inertia and the assumption that the two bodies started off in close proximity relative to the size of the universe. This example becomes even more intuitive if we take three objects instead of two. If these objects were clumped together to start with, we would intuitively interpret the movie in which they dispersed as the forward direction and that for which they converged from outer space as the reverse. This experiment is obviously similar to the standard demonstration of the second law in which the partition between two compartments is removed, allowing a gas initially trapped in one to expand into the other so that the distribution becomes uniform.

In summary, the “contradiction” between the reversibility of the laws of motion and the second law results from the failure to define initial conditions based on prior information. The minimum condition for describing motion requires two objects located in space. The hidden assumption underlying the prior information is that the initial distance between these objects is small relative to the size of the space containing these objects. If this hidden assumption is taken into account then the second law can be seen as a direct consequence of the law of inertia.

This is an example of the resolution of a “paradox” by
Bayesian inference as described by E. T. Jaynes.^{5} Richard Feynman^{6}
easily derives the diffusion equation by applying similar logic which takes into
account that the initial concentration is non-uniform, while Jaynes^{7}
explicitly demonstrates the necessity of applying Bayesian logic to the same
problem, by showing that neglect of the initial conditions leads to the time
paradox with respect to diffusion. Prigogine, on the other hand, takes a whole
book and a dizzying array of higher math to unify dynamics and thermodynamics,
providing a classic example of the relative simplicity of Jaynes’ Bayesian
approach.

One might ask, what happens if we start with our objects at the edge of the universe? Won’t we then see the second law run in reverse, with the objects inexorably moving closer together? Not necessarily. In an expanding universe, distant objects are likely to be moving apart and in fact it appears that more distant objects recede proportionately faster than nearer ones (hence the “red shift”).

However, a more pertinent point is that objects widely separated in the universe will have negligible influence on each other relative to the myriad of objects interspersed between them. Since the laws of motion are statements about the nature of physical reality, they are only applicable under conditions where the motions being described relate to objects that are sufficiently large and/or proximate to render the effects of smaller and more distant objects negligible (for example, calculations of planetary motion or the velocity of baseballs), or on such small scales that only a few objects are in the vicinity (such as atoms in a gas or particles in a cloud chamber). Implicit in these scenarios is the relative proximity of the objects under observation in relation to the vast universe. So the condition of proximity is crucial component of the mathematical description of physical reality and is empirically supported by observation and experiment in the same way as the other mathematical relations which constitute physical laws.

In the two-body scenario, one might accept that entropy is increasing as the objects are moving apart, but what about the early stage when they might be moving closer? Does this correspond to decreasing entropy? First, it should be noted that if we have no information on the direction of the motion, it is as likely that the objects start out moving apart as together, so on average the entropy change would be zero until the objects passed one another. This period of zero expected entropy would have a duration determined by the distance between the objects divided by the relative velocity. If additional objects were added to the cluster bounded by the same initial separation, the period of zero expected entropy would decrease and approach zero as the number of objects increased.

To illustrate the above, imagine a one-dimensional universe where two identical objects are initially positioned a given distance apart with a known relative velocity, but with the direction unknown. Assuming that each direction is equally likely, the average entropy will remain zero until the two objects collide if they started off moving towards each other, with this duration calculated as the distance divided by the velocity. Now suppose we add another object, initially placed between the previous two objects, with the three objects contained within the same initial distance as before. The duration of the period of zero average entropy will now be one-half of that with only two objects. With four objects the duration will be one-third of the two-body case, and so on. With a large number of objects, the duration will approach zero and entropy will begin to increase immediately. This is what is observed in the thermodynamic realm when generalized to three dimensions.

Many authors have represented entropy as a physical entity, using descriptors such as flux, flow and gradient. It is also commonly described as a force. Since these terms are usually applied to matter or energy, their use paints a misleading picture. It should be clear from the above that entropy is not a physical entity, but a statement about the degree of dispersal of physical objects in the same way that distance describes the separation of physical objects. Feynman clearly recognizes entropy as a measure of dispersal, which is consistent with the above formulation.

Jaynes, going further, writes that entropy is a measure of
our knowledge of the state of a system and is therefore solely epistemological.
Relying on Boltzmann’s formulation defining entropy as a measure of probability,
he declares that the second law “cannot be an ontological statement … because
the mere calculation of [probability] makes no use of the equations of motion.”^{8}
However, an approach derived directly from the law of inertia, as presented
herein, circumvents this objection and establishes an ontological basis for
entropy.

Entropy is a consequence of inertia; it is not a force but results from the absence of force. To counter entropy requires force applied over distance, or work. Particles will naturally disperse as a consequence of inertia, but reversing this dispersal requires work to direct particles to the particular location from which they started. This asymmetry applies to dynamics as well as thermodynamics, as the above makes clear.

7/5/04

Entropy can be defined as a function of the aggregate distance between particles, with maximum entropy corresponding to the maximum aggregate (or average) separation given the spatial constraints on the system. This is a corollary to Boltzmann’s equation based on probability. The difficulty with Boltzmann’s formulation (and the reason for the widespread resistance to his ideas, which reportedly led to his suicide), is that while his formula provides a quantitative link between macroscopic entropy and microstates, it fails to establish a causal link between dynamics and the second law. The answer to the question of why entropy can only increase in an isolated system remains unexplained and inconsistent with the accepted (incomplete) laws of motion. His assertion that an isolated system always moves from a less probable state to one of higher probability can simply be taken as another way of stating that the entropy increases without addressing the problem is why this is so.

Similarly, Boltzmann’s equation provides no way of
calculating the rate of increase of entropy, while a method based on the law of
inertia could calculate this as a function of the average velocity of the
individual particles. In a gas, this velocity is proportional to temperature.^{9}

Feynman resolves the reversibility/irreversibility paradox
with a more complicated explanation, of which Prigogine seems unaware. He comes
close to my perspective when, in a discussion of the mixture of two gasses, he
states that entropy “is not a property of the molecule itself, but of *how
much room* the molecule has to run around in.”^{ 10} (italics his)
Implicit in this statement is that the molecules were initially confined to a
smaller space and that they came to occupy a larger space as a result of
inertia. As before, the Bayesian approach advocated by Jaynes is both simpler
and more logically precise.

Unable to identify the causal connection between the second law and the laws of motion, many writers turn to cosmology as an explanation by relating the second law and the direction of time to the expansion of the universe. However, this seems disingenuous in light of the fact that the laws of motion (including relativity) and thermodynamics were well established long before it was discovered and widely accepted that the universe was expanding due to the Big Bang.

The laws of dynamics and thermodynamics were both formulated and experimentally confirmed without regard to cosmological considerations during a time when the universe was assumed to be in a steady state condition of one form or another. The oft-repeated contention that a contracting universe would result in the reversal of the directions of time and entropy seems ill-founded if it is assumed that the laws of nature would be otherwise unaffected. For instance, the law of conservation of energy (and matter) is the most fundamental law of both dynamics and thermodynamics. Since this law was not modified with the discovery that the universe is expanding, there is no reason to believe that it would have been altered if the universe had been found to have been contracting. Since the law of inertia is a special case of the law of conservation of energy, one might expect it to operate regardless of whether the universe were in a steady state, expanding, contracting or oscillating. If so, then the second law should operate identically on the local level as a consequence of inertia and the initial condition of relative proximity, as previously discussed.

This leads to the question of the origin of the clumping that underlies the proximity assumption. Cosmologists have long been at pains to explain why the universe has not expanded uniformly, which would render any discussion of entropy meaningless (and not merely since we wouldn’t be here to have the discussion), since the universe would always be in a state of maximum entropy (even though entropy would constantly increase due to the expansion of the universe). Locally, the aggregation of matter is the consequence of energy being applied in the form of work, with heat as the byproduct. All order that we observe, both inorganic and organic, is the consequence of such work. But it is clear that if the second law is merely the result of inertia and does not constitute a force of any kind, then increasing entropy is also nothing but a byproduct of work and has no role, either directly or indirectly, in the creation (or destruction) of order in the universe.

Once again, the only logical candidate which is both universal and non-teleological is the principle of least action, and its thermodynamic counterpart, the Fourth Law. If we assume that asymmetries, however slight, occurred during the early stages of the Big Bang, then the Fourth Law would act to propagate and amplify these imperfections into the astounding variety we observe. The Fourth Law can best be interpreted as the rule that governs the application of force, with the second law describing the rule (as defined by the law of inertia) governing the absence of force.

The continuity between the second law and the laws of
motion can perhaps best be illustrated by a top-down example. We can modify the
standard two-compartment experiment (where a gas is released from one
compartment by removing the partition separating the two) by specifying that the
compartment initially containing the gas is much smaller than the adjoining
compartment into which the gas will spread. For instance, assume a box one cm
on a side containing 1000 molecules of a gas. This box is then placed inside
another box 10 cm on a side with a volume of 1000 cm^{3}. If the
smaller box is opened, the molecules will spread to fill the larger box such
that the long-term (equilibrium) distribution of the molecules in the larger
space is uniform with an average of 1 molecule of gas per cubic centimeter.
This uniformity is a consequence only of the inertia of each of the molecules,
and not of any mysterious second law “force” or influence. To see this clearly,
one need only repeat the experiment with progressively fewer gas molecules. In
moving from many to few molecules there is clearly no discontinuity down to the
level of a single molecule, in which case the sole motive factor is clearly the
inertia of that molecule. For this case, the gas would still be uniformly
distributed *on the average* at equilibrium, with a density of .001
molecules per cubic centimeter.

Once again, the crucial link between the dynamic and the thermodynamic descriptions is the specification of initial conditions concerning proximity. These initial conditions are always specified in thermodynamics, but frequently neglected or considered irrelevant in dynamics. It is the neglect of relevant prior information which causes the second law to become invisible to dynamics.

One might notice that in the above example there is no discussion of the interaction of molecules due to collisions, as in usually the case in discussions of diffusion. The space between particles minimizes encounters and confirms the similarity between the actions of single particles and the aggregate activity of numerous particles, which is jus the result of summation. Brownian motion is a rare event.

**Physics in a Nutshell**

Both the 2^{nd} and 4^{th} laws are
consequences of 1^{st} law. The 2^{nd} law operates in the
absence of force (inertia) and 4^{th} law operates when force is applied
(PLA). At small scales the 2^{nd} (distances less than initial
separation) and 4^{th} (Feynman QED) laws may be attractors. The paths
of increasing entropy or least action are the most probable paths out of all the
possible paths.

If we add the 3^{rd} law to the 1^{st} law
we can derive all of the classical laws of physics from two “axioms”. The 3^{rd}
law restated in terms of dynamics says that at absolute zero, all motion stops.
This implies that motion can only exist above absolute zero and that there is no
such thing as negative motion. Therefore the combination of the 1^{st}
law, which is the basis for the law of inertia, and the 3^{rd} law,
which says that inertia cannot be less than zero, implies that entropy cannot
decrease and that time can only move in one direction (forward).

It is interesting that the 1^{st} law
(conservation of energy) is symmetrical and that the 3^{rd} law is
asymmetrical (directional), resulting in an elementary kind of complementarity.
This is analogous to the line of reasoning taken by Richard Cox in “The Algebra
of Probable Inference”^{11}. Cox derives the principles of induction
from only the following two axioms.

- The probability of an inference on given evidence determines the probability of its contradictory on the same evidence.

- The Probability on given evidence that both of two inferences are true is determined by their separate probabilities, one on the given evidence, an the other on this evidence with the additional assumption that the first inference is true.

The first can be paraphrased as saying that the sum of the
probabilities of an exhaustive set of possibilities must always be 1, and can
therefore be seen as a form of conservation law. The second axiom implies two
activities, one of which must (logically) preceded the other, which implies a
directionality. Therefore, these axioms also appear to pair symmetrical and
asymmetrical rules. Thus we see an interesting similarity between the logical
structure of ontology, as prescribed by the 1^{st} and 3^{rd}
laws, and the logical structure of epistemology, as prescribed by Cox’s two
axioms.

Footnotes and References:

^{1} Ilya Prigogine,
*Modern Thermodynamics*, p. 91-92.

^{2} Erwin Schrödinger, *What Is Life? *
with* Mind and Matter *
and* Autobiographical Sketches*, Cambridge University Press,
1967, p. 151. First published in 1944.

^{3} Edward *Six Roads from
*,
p. 66-68.

^{4} Richard Feynman, The
Character of Physical Law, The Modern Library, p. 106.

^{5} E. T. Jaynes, *Probability Theory*

^{6} Richard Feynman, *Lectures
on Physics*, Volume I, Chapter 43. "Diffusion."

^{7} E. T. Jaynes, "Clearing Up Mysteries – The
Original Goal", p. 3-7.

^{8} E. T. Jaynes, "Clearing Up Mysteries – The
Original Goal", p. 20.

^{9} Richard Feynman, *Lectures on Physics*, Volume
I, p. 43-10.

^{10} Richard Feynman, *Lectures
on Physics*, Volume I, p. 46-6.

^{11} Richard Cox, *The Algebra of
Probable Inference*, p. 3-4.