**Some early history**

It's been 206 years since Lazare Carnot published the first paper sketching the law of an increasing entropy (Carnot, 1803), an insight that was later elaborated upon by his son Sadi Carnot (Carnot, 1824).

At the macroscopic, i.e. thermodynamic level, the notion was discussed by Rudolph Clausius and others in the 1850s and 1860s.

Finally, 113 years ago, the entropy was derived from the statistical properties of the atoms and it was proved that it can never macroscopically decrease (Ludwig Boltzmann, 1896).

Boltzmann has proved his H-theorem, made the whole classical statistical physics coherent, and became the true 19th century forefather of quantum mechanics. Because of the vitriolically critical lesser minds such as Johann Josef Loschmidt or Ernst Mach and other problems, he decided to commit suicide in 1906. But before he did so, he made sure that "S = k log W" was written on his tomb. ;-)

To learn about some tragic stories of this underestimated genius, see e.g. this book.

Today, sensible people may admire Boltzmann for being so accurate, prophetic, and rigorous about all these insights and formulae at the time when people like Mach weren't even able to accept the very existence of atoms because they were not "scientific enough" for them.

Shouldn't Boltzmann have waited with the suicide, hoping that people would get more sensible along the way and he would receive the well-deserved credit? Well, it seems that his suicide was a sensible decision in this respect because 103 years of waiting wouldn't have been enough.

Anyway, it's incredible but almost two centuries after its conception and more than one century after the appearance of the crucial statistical proofs, entropy and its inevitable increase - as described by the second law of thermodynamics and proved in Boltzmann's H-theorem - apparently remains a controversial concept among some people who consider themselves physicists.

**More modern sources proving the law**

Obviously, the society may have dumbed down but nothing has changed about the physics. The increase of entropy is a consequence of the basic rules of logic and statistics applied to the elementary building blocks of matter or, if you wish, the fundamental degrees of freedom - as long as their number is large. These insights have indeed been understood for more than a century, and the basic structure of the proofs haven't been changed by quantum mechanics, either.

Let me offer you two prominent physicists (and expositors!) who have discussed this basic topic, Richard Feynman and Steven Weinberg. If you have any doubts that the arrow of time has been understood for decades if not centuries, you should first watch the fifth

*Messenger Lecture*by Richard Feynman (Cornell, 1964):

The Distinction of Past and Future (via Bill Gates)If you have problems with the link above, go to Project Tuva, click at Feynman's picture in the middle, and then click at the fifth lecture in the right upper corner. The talk is 46 minutes long.

Feynman starts by saying that it's obvious to anybody that the phenomena of the world are evidently irreversible. Well, despite some jokes that Feynman adds, this basic fact evidently fails to be obvious to many people who live in 2009.

One of the truly insightful parts of the lecture is Feynman's explanation why Maxwell's daemon cannot work. He tries to convince atoms to push a one-direction wheel with teeth by their chaotic motion. However, he finds out that the apparatus can't be used to extract energy out of the thermal motion. The wheel is turning differently depending on the temperature differences and one can see why the heat always wants to flow from the warmer objects to the cooler ones.

At any rate, Feynman makes it clear that the irreversibility of the phenomena is

- well understood
- exists because of the statistical properties of the atoms or other microscopic degrees of freedom; doesn't depend on any physics outside the "lab" where these phenomena are observed
- is fully consistent with the other known laws of physics and the time-reversal invariance of the microscopic ones.

- mysterious
- depends on cosmology
- subtly contradicts the time-reversal invariance of the other laws of physics.

The other source you should look at is Weinberg's proof of Boltzmann's H-theorem, i.e. a direct proof of the entropy increase. Shift/click the pages to zoom them in in a new window. Below the three asterisks, the proof begins with a few comments about the unitarity and its consequences.

This proof is constructed analogously to similar proofs in classical physics and elsewhere but it's fun to have a proof formulated in the language of a current effective theory of almost everything - namely quantum field theory.

I would like to emphasize that the goal of this proof is not just to clarify a simple qualitative question about the sign of the entropy increase. It's much more powerful because it actually gives you quantitative tools to calculate how much the entropy is going to increase in a given situation.

You may see that the proof is very clean. It only uses a simple identity similar to "perfect balance" that directly follows from unitarity; and universal mathematical inequalities. You may ask where does the asymmetry between the past and the future enters the proof. The answer is that Weinberg de facto (and somewhat silently) assumes - and has to assume - that the density matrix becomes diagonalized relatively to his chosen basis in the future.

That's why the thermodynamic arrow of time inevitably coincides with the arrow of time as seen by decoherence, a process that is inherently assumed to exist in the proof. It's the same logical arrow of time. As an observer is losing the contact with some parts of the system and the corresponding degrees of freedom, one must trace over them.

This manipulation is an inevitable procedure that has to occur whenever we deal with any system with many degrees of freedom, it has a logical arrow of time built into itself, and it inevitably leads to the evolution of pure (or almost pure) states into mixed (or effectively mixed) ones. Physical considerations involving incomplete information and a statistical treatment of many microstates cannot exist without such an arrow of time. It makes no sense to discuss macroscopic physics without an arrow of time because there can't be any macroscopic physics without an arrow of time.

This evolution from pure to mixed states includes an arrow of time that agrees with the previous ones, too. Lorenzo Maccone just managed to publish a simple paper in

*Physical Review Letters*that explains the second law as the consequence of our inability to observe and remember situations where states were "traced over" in reverse. ;-)

Well, I wouldn't approve such a paper with a few traces, inequalities, and obvious conclusions for PRL and Maccone is somewhat silly if he thinks that he's the first physicist in the history who has understood the irreversibility of macroscopic physical phenomena. But at least, he's sensible concerning the fact that the problem has been understood by now. ;-)

The detailed arguments in the paper are often unrealistic or incorrect - e.g. because of his isolation of macroscopic observer Alice from decoherence etc. Because of these unjustified assumptions, the main statement that "one could never remember an entropy decrease" is incorrect. The arrow of time is such that people can remember the past and not the future but that doesn't mean that any universal "similar statement with the same vague sign" is correct.

The philosophical spin that Maccone gives to the question - his focus on our "memory" - is nevertheless somewhat entertaining (and a corrected version of the paper would be much more equivalent to the decoherence arguments than he seems to think!) but it is not true that this particular story involving a "psychological" aspect of entropy and our memory is the unique or necessary framework to prove the irreversibility. It's morally equivalent to all the other proofs.

People in this semi-philosophical sub-discipline of statistical physics should understand that the result that some of them still want to prove (repeatedly) is so simple that different proofs will inevitably contain different idiosyncratic philosophies, and they should understand that whoever is able to give a proof with the right conclusion probably does understand why it follows even if his proof is different in details.

**The common errors**

We've seen that the second law is a well-understood consequence of statistical physics which must always have a built-in arrow of time, otherwise it cannot work. The people who don't understand the proof that the entropy is increasing - e.g. Weinberg's proof - simply shouldn't get an "A" from quantum field theory - and other subjects - because the proof of the H-theorem is a standard part of the material that is taught in the Weinberg-like QFT courses and many other courses. It is no cutting-edge physics open to speculations or a vague philosophical chatter.

As we've explained many times, there exist reasons why some people simply seem to be unable to understand this piece of basic physics, namely the second law of thermodynamics, the arrow of time, and their explanation in terms of statistical physics. Besides general mental limitations and attempts to defend mistakes from the past even when they already know that they had been mistakes, the following three are probably the most important causes of the misunderstandings:

- arrogant and deliberate denial of thermodynamics as a legitimate subject of physics
- irrational cosmological chauvinism
- completely incorrect rules for retrodictions in physics, and the inability to notice that their rules are incorrect

**Thermodynamics as an emergent phenomenon**

The first point has something to do with reductionism and its incorrect interpretations. When we learn the basic microscopic laws of reality, we become able to predict and understand all phenomena in Nature. At least in principle, we do.

However, in practice, it may be damn difficult to do all the necessary calculations and construct all the convoluted chains of reasoning correctly. One may easily get lost and all the people who are confused about the second law of thermodynamics get lost at the very beginning.

The derivation of basic thermodynamic quantities such as entropy and their basic properties such as the entropy increase is the most elementary example of the work we have to do if we want our knowledge of the microscopic laws to be relevant for the world including macroscopic objects around us. Thermodynamic concepts and phenomena are omnipresent and the principles that control them are universal.

The concept of the entropy and its properties, together with the basic operations needed for the proper analysis of ensembles of microstates - the averaging over initial states, the summing over final states (note the past-future asymmetry!) - are simply new things that people have to learn and master.

And some people simply don't want to learn new things. They're happier if they remain stuck with more "straightforward" extrapolations of the microscopic laws to the macroscopic context. However, in their edition, this extrapolation is completely incorrect and becomes increasingly more incorrect if we study systems with an increasing number of degrees of freedom - or increasing entropy.

**Cosmological chauvinism**

It may be good for someone to be proud about his or her occupation but if such a pride ends up with far-reaching yet irrational conclusions, it's just too bad.

Even if a carpenter thinks - and wishes - that the Moon is made out of wood, no one will help him to make this dream a reality. Every new astronaut who returns with a rock from the Moon will be a new source of unnecessary conflicts.

The very same thing is true if an excessively proud condensed matter physicist - if I have to politely avoid the name of Robert Laughlin - thinks that black holes must be made out of his favorite superconductors. It's nice that he likes superconductors but his love can't ever be strong enough to rebuild the structure of the black holes in our Universe according to his image. ;-)

A cosmologist's idea that the right explanation of thermodynamic phenomena must be hidden in cosmology is equally ludicrous.

Thermodynamics has nothing to do with cosmology. Thermodynamic phenomena may occur in the lab. The lab may be isolated and everything that takes place outside the lab will be completely inconsequential for the correct predictions inside the lab. This was a "phenomenological" type of an argument but we can offer a "theoretical" one, too: every theory that agrees about the local physics in the lab - regardless of its more or less unusual ideas about the cosmological evolution (consider inflation, braneworlds, multiverse, pocket universes, cyclic universes, bubbles, or whatever you like) - will predict exactly the same evolution and probabilities of outcomes of various measurements in the lab.

A cosmological theory can't ever predict that people in the lab will observe Boltzmann's Brains as long as its local physics agrees with QED or the Standard Model. Why? Simply because the predictions of the lab phenomena only depend on the local properties of the theory, not on the number of pockets, bubbles, or sister universes in the multiverse or their lifetime (or the number of angels on the tip of a needle, for that matter).

(By the way, Boltzmann has only proposed a statistical fluctuation of a higher-entropy state as a possible low-entropy beginning of our Universe, which is a totally sensible proposal that is kind of realized in eternal inflation. Boltzmann has never claimed that what is called "Boltzmann Brains" today was a sensible explanation or prediction of our lab experience, and he completely realized - and always emphasized - that macroscopically huge fluctuations are so unlikely that they can be neglected everywhere in physics. The concept behind "Boltzmann Brains" was really invented by his critics who didn't understand the statistical description of thermodynamics and who wanted to humiliate it. The term was recently revived by Don Page and other people who weirdly believe that it can actually be true and predicted in a realistic theory.)The entropy is increasing in every cubed inch of matter. It is increasing every second. It is not increasing just if you average the observations over cosmological distances. The thermodynamic phenomena are universal and their key underlying processes are associated with physics at the atomic, microscopic, and possibly mesoscopic scales, not the astronomical or cosmological ones!

Indeed, the lower entropy in the past may be extrapolated to the Big Bang. The entropy of the early Universe - or at least its visible part - had to be very small and probably zero. There are independent reasons to think that this is the right picture in cosmology.

**Neither necessary nor sufficient condition**

And this proposition about the early Universe "has the same arrow of time" as we obtained from "lab" thermodynamics. But the cosmological proposition is in no direct logical relationship with the second law of thermodynamics. In particular, a low or vanishing entropy of the early Universe is neither a necessary condition for the validity of the second law, nor a sufficient assumption to derive the second law.

To see that it is not necessary, it is enough to consider a more complicated, e.g. "eternal" cosmology. You can create it in such a way that it has no beginning. Bubble universes are forever being born out of their parent universes and there's no real beginning. The whole universe including all such bubbles will be infinite. You can't say that there was a low-entropy beginning because there's no beginning.

Still, such a universe will predict the same behavior in the labs because these predictions only depend on the local physics. In a very large Universe, you may find a lot of unusual places and unlikely events that only occur because they had many chances to occur. But these unusual events won't change anything about the correctly calculated predictions of lab phenomena.

For example, if you boil a soup, it will be "almost certainly" getting cooler once you turn the stove off. This is obviously true even in a very big multiverse with many bubbles and pockets. The correctly calculated probability that the soup will spontaneously warm up by cooling down the (already cool) table and extracting extra energy out of it will be exponentially small in any universe or multiverse.

You would need a wrong calculation - e.g. one that "cherry-picks" anomalous events and abuses them in a flawed "statistical" calculation - to obtain a different result. The very same comment applies to any lab predictions you can think of. Get used to it. There's no logically valid derivation of anomalous phenomena - such as Boltzmann's Brains - in any cosmological setup as long as the approximate locality, Standard-Model-like local laws of physics, and basic rules of logic are preserved.

We've seen that a low entropy of the young Universe is not necessary for the validity of the second law in the lab. It is equally easy to see that a low entropy of the young Universe is not a sufficient condition or tool to explain the second law of thermodynamics.

**A condition about the Big Bang is not enough**

If you were not allowed to use any "local" statistical reasoning - such as Weinberg's proof of the H-theorem - the information about a low entropy of the early Universe could not be enough to derive the correct, "irreversible" predictions for your lab (and certainly not its quantitative issues, which surely require a calculation similar to Weinberg's). Why?

Well, let's be generous and assume that we're told that the entropy of the early Universe was zero. We're also allowed to assume that it can never go negative, a fact that is obvious from the statistical interpretation of entropy as a kind of information. Are these two points enough to show that the heat won't flow from cooler objects to the hotter ones in your lab?

Clearly, these facts are not enough. The total entropy of the Universe may start at zero but it may go up and down. A positive function may still be increasing and decreasing in different intervals.

But even if you showed that the total entropy of the Universe is going to be increasing, you won't be able to prove that the entropy is increasing in every cubed inch of matter in the Universe. The sum may be increasing but what about the individual terms - the individual cubed inches? You clearly need a local, Weinberg-like argument to say anything about them i.e. to make any predictions we care about, namely predictions of the lab phenomena.

One may even construct a counter-example to the statement that cosmologies with "S=0" at a privileged point have to predict the desired arrow of time even though no "logical" arrow of time is inserted into the argument (I claim it has to be!). For example, there have been ideas (search for "The black hole final state") that future spacelike singularities - such as the Schwarzschild or Big Crunch singularities - may be associated with a unique state.

So the path integral would have to be calculated with specific and unique "final" (i.e. "initial" in reverse) boundary conditions. That corresponds to "S=0" in the future.

Despite this new rule affecting the singularity, the path integral would still reduce to the normal Feynman rules away from the singularities. So even though the laws of physics could say that the entropy approaches zero as we get really close to the future singularity, the entropy is actually increasing all the time as long as we are far from the singularity. That's because away from the singularity, the path integral reduces to the old local laws, with the same low-energy approximation, and they imply that the entropy is locally increasing.

**Correct and wrong rules of retrodiction**

I have also mentioned the third reason that prevents many people from understanding the second law of thermodynamics. It is related to the other reasons and it has been discussed on this blog many times: the reason behind the irrational conclusions are wrong rules of retrodiction.

Some people think that even when it comes to macroscopic predictions, "predicting" the past follows the same rules as predictions for the future. Except that it doesn't.

When we imagine that we know and keep track of all the exact information about the physical system - which, in practice, we can only do for small microscopic physical systems - the microscopic laws are time-reversal-symmetric (or at least CPT-symmetric) and we don't see any arrow. There is a one-to-one unitary map between the states at times "t1" and "t2" and it doesn't matter which of them is the past and which of them is the future.

A problem is that with this microscopic description where everything is exact, no thermodynamic concepts such as the entropy "emerge" at all. You might say that the entropy is zero if the pure state is exactly known all the time - at any rate, a definition of the entropy that would make it identically zero would be completely useless, too. By "entropy", I never mean a quantity that is allowed to be zero for macroscopic systems at room temperature.

But whenever we deal with incomplete information, this one-to-one map inevitably disappears and the simple rules break down. Macroscopic laws of physics are irreversible. If friction brings your car to a halt and you wait for days, you won't be able to say when the car stopped. The information disappears: it dissipates.

The right predictions for the future of a closed macroscopic system say that the entropy will exceed the current one. But the right retrodictions for the past say that the entropy was lower than today. Whenever you calculate the probabilities of evolution from one "macroscopic state" (being identified with an ensemble of similar microscopic states) to another, it's important that you sum over the microstates in the future, but you take the average over the microstates in the past.

The sum is something different than the average.

The rules above follow from logic and nothing else: for future outcomes, all of the microstates extensively

*add*to your odds, while for the past initial conditions, the different microstates must

*share*your chance whose sum equals one! There can't be any other logic where the past and the future would be treated "equally" in these probability calculations (essentially because assumptions and their logical conclusions are not symmetric in logic!), and this built-in asymmetry between the past and the future effectively means that the evolution from past states whose entropy was smaller by "dS" than the current entropy is more likely, by the huge factor of "exp(dS)", than the time-reversed evolution.

For common values of entropy of macroscopic systems, "dS" is around "10^{26}" and "exp(dS)" is a number similar to "exp(10^{26})". This is the missing (or incorrectly added) multiplicative factor in those people's formulae for retrodictions. Despite its scary size, these people are simply unable to notice that they're doing something incorrectly, that they're having a breathtakingly wrong prefactor in all of their formulae for probability.

The multiplicative error in their reasoning is much much greater than the multiplicative error in the creationists' estimate of the age of the Earth. But the people who think that the local laws of physics don't imply any irreversibility don't care. Despite all the evidence, they think that they're smarter than the creationists. Well, they're surely not.

What I also find amazing is that these people think that they can actually

*prove*that for states at moments "t1" and "t2", the entropy at "t2" exceeds the entropy at "t1", but they can also prove that the entropy at "t1" exceeds the entropy at "t2". These two conclusions are manifestly inconsistent which proves that their very logic is inconsistent. They don't care.

Or maybe they think that everyone else's assumptions must also be inconsistent - or perhaps that mathematics itself is inconsistent? Well, neither mathematics nor the basic laws of physics including the local laws (such as the Standard Model) and statistical physics (needed to derive the thermodynamic phenomena) nor their union is inconsistent when used properly!

**Summary**

I urge all the people to finally wake up and return to the common sense. There may be many ways how to define the concept of entropy and how to prove that it is increasing. But in the limit in which the entropy becomes relevant - the thermodynamic limit - all of them become equivalent. The entropy becomes equal to the logarithm of the macroscopically (and/or mesoscopically) indistinguishable quantum microstates (or other definitions) and the proofs of the second law become statistical arguments dealing with a lot of microstates.

The incomplete information has to be treated differently in the past and in the future. In the past, we don't

*know*the incomplete (e.g. microscopic) information which means that we must

*average*over the microstates (perhaps with some non-uniform weights, if the priors have a reason to differ). In the future, we don't

*care*about the incomplete (e.g. microscopic) information which means that we must

*sum*the probabilities over the indistinguishable microstates (and there is no freedom to choose any weights in the future: there are no "priors" for the events predicted for the future).

This difference between the past and the future, or any equivalent difference, is an inherent part of logic, can't be eliminated, and guarantees that high-entropy states (with many indistinguishable fellow microstates) are favored in the future but not in the past. There's no freedom to modify this logic: the only freedom is to switch the terminology for the past and the future, but that's such a trivial operation that without a loss of generality, we may require that the past and the future always mean what we are used to.

Thermodynamics, its basic laws, and statistical physics belong to the basic stuff that every physicist worth the name should master.

**There were so many ways...**

A common proposition that the deluded people love to repeat one after another is the following:

We expect the early Universe to have a high entropy because there are so many ways for an early universe to have high entropy, and so few to have low - that's what high entropy means.But this completely wrong statement is based on the assumption that the number of "ways" - i.e. microstates - increases the probability of a given macroscopic initial state. Well, it doesn't.

Whenever we compute probabilities involving the evolution, initial states, and final states, we *sum* the probabilities over the final microstates (or other possibilities), but we *average* (or take a properly weighted average) of the probabilities over the initial microstates (or other possibilities). The averaging is needed because the sum of the prior probabilities for mutually exclusive "hypotheses" - and/or "initial states" - must remain equal to one.

It means that a higher number of microstates increases the probabilities if they describe the microstates of a final state of an evolution. But it is doing nothing whatsoever if these are the microstates of our initial state. What's amazing is that the people who are eager to parrot the wrong argument above not only question that the second law has been proved: they question its very validity.

Even if they're unable to calculate the probabilities properly in theory, being unaware of the rules of logical inference, why don't they make at least one observation of the world around them to see that the entropy in the past was lower, indeed? That instantly falsifies all of their fantasies.

The proofs of the second law show that the thermodynamic arrow of time agrees with the logical arrow of time. Any rational evaluation of phenomena that take place in time requires something like (Bayesian or another kind of) logical inference. The rules of inference are past-future asymmetric: they inevitably have a logical arrow of time. It can be shown that this arrow of time agrees with the thermodynamic arrow of time, the arrow of time from decoherence, and others.

Without any logical (or another) arrow of time, it is manifestly impossible to prove the existence of a preferred direction. After all, if a structureless "t" axis with no information about its directions could be proved to have a privileged direction, one could repeat the same proof in reverse and prove that the preferred direction is the wrong one, too. ;-) No consistent proof like that can therefore exist and anyone who is waiting for such a proof misunderstands the basic rules of logic.

Boltzmann gave the first proof of the only non-trivial statement one can make in this context, and all other proofs relevant for the thermodynamic arrow of time are guaranteed to be nothing else than variations of Boltzmann's original proof of his H-theorem.

More articles about similar topics...

Time is the presence of motion and forces. Time is due to expansion of space. Time is slow where expansion of space is slow like around large masses. As total motion and forces within a mass is a constant therefore when linear motion is increased then internal motion as well forces within that object slow which is then percieved as slowing of time.

ReplyDeleteCheck this out: http://www.timephysics.com

Thanks

Mkhan