Wednesday, February 20, 2013 ... Français/Deutsch/Español/Česky/Japanese/Related posts from blogosphere

Ludwig Boltzmann: a birthday

Off-topic: Yuri Milner, Facebook's Zuckerberg, and Google's Brin launched a Life Sciences counterpart of the Milner Prize, the same money. Because it's about life sciences, the chairman of the foundation is the chairman of Apple.
Ludwig Boltzmann was born on February 20th, 1844, in Vienna, the capital of the Austrian Empire. He hanged himself 62 years later, on September 5th, 1906, near Trieste, (then) also in the Austrian Empire, where he was on vacations with his wife Henriette von Aigentler and a daughter. They had 3 daughters and 2 sons; Boltzmann probably suffered from undiagnosed bipolar disorder.

I consider Boltzmann to be not only the #1 person behind classical statistical physics but also the latest "forefather" of quantum mechanics. His name appears in something like 100 TRF blog entries.




Fine. His grandfather was a clock manufacturer while his father Ludwig Georg Boltzmann (who died when his son-physicist was 15) was an IRS official. He got some good teachers such as Loschmidt and Stefan (yes, they share the Stefan-Boltzmann law) and was exposed to the contributions by 19th century giants such as Maxwell rather early on. He collaborated with Kirchhoff and Helmholtz, among others. On the contrary, he was later the adviser to Lise Meitner, Paul Ehrenfest, and many others.

Already when he was 20+ years old, he wrote a dissertation on kinetic theory of gases, the main subject he revolutionized. Graz, proper Austria's second largest city (an important one for Slovenes), was his most successful workplace.




All his major contributions to physics are linked to statistical mechanics – the microscopic "explanation" of the laws of thermodynamics. They include the Boltzmann [transport] equation for a probability distribution on the phase space\[

\frac{\partial f}{\partial t} + \frac{\mathbf{p}}{m}\cdot\nabla f + \mathbf{F}\cdot\frac{\partial f}{\partial \mathbf{p}} = \left(\frac{\partial f}{\partial t} \right)_\mathrm{collisions}

\] but also the Stefan-Boltzmann law for the total radiated energy (scaling as the fourth power of the absolute temperature; \(4\) is indeed generalized to the dimension of the spacetime if you change it)\[

j^{\star} = \sigma T^{4}.

\] and the Boltzmann distribution saying that Nature exponentially suppresses the likelihood that things jump to higher energy levels – the colder the temperature is, the more speedy the suppression becomes:\[

{N_i \over N} = {g_i e^{-E_i/(k_BT)} \over Z(T)}.

\] Most famously, his tomb offers the visitors his statistical interpretation of the entropy:



In this equation, \(S=k\cdot \log W\), and in many other equations, we see Boltzmann's constant \[

k=k_B= 1.38\times 10^{-23}\,{\rm J/K}.

\] This is the ultimate constant to convert between kelvins (temperature) and joules (energy per a degree of freedom such as an atomic one etc.). In other words, it's the conversion factor between statistical physics and thermodynamics; mature physicists set it equal to one, \(k=1\), much like in the case of \(c=\hbar=1\). The numerical value is small because people had only been familiar with the "thermodynamic limit" in which the number of atoms is very large, effectively infinite. In this \(N\to\infty\) limit, the thermal energy per atom becomes comparable to \(kT\) which is numerically about as small as \(k\) itself. A large number of atoms (or photons) form a "continuum" and the energy is smooth.

Let me mention that \(W\) in Boltzmann's tomb trademark equation looks simple but it stands for a rather complicated word, Wahrscheinlichkeit. That simply means "probability" although a more correct explanation is "frequency of occurrence of a macrostate: how many microstates correspond to it" (the natural probability of one particular microstate is the inverse of this number, so its logarithm is the same thing as the logarithm of the number with a minus sign).

You may also wonder why the entropy is denoted \(S\). Well, I can tell you something. The symbol as well as the word "entropy" were introduced by Clausius in 1865. The word "entropy" was deliberately chosen to be similar to "energy". The word "ἐνέργεια" i.e. "energeia" means "activity" or "operation" in Greek; similarly, "τροπη" i.e. "trope" is a transformation. The symbol \(S\) wasn't explained but given the fact that the front page of a major related article by Clausius mentioned Sadi Carnot as the ultimate guru in the theory of heat (he was in the first wave of proponents of entropy), it's likely that \(S\) actually stands for "Sadi".

Ludwig Boltzmann mastered lots of the combinatorial exercises we often learn in the context of the classical statistical physics (factorials to calculate the number of arrangements, and so on). But he has spent many years with efforts to prove the second law of thermodynamics. A particular proof of this sort was the proof of the H-theorem in 1872.

Controversies about the provability of the second law

Even though Boltzmann's proof of the H-theorem is obviously right and is a template to prove the second law of thermodynamics in any microscopic theory or any formalism (including quantum mechanics), it has been attacked by irrational criticisms from the very beginning. Poor Boltzmann spent a lot of energy and created lots of entropy by the defense of his important insight and I am sure that the crackpots who criticized him contributed to his decision to commit suicide.

The so-called Loschmidt irreversibility paradox is named after Boltzmann's former teacher, Johann Joseph Loschmidt (born in Carlsbad, Czech lands), and it is a deep misunderstanding of the origin of the second law of thermodynamics (and the arrow of time). The basic logic behind this "paradox" is that the microscopic laws are time-reversal-symmetric (or at least CPT-symmetric, if you discuss a generic Lorentz-invariant quantum field theory; the impact of the CPT symmetry is almost identical). So it shouldn't be possible to derive time-reversal-asymmetric conclusions such as the second law of thermodynamics (entropy is increasing with time but not decreasing with time).

Well, if you state it in this way, it looks tautologically impossible to prove the second law. However, what this argument completely misses is the fact that the second law of thermodynamics is a statement about statistical or probabilistic quantities such as the entropy. And to derive such statements, we actually need more than just the "microscopic dynamical laws" of physics; we also need the probability calculus. The probability calculus applied to statements about events in time is intrinsically time-reversal-asymmetric. And this asymmetry, the "logical arrow of time", is imprinted to time-reversal asymmetries in other contexts.

For example, Boltzmann's H-theorem proves that the thermodynamic arrow of time (the direction of time in which the entropy increases) is inevitably and provably aligned with the logical arrow of time.

Technically, the critics were attacking the assumption of molecular chaos and were claiming that it was the key assumption that made Boltzmann's proof vacuous or invalid or whatever they would say. But this is a complete misunderstanding of this assumption. Molecular chaos is just a technical assumption about the velocity of the gas molecules etc. – they're uncorrelated etc. in the initial state – which allows one to calculate certain things analytically.

But even if we introduced arbitrary correlations or other features of the initial probability distribution, it would still be guaranteed – up to negligible, exponentially supertiny probabilities – that the entropy is actually going to increase with time! If you want the entropy to decrease with time, it is not enough to start with an initial state that contains correlations between velocities. You need to start with some totally unnatural correlations between the positions and velocities that are exponentially unlikely and that happen to evolve in a way that reduces entropy for some time. The only way to "calculate" what the correlations required to start a decline of the entropy are is to actually evolve a desired low-entropy final state backwards in time. But this can't occur naturally.

The very fact that the entropy increases in the real world has nothing whatsoever to do with any details of the molecular chaos assumption. The actual reason why the second law – a maximally time-reversal-asymmetric fact about Nature – holds is that it is a claim of probabilistic character. And probability calculus has an inevitable, intrinsic, logical arrow of time. When you calculate the probabilities for a transition between macrostates \(A\to B\), you need to sum the probabilities over the possible final microstates \(B_j\), but you need to average (not sum) over the possible initial microstates \(A_i\). Think about the origin of this summing and averaging. They are unavoidable consequences of "pure logic". For the final (mutually exclusive) microstates, the probabilities are simply summed; for the initial states, the fixed "total prior probability" must be divided among many microstates.

Averaging and summing are different things. As explained in dozens of TRF posts, this difference is reflected by an extra factor of \(1/N_A\sim \exp(-S_A)\) and guarantees that\[

\frac{P(A\to B)}{P(B^*\to A^*)} \sim \exp[(S_B-S_A)/k].

\] The asterisks represent the time-reversal- or CPT-transformed states. (The signs of velocities/momenta etc. are reverted.)

Because the entropy difference \(S_B-S_A\) is typically incomparably larger than \(k\), Boltzmann's constant, and because the probabilities can't exceed one, it's clear that the right hand side above is either zero or infinity in the thermodynamic limit and only one of the probabilities from the numerator or denominator (the probability of the process where the entropy increases) may be nonzero. This is the actual reason why the second law of thermodynamics holds.

Glimpses of the quantum, Bohrian thinking about the world

Why was it so hard for the people to understand these things? And why it's so hard to some physicists – such as Brian Greene or Sean Carroll – even today, more than 100 after these discoveries were made? Well, I think that the reason is the same as the reason why certain physicists (including the two I have mentioned) can't understand the foundations of quantum mechanics. In fact, they face almost the same problem. Let me explain why.

Niels Bohr is the author of a quote that's been mentioned on this blog many times:
There is no quantum world. There is only an abstract physical description. It is wrong to think that the task of physics is to find out how nature is. Physics concerns what we can say about nature...

As quoted in "The philosophy of Niels Bohr" by Aage Petersen, in the Bulletin of the Atomic Scientists Vol. 19, No. 7 (September 1963); The Genius of Science: A Portrait Gallery (2000) by Abraham Pais, p. 24, and Niels Bohr: Reflections on Subject and Object (2001) by Paul McEvoy, p. 291
I have mentioned that this is the key psychological obstacle for many people in the context of quantum mechanics. They're permanently looking for a classical model. There is an objective reality, they believe, described by the values of some mathematical objects (functions): the mathematical functions and the objective reality are isomorphic to each other and observations are only passive reflections of this underlying reality.

In quantum mechanics, things are different. The fundamental things are propositions that we can make about observables (properties of physical systems) at various moments. The laws of quantum mechanics relate the truth value (and, more generally, probability) of these propositions directly and there is no way to reduce them to a classical model or objective reality in between. Although all of us laughed when we were kids and we were told about philosophers who question the existence of objective reality, it's nevertheless true that at the fundamental level, objective reality doesn't exist. It is just an emergent, approximate concept.

But what I haven't emphasized sufficiently often is that Bohr's quote actually applies to classical statistical physics as well. Many people misunderstand the proposition-based character of physics in this context which leads them to invent lots of irrational criticisms against classical statistical physics, too. What's going on?

The critics are still imagining that Nature is found in a particular, deterministically evolving microstate, and they try to evaluate the second of law of thermodynamics from this viewpoint. In classical physics, it is tolerable to imagine that there is a particular, deterministically evolving microstate behind all the information about Nature; in quantum physics, it isn't tolerable, as Bohr's quote above pointed out.

However, even in classical statistical physics, the existence of such a particular, deterministically evolving microstate is completely irrelevant for the validity of the second law of thermodynamics. Why? Because the second law of thermodynamics isn't a statement about a particular microstate in the phase space (or Hilbert space) at all! It is an intrinsically statistical statement about a collection of such microstates or about a probability distribution and its evolution.

Let me give you an example. The second law says, for example, that...
...if you place a hot bowl of soup on a cold table and measure the soup-table temperature difference 20 minutes later, it's almost guaranteed that you will get a smaller number than you obtained at the beginning.
Let's analyze it a little bit. The first thing to notice is that the sentence above is a proposition, not an object. It is a probabilistic proposition and quantitatively speaking, it is true because we may show that for the proposition to be wrong, the entropy would have to decrease but the probability of such a process is exponentially tiny, something like \(p\sim \exp(-10^{26})\). The smallness of this number is what we mean by "almost guaranteed".

Fine. How it is possible that such a time-reveral-asymmetric proposition follows from the time-reversal-symmetric microscopic dynamical laws in physics? To see the answer, we must realize that the proposition deals with macroscopic objects such as a hot bowl of soup and table. It's important to notice that these phrases don't represent any particular microstates – precise arrangements of atoms. It's very important to acknowledge that these phrases represent macrostates – statistical mixtures of microstates that look macroscopically (almost) indistinguishable. That's true for the soup in the initial state as well as the soup in the final state.

It is a technical detail whether these statistical mixtures are "uniform" (the same probability for all of the microstates) or not; that's just the difference between microcanonical and canonical or grand canonical ensembles. These mixtures don't have to be uniform. What matters is that there are many microstates that have comparably large probabilities to be realizations of the concepts such as a hot bowl of soup. That's why the proposition "soup will cool down" above is a proposition about a transition between an initial state and a final state. And it is a proposition that is appropriately "statistically averaged or summed" over the microstates.

As I have emphasized above (and in dozens of older TRF blog entries), the right way to statistically treat such combined propositions about many microstates or macrostates is to sum over the possible final microstates, but average over the possible initial microstates (with weights identified with some "prior probabilities" that depend on other subjective choices and knowledge, but you won't lose much if you assume that all initial microstates in a set are equally likely). When you sum-and-average these transition probabilities for microstates of the soup+table, you will get a result that is totally time-reversal-asymmetric. The entropy increases because the summing/averaging asymmetry favors a larger number of "fellow microstates" for the final state and a lower number of "fellow microstates" for the initial state.

It's the mathematical logic, pure probability calculus applied to propositions about things that occur at various moments of time, that is the source of the time-reversal asymmetry. One doesn't need any time-reversal asymmetry of the microscopic laws. Indeed, these laws are time-reversal-symmetric (or at least CPT-invariant in the case of quantum field theories but the CPT-invariance plays the same role).

There is nothing paradoxical about the validity of the claim "soup will cool down". Every sane person knows that it's true. And Nature doesn't need any time-reversal-asymmetric terms in the microscopic equations of motion for the elementary particles to cool down the damn soup! It just cools down because of basic statistical considerations.

Things would be different if we made a statement about a particular microstate of the hot bowl of soup. Would a particular microstate of soup (in contact with a cold table) evolve into a microstate that looks like hotter soup or colder soup? Now, this question isn't completely well-defined. You would have to tell me what microstate you are actually asking about. And by this comment, I really mean that you would have to tell me the \(10^{26}\) positions and velocities of elementary particles in the soup with amazing precision that I need to make the prediction.

No one ever does that in the real world and it isn't really needed because we know that with insanely unlikely exceptions, whatever the initial state of the soup is, the soup will just cool down within a second, after a minute, it will always cool down. A macroscopic decrease of the entropy is virtually impossible. It will never happen anywhere in the Universe during its lifetime (except for possibly super long timescales such as the Poincaré recurrence time).

What about the unlikely exceptions? Yes, a tiny fraction of the states, \(1/\exp[(S_B-S_A)/k]\) of them, will evolve in a way that decreases the entropy. But these exceptional states can't be isolated by any natural condition for their velocities that would only look what their values are "now" (in the initial state). The only way to define these special states is to say that these are the states that will just happen to evolve into low-entropy final states.

Indeed, if you define the initial state of the soup in this way (it is a microstate that evolves into a lower-entropy final state by the microscopic equations of motion), the right statement about its evolution will be that it will evolve into a lower-entropy final state. But this proposition will be an uninteresting tautology. What the actual second law of thermodynamics is concerned with is something entirely different: realizable initial states of hot soup and the phrase "soup" that is simply defined as a statistical mixture of these microstates which forces you to use statistical and probabilistic methods to evaluate the probabilities and truth values of propositions!

The critics are also entirely wrong that the same comments would apply to the final state. It is not true that the final microstate state of the soup is "generic". Instead, among the equally high-entropy states, it is an extremely special microstate because it has evolved from a lower-entropy initial state and there are just "few" of these low-entropy initial microstates. We are told that it has evolved; it is a part of the homework exercise we were supposed to solve!

When we discuss the evolution of a bowl of soup on the table, it would be totally incorrect to think that "bowl of soup" in the final state denotes an equal statistical mixture of all conceivable similar high-entropy microstates of the soup and the table. Indeed, the very formulation of the problem says that the bowl of soup was sitting on the table so the final soup did evolve from an initial state, and it must therefore be a special microstate.

The key difference between the initial state and the final state is that it is legitimate to assume that all the allowed microstates in the initial state are comparably likely; but it is not legitimate to assume that all the macroscopically similar final microstates are equally likely. The latter claim about the final state is illegitimate simply because we aren't allowed to choose the probabilities of the final microstates; they are – by the very definition of the adjective "final" or the noun "future" – determined from the probability distributions in the initial state and the properties of the initial state in general. The future evolves from the past, not vice versa!

On the other hand, the initial state evolved from some states at even earlier instants of time and the formulation of the problem says nothing about those. That's why it's allowed to organize our knowledge about the initial state as a statistical mixture of microstates of a certain kind – where all the microstates are comparably represented. That's right because the only thing that we know about the initial state is that it is a hot bowl of soup etc.; this state doesn't have any special "micro" properties. The final microstate does have some special "micro" properties (correlations) because – as the description of the very problem says – this final state evolved from an initial state that was also a soup. We can't revert this statement and say that the initial state evolved from the final state because – by definition of the words "initial" and "final", it just hasn't.

As you can see, the critics reach wrong conclusions because they're sloppy about what we know and what we do not know. But it's a part of their philosophy to be sloppy about "what we know" because they think that the knowledge is an irrelevant spiritual subjectivist solipsist stuff that has nothing to do with physics. So they behave as if they knew the exact microstate. But that's a complete fallacy. They do not know the exact microstate and if they assume that they do, and that the initial state is very special, they inevitably reach wrong conclusions that are easily falsified by observations. Statistical claims about the soup or any objects in thermodynamics are all about our knowledge and ignorance. It's very important to distinguish what we know and what we do not know and what we partially (probabilistically) know and what the probabilities are.

Classical statistical physics is about the careful derivations of true (or extremely likely) claims about systems with many degrees of freedom out of some other true (or extremely likely) assumptions that we were told to be valid. It is all about propositions. Quantum mechanics has upgraded this principle to a new level because it became impossible – even in principle – to assume that there is a particular "objective reality" (microstate at each moment) at all. Nevertheless, the feature – that physics is about making right propositions, not about mindlessly visualizing a "model of the precise thing" – was already present in classical statistical physics because even classical statistical physics tells us that interesting statements about Nature (or soup) are statements encoding partial and probabilistic knowledge of some features of Nature (or soup) and we should be very interested in how these statements are related to each other.

So the deluded folks who helped to drive Ludwig Boltzmann to suicide were direct predecessors of the contemporary anti-quantum zealots in the same sense in which Ludwig Boltzmann himself was a forefather of modern physics, a Niels Bohr prototype.

And that's the memo.

Add to del.icio.us Digg this Add to reddit

snail feedback (32) :


reader TooT said...

Is there a "quantum equivalent" of the H theorem for quantum mechanics? I mean looking at density operator of system immersed in a bath and subject to fluctuation dissipation, is it possible, looking at its master equation, to show that the Von Neumann entropy of our subsystem ($ = - \text{Tr} \rho_S \ln \rho_S$ with $\rho_S$ the density operator of the system only) to be "more likely" (in the sense Lubos described in his post) to increase?


reader Luboš Motl said...

Yup, one may rewrite the proof of the H-theorem in the quantum formalism, too. See e.g. the last section of

http://motls.blogspot.com/2009/04/god-and-boltzmann-eggs-dogmas-and.html?m=1

which also mentions a proof from a book by Weinberg.


reader Gene Day said...

I am amazed that someone for whom English is a second tongue can write such perfect prose, Lubos. I am a real nit-picker when it comes to good writing and I can’t find any nits to pick here. I concur with every word of this “memo”, of course.


reader AlainCo said...

Just one naive idea. I don't know much of TD, beside some useful consequences.


If the law of entropy growth was false, it could be possible to use it to separate hot and cold without consuming energy, then you would be able to produce energy with a thermal engine...


so this law is a consequence of energy conservation law (1st TD law) ?


Is it true, like what I've read, that breaking the Heisenberg inequality would break 2nd TD law ?


so Heisenberg inequality is also a consequence of conservation of energy ?


maybe is it a tautology ? or stupidity ?


it seems to me that there are few real laws that look hard to break (from data we have today), being speed of data transmission, energy/mass conservation, entropy increase. maybe are they all the same law ?


what is your opinion.


reader nick said...

Hi Lubos,

I wonder why you didn't mention coarse graining in your article (which is excellent and extremely well written, by the way). It is at the heart of my understanding of the second law, since it provides the mechanism by which the number of accessible microstates actually grows for a system -- according to Liouvilles theorem, that number would otherwise be constant. And if S_A and S_B were the same, the time asymmetry would vanish.
Cheers,
Nick


reader Luboš Motl said...

Dear Nick, I agree that coarse-graining is an essential concept here but I decided to avoid this term because I still have a plan to write a blog entry where coarse-graining and its character would play the key role...


Right, if S_A and S_B are the same, there's no asymmetry. The system must already be at equilibrium.


reader Luboš Motl said...

Thank you very much for your compliments – although I feel that you may be trying to balance some recent criticisms of my language skills that haven't tangibly hurt me because, as you may guess, I don't consider foreign languages to be a pillar of my self-confidence. ;-)


Good that you agree!


reader Luboš Motl said...

Right, if you could separate materials to cold and hot - so that the temperature difference would increase with time, conversely to the usual behavior - it would be very useful. You could extract energy effectively from constantly cooling another object, e.g. the whole Earth.


Such a wonderful device is known as the "perpetual motion machine of the second kind" and its non-existence is another way to formulate the second law of thermodynamics.


I think that the second law of thermodynamics; first law of thermodynamics (energy conservation); and the Heisenberg uncertainty principle/inequality are three mutually (pairwise) unrelated, inequivalent principles of physics.


reader Shannon said...

A cold soup tells us more about its own history than a hot soup.


reader chuck said...

I'm not clear that you have finessed dynamics with this argument. As a thought experiment, start with the classical thermodynamic rigid box with a divider, and in one side put a single large frictionless, perfectly elastic ball with a given energy. I think the classical state of could be well enough determined to integrate its path well into the future after the divider was removed, and in almost all cases it would shortly be found on the other side of the box. Now there are statistics there in the form of 'almost all initial states of given energy', but I think there is also dynamics. In particular, if the ball is headed towards the other end, we can expect it to get there, which reduces to Newton's first law.

This thought experiment isn't a many particle, macroscopic, thermodynamic system, but on the other hand, I think the fact that most macroscopic systems are comparable temperature-wise needs some explanation.


reader NumCracker said...

Dear Lubos, when you say: "But even if we introduced arbitrary correlations or other features of
the initial probability distribution, it would still be guaranteed – up
to negligible, exponentially supertiny probabilities – that the entropy
is actually going to increase with time!" ... one would think that in case there are long-range correlations (decaying slower than power laws) and memory effects ... well, in such case one trying to prove H-theorem would find harsh times, right?


reader Gene Day said...

For anyone to write really well they have to know a hell of a lot about the language AND they have to be careful. I think that anyone contributing to a public forum such as TRF should be pretty careful. Failing to do one’s best is a discourtesy to the readers.


reader Luboš Motl said...

Not at all. This is exactly the silly misconception I was attempting to clarify - but clearly, I have failed in your case. Long-range correlations have absolutely no power to invalidate the second law.


Long-range correlations are about a few degrees of freedom - because there are not too many degrees of freedom that have long wavelengths. The entropy is increasing because the bulk of the degrees of freedom - mostly the shortest wavelength degrees of freedom that still carry variable energy - is getting more chaotic.


To make the entropy decrease, you have to introduce insanely unnatural and extremely precise correlations in correlations of extremely short-wavelength-variables across the system. Long-range correlations or memory effects won't do.


reader Robert Rehbock said...

One so fluent with the language of math doubting his language skill? I am still trying to absorb and learn enough background to follow sufficiently the recent Brane post but the English is "ausgezeichnet".


reader Rami Niemi said...

I cant get over this "puzzle": I can fine tune phase space of classical biljard ball particles into arrangement that degreasing entropy in time. But if one chooses two points close around that point with equal average position and momenta one looses the fine tuning of averaged entropy of these points at the speed determined by Lyapunov exponent. Now entropy increases in both directions of time due to this averaging or ambiguity? What am i missing?


reader Rami Niemi said...

I cant get over this "puzzle": I can fine tune classical biljard ball particles into arrangement that degreases entropy in time; a point of phase space. But if one chooses two points close around that point with equal average position and momenta one looses the fine tuning of averaged entropy of these points at the speed determined by Lyapunov exponent. Now entropy increases in both directions of time due to this averaging or ambiguity? What am i missing?


reader Luboš Motl said...

Dear JR, thanks for the tip but no, because of your last sentence. The popular book market is completely distorted - there is pretty much a "consensus" about virtually all the misconceptions one may think of in this context - and I don't want to be fighting wind mills. I think that these delusions must have overwhelmed the popular book for a good reason - probably the popular book readers want to hear them all the time. In my guess, this makes it likely that a book offering the valid alternatives to all these things couldn't be anywhere close to a bestseller.


reader Luboš Motl said...

Fun topic but the author over there doesn't say too much beyond simple cliches, does he?


reader Luboš Motl said...

Dear Guest, what you're missing is the central point - one that I have been trying to stress throughout most of this blog entry - that statistical mechanics is about statistical claims.


You may fine-tune a particular classical microstate so that it has some behavior when evolved back and forth in time but that's a completely different statement than the statement that the entropy of a macroscopic object is almost certainly increasing into the future and was almost certainly lower in the past. The latter substatement requires retrodictions with the (relatively) final state determined by a probability distribution and with a choice of priors for the (totally) initial state. These claims about macroscopic/thermodynamic systems are simply *not* statements about individual microstates you may invent so these individual microstates with certain properties are irrelevant exceptions.


reader Rami Niemi said...

I assume that i do make the step from deterministic view into statistical claim by this toy model of averaging over two points in vicinity. Might as well be sphere S^6 around the fine tuning point in phase space. In my imagination the outcome is that entropy grow always in both directions in time, because you drag the S^6 with you through the simulation, and because the near by points end up high entropy microstates because the probability in both ends. Needles to say that I think the S^6 to be size deltaX times deltaP.


reader Luboš Motl said...

Dear Rami, I have already told you where the mistake is, so why are you just repeating it verbatim as if you were deaf?

It is not true that all points in a sphere - you surely meant a much more higher-dimensional sphere than 6-dimensional, something like 10^{26}-dimensional sphere - are equally (or even comparably) likely as microstates describing the final state. Only those that evolve from much lower entropy states in the past have a noticeable probability to "be" the final state.


By talking about the sphere in the context of the final state, you are making an *assumption* about the probability distributions, and it's just a wrong assumption. What the hell is so hard about this to understand?


reader Rami Niemi said...

Sorry for the dubious expression, I didnt mean that the final states are in sphere (and yes 10^(26+6)-dimensional sphere). What I tried to state was that: lets follow the points at S^(10^32) around fine tuning point and find out that the average out come is not at all fine tuned to low entropy state. Looking now this fine tuning point in opposite direction with respect to time it looks like hot soup cooling and points around the S^(10^32) will do the same.


To my view statement "the entropy of a macroscopic object is almost certainly increasing into the future and was almost certainly lower in the past" does not contradict statement "Simulation low entropy particle soup will end up in high entropy state regardless of direction of time". Reversing time I assume to be same as reversing all momentum vectors at start.


reader Luboš Motl said...

I feel sure that this discussion has been running in circles for quite some time. Yes, "most" points around a microstate that evolved from an apparently low-entropy state didn't evolve from a low-entropy state - and there's no reason why it should be so because "most" is evaluated with respect to the (nearly) uniform measure for all microstates and one may show that this measure is not an appropriate one to describe the final state.


reader Rami Niemi said...

Feel free to end this discussion.

But is it reasonable to you to examen time reversal by reversing momentums of particles. Because if it is allowed I feel that 2nd law can be stated as: "entropy grows as time goes" instead of "entropy grows as time goes forward", for classical gases. This especially in mixing two gases in a box, etc.


And for the record I dont want replace wave function with something classical.


reader Luboš Motl said...

Right, you're still not getting it. The two statements are exactly equivalent because time always goes forward - that's a different way to say the same thing.


Indeed, it would be a catastrophe - a rudimentary logical inconsistency - if one could prove that the entropy is increasing in both directions because "increasing backwards in time" is the same thing as "decreasing forwards in time in the same interval" and no function S(t) may be both increasing and decreasing in an interval.


This self-evident logical inconsistency is what people like you *constantly* live with and you're apparently not capable to notice that your beliefs about the world suffer from this lethal flaw. It's extremely important to realize that the growth of the entropy may only be proven when time goes in the right direction, namely forwards, and when all the tools are used properly, one may indeed prove this growth of entropy in the forward direction only.


I will not approve your next comment if it is equally un-new as this one.


reader Rami Niemi said...

"Time always goes forward" is less or equally plausible to "time always goes the same direction". Statement "entropy grows as time goes" fits both.

I'm not hinting any fluctuations of time, just the initial toss of a coin in which the direction was chosen. In universe of ours it goes forward (some SSB mechanism, perhaps). What is the way to determine which way time goes, except the way it always goes? And hence the Loschmidt's paradox?

Most sincerely


reader Luboš Motl said...

"What is the way to determine which way time goes, except the way it always goes?"



A method to find out which direction time goes is to wait for a while, and one gets to the future. Holy fuck. Is that really hard for you to distinguish the past and the future? They're completely different.


There's no paradox.


reader Rami Niemi said...

The past is closer than the future to the to the point in which the SSB of time took place. Mind me for this cosmology influenced heresy but at the beginning both options ought to be on the table and 2nd law and quantum mechanical H-theorem to describe them both.


reader Claes Johnson said...

For a formulation of the 2nd law with the big mystery of entropy replaced by the smaller mystery of turbulent dissipation, see

http://claesjohnson.blogspot.se/2013/02/2nd-coming-of-2nd-law.html


reader Jan Reimers said...

Dear Lubos, Understood, as with progress in any field it is an uphill battle. It seems you are in the same position as Richard Dawkins and Carl Sagen, who have both written extensively about debunking irrational myths. Dawkins' book "The God Delusion" is selling reasonably well, so there is some market for a rational expose on reality. Anyway I appreciate all your efforts at explaining the world to those of us who follow your work.

JR


reader Claes Johnson said...

Such a book has already been written, see The Clock and the Arrow: A Brief Theory of Time:

http://books.google.se/books?id=W0xp9JMnhFwC&lpg=PP1&pg=PP1&redir_esc=y#v=onepage&q&f=false


reader Gene Day said...

Yes, Gordon, I, too, found a few nits to pick upon a second reading but the clarity of Lubos’ thinking shines through, doesn’t it?