Thursday, November 05, 2009 ... Deutsch/Español/Related posts from blogosphere

An anti-inflation paper by Brian Greene et al.

The world must be coming to an end, indeed. :-) Brian Greene, Kurt Hinterbichler, Simon Judes, and Maulik K. Parikh wrote a preprint called

Smooth initial conditions from weak gravity (PDF)
which claims that inflation doesn't solve the smoothness problem and that offers an extremely awkward solution - more precisely, a non-solution - to something that the authors present as a problem even though it is manifestly not one.

The main motivation for the paper seems to be their misunderstanding of the meaning and validity of the second law of thermodynamics, a topic that has been discussed here many times. The authors start their article by repeating pretty much all the myths about the second law. For example, they write:
The status of [Boltzmann's H-theorem] is less settled than often claimed, because it requires the so-called 'molecular chaos' assumption, doubts about whose applicability have not been firmly laid to rest.
Well, doubts will never be "firmly laid to rest" - partly for legitimate reasons (because science is never over) but, especially in this case, mostly for the reason that Brian Greene nicely explained on TV when he tried to teach general relativity to a dog. Certain things will simply always be difficult to understand for many mortals. A more important fact - for physics - is that the status of the molecular chaos assumption has been fully understood by the experts in statistical physics since 1895.




Recall that the assumption says that the velocities of individual molecules [of gas] are uncorrelated with each other and they don't depend on the position. This assumption allows one to perform various calculations pretty much analytically which is what Boltzmann was able to do, using the simple Maxwell-Boltzmann-like distributions.

Is the assumption true? Well, strictly speaking, assuming the infinite accuracy of our measurements and an unlimited capacity to analyze the data, it is not true. It is a simplifying assumption that allows us to quantitatively proceed. There do exist correlations between the velocities. Once two molecules collide, their velocities surely become entangled.

But what's important is that it is exponentially unlikely for any such microscopic correlation between the velocities to influence macroscopic, statistical features of the physical system in the future. If you assume that the initial correlations carry N bits of information, these N bits are rapidly distributed among all the molecules of the gas. Moreover, they are encoded into very subtle microscopic features of the final degrees of freedom that can only be reconstructed if you're able to measure pretty much all of the degrees of freedom, and to measure them with an exponential accuracy.

This process is called thermalization.

The probability that a macroscopic (or even mesoscopic or even de facto measurable microscopic) quantity describing the final state is affected by some particular microscopic correlation in the initial state goes like "exp(-S/k)" where "S" is at least comparable to the entropy increase and "k" is a very small constant, Boltzmann's constant. The probability is negligible for any macroscopic system. This probability is morally closer to an inverse googolplex than the inverse googol. It's just zero for all empirical purposes.

For this reason, virtually all initial microstates of a gas with the same macroscopic and/or mesoscopic characteristics will evolve into a final state with the same macroscopic and/or mesoscopic characteristics. For this reason, it is also possible to compute the statistical averages over all initial microstates - which is equivalent to setting the correlations to zero. Such an averaged calculation can actually be done. And it leads to the same macroscopic final outcome as virtually all the initial microstates, with the exception of a expo-exponentially tiny fraction, "exp(-S/k)", of the initial states.

The initial conditions with "molecular chaos" simplify the calculations but the macroscopic results of the calculations are completely robust and hold for pretty much all the microstates, including those that don't satisfy the molecular chaos.

The purpose of Boltzmann's H-theorem is not to keep physicists confused about meaningless everlasting philosophical debates "how to get the arrow of time from nothing". Of course, something can't ever be obtained from nothing. The quantitative (more ambitious) purpose is to allow us to calculate the increase of the entropy in any system and the qualitative (more philosophical) purpose is to show that the thermodynamic arrow of time (defined by the increasing entropy) is always aligned with the logical arrow of time.

The logical arrow of time allows us to assume that the future is evolving from the past, and not the other way around. It is the principle that implies that we may remember the past but not the future, and so on. It includes everything that these people find "controversial" about the second law. It is completely true, too. It can't be "removed" from the logical description of any world where objects evolve in time. Well, at least the authors admit that
But it is uncontroversial that one way or another the second law follows from microscopic physics and an appropriate probability distribution over initial microstates.
But they abruptly return to the myths:
On the other hand, the existence of a low entropy state to begin with is more puzzling.
Oh, really? For whom? The lower entropy of the state of a system in the past is a direct consequence of the second law, which they just admitted to "follow from something", so it can't possibly be "puzzling". Can it?
Low entropy initial conditions are unnatural in the sense that with any straightforward measure they occupy an exponentially small (and unstable) region of phase space.
Indeed, low-entropy states are families that occupy small regions of the phase space (classically) or small subspaces of the Hilbert space (quantum mechanically). This equivalence is true by the very definition of the entropy and both of these equivalent propositions about the initial state follow from the second law of thermodynamics.

The "instability" of the region follows from the second law, too. The very statement that the "entropy increases" means that the "region is evolving into something bigger", so it is unstable. States describing any non-equilibrium situation in science (which covers almost all situations in the world we have cared about - the ultimate "thermal death" being the only conceivable counterexample, and it is not too scientifically interesting, anyway) will always be "unstable" in this sense because the second law holds and the entropy increases.

Attempts to show something else are doomed from the very beginning because what these attempts want to show (probably that the regions should be stable?) is just not true. We're just repeating the same second law dressed in dozens of slightly different formulations but it's still the same thing and it is unquestionably true. What can they possibly find so controversial about the second law?

Well, if a "straightforward measure" predicts that the physical system should have had a high entropy in the past, it is indeed a wrong - falsified - measure to describe the state in the past because we know that the entropy in the past was low. ;-) But this conclusion is not a problem of the laws of Nature. Instead, it is a consequence of the laws of Nature (namely the second law, once again). It is just a problem of the given person who tried to propose a "straightforward" description of the state in the past but he has failed. ;-)

If someone else proposes another description of a state in the past, one that actually follows from our knowledge of science and seems correct (e.g. the right initial conditions for the conventional Big Bang cosmology), he won't run into these problems. :-) It may be an open and somewhat terminological question whether these right initial conditions determined by the correct scientific method and the sensible scientist are "straightforward" but even if they are not "straightforward", it is simply no problem. The goal of science is to find the correct answers, not the "straightforward answers", and indeed, the answers are usually not "straightforward".

Science prefers "natural" answers but "natural" is something different than "straightforward". The word "straightforward" indicates that it can be found by the uneducated or misguided people. But the correct mechanisms in Nature, while "natural", usually cannot be found by the wrong people. In this sense, they are simply not "straightforward".

The legitimate condition of "naturalness" says that theories should depend on parameters that should be less awkward, according to the probability distributions dictated by the rules of these theories, than the parameters of the less complete theories we previously used. But no principle can ever prevent the people from proposing "straightforward" hypotheses about difficult questions that turn out to be wrong. Is that really so surprising?

The initial conditions are parts of our "hypotheses". In every scientific discipline, some of the statements about the initial conditions are correct, some of them are wrong - just like in the general case of hypotheses. It's just incredibly silly to criticize a theory or a hypothesis for your own choice of wrong initial conditions - that you may call "straightforward", but it's just your self-brainwashing because the key fact about these "straightforward" ideas is that they are wrong.

Let me explain the same point in one more way. I have argued that "straightforward" is not the same thing as "natural" because if you want to decide about the "naturalness" of something, you first need a theoretical framework to discuss such matters. You may also interpret the adjective "straightforward" as "easy to be logically deduced".

However, this interpretation of "straightforward" clearly cannot be applied to the initial conditions because they can't ever be "logically deduced". You always need an additional hypothesis/theory - or additional information or choices of the priors - to say something about the initial conditions. And this additional information simply can't ever be obtained in a "straightforward" way. Instead, you need to follow the scientific method: be creative, invent sensible hypotheses, and eliminate some of them by the scientific evidence.

What Greene et al. (and others) are failing to do is simply to use the scientific method when they discuss the issue of the initial conditions.

They seem to think that they have already been told about a "God-given" or "straightforward" choice of the initial conditions and they seem to be surprised that this "God-given" opinion about the initial conditions has nothing to do with reality. Well, many "God-given" opinions that the people adopted in "straightforward" ways in the past have nothing to do with reality and a lot of work and open-mindedness has usually been needed, and is still needed, to find the sensible ones.

In particular, the proposition that "the initial state of the Universe had a low entropy" is not "improbable" as these people like to claim. Quite on the contrary, it is certainly true. And any hypothesis or a way of thinking that says that the proposition is false or extremely unlikely may be immediately falsified.
If regarded as fluctuations from equilibrium, they are very unlikely.
The Boltzmann Brains have returned. There you go again. ;-)

Well, indeed, it's unlikely that the world was born as a Boltzmann Brain, a fluctuation from equilibrium that instantly created a large macroscopic system with many degrees of freedom in the right state. But this observation doesn't mean that something is puzzling about the known laws of Nature: it means that the particular Boltzmann Brain hypothesis where the world around us is described as a fluctuation from equilibrium is ruled out. It is dead. It is ludicrous, too. Period.

Nevertheless, they're still convinced that there's a problem:
So the central problem of the arrow of time consists in fi nding a justification for the so-called past hypothesis - the assumption that the universe had low entropy at early times.
If you feel that you have never heard about the "past hypothesis" in physics, it's probably because you have not.

The authors refer to a weird popular book by D.Z. Albert (which unsurprisingly repeats the same myths about the second law that are also discussed in this paper). The proposition that the Universe had a lower entropy in the past than it has today is not an independent "assumption" deserving a new name. Once again, it is a direct consequence of the second law. We may discuss whether the entropy was really zero at the beginning and what it means - it is not really necessary for any questions involving our Universe at a positive "t". But it is not an independent assumption. It is a well-known fact, a simply derivable consequence of a basic law of thermodynamics.

Attacking the cosmic inflation

After the basic results of statistical physics, inflationary cosmology becomes the next target of our friends:
At the start of the hot big bang era the universe was very smooth, with density perturbatios "delta rho / rho = 10^{-5}". This corresponds to a state of very low entropy, because the generic fate of classical matter in a decelerating universe is to form an inhomogeneous configuration of black holes, i.e. "delta rho / rho = 1". It is occasionally suggested that inflation adequately explains this fact, since accelerated expansion smoothes out inhomogeneities that would otherwise be susceptible to gravitational collapse [Paul Davies].
Well, so far it may sound good - except that you may be slightly surprised that Paul Davies is credited with the solution to the smoothness problem. I actually think that among the authors of popular books, Paul Davies may possess one of the highest signal-to-noise ratios because he is able to avoid crackpottery even when he discusses very advanced and possibly speculative questions.

However, when we return to the preprint, it's predictable how the sentences above continue:
The reason this argument fails [6, 7, 8, 9] is that inflation itself requires extremely special initial conditions to get going. In the simple case of a single scalar field "phi", [we need a simple ratio "w" of functions of spatial and temporal derivatives of "phi" and of the potential energy to be smaller than "-1/3"].
Wow. What's so "extremely special" about a configuration whose "w" ratio is smaller than "-1/3"? The ratio is a priori a number of order one and the a priori probability that it is smaller than "-1/3" is of order 50%. It's certainly not "extremely special". I wouldn't even say it's "special". It just picks one of the two possible inequalities with respect to "-1/3". In fact, I will mention a "non-tachyon" argument why the temporal derivatives should be a priori expected to exceed the spatial derivatives later in the text.

Inflation surely does solve the smoothness problem.

The only reason why someone could think that "w < -1/3" is "extremely unlikely" is that he would use a completely wrong - and highly non-relativistic - probabilistic distribution that would dictate what is likely and what is not. Any sensible, relativistic rule or intuition about the initial conditions will respect that the spatial and temporal derivatives of "phi" are comparable, the virial theorem implies that the kinetic and potential energy densities are going to be comparable (in a field theory) as well, and the "w" ratio computed from these things is inevitably of order one and is pretty likely to be smaller than "-1/3".

The authors clearly offer no argument that would justify their extremely bizarre statement that the states satisfying the inequality are "extremely special" but that can't prevent them from slinging more mud at inflation that even makes the "problem even worse", and so forth.

Their "replacement" for inflation

So after having told us all their irrational reasons to think that inflationary cosmology doesn't solve the smoothness problem, they present their own "solution".

A new scalar field effectively changes the value of Newton's constant, and they think that its initial value is low which means that the space wants to be pretty flat (like in gravitational fields sourced by small masses). This is supposed to give them the "right" initial conditions that imply the "past hypothesis" from the crazy popular book.

Well, it's just too much of good stuff. ;-) But let us look a little bit more closely what they actually tried to solve.

First, it is ambiguous whether they wanted the initial state to be just geometrically smooth, or whether they wanted it to be "generic" i.e. a high-entropy state. On the other hand, it is extremely clear that they couldn't have solved either of these "problems".

Let me start with the second possible goal. Here, the verdict is immediate: a theory that would say that the initial state is "generic" in the sense of having a high entropy is simply wrong because it would contradict the second law of thermodynamics - the increasing entropy - which is a fact that can be demonstrated in any physical system, regardless of its degrees of freedom (and regardless of the number and couplings of its scalar fields).

While it's manifest that the authors of similar papers are bothered by the very validity of the second law and they would indeed like the entropy to be higher in the past, let me stop further comments about this downright stupidity because too much of my time and our time has already been wasted with this trivial point.

The other possible purpose of their "solution" was to find a "natural" initial state that is spatially smooth. As explained by above - and in the basic literature about inflation - inflation clearly does solve this problem. What we need at the beginning of the ordinary Big-Bang cosmology is an unusually smooth space whose spatial curvature radius is much larger than all the microscopic scales and that also exceeds the time or spacetime curvature radius.

Inflation achieves this goal beautifully because it is able to produce such an "unnaturally" smooth Universe from an initial state that is natural - whose spatial and temporal derivatives are comparable and that depends on no values of local quantities that would be vastly different from one ("unnatural" values).

Greene et al. have completely different criteria what is "natural" and what is not. In their approach, it is "natural" for the spatial derivatives to be much (infinitely) greater than the temporal derivatives and the goal is to find mechanisms in which they're comparable.

All of these assumptions of theirs are, of course, incorrect. To emphasize how natural it is for the temporal derivatives to be at least comparable to the spatial derivatives, just consider a de Broglie wave. The spatial derivatives are linked to the inverse wavelength which is associated with the momentum while the temporal derivatives are associated with the frequency and the energy. It is natural - and, in the case of the de Broglie wave, inevitable - that the temporal derivatives are greater than the spatial derivatives because the energy of a particle is never smaller than its momentum, otherwise it would be a tachyon.

But even if you switch your brain into a wrong operational mode where it's natural for the spatial derivatives to be almost infinitely greater than the temporal derivatives (because you build on some strange non-relativistic psychological feelings about the right "measure"), it is very clear that their picture can't possibly solve what they consider to be the "problem". As long as the metric tensor is a dynamical variable, its spatial derivatives can be large, even in the absence of matter sources (or a small Newton's constant).

It's simply because there can be gravitational waves in the initial state - and their geometry doesn't care about the value of Newton's constant at all. As long as you allow "any" initial states, it is clear that the high-spatial-curvature initial states will be among them and the "generic" initial state simply cannot be "unnaturally smooth". After all, Newton's constant is a dimensionful parameter, so saying that it was small at the beginning has no physical, unit-independent meaning.

Inflation solves as big a part of the smoothness problem as can be solved because it explains unnatural numbers - very large or very small numbers that are not of order one - in terms of natural initial numbers of order one, combined with the inflationary dynamics. If inflation is right, and it almost certainly is, it simply means that there is a whole new era of the evolution of the Universe that is understood - and where mysterious dragons are no longer allowed.

One can still ask what was the state of the Universe before the inflationary era. Inflation is very powerful because the details of the pre-inflationary initial state are pretty much inconsequential for the post-inflationary properties of the Universe. This feature is extremely good for inflation because it makes inflation robust and independent of the unknown things. On the other hand, this feature is also bad for theories that would like to go beyond inflation: for the same reason. ;-)

If inflation itself can already predict the observable features of the post-inflationary Universe in such a way that the pre-inflationary state becomes irrelevant and undetectable, it means that it is hard to measure or otherwise determine the properties of the pre-inflationary state that could be the holy grail of a hypothetical beyond-the-inflationary cosmology. :-)

Inflation is the king almost up to the Planck scale. The same conclusion makes the available candidate beyond-the-inflationary theories "less royal". Whether it is a good news or not depends on whether you are Alan Guth (or his friend or a fan of his and Linde's theory) or not. ;-) For your humble correspondent, it is a good news, and it will remain good news until inflation becomes "boringly" obvious. It is not yet "boringly" obvious, I think.

At any rate, I remain baffled by the psychological problems that some people encounter when trying to understand and accept the second law of thermodynamics. Yes, I think that the second law is more obvious and more universal than e.g. the evolution of species, so these psychological problems with a low initial entropy seem to be more surprising to me than other people's problems with Darwin's evolution.

Add to del.icio.us Digg this Add to reddit

snail feedback (0) :