Yesterday, Natalie Wolchover and Peter Byrne wrote an article for the Quanta Magazine:Off-topic, CERN:former ATLAS spokeslady Fabiola Gianotti was chosen to be the next CERN Director General. Congrats but it's since 2016 and she may very well miss the first big post-Higgs discoveries.

Interstellar:the reviewers seem totally captivated by the black hole blockbuster.

In a Multiverse, What Are the Odds?It's the first part of a series. The article has received no comments so far and no other media reacted. My reactions to this silence is mixed:

- On one hand, this article refers to many actual big shots in physics research (and yes, I know all the physicists mentioned there in person), unlike most of the "news about science" that inform about the work of assorted cranks and 3rd class physicists
- Despite these physicists' being very good and at the top, this whole "measure problem" research program is fundamentally flawed, so it's good news that at least one set of flawed ideas is being ignored by most of the media

The article mentions various people who defend the anthropic reasoning, including Weinberg, Guth, Vilenkin, Bousso, Kleban, and others. Nathan Seiberg is quoted as a neutral person who would like to be anti-anthropic but he won't deny the evidence. Paul Steinhardt plays the role of the guy who denies the cosmic inflation itself.

Alan Guth is a key co-father of cosmic inflation and he pays lip service to the anthropic reasoning, too. In a video attached in the article, Guth says that a big problem of cosmology is how to calculate the percentage \(\infty/\infty\) of the cows across the Universe that are violet (the Milka cow could be just one example). Steven Weinberg gave us the argument estimating the cosmological constant in the late 1970s and this argument is being promoted as a success of the anthropic reasoning, which is misleading, as I will point out.

Various random methods to calculate \(\infty/\infty\) due to Vilenkin and others are said to lead to highly variable and unsatisfactory results. Raphael Bousso is promoted as the author of a better approach to the "measure problem" based on the causal diamonds.

However, the whole paradigm that one should look for a solution of the "measure problem" is fundamentally flawed. It's based on the assumption that the probabilities of various observations done "here and now" are affected by statistical properties of objects in the multiverse – either in the spacetime or in a spatial slice – because "we are typical" in some sense. But this very general assumption is wrong not just for one reason but for many reasons:

- Calculating properties of the Universe "now and here" from the distribution of observers including the future ones is acausal, and therefore logically inconsistent, so the "spacetime counting" is just plain wrong (how many observers of some type will exist also depends on our current and future decisions, so these numbers can't influence these decisions in the backward direction)
- The "spatial slice counting" is also wrong because there are no preferred "slices" of the Universe, due to relativity
- All this "anthropic" counting requires one to be able to define what an observer is and what is not. However, science obviously can't give us an answer to this "bureaucratic" question who gets the "paperwork" – the difference between "non-observers" and "observers" is always gradual and quantitative
- There is absolutely no justification for the assumption that "every observer whose immediate vicinity is looking the same is equally likely". It's just one possible probability distribution among infinitely many – and distributions that actually try to "discourage" highly populous representatives are at least equally justified (some of this discouragement has to exist to make the distributions normalizable, see the next item)
- The uniform distribution doesn't even exist because the sets of observers in the multiverse are infinite and the normalized uniform distribution would have to have \(p(X) = (1/\infty)\cdot 1 = 0\)
- Any attempt to regulate this problem with infinities leaders to a strong dependence on the cutoffs

The basic issue that they still seem to misunderstand, after these decades (!), is that every probability that is uniquely and unambiguously calculable by the laws of physics has the form\[

P(\text{property at later moment} | \text{prop. at earlier }t).

\] In other words, the laws of Nature allow you to calculate the probability that a particular initial state – known either "maximally" (as a pure state) or "probabilistically" (as a density matrix) – will evolve into a final state with certain properties. That's how dynamical laws of physics always work. In particular, I want to emphasize that the opposite probabilities – probabilities of something at an earlier moment given some assumptions about the later moment – are never uniquely calculable because they may only result from Bayesian inference and this inference is a "reverse problem" that inevitably depends on some arbitrary, subjective priors! So the results of the "opposite" (past assuming future) probabilities can never be fully dictated by the laws of physics!

The only "new twist" that cosmology may add into this template is that it may provide us with a definition of the "ultimate initial conditions of the Universe", perhaps by a formula similar to the Hartle-Hawking wave function. With this addition, the probabilities\[

P(\text{property at any moment}|\text{right initial cond.})

\] may become meaningful, too.

In the future, it may be possible to calculate the probability that a particular compactification of string theory with certain extra properties has evolved from the right initial conditions. Other things could have evolved, too. They may even be "realized" in very distant parts of the Universe or the multiverse. But they don't

*affect*what we observe here and now. Instead, they are

*consequences*of the initial conditions, much like our observations here and now are consequences.

It's also misleading to say that Weinberg's estimate of the cosmological constant is a success of the anthropic reasoning. In reality, Weinberg has simply observed the existence of the stars and he has proven that the cosmological constant can't be too high and positive; and can't be too high and negative, otherwise the theory would imply that no stars exist. This is nothing else than a

*particular procedure to falsify a class of the hypotheses by the empirical data*. It's the most ordinary scientific method – and Weinberg has played the role of an experimenter.

However, there are usually many very different ways to falsify a wrong theory by finding its predictions that contradict certain empirical observations. And it is

*never*true (at least not demonstrably true) that a particular method to falsify the theories is the

*unique one*. The anthropic approach (the anthropic lack of principles) is the belief that a procedure similar to Weinberg should be the

*only*relevant argument or observations that constrains or determines some property of Nature. But this belief has clearly been wrong almost everywhere in the history of science and it is very likely that it will be wrong in most scientific advances in the future, too.

There simply exist – and will exist – much sharper, more concrete and well-defined ways to eliminate or strengthen hypotheses than just the observations that "some intelligent life exists"! Nature gives us many hints (and clear evidence), we see some of them, fail to see or overlook the others, but the task for science is to find as many useful hints as possible. And the anthropic lack of principles is telling us nothing whatsoever where we should look. On the contrary, it is telling us that we shouldn't look anywhere, perhaps with one exception of a stupid process of counting the observers.

Another part of Weinberg's success is that the inequality for the cosmological constant that he derived from the existence of stars is obeyed by the Universe around us – it has to be – but "barely so". The observed cosmological constant is "just" 100 times smaller than the maximal one that is allowed according to Weinberg's argument. The number 100 is much smaller than 10

^{123}but it is still pretty large. The opinion that we have to eliminate the values that contradict the existence of stars is a

*fact*. But there are also other things that affect the justifiable opinions about the value of the cosmological constant and it will never be possible to show that these other factors – e.g. conventional physics as we have known it for centuries – play no role. Whether the number 100 above is large or small is a matter of subjective taste, not science. This taste depends on some probabilistic distribution for the surviving viable values of the cosmological constant but without a law dictating the initial conditions, every such distribution is just a premature, subjective, idiosyncratic guess that simply cannot be "qualitatively more true" than others.

Now, the causal diamonds and light-like sheets and slices are very natural in physics. This importance is also linked to the fact that the light-cone gauge is such a natural gauge-fixed formulation in string/M-theory (Matrix theory naturally arises in the light-cone gauge, too); the light sheets may always be viewed as generalized black hole horizons, so they're relevant for the generalizations of the black hole entropy to general spacetimes; and they play many other roles. But I wouldn't say that the discussions about the initial conditions for the Universe are among them. At most, Bousso's focus on the light sheet is yet another observation showing a fundamental defect of the "measure problem" research as we knew it for many years. According to the horizon complementarity, the degrees of freedom behind the cosmic horizon shouldn't even be viewed as independent ones, so they can't simultaneously have well-defined values with the actual observables within our causal patch! That's why we shouldn't be talking about the number of observers behind the horizon if we also want to talk about more tangible questions that are observable within our patch: to speak about both is just like to assume that both \(x\) and \(p\) are well-defined in quantum mechanics.

But this observation

*weakens*the efforts to calculate something about "now and here" from counting observers; it doesn't strengthen it in any way.

Incidentally, in a recent paper by Bena, Graña, Kuperstein, and Massai, it is being argued that in the KKLT construction of the "many stabilized flux vacua", the added anti-D3-brane actually makes the whole vacuum unstable, in the tachyonic sense, so the "uplifted vacua" are no good. Once supersymmetry is broken, calculations become significantly less controllable. So whether this criticism of KKLT is valid will remain an open question for some time, I guess.

Let me remind you that the KKLT is a semi-specific construction of large classes of type IIB string theory compactifications that starts from showing that many supersymmetric AdS vacua are possible solutions because of many values of the fluxes over cycles; and in the second part, KKLT argue that one may also "uplift" these SUSY AdS vacua to non-SUSY dS vacua by adding an ingredient that breaks supersymmetry, namely some anti-D3-branes (branes with the opposite charge than the D3-branes that have been there already before that final step).

KKLT are generally more achieved physicists than the critics and I think that they're generally slightly better "string mathematicians" able to calculate these matters, too. But otherwise my expectation is that the critics – these critics or similar ones – are likely to be correct. Why do I think so?

Because unbroken supersymmetry is the key feature that tends to produce moduli spaces, stable vacua, positivity of the energy (which is also linked to the stability at the stationary points), and large configuration spaces. When SUSY is broken, it just seems reasonable to expect that many "steep slopes" in the configuration spaces are created, ruining the stability of almost all conceivable stationary points, and the stationary points are likely to have some unstable directions. So quite generally, I do tend to think that the number of stable non-SUSY vacua is smaller than the number of stable SUSY vacua, in some sensible way to count them. I know that some people have the opposite expectation but it even seems plausible to me that – just like we would sometimes believe – if the analysis is done really perfectly, SUSY breaking leads to a single (!) realistic non-SUSY vacuum in four dimensions, and this vacuum has to be ours. Of course that this possibility is a wishful thinking to a large extent and the uniqueness isn't necessary for any established principle. But it may be a possibility, anyway.

## snail feedback (23) :

Hello Lubos,

but in another parts of string field theory a major progress was made, I think: http://phys.org/news/2014-11-field-theory-foundation-quantum-mechanics.html

Best regards and thank you for your very interesting postings,

Klaus Lange

Dear Klaus, first, this article (in Quanta Magazine, or this blog post) has nothing to do with string *field* theory. String field theory is just a tiny fraction of the research in string theory - a particular way to write down the formulae for perturbative amplitudes.

Second, the progress you reported is not new. The preprint was posted in July

http://arxiv.org/abs/arXiv:1407.6833

which is when I first responded to it in a comment. So far, the paper has 0 citations.

Third, the claim is really a fundamental misunderstanding of the interpretation of the string fields. The mathematical analogies between the operators in quantum mechanics and the non-commutative star product on one side - and the star-product merging the strings on the other side - has been known since the first paper by Witten about the cubic string field theory. So that's nothing new.

But this mathematical similarities doesn't mean that they're the same thing. A string field is a "matrix" on the space of string configurations, but it's also an operator on the Hilbert space. These are two completely different levels of the fields' being "operator-like". To some extent, the claim that the foundations of QM are "explained" by the string fields is as wrong as the claims that the foundations of QM are described by the droplets.

http://motls.blogspot.com/2014/07/droplets-and-pilot-waves-vs-quantum.html?m=1

And the latter is seriously wrong.

Hi Lubos,

Please tell me your views(if any) on this paper. I had recently come across it and it looked very curious. http://arxiv.org/abs/1312.6523

Hi! I think it's a nice paper, some of the mathematical observations may be new but some of them are older. The relationship to the division algebra is heuristically great but much of the structure evaporates for the larger algebras, especially the octonions.

They say:

"If the properties of this universe still seem atypical even in the habitable subset, then the multiverse explanation fails."

Why is that? I think it should be that the anthropic explantion fails.

Agreed. They simply adopt the dogma that the anthropic principle ("typicality") must be *automatically* associated with the multiverse. Of course that I disagree with that entirely.

But do we even need to expain QM? QM is fundamental it can't be explained by a deeper level. Not to mention that if this deeper level is not Quantum mechanical it has to be classical which is just stupid.

(This comment is to be taken as one coming from a mathematical mosquito with its neck sticking out!)

Dear Lumo,

Soon enough into reading this to me beautifully efficient and swiftly encompassing overview of and revisit to this topic, I was somehow deriving a slight satisfaction from sensing that your raging against, and your own sense of room to move away from, The AP seems to have slightly shriveled and shrunk, respectively. :->

Thanks for a nice discussion. I will have to study the earlier points more carefully in order to try to understand them better, but the second half of the essay makes points that are much clearer to a layman like me, and which help to clarify the issues in this very important question -- important because is likely to color humanity's "religious" views of the nature of the universe for eons to come. I must say I share your wishful thinking, but I am ready for anything.

"They simply adopt the dogma that the anthropic principle ("typicality") must be *automatically* associated with the multiverse. . ."Theoretical cosmologists assume typicality but they are themselves highly atypical human beings. I find a certain irony in this.

I agree with you, Giotis. SFT is just an application (and much more specific type) of quantum mechanics, and like any field theory, it's "rather interesting" already before quantization, as a classical theory, too.

What an irony! And it's not just some similarity between the two problems. The relevant definition of "intelligent life" would almost certainly assume some "on Earth atypical" elevated IQ, because those who can't really get these cosmological theories may very well be clumped with the apes and maybe other organisms etc.

Even on Earth, one may get estimates of relative probabilities obtained from "typicality" that differ by many orders of magnitude depending on "who is allowed to be counted". And that's just the Earth where all "good enough" candidates are rather similar. In the multiverse, the "candidates to be called intelligent observers" are almost certainly much more diverse in number and in intensive quantities, so a more demanding version of "intelligent life" may very well shrink the pool not by the factor of 1 million like on Earth but by the factor of 10^{123} or higher.

Dear Lubos,

Just a comment on the following statement "KKLT are generally more achieved physicists than the critics and I think

that they're generally slightly better "string mathematicians" able to

calculate these matters, too.".

I would like you to inform you of the recent history of this issue. The first real criticism on anti-brane uplifting started in 2009, 6 years after KKLT. It took that long since computing the details of string theory vacua with broken SUSY is very dirty and the techniques were not available. At least 10 papers have been written since that are highly technical and demonstrate that anti-brane uplifting is likely to be problematic. The "pro-KKLT" camp has not written any paper in response, apart from a paper by Dymarsky (http://arxiv.org/abs/1102.1734) that contained many claims that have been shown to be wrong. The people that are working on this should not be considered as an "anti-KKLT camp". Instead they are working out the details with an open mind and the details seem to contain problems.

I am disappointed in the lack of scientific attitude of some members of the string pheno community. Instead of an honest scientific debate, the KKLT scrutiny is dismissed as politically motivated. A good scientist can defend his/her model by solid mathematics and not by authority arguments. Not a single computation has been done by the KKLT camp to disprove the critics. If they are the better "string mathematicians" then they should be able to show it.

In my opinion (for what it is worth) I would not bet my money on KKLT, but I would not say either it is shown to be inconsistent for sure (I would give it a 95% failure rate). It seems that the only way to tell for sure would be to do a computation at small length scales/or strong coupling.

Lightspeed is a robust observable (including exponents larger than 1) throughout the visible universe. Lightspeed survives both relativity and quantum mechanics. Lightspeed is impossibly small given the scale of the universe, across the solar system, even at circuit board dimensions.

One suspects there is a universal very small scale determinant rather than a large scale effect. The multiverse is then inversely wrong. Entanglement began intensely locally and resisted dissipative decay - a pervasive soliton background?

The paper on KKLT pulls my layman triggers for "this could be groundbreaking". A way to show that almost all of the landscape is unstable?

"Another feature that is important to understand is what is the endpoint of the tachyonic instability."

"Hence, such a construction will not give a long-lived de Sitter vacuum, but will either give an unstable one or one whose cosmological constant will jump down whenever the anti-D3 branes are shot out."

Tied to the cosmological constant? Could this force a de Sitter vacuum with a CC below some very low treshold? Absolutely amazing if so!

Are there any prominent researchers in the "pro-KKLT" camp? Specifically, KKLT themselves could reasonably think the papers pointing out problems are simply good work and have no interest in, or way to, find fault with them.

Catfighting aside, congrats on the "featured comment"; I had no idea such a feature existed.

Discussing of wonderful photos, examine out the NASA Astronomy Image of the Day Database. One pic per day with a brief medical summary.

Spybubble gratuit

The equivalence principle is more apropos. Unless there are a lot of small people running around that I cannot see the anthropic principle leaves much to be desired. While I am more than happy to entertain the thought that physics breaks down at small scales, where are all the little tiny people at? Isn't that more likely? Multiverse? Riiiiiiight..... http://youtu.be/UUARU7iawSA

Since vacuum energy is close to matter energy or if we assume that Einstein did not make a blunder, has there been a conjecture to tie both energies directly by some process. Maybe one is responsible for the other.

A bit of humor, the extreme badassism of top-level physicists. I saw it in the presentation “Eternal Inflation and the Measure Problem" by Alan Guth in Madrid passed month.

http://33.media.tumblr.com/a9e9e85c783a1ca0a5e50b658b6967a4/tumblr_nd67ljG81m1qaityko1_1280.png

Is it possible that there is a repulsive quantum field which gave rise to inflation and it decreased to a small value when inflation stopped and we interpret it as a small cosmological constant today? I know, this is a completely naive speculation on my part!! I suspect it is probably not possible to write a mathematical model for such a field otherwise some one would have done it. But a discussion will be interesting.

Lubos, you might enjoy http://arxiv.org/abs/1401.2938

I think this paper comes close to answering my question

http://philsci-archive.pitt.edu/398/1/cosconstant.pdf

page 20

The idea that the field concept has to be used with great care lies also

close at hand when we remember that all field effects in the last resort can

only be observed through their effects on matter. Thus, as Rosenfeld and

I showed, it is quite impossible to decide whether the field fluctuations

are already present in empty space or only created by the test bodies.52

Post a Comment