## Wednesday, November 13, 2013 ... /////

### Flux repulsion may make a tiny C.C. natural

A "Darwinian" proposal to solve the cosmological constant problem

Since the cosmological observations in the late 1990s, most of us took for granted that the Universe is filled with dark energy (currently believed to represent 68% of the energy density $\rho=T_{00}$) whose character may be refined as the ordinary positive cosmological constant (the "C.C.") i.e. $p=-\rho, \quad T_{\mu\nu}=\rho g_{\mu\nu}$ However, the energy density $\rho$ seems to be extremely tiny in the apparently natural units of quantum gravity, $\rho\approx 10^{-123}\,m_{\rm Planck}^4$ which is the worst known prediction of (dimensional analysis in) physics.

This problem, the cosmological constant problem, doesn't have any convincing explanation except for the "possible" explanation involving the multiverse. There are many possible values of $\rho$ in different vacua of the theory of everything (we mean string/M-theory). Most of them don't admit any life but due to the large number, there are some vacua for which $\rho$ is tiny and those are "more important" because intelligent life may emerge in them. We are supposed to live in one of them.

This multiverse scenario avoids the "straight contradiction" but one could argue that the right way to estimate the probability that the C.C. is tiny still leads to an intolerably small value so we haven't explained anything. Isn't there a better way to argue that the C.C. has to be tiny?

Adam R. Brown, Alex Dahlen, and Ali Masoumi propose a new "mixed" approach to the question in the new preprint

Compactifying de Sitter Naturally Selects a Small Cosmological Constant
They remind the readers of the Bousso-Polchinski-like (including KKLT) constructions in which the vacuum energy density is distributed "uniformly" so very tiny values of the C.C. are as unlikely as any other fine-tuned values.

However, they argue that some flux repulsion terms may heavily distort the distribution so that the vacua with $|\rho|\ll 1$ in the Planck units are far more frequent.

They claim to have an example of anti de Sitter spaces with a tiny negative cosmological constant that get accumulated near $\rho=0-\epsilon$ as well as (the phenomenologically relevant) numerous de Sitter vacua with $\rho=0+\epsilon$. In both cases, the distribution is "expo-exponential" (my preferred word for any function similar to $\exp[\exp (x)]$: they and others call it "doubly exponential").

These lower-dimensional vacua are said to have a rather naive geometry$dS_{D-Nq} \times (S^q)^N.$ Here, $D$ is the total spacetime dimension and the overall spacetime is being compactified $N$ times on $q$-dimensional spheres, exploiting a $q$-form field strength. The compactification on a sphere is metaphorically identified with "having offspring" and the language of natural selection is applied there. The reason is that the "parent" vacua with a small C.C. are producing many more daughters and sons – the number of offspring scales like a negative power of the parent's C.C. So the "small C.C. vacua" are more viable in the Darwinian sense.

For the expo-exponential distribution to appear, we need $N\geq 2$, i.e. at least two $q$-spheres, and the rank of the differential form i.e. the dimension of each sphere has to be $q\geq 2$ for $N\geq 3$ and $q\geq 3$ for $N=2$. If $N,q$ are smaller than that, the divergence in the distribution isn't fast enough.

It means that at least six extra dimensions (the same number as we envision in compactified $D=10$ superstring theory) are needed but they argue that the ordinary superstring vacua are probably not ready for their construction (three two-spheres or two three-spheres in the role of six extra dimensions sound too revolutionary in their simplicity and it would be shocking if such vacua had been overlooked – but I must recheck this expectation) so they propose that a realization of their scenario could occur in $D\gt 10$ "supercritical" string theory, something I don't really like much, partly because of worries that such theories are non-perturbatively ill-defined, partly because they threaten us by a strictly infinite, "unbounded" landscape (the forefather spacetime dimension may be arbitrarily high).

Incidentally, when the number of vacua is strictly infinite (and it arguably is if we allow supercritical string theory), the number of vacua in any interval of the C.C. is infinite as well and the "relative proportion" of the different C.C. values (the probability distribution) depends on the way how we number the vacua so the accumulation could very well be just an unphysical artifact of a numbering scheme (these warnings are actually discussed in Brian Greene's popular book, The Hidden Reality).

Some of the annoying features of their approach have already been mentioned but there's one more: much like they predict a tiny C.C., they also predict a huge compactification radius. The distribution for the radius is similarly expo-exponentially peaked near the Hubble scale (which is obviously an unacceptable value). They may still get a "generic prediction of a small C.C." by demanding a much shorter compactification radius which they find exciting but it just means to trade one hierarchy problem for another, with some extra complications that their construction brings.

So I am mostly skeptical. But the point that some overlooked dynamical features could make the C.C. in a class of vacua to be "much more likely to be near zero" than naively expected is a point that I have shared for years. I am less certain about the validity of their particular method to achieve the accumulation of the tiny-C.C. vacua but "revolutionary enough model builders" should spend at least 10 minutes or an hour with thoughts whether such an unusually simple scenario is really impossible as they almost certainly believe while they are reading this sentence.

#### snail feedback (20) :

Thanks for explaining this Lumo :-)

I think it is much nicer to try such things (even if it does not pan out in the end), than just throwing in the towel on solving the CC or other hierarchy problems, by invoking some anthropic cop-outs ...

Could the approach of these outhors work with D=12 ...?

The meat of this bold proposal has the kind of 'smell of Infinity' that I have come to vaguely intuitively expect that the meta-mathematical marrow (mainly manifested by string/M-theory) of the multiverse MUST have. %-^-}
This will be my only comment on this blog-post (a post that exemplifies one of the several kinds of TRF-contents that causes me to feel lucky and privileged to be a faithful reader/follower and Lumo fan).

OFF-TOPIC

Lubos - 95% and all that... You might be interested in this:

http://www.nature.com/news/weak-statistical-standards-implicated-in-scientific-irreproducibility-1.14131

"compactification" is one among many misunderstandings that physicists so much love. If only someone would strip these notions of all the physics hype and take them as they are defined in mathematics or topology one would understand far more.

Yep, this is why Urs Schreiber and friends provide this nice site

http://ncatlab.org/nlab/show/HomePage

for physicists to learn about how such notions are applied more in accordance with how mathematicians use them.

well, thank you for telling me about it. I knew it before. In fact if you think about compactification as mathematicians do you will see that the whole string theory is a sort of incomplete compactification of some topological space. Honestly, I am not that sure that string theory as described now can work but I am pretty sure it contains some very interesting principles nobody actually observed, so I think on one side I am a string-skeptic, on the other side I think string theory deserves some more attention and deep research... sounds strange, no?

If you mean to say that we should use purely rigorous methods in Physics, I disagree. If we were to turn all of (theoretical) physics into mathematical physics, we will still be stuck at Quantum Field Theory.

Physics is the study of the real world, so it requires intuition too, in fact it is very important.

I once had a comment discussion on Physics.SE with a mathematician, Sergei Akbarov, who thought that physics should be done purely axiomatically, and the extent of his argument is that a lot of things that I and others take pretty trivially are actually pretty crazy to mathematicians.

I of course respect this view and actually think that Mathematical Physics is important but I disagree with your idea that all of (Theoretical) Physics should be dominated with rigorous Mathematical Physics, since Physics is after all the study of... the "real world", not abstract Kirchoff's this and that : ) (and yes, I know Kirchoff has nothing to do with mathematical rigour, but I liked that phrase I found in a popular-level book).

I agree with you but look, take as an example supsersymmetry... it is clear that it is a way to avoid coleman mandula theorem by having commuting and anticommuting variables but the fact that Haag Lopuszansky Sohnius theorem proves that susy is the "only" way is by far not as rigurous as I would like. In my opinion it is *a* way and if some people would care more about existence *and* unicity proofs in physics one could advance farther. There are nevertheless lots of other aspects where things can be improved. You mention path integrals. They can be well defined or poor defined and it is not for everyone obvious how to do it but there are not a "prescribed" way of doing the quantisation except for the case in which you want to work with "fields", and yes, quantum fields are not phsyical objects in any way... so, what I mean is that one should pay more attention about the rigor of some formulations because new things can appear from that

also, take flux compactification: if you look carefully there can be no compactification towards a desitter or minkowski spacetime unless you allow localized sources. Is that the only way? No unicity theorem says anything about that except a "no-go" theorem that appears rather obvious but is not as "exact" as one might think

it's as if a bunch of people were in a hurry and accepted each time the first idea someone got...

Dark energy & dark matter... with a little bit of tweak those two are eliminated. G is not universal constant, that's the tweak. I can't believe how blind physicists seem to be :-)

As a bonus, with the tweak, one gets the TOE.

I mean, string theory is NOT wrong, how could it be? I understand all the proofs that I learned about it (probably I did not read all of them)... so, yes, it is the most well proved theory in terms of the existence of all its parts or the mathematical consistency but it is NOT clear that all the parts put together represent any true real final theory or that some aspects are not just one way of looking at things out of many others. Say, "multiverse"... who could honestly say there is a real necessity for a "multiverse"? There are some people making huge propaganda everywhere why I don't know what anisotropy in the background radiation should be interpreted directly and only as a proof that something is pulling on the universe... I mean, what if it's because of a phase transition that occurred in the early universe or something else... See, this is what I call "rigor": just don't jump around whenever you have a new idea and think a bit... it may be correct and consistent but then, maybe nature found another way... etc.

very interesting article. If "C.C is much more likely to be near zero" and small C.C. is associated with life, then it may imply that not only our universe is full of life, multiverse also would be full of life. How many aliens we have to face with?!!!

So it's related to evolution by way of analogy, not literally. For a minute there I was worried this was the universes literally reproduce and evolve idea.

Dear Kashyap, whether other vacua are full of life or not, we will never "face" them.

If a no go theorem based on several assumptions is explicitly falsified by relaxing one of its assumptions then of course it can be falsified by relaxing all other assumptions. But then logic dictates that the set of assumptions used in the first place must not have been unique because adding another assumption may make the no-go theorem again valid so the validity of the theory is reduced not only in the directions of the first assumptions but in the directions of all other possible sets of assumptions. Unfortunately there is no axiomatic foundation for the assumptions made and there are lots of other possible choices for "assumptions", none of them being anyhow proved to be more fundamental than another. This is a problem that goes around string theory from one end of it to the other, from the proofs of duality, anomaly cancellation, compactification, what do I say? Any so tiny progress made in string theory is not proved to be neither unique not relevant. The only proof there is is the proof of consistency, but then again, there is not only one unique framework of "consistency" in the universe... examples are quite a lot (take just all the different but equivalent ways of quantization prescriptions). I studied carefully anomaly cancellations for a year or so and in the end I came to the conclusion that they are wishful thinking. It may be a hard word but then, there is no actual reason why one should demand anything from the physical universe based on the consistency criteria of some theory. Most of the results since the 1980's are based on assumptions that Nature must do something to save us from saying nonsense... well... it must not! If something proved that we can merrily say nonsense in a consistent way then it was the whole history of pre-quantum science...

It is not clear to me that if the cosmological "constant" is nonzero then it is time dependent as a result of the dissipation of energy by the entropy associated with the dissipation of the energy that must accompany the curvature of spacetime