Wednesday, March 20, 2013 ... Français/Deutsch/Español/Česky/Japanese/Related posts from blogosphere

Greene et al.: too large landscapes are unstable

For years, I've been intrigued by the general idea that the usual KKLT arguments – supporting the view that the number of vacuum (dS or AdS) solutions to string theory is googol-like huge, most of those are stable, and they realize the anthropic principle (including the anthropic explanation of a tiny cosmological constant) within string theory – fail because there is an overlooked mechanism that tries to harm the classes of vacua that are too numerous and prefer the more unique vacua (or their small families).

In the landscape category of this blog, you will find numerous articles discussing various additional instabilities that may appear on configuration spaces of too high dimensions.

See e.g. Resonance tunneling and landscape percolation, Landscape decay channels, Disorder on the landscape, Locally predictive landscape, among others. I have personally spent some time with new ways in which the compactification manifolds could decay – even to several disconnected, simpler pieces – and various new factors that could enhance the decay rates. Now, there's a new addition to the collection with some famous author names.

Brian Greene, David Kagan, Ali Masoumi, Erick J. Weinberg, and Xiao Xiao released the following hep-th preprint:

Tumbling through a landscape: Evidence of instabilities in high-dimensional moduli spaces
Incidentally, I think it is a mistake for Brian et al. not to cite several of the papers discussed in the aforementioned TRF blog entries.

At any rate, the main message of the new paper is clear: if you look at portions of the stringy or similar configuration space with too many scalars (those that result from "too complicated" compactification manifolds with too high Hodge numbers etc. which are typically necessary to obtain the anthropically huge sets of compactifications), the naive calculations of the rate of the lethal bubble nucleation has to be modified and one actually finds many more instabilities and more abrupt instabilities on that sub-landscape. This dramatically reduces the percentage of the viable vacua in these large classes.

So far, I haven't been able to "compress" the reason why this is claimed to occur into several words – their arguments are a mixture with some numerical calculations that make the main idea somewhat less penetrable to me; but I haven't spent enough time with the paper so far so this may change. At any rate, I do guess that the increased instabilities have something to do with some new freedom to construct instantons – or new (more complicated) instantons – that don't exist if you consider with one scalar or several scalars only.

Instead, let me offer you a few approximate formulae that summarize their main conclusions.

The bubble nucleation rate has the form\[

\Gamma = A e^{-B}

\] where the exponent \(B\), normally related to the instanton action, has a more important impact on the stability of the minimum in the landscape. They look at the percentage of the minima in the sub-landscape with \(N\) for which \(B\geq 1\) – well, one surely wants \(B\) to be much greater than one for the Universe to be stable enough and to be a candidate to match ours, so let's look at a more general separation of the minima to stable and unstable ones.

They find out that the fraction of the minima in the landscape for which the exponent \(B\) is greater than \(B'\) i.e. that are more stable than a certain bound goes like\[

\frac{\#({\rm stable})_N}{\#(\rm all)_N} \sim \exp[-0.001\cdot\lambda\cdot B'\cdot \exp(b N)]

\] This formula results from some partly uncontrollable set of assumptions and some numerical calculations – at least I can't prove that every piece is robust now. But they suggest that \(\lambda\) could be of order \(0.01\). If that's so, the double exponential decrease produces, for \(B\sim 1\), the fraction \(10^{-20}\) or so for \(N=18\) and \(10^{-1,300}\) for \(N=23\). For higher values, the double exponential drop of the percentage becomes insane.

At that moment, for \(N\geq 23\) or so, the ratio is pretty much zero and one sort of expects (well, if he assumes that all the vacua are generic and the distributions are quasi-uniform, something I have a big problem to believe, but let me omit this complaint in this blog entry) that you won't find a single stable enough minimum on the landscape. The number of vacua in those classes, even though it's often claimed to be a "very high" number such as \(10^{500}\), is simply not high enough to beat the "instability disease" that cripples most of the candidates.

I think that if these formulae were right and if one would still try to follow the KKLT-like multiverse thinking, but with the corrected maths, the mechanism could still prefer the vacua with \(N\sim 18\) or so, using the example above, and there could perhaps be still enough minima on such a sub-landscape to anthropically explain the tiny cosmological constant, assuming that the potential would be a function of a high enough order (or complexity) in these fields. So it seems to me that their conclusion that they debunk the anthropic explanation of the cosmological constant may be premature or incorrect.

Your humble correspondent would prefer a principle that favors vacua with the smallest values of \(N\), the number of scalar fields, such as \(N=1\), and some completely different, not-naively-statistical and anthropic, explanations of the small C.C., but my preferences don't matter, of course. It is perfectly plausible that Nature favors some intermediate values of \(N\). There could be \(10^{200}\) vacua for \(N=17\) and \(10^{-80}\) of them could be stable enough. That could still produce vacua with the minimum achievable positive cosmological constant around \(10^{-123}\) in the Planck units by some natural naive anthropic estimates.

Again, to repeat some points I have done many times in this category, I feel that too much intuition from quantum field theory has been blindly imported to string theory. However, string theory may modify many of the conclusions and even in quantum field theory with very many scalar fields, the events may simply proceed differently.

Just a trivial example. Consider the \(N\)-dimensional unit ball. What is its volume (in the units of the volume of a unit \(N\)-dimensional cube)? This is a fun exercise I numerically "derived" with the help of a Commodore 64 when I was 10 years old and the answer is\[

V(B^N) = \frac{1}{(N/2)!} \pi^{N/2}

\] where \(n!=n\cdot (n-1)!\), \(0!=1\), \((-1/2)!=\sqrt{\pi}\). Note that for very large \(N\), the factorial ultimately grows faster than any simple exponential (or power law). Indeed, Stirling's formula says\[

X! \sim \sqrt{2\pi X}\zav{\frac Xe}^X, \quad X\to \infty.

\] The most important factor over here is \(X^X\). This ultimately beats \(E^X\) for any fixed base \(E\) such as \(E=e\). Incidentally, you may calculate the volume of the ball analytically by computing the integral \(\int d^n x\,\exp(-|x|^2)\) in two different ways, by a decomposition of the integral into \(N\) one-dimensional factors yielding \(\sqrt{\pi}\) from the Cartesian coordinates or by a calculation in spherical coordinates whose angular part produces the volume of the sphere and the radial part generates a version of the Euler integral for the factorial.

Now, this volume of the \(N\)-ball or a similar \(N\)-sphere (or something similar) naturally appears at various points of the calculation. So you may get something like \(10^N\) vacua but the rates and other quantities may be modified by coefficients that include things of the sort \(N^N\) – and the latter inevitably wins if \(N\) is really large.

Many papers have been written and what really matters on the landscape of highly complicated compactification manifolds hasn't been settled yet. But one must surely be careful because much of our intuition has been trained in field theories with small values of \(N\sim{\mathcal O}(1)\) and they may break down when we switch to complicated landscapes.

Tomorrow, Keith Copsey will release a related but inequivalent paper that will argue that orientifold planes – pretty much inevitable in all semi-realistic F-theory vacua – suffer from a perturbative instability (deformations allowed when you lift the orientifold planes to M-theory) that may destroy the stringy landscape as we know it. I find this claim confusingly far-reaching because for large enough manifolds in Planck units, the instabilities from some non-local states or transitions must be hugely suppressed and essentially described by the effective low-energy field theory, am I wrong?

Add to Digg this Add to reddit

snail feedback (6) :

reader Luke Lea said...

"Just a trivial example. Consider the N-dimensional unit ball. What is its volume (in the units of the volume of a unit N-dimensional cube)? This is a fun exercise I numerically "derived" with the help of a Commodore 64 when I was 10 years old and the answer is . . ."

Wow, you really were a prodigy. Was it for you like it was for Sheldon in East Texas?

reader Luboš Motl said...

Dear Luke, one doesn't have to be a prodigy to do that! It's rather straightforward. A basic understanding of the concept of integrals, the ability to write a program for a simple integration, and some elementary numerological playfulness are enough.

It was almost 30 years when I wrote the program but I remember it as if it were yesterday. ;-)

I cut the N-dimensional ball to slices which are (N-1)-dimensional balls whose volumes were already found, as a function of "r", by induction, previous calculations. So the volume of the N-ball is an integral of a simple function from (-1) to (+1).

Now, because there are pi's everywhere, you try to divide the results by pi to see whether the ratio is rational - you know in advance it works for the disk and for the 3-ball - and if it doesn't work for the 4-ball etc., you try to divide it by pi again.

The accuracy of the real numbers' integration was so high that you have no doubt that the volume of the 4-ball and 5-ball is a rational multiple of pi^2, then pi^3 for 6-balls and 7-balls, and so on. One determines the rational multiples by a simple division. I think that I found the multiples up to the 10-ball, and factorized the rational numbers' numerators and denominators.

The formulae were clearly behaving differently for even and odd values of N. I reconstructed the factorial-like rules for both cases separately. It took me several more years to realize that the odd- and even-N cases may be unified if one naturally defines the factorial of half-integers, too. I had to wait through the high school to get through the Gaussian non-numerical derivation and as far as I remember, I didn't rediscover the analytic derivation independently: I just saw it in a book.

Sheldon was arguably doing more hardcore things when he was 10.

reader Dilaton said...

Nice article, I'll have to follow the links to the different instability mechanisms too :-)

In particular I'd like to find a way to collapse the family of landscapes that contain too much snow ...!!!

reader scooby said...

There is a new preprint this morning on this topic: .

reader landscape_doubter said...

Great that you mention these works. Let me just add that in the recent years there have been various papers claiming that the anti-brane uplifting mechanism suffers from problems which can also be interpreted as instabilities. These papers are not all hand waving but contain actual serious computations. So it seems that in many ways certain string cosmologists have been claiming a landcape and multiverse a bit too soon... (the other uplifting scenarios suffer their own problems as well. So de Sitter is very elusive in string theory-perhaps simply not there)

reader landscaping in the woodlands said...

I just like this article thanks to share this article with us. I think this is really nice and interesting too,