## Friday, April 22, 2005 ... //

### Kennedy's landscape

Frederik Denef (Rutgers U.) was explaining how to build a better racetrack (with Bogdan Florea and Michael Douglas), i.e. how to construct particular examples of the numerous KKLT anti de Sitter vacua - the mathematical constructions that are used to argue that the anthropic principle is needed in string theory. The talk today was actually based on a newer paper with Douglas, Florea, Grassi, and Kachru; sorry for an incorrect reference, and thanks for Frederik's correction. Nevertheless I will keep examples from the older paper, too. This stuff is impressive geometry. A really high-brow mathematics, even if it happens to be just recreational mathematics.

Nevertheless, the most illuminating idea was the following variation of Kennedy's famous quote due to Abdus Salam:

This could become the motto of the landscape research. Suddenly it's not too important whether a theory teaches us something new about the real world - either predicts new unknown phenomena or previously unknown links between the known phenomena and objects. It's more important that such an unpredictive scenario might be true and we should all work hard to show that the scenario is plausible because we should like this scenario, for some reasons that are not clear to me.

It's slowly becoming a heresy not to believe the anthropic principle - but it already is heresy to think that even the question whether the anthropic reasoning is the explanation of the details of our universe is not the most interesting question, at least among the scientific ones. Even if some numbers in Nature - such as the particles masses - are random historical coincidences, we will never know for sure.

Let me remind you about the basic framework of the Kachru-Kallosh-Linde-Trivedi (KKLT) construction - the most frequently mentioned technical result to justify the anthropic principle in string theory. String theory often predicts many massless scalar fields that are unacceptable because they would violate the equivalence principle and we could already have detected them.

They must be destroyed - i.e. they must acquire masses. The potential energy as a function of the scalar fields must have a finite or countable number of minima. The scalar fields then sit at these minima - we say that the moduli (scalar fields) are stabilized which is a good thing and one of the unavoidable tasks. Moduli stabilization was only the main goal of Frederik's talk.

KKLT start with F-theory (a formally 12-dimensional theory due to Cumrun Vafa) compactified on an elliptically fibered (=interpretable as an elliptic curve, i.e. a two-torus, attached to every point of a lower-dimensional base space) Calabi-Yau four-fold (an eight-dimensional manifold) to give you a four-dimensional theory with a negative cosmological constant and all moduli stabilized. Then they add some non-supersymmetric objects (D3-branes) to create a de Sitter space (with the observationally correct, positive cosmological constant and broken supersymmetry) out of the original anti de Sitter space (AdS).

The talk today focused on the AdS, supersymmetric part of the task.

The F-theory vacuum on a four-fold may be re-interpreted as a type IIB vacuum with orientifold planes (both O3 and O7 where 3 and 7 count the spatial dimensions along the fixed planes). Moreover, there are some fluxes of the three-forms over three-cycles (both the NS-NS as well as the R-R field strengths). The integral
• int (H3 wedge F3) + #(D3)
must vanish due to a tadpole cancellation which constrains the fluxes H3 and F3 (numerical constants ignored). In terms of the four-fold, the vanishing quantity may be written as
• L = 1/2 (int G4 wedge G4) = chi (X) / 24 - #(D3)
where you may think about M-theory on a four-fold instead of F-theory (a dual description for finite areas of the elliptic fiber), and G4 is the standard M-theoretical four-form field strength (its integral over one of the two 1-cycles of the toroidal fiber gives you the NS-NS and R-R three-form field strengths, respectively). Such a cancellation condition still allows for a huge spectrum of possible choices of the integer-valued fluxes: as Bousso and Polchinski estimated 5 years ago, if there are 300 three-cycles and each of them can carry a flux roughly between 0 and 30, then there are 30^{300} or so possible universes. The light scalar fields that we need to stabilize are
• the dilaton/axion
• the complex structure moduli, the shape parameters of the four-fold
• the Kahler moduli, the areas of topologically non-trivial two-dimensional manifolds (2-cycles)
The former two categories are stabilized perturbatively by the Gukov-Vafa-Witten superpotential
• W = int (Omega wedge G3)
where Omega is the holomorphic three-form and G3 is the complexified three-form field strength that includes both the NS-NS and R-R components (with "tau" as the relative coefficient, which makes "tau" also stabilized). This perturbative superpotential handily stabilizes the dilaton/axion and the complex structure moduli at some values that are in principle calculable. Well, I should really write the 8-dimensional integral "int (Omega4 wedge G4)" from the M-theory or F-theory picture.

However, the Kahler moduli (the sizes of the two-cycles) are not stabilized by any perturbative effects. Such a fact is also known from other types of stringy models of reality, the so-called "no-scale supergravities" obtained e.g. by compactifying the heterotic strings on Calabi-Yau three-folds. These moduli are, however, stabilized by M5 (or "F5")-brane instantons wrapped on six-cycles of the four-fold. This can either be interpreted as D3-brane instantons in type IIB, or condensation of gauginos living on the D7-branes.

Note that we want to add new terms to the superpotential W that stabilize all the moduli. The precise value of the Kahler potential (not to be confused with the Kahler moduli although Mr. Kahler is of course identical in both cases; the Kahler potential is another function that determines the physics of four-dimensional supersymmetric theories) is not protected and it's always a source of controversies.

OK, these are the general rules - everything else is to look for more exact, particular examples. A goal is to stabilize the Kahler moduli at sufficiently large volumes of the internal space whatever the space exactly is. This (large volume) is something that can be marginally achieved (if you think that the number 20 is large), but the 2-cycles are never really large at the end. Instead, they are comparable to the string size.

The anthropic strategy is to pick as complicated Calabi-Yau manifolds as possible, to guarantee that there will be a lot of mess, confusion, and possibilities, and that no predictions will ever be obtained as long as all the physicists and their computers fit the observed Universe (which is an encouraging prediction that Frederik has also mentioned).

This means that you don't want to start with Calabi-Yaus whose Betti numbers are of order 3. You want to start, if one follows the 2004 paper, with something like F_{18}, a toric Fano three-fold. That's a 3-complex-dimensional manifold that is analogous to the two-complex-dimensional del Pezzo surfaces, in a sense. But you don't want just this simple F_{18}. You take a quadric Z in a projective space constructed from this F_{18} and its canonical bundle. OK, finally the Euler character of the four-fold X is 13,248. Great number and one can probably estimate the probability that such a construction has something to do with the real world. It becomes a philosophical question whether we should be distinguishing this probability from the number "zero" and how much this "zero" differs from the probability that loop quantum gravity describes quantum gravity at the Planck scale. One can also estimate the values of the scalar fields at the minima of the potential, and the number of vacua (some of their models only had a trillion, others have 10^{300} - of course, the Kennedy rule is that the more ambiguous and unpredictive the set of vacua is, the more attention physics should pay to them).

The example today, from the 2005 paper, was the resolved orbifold "T^6 / Z_2 x Z_2" which has 51 Kahler moduli and 3 complex structure moduli. The singularities were analyzed by a local model, and various toric diagrams shown were related by a flop (or a flip, as is now a more popular terminology). Sorry for neglecting the real model of this talk in the first version of this article.

Cumrun - who is not exactly a fan of the anthropic principle (unlike Nima, who tried to counter) - was extremely active during the talk and he argued for the existence of many new effects that were neglected. For example, there is new physics near a high-codimension singularity that is needed in one of these models. Cumrun argued that the fivebrane instantons could get destabilized - kind of unwrap from the singularity; that a lof of instanton corrections could arise from various cycles, and so forth. The expansions are never quite under control because they rely on some "small" numbers that can be as large as (4.pi/flux) where the "flux" is of order "ten" or "one hundred". Most estimates for the Kahler potential are unjustified, and so forth.

Their calculations required to draw a lot of toric diagrams (that's a representation of a manifold where toroidal fibers are attached to a region with boundaries on which some of the circles of the tori shrink to zero); determine various cycles and their triple intersection numbers (it's like counting how many holes a doughnut has, but in a more difficult 8-dimensional setup) which are needed for the volume; a lot of computer time. Do we really believe that by studying the orientifold of the weighted projective space CP^{4}_{[1,1,1,6,9]}, we will find something that will assure us (and others - and maybe even Shelly Glashow) that string theory is on the right track? I believe that the simplest compactifications, whatever the exact counting is, should be studied before the convoluted ones. If we deliberately try to paint the string-theoretical image of the real world as the most ambiguous and uncalculable one, I kind of feel that it's not quite honest.

When we study the harmonic oscillator and the Hydrogen atom, we want to understand their ground states (and low-lying states) first - where the numbers are of order one. Someone could study the "n=1836th" excited level of the Hydrogen atom, hoping that it is messy enough so that it could explain why the proton mass is so much larger than the electron mass. But it is a well-motivated approach? Some people used to blame string theorists that they were only looking for the keys (to the correct full theory) under the lamppost. It's unfortunately not the case anymore: most of the search for the keys is now being done somewhere in the middle of the ocean (on the surface). Maybe, someone will eventually show that the keys can't stay on the surface of the ocean, and we will return to the search for the keys in less insane contexts. But it's not easy to prove something about the middle of the ocean, especially if we don't yet understand the shape of the Manhattan island.

#### snail feedback (18) :

Hi Lubos,

I must say you're an impressively fast writer...

Just wanted to point out that you cited the wrong paper -- I was talking about our more recent hep-th/0503124. The model we study in that paper is T^6/Z_2 x Z_2 -- hardly an "as complicated Calabi-Yau manifold as possible". We did actually check an exhaustive list of nontrivial conistency requirements for this model and estimated various corrections.

Also, I was not advocating the anthropic principle or anything like that in this talk. I tried to keep the issue clean, technical and certainly non-philosophical, just showing that all necessary ingredients for a susy AdS KKLT vacuum can be realized in a very concrete and simple model. It's up to you what to do next with this piece of information.

Finally, the physics variant of the Kennedy quote actually goes back to Abdus Salam. This philosophy seems to have worked quite well for him.

Cheers,
Frederik

the paper is indeed a very interesting one, esp. the part mentioning 'O7 swallowing a D3';)
Salam's modification of JF Kennedy (airport's racetrack) captures the universal sufferings in string theory research very well. da theory is supposed to have infinite amount of potential, and we do everything and anything we can for the theory. sometimes, a nice piece of pie drops from sky.
there's some far-fetched similarity between commercial airlines and da string community. in a sense, commercial airlines are better because there the majority are pilots, not hi-jackers :)

Just a comment in respect of your 2nd paragraph:

If a theoretical or philosophical interpretation is (or can be shown to be) the one most likely to be true then is "should" not matter if you or anyone else [even me :}] cannot stand to take in this truth.

String theory and evolutionary psychobiology type theorizing can (or may) never become meaningfully mathematically unified,
but these fields of Science sure can be seen to be closely correlated at the level of string theorists's sensibilities. %-)

With awed but entertained appreciation from afar,

Peter

This comment has been removed by a blog administrator.

It seems to me that, whilst you think that string/M-theory is an approximately correct approach at a fundamental quantitative explanation of "What Is going on", you (in addition) hope or would prefer (or perhaps even expect?!) to find a way to logically prove that out of an ~infinite number of "vacua" only one is real (or physically possible) and all the rest is just figments of string theoretically inspired imagination.

Do you think I am not even wrong?
%}

Whilst I am in unqualified agreement with you that string/M-theory "is it", I also sense that it will ultimate beg to be declared 'fundamentally theoretically uncertain' (so to speak); similar to so much else - including stringy stuff.

But I am also prepared to bet a good bottle of wine (as long as I'm allowed to share it;) on that string/M-theory is placed at the very end of a line of best possible workable mathematical logic.

Peter

While you technical boys like to play, I am going to bring some down to earth here if that's okay?:)

I think we are gaining quite appreciably on what physics and science has done for us, as we move our views to transcendental view of our pictures of the earth and moon. Gia's meaasure of the variance using mirrors on the moon should suffice to insight?

We are understanding lag times(clementine), and we understand the two points measure of GRACE, and with this a deepr view of the reality of the mass, as a Mean time, and other fucntions of the ocean, as time avariable aaginst this mean.

So we have gained in perception about the island in the vast cosmos?

Is there ever a sane way to bring this theoretcial world down to the popluation at large. I certainly hope so and am trying hard. I believe some like to see better, and just can't?

How would theoretcial developement insight math to bring forth the images that are necessary for complete comprehension? Susskind understands this well I think. Non?

Lubos said: "Do we really believe that by studying the orientifold of the weighted projective space CP^{4}_{[1,1,1,6,9]}, we will find something that will assure us (and others - and maybe even Shelly Glashow) that string theory is on the right track? "
But Lubos, the world is a complicated place. I *would* expect the actual internal space of our Universe to be something extremely complicated -- how else are you going to explain all those complicated mass matrices etc etc etc. The problem is not here, it is with how that particular complicated space is *selected*. And that, as you and Vafa have stressed, ought to come out of solving some kind of Wheeler-de Witt equation. What you should be telling us is not how silly the landscape is, but about how Vafa and his clever pals are going to push along their wavefunction of the Universe ideas. Or perhaps you can tell us what Sash Sarangi and Henry Tye are up to these days. I think that this is more useful than just telling us that the landscape is nonsense-- we know that already :-)

Fyodor: "But Lubos, the world is a complicated place."

Dear Fyodor, at the fundamental level, I just, respectfully, don't believe so. For example, the Standard Model and GR (plus 30 parameters or so) - three lines of equations, if you just want to define them - describe virtually everything we have observed so far.

And string theory in the classical vacua - e.g. those that respect Grand Unification based on E_8 or other groups, to quote the best example - has the power to reduce the number of independent ideas and objects further although the vocabulary required from a physicist is broader than for QFT (the compactness of the formulae is a cheat, to some extent).

Quarks and leptons naturally fit into a derivable representation and gravity is automatically the other low-energy force. This was a clear progress in describing the real world, albeit an incomplete one so far.

In the history of physics, the fundamental laws were never really "complicated" at the end. Recall the bootstrap era in the 1960s that was meant to explain the zoo of strongly interacting particles by something that can never be really calculated. Wrong. This atmosphere also suggested that effects with 10^{200} possibilities could occur. The true answer was an SU(3) gauge theory - which is "the next one" after the SU(2) that was already known to be relevant, if you allow me to use this counting.

In physics, it has also been a useful policy to look for the simplest theories with certain features, and then for the "next to simplest" and "next to next to simplest". The methods how the theories are generalized and how the "next" is interpreted became much richer and often included new fascinating concepts (quantum mechanics is also just "next to" classical physics, in a sense). It's just the opposite approach to look for "very-far-from-simplest" theories that only use the same concepts as some simple theories, and I am not aware of any example in history when it was successful.

The statement about a "complicated place" disagrees with the program of unification and the search for deeper theories. The most incomprehensible thing about the universe is that it is comprehensible, as Einstein said. Five years ago, I would not think that many people in the field would endorse the comment that the ultimate explanation for complicated mass matrices would be the existence of a very complicated landscape of backgrounds where the right one can't really be isolated. Today that's becoming mainstream, or - at least - the most visible research direction.

Better theories in physics - and even better theoretical frameworks for string theory - are always meant to explain a larger number of previous observations using a smaller number of independent objects, principles, and equations, which typically also includes yet-unobserved phenomena i.e. better theories are able to make new predictions.

If a description of the real world claims that the observed patterns - such as the fermion spectrum; hierarchies of various kinds; disappearance of unwanted particles - are a consequence of some uncontrollable accidents in an uncontrollably complicated framework, then this description is not progress in physics because it does not explain anything.

The statement that I believe that string theory describes the real world means that it will once become possible to predict new phenomena and numbers (or at least relations between them) that cannot be extracted from the previous theories. Explaining complicated mass matrices by saying that "there is a comparably complicated geometry behind it" is not an explanation of anything, and surely, in that case, it's more realistic to work with the low-energy effective field theory than with a hypothetical ensemble of hypothetically consistent 10^{200} different F-theory vacua all of which *might* reproduce it with an unknown probability.

Concerning Hartle-Hawking, I was planning to write a report about the journal club on Wednesday about the topic. But once again, if the Ooguri-Vafa-Verlinde program is successful in real physics one day, it means that it will present the correct background as a rather *simple* background (optimistically speaking, "the" background), in an appropriate counting of simplicity, not just "a" complicated background.

Counting probabilities by the HH wavefunction is physically more well-motivated that, for example, "democratic" counting of vacua as vacua. But being more well-motivated is still something different from being an acceptable or established description of physics. Even for the Hartle-Hawking program to be accepted as relevant for the selection problem, predictions or unification of the previous insights must occur.

Comparison of known observed physics and the string theory models: I think that the observed reality is undoubtedly simpler than the typical models that are being studied today. String theory always predicts many universal and almost universal phenomena that are needed to reconcile gravity with quantum mechanics, as I believe. Extra dimensions (although some people would even disagree that string theory makes any prediction or even bias about the total or even the maximal number of dimensions); at weak coupling, excited string states, and so forth.

Then there are lots of objects predicted by various stringy models - low-energy effective fields - that are far from being universal predictions of string theory - such as 51 Kahler moduli in a certain "simple" model. None of these things has been observed. They're not needed for any kind of theoretical consistency, and I think that a natural comment is that they're disfavored observationally. The main reason why many people push these complicated vacua is that they are complicated and numerous enough so that even if they're not true, one of them is likely to look much like our real world as they probably "almost densely" cover the parameter space of low-energy effective field theories. But this is not an explanatory framework for physics. And it goes against the Einsteinian program of unification.

Of course that neither of us knows what other fields and physics there is at higher scales than what we've seen. But I guess that a promising model should still confirm the physics we know that exists, and it is better if it avoids physics that we have not seen.

"Concerning Hartle-Hawking, I was planning to write a report about the journal club on Wednesday about the topic."

I --- and not just I --- will definitely look forward to seeing that!

"But once again, if the Ooguri-Vafa-Verlinde program is successful in real physics one day, it means that it will present the correct background as a rather *simple* background (optimistically speaking, "the" background), in an appropriate counting of simplicity, not just "a" complicated background."

That would be ideal, of course, but it is not unusual for very simple *equations* [Like WDW] to have very complicated *solutions*. The complexity of some landscape models is bad, but it would be ok if it could be shown to result from maximizing something coming out of a general, very *simple* wave function like the one proposed by Firouzjahi Sarangi Tye. In other words, the landscape is only bad if it is presented [as unfortunately it usually is] as being fundamental.

Counting probabilities by the HH wavefunction is physically more well-motivated that, for example, "democratic" counting of vacua as vacua. But being more well-motivated is still something different from being an acceptable or established description of physics. Even for the Hartle-Hawking program to be accepted as relevant for the selection problem, predictions or unification of the previous insights must occur.

Absolutely. And that's bound to be hard. But that's what we should be trying to persuade people to do. Even in the case of Fred Denef's relatively simple construction, I still want him to tell me *how* that nice simple construction is physically preferred over the horribly complicated ones. Get it out of a modified HH wavefunction, even in a not-enormously-believable way, and *then* one can claim that some progress has been made. Meanwhile, as Lubos says, we still don't properly understand the HH wavefunction itself, and people should work on that. All these things are more necessary than still more statistical hocus-pocus.....

One simple calculation showing why it is very important to prefer "simple" models that don't contain much extra low-energy stuff.

String theory predicts many new particles that have not been seen. It's OK, some of them are guaranteed to be much heavier - excited strings and perhaps also the KK modes defined by their scales. But take all particles that are, in your approximation, massless or very light. Let's assume that your framework can't reliably tell you about a hierarchy among these light particles (no separation of scales), and there are M of them. Also, let's assume that your model(s) contain the Standard Model spectrum plus other things.

Let's say that the Standard Model has 20 or so elementary particles - different mass eigenvalues - and these are those lightest ones, namely those lighter than 92 GeV. Given our assumption that the spectrum of the light particles in a model is more or less random, the probability that you get the right ordering is one in (M choose 20).

For M safely greater than 20, the fraction of plausible backgrounds goes like 1/20^M.

Note that this suppression of mine contributes much like the vacuum counting of the type 20^M if I identify the number of light particles species "M" with the number of cycles. Most likely, the number of light species could be bigger than "M", and then my factor wins. Also, my factor would be growing stronger as we know more physics at higher energies, and more light particles (then the number 20 may grow). This is a textbook example of Feynman's quote that every new insight we gain reduces the room for religion; the anthropic explanations share the same fate.

The more we know about reality, the less likely it is that reality is described by one of the vacua in a fixed ensemble whose only "virtue" is that it is very large so that it may seem that it's more likely to be "lucky".

A general conclusion is that the 10^{-120} accuracy to fine-tune one number, the cosmological constant, may seem difficult, but in reality, having the right Standard Model at low energies is more constraining, and this difference increases with the convolutedness of the model because it becomes increasingly less likely that it is just the SM at low energies!

Note that if the number of a priori plausible vacua is finite, then the total probability that there is a model in the random landscape that agrees with reality might also converge "safely", and it would in fact be very small. The more low-energy stuff a "generic" theory contains, the more likely it is that it will contradict observations.

It is very important for a promising model to show that it contains just the Standard Model - or the Standard Model plus a very few new things - at sub-100-GeV energies. Surely no one can rationally look deliberately for complicated theories - except if someone thinks that string theory is wrong and the goal is to hide it for as long as possible.

Also, as the simple calculation has hopefully shown, it is not right to consider *one number* of the order 10^{-120} to be the *main* constraint on our theories. The existence of the Standard Model at low energies with the right spectrum is more constraining, and *far* more constraining for the class of convoluted models that are being proposed to solve the C.C. problem. The more convoluted vacua one considers, the more important other, non-cosmological-constant problems of these vacua will become.

OK, someone may object that the light particles are all light because they are all derived from the weak scale, and therefore there is one hierarchy only, and the number "20" above should really be replaced by "1". I don't know how to address these questions without knowing how the Standard Model occurs from the flux vacua which does not seem to be the focus of the flux research. The number of (light) generations is certainly another constraint that becomes more restrictive as the convolutedness of the models grows.

Fyodor: yes, indeed, the solutions to the WdW equation (and other equations) are likely to be subtle - and should be subtle if the equation is useful at all. They should be "simple" in the sense that the physicists will eventually be able to understand these solutions, eliminate major open questions, and learn to use them quantitatively.

"T^6 / Z_2 x Z_2" sounds pretty simple but the number of moduli etc. just seems too large. Note that this particular model also gives O(10^{12}) vacua only and because this number is rather small, it would not be viewed as a serious candidate by the "strongly" anthropic guys.

Sorry, 20^M should have been M^20. ;-/

Hey Lubos,

As Fyodor pointed out already, theories should be simple, solutions need not. General relativity is summarized in one single, simple, elegant equation, and yet this equation has infinitely many solutions of arbitrary complexity and ugliness (to name a few of the simpler and prettier ones: all of those complicated and ugly Calabi-Yau manifolds).

It would have been great if people had obtained the real world from compactification on a torus. Unfortunately, that didn't quite work out. Nor did any of the zillions of other simple and less simple compactifications one has tried over the years.

So yes, it may very well be that our vacuum *is* complicated, at least to the mind of the Homo Sapiens at the present stage of evolution (a species for which, I should remind you, the seminal insight that hitting someone on the head with a stick is more efficient than with a bare fist, is only a very recent one, on evolutionary time scales). Alternatively, we may realize one day that we completely misunderstood string theory and that the 6-torus carried the whole standard model and solved the cosmological constant and the hierarchy problems all along. Who knows?

I know you have in mind slightly more complicated constructions than a six-torus, but where do you draw the line?

A much more rational approach to this issue is along the lines of your statistical argument. I like the rough idea behind it, but I don't think it is correct as it stands -- even independent of the numerics. The crucial question is how likely light particle species are. Naively at least, they can be expected to be very rare in any general ensemble of string vacua, just because the natural scale is the string scale. So generically, one would not expect any light particles at all, no matter how much one increases the complexity of the topology. Of course we know that there is a sector of light particles in the real world (very good, otherwise we wouldn't be here), but granting it is a prioiri very unlikely for any sector of particles to be light, adding more cycles or whatever will not significantly increase the probability of finding more light particles.

Clearly also my reasoning is naive here (e.g. if susy happens to be broken at a low scale, more light particles may become much more likely), and there may certainly be something along the lines of what you suggest.

And yes, it would be great if something like the Hartle-Hawking wave function told us in which vacuum we live. But is this hope realistic? Look at the OVV model. The conditional probability at given (small) cosmological constant \Lambda (\pm \epsilon) of finding the moduli to be in a certain region R in moduli space is essentially just proportional to the *number* of N=2 black hole attractor points with given (large) horizon area 1/(\Lambda \pm \epsilon) in that region R. The corresponding distribution on moduli space is known from the work of those dumb democratic vacuum counters to asymptote for \Lambda \to 0 to the *uniform* distribution w.r.t. the Weil-Peterson metric. It's not peaked at all; the HH wave function does not favor any region of moduli space -- it is, indeed, democratic.

So what can we do? At least two things at this point, in my opinion.

One is to take the landscape hypothesis and the randomness it suggests seriously and try to extract general qualitative predictions from it. It's naive to think that this is hopeless, almost as naive as it is to think that we can't say anything about a gas in a box because we can't figure out its exact microstate. The work of Nima et al clearly shows one can make this line of thought very concrete.

The kind of statistical arguments you are beginning to make are another way to try to learn something in this context.

The other thing one could try to do is to decimate the landscape. I would be as happy as you if we could make most of those would-be vacua go away. I think there is still reasonable hope one could come up with strong existence criteria. Not so much in the susy sector perhaps, but it is not inconceivable that in the nonsusy dS part of the landscape, metastability will turn out to be a very strong constraint, possibly eliminating a huge class of naive compactifications. There hasn't been much concrete research in this direction; I encourage you to put that great mind of yours to work on trying to achieve some progress in this area :)

frederik

Dear Frederik,

Indeed, all of us agree that arguably simple equations may have complicated solutions, and in the context of string theory, I think that there have been enough pretty convincing examples of this fact. ;-)

But having *a* complicated solution is a very different question from having the right solution. Would you agree?

"[Torus would be nice but did not work.] ... Nor did any of the zillions of other simple and less simple compactifications one has tried over the years."

I think that this statement is too quick and misleading. Although the complete universally acceptable vacuum has not yet been found, of course, there have been many candidates that were significantly more promising to describe the real world in detail than some generic flux vacua popular today. See, for example, the heterotic standard model for a recent example.

It has everything one needs and no exotics. In a certain pretty large class of models, the authors claim that the topology is unique with this property. Using the conventional scientific approach - judging how naturally a model agrees with detailed features of reality - this one would score pretty well. This kind of result gets virtually no attention today. What do you think is the main reason?

"I know you have in mind slightly more complicated constructions than a six-torus, but where do you draw the line?"

Yes, I guess that torus leads to N=8 SUGRA which is too much of a good thing, ;-) (yes, let's admit the anthropic explanation why we don't live in extended SUSY) and our present understanding of string theory does not allow us to say exactly where the line should be drawn. But in my viewpoint, this is just a technicality caused by our imperfect picture. What I consider much more important, and it is not clear whether you will agree, is that there must be *a* line at a finite place that we *must* draw.

If string theory is gonna require an infinite sequence of refinements and increases of the complexity (and decreases of expected predictivity) of its vacua to match reality, then I would definitely love to know this fact as soon as possible, because in such a case I would consider string theory to be a wrong theory of physics - much like other incorrect theories in the history of science that were first corrected 100 times before they were abandoned (recall Lorentz's explanations of Morley-Michelson experiments) - and people should move on.

Of course I don't believe that this is correct, but allowing something like that for a theory *does* mean to construct a scientifically unjustifiable theory.

Your comment that the vacua typically have *no* light particles is probably true but it is only making the matters worse, and it is actually nothing else than my comment that some people would say that the right low-energy spectrum is more or less just about creating the (one) hierarchy (between the weak scale and the Planck scale).

We don't know the exact statistics, but I still find it reasonable to expect that if there are 300 light scalars that can be very light, then such a model is much more likely to be ruled out because in reality, there seems to be no scalars among the "20" lightest elementary particles.

I agree that the probability that a random vacuum in the haystack (also called "landscape") gives the sufficient hierarchy is probably small, but the probability that it gives the hierarchy *and* the low-energy Standard Model is smaller still, and if a model contains an increasing amount of unknown, exotic light matter, it is ever more likely that it will violate the known facts. But the Standard Model is probably a detail now, is not it?

I agree that OVV in its present form looks much like the equal weight ensembles, and many distributions mimic the structure of the low-energy HH wavefunctions which is known to have many problems. Which is one of the reasons it can't be a complete solution to the important questions.

"It's naive to think that [extracting predictions from the landscape] is hopeless, almost as naive as it is to think that we can't say anything about a gas in a box because we can't figure out its exact microstate."

The simple reason why I think that this analogy is erroneous is that the gas in your box is naturally expected to be in the state of thermal equilibrium, at least locally. One can derive this fact from the interactions of the box with the external environment and from tracing over the undetectable internal degrees of freedom. Well, the enthropy grows and thermodynamics holds.

On the other hand, there is no equilibrium between our universe and the other universes in the anthropic haystack - at least none of them that is known, formulated, and calculable (and the Hartle-Hawking state may be the closest thing to it we have). All "derivations" whether the haystack implies split or supersplit supersymmetry at low or high scale, assuming this or that cosmological constant, only reflect the authors' prejudice, and I tend to believe that the research has already shown that one can get virtually any results she wants in this way.

In other words, to reply your similar comment, it is naive to think that the anthropic framework combined with the current knowledge about string theory allows one to make any unambiguous predictions.

Even if the equilibrium-like nature of the ensemble were found - by some quantitative treatment of the eternal inflation - the precise "timing" in which the "right" ensemble should be taken would be equally ambiguous. No principle to resolve these ambiguities is known.

I agree that I/we should probably try harder to find the new decay channels of the dS vacua, but in advance it is quite clear that the current knowledge of non-SUSY regimes of string theory won't allow a fully convincing final picture, and it's guaranteed that the landscape team would not like the decay channel if one were proposed.

Various other decay channels have been looked at - the NS5-brane induced decay in Fall of stringy dS, for example, at the very beginning. Concerning your statement that there's no research of these things: there are tens of papers related to these issues, all of them partially contradict each other, and it's not possible - probably not even in principle, at least today - to decide what's exactly correct. I just think it's a premature time to ask all these questions.

The cosmological constant of the dS vacua, for example, is guaranteed to have 1-loop corrections from the effects of the D3-branes through the bulk at the SUSY breaking scale which uncorrelates the AdS and dS cosmological constants. Do you think it's true and/or it's a problem?

Best wishes
Luboš

no offense please, everyone. what you all said made great senses, but i feel philosophying has been too much.
i guess we all understand and agree that qcd doesn't tell us much about low energy behaviors of hadrons and such, but it's still THE correct theory for strong interaction.
it's of course not guranteed that the above example has full relevance in the case of string theory. but if one is willing to extend the analogy, then it's not unconceivable that string theory can make some qualititive but not quantititave predicitons about the dirty world. and that should be what people are after.
in that sense, KKLT kind of constructions are not too far off. and it's certainly considered a valid candidate for being a mechanism to cook up the dirty world.
it's obvious all the statistical or even HH wave function kind of studies should not form the whole scope of string theory studies. i think it's fair to say that there're many more important problems to study in string theory than simply explaining certain sort of numerical facts.
to be concrete, let's not forget that string theory should be a theory of quantum gravity, more so than a theory of everything and anything.

Dear gigbyte,

thanks for your comment. QCD is the correct theory of strong interactions, but your - sort of silent but obvious anyway - assumption that KKLT is analogously the correct description of the rest of the world is slightly overly optimistic, don't you think? ;-)

QCD was the unique quantum field theory - which is the framework that had already explained the electromagnetism and the weak interactions - that allowed the asymptotic freedom for quarks and the construction of hadrons from quarks, among more quantitative but less well-known checks. Is there something remotely similar for the KKLT vacua? Is the conjectured large degeneracy supposed to be the analogous argument?

The probability that exactly this type of vacuum is the closest description of the real Universe is, because of lacking evidence of "details" agreeing, small, but even if one buys that we live in these kinds of IIB orientifold flux vacua, one simply can't get qualitative predictions out of it without localizing the right one, at least approximately. No such predictions could have been found and none of the will be found in the future - that's pure logic. One can find various averages of various quantities over various ensembles with various measures - but they don't have any physical meaning.

"...let's not forget that string theory should be a theory of quantum gravity, more so than a theory of everything and anything."

I don't think so. String theory is an interesting theory to replace the SM & GR because it seems to be the theory of all interactions. Surely if one omits the non-gravitational part of the particle spectrum, he will get no otherwise universal predictions (of all background of string theory united).

The only universal features of quantum theories of gravity are those that can be derived from classical GR and its attempted quantization - which effectively ends at the semiclassical approximation. This includes classical and semiclassical GR and black hole thermodynamics etc. Everything else we know about quantum gravity are *potential* things that may occur in some backgrounds but not others.

More frustratingly, using the words of Steve Shenker, AdS/CFT now says that nearly every QFT may be assigned a "gravitational dual" - i.e. every quantum theory is, in some loose sense at least, a theory of quantum gravity (in a higher-dimensional space). We definitely don't want to be *this* broad because then the term "quantum gravity" would become pretty vacuous.

All the "details" about a theory of quantum gravity - that manifest themselves at higher loops as undetermined counterterms in the effective field theory - are model-dependent. This whole magic how exactly string theory can connect the low-energy gravity with the high-energy gravity (of black holes) is determined by the details of the models which are heavily correlated with the rest of the stringy spectrum i.e. with particle physics.

There is no way to separate even the qualitative predictions beyond (semi)classical GR from the knowledge of something about the right model. And moreover, string theory always predicts some new physics beyond gravity - so a disagreement about it is not a sign of an incomplete theory, but a sign of a wrong model.

"i think it's fair to say that there're many more important problems to study in string theory than simply explaining certain sort of numerical facts."

Maybe, but could you please define more conretely what you think is more important than actual quantitative post/predictions?

Best
Lubos

dear lubos,
i can agree with all the points you had in your last post. let me nontheless clarify my point of view more carefully.
first, i'm not a big fan of having too many vacua, and so the KKLT models are toy models of more relevance in proof of existence rather than the final fantasy. they got quite some things correct, although inflation's embedding is less successful.
using QCD as an example, I wanted more to stress that we may want to look for the kind of evidence that put QCD on its firm footing within string theory. ie. we might find something. of course, i don't know what to look for, except the ones now on the market, namely cosmic strings and potential TeV signatures.
if we forget the potential probablity of running into something else interesting by studying less phenomenlogically interesting theories, such as non-critical ones, we may remember that blackholes are not fully under control. one can say we have a general picture, but many things we don't understand and there are now quite some novel-not necessarily correct- suggestions, I'm sure people at harvard are very interested in this too.
i'm actually personally rather pessimistic about the model building efforts. mostly one do not learn of new physics, but one may take the position that we can't learn anymore new stuff, then i consider it a fair thing to try. i also doubt all the efforts put into finding the finest signatures of a model pays off hugely, esp. when there are so many semi-viable models, and we're certainly not sure which ones is closer to the real world.
finally, the HH/WDW point of view is an very important one and definitely deserves further studies. esp. if it holds the potential of picking some vacua correctly.