As we announced one week ago, Frederik Denef and Michael Douglas propose their proof that finding the right vacuum in the "landscape" is an NPhard problem. They're very smart guys and the paper is very interesting but let me admit that I don't believe a word.
First of all, they admit that such a proof can't be rigorous. Indeed, it can't. The main reason is that the NPhardness is an asymptotic notion that can only be defined for N going to infinity. But various interpretations of N  such as the number of CalabiYau fourfold topologies that are relevant for particle physics and/or homology classes within these manifolds  are always finite numbers. This prevents us from defining the limits and from asking the question in the first place.
The very notion of approximating the number of relevant CalabiYau topologies or an analogous observable by an infinite number is a misstep, I think. The number of physically relevant topologies must always be treated as a number of order one, by the very definition of the word "relevant"  much like the number of physically relevant bound states describing the Hydrogen atom. We don't yet know what is the scheme that organizes the vacua and tells us which ones are relevant (counterparts of the lowlying states) and which ones are not, but whatever the answer will be, it will make the number of relevant ones comparable to one.
We can only treat the number of objects or states as an infinite number if we actually don't want to distinguish between them  like the microstates in statistical physics  but this is clearly not the case of the vacua of string theory that we definitely do want to distinguish from each other.
In order to estimate the time we need for the calculation  and their results are something like 100 million CPU years (which could incidentally take just a few months with all computers we have on Earth today)  they must make various assumptions, and I find most of these assumptions suspicious, to say the least. The main assumption is, of course, that we live in one of the generic vacua among 10^{500} vacua that does not really differ from others. Of course, if you assume that, then the optimal search for the correct vacuum can't be much better than trying the vacua one by one, giving enormous times.
The more the vacua differ from each other in qualitative ways, the more you can organize them, the more you may divide the calculation to pieces, and the faster you can find the correct one.
I feel that their derived result  that the vacua can't be looked for  contains exactly as much information as the assumptions. If you assume that physics is an unpredictable jungle of possibilities that can't be distinguished and discriminated against, then it is not surprising that you can derive that you can't find the right possibility quickly enough.
But possibilities in physics, as long as it is science, can always be distinguished and it is always possible to converge towards the correct answer step by step  so that each step takes a very reasonable amount of effort. In some sense, their philosophy is remotely similar to various statements that Gödel's theorems influence what can be derived in physics. The conclusion of the "Gödelian" reasoning is, of course, that almost nothing can be derived or proved in physics. All people whom I met and who have claimed that Gödel's theorems are relevant for physics may safely be classified as crackpots. In fact, most of them also interpret the uncertainty principle as a proof that physics can't be an exact science. Frederik and Michael are not doing exactly the same thing but the similarity is hard to be overlooked.
Later I noticed that they mention this analogy, too. So let me correct the previous statement: there are at least two physicists who are not crackpots and who believe that the Gödelian ideas may be relevant for physics.
An example of the oldfashioned approach to physics that works is the paper right after theirs. Braun, He, and Ovrut show that the SU(4) bundle in their heterotic MSSMis slopestable  which is a good thing. (A debate whether there exists a stable bundle in the hidden sector erupted in the fast comments.)
Obviously, models whose individual properties and predictions can be tested don't care about the "theorems about inability to calculate". Why don't they have to obey the theorems that "the calculations are impossible"? Well, it's simply because the calculations in their case obviously are possible. It does not take 100 million years for Braun, He, and Ovrut to find the arguably correct CalabiYau topology that leads to the right spectrum. It takes them a few months. It takes a few months to find the correct pure MSSM bundle and a few months to calculate the Yukawa coupling at the tree level. Then it takes them a few weeks (impressive guys, aren't they?) to prove that the bundle is stable, and I don't know how many more months they need to calculate the masses of all elementary particles. But it is likely to be closer to a month than 100 million years. ;)
Frederik's and Michael's proof that the calculations can't be done only applies to the models in which physical calculations can't be done, and for this subclass, the statement is a tautology.
For other vacua, namely the physically interesting vacua, their statement seems manifestly incorrect. Physically interesting vacua always have some features that allow us to localize them, identify them as the physically more interesting ones, and study their properties. There exists a hierarchical structure of the vacua that allows us to organize them and find and review their properties using a finite (and reasonable) number of letters. For example, one can try to see whether the correct gauge groups and the desired fermionic spectrum is obtained. When you look at this question, you conclude that you should study heterotic strings on CalabiYau threefolds rather than Ftheory on generic CalabiYau fourfolds because they're simply more likely to give you the right physics.
One can continue with many similar considerations and localize the correct model with ever increasing accuracy. The result is that it takes a short enough time instead of an exponentially large one  just like it takes a finite time for a letter to get to the correct mailbox despite the large number of mailboxes in the world. The letter first gets to the right continent, then to the right country, the correct city, street, and mailbox. The particles of the sand in Sahara are not organized into cities and streets  but we usually don't send them mail anyway so it does not matter. Whenever you see no patterns and features of your ensemble XY of objects or backgrounds even though you are expected to see some patterns, it more or less proves that you are looking at a wrong thing.
Another criticism, discussed in the fast comments, is the religious treatment of the cosmological constant  which is just one parameter  and probably not the best one to locate the correct model. But even if we look at the cosmological constant only, I think that Michael and Frederik are very far from proving that we can't find e.g. the model with the minimal positive cosmological constant after a reasonable amount of time. In reality, the contributions to the cosmological constant are likely to be hierarchical, and we may try to cancel the bigger contributions first and the smaller contributions later. By organizing the calculation in this way and by dividing the reasoning into many steps, the time needed to find the right vacuum will probably shrink dramatically. Their "nogo" theorem looks like an artifact of a lack of imagination and similar "science can never do XY" theorems have always been ruled out by new progress. It is reasonable to expect that the vacuum selection won't be any different.
The last paper is "Emergent gravity" by Jack Sarfatti  whose title is similar to Seiberg's recent paper on "Emergent spacetime". While Seiberg's paper contains the lore that most of us obviously agree with, Sarfatti's paper is much more revolutionary. Also, it proves the infinite tolerance of Paul Ginsparg with respect to the original ideas. ;)
Prof. Sarfatti derives (...) the world [sic] hologram conjecture as the AharonovBohm effect. The area quantization law, like the law in loop quantum gravity, is derived (...) from singlevaluedness of inflation [sic] field. These breathtaking discoveries may be used to find the extraterrestrial civilization  especially those who live in the "level II hyperspace" and who have contacted Prof. Sarfatti when he was 13. Jack Sarfatti thinks that the anthropic landscape is the most important recent discovery in physics  apparently because it allows him to prove his theories about the "level II hyperspace". ;)
Prof. Sarfatti, if you don't like this description, please don't complain  otherwise I will have to ask Jimbo Wales for assistance. And congratulations to your preprint. ;)
Probing the Whole Internet for Weak Spots

Rapidly scanning the Internet has become vital to efforts to keep it secure.
When a major flaw in the encryption that secures websites was revealed this
...
8 hours ago
snail feedback (1) :
Dear Lubos
First a new geometric transitions in IIA (a flop inside G_2 manifold with torsion) relate Dbranes wrapped on cycles of nonKahler manifold and fluxes on other nonKahler manifold. This enriches the landscape picture. The nonKahlerity is proportional to CC.
Second nongeometric spaces are good string background which necessarily arise in string theory (nongeometric string backgrounds arise from mirrors, Tduals, Sduals and Uduals of flux compactifications, from reductions with duality twists and from symmetric orbifolds). So generic string compactifications are nongeometric in any duality frame.
Third there are uncountably many nonSUSY vacua and you should weigh their relevance because of conservation low due to principle of quantum superposition.
Yes, you should not even try to find the right vacuum because the required time will exceed the recurrence time of the Universe, naturally, but you can try to find seemly interference of the superposition of all quantumconsistent vacua by hypothetical quantum computer ;)
Best, planckeon
Post a Comment