## Friday, February 11, 2005 ... //

### Recent talks: Berkovits, Tegmark, Arkani-Hamed

Recently there have been several interesting talks at Harvard. Let me start with the postdoc journal club.

• On Wednesday night, Nathan Berkovits was explaining his pure spinor formalism. He started with a description of the advantages and disadvantages of the RNS and Green-Schwarz formalisms. The superparticle is a good starting point for his pure spinor formalism and it already has many features that are also relevant for the superstring. For example, the polarizations of the gluon supermultiplet are described as a spinor-valued function A_alpha(X,theta) of the superspace - this has 16 x 65536 i.e. roughly a million of components to start with even for the open string. The situation for the closed string is this squared - i.e. more than one billion components. ;-) Nevertheless, the condition "lambda^a lambda^b D_a A_b(X,theta) = 0" is enough to kill all the degrees of freedom except for the 16 open (or 256 closed) physical modes. Here "lambda^a" is a pure spinor ghost - i.e. a spinor whose only non-vanishing bilinear is the middle-dimensional one, "lambda_a gamma_{jklmn}^{ab} lambda_b". Nathan explained that the pure spinors were already introduced by Cartan to describe possible complex structures in "2k" dimensions. The coset "SO(2k)/U(k)" is isomorphic to the space of projective pure spinors in "2k" dimensions. Nathan then discussed the BRST charge and its cohomology, and tried to answer many questions of ours about the relation of his BRST charge to the conformal symmetry - this relation is different from the RNS formalism because Nathan's BRST charges does not follow from a canonical gauge-fixing procedure. Finally, he wrote down a pure spinor Lagrangian for the "AdS5 x S5" background that - as he can prove - is a consistent conformal field theory. However, so far it has been difficult to extract useful information from this model.
Thursday is the usual day of the Duality Seminars - this semester they are dedicated to cosmology plus the interface between cosmology and string theory.

• Yesterday, Max Tegmark from MIT gave an impressively high-tech Cosmological Duality Seminar about the cosmological parameters and their measurement. With his Apple, he was showing the Universe and stars at all possible distance scales. He has made a very good job in defending the idea that the funding for cosmology should grow, and that the whole NASA budget should not be sent to Mars. Max presented the animations showing how the CMB anisotropy as a function of the orbital momentum - as well as the visible matter density in the Universe at various scales and various other comparably important "graphs" describing the Universe - depend on various basic cosmological parameters: each of them seems to influence the patterns in a different way. It's impressive how cosmology is becoming a high-precision science. On the other hand, Max emphasized that we should not spend the rest of our lives with determining the parameters with higher accuracy. We should find relations between them - a goal mostly for the theorists.
Finally, Nima gave a captivating and provoking family lunch seminar today about some important grand questions and his approach to these questions:

• He described his paper with Savas and Shamit about their friendly, predictive neighborhoods in the landscape. The type of toy models they consider are effective field theories with "N" real scalars. (Bobby Acharya et al. recently argued that the friendly neighborhood could appear in the M-theoretical landscape on G2 or Einstein seven-manifolds.) The potential in their toy model is a sum of quartic functions for each scalar field. This means that each scalar field increases the number of vacua by a factor of two - the total number of vacua is "2^N". Nima argues that assuming that the values of the parameters can be written as a constant plus corrections from all the scalar fields (which is another big assumption, I think), one can show that via the central limit theorem the average fluctuation of each observable will be of order "1/sqrt(N)". In the limit of infinitely many vacua, he says, we don't get a landscape where everything is undetermined. On the contrary, everything is determined with a relative error of order "1/sqrt(N)". One can get predictivity back, given certain assumptions! What are his new rules to deduce new physics? I've been trying to get a well-defined answer for quite some time, and I can only claim a partial success which can of course be caused by my being slower. Today, Nima formulated the rule pretty clearly:
• A new particle or field or mechanism is only allowed to be added to our description of Nature if it is necessary for avoiding one of the "major" disasters - such as the destruction of the galaxies or atoms - disasters that one often discusses in the framework of the anthropic reasoning.
• Obviously, one can raise the following objection:
• Is not the new prescription for physics ruled out? The top quark, for example, seems unnecessary to avoid any of these low-energy disasters. This kind of anthropic thinking would have predicted that the top quark did not exist - and therefore Melissa Franklin et al. have falsified your paradigm 10 years ago.
• What is the answer (of Nima) to this question?
• We are not interested in these questions. We only want the neighborhood of the Standard Model. It's an assumption that we keep the broad features of the real world that are already known intact.
• Well :-), it's OK not to be interested in these questions, but they seem necessary to define what the new principle constraining physics exactly is. Obviously, the principle must be refined to avoid the contradiction mentioned above. The refined version says something like:
• Proposals for new physics that goes beyond the insights known in 2005, which is treated as an exceptional moment in the history of the Universe (and perhaps the humankind), must be justified by their necessity to avoid major global disasters such as the destruction of galaxies, nuclei, or atoms. Occam's razor must be applied - and the only way how a proposal for new physics can survive Occam's razor is an anthropic argument.
Well, when one formulates the new principle is this more accurate way, it seems slightly less convincing because of its special emphasis on the present knowledge. Why should we believe that a principle like that is a good guide to look for new physics if the same principle applied in earlier developments of physics is known to lead to incorrect conclusions?

Don't get me wrong. I like simple models and Occam's razor, and we always use simplicity as one of the rather important arguments. However, many things in our Universe are simply not minimal. In fact, according to Gell-Mann's totalitarian principle, everything that can occur in agreement with the symmetries will occur. This principle is usually used to allow all possible terms in the action involving given fields, but its moral generalization can also be used to allow for new fields and particles.

This observation implies that the requirements of simplicity are not a universal principle that can always be safely trusted when we try to pick the correct theory. They're often useful, but they're not universally valid. No doubt that Nima agrees. Nima et al. seem to suggest that exactly for "physics beyond the Standard Model", this principle of simplicity should suddenly become more important and more reliable than before. Also, beyond the Standard Model, the anthropic arguments should suddenly become more important - or even omnipotent. But I don't see a rational argument behind this approach. Simplicity, anthropic arguments, technical arguments, naturalness, and other arguments remain valid arguments, and giving them weights is a matter of intuition and personal psychology. I don't see why the Standard Model should be the critical line behind which we should suddenly change the way how physics is done and which arguments are important.

In this sense, I would probably guess that the resulting "minimal" models - and especially the new "anthropic sector" that is used to explain the cosmological constant and the Higgs mass - have comparable probability to be correct as any other models supported by some other, less anthropic principles - except for the obvious fact that the authors of the paper are big shots whose probability is always a bit higher. ;-)

#### snail feedback (3) :

Lubos - Isn't it pitiful that most of clowns who keep working on "predictions" and "interpretation" of superstring theory wouldn't be even able to compute the Veneziano amplitude while a whole generation of serious young thinkers had been led out of town by Ed the Piper?

Jean-Paul