Tuesday, May 05, 2009 ... Français/Deutsch/Español/Česky/Japanese/Related posts from blogosphere

Predictions of local M-theory and F-theory models

Today, I consider the first paper by Jacob Bourjaily, Nima's student at Princeton,

Effective field theories for local models in F-Theory and M-Theory (click),
to be the best hep-th paper on the arXiv. The author describes a semi-systematic procedure to determine the allowed spectra and couplings - predictions - from local M-theoretical and F-theoretical models. Because Cumrun Vafa was just elected to the National Academy of Sciences (yes, I took the picture), it's natural to discuss this topic again.

Why local models

Because the gravity is so weak, i.e. because the Planck energy scale is so high, it makes sense to consider the non-gravitational forces in our world to be approximately decoupled from gravity. If this decoupling has a simple interpretation within string theory, the non-gravitational forces and matter must arise from a local segment of the compactified geometry. It means that the vicinity of the relevant locus of the compactification may be approximated by a non-compact geometry.

In order for the visible sector not to be decomposed into decoupled pieces, the whole sector must arise from a single singular locus. In M-theory, we consider real 7-dimensional manifolds at each point of the visible space. In F-theory, we study real 8-dimensional (complex 4-dimensional) manifolds at each point: in this counting, the two real dimensions of the toroidal fiber are included.




The gauge group always arises from real co-dimension 4 loci (think about the standard ADE singularities); in the F-theory case, two of these transverse dimensions belong to the toroidal fiber so the co-dimension is 2 on the base space (recall Kodaira's classification of singular fibers). This counting means that the gauge group lives on real 3-dimensional (M-theory) or real 4-dimensional (F-theory) cycles in the compactified geometry; in the F-theory case, imagine a gauge multiplet carried by a del Pezzo surface.

Bourjaily (see his scary CV) argues that in both cases, E_8 is the only possible "master" gauge group - or singularity type - that is powerful enough to incorporate the known forces and matter. He claims that the whole matter spectrum arises from the adjoint representation of the "large" group by a decomposition under a subgroup, a statement that could be technically wrong.

He explains that the light chiral matter imposes strong constraints on the geometry which in turn implies nontrivial constraints on the matter spectrum, especially in the case of M-theory where one N=1 chiral multiplet has to arise from each real co-dimension 7 point in the geometry. In F-theory, one gets full N=2 hypermultiplets from co-dimension 6 places where the matter fields live (they're therefore massive, unless the right fluxes change this conclusion), and he is more free to choose the multiplicities etc.

It is also being explained that many couplings tend to be naturally zero in M-theory and F-theory, even though you would think that there is no (symmetry or related) reason for such a vanishing in field theory. M-theory and F-theory also give you new methods of "non-unification" that nevertheless links the properties of different sectors of matter. These models naturally incorporate a heavy top quark and many other things.

Sociology: AGTs

I think that this guy is extremely smart, creative, and broad. It is very conceivable (and likely) that his knowledge of algebraic geometry is less comprehensive than the knowledge of the big shots in this subfield - the AGTs (algebraic geometry technocrats), a group that includes many ex-students of Cumrun and ex-students of other senior people. ;-)

On the other hand, I am absolutely convinced that this guy sees the big picture and the links between geometry and physics much more clearly than most of the AGTs do (but I am certainly not including Cumrun himself among the AGTs at this point!). So the AGTs should do their best to organize their ideas about the model building according to Bourjaily's template because he's simply a more intuitive, organized thinker who doesn't get easily distracted by irrelevant details. At the same moment, he should probably study algebraic geometry more systematically. So should I. ;-)

I am sure that many AGTs think that much of the nice stuff that Bourjaily is writing is about marketing. But they're mostly wrong. This "marketing" is actually the basic framework that has to exist in physics. What many AGTs are doing is to work out some details in very local, relatively small pieces of the framework. That's important but it's clearly not the sufficient activity that this part of physics research could be composed of.

Sociology: anthropic people

Bourjaily's papers also exhibit a striking contrast with the anthropic papers. Bourjaily is still applying "physics inference" that has been so crucial in all major, conceptual developments of science. He observes (or "induces") some qualitative features of the real world and tries to guess which qualitative features of the underlying (stringy) description are likely to be correlated with the observed patterns.

The models that have the right qualitative properties are subsequently investigated and classified in detail, leading to predictions and new possibilities. What's important is that the research is concentrated under the lamppost - namely the lamppost of the rational, qualitative arguments. One is not searching in a haystack, trying to find a needle by chance. He or she has to search mostly in the places where the concentration of promising models is high.

As our knowledge gets more extensive and more accurate, the light from the lamppost is getting ever more focused. We know what we're doing increasingly accurately and the "measure" determining where we are searching is getting increasingly non-uniform.

What the anthropic people apparently like is the idea that a random (according to a quasi-egalitarian measure) vacuum in the landscape reproduces the observed world "by chance". And they think that their "chance" is higher if they have many candidate vacua. They clearly disagree with the common sense: common sense (and Bayesian inference) implies that if a model agrees with the observations just "by chance", it probably means that the model is wrong. ;-)

A model is only promising if it agrees with the aspects of observations that are likely to be more than just a coincidence!

So having many candidate vacua cannot possibly increase their probability of being correct. In the Daily Show about the LHC, LHC alarmist Walter Wagner calculated the probability of the world destruction by the LHC to be 50% because there are two possible answers, Yes and No, so the probability of each had to be 1/2 = 50%. Even John Oliver, the journalist, was able to guess that this was probably not the way how probability works. ;-)

However, Wagner's is basically the same understanding of "probability" that the anthropic people are implicitly or explicitly using all the time. It's breathtakingly stupid, too. The number of possible answers has nothing to do with their overall probability. Sometimes, a higher number of detailed realizations of an answer may increase the probability of this answer. In other cases, it's the other way around: having too many choices implies that none of them is right. If you're searching for the king in a country, it's likely to be the guy who looks different than 10 million of other people rather than one of those 10 million people who look alike! In some cases, more is definitely less.

A rational approach to determine which answers are correct must obviously care about the detailed physical questions - qualitative features of the spectrum, exact and approximate symmetries, smallness and greatness of individual couplings, and the correlations between all these features, among other things. Whoever is trying to find the right theory or the right vacuum while paying no attention to these real dynamical issues is clearly not doing physics properly.

And that's the memo.

Add to del.icio.us Digg this Add to reddit

snail feedback (2) :


reader downquark said...

Hi,

I have a question regarding the anthropic principle.

Theologians often refer to the physical constants in this regard as proof of "fine tuning". This seems to me implicitly assume that the constants are less fundamental than the structure of the equation itself. Is this logically justified?

It seems to me you could just as easily say the structure of the equation could be different, at which points all bets are off for the number of possible universes.


reader Lumo said...

Dear downquark, a good question. One must distinguish two cases.

In incomplete and approximate theories ("effective theories"), the equations and basic parameters are independent things. One needs both the qualitative form of the equations as well as the precise numerical values of the parameters that can be adjusted independently of the shape of the equations.

In deeper, more accurate, more "microscopic" descriptions, the number of independent parameters is typically smaller. The properties of atoms are no longer independent in quantum physics (like they were independent in 19th century when the properties of all materials were thought to be independently adjusted), being derivable from one equation and a few parameters describing electrons. In string theory, all continuous parameters are ultimately calculable.

So if one uses the effective theories where equations and parameters are independent, I would agree with you that the constants are "equally fundamental" and Nature was "equally free" to adjust the form of the equations as well as the parameters for these equations.

However, in more accurate theories, the parameters are really calculable from equations that have fewer (or no) adjustable parameters. In this case, the equations are fundamental and the parameters are less fundamental because they're derived. You can't get any value of parameters you want - especially because the number of "qualitatively different" types of equations is countable, and if you want to avoid really contrived ones, it is essentially "finite".

So they can't give you all, continuously (uncountably) infinitely many values of parameters that could be adjusted in the first place.

As has been said many times, it is plausible that some parameters are adjusted so that the existence of life or intelligent life is the only criterion that had to be satisfied - by Nature or God (and which of them you imagine to be the "selector" has pretty much no effect on the logic of these physical considerations).

In many cases, it's been shown that these parameters are essentially random - like the Sun-Earth distance - consequences of historical coincidences.

However, in hundreds of other, more fundamental cases, it's been already shown that the values of the parameters are, in fact, calculable from more fundamental equations (with fewer parameters), even though they used to be considered as independent (and independently adjusted for life/humans to exist, by God or Nature).

That's why a more "specific" explanation or justification can always be found and one can never be certain that the anthropic (non)explanation is the final word.