Monday, January 09, 2012 ... Français/Deutsch/Español/Česky/Japanese/Related posts from blogosphere

Compositeness and SUSY at LHC

Attempts to import Seiberg duality from string-friendly research to routine model building

The first hep-ph paper on the arXiv today is called

Light Stops from Seiberg Duality.
Csaba Csáki (Cornell), Lisa Randall (Harvard), and John Terning (UC Davis) propose a novel framework that could address the hierarchy of Yukawa couplings (why some fermions are much lighter than others), the little hierarchy problem (a small relic of the hierarchy problem that seems to survive even with SUSY or other solutions to the normal, "big" hierarchy problem), and the hierarchy in the squark masses that is apparently needed to solve both the hierarchy problem as well as to agree with the observed absence of supersymmetry signals at the LHC so far.

The new twist in their story is compositeness, the idea that elementary particles of the Standard Model aren't quite elementary, and Seiberg duality, a seemingly abstract but in this context phenomenologically viable symmetry originally found in 1994 exchanging electric and magnetic fields and changing the gauge group of \({\mathcal N}=1\) gauge theories along the way.

The composite particles transform as bifundamental representations which you may visualize as open strings stretching between two stacks of 6 D-branes.




Needless to say, all these features are related and they're required to make the model naturally work. They only focus on versions of these ideas that are compatible with the observed constrains from the LHC; however, they automatically observe that the Higgs mass around 125 GeV is obtained in these models without fine-tuning.

The lightest superpartner is the light stop or a light neutralino. The first choice may face some problems with the nature of dark matter, a problem they don't discuss (stops can't be dark because they're charged, so they better be unstable) but it would be very interesting because in this case, they would predict that the stop mass is very close to the top mass, 173 GeV, and such a degenerate top-stop pair would be hard to detect, resembling the "stealth supersymmetry" concept.



Csabajka, čabajka, or Hungarian sausage ;-). Like his two co-authors, Csaki also has books at amazon.com, but I needed a picture for this blog entry, too.

There are many cute ideas used in the paper. One of them is Seiberg duality. It's a form of electromagnetic duality analogous to the \(SL(2,\ZZ)\) Montonen-Olive duality of the \({\mathcal N}=4\) gauge theory (which may be thought of as a symmetry inherited from type IIB string theory whose D3-branes have the right low-energy dynamics). It was also found by Seiberg around 1994, when he also worked on related issues in the intermediate case with the \({\mathcal N}=2\) supersymmetry. In the \({\mathcal N}=2\) case of Seiberg and Witten, one has to deal with complicated monodromies as well. From this viewpoint, the Seiberg duality with a simple exchange of two theories and a \(g\to 1/g\) U-turn for the gauge coupling is closer to the simple \({\mathcal N}=4\) case.

Seiberg duality claims the low-energy equivalence of two QCD-like gauge theories. Both of them have \({\mathcal N}=1\) supersymmetry and \(N_f\) flavors. However, they have different gauge groups, \(SU(N_c)\) and \(SU(N_f-N_c)\). Note that the number of colors is jumping back and forth relatively to the central position, \(N_c=N_f/2\). The qualitative behavior of these theories (and especially the sign and value of the beta-function) depends primarily on the ratio \(N_f/N_c\). For \(N_f/N_c=3/2\), we encounter the lower end of a "conformal window" and that's exactly where the authors see the most viable models.

The simplest story about the relationship of supersymmetry and the hierarchy problem used to say that all superpartner masses (or their difference from the normal particle masses) should be just a bit higher than the Higgs mass, so that the cancellation guarantees that the Higgs remains light. But this "brutal" constraint on the lightness of all the superpartners is just a rough, sufficient condition for the cancellation to occur; it is not necessary for SUSY to solve the hierarchy problem. What we actually need isn't that stringent.

Dividing superpartners into castes

It turns out that the Higgs mass only cares (substantially) about the masses of higgsinos, gauginos, both stops, and the left-handed sbottom. The remaining superpartners don't substantially affect the natural unbearable lightness of Higgs' being. And it turns out that all the relevant particles are composite or "mostly composite" in the new model: you need the dual Seiberg magnetic description to understand where they come from! Be ready to work with bifundamental representations of \(SU(6)_1\times SU(6)_2\) if you want to read the paper. Moreover you will need \(SU(4)\) instead of \(SU(3)\) as the strong group; I am still a bit confused about this elementary point.

The reason why they can divide the superpartners to important and unimportant ones is that – and this is a key new observation of the new paper today – at the leading order, composite particles' masses are unaffected by supersymmetry breaking scalar and gaugino masses. This only applies if the number of flavors and colors places you near the edge of the "conformal window" I mentioned above.

You may see that the model brings some new "qualitative structure". There seem to be more complicated gauge groups than we have in the Standard Model; and some particles we have considered elementary become composite. For these reasons, you might think that the model is a bit "contrived" or "arbitrary". However, it's only the qualitative properties that are being "fine-tuned" in this way. The continuous parameters, and they may be much more important, seem to take the empirically required values much more naturally once you agree about the qualitative assumptions.

Although the complexity of the "qualitative rules" may look unpleasant, I do think that pretty generally, explaining some hierarchies (continuous fine-tuning) by unexpected "qualitative choices" represents progress. So I hope that the paper will be read and followed by many more people than those who studied some of the previous "Seiberg duality" phenomenological papers.

Features that repel physicists from similar papers

There is one reason why many phenomenologists, including the supersymmetric ones, may be paying too little attention to papers involving Seiberg duality: they don't know what it is. Seiberg duality may look too stringy. After all, Seiberg could be counted as a formal string theorist, too. But be sure that he is mainly a field theorist and despite the important applications of Seiberg dualities in string theory and its "unrealistic" vacua, everything about Seiberg duality is about quantum field theory. So every phenomenologist, especially a SUSY phenomenologist, should surely try to learn what it is. It is exciting.

(Seiberg and Witten knew about this drop of readership that a reference to a stringy idea causes, so they carefully dropped all such references in their \({\mathcal N}=2\) papers around 1994 even though many of these ideas were found by thinking about the stringy realization of these systems and many "future" representations of the theories within string theory shed new light on them.)

The aspects of similar papers that sometimes annoy me are somewhat different. I am kind of scared by one construction that is omnipresent in a class of papers – namely by the embedding of gauge groups into global symmetries, by weakly gauged symmetries, and by a few related concepts. The ultimate models probably make complete sense once all the dust settles but this way of describing how they're "constructed" sounds horrifying to me. Global symmetries and gauge symmetries are qualitatively different. If a subgroup of a global symmetry is gauged and the rest is not, then the global symmetry can't be a symmetry because it mixes qualitatively different generators - the gauged ones and the ungauged ones. ;-) Moreover, a "weakly gauged" group could physically be forced to correspond to a group that is gauged but that is given a tiny coupling constant and each such a group could represent an added fine-tuning (for the tiny new coupling) which never seems to be "counted" in the unnaturalness of similar models. But maybe this criticism of mine is irrelevant for some reason I don't quite see.

At any rate, one may see that the model builders are already adapting and adjusting to the results of the first serious year at the LHC, namely 2011. They have to; it's the nature of their subfield. Of course, formal theorists and string theorists have been affected by the results of the LHC only minimally; their focus is on much more long-term, ambitious, lasting, higher-energy questions.

Add to del.icio.us Digg this Add to reddit

snail feedback (0) :