Thursday, December 15, 2011 ... Français/Deutsch/Español/Česky/Japanese/Related posts from blogosphere

Implications of a 125 GeV Higgs for SUSY

The results of the 2011 Higgs search at the LHC have been announced and phenomenologists began to write papers about the implications of the findings for their models.

In the case of the Standard Model that has nothing new beyond the freshly nearly discovered Higgs, the implications are simple. At 125 GeV, the Higgs potential becomes unstable at a scale between \(10^{13}\,\,{\rm GeV}\) and \(10^{20}\,\,{\rm GeV}\) [formally]. A fresh paper from today (using two-loop equations) puts the instability scale to a value as low as \(10^{9\dots 11}\) even for 126 GeV. We don't know the exact value because it sensitively depends (especially) on the mass of the top quark and the W-boson which are not known quite accurately, either.

And the same holds for the Higgs mass which is also not known too accurately. It just seems much more likely than not that the vacuum gets destabilized below the Planck scale for a 125 GeV Higgs; if you count this instability as an inconsistency, the very existence of a 125 GeV Higgs implies that there have to be new particles beyond the Standard Model with masses much lower than the Planck scale (but not necessarily light enough to be seen by the LHC).

There's not much else to say (well, a new paper from today says that the Standard Model with 4 generations is compatible with the new Higgs data) and the uncertainty will probably not go away too soon. I just wanted to start with the Standard Model to assure everyone that all sensible people talking about particle physics – including your humble correspondent – think about the behavior of things in the Standard Model at the very beginning. The fact that they don't spend too much time with it boils down to the observation that the Standard Model is pretty simple. It also seems deficient in describing what happens at the natural fundamental scale of quantum gravity, the Planck scale.


Things become more interesting in the MSSM, the Minimal Supersymmetric Standard Model. Supersymmetry (or SUSY, for short) adds superpartners for all known particles. Each boson has its fermionic partner and vice versa. So we deal with winos, binos (or photinos, zinos), higgsinos (neutralinos cover all of the previous ones), charginos (charged higgsinos and charged winos combined), gluinos (that completes the gauginos), sleptons such as stau, squarks such as stop, and so on. One more special superpartner is the gravitino.

However, this spectrum would be (gauge) anomalous (and would fail to generate masses for one-half of the quark flavors) so another Higgs doublet has to be added, together with the higgsinos. By definition, the resulting spectrum is that of the MSSM.

Supersymmetry is broken in the world around us because the superpartners are manifestly heavier than the particles we already know. Otherwise we would be encountering them on the sidewalk all the time. In a fully specific stringy compactification (or field-theoretical model), one could calculate all the consequences of the SUSY breaking.

However, we don't know the full theory, so we parameterize the implications of the SUSY breaking at low energies by generic "soft breaking terms". We have about 105 new terms with undetermined coefficients. These terms break SUSY at long distances – which is what we observe – but they preserve SUSY in the high-energy limit. Closer to the fundamental scale, the cancellation between fermions and bosons (in quantities as diverse as the vacuum energy or the Higgs mass) is preserved, up to the accuracy of these high-energy scales.

Related: Science Magazine's live chat with Gordon Kane and Robert Roser begins soon, at 3 pm Boston Winter Time.
If we go beyond the MSSM and add totally new particles (NMSSM where N stands for "next to" adds a new neutral scalar and its superpartner aside from the MSSM spectrum), many things may happen and almost no universal inequalities are preserved. So things are "highly constrained" only if we restrict our attention to a more specific spectrum such as the MSSM.

Higgs mass in MSSM

As Matt Reece summarized at Theoretical Physics Stack Exchange, the classical (tree-level) prediction for the lightest Higgs mass in the MSSM inevitably implies that
\[ m_h \leq m_Z \sim 91\,\,{\rm GeV}. \] That would be a contradiction with the observed mass of 125 GeV but we live in a quantum world and the quantum corrections are actually significant. The most important effect on the allowed Higgs mass is brought to you by the loops (in Feynman diagrams) with stop squarks, the superpartners of the heaviest top quark. It's the most important one because the top quark (and stop) have the strongest interaction with the Higgs (because it's proportional to the mass).

The top-quark is described by a Dirac spinor which has 4 complex components which means that the bosonic superpartners really have to be composed of 2 complex scalars and their complex conjugates: one complex bosonic scalar for each Weyl fermion. So even though there only exists one unique mass of the top quark (the mass term has to mix both Weyl fermions), there are actually 2 independent complex polarizations of the stop squark and they may have two different mass eigenvalues, the heavier and the lighter stop. Quite generally, bosons' masses are much less protected and they may arise in easier ways (and be larger). The principle banning "easy masses" for a charged Weyl fermion is called the chiral symmetry.

With stops appreciated, the first quantum corrections to the Higgs mass scale like
\[ \frac{m_t^4 \log m_{\tilde t}^2}{v^2} \] There are also similar terms where the logarithm is replaced by \( X_t^2/m_{\tilde t}^2 \) or \( X_t^4/m_{\tilde t}^4 \), respectively. The \(X_t\) quantity measures the amount of mixing between the two stop complex scalars. But let us focus on the logarithmic term above.

The logarithm is an increasing, unbounded function. But it does grow very very slowly. As Matt recalls, the usual upper bound on the Higgs boson mass – 135 GeV – assumes that the stop mass \(m_{\tilde t}\) is below 2 TeV or so. This was really a kind of a wishful thinking (such a stop would be visible at the LHC), only sloppily supported by arguments involving the hierarchy problem (SUSY helps to make the lightness of the Higgs' being bearable but naively it does so only when all the MSSM superpartners are light).

However, 135 GeV is really unnatural; more natural Higgs masses in the light-stop MSSM are (or would be) closer to 100-120 GeV.

OK, an important observation is that for stop masses around 10-100 TeV, one indeed gets a Higgs mass in the MSSM that is around 125 GeV. This is kind of independent of other choices although many model builders prefer to make many additional choices, and some of them even seem to have pretty good reasons for doing so. One may say that the apparently observed Higgs mass favors squarks in the multi-dozen TeV scale, something that may be a good estimate for other reasons as well. It could also be good as an argument to build a Hyper Superconducting Super Collider, HSSC, that would bring us closer to 100 TeV, unless it would be killed by Hyper Republicans and a Hyper Democratic administration. ;-)

Finally, I want to mention two fresh preprints about the implications of a 125 GeV for supersymmetry:
Implications of a 125 GeV Higgs for supersymmetric models (A. Arbey, M. Battaglia, A. Djouadi, F. Mahmoudi, J. Quevillon)

Implications of a 125 GeV Higgs for the MSSM and Low-Scale SUSY Breaking (Patrick Draper, Patrick Meade, Matthew Reece, David Shih)
I only know (most of) the authors of the second paper. The titles are pretty similar, aren't they? Another Japanese preprint from today says that a 125 GeV Higgs probably implies anomaly mediation and heavy scalars; wino is the LSP. (But one more Japanese paper with an overlapping author list says that a 125 GeV Higgs is easy to understand with extra matter.)

Quite generally, one may say that the "garden variety" supersymmetric models with light squarks and "gauge mediation" of the supersymmetry breaking have become almost hopelessly contrived and fine-tuned, and have been nearly euthanized. The apparently observed SUSY-compatible but not-too-low value of the Higgs mass favors scenarios with heavy scalars (especially heavy stop squark); or extensions of MSSM with additional particle species. See another new paper by Carena et al. trying to obtain new possibilities with various hierarchies between slepton and squark masses.

On the Physics Stack Exchange, Matt Reece apparently disagreed with my classification of these vacua as "more stringy" reincarnations of supersymmetry. I think the description is accurate and I think that papers by Gordy and others do show calculations supporting the claim that they're accurate. Of course, some of it boils to terminology. For example, grand unified theories are very naturally incorporated in string theory, e.g. in heterotic string theory, so I would surely view grand unified SUSY models as being "more stringy" than the ordinary MSSM.

Also, string/M-theory at the Planck scale simply offers different criteria of what is natural than field theory. Some extra fields may look very awkward from a field-theory perspective (for example some fractionally charged state) but they may be totally meaningful in a stringy context (e.g. as strings wound on some \(\ZZ_k\) cycles in the compact manifold). Much more generally, the bottom-up perspective also leads us to interpretations of naturalness that may be way too naive. Kane et al. models show that one may really avoid fine-tuning of the Higgs mass even if the stop is heavy. We often get these unexpected solutions by carefully analyzing new possibilities that look natural from the top-down perspective.

At any rate, as people are learning the physics, the boundary between "stringy physics" and "particle physics accepted by everyone" is gradually dissolving because insights obtained from string theory are being recycled and hijacked by phenomenologists, including those who utterly fail to pay lip service to the glory of string/M-theory in their daily prayers. :-) It's clear that some phenomenologists prefer to present their models in a stringy way, others prefer to present their models in a way where string theory is being as censored away as possible. And different readers react differently. And there are other political issues in those matters, issues that have often nothing to do with any "string wars".

However, the exciting question that remains is whether or not the LHC will see something else beyond the 125 GeV Higgs. The very likely mass near 125 GeV makes both possibilities – yes and no – approximately equally likely. Stay tuned.

Gordon Kane and Rob Roser were interesting. Of course, they would agree about most technical issues but in some respects, their answers were remarkably different. Needless to say, I was much closer to Gordon Kane.

He managed to explain that there are some indirect relationships to SUSY, mentioned their work on M-theory and predictions, repeatedly stressed that the current theories can calculate all kinds of things and most things people ask about are really understood, and said that he was sure the Higgs was there and above 5 sigma (I think it's only 4 sigma after the look-elsewhere subtractions but it's kind of enough for me in this case). Roser said that on the scale 1-10 of the certainty about the Higgs, he would be just a 1. Holy cow. If 9 means 90% certainty, of course that I would have to say 10. I guess that Gordon Kane would say 9 or 10, too.

At a Nature live blog, Bill Murray said he polled 10 of the leaders of the ATLAS search. Do they think that the LHC has found the Higgs? All of them said Yes. Matt Strassler claims almost exactly the opposite: that at least 90 percent of the CERN folks think that it remains a totally open question. I just don't believe Matt on this not-purely-sociological issue. Apologies but I can't get rid of the feeling that he has an agenda – to present the results in as fuzzy, skeptical, and unattractive way as possible – and he just made up the claims about the 90% inconclusive beliefs.

Also, Roser repeated the fashionable cliché that it's always great to find something totally unexpected because it changes our paradigm dramatically blah blah blah. This phrase may sound "nice" but Kane disagreed and so do I. It's better to confirm a good theory because it's a faster way towards a deeper understanding. After all, we have obtained some "very surprising" findings in the recent decades – such as the 1998 discovery of the positive cosmological constant. Did it really bring some paradigm shift and an avalanche of new insights or just chaos or confusion? I think it's the latter. Chaos and decadent anthropic philosophies. At the end, we may just find out it was just a single number that has a pretty simple, and localized, explanation that doesn't affect anything else and isn't too important.

Of course, Nature shows us whatever She wants to show, scientists have to respect Her decision without any prejudices, and the nonzero C.C. is almost certainly there so it had to be discovered at some point. However, we may still ask in what case the scientific progress is luckier and I agree with Kane that confirmations of good theories are more beneficial for the scientific progress.

Add to Digg this Add to reddit

snail feedback (0) :