Friday, July 24, 2009

Ellis et al.: Higgs self-coupling doesn't blow up

Tommaso Dorigo liked the paper by John Ellis et al.,
The probable fate of the Standard Model (hep-ph preprint).
Let's assume that the Standard Model - the renormalizable theory of known elementary particles plus one Higgs doublet - is valid up to very high energies. What can you say about the Higgs mass?


Click to zoom in.

As you know, the Higgs potential is classically a simple function, schematically
V(h) = lambda h4 - f(m)h2.
As long as the quantum corrections (i.e. "lambda") are small enough, this form is pretty good because quantum corrections are also small. Here, "f(m)" is an increasing function of the Higgs mass (and lambda). The coefficients must be such that the minima of the potential occur for "h = 247 GeV", to generate the right W,Z boson masses via the known gauge couplings.

But the overall normalization of the potential is not determined. You can kind of rescale it by "lambda". But "lambda" is not a complete constant. Quantum effects make it run logarithmically: it becomes bigger at higher energies, much like most couplings (except for asymptotically free gauge theories).




Blow-up scenario

If "lambda" is too high, it runs too quickly and eventually overshoots the critical value of "pi" or "2 pi" which only leads to infinity - a breakdown of the theory, at least in its qualitatively perturbative form. The higher energy scale you choose to require that "lambda" doesn't overshoot at that scale, the more strict bound on the Higgs mass you obtain.

The upper decreasing curves on the graph above imply that if you want the Standard Model to avoid a divergent "lambda" up to the Planck scale, the Higgs mass shouldn't exceed 170 GeV or so. But experiments indicate that the Higgs mass doesn't exceed 170 GeV anyway - the only new insight that Ellis et al. actually included in their "update" of the Higgs bound. In fact, even the 160-170 GeV window is excluded by the Tevatron, much more strictly than the masses between 170 and 185 GeV.

It also means that the experiments tell us that the Higgs self-coupling doesn't blow up even if extrapolated up to the Planck scale, assuming that the Standard Model is the whole story. That's the newest claim in the paper by Ellis et al.

Survival and metastability

The other possibilities are that the Higgs mass is between 130 and 160 GeV or so which allows the Standard Model to be well-behaved up to the Planck scale - the "survival scenario" - and that the mass is lower than 130 GeV in which case our vacuum becomes metastable or unstable. Why? What does it mean?

If you assume that the mass is too low, the "lambda" coupling has to be small, too. The one-loop corrections are proportional to "log phi", they may be negative, and it's very plausible that for a finite value of "phi", they actually beat the classical contribution (proportional to a small "lambda") and drive the total potential to a negative value (below the value at our vacuum) which makes our value of "phi" unstable.

Well, it may be metastable and one must distinguish different levels of threat - how big a temperature is sufficient to induce the catastrophic collapse, and how long a lifetime of such an unstable Universe you tolerate once you agree that it may be unstable.

No further exclusions

At any rate, the newer Tevatron data don't falsify either of the unstable, metastable, or surviving scenarios. Only the "blow-up" scenario is experimentally excluded at a 99% confidence level. Of course, you may still say that the "unstable" scenario is "theoretically" excluded, i.e. experimentally excluded by the very existence of our non-dying world. ;-)

The Higgs mass in the Standard Model still has a reasonably big window - between 130 GeV and 160 GeV - to live in without serious problems. Precision measurements suggest that the Higgs mass is actually below 130 GeV, which means that it satisfies exactly the constraint that can also be derived from MSSM.

The supersymmetric extension of the Standard Model together with the known experimental bounds implies that the lightest Higgs mass must be between 114 and 130 GeV. This interval is slightly preferred by precision measurements over the heavier interval of the Standard Model but there's no definite answer at this point.

Hierarchy problem

The tiny value of the Higgs mass relatively to the Planck scale, the hierarchy problem, can be explained by additional mechanisms as long as the small mass is protected against quantum corrections by supersymmetry, e.g. in the MSSM. The hierarchy remains unexplained by similar considerations in the Standard Model: however, you may always say that it is explained by the anthropic considerations because we would probably die before we would be born if the hierarchy were not there.

So if some fine-tuning is needed for us to live, then we deserve a fine-tuning. ;-)

If the MSSM is (nearly) valid, there are much better reasons - such as gauge coupling unification and the general "stability" - to assume that it can be the right description up to the GUT scale or so. On the other hand, it is generally believed that the Standard Model, if correct at all, is only good as an effective theory to the current accelerator energy scales. It just doesn't look terribly good or natural at higher energies.

Supersymmetry versus no supersymmetry

The laymen often incorrectly think that the MSSM is the SM plus extra garbage. But supersymmetry might be a necessity imposed upon us by consistency and other considerations. From this viewpoint, every non-supersymmetric theory is just an effective approximation, and additional stuff is always needed to produce a non-supersymmetric theory from a more fundamental supersymmetric one by SUSY breaking; it is therefore counterproductive to pretend that we don't know SUSY whenever we consider new physics.

That's certainly the attitude that virtually all people who take string theory seriously, including your humble correspondent, adopt. Just like we don't know the Higgs mass yet, we don't know the superpartner masses yet (assuming that they exist). But there are good reasons to think that the former is close to 120 GeV just like the latter is in the ballpark of 1 TeV, so if supersymmetry is correct, chances are that the LHC should see it.

The LHC should be able to detect gluinos or squarks that are as massive as 2-4 TeV. As the mass increases, the required time increases, too. Of course, a 14 TeV beam allows you to see "proportionally" heavier particles than a 10 TeV beam.

At Tommaso Dorigo's blog, I explain that while the notion of "minimality" is a sensible strategy, the answer to the question what the word "minimal" actually means keeps on evolving. Chances are that in a few years, we will know the fact that and the reason why one Higgs doublet is not really the "minimal viable option", much like we know that charged currents (and W bosons) without neutral currents (and Z bosons) are not the "minimal viable" option to describe the beta-decay.

The Z bosons are needed, for reasons that used to be unknown in the past. On his blog, I am giving many other examples but there exist many more examples in the history of physics.

No comments:

Post a Comment