Friday, July 20, 2012 ... Deutsch/Español/Related posts from blogosphere

Diphoton Higgs enhancement as a proof of naturalness

The first hep-ph preprint today is a paper I've known about since the July 4th Higgsfest and I've been looking forward to see it. The title is

2:1 for Naturalness at the LHC?
The score "2:1" has a double meaning: it either refers to a soccer match in which the home team won (Plzeň defeated Rustavi of Georgia 3-to-1 last night, however); or it refers to the 100% excess of the Higgs decays to two photons.

The authors, Nima Arkani-Hamed, Kfir Blum, Raffaele Tito D'Agnolo, and JiJi Fan (who will be referred to as Nima et al. because I don't know any of the co-authors, as far as I know) propose a connection between a priori very different features or possible features of Nature:
  1. naturalness, essentially the opposite thing to the "anthropic principle" – one of the most conceptual principles we know in contemporary particle physics that may still be wrong (it says that dimensionless parameters shouldn't be surprisingly tiny unless their small or vanishing value is justified by a valid argument, ideally an enhanced symmetry)
  2. seemingly elevated diphoton branching ratio of the July 4th \(126\GeV\) Higgs boson, one of the boring yet distracting 2+ sigma anomalies and the only slight deviation of the observed God particle from the Standard Model predictions that has survived so far and that may be talked about
The probability that the Higgs boson decays to two photons (also known as the branching ratio) was observed by ATLAS+CMS to be about 1.8 times higher than the Standard Model prediction. Because the measurements of the precise branching ratios require lots more data than the very discovery that there is a new particle, these branching ratios have a large error margin and the 80% excess is therefore just a 2+ sigma effect at this moment.

But it could have profound consequences, Nima et al. argue.

As I said, if their arguments are right, it's exactly the type of a connection that must please every physicist. One finds a litmus test, a previously irrelevant technicality – in this case the enhancement of the diphoton branching ratio – that is actually inseparably connected with something we really care about, almost religiously, and something that seems to decide about the soul of science: naturalness.

How does their argument work? And is it right?

A brief history of Nima

Before I try to offer my answer, I can't resist to recall some history about Nima and naturalness. Yes, after those years of interactions with Nima and listening to his talks, I could perhaps be employed as a historian of science focusing on the relationship of Nima Arkani-Hamed and naturalness. ;-) But I will simplify it a bit; let's hope that it won't be completely wrong.

One of the things about Nima has been his diversity of ideas and interests and the sheer size of the ensemble of models he has co-fathered or nearly co-fathered. There are several gods whose discovery would mean that Nima would deserve to share a Nobel prize; there are also several antigods, devils, and atheists' holy grails whose discovery would probably earn him a Nobel prize, too. He's famous for models with the huge gaps between the masses as well as small gaps between the masses; large extra dimensions and no extra dimensions at all; models with huge numbers of additional particle species and models with almost no new particles; enthusiastic garden-variety supersymmetric models as well as passionate feelings that SUSY looks much less powerful than a decade ago; and so on, and so on.

But one open question has become a defining sign of physics from Nima's viewpoint. It's nothing else than the anthropic reasoning. In the recent decade, Nima has repeatedly emphasized that it's a crossing that may send physics research of the future into vastly different directions. Needless to say, Nima, the ultimate opportunist (greetings, Nima!), has been a double spy in this cold war, too. :-)

He's co-written various papers that proposed natural solutions to the hierarchy problem – models that implied that the Higgs boson should be light without fine-tuning. But he's been also involved in the opposite business. Together with Savas Dimopoulos, they gave rise to the split supersymmetry, a culminating work in the pro-anthropic research by these men. One could say that they decided to construct the most sensible particle physics model with the new assumption that one may leave the lightness of the Higgs to the anthropic selection. Still, with this change in the paradigm, it's sensible to consider supersymmetry and reproduce the successes of low-energy supersymmetry such as gauge coupling unification and a dark matter candidate. It can be done. In the resulting model, most superpartners are very heavy and only some of them are kept light. By the way, split SUSY predicted a Higgs boson between \(120\) and \(150\GeV\), compatible with the July 4th discovery. But at least one of its co-fathers is on a new mission whose goal is nothing else than the massacre of split SUSY.

Nima has always viewed the "naturalness vs unnaturalness" conflict to be very sharp, binary, and black-and-white. And the fresh paper they wrote fits into this philosophy perfectly. I almost have the feeling that the most general philosophy and storyline of the new paper has been decided for 8 years and they just recently added some technical details. ;-) Needless to say, I would also be happy if we could learn a clear black-or-white answer to the question whether Nature respects naturalness when it makes the Higgs boson light. However, I am ready for the answer that the answer isn't black-or-white. The question whether the lightness of the Higgs is natural is somewhat vague and non-rigorous and such questions often have unclear, grey answers. The Planck-Higgs gap could be partially covered by dynamical mechanisms and partially accounted for by the anthropic selection; it could also be explained or co-explained by completely new ideas that can't be easily classified as anthropic or non-anthropic ones.

But let's jump to the paper.

They want to "nearly classify" particle physics models that increase the diphoton Higgs branching ratio, i.e. the probability that the Higgs decays to \(H\to \gamma\gamma\), but that doesn't enhance the \(H\to ZZ\) branching ratio. The ratio of the two branching ratios should increase 1.5-2.0 times relatively to the Standard Model.

They decide it can't be done by modifying the tree-level coupling of the Higgs field, something I wouldn't even discuss as a possibility because in the minimal i.e. Standard Model, these couplings are completely determined by the measured masses. In their scheme, it follows that the affirmative action favoring the diphoton decays has to come from loop corrections. There have to be new loop contributions – which also means new particle running in the loops.

(I have some worries that even this first step could have loopholes – new tree-level exchange of new matter such as new \(W'\) bosons could also make an impact but I am ready to believe that light enough particles of this kind have been excluded.)

But loop contributions are naturally small. To make them large, you must have large values of the interaction coupling constants that appear in almost all the vertices in the loop. It seems simplest to add a fermionic loop. When it comes to the identity of the fermion, they decide that it is essentially a "new vector-like lepton species", a lepton with left-right-symmetric interactions whose mass shouldn't be far from \(100-200\GeV\).

Note that Hooper and Buckley attribute the effect to stop squarks which are scalars so Nima et al. have nothing to say about these things.

The point is that if the required higher diphoton branching ratio forces you to add a new particle such as the vector-like lepton, it has additional consequences. The new particle or new particles will contribute to the running of the Higgs quartic coupling \(\lambda\), the running that I have discussed in the article about the Higgs instability. The running will have the form\[

\ddfrac{\lambda}{\ln\mu} = -C\cdot {\mathcal N} y^4.

\] I have included the possibility of several, \({\mathcal N}\) new species. Unless I am an idiot, the fourth power of the Yukawa coupling \(y\) comes from the four vertices of a box (square) diagram inserted between four external lines of the God particle.

The Yukawa coupling \(y\) has to be large for the new lepton to substantially influence the diphoton branching ratio. It follows that the right hand side of the equation above is large, too. It means that the quartic coupling goes negative, the theory becomes unstable, and a new fix is needed. The scale at which the new fix is vital may be described as the "cutoff scale" of the theory with the single new lepton only – or, more generally, the cutoff scale of a theory with several new fermion species.

The cutoff scale comes quickly, below \(10\TeV\) or so, even if we try to delay it as much as possible. If we try to delay it, it is desirable to be satisfied with a smaller diphoton enhancement: the doubling of the diphoton branching ratio makes a breakdown below \(1\TeV\) inevitable. So we should better be satisfied with the enhancement by a factor of \(1.5\). And if we fix this factor and if we want to delay the breakdown, it is a good idea to make the new lepton as light as possible, e.g. \(100-150\GeV\). But even if we do it in this way, the theory inevitably breaks down below \(10\TeV\), they say, which is not too far from the Higgs mass scale. In this sense, the naturalness – something more robust fixing the problems coming from the lightness of the Higgs – is guaranteed by the enhanced branching ratio.

One may formulate the same proposition in the opposite way: all theories that only add extra light fermions to the Standard Model (which already includes the God particle) inevitably predict that the large diphoton branching ratio enhancement will disappear. It's a cute argument and equivalence, especially if it is true.

Concerning the last paragraph, the proposition is not as strong (or "audacious and obviously wrong") as you may think. The theories discussed in the previous paragraph don't really include the full-fledged supersymmetric models because those models have new bosons such as stop squarks, too. So there's really no contradiction with papers such as Hooper-Buckley who claim to have achieved the diphoton enhancement by stop squarks or by other methods. However, the paper shows that split SUSY would be excluded if the diphoton excess survived. I personally would never say that "split SUSY" and "an unnatural theory" are the same thing, however (split SUSY is just the minimal theory obeying a certain complicated list of requirements that depend on the status of particle physics in 2004, i.e. on social sciences, and I just find it extremely unlikely that the right theory of Nature may be obtained by exactly this minimal-in-2004 definition), and the observation by Nima et al. doesn't seem to affect most unnatural theories because they contain new light bosons, too. In fact, it's natural for unnatural theories to contain new light bosons. ;-)

[A wrong paragraph was here and it was removed.]

The LHC experiments could decide about the big questions in these confrontations of ideas about naturalness earlier than the men of theory, however.

Add to Digg this Add to reddit

snail feedback (5) :

reader Dude are you doped said...

Dear Lubos, the stops are scalars, and thus have nothing to do with the argument of Nima and friends, who only discussed fermions contributions.

You may want to fix sentences such as "depend on new fermion species -- stop squarks --..."
Like seriously?

reader Luboš Motl said...

Thanks, very good point.

reader Mitchell said...

I just ran across 't Hooft's original paper in which he introduced naturalness:

reader Luboš Motl said...

Cool, still a kind of post-golden-era paper.

But it's enough to read it roughly, or the abstract, to see that the details shouldn't have been that influential.

I introduce naturalness... at every scale - great.

Then I claim that for theories to be natural, one must have lots of other QCD/technicolor strong sectors and compositeness all over the place - completely wrong.

For a late 1970s paper, it's pretty remarkable to completely overlook SUSY and SUSY isn't really the only mechanism/symmetry that may help to protect naturalness.

Only when I see the paper in this form and imagine it was the bread-and-butter of the education of many people, I understand why so many people have been so obsessed by adding strongly interacting sectors and compositeness everywhere - something I would always find pretty much stupid and unmotivated (and that is excluded up to huge scales by the LHC today): an influential paper was identifying this ugly technological monstrosity with naturalness! ;-/

reader Dilaton said...

Ha ha Lumo, the history of Nima is as much fun to read as listining to a Nima talk :-D...

Where would these "new vector-like lepton species" come from? I mean, are they embeded in a more fundamental high energy scale theory somehow? And if true, would Nima's new model give clues about how to resolve additional issues the SM refuses to explain?

(function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){ (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o), m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m) })(window,document,'script','//','ga'); ga('create', 'UA-1828728-1', 'auto'); ga('send', 'pageview');