Tuesday, March 19, 2019 ... Deutsch/Español/Related posts from blogosphere

CMS: 2.4-sigma excess in the last gluino bin, photons+MET



Gluino, a vampire alchemist with human eyes

I just want to have a separate blog post on this seemingly small anomaly. We already saw the preprint for one day in advance but the CMS preprint finally appeared on the hep-ex arXiv:

Search for supersymmetry in final states with photons and missing transverse momentum in proton-proton collisions at 13 TeV
OK, they look at events in which two photons are created and seen in the calorimeters, plus the momentum addition doesn't seem to add up. The sum of the initial protons' \(\sum\vec p_i\) seems to differ from the final particles \(\sum \vec p_f\). The difference is the "missing transverse momentum" but because such a momentum is carried by particles which must have at least the same energy, it's also referred to as MET, the missing \(E_T\) or missing transverse energy.



OK, CMS has picked the collisions with the qualitatively right final state, photons plus MET, and divided them to bins according to the magnitude of MET. The last bin has MET between \(250\GeV\) and \(350\GeV\). It's very hard to produce this high missing transverse momentum at the LHC – the Standard Model assumes that MET is carried by the invisible neutrinos only. And although the protons carry \(13\TeV\) of energy in total, it's divided between many partons in average, and it's unlikely that a neutrino created in the process can steal more than \(0.35\TeV\) of energy for itself.



In this last bin of photons+MET, 5.4 events were expected, plus minus the systematic error 1.55 events or so. However, a whopping 12 events were observed. If you combine the 1.55 systematic error with the \(\sqrt{5.4}\) statistical error in the Pythagorean way, you get some 2.8 events for a total sigma, and 12 is some 2.4 sigma above the predicted 5.4. Sorry if my calculation is wrong but that's a mistake I probably repeated a few times. They seem to say that the error 1.5-1.6 already includes the statistical error and I can't see how it can be true because 1.6 is smaller than the square root of five.

A 2.4 sigma excess corresponds to a 99% confidence level. It means that if this increase from 5.4 to 12 is due to chance, the probability that it occurred by chance is just some 1% or so. It means that the odds that it is due to a real signal are about 100 times higher than your prior odds. That's a significant increase. I think it's basically fair to interpret this increase as a reason to increase the visibility of this particular CMS search by a factor of 100 (for those who look for new physics).

Some people learned to react instinctively and easily. If the excess is just 2.4 sigma, it can't be real. I think it's a sloppy reasoning. It's right to be affected by 2.4 sigma excesses. In softer sciences, this is more than enough to claim a sure discovery and demand a world revolution that is proven necessary by that discovery. Particle physicists want to be hard, nothing below 5 sigma counts, but the truth is fuzzy and in between. It's just right to marginally increase one's belief, hopes, or attention that something might be true if there is a 2.4 sigma excess.

And this excess is in the last bin – the cutting edge of energy and luminosity. With the full dataset, if things just scale and it's a signal, there could be 50 events instead of the predicted 22, just on CMS. The same could be seen by ATLAS, in total, 100 events instead of 44. Maybe some events would show up above \(350\GeV\). If true, it would surely be enough for a combined or even separate discovery of new physics.

And yes, this new physics looks just damn natural to me. Gluinos, the superpartners of gluons, may appear in this photons+MET channel assuming a popular version of supersymmetry breaking, the gauge mediation supersymmetry breaking (GMSB). In that setup, the lightest supersymmetric particle and therefore dark matter particle is the gravitino \(\tilde G\), the superpartner of the graviton whose mass could be just \(1\eV\) or so which is nice in cosmology. Its interactions are very weak and the particle is predicted to be invisible to doable direct detection experiments, because gravitino's interactions are superpartners of gravity which is the weakest force, you know, thus easily explaining the null results (and suggesting that these direct search experiments were "wasted money" – such things are unavoidable somewhere in science; science is exploration, not a guaranteed insured baking of breads).

The gravitino could still be unstable i.e. R-parity could be broken (but the lifetime could be billion of years so the gravitinos are still around as dark matter) – in which case the gravitino decays could be seen in cosmic rays. On top of that, the NLSPs' decays to the gravitino could be seen in colliders. If the NLSP is too long-lived, there's a problem with the Big Bang Nucleosynthesis.



If gravitinos are dark matter, they're probably overpaying for the commodity. See how these gravitino balls are farmed.

We also need to assume a neutralino NLSP, the next to lightest superpartner, and it's not just any neutralino. It must be close to the bino, the superpartner of the B-boson, the gauge boson for the electroweak \(U(1)\) hypercharge gauge group (bino is closer to a photino than a zino under the Weinberg angle rotations of bases, correct my mistakes, please). There are many other possibilities but for the largest percentage of my tenure as an amateur supersymmetry phenomenologist, I have considered this assignment of the LSP and NLSP to be the more natural one despite my clear grasp which stringy vacua predict it. (Later, I looked at an American-Romanian-Slovak 2005 paper to have an idea that it's possible at all.)

Just to be sure, it's different from Kane's \(G_2\) M-theory compactifications, from Nima-et-al. split supersymmetry, and from many other things you've heard. The number of choices for the qualitative character of supersymmetry breaking and for the hierarchy of the superpartner masses is somewhat large.

The mass of the gluino indicated by the small excess would be about \(1.9\TeV\). Some squarks could be around \(1.8\TeV\) but detailed numbers are too model-specific because there are many squarks that may be different – and different in many ways.

As we discussed with Edwin, Feynman has warned against the "last bin", in an experimental paper claiming that the FG theory wasn't FG. But I think that he talked about a situation in which the systematic error in the last bin was high – the experimenters lost the ability to measure something precisely due to complications in their gadget. Here, the error in the last bin is already mostly statistical. So one can't get a new bin simply because there are no events detected with the MET above \(350\GeV\) and zero is too little.

In this sense, I think it's right to say the exact opposite of Feynman's sentence here. Feynman said that if the last bin were any good, they would draw another one. Well, I would say that if the last bin were really bad, it would have zero events and they wouldn't include it at all. ;-)

There's a significant probability it's an upward fluke, some probability it's a sign of new physics. None of them is zero. To eliminate one possibility means to be sloppy or close-minded. This last bin of this channel is arguably one of the most natural bins-of-channels where the superpartners could appear first, and the gradual appearance that grows from 2 sigma to 5 sigma and then higher is the normal thing how a discovery should proceed. It's no science-fiction speculation. The LHC is struggling to see some events with very high energies of final particles that the LHC is barely capable of producing – it just produces them too scarcely. In such a process, it's just rather natural for new physics to gradually appear in the last bin (what is the last bin in one paper but may be earlier than the last in higher-luminosity papers).

The Higgs boson also had some small fluctuations first. At some time in 2011, we discussed a possible bump indicating a \(115\GeV\) Higgs if you remember. It went away and by December 2011, a new bump emerged near \(125\GeV\) and I was correctly certain it was the correct one and I was right (I wasn't the only one but the certainty wasn't universal). This last bin may be analogous to the wrong \(115\GeV\) Higgs. But it may also be analogous to the correct \(125\GeV\) Higgs and it may grow.

You see that this paper was just published now and only uses 1/4 of the available CMS dataset. It sees a minor fluke. The LHC still has enough not yet analyzed data that could produce new physics – although the LHC is already not running for several months. It's just plain wrong for anyone to say that "the discovery of new physics in the LHC data up to 2018 has already been ruled out".



Update, March 21st:

There is a new CMS search for gauge mediation which also has 35.9/fb, includes the diphoton channel above, plus three more channels. One of them is one photon+jets+MET which was already reported in July 2017 when I naturally ignored it because that paper seemed like "clear no signal" in the absence of the diphoton analysis in it. But there's an excess in the (next to last) bin with the transverse energy \(450\)-\(600\GeV\), one photon, jets (or "transverse activity"), and missing energy. Instead of 3 expected, one gets 10 events.

In combination, there are some 2-sigmaish-excesses everywhere, although the diphoton is probably still the most important source of them. The charginos are mostly excluded below \(890\GeV\) but it should have been \(1080\GeV\). The wino mass \(M_2\) seems to be excluded below \(1100\GeV\) although it should have been all below \(1300\GeV\), and so on. I finally decided that the degree to which this increases the odds that the CMS is seeing hints of gauge-mediated SUSY breaking (if it is an increase at all, not a decrease) is too small and doesn't justify a full new blog post.

Add to del.icio.us Digg this Add to reddit

snail feedback (0) :