Thursday, August 27, 2015 ... Français/Deutsch/Español/Česky/Japanese/Related posts from blogosphere

LHCb: 2-sigma violation of lepton universality

In combination with earlier results by BaBar and Belle, the deviation is 3.9 sigma!

Since the end of June, I mentioned the ("smaller") LHCb collaboration at the LHC twice. They organize their own Kaggle contest and they claim to have discovered a pentaquark.

In their new article Evidence suggests subatomic particles could defy the standard model, Phys.ORG just made it clear that I largely missed a hep-ex paper at the end of June,

Measurement of the ratio of branching fractions \(\mathcal{B}(\overline{B}^0 \to D^{*+}τ^{-}\overlineν_τ))/\mathcal{B}(\overline{B}^0 \to D^{*+}μ^{-}\overlineν_μ)\)
by Brian Hamilton and about 700 co-authors. The paper will appear in Physical Review Letters in a week – which is why it made it to Phys.ORG now. An early June TRF blog post could have been about the same thing but the details weren't available.




What is going on? They measured the number of decays of the \(\bar B^0\) mesons produced within their detector in 2011 and 2012 that have another meson, \(D^{*+}\), in the final state, along with the negative-lepton and the corresponding antineutrino.




Well, the decay obviously needs the cubic vertex with the \(W^\pm\)-boson – i.e. the charged current – and this current should contain the term "creating the muon and its antineutrino" and the term "creating the tau and its antineutrino" with equal coefficients. There are no Higgs couplings involved in the process so the different generations of the leptons behave "the same" because they transform as the "same kind of doublets" that the \(W^\pm\)-bosons are mixing with each other.

The decay rates with all the \(\mu\) replaced by \(\tau\) should be "basically" the same. Well, because the masses and therefore kinematics is different, the Standard Model predicts the ratio of the two decay rates to be\[

{\mathcal R}(D^*) = \frac{\mathcal{B}(\overline{B}^0 \to D^{*+}\tau^{-}\overline \nu_\tau)}{
\mathcal{B}(\overline{B}^0 \to D^{*+}\mu^{-}\overline \nu_\mu)} = 0.252\pm 0.003

\] The error of the theoretical prediction is just 1 percent or so. This is the usual accuracy that the Standard Model allows us, at least when the process doesn't depend on the messy features of the strong force too much. This decay ultimately depends on the weak interactions – those with the \(W^\pm\)-bosons as the intermediate particle – which is why the accuracy is so good.

Well, the LHCb folks measured that quantity and got the following value of the ratio\[

{\mathcal R}(D^*) = 0.336 \pm 0.027 \text{ (stat) } \pm 0.030 \text{ (syst) }

\] which is 33% higher and, using the "Pythagorean" total error \(0.040\) combining the statistical and systematic one, it is about 2.1 standard deviations higher than the (accurately) predicted value.

As always, 2.1 sigma is no discovery to be carved in stone (even though it's formally or naively some "96% certainty of a new effect") but it is an interesting deviation, especially because there are other reasons to think that the "lepton universality" could fail. What am I talking about?

In the "lepton universality", the coupling of the \(W^\pm\)-boson to the charged_lepton-plus-neutrino pair is proportional to a \(3\times 3\) unit matrix in the space of the three generations. The unit matrix and its multiples are nice and simple.

Well, there are two related ways how a generic matrix differs from a multiple of the unit matrix:
  1. its diagonal elements are not equal to each other
  2. the off-diagonal elements are nonzero
In a particular basis, these are two different "failures" of a matrix. We describe them (or the physical effects that they cause if the matrix is used for the charged currents) as "violations of lepton university" and "flavor violations", respectively. But it's obvious that in a general basis, you can't distinguish them. A diagonal matrix with different diagonal entries looks like a non-diagonal matrix in other bases.

So the violation of the "lepton universality" discussed in this LHCb paper and this blog post (different diagonal entries for the muon and tau) is "fundamentally" a symptom of the same effect as the "flavor violation" (non-zero off-diagonal entries). And the number of these flavor-violating anomalies has grown pretty large! Most interestingly, CMS saw a 2.4 excess in decays of the Higgs to \(\mu\) and \(\tau\) which seem to represent about 1% (plus minus 0.4% if you wish) of the decays even though such flavor-violating decays are prohibited.

LHCb has announced several other minor flavor-violating results but because they depend on some mesons, they are less catchy for an elementary particle physicist.

The signs of the flavor violation may be strengthening. If a huge, flavor-violating deviation from the Standard Model is seen and some discoveries are made, we will be able to say that "we saw that paradigm shift coming". Although right now, we may be seeing that this discovery may also be going away if Nature is less generous, too. ;-)

A comment I added in September: if one combines the results of LHCb with similar 2-sigma-ish anomalies of the Belle experiment (KEK, Tsukuba, Japan) and BaBar (Stanford), the deviation of the world average from the Standard Model prediction is 3.9 sigma – formally a 99.99% "certainty". And that's already pretty interesting.

Add to del.icio.us Digg this Add to reddit

snail feedback (0) :