## Friday, August 08, 2014 ... /////

### Sum rule constraint on BSM models

Guest blog by Paul Frampton, paper by PF and Thomas Kephart

It is good to back after my unexpected sabbatical of 2 years and 4 months in South America. During that time the BEH scalar boson (called $H$) was discovered on July 4th, 2012 at the LHC by both the CMS and ATLAS groups. The subsequent experimental study of the production and decay of $H$ provides particle phenomenology with the first really new data for decades. Physicists who are less than 50 years old cannot remember the excitement in particle phenomenology of the 1970s. In the 1980s, 1990s and 2000s which included the important discoveries of the $W^\pm$ and $Z^0$, and the top quark the interplay between theory and experiment was nevertheless less exciting than the 1970s. Now the study of $H$ again is.

In this paper, two assumptions are made:

1. The masses of the fermions arise entirely from their Yukawa couplings.
2. The mass of $W^\pm$ arises entirely from the BEH mechanism. Both of these assumptions are implicit in the standard model so, if violated, there is already new physics to understand.

With these two assumptions we (my coauthor is Tom Kephart) derive a sum rule which must be satisfied by the Yukawa coupling constants. It states that the sum of the squares of the standard model Yukawa couplings divided by their measured values must equal one. This sum rule has several immediate consequences.

The partial decay rates for the decays $H\to b+b$ and $H \to \tau + \tau$ (one of the decay products has a bar over it) cannot be less than the corresponding rate in the standard model. The reason is simple to explain. If the Yukawa coupling were smaller, the corresponding vacuum value must be bigger but that gives too large a $W$-mass by the BEH mechanism and hence is disallowed because the W mass is known to an accuracy better than 0.01%.

In beyond the standard model (BSM) theories with two distinct scalar doublets coupling respectively to the top, and to the bottom and tau, such as the MSSM, the sum rule constrains $\tan\beta$ certainly to be less than one, quite different from what is often assumed in fits. Although the MSSM was already on life support before this work, I would dare to say that the plug is now pulled half-way out of the socket. BSM theories like Peccei-Quinn and the 2HDMs are likewise constrained by the sum rule.

There are BSM models where three distinct scalar doublets couple to the top, bottom and tau. These include theories with global flavor symmetries, including several of my group's old models. Here the sum rule is even more exacting and almost no model of this type can survive at 3 sigma.

Regarding the MSSM, the supersymmetry community is very clever and no doubt a generalization of MSSM will be constructed which can satisfy the new sum rule even with the higher accuracy data expected from the LHC. But it will be challenging.

More generally, the sum rule means that constructing any viable theory beyond the standard model becomes more difficult and that is obviously a good thing.

So that's my guest blog, Lubos.

With my best regards as always,
Paul

#### snail feedback (40) :

Dear Paul, welcome back. It's an interesting paper.

I don't believe that tan(beta) MSSM above 1 may be generally excluded in this way. Maybe the loophole are your assumptions (loops neglected etc.?), maybe something else, and maybe I am wrong. ;-) But I just don't believe the claim. Also, I would say that it is silly to say that "MSSM is on life support".

Dear Lubos,

I am delighted you find the paper interesting. In the two photon decay of H where there is no tree diagram, the one loop diagrams dominate. In the two body decays of H into bottom and tau, however, the loop corrections to the tree diagram are too small to change the conclusion. As for supersymmetry, there could be a non-minimal version which is in better health.

Paul

Dear Prof. Frampton,

thanks for this nice explanation of your paper :-).

What kind or class of BSM models do get most impacted by this new constraint?
Does this sum rule constrain minimal extensions of the SM more then models, which are motivated from farther BSM (stringy for example) frameworks, or is it the other way round?

BTW I am happy that you are back and hope you are ok.
The villains who have ordered the more than 2 years sabbatical to you made me very angry, and I will think twice before travelling to parts of the world without an appropriate legal system ... :-(0)!

Cheers

Paul--your equations neglect mixing between the doublets. Look at Table 2 in the review 1106.0034. If either sin(alpha) or cos(alpha) are small, the Yukawa couplings in the BSM are smaller than those in the SM, in contradiction to your eq. 10. Furthermore, there are dozens of papers in the past couple of years looking at LHC data in the context of 2HDMs, and they certainly don't have the same results (see 1305.2424 for a bunch of refs and cites).

Hey, I heard you used ideas from South American physicists repeatedly, without proper references, in your papers. Is this true? And why not refer to them?

I do not really understand the logic of this blog post:

1) >> Both of these assumptions are implicit in the standard model
so you make 2 assumptions assuming SM holds ...

2) >> constructing any viable theory beyond the standard model becomes more difficult
... you then show that from those 2 assumptions you limit the BSM physics.

You seem to say "if the SM holds then BSM models are restricted" - what am I missing?

Yes! We neglected mixing because we focused on cases like MSSM and PQ where the two doublets have different quantum numbers. Note that 2HDMs come in different types. The sum rule constraint applies to 2HDMs where the doublets have different quantum numbers and therefore cannot mix.

What is your source of information? It is a false accusation.

Given both yours and the newer comments, I'd be pretty interested in hearing your opinions on this result if you delved a bit deeper into the paper. Is this real and if so, how much does it really narrow down common SUSY models?

Sorry, but in the usual type II Higgs doublet model, the doublets have different hypercharges and there is still mixing. Do you really think that the dozens of papers studying the usual type II models (which includes SUSY/PQ) have missed this. If they both give mass to fermions, then they will mix. Look again at the references--there may now be over a hundred of these analyses, and they disagree with your sum rule.

One more thing. Your reply says that the doublets don't mix in the MSSM. This is utter nonsense, as any study of the Higgs structure of the MSSM will show you. The mixing angle is called alpha, and it is determined by the pseudoscalar mass and tan(beta).

Hey, you heard this libel from someone? Is that true? And why not refer to them?
Charitably you may be a twit, so let me explain those are rhetorical questions. Such false accusations as this spreads are pernicious. So who says this? That you should answer.

Even if you take the decoupling limit, where the Higgs couplings (including Yukawas) are pretty much identical to SM, tanB can be as large you want (not considering flavor constraints etc).

Dear Eric, let me tell you that this is a potential issue for lawyers. If you won't present something that counts as specific enough evidence, I will later ban you and also provide Paul all the relevant information needed to beat you.

Send me the info as well. I have reason to correct people like Eric :) . I will find him and fly to his location within 48 hours. red.apple37 Skype id

Because the destination is Brussels, could I ask you to deal with about 3 dozens of EU villains on a list of mine, too? ;-)

Absolutely, Just send me the complete list. I will have flight confirmation details after mapping out the route

"I do not really understand the logic of this blog post ... what am I missing?"—Anon

A few teeth if you're not careful.

But enough of these irrelevant interruptions.

Dear Luboš,

This is a little OT but since I have retired and found that I now have a lot of time on my hands, I have finally resolved—much like your good self—to commit myself to a long-held ambition to offend as many people as possible. Only in my case I'd prefer to adopt a much more intelligent and discriminate approach and restrict my targets to people I don't actually like, such as Gene Day*. There might be a few others.

After careful consideration and much research I decided to do this by way of your blog since I know where to find it now but mostly because I don't have one of my own since I can't be bothered to set one ip.

Anyway, you seem to have the technique down pat. Do you have any guidelines in this regard?

Kind regards,
John

* He doesn't have the good grace even to pretend to be nice to those whom he dislikes. The man has no style. He'd never be accepted at my golf club.

In the notation of our paper H_t and H_b are the scalar doublets coupling to the t and b mass eigenstates, and hence the corresponding Y_i are those measured in the H decays at LHC and only those are what enter into the sum rule. You might study assiduously whether earlier definitions of scalar doublets are identical to the one I just gave you. If they disagree with the sum rule they cannot be. I am not criticizing previous works just they were doing something different.

Rather than make general statements, each model can be examined with respect to the sum rule on a case by case basis. The BSM models that are impacted the most are those with the most distinct scalar doublets. The possible top-down origin of the BSM is not so important. All that matters is its appearance near to the electroweak scale.

Sorry, but those definitions are exactly the same as in all of the hundreds of preceding papers (ESPECIALLY in the MSSM and PQ models). H_t and H_b are not mass eigenstates - that is your problem. You explicitly stated that tan(beta) had to be less than 1 in the MSSM, and that is false, as shown in practically every MSSM Higgs paper in the past 30 years.

Dear Paul, CMS recently saw a 2.5-sigma excess of flavor-violating Higgs decays,

http://cds.cern.ch/record/1740976/files/HIG-14-005-pas.pdf

About 1% of the Higgs decays seem to be to "tau+ mu-" or vice versa. The simplest models clearly require some relatively "shifted" contributions from two Higgs doublets, or something comparably new. Is that correct to say that your theorem automatically assumes (assumption 1) that such things are not happening? Don't you think that such an assumption is too strong because you're assuming that all kinds of new flavor-related processes etc. must be non-existent?

Quite a lot. With respect to the minimal MSSM, I stick with the characterization offered in my guestblog. As hinted there, however, I have little doubt SUSY experts will be able to wriggle out of the phenomenological problem.

Lubos--if this potentially exciting CMS result is true, then the Glashow-Weinberg theorem states that more than one Higgs doublet must couple to the charged leptons. In the minimal models, this doesn't happen. 2HDMs with a discrete symmetry or the MSSM don't have this, but one can always add extra doublets and make it happen.

Dear M guest, thanks, yes, I sort of realize that. But assuming that things are like in the MSSM or the symmetry-constrained 2HDM is already a big assumption for me.

For example, there are lots of very cute string models that have a pair of Higgs doublet per each generation of fermions. It's hard to say how big a share - or prior probability - one assigns them in total, but for me, the fraction is surely not infinitesimal.

Agree about discrete symmetries, but Mother Nature does tell us, from B_s-\bar B_s mixing, that the coupling bsH, where H is any Higgs, must be very small. That makes it unlikely that \tau\mu H isn't also small, but not impossible.

Hi Lubos, I have always thought the generational L conservations are more fragile than the overall L conservation although I expect all global symmetries are broken at some scale. If these data hold up, it will be revolutionary but let's wait for one additional sigma?
On your other point, the idea was to be conservative in assuming BSM retains some of SM. As you say, it was an exaggeration to imply our argument was circular.

Agreed with your appraisal. Unlikely but as long as you won't prove that the b-s-H coupling is effectively the same as a mu-tau-H coupling, sensible people are free to believe and investigate that the decay may be real. ;-)

Dear Paul, the overall goal of your paper surely seems to be that you claim to provide some evidence that one should be conservative. So if you use the being conservative as an assumption, then I would agree that the paper is circular at least from a moral point of view. ;-)

I am not claiming any Higgs-flavor-violating-decay discovery at this point and one needs to wait to 5 sigma for an official discovery. But a no-go theorem could help us to decide about similar questions without waiting for a potential 5-sigma signal like that: the answer could be No. I just don't think that your paper may exclude such possibilities, can it?\

The LHC data are not 30 years old and I have no interest in discussing further here with "guest" as it creates heat but no light about physics.

Dear Paul, apologies, the name of guest can't be revealed for certain administrative reasons but I assure you that she or he is a particle physicist whose name is known to me with at least 5,000 citations.

I commented about this in a reply to Lubos. There is an element of circularity but the discussion is not completely circular. I would say it is conservative in assuming central elements of the SM survive in the BSM in order to make definite statements about the latter.

Lubos--thank you (although Paul has over 7,500 citations). My final comment is that MSSM Higgs studies at the LHC/SSC certainly ARE 30 years old, and the sum rule contradicts all of them. I explained why, but it's time to let it go.

Some of us have come to appreciate you for just that quality but I know what you mean.

Thanks, Rehbock. I appreciate it. :)

Normal service will be resumed as soon as possible, or not — it all depends on whether I can be bothered.

It's important to hold to one's principles.

Still, that's not too difficult for those of us who don't have any.

At present human beings are not capable of discerning the numerous wrong ideas of separation that build the fabric of their society, because they have been raised with them, and these have shaped their belief systems in a profound way.

Very often such wrong ideas carry the nimbus of infallible scientific truths that a humble human being can hardly oppose, as is the case with most light workers who are, on the one hand, ignorant on current failed science for good reasons, but, on the other, quite vulnerable to flawed scientific argumentation.

One central flawed concept of science is the exclusive application of closed rational numbers in mathematics and, from there, in all exact natural sciences, as we have shown above. Another basic flawed idea is the introduction of closed physical systems in modern physics, such as "ideal elastic collision", "blackbody radiation", "Carnot machine".

These closed systems are used to describe various energy interactions or to derive particular physical laws thereof.

The introduction of the concept of closed systems has, however, a sound ground. As already said, one can only describe the physical world in a correct manner, if one has grasped the essence of All-That-Is - first and foremost, its closed character.

This is precisely where empiric science has totally failed. The Whole is closed. In order to describe its parts, this idea must be introduced somehow in the physical view of the world. This has been achieved by inventing such abstract closed systems as ideal elastic collision, blackbody radiation, etc., which are hypothetically supposed not to exchange energy with the surroundings.

However, all systems are open. Here we have an antinomy, a fundamental paradox of physics and science, which we have also described as heavenly dichotomy. Just like mathematics, physics is essentially founded on paradoxes and antinomies. It is important to observe that conventional physics is fully unaware of this fundamental inconsistency.

While such closed systems can be found in any textbook on physics, this discipline has failed to acknowledge the fact that such a system as for instance Carnot machine, with which the thermodynamics of adiabatic processes is explained, is a phantom. Otherwise, we would have already created perpetuum mobile, an everlasting energy source, and all our energy problems would have been solved forever.

There are many other fundamental theoretical flaws of present-day physics. For instance, all current physical constants, magnitudes, quantities, variables, dimensions, and corresponding SI-units are defined in a circular manner, for instance, space (length) is defined by introducing time and assuming tacitly that this quantity is already firmly established.

The definition of time, itself, needs the a priori introduction of length (space); electric current and its corresponding SI-unit is defined through charge and vice versa - charge through current. This list can be extended ad infinitum. The author recommends the reader to scrutinize these definitions in any textbook on physics with a fresh eye after he has read this essay.

All definitions of physical quantities and their dimensions obey without an exception the principle of the vicious circle (circulus vitiosus): One part is defined through another, while the Whole is neglected. This circumstance has precluded a true understanding of All-That-Is in current empiric science.

This simple epistemological fact is so evident in the theory of physics even for beginners that the author is really baffled and has failed to find any other explanation as to why this fact has not be acknowledged so far but to assume that all scientists are imbecile and are totally unable of any elementary logical thinking.

Christ! What a pile of shit. But it must have taken some effort on your part to put that together.

WTF are "wrong ideas of separation that build the fabric of their society" and "light workers"?

"One central flawed concept of science is the exclusive application of closed rational numbers in mathematics..."

'Closed rational numbers'? WTF are they?

The FIELD of rational numbers is closed (and in more than one way) but the numbers themselves aren't — 'closed' is a meaningless adjective the way you've used it.

Only someone with no real acquaintance with the concepts would use such an expression, e.g. a faker spouting off. On the other hand, someone who knew what he was talking about might sloppily (but without any harm) say, for example, "the rational numbers are closed under multiplication" but he would NEVER EVER use the expression "closed rational numbers" to impart anything like the same notion.

Yes, it's easy to tell you're full of shit. You've left clues all over the place and so far we've only got as far as the start of your third paragraph.

You don't write to be understood but rather to sound off. It's the kind of thing likely to cause a civil disturbance and best avoided.