In early February, I discussed a paper by Howard Baer and 4 co-authors which made some steps to update the estimates of superpartner masses and other parameters of new physics – by replacing naturalness with the string naturalness which takes "the number of string vacua with certain properties" as a factor that makes a vacuum more likely.

*Ace of Base, Living in Danger – recorded decades before the intense Islamization of Sweden began. In the eyes of a stranger, such as a Czech, Swedes are surely living in danger. The relevance will become clear later.*

They have claimed that this better notion of naturalness naturally drives cubic couplings \(A\) to large values (because those are more represented in the string vacua, by a power law) which means a large mixing in the top squark sector and stop masses that may exceed \(1\TeV\). Also, the other (first two generation squarks...) scalars are "tens of \({\rm TeV}\) in mass". The lightest two neutralinos should be close to each other, with a mass difference around \(5\GeV\). Most encouraging is the derivation that the Higgs mass could be pushed up towards the observed \(125\GeV\), plus minus one.

They have continued to publish papers – roughly one paper per month – and the first hep-ph paper today comes from a similar author team, too.

Naturalness versus stringy naturalness (with implications for collider and dark matter searches)I improved the title by closing the parenthesis. Baer, Barger, and Salam review some of the previous claims about the string naturalness – and look which general kinds of supersymmetry scenarios are likely according to the string naturalness.

String theory and supersymmetry are friends – but they are independent and different, too. Any person who does some actual research about the deeper origin of the observed laws of particle physics (currently the Standard Model) must have some theoretical basis to produce estimated probabilities of various statements about new particles and similar things. If you refuse to consider any such measures or probabilities, it just means that you are a complete non-expert in this part of physics and you shouldn't contaminate the discussions about these topics by your noise.

For bottom-up model builders, the "practical naturalness" has been the canonical framework to think about such matters. "Practical naturalness" says that none of the independent terms that add up to a quantity should be much greater in magnitude than the sum – the total quantity. It's a simple, partially justified rule, but it's also too heuristic and may be wrong or highly inaccurate, especially in some special (and perhaps even not so special) conditions.

String theory is bound to modify these rules. As I said, it seems that string theory wants the \(A\) cubic couplings to be high and so on. This has other implications. We're being pushed to some more extreme corners of the parameter space – more extreme according to the previous notion of naturalness – and the counting is a bit different in these corners. In particular, some masses may be rather high while this doesn't imply too big a fine-tuning, and so on.

Non-stringy supersymmetry model builders have often considered subsets of the MSSM parameter space such as the CMSSM (Constrained Minimal Supersymmetric Standard Model) and mSUGRA (minimum supergravity). These are obvious enough choices to reduce the number of soft parameters in the MSSM with broken supersymmetry. However, Baer et al. present evidence that such vacua are actually rather rare. High scale SUSY breaking models are more frequent but only some kinds. You need to read the paper to see the fate of PeV SUSY, minisplit SUSY, spread SUSY, and others.

The stringy counting arguments seem to prefer light enough higgsinos (and the related \(m_{\rm weak}\) parameter) in the vicinity of a hundred or hundreds of \({\rm GeV}\). On the other hand, gluinos and other strongly interacting superpartners are said to be out of the LHC reach.

Concerning the Higgs potential which is what breaks the electroweak symmetry, Baer et al. claim that the stringy naturalness pressures push the Universe to "living dangerously". It means that the parameters of these potentials are such that some of the "deadly" features of the potential are relatively nearby. By "deadly" features, I mean potentials that break the electromagnetic \(U(1)\); or they break the color \(SU(3)\); or that don't break the electroweak symmetry at all; or that produce a pocket universe weak scale of a magnitude that is clearly incompatible with the observed one.

If they can get this preference for the dangerous life, couldn't they also explain by the stringy statistical arguments why the whole electroweak vacuum seems – due to the other minimum of the Higgs potential etc. – to be metastable and almost unstable? That the quadratic terms in the Higgs potential, when reversed back to the Planck scale, seem to be zero – that the Standard Model seems to be "conformal" in the UV? And other coincidences that people have noticed...

At any rate, I find this research inconclusive but very interesting. The reasoning is imperfect but it's still much better than no reasoning or insisting on pure prejudices. And this reasoning indicates that the probability that some new particles such as higgsinos are just "hundreds of \({\rm GeV}\) in mass" and therefore (almost) accessible by the LHC is surely comparable to 50% or higher. The people who claim such a probability to be close to zero are just deluding themselves – they are defending themselves against a totally real possibility that they totally arbitrarily labeled blasphemous.

## snail feedback (0) :

Post a Comment