Thursday, May 10, 2012

Thomas Bayes and supersymmetry

Phil Gibbs wrote a nice article about an insightful puzzle, Bayes and SUSY, which is relevant for a sensible answer to the question "how much we should change our mind when we eliminate a big chunk of a parameter space", i.e. a big portion of the possible values of some parameters that specify a theory.




Consider three shells, similar to the four shells in the game below. ;-)



Full screen; the music stops in 33 seconds

Here is the puzzle.

You're told that the probability is 90% that a marble is hiding under one of them; and 10% that it is nowhere. Now you turn and look beneath two shells out of three and you don't find a marble. What is the probability that the marble is hiding behind the last shell now?

Phil tells us that some people have the inclination to simply divide 90% by three and say that the probability has dropped to 30%. This is, however, very far from the right result which is 75%. Why is the right result so high?

You may imagine that there are 10 equally likely possibilities. In 3 of them, the marble is under the first shell, in 3 of them it is under the second one, in 3 of them, it is under the third one, and in 1 of them (i.e. 10% of the cases), there is no marble.

By seeing that there's no marble under the first and second shell, we eliminate 3+3=6 possibilities out of the 10 possibilities we started with. The remaining 3+1=4 possibilities are still allowed and 3 of them (i.e. 75%) correspond to a marble under the third shell. So the probability resulting from this "frequentist calculation" is much higher than some naive guess that many people would quickly scream.

General numbers

Let me generalize the calculation to a generic prior probability \(P\) that the marble was somewhere; and a fraction \(F\) of the possible shells that have been eliminated. We may rephrase the situation by considering a larger parameter space (and a larger set of initial possibilities) that also contains the region in which the "marble is nowhere".

The size of this larger parameter space is \(1/P\) times greater than the size of the original "marble is somewhere" parameter space. Out of this interval of length \(1/P\), the length \(F\) corresponds to possibilities that have already been eliminated. So the surviving possibilities correspond to the interval of length \(1/P-F\) and "no marble" corresponds to the added length \(1/P-1\).

So the probability that there's no marble anywhere after we eliminated the fraction of the possibilities is\[

\eq{
Prob(\text{no marble}) &= \frac{1/P-1}{1/P-F} =\\ &= \frac{1-P}{1-FP}
}

\] Note that neither the numerator nor the denominator can be negative because \(F\lt 1\) and \(P\lt 1\). The aforementioned probability 25% for "no marble" is obtained for \(P=9/10\) and \(F=2/3\).

Of course, the main marble we are interested in is supersymmetry; it's the single most likely marble that is hiding under the shells of CERN. One reason that makes SUSY the winner in this contest is that the lower bound on the sparticle masses are among the "lightest" i.e. "least constraining" lower bounds we may find in physics. Other new objects such as black holes, Kaluza-Klein particles, new massive gauge bosons etc. have been eliminated up to masses of a few TeVs or so and these "already high" thresholds are not going to grow too quickly if you talk about the percentage growth which you should because the chances are approximately uniformly distributed on the log axis (logarithm of the mass).

On the other hand, the stop squark may still exist near 300 GeV or so and the interval between 300 GeV and 600 GeV is going to be investigated within a few months (it is already being investigated) – a whole doubling – so the stop has a relatively higher chance to be discovered.

What are the numbers for SUSY? Of course, the main quantity that influences the result is the prior probability \(P\) that SUSY exists. Now, one must be a bit careful what we mean by its existence. If we mean its existence at an arbitrarily high scale, the probability \(P\) is very close to one, something like \(P=0.9999\), pretty much guaranteed by string theory. However, \(P\) as estimated by your humble correspondent may have been as low as 60% if we mean the low-energy SUSY that is available to the LHC searches.

The value of \(F\) is problematic as well. How big a portion of the possibilities has been eliminated by the LHC searches so far? It could be \(F=2/3\) or much less. Of course, there is no "canonical measure" on the parameter spaces.

Moreover, we don't really know all the disconnected components of the parameter spaces because the MSSM isn't the only supersymmetric model. We don't know their relative weights. We don't know how strongly we should favor "simplified Ansätze" and values of parameters that favor unification and/or preserve flavors or CP in a simple way, and so on. And we must decide about some natural decrease of the probability distribution function for large values of masses – something that is needed both for naturalness as well as for the normalizability of the overall probability. We don't know the rate of this decrease. We don't know the detailed shape of the distribution at all.

There's no canonical calculation. All these factors depend on subjective preferences. But if you say that 2/3 of the "shells" under which SUSY may be hiding have been eliminated, you may calculate the probability that there's no SUSY. For the existence of SUSY "anywhere", we get\[

\eq{
Prob({\text{no SUSY}}) &= \frac{1-P}{1-FP} =\\ &= \frac{0.0001}{1-0.0002/3} = 0.0001000066...
}

\] Note that before we excluded two thirds of the parameter space, the probability was 0.0001 so the change of this probability has been almost non-existent! The probability of "no SUSY" has only increased by 0.0066% of its original value; in absolute numbers, the increase has only been 0.0000000066 or so.

Of course, the elimination of 2/3 of the parameter space makes a much greater difference for \(P=0.6\) which I said to be the pre-LHC probability that SUSY is accessible by the LHC. For this question, the post-elimination probability is\[

\eq{
Prob(\text{no LHC SUSY}) &= \frac{1-P}{1-FP}\\ &= \frac{1/3}{1-2/3\times 0.6} = 0.5555...
}

\] So the probability that the LHC will find SUSY remains at 44.4% according to these numbers; it has only dropped by 26% of its previous value. At any rate, Phil's point is very important. One must be careful not get get caught in the trap and reducing the "Yes SUSY" probability in direct proportionality with the percentage of the surviving parameter spaces because this is not at all what the correct probabilistic calculation says!

Pessimists' maths looks different

Let me mention that if your prior probability that "SUSY is right" were extremely low, \(P\ll 1\), while \(F\), the fraction of the excluded parameter space, were far both from zero and one, then \((1-P)/(1-FP)\) would be essentially one in the limit. It would be more insightful to calculate the (in this case small) probability that SUSY is right,\[

\eq{
1 - \frac{1-P}{1-FP} &= \frac{1-FP-1+P}{1-FP} =\\ &= P\frac{1-F}{1-FP}\sim P(1-F)
}

\] so indeed, the prior probability that SUSY is right, \(P\), would be suppressed in proportionality with the fraction of the parameter space that has survived. But this approximation only holds if you were considering SUSY as very unlikely to start with. If you were not, the decrease of the probability is much more gentle and the exclusion of a moderate fraction of the parameter space doesn't have enough potential to change the odds dramatically.

No comments:

Post a Comment