Wednesday, September 10, 2008 ... Deutsch/Español/Related posts from blogosphere

Why supersymmetry should be seen at the Large Hadron Collider

The observation of superpartners at the LHC would become the most spectacular discovery in experimental fundamental physics in the last 35 years, to say the least.


Expected breaking news: the LHC first beam day is celebrated by the world, including the Google main page.

Embedded Sky News Live TV was here and removed when the main events ended.
The "First LHC beam" webcast began at 9:00 am, Prague=CERN Summer Time (midnight Californian daylight saving time). At webcast.cern.ch, there were only 2,000 connections: sorry, you're unlikely to connect directly. Qbrick offers a working mirror of CERN TV but at some moments, I embed Sky News Live instead: their programs occasionally differ.

When you're bored, try the other one (Qbrick vs Sky). Sometime after 2:00 p.m., Sky News returned to non-LHC topics for a while (but it's back to the LHC now) while the CERN TV / Qbrick broadcast stopped at 6 p.m. Prague Summer Time.

See Adam Yurkewicz's and David Harris's live blogging from CERN: after adding sectors one by one, the beam 1 has made the full clockwise round trip at 10:23 a.m., as the updated map of sectors and fresh beam event pictures show. Applause! Orgasm across the room (YouTube)!

Unfortunately, a 17-year-old Indian girl was so devastated by the moving pictures (the end of the world) that she committed suicide. Other Indians blame the LHC for an earthquake in Iran. On the other hand, the LHC is supported by all obedient Christians.

After the noon, the beam 2 was getting ready for a counter-clockwise trip. Around 1:40 p.m., the first sector (78) was added to the beam's journey, followed by 67 at 2:02 p.m. when the point 7 (betatron cleaning) was penetrated. Cryogenics was behind the slow progress of this stage. (Sectors around the sector 8 were not always as cool as desired.)

Around 2:15 p.m., the beam unexpectedly stopped right after the point 6 (dump). A difficult period of the LHC history started and it ended by 2:23 p.m. :-) when the beam was already knocking on the CMS (point 5): the glitch was caused by optics. The point 4 (RF) and 3 (momentum cleaning) followed at 2:38 p.m. and one minute later.

At 2:44 p.m. and 2:51 p.m., points 2 (ALICE) and 1 (ATLAS) joined the winners. ATLAS (the only point at the Swiss territory) gave the green light at 3:03 p.m., allowing the beam to reach point 8 (LHCb) and celebrate the second round trip! The director could finally say some nice words to his employees (in French). Each beam took an hour or so to make a round trip, debunking all kinds of pessimists.

See Russia Today and CNN and BBC for fresh video reports on the LHC.

There have been many fascinating people who were speaking on the live webcast - bosses of different teams at the LHC, numerous former and current leaders of CERN, Nobel prize winners, and other famous physicists (such as Rubbia, 't Hooft, Smoot, Randall, and even Mariňo :-) of the topological string fame, among many others). Unfortunately, I can't repeat the whole days of interesting comments.

Supersymmetry (SUSY) would also count as the first experimentally confirmed prediction of string theory that was historically not a postdiction.




Its discovery would double the spectrum of elementary particles in a way that is not obvious, that was was qualitatively predicted for decades, and that some people still find unbelievable. It could be interpreted as a discovery of new, anticommuting dimensions of space. The discovery of supersymmetry would surely be considered as one of the most amazing discoveries of experimental science of all time.

It sounds fantastic. It sounds too good to be true.

Nevertheless, some of us are now predicting that the LHC is more likely to see SUSY than not. A figure "60%" has recently become popular as a description of our confidence that SUSY will be there. Of course, if you evaluate many arguments, it is extremely unlikely that you end up with a posterior probability that is so close to 50%. So what many of us actually expect may be a number close to 90% or higher. We just want to be modest and cautious so we artificially reduce the estimate to 60%. We mix our qualified opinions with the sociological priors. ;-)

In this text, I want to explain why I think that supersymmetry is more likely to be found there than not.
See also Gordon Kane's explanation of the same question, why SUSY should be there at the LHC, written for Cosmic Variance.


First, to have some fun, you can watch the fate of the protons at the LHC - from various pre-accelerators up to the collisions.
See also: Why the LHC should see the Higgs boson: Hawking vs Higgs
Fake counterarguments

Let me start with a couple of wrong fairy tales that are often presented as "counterarguments" against SUSY. Each of the comments below has been raised by various people who either don't know what they're talking about or who are confused by their own emotions, as we will explain in detail.
  1. SUSY should have already been seen
  2. SUSY is contrived: there are too many unnecessary particles
  3. SUSY breaking as a principle is contrived: there are too many parameters
  4. There are many alternatives
  5. Anthropic arguments imply that a low-energy SUSY breaking scale is unlikely
1. SUSY should have been seen

First of all, if SUSY would have already been seen, there would be no need to analyze the question whether SUSY exists.

Fifteen years ago, some people may have guessed that the superpartner masses could have been as low as 30 GeV. But these were not real predictions justified by scientific arguments. They were just guesses mostly based on a wishful thinking. The low values reflected a "sensationalist" bias of the phenomenologists who often want to say something spectacular that could be observed soon. Phenomenologists like to proceed "from the bottom up" which means that they often want to see new dragons "right behind the corner".

As a person thinking in a top-down fashion, I think that this bias is cheap, irrational, and somewhat dishonest. So whether new things are "right behind the corner" or "much further" - and how easily "testable" the new phenomena are - is what I leave to Nature. There may exist arguments that something is right behind the corner or far away. But wishful thinking is not a rational argument. And it is unreasonable to try to intimidate Nature.

There has never been a fair argument that new supersymmetric particles should be that light. The electroweak scale - the Higgs vev - is around 246 GeV so this value (and even somewhat higher masses, even by an order of magnitude) are clearly good and natural enough an expectation for the superpartner masses. Moreover, the people who predicted very light superpartners typically expected a very light top quark, too. It turned out to be at 172 GeV or so which is much heavier than the lightest estimates. Things were simply not "behind the corner" as some people wished.

And it was surely not the first time when Nature showed that some approximate theories have a much wider range of validity - and new phenomena are much further - than some people wanted: see, for example, David Gross's review of a 1938 conference. This fake argument is often voiced by phenomenologists who can't quite separate facts from wishful thinking, not even a decade after it has been proven that their previous wishful thinking was incorrect, and by theorists who are eager to participate in a permanent revolution.

The fact that SUSY hasn't been seen yet is nothing mysterious. Because of various decisions that were not good for science, the current colliders (before the LHC) were testing energies that didn't differ much from those 90 GeV, the mass of the Z boson that was discovered 25 years ago. There's been not much progress in the brute force of the colliders so SUSY hasn't been discovered either.

2. SUSY brings too much baggage

This argument is usually not presented by phenomenologists but rather the people who really have no clue what physics is all about. It just look complicated to them because their mind is not powerful enough and they deduce that it must be complicated for everyone else and Nature, too. But Nature doesn't necessarily share these severe limitations.

While it is true that supersymmetry doubles the number of elementary particles, it is not true that each new particle makes the theory "more contrived" or "less likely" because of its apparent complexity. The new particles are not independent at all. Supersymmetry is really one principle, not dozens of principles: it is not one assumption for each new particle. And this one principle predicts dozens of new particle species, among many other things.

The doubling of particle species is not quite new in science. When positrons and antimatter were discovered, the number of particle species was essentially doubled, too. Supersymmetry is really one assumption and the spectrum of the new particles are its predictions. In science, it is always good for a theory to have a small number of independent assumptions (because each independent assumption makes a theory more contrived and less likely) and a high number of predictions (because they're what the theory is actually used for and they prove that the theory is not a vacuous truism). In this counting, supersymmetry is doing extremely well.

The assumption of supersymmetry is very natural because it's the only new conceivable symmetry that doesn't commute with the normal spatial (Poincaré) symmetries. It might be expected that such a new, unusual symmetry would have to be incompatible with basic observed features of reality. Surprisingly enough, it's not incompatible which is itself a nontrivial argument in its favor.

3. SUSY is too ambiguous

What is true is that supersymmetry must be spontaneously broken and if we don't know what mechanism exactly does the job, there are many qualitative possibilities and each of them must be described using many new "soft terms" at low energies. This makes the space of possible "broken SUSY" effective theories hugely multi-dimensional.

What does it say about the likelihood that SUSY is correct? It says that if you pick a random small region in this multi-dimensional parameter space, it is extremely unlikely that you will pick the right one. That's indeed the case. People don't know what the superpartner masses will exactly be, among other unknown things. But that doesn't make SUSY as a principle any less likely. The likelihood that it is correct is the sum of the probabilities for its distinct detailed manifestations. Each of the detailed models becomes less likely if there are many of them but the sum doesn't.

Certain predictions are universal for all SUSY models and others are model-dependent.

Eventually, all the parameters of the "soft SUSY breaking" can be calculated from a detailed underlying high-energy theory, at least in principle (and they can be measured, too). The situation is analogous to the electroweak symmetry breaking by the Higgs sector. The only obvious difference between these two cases is that in the case of the electroweak symmetry, there exists a "canonical", simplest way to break the symmetry, namely with Weinberg's toilet (one Higgs doublet). In the case of SUSY, there are many possibilities for the SUSY breaking sector and none of them is "obviously better" than others.

So this fact prevents you from saying which detailed realization is the right one but again, it doesn't change anything about the probability that at least one of them is correct. The details of the supersymmetry breaking is one of the aspects that is not well understood - even though there has been a lot of progress in the last two years. But that doesn't mean that everything is badly understood.

4. There are alternatives

This is a popular talking point among some physicists except that the sentence doesn't seem to be true and the likelihood that it is true seems to be decreasing.

Supersymmetric models usually assume that the Higgs particles are point-like and elementary although there would be no contradiction otherwise. But composite Higgs bosons are simply not needed. The unbearable lightness of Higgs's being is guaranteed by the supersymmetric cancellations, not by a substructure of the Higgs boson.

As we will mention in the second part of the article, the Higgs boson is known to be rather light but it is very hard - and unnatural - to explain, for a theory valid up to extremely short distances why the Higgs should be so much lighter than the Planck scale. SUSY is the canonical solution to this problem and the only alternative solutions talk about some kind of "compositeness" of the Higgs boson.

The composite models of the Higgs - technicolor and generalizations - have all kinds of general problems. As some high-precision indirect measurements indicate, the compositeness scale has to be extremely high - essentially above the energies detectable by the LHC - so the bosons would look as elementary particles even at the LHC. There's a lot of problems to get the right fermion masses, especially for the (heaviest) top quark. Technicolor at the LHC seems to be almost excluded.

Once you admit that the Higgs is a point-like particle up to very short distances, not far from the fundamental gravitational distance scale, supersymmetry is the only approach that explains the hierarchy problem. The composite models have technical problems and I would also claim that they're unmotivated a priori. Also, string theory seems to suggest it is "uneconomical" to view Higgs doublets as composite particles.

5. Anthropic counting of vacua

This method is often used by big shot physicists who however believe the anthropic principle. They count "almost all" vacua of string theory. Sometimes they find out that "most of them" don't exhibit any supersymmetry at the TeV energy scale. This argument is bogus because the number of vacua is really infinite, so there is no "uniform probabilistic measure" (the total probability wouldn't converge). Even if there were such a measure, there is absolutely no reason why the measure should be close to a uniform one i.e. why the numerous types of vacua should "win".

Most of the bound states of the Hydrogen atom have "n" (the principal quantum number) much greater than one but that doesn't mean that most of the Hydrogen atoms you find in Nature have an infinite value of "n" in average. Many of the "typicality" arguments are demonstrably wrong and those that are not demonstrably wrong are unjustified. And even if you believed that the "typicality" assumption is correct, it remains a controversial topic which of them really win.

There is one more argument related to the Higgs mass and the anthropic principle that I would like to mention. It is sometimes said that the huge gap between the Higgs mass and the Planck scale is not a problem that requires an explanation because the gap is needed for stars (and animals) to be long-lived and large enough.

Well, this observation is pretty much a truism. Life as we observe it as well as certain "similar" life establishments need the constants to be what they are. But if we agree with this truism, it doesn't mean that other arguments or explanations become prohibited. In science, there exists no legitimate justification of a "ban" that would prevent you from using certain (old-fashioned) kinds of rational arguments. At most, the anthropic tautological observation may have a vanishing impact on the perceived probabilities of various scenarios, just like all other truisms.

I could have also discussed possible phenomenological problems for SUSY such as the flavor-changing neutral currents etc. They're genuine problems but they are solvable and many satisfactory solutions to them are actually known within the SUSY framework. Moreover, most of these problems seem to be general for all models of new physics. In the context of SUSY, they're not insurmountable.

Fine. Let us now turn to the positive side of the story.

Positive arguments

Let us sketch five general arguments why SUSY is highly promising and likely, maybe even at the LHC:
  1. Hierarchy problem
  2. Improvement of the cosmological constant problem
  3. Dark matter candidate
  4. Gauge coupling unification
  5. String theory
1. Hierarchy problem: why the Higgs is light

The W and Z bosons have masses around 80 GeV and 90 GeV which is much more than 0 GeV, the mass of the photon. The discrepancy requires the electroweak symmetry to be broken. The breaking agent must be effectively equivalent to the Higgs ocean. The Higgs field expectation value must be around 246 GeV to obtain the right masses of the W and Z bosons.

However, the Higgs mass is not quite known (even though I would bet it won't be far from 115 GeV). It is not known because the quartic Higgs potential can be multiplied by a real constant without changing the position of the minima at +-246 GeV. The overall normalization of the potential is related both to the Higgs mass as well as the dimensionless quartic coupling.

The coupling can't be too small because that would imply that the Higgs boson is very light and should have been already seen (but it hasn't). However, it can't be too high either. When the dimensionless coupling is greater than a number similar to "one", it also starts to "run" as a function of the energy scale. The interaction, as quantified by the dimensionless parameter, grows stronger at higher energies and at some point (the Landau pole), it gets infinitely strong and the theory breaks down.

That shouldn't happen in a good family of Mr Higgs: the theory would be either inconsistent or, at least, incomplete. These conditions imply that the quartic coupling is a number smaller than one (but not much smaller) while the mass is between 50 GeV and 800 GeV. If we assumed that SUSY is correct, we get a shorter interval between 114 and 200 GeV or so which seems to be perfectly consistent with other, e.g. high-precision measurements. The self-consistency of this lighter-than-necessary Higgs predicted by SUSY is another detailed argument in favor of SUSY.

But decades before we had these detailed arguments, we understood that the Higgs mass must be lighter than 800 GeV or so but it is actually a hard conclusion to understand from a fundamental theory. The loops attached to the Higgs propagator give it ideally divergent corrections to the Higgs mass. It seems that the Higgs mass naturally wants to become as heavy as the heaviest energy scale where your theory dares to say anything. It should really have the Planck mass. Most of the other particles that acquire their masses from interactions with the Higgs would be this heavy, too. They're not.

Supersymmetry implies cancellations that keep the quantum corrections to the Higgs mass zero. Even when it is spontaneously broken, you can show that the quantum corrections to the Higgs mass won't be huge - such as the Planck scale - but they will only reflect the mass differences between the light particles and their superpartners: the Higgs should be as light as the superpartners which sounds OK.

This was the old "qualitative" argument that attempted to solve the otherwise puzzling hierarchy - gap - between the Higgs mass and the Planck mass. But I would like to emphasize that it was not the last word about the question. Imagine that a few years ago, you would have some probability estimates that MSSM or SUSY was correct. Later, they would become your priors. Now, I claim that your posterior for MSSM or SUSY must be higher than the prior - the probability has increased - because MSSM or SUSY have made some correct and nontrivial (although not totally accurate) statements about the Higgs mass.

Without SUSY, the Higgs mass should be between 50 GeV and 800 GeV or something like that. In fact, the best fits suggest that the Higgs in a pure Standard Model should be around 85 GeV which is excluded by direct tests. It is not a truly strong falsification but it may be annoying: it should slightly reduce your belief that the pure Standard Model is a correct theory up to 1 TeV. On the other hand, the Higgs in the Standard Model can be as heavy as 800 GeV or so. The model doesn't pinpoint the mass any more accurately.

On the other hand, SUSY makes more specific predictions for the Higgs mass, between 114 and 200 GeV, and this 8 times more accurate a prediction (shorter interval) seems to be consistent with all newer (high-precision and other) measurements. So in this sense, SUSY has successfully passed a non-trivial test whose a priori probability of passing was just 1/8 or so. The signal of this test is not too strong but it is another hint. The probability that SUSY is correct has recently increased.

2. Vacuum energy: the cancelations may improve

The vacuum Feynman graphs (without external legs) produce the vacuum energy density that enters Einstein's equations as a source, curves the spacetime, and accelerates its expansion. Cosmologists observe this cosmological constant to be 10^{-123} in Planck units even though the generic prediction of a non-supersymmetric theory would be around 1 = 10^{0} in Planck units. That's a discrepancy of 123 orders of magnitude, the infamous cosmological constant problem.

With supersymmetry, the vacuum energy density is controlled by the SUSY breaking scale and the discrepancy drops to 60 orders of magnitude only. That's still bad enough but it's progress. At the same moment, the estimate is "more robust" because there's less freedom to fine-tune the energy density in a supersymmetric theory. But I kind of believe that supersymmetry is one step in the solution to the cosmological constant problem and we misunderstand something about the other step(s).

If I were brutal, I could say that this Bayesian inference increases the chances of SUSY by 60 orders of magnitude because it passed another test - that the C.C. is between 0 and 10^{-60} in Planck units - that would almost certainly fail (chance would be 10^{-63}) in a non-SUSY theory but has a chance close to one in a SUSY theory. Because I don't quite think that the distributions should be uniform on the linear scale and that this argument should be taken too seriously, I wouldn't say that the cosmological constant counting increases the likelihood of SUSY 10^{60} times. But I just happen to think that it does increase it, anyway.

3. Dark matter particle

Besides 70% of the mass of the Universe stored in the cosmological constant (more generally called "dark energy", an invisible structureless medium that penetrates all of space), about 25% of the mass density in the Universe is dark matter, something that clumps in invisible halos around galaxies but something that doesn't fill the whole space. It's been determined that a big part of the dark matter must be composed from a new kind of particle whose energy is close to the energies measured by the LHC. So whatever it is, the LHC is likely to find it or at least show some hints.

Neutralinos in SUSY theories (electrically neutral fermionic superpartners of the photon, Z boson, Higgs boson, and their mixtures) seem to have all the desired properties to be the dark matter particles which is another piece of non-trivial evidence in favor of SUSY. The theory predicts these new particles and they are needed (or "observed") by cosmologists, too.

4. Gauge couplings unify

Before SUSY was discussed by phenomenologists in the 1970s, grand unification was the most popular "very high energy" framework to construct new promising and pretty quantum field theories. In this setup, electroweak and strong interactions have to unify at a higher energy scale. The simplest grand unified (GUT) theory has various problems, for example that the strength of the three (U(1), SU(2), SU(3)) interactions don't quite unify: three lines in a graph don't cross at a point, simply speaking.

The simplest GUT theory with supersymmetry included solves this discrepancy. The three lines intersect at a point, as far as we can say with the available accuracy. Again, this is a test that didn't have to work. Because it did work so well, it increases the probability of the simplest SUSY GUT model - or any other model that looks the same way at all energies below the GUT scale.

It is true that other models with a lot of unknown corrections from various unusual particles could pass the test, too. By chance. But most of them would not. When you try to compute your confidence in various theories, it is useful to remember the following rule: If a model has actually passed a test, it was surely "doing better" than a model that had just a "chance" to pass the test. This is about the basic logic and the rules of inference. Needless to say, if SUSY is indeed found, this argument about gauge coupling unification, when combined with observed SUSY, will hugely strengthen the case for grand unification and the big desert, too.

Again, it won't be a proven thing but it will sound extremely reasonable. The gauge coupling unification doesn't influence the probability of models whose spectrum significantly (and "unpredictably") differs from the minimal SUSY GUT model.

5. String theory makes SUSY almost crucial

Finally, my most important reason to believe that SUSY is correct is that it seems to be a crucial part of all truly promising string theory models - namely superstring models - that we know. Supersymmetry was first discovered by Pierre Ramond (among other, mostly independent discoverers in the Soviet Union motivated by other, more mathematical considerations) who wanted to incorporate fermions to bosonic string theory.

The worldsheet SUSY can be shown to be almost necessary to do so naturally. Several natural models in the highest dimension then respect spacetime supersymmetry, too. In some sense, supersymmetry always seems to be present in string theory but it can be spontaneously broken. The hierarchy arguments above imply that assuming all the data we have, superpartners near the electroweak scale are the most likely ones. But it is true that I cannot derive a low-energy SUSY breaking scale from the first principles (of string theory).

And the anthropic arguments to estimate the supersymmetry breaking scale are either demonstrably wrong or rationally unjustified. If there are too many vacua of a certain kind, it is a hint that these vacua are unlikely to be relevant for the real world (see the analogy with the Hydrogen atom discussed near the beginning). The anthropic people believe the opposite thing but both of our beliefs are just biases, not scientific arguments.

After the LHC

If the LHC observes SUSY, it will be stunning.

However, indeed, the LHC is not guaranteed to see SUSY. If it won't observe SUSY, the probability that SUSY at meaningfully low energies is correct will decrease but it will not decrease to zero. Such a complete drop is simply very unlikely. The arguments above won't just die away. We will be just chased into a smaller "corner" of the parameter space which will reduce the probability but it won't annihilate all the chances.

Someone might like if theories could be killed by a particular collider but that's simply not how science works. We simply don't know at this moment what the mass scale should be so we're not guaranteed that the LHC is powerful enough to make the final verdict. Someone could pretend that a particular theory is (or should be) so "courageous" that it will either win or die at the LHC: everything or nothing. But the matter of fact is that there doesn't exist any promising and motivated enough theory worth talking about that would guarantee such a decision by the LHC. The purpose of science is not to be as "courageous" as it gets but to be as "correct" and "accurate" as we can. ;-)

The LHC is not an omnipotent cathedral to ask any question to Nature. It is a great experiment but it is just another experiment for a few billion euros only and there will be many more expensive experiments in the future (unless the primitives will take over the civilization). Nevertheless, it will tell us something and it may even be stunningly exciting.

Stay tuned.

Add to del.icio.us Digg this Add to reddit

snail feedback (1) :


reader Gordon Chalmers said...

I am very glad that the LHC is now on-line.
Guess what: I have written a matlab program
that can compute n-point and l-loop amplitudes
with as many particles and higher dimension
operators as you like (and form). The time
it takes to compute, would you believe is now
polynomial in npoint and lloop, i.e. npoint^6
* lloop^6 * number of operators, and some
minor complications associated with the number
of particles. Isnt that great! I have mixed
feelings about it. On the one hand you guys
get the dang best program you can imagine, and
on the other hand I get to hate my life (kindof
sortof - anyone want to call off the thugs from
continually screwing me over.)

I figure, and I have already estimated, that the
CERN computer complex can now do in excess of
an oodle number of loops, depending on several
factors.

By the way, my front end javascript (not yet written
into the computer) simply says : npoint? lloop?
operator types? particle content? and some other
things such as how you like to arrange the memory
allocation (compress? compress that way or this way?)
which version? you want your momenta or random
generated momenta? how many dimensions? want some patter recognition? scattering amplitude or
inclusive or exclusive? want some beam tips to
enhance cross section? stuff like that. Experimentalists will never have to ask a particle
theorist again for theoretical or numerical
computations.

There is a bad side however, and I am not going to
tell you, but you free to contact your local United
Nations office as to what the bad side is.