See also: Naturalness is fuzzy...
I was told about Matt Strassler's 50-minute talk at JoeFest (click for different formats of the video/audio/slides) and his verbal exchange with David Gross that begins around 35:00.
Matt's talk is pretty nice, touching technical things like the Myers effect, pomerons etc. but also reviewing his work with Joe Polchinski and giving Joe some homework exercises all the time. Matt said various things about the effective field theory's and/or string theory's inability to solve the hierarchy problem even with the anthropic bias taken into account. He would be distinguishing the existence of hierarchies from the lightness of the Higgs in a way that I didn't quite find logical.
They were thought-provoking comments but I just disagree about the basic conclusions. He can't pinpoint any contradiction in these matters because the QFT framework doesn't tell us which QFT is more likely – it goes beyond the domain of questions that an effective QFT may answer. And even the rules to extract such a probabilistic distribution of the vacua from string theory is unknown. If there are no predictions about a particular question – even if it is a "pressing" question like that – there can't be contradictions.
But the main conflict arose due to Matt's vague yet unusual and combative enough comments about the value of the 100-TeV collider.
He would say it could be a bad idea to give our eggs into the basket of this collider planned for the longer term. The reasons? Similar to the luminiferous aether. Michelson was trying to find the aether wind which was misguided, Matt says, so there should have been other experiments.
Unfortunately, he didn't say what those were supposed to be.
David Gross' reaction made it very clear that they disagree not only about the 100-TeV collider but also about the right strategy and interpretation of hypotheses and their testing of the late 19th century, i.e. about the Michelson issue. Previously, Matt Strassler would say lots of weird things such as "the prediction of new physics at the LHC is the only prediction one can deduce from string theory".
This is clearly wrong. No one can deduce anything like that. Effective field theories with some extra assumptions about the distribution of parameters could perhaps lead you to guess that particles with masses comparable to the Higgs may exist. But these are not conclusions of QFT itself. And in string theory, such "predictions" are even more impossible because string theory has no adjustable continuous dimensionless parameters. That means that there are no "natural distributions" on the space of parameters that would follow from string theory, at least as long as we interpret string theory as the currently understood body of knowledge.
Even more qualitatively, there is clearly no derivation that would imply that "string theory is progressive" or "string theory is conservative" when it comes to the amount of new low-energy physics. The latter – conservative string theory – is totally compatible with everything we know. After all, your humble correspondent is not the only one who thinks that string theory is a mostly if not very conservative theory. The claim that it inevitably predicts new things – or that it predicts more new things than possible or real alternatives – is just wrong.
Just to be sure, you may remember that your humble correspondent considers an even larger hadron collider to be the single most meaningful way to progress in experimental particle physics. We march towards the unknown which means that higher-energy experiments are needed for that. This relationship is probably true up to the Planck scale. The higher-energies we investigate experimentally, the deeper we penetrate into the realm of the unknown. David Gross and others clearly share the same viewpoint.
Is it possible that the Very Large Hadron Collider will find the Standard Model only and nothing else? Absolutely. That will be a disappointment but physicists will still learn something. But if you want to propose alternative experiments, you should know what they are. Some people are looking for the dark matter directly. There are experiments trying to detect axions and other things. Physics seems to have enough money for those – they are not as expensive as the large colliders. If you had another important idea what should be tested, you should say what it is, otherwise the claim that the colliders are overvalued contradicts the evidence you are able to offer. Matt isn't able to justify any true alternative.
But David Gross really disagreed about Matt's suggestion that the Michelson-Morley experiment wasn't naturally the best things to do at that time. Well, it was very sensible. Their understanding of the origin of the electromagnetic fields within classical physics implies the aether wind, so they were logically and justifiably trying to find it. More generally, these interferometer-based experiments were a "proxy" for the tests of many or all conceivable phenomena whose strength is proportional to \(v/c\) or its power i.e. effects that become strong when speeds approach the speed of light. They chose an experimentally smart representative of the "tests of relativistic physics", as we could call it decades later.
But during Michelson's times, there was no relativity. People just didn't know it yet. And Michelson's experiments happened to play just a minor role in Einstein's theoretical investigations. But as far as the people who relied on the experimental data and not Einstein's ingenuity are concerned, it was just completely logical to study the aether wind. It was really a critical experiment that would kickstart relativity if Einstein were paying more attention to the experiments or if another physicists who was well aware of the experiments had a little bit more of Einstein's ingenuity.
When relativity was discovered, it became clear that the aether wind doesn't exist but lots of other relativistic effects (corrections to Newton's physics suppressed by powers of \(v/c\)) do exist. Theorists like Einstein were crucial to make the progress in 1905 etc. but when it comes to the experiments, Michelson's experiments in the 1880s were really the optimum thing to make. They were even lucky because they revealed a particular situation where the right (now: relativistic) physics not only deviates from Newton's physics but where it gives a very simple result (the speed of light is constant, regardless of the speeds of sources or observers).
So just like David Gross and others, I think that Matt is just wrong even in the case of Michelson.
Nati Seiberg would also criticize Matt Strassler. Seiberg says that Matt's doubts are not only in contradiction with the philosophy of high-energy physics; they are in contradiction with reductionism itself that has worked for centuries. The further you go to shorter distances, the more detailed understanding of the physics you may acquire. Why should it break down now? Matt says that quantum gravity is an example where shorter distances fail to allow you to study more detailed physics. Seiberg correctly says that it is right but quantum gravity is very far. Strassler says that this comment by Seiberg might be right or wrong.
Well, I do think that quantum gravity is "probably" at the usual 4D Planck scale or nearby, roughly at the conventional GUT scale or higher. It is also hypothetically possible that quantum gravity kicks in at nearby energies, near 1 TeV. But this scenario predicts *exactly* what Matt Strassler attributes to string theory in general. This scenario implies significant deviations from the Standard Model at the LHC energies. It is really excluded experimentally. String theory is not excluded but quantum gravity at 1 TeV is excluded. I don't know why Matt is getting these basic things backwards.
Nima Arkani-Hamed addressed a question touched by Matt, "why a light Higgs and not technicolor". This question has an answer. As Nima decided with Savas after some scrutiny, technicolor models with light fermions are inevitably tuned to 1-in-a-million or worse because the light fermion masses require us to introduce some running that easily and generically creates new exponential hierarchies between the electroweak scale and the QCD scale, and related things. So SUSY with some tuning for a scalar is still less fine-tuned than technicolor. And if the electroweak symmetry is broken by a strong force, there are no baryons – just neutrinos.
Nima also defends the 100-TeV collider. No one is really suggesting to put all eggs into one basket; people are thinking about and building many, many experiments. Going to high energies is still a very important thing for many reasons.
Matt replies to Nati that "this time is different" because for the first time, the Standard Model can be the "whole story" (he overlooked gravity but it is a potentially complete theory of non-gravitational physics) so there is no reason to think that the new discoveries will come soon. Despite my expectations that new physics does exist below the scale of a few TeV, I agree with that. Strassler also says – and it is pretty much equivalent to the previous sentence – that naturalness and reductionism are not related in any direct way. I agree with that, too: there may be big hierarchies and deserts but reductionism still holds.
David Gross says that the naturalness isn't a prediction; it is a strategy. I completely agree with him (and I have written down this point many times), so let me please try to present this claim in my words, assuming that David Gross would fully subscribe to them, too. Naturalness boils down to a probability distribution on the space of parameters which we can use to think that certain values or patterns are "puzzling" because they are "unnatural" – which means "unlikely" according to this probability distribution. And that's why we focus on them; they are likely to hide something we don't know yet but we should know. At the end, the complete theory makes all these effects in Nature natural (in a more general sense) but because they look unnatural according to an incomplete theory, these effects in Nature are likely to hold a key for a new insight that changes the game, an insight by which the new theory significantly differs from the current incomplete theory. Naturalness cannot be tested quantitatively, however.
Gross said that our inability to calculate something is a problem. I completely agree with that and I add that this problem is worse in the case of parameters that seem parameterically smaller than the expectations because this gap suggests that there is some "big piece of meat" we are missing that changes the story substantially, not just some generic obscure calculation that spits out some numbers of order one. That's where naturalness directs us, I add.
Matt is often drowning in the sea of vagueness – this is something we know from the discussions with him on the blogosphere, too. He tries to say something extremely unusual but he's trying to claim that he isn't saying anything nontrivial at all at the same time. You just can't have it both ways, Matt. In this case, he is saying that we're not spending our time wisely – it's being focused too narrowly. Except that he never says what is the direction in which one might or one should broaden the interest or work.
Someone says he finds it frustrating that the reach for gluinos will only be doubled from 1 TeV to 2 TeV in 2015, not too big a difference. A reason to like a higher-energy machine.
Steve Giddings also points out a bug in Matt's logic concerning reductionism. Even if reductionism (meaning the need to study higher energies) were ending at X TeV, we would clearly need to go slightly above this level to find out that the reductionism fails. Finally, Matt proposes a loophole. Maybe there are extremely light and extremely weakly coupled new effects somewhere, so going to higher energies doesn't help us. Great. So what should we measure instead of the larger collider data?
David Gross says that dark matter is an example of that and Matt says that this makes his (Matt's case) stronger because according to many dark-matter models, one can't discover the new physics by going to higher energies. Well, right, it's plausible. But the difference is that there are "very many" such possible directions. Going to high energy is to increase the value of a quantity that is universal for all of physics – energy (or the inverse distance). Going to study very weakly coupled things means to go in the direction of lowering every conceivable coupling constant anywhere and there are just too many. We may try. We should try those cases that are justified by some arguments. But it is simply not true that any single march towards higher sensitivity in some particular coupling constant of some particular interaction seems to be as important as our ability to go to higher energies. There is only one energy and it's the king; there are way too many coupling constants and each of them seems less fundamental and less universal than energy. So I don't really agree with Matt on this change of the bias, either – unless he tells us what is the particular coupling constant or experiment where it makes sense to go to "much better than considered" sensitivities.
Maybe Matt would propose to build a 1 GeV collider with the luminosity increased 1 billion times? Perhaps it could make sense for some potential possibilities. But he should at least propose such a thing explicitly instead of saying that others are narrow-minded just because they are doing everything that people have conceived so far.
At the very end, Joe Polchinski calmed all bitterness and said that Matt was one of the young persons who comes to Joe's office and pretty much solves a problem in the confining gauge theory, having 100% right on the field-theory side and 80% right on the string-theory side, so Joe added the remaining 20%. Joe improved the flattering joke by saying that this paper was never submitted for publication because it didn't meet Matt's standards LOL. Matt says it's not really true. Joe also says that he didn't really deserve his PhD but with Matt's help and 15 years later, he had finally solved his thesis problem. ;-)
See also: Naturalness is fuzzy...