## Thursday, August 11, 2016 ... /////

### Modern obsession with permanent revolutions in physics

Francisco Villatoro joined me and a few others in pointing out that it's silly to talk about crises in physics. The LHC just gave us a big package of data at unprecedented energies to confirm a theory that's been around for mere four decades. It's really wonderful.

People want new revolutions and they want them quickly. This attitude to physics is associated with the modern era. It's not new as I will argue but on the other hand, I believe that it was unknown in the era of Newton or even Maxwell. Modern physics – kickstarted by the discovery of relativity and quantum mechanics – has shown that something can be seriously wrong with the previous picture of physics. So people naturally apply "mathematical induction" and assume that a similar revolution will occur infinitely many times.

Well, I don't share this expectation. The framework of quantum mechanics is very likely to be with us forever. And even when it comes to the choice of the dynamics, string theory will probably be the most accurate theory forever – and quantum field theory will forever be the essential framework for effective theories.

David Gross' 1994 discussion of the 1938 conference in Warsaw shows that the desire to "abandon all the existing knowledge" is in no way new. It was surely common among physicists before the war.

Let me remind you that Gross has argued that all the heroes of physics were basically wrong and deluded about various rather elementary things – except for Oskar Klein who presented his "almost Standard Model" back in the late 1930s. Klein was building on the experience with the Kaluza-Klein theory of extra dimensions and his candidate for a theory of everything

• appreciated the relationship between particles and fields, both for fermions and bosons that he treated analogously
• used elementary particles analogous to those in the Standard Model – gauge bosons as well as elementary fermions
• it had doublets of fermions (electron, neutrino; proton, neutron) and some massless and massive bosons mediating the forces
• just a little bit was missing for him to construct the full Yang-Mills Lagrangian etc.
If Klein had gotten 10 brilliant graduate students, he must have discovered the correct Standard Model before Hitler's invasion of Poland a year later. Well, they would probably have to do some quick research of the hadrons, to learn about the colorful $SU(3)$ and other things, but they still had a chance. 3 decades of extra work looks excessive from the contemporary viewpoint. But maybe we're even sillier these days.

Gross says that there were some borderline insane people at the conference – like Eddington – and even the sane giants were confused about the applicability of quantum fields for photons, among other things. And Werner Heisenberg, the main father of quantum mechanics, was among those who expected all of quantum mechanics to break down imminently. Gross recalls:
Heisenberg concluded from the existence both of ultraviolet divergences and multi-particle production that there had to be a fundamental length of order the classical radius of the electron, below which the concept of length loses its significance and quantum mechanics breaks down. The classical electron radius, $e^2/mc^2$ is clearly associated with the divergent electron self-energy, but also happens to be the range of nuclear forces, so it has something to do with the second problem. Quantum mechanics itself, he said, should break down at these lengths. I have always been amazed at how willing the great inventors of quantum mechanics were to give it up all at the drop of a divergence or a new experimental discovery.
The electron Compton wavelength, $10^{-13}$ meters, was spiritually their "Planck scale". Everything – probably including the general rules of quantum mechanics – was supposed to break down over there. We know that quantum mechanics (in its quantum field theory incarnation) works very well at distances around $10^{-20}$ meters, seven orders of magnitude shorter than the "limit" considered by Heisenberg.

This is just an accumulated piece of evidence supporting the statement that the belief in the "premature collapse of the status quo theories" has been a disease that physicists have suffered from for a century or so.

You know, if you want to localize the electron at shorter distances than the Compton wavelength, the particle pair production becomes impossible to neglect. Also, the loop diagrams produce integrals whose "majority" is the ultraviolet divergence, suggesting that you're in the regime where the theory breaks down. In some sense, it was reasonable to expect that a "completely new theory" would have to take over.

In reality, we know that the divergences may be removed by renormalization and the theory – quantum field theory – has a much greater range of validity. In some sense, the "renormalized QED" may be viewed as the new theory that Heisenberg et al. had in mind. Except that by its defining equations, it's still the same theory as the QED written down around 1930. One simply adds rules how to subtract the infinities to get the finite experimental predictions.

I want to argue that these two historical stories could be analogous:
Heisenberg believed that around $e^2/mc^2 \sim 10^{-13}\,{\rm meters}$, all hell breaks loose because of the particle production and UV divergences.

Many phenomenologists have believed that around $1/m_{\rm Higgs}\sim 10^{-19}\,{\rm meters}$, all hell breaks loose in order to make the light Higgs mass natural.
In both cases, there is no "inevitable" reason why the theory should break down. The UV divergences are there and dominate above the momenta $|p|\sim m_e$. But they don't imply an inconsistency because the renormalization may deal with them.

In the case of the naturalness, everyone knows that there is not even a potential for an inconsistency. The Standard Model is clearly a consistent effective field theory up to much higher energies. It just seems that it's fine-tuned, correspondingly unnatural, and therefore "unlikely" assuming that the parameters take some "rather random values" from the allowed parameter space, using some plausible measure on the parameter space.

At the end, Heisenberg was wrong that QED had to break down beneath the Compton wavelength. However, he was morally right about a broader point – that theories may break down and be replaced by others because of divergences. Fermi's four-fermion theory produces divergences that cannot be cured by renormalization and that's the reason why W-bosons, Z-bosons, and the rest of the electroweak theory has to be added at the electroweak scale. An analogous enhancement of the whole quantum field theory framework to string theory is needed near the string/Planck scale or earlier, thanks to the analogous non-renormalizability of Einstein's GR.

So something about the general philosophy believed by Heisenberg was right but the details just couldn't have been trusted as mechanically as the folks in the 1930s tended to do. Whether QED was consistent at length scales shorter than the Compton wavelength was a subtle question and the answer was ultimately Yes, it's consistent. So there was no reason why the theory "had to" break down and it didn't break down at that point.

Similarly, the reasons why the Standard Model should break down already at the electroweak scale are analogously vague and fuzzy. As I wrote a year ago, naturalness is fuzzy, subjective, model-dependent, and uncertain. You simply can't promote it to something that will reliably inform you about the next discovery in physics and the precise timing.

But naturalness is still a kind of an argument that broadly works, much like Heisenberg's argument was right whenever applied more carefully in different, luckier contexts. One simply needs to be more relaxed about the validity of naturalness. There may be many reasons why things look unnatural even though they are actually natural. Just compare the situation with that of Heisenberg. Before the renormalization era, it may have been sensible to consider UV divergences as a "proof" that the whole theory had to be superseded by a different one. But it wasn't true for subtle reasons.

The relaxed usage of naturalness should include some "tolerance towards a hybrid thinking of naturalness and the anthropic selection". Naturalness and the anthropic reasoning are very different ways of thinking. But that doesn't mean that they're irreconcilable. Future physicists may very well be forced to take both of them into account. Let me offer you a futuristic, relaxed, Lumoesque interpretation why supersymmetry or superpartner masses close to the electroweak scale are preferred.

Are the statements made by the supporters of the "anthropic principle" universally wrong? Not at all. Some of them are true – in fact, tautologically true. For example, the laws of physics and the parameters etc. are such that they allow the existence of stars and life (and everything else we see around, too). You know, the subtle anthropic issue is that the anthropic people also want to okay other laws of physics that admit "some other forms of intelligent life" but clearly disagree with other features of our Universe. They look at some "trans-cosmic democracy" in which all intelligent beings, regardless of their race, sex, nationality, and string vacuum surrounding them, are allowed to vote in some Multiverse United Nations. ;-)

OK, my being an "opponent of the anthropic principle as a way to discover new physics" means that I don't believe in this multiverse multiculturalism. It's impossible to find rules that would separate objects in different vacua to those who can be considered our peers and those who can't. For example, even though the PC people are upset, I don't consider e.g. Muslims who just mindlessly worship Allah to be my peers, to be the "same kind of observers as I am". So you may guess what I could think about some even stupider bound states of some particles in a completely different vacuum of string theory. Is that bigotry or racism not to consider some creatures from a different heterotic compactification a subhuman being? ;-)

So I tend to think that the only way to use the anthropic reasoning rationally is simply to allow the selection of the vacua according to everything we have already measured. I have measured that there exists intelligent life in the Universe surrounding me. But I have also measured the value of the electron's electric charge (as an undergrad, and I hated to write the report that almost no one was reading LOL). So I have collapsed the wave function into the space of the possible string vacua that are compatible with these – and all other – facts.

If all vacua were non-supersymmetric but if they were numerous, I would agree with the anthropic people that it's enough to have one in which the Higgs mass is much lower than the Planck scale if you want to have life – with long-lived stars etc. So the anthropic selection is legitimate. It's totally OK to assume that the vacua that admit life are the focus of the physics research, that there is an extra "filter" that picks the viable vacua and doesn't need further explanations.

However, what fanatical champions of the anthropic principle miss – and that may be an important point of mine – is that even if I allow this "life exists" selection of the vacua as a legitimate filter or a factor in the probability distributions for the vacua, I may still justifiably prefer the natural vacua with a rather low-energy supersymmetry breaking scale. Why?

Well, simply because these vacua are much more likely to produce life than the non-supersymmetric or high-SUSY-breaking-scale vacua! In those non-SUSY vacua, the Higgs is likely to be too heavy and the probability that one gets a light Higgs (needed for life) is tiny. On the other hand, there may be a comparable number of vacua that have a low-energy SUSY and a mechanism that generates an exponentially low SUSY breaking scale by some mechanism (an instanton, gluino condensate, something). And in this "comparably large" set of vacua, a much higher percentage will include a light Higgs boson and other things that are helpful or required for life.

So even if one reduces the "probability of some kind of a vacuum" to the "counting of vacua of various types", the usual bias equivalent to the conclusions of naturalness considerations may still emerge!

You know, some anthropic fanatics – and yes, I do think that even e.g. Nima has belonged to this set – often loved or love to say that once we appreciate the anthropic reasoning, it follows that we must abandon the requirement that the parameters are natural. Instead, the anthropic principle takes care of them. But this extreme "switch to the anthropic principle" is obviously wrong. It basically means that all of remaining physics arguments get "turned off". But it isn't possible to turn off physics. The naturalness-style arguments are bound to re-emerge even in a consistent scheme that takes the anthropic filters into account.

Take F-theory on a Calabi-Yau four-fold of a certain topology. It produces some number of non-SUSY (or high-energy SUSY) vacua, and some number of SUSY (low-energy SUSY) vacua. These two numbers may differ by a few orders of magnitude. But the probability to get a light Higgs may be some $10^{30}$ times higher in the SUSY vacua. So the total number of viable SUSY vacua will be higher than the total number of non-SUSY vacua. We shouldn't think that this is some high-precision science because the pre-anthropic ratio of the number of vacua could have differed from one by an order of magnitude or two. But it's those thirty orders of magnitude (or twenty-nine) that make us prefer the low-energy SUSY vacua.

On the other hand, there's no reliable argument that would imply that "new particles as light as the Higgs boson" have to exist. The argument sketched in the previous paragraph only works up to an order of magnitude or two (or a few).

You know, it's also possible that superpartners that are too light also kill life for some reason; or there is no stringy vacuum in which the superpartners are too light relatively to the Higgs boson. In that case, well, it's not the end of the world. The actual parameters allowed by string theory (and life) beat whatever distribution you could believe otherwise (by their superior credibility). If the string vacuum with the lightest gluino that is compatible with the existing LHC observations has a $3\TeV$ gluino, then the gluino simply can't be lighter. You can protest against it but that's the only thing you can do against a fact of Nature. The actual constraints resulting from full-fledged string theory or a careful requirement of "the existence of life" always beat some vague distributions derived from the notion of naturalness.

So when I was listing the adjectives that naturalness deserves, another one could be "modest" i.e. "always prepared to be superseded by a more rigorous or quantitative argument or distribution". Naturalness is a belief that some parameters take values of order one – but we only need to talk about the values in this vague way up to the moment when we find a better or more precise or more provable way to determine or constrain the value of the parameter.

Again, both the champions of the anthropic principle and the warriors for naturalness often build on exaggerated, fanatical, oversimplified, or native theses. Everyone should think more carefully about the aspects of these two "philosophies" – their favorite one as well as the "opposite" one – and realize that there are lots of statements and principles in these "philosophies" that are obviously right and also lots of statements made by the fanatical supporters that are obviously wrong. Even more importantly, "naturalness" and "anthropic arguments" are just the most philosophically flavored types of arguments in physics – but aside from them, there still exist lots of normal, "technical" physics arguments. I am sure that the latter will be a majority of physics in the future just like they were a majority of physics in the past.

At the end, I want to say that people could have talked about the scales in ways that resemble the modern treatment of the scale sometime in the 1930s, too. The first cutoff where theories were said to break down was the electron mass, below an ${\rm MeV}$. Quantum field theory was basically known in the 1930s. Experiments went from $1\keV$ to $1\MeV$ and $1\GeV$ to $13\TeV$ – it was many, many orders of magnitude – but the framework of quantum field theory as the right effective theory survived. All the changes have been relatively minor since the 1930s. Despite the talk about some glorious decades in the past, people have been just adjusting technical details of quantum field theory since the 1930s.

And the theory was often ahead of experiments. In particular, new quarks (at least charm and top) were predicted before they were observed. The latest example of this gap was the discovery of the Higgs boson that took place some 48 years after it was theoretically proposed. If string theory were experimentally proven 48 years after its first formula were written down, we would see a proof in 2016. But you know, the year 48 isn't a high-precision law of physics. ;-)

Both experimental discoveries and theoretical discoveries are still taking place. Theories are being constructed and refined every year – even in recent years. And the experiments are finding particles previously unknown to the experiments – most recently, the Higgs boson in 2012. It's the "separate schedules" of the theory and experiment that confuses lots of people. But if you realize that it's normal and it's been a fact for many decades, you will see that there's nothing "unusually slow or frustrating" about the current era. Just try to fairly assess how many big experimental discoveries confirming big theories were done in the 1930s or 1940s or 1950s or 1980s etc.

The talk about frustration, nightmares, walls, and dead ends can't be justified by the evidence. It's mostly driven by certain people's anti-physics agenda.