He has attempted to convert the string theorists to the belief system of algebraic quantum field theory which is not a trivial task. Algebraic quantum field theory is a newer version of the older approach of axiomatic quantum field theory.

In this approach, the basic mathematical structure is the algebra of bounded operators acting on the Hilbert space. In fact, for every region R, you can find and define a subalgebra of the full algebra of operators, they argue. A goal is to construct - or at least prove the existence of - quantum field theories that do not depend on any classical starting point.

This is a nice goal. Because the string theorists know S-dualities and many other phenomena in field theory and string theory which imply that a quantum theory can have many classical descriptions - more precisely, it can have many classical limits - we are certainly open to the possibility that we will eventually be able to formulate our interesting theories without any direct reference to a classical starting point. Instead, we will be able to derive all possible classical limits of string/M-theory from a purely non-classical starting point.

On the other hand, the particle physics and string theory communities are deeply rooted in experimental physics and we simply do not want to talk about some abstract concepts without having any particular theory that can at least in principle predict the results of experiments and that respects these concepts. In fact, we want to focus on theories of the same kind that are relevant for observational physics.

There is some kind of trade-off going on here: you want to be general, but on the other hand, you want to learn or predict something relevant about the observed physical phenomena and make progress.

Having a classical limit makes things extremely concrete and calculable which is one of the reasons why quantum theories constructed as a quantization of a particular starting point have been so successful, especially whenever they're weakly coupled. There exist extremists who would like to ban any physics that is based on a classical starting point or a background - and Prof. Buchholz is arguably not one of them.

As you can see, this trade-off between generality and relevance is completely analogous to the question of background independence. We would eventually like to have a theory that allows all types of backgrounds we need - but on the other hand, we must know that our assumptions still allow for the interesting theories that are relevant for experiments. And moreover, we must still be able to find new and true insights about these theories.

As you can guess, your humble correspondent thinks that both the loop quantum gravity community as well as the algebraic quantum field theory community are far from the ideal equilibrium: their hypothetical idealized theories are so carefully disconnected from perturbative expansions around the classical limits and from backgrounds that they are also rather safely disconnected from observationally relevant insights.

Prof. Buchholz has introduced us to some basic axioms of algebraic quantum field theory. The more concrete part of the talk was a mathematical proof of the existence of a particular infinite-dimensional family of integrable but interacting two-dimensional theories. Because I am using the word "theory" with a less general meaning than the meaning in algebraic quantum field theory, the term "certain formulae" would seem as a more accurate description to me. My knowledge of these rigorous tools was not sufficient for me to understand in which sense the proof was non-trivial and I cannot comment on this particular issue.

Differences

There are four main groups ot topics which are answered and interpreted very differently by the AQFT community and the conventional high-energy physics community:

- the ultraviolet physics
- the infrared physics
- the physics in between
- the philosophy of physics

Ultraviolet physics

A conventional 21st century particle physicist or string theorist would tell you that the methods of the renormalization group are the key to understand the actual interrelations between short-distance physics and long-distance physics: the details of short-distance physics only influence long-distance physics through a small number of parameters. These parameters can be determined without the exact knowledge of the short-distance physics. Physical phenomena can be organized according to the length scale or the energy scale. The details of short-distance physics, especially the coefficients of very high dimension operators, simply become unimportant or irrelevant at long distances.

This system of thinking allows us to determine what the experiments actually tell us and what they don't. Also, the fact that the laws are organized according to the scale explains why the humans could have made a gradual progress in the first place. This philosophy also puts the ultraviolet, short-distance divergences under the proper light: these divergences are not real physical inconsistencies because they can be cured by some kind of short distance physics or a regulator.

In renormalizable theories, one can show that physics at energies well below the cutoff scale - the energy scale above which the theory breaks down - is determined just by a few (finitely many) parameters. If this is the case, a predictive theory exists; otherwise it does not. These principles are very important for us. Even in string theory which eliminates ultraviolet divergences completely and which is valid and exact at arbitrarily short distance scales, we find the renormalization group important. The insights of the renormalization group are also important in condensed matter physics. The renormalization group equations can be related to the dependence on the holographic dimension in the AdS/CFT correspondence.

Prof. Buchholz has strengthened my feeling that these RG insights play no role in algebraic quantum field theory; they seem to be entirely ignored. And yes, I even think that the concept of the idealized operator algebra is inconsistent with the insights of the renormalization group. The operators in quantum field theory are only finite with respect to a particular renormalization mass scale; they're well-defined after we choose a renormalization scheme. The rigorous operator algebra seems to be an attempt to remove this dependence - i.e. to return to the naive picture where the cutoff is effectively sent to infinity. I feel that it can't work, not even in UV complete theories. Theories in low spacetime dimensions may be an exception.

Note that in conformal field theories, the correlator of two operators of the same dimension "Delta" goes like the distance of the two points to the power of "-2Delta". The composite operators have anomalous dimensions and can't be interpreted as simple products of the "elementary" operators in some algebra. Products of two operators that exactly coincide is always a singular object. Such a singular product occurs even if you multiply two operators that are averaged over some regions of space. A choice of the UV cutoff or a renormalization scale is necessary to define products in an algebra. Its power gives the products the right dimensions.

An important group of questions in algebraic quantum field theory is apparently the choice of the test functions that define how the operators are smeared over finite regions. The researchers seem to worry about various "paradoxes" such as the ability to find negatively-normed states in the perturbative expansion with a sufficiently pathological choice of the test functions.

I find all these constructions completely unphysical. The existence of such pathological constructions does not mean that there is anything wrong with the theories because the pathological test functions don't correspond to anything that can occur in the experiments, not even in principle. In field theory, even if it is UV complete, we should first define a cutoff and we should never consider test functions that are changing drastically at distances shorter than the cutoff. This implies no limitation of predictivity because the cutoff energy scale can be chosen arbitrarily high and the measurable results at finite energies can be shown to be independent of the cutoff as long as the energy cutoff is high enough.

Infrared physics

According to conventional 21st century particle physics as we teach it, the nature of infrared divergences is very different from the character of the ultraviolet ones. They cannot be eliminated by a more careful definition of the laws of physics: the infrared divergences are real. They inform us that we have asked a wrong question. If you collide an electron with a positron and tell them to annihilate, you produce two hard photons but the one-loop diagrams contributing to this production are infrared-divergent.

What do you do with this infinity? You realize that you have asked a wrong question. You should not ask the question what is the cross section to create two photons from the initial electron-positron pair. You should ask what is the cross section to create two photons plus an arbitrary number of photons whose energy is so small, below "epsilon", that you can't really detect them.

Once you admit that you can't measure the photons whose energy is below "epsilon", you eliminate the infrared divergences from the loop diagrams, and the part that diverges for "epsilon" approaching zero actually cancels against the cross section where you produce two visible photons plus one additional invisible photon whose energy is below "epsilon". If you don't like this "epsilon", you should realize that its value can be chosen arbitrarily small. If you choose any finite "epsilon", the infrared divergences disappear. The measurable results are independent of "epsilon".

For any finite "epsilon", the infrared problems are absent. In spacetimes of higher dimensionality, these infrared problems do not arise at all. They don't arise even though the higher-dimensional theories are closely related to the four-dimensional ones; the latter can be thought of as compactifications of the former. The infrared divergences in "d=4" don't mean that the internal mathematical structure of the theories in "d=4" and "d=6" is completely different; they just mean that we must carefully choose the questions and the nature of the right questions may depend on the dimension.

At any rate, if you ask physically meaningful questions with a cutoff, these problems become non-problems. Moreover, you will always be allowed to assume that the Feynman diagrams with additional vertices will be suppressed relatively to the diagrams with fewer vertices: they will be suppressed by the power of the fine-structure constant. The perturbative expansion in nice theories such as QED works whenever the coupling constant is small and whenever we ask meaningful questions. Prof. Buchholz seems to disagree. Algebraic quantum field theory seems to think that the perturbative expansions are lethally ill in some sense, and I have not understood the justification of this feeling.

Both in the case of the ultraviolet as well as the infrared divergences, the fact that the loop diagrams diverge means that you must deal with these divergences properly, and if you do so, the higher-order terms will always be suppressed by powers of the fine-structure constant because all the coefficients will become finite. Tree level diagrams, whenever they are non-zero, dominate over the loop diagrams in all physically meaningful observables.

The algebraic quantum field theory community seems to disagree. Just because a loop contribution is infrared divergent, they assume that the "infinite" higher-order terms dominate over the lower-order terms. They think that the higher-order terms make the theory inconsistent; they make the perturbative expansion break down; and some actual correlators become either zero or infinity. All these conclusions are incorrect, I think. Things can only become zero or infinity if we ask unphysical questions. And we should never do so.

There also exists another "lore" in algebraic quantum field theory that I find irritating: the infraparticle. Unfortunately, with all my respect for Prof. Buchholz, he is associated with this concept. The statement is that one-particle (one-electron) states that are contained within the same superselection sector of the Hilbert space can't form a representation under the Lorentz group, just because of the existence of soft photons that are produced in most realistic processes.

Of course, I disagree with this description. Wigner's classification of one-particle states is an absolutely robust quantum conclusion that works at any coupling and does not depend on the perturbative expansions. Start with any kind of dirty electron you want. Wait for a long enough time. Every photon that can escape to infinity will escape. In the limit, you end up with the electron, a single-particle state, with the smallest possible energy. Put an arbitrarily large box around the electron - that should still be smaller than the waiting time. What you end up with is an arbitrarily good approximation of the Minkowski space.

The punch line is that you can isolate "pure" one-electron states in the Minkowski space. For two electrons very far from each other, you can also find, by locality, a two-particle state made out of two "pure" electrons as a tensor product. For nearby particles, the simple tensor product construction won't be accurate because of their interactions but there is a construction of the right Hilbert space whose basis can be chosen to be a Fock space of all particles and their bound states.

We always consider initial states that contain the pure particle states without any kind of "soft cloud" that would raise their energy. Many soft massless particles may be contained in the final state, and we deal with them by introducing the infrared cutoff as explained above. If you add analogous extremely soft particles (below the measurable threshold) to the initial state, you can show that measurable physics won't be affected either. There is no open question here and there is no room for violations of Wigner's unitary action of the Lorentz symmetry on one-particle states.

The question of soft photons can be confusing - but it can also be completely comprehensible and meaningfully answered. In algebraic quantum field theory, the first alternative occurs; in conventional 21st particle physics, it is the second alternative that we believe to be correct. I prefer the second alternative, too: things should be clear especially if they are clear. Incidentally, these "soft photon clouds" could also obscure the uniqueness of the vacuum itself.

Cumrun Vafa was surprised that Prof. Buchholz did not include the uniqueness of the vacuum among his axioms - and instead offered slightly confusing comments about the possibility to work with "mixed states" containing "different vacua". I find mixed states constructed from vacua in different superselection sectors very problematic.

Physics in between

There are differences in the understanding of ultraviolet and infrared effects. But there are differences in between, too. Rajesh Gopakumar asked the same question as I did yesterday: is the algebraic quantum field theory formalism powerful enough to include gauge theories - quantum field theories whose importance for the 21st century physics can't be overstated? Does the formalism allow gauge-non-invariant operators as intermediate results that are eventually removed by imposing gauge invariance and/or BRST invariance? Is there a room for the Faddeev-Popov ghosts? If these things are not allowed and if we must work with gauge-invariant operators only, how could we ever derive the contributions of the Faddeev-Popov ghosts to one-loop quantities such as the beta-function?

I think that the answer that both of us obtained has made it rather clear that the field of algebraic quantum field theory has no idea how to deal with these questions that have become a routine in high-energy physics. Similar comments apply to other important phenomena such as gauge anomalies. How would you ever prove, in the context of algebraic quantum field theory, that the Standard Model is anomalous without the leptons? Note that this question can be formulated without any reference to the particular classical Lagrangian: we can describe the theory by its spectrum of spin 1/2 and spin 1 massless particles (let's avoid the Higgs mechanism in this gedanken experiment). The diagrams that calculate the anomaly can look like perturbative diagrams but it is also true that the one-loop contribution is actually exact. There are no further corrections.

Moreover, the concept of local operators and operators that can be uniquely associated to spacetime regions seems to contradict many insights about dualities. Strong-weak dualities typically interchange elementary excitations with solitons. Elementary excitations are created by local operators but solitons are non-local, extended configurations or solutions. It is clear that at a generic value of the coupling constant, a democracy between the elementary excitations and the soliton arises and these states can't be quite localized. All of them must be slightly extended. And they are mutually non-local. We can estimate the size of the objects in many cases. The rules of the operator algebras seem too strict and too point-like. I feel that many particular developments in gauge theories find no support in algebraic quantum field theory and they are perhaps marginally inconsistent with it, at least in four dimensions and higher:

- gauge invariance, the contributions of Faddeev-Popov ghosts in loops
- spontaneous symmetry breaking
- chiral symmetry breaking and supersymmetry breaking by fermionic bilinears
- strong-weak duality
- non-commutative extensions of field theories

- anomaly cancellation as a non-trivial condition on the spectrum
- and, of course, the existence of gravity, diffeomorphism symmetry, all of its extensions from string theory, and holography

This leads me to the final category of differences: the differences in philosophy. I find the approach of algebraic quantum field theory dogmatic and disconnected from the actual experiments and from the principles whose importance may be deduced from the experiments. When we are working on theoretical physics, the contact with experiment is the ultimate judge. Because we are often working on questions that are hard to test by direct experiments, the contact cannot be direct.

But we still care about an indirect contact. Our theories must be predictive at least in hypothetical universes that are qualitatively analogous to ours. They must contain qualitatively analogous ingredients that turned out to be important for the description of actual physics around: unitarity, gauge symmetries, diffeomorphism symmetries, Lorentz invariance, dependence of physics on the scale, Higgs mechanism, gauge anomaly cancellation, and so forth. I think that the approach to physics in which these ingredients of our picture of the universe are treated as details that are less important than some untested "principles" - such as the existence of cutoff-independent algebra, background independence - is a scientifically flawed approach.

Our experience with string theory teaches us that if we insist on the right principles, those that have actually been proved important by the experiments, we also obtain algebraically beautiful mathematics that fits together, gives us new and fresh ideas how to solve old puzzles, and allows us to answer questions that mathematicians found difficult, by applying our physics intuition.

I am afraid that the approach of algebraic quantum field theory is very different. It starts with some dogmas - and some of them are most likely not satisfied in the real world. As explained above, the approach seems to require that you can define all physical quantities without any reference to energy scales, ultraviolet cutoff, and the infrared cutoff. It even insists that you must be able to combine the operator distributions with arbitrary test functions, and still expect nice results from your theories.

Many of these assumptions seem incorrect to me. Algebraic quantum field theory respects the principles of quantum mechanics - the principles that are relevant to calculate the "1/n^2" spectrum of the Hydrogen atom. But it fails to respect the new subtle features that distinguish quantum field theory from simple quantum mechanics, especially those related to the correct role of cutoffs of both types and to gauge symmetries.

Also, the algebraic quantum field theory approach does not seem to ask the question whether we are asking the right questions or not; whether we are making reasonable assumptions; whether the mathematical structures that satisfy our axioms are interesting. Some axioms are probably legitimate and they are satisfied by the physically relevant and mathematically attractive theories. But these axioms may be too weak. Some other axioms that are added could be, on the contrary, too strong: they may be violated by the physically relevant and mathematically beautiful theories that we normally study.

It is very important to know whether our axioms have the secret power to lead us to new interesting theories that may describe reality and that will be praised as mathematically pretty in the future. It is very important to think whether the insights that seem to be true are true exactly or just approximately. The feedback mechanism is very important. If a direct feedback from the experimenters is impossible, we must rely on the feedback of theoretical experiments. I mean the work of colleagues who investigate particular examples and who can say whether a general framework is interesting - or whether it is too loose or too constraining.

The question which questions in physics are good questions is itself a scientific question, even though it may look vague. This meta-question must be studied without prejudices, by following the scientific method. This is my main reason why I feel that most of the conceptual assumptions of algebraic quantum field theory have been superseded by more concrete and more coherent insights, many of which have already been proved experimentally.

## No comments:

## Post a Comment