## Friday, June 27, 2008 ... //

### Basic concepts of physics and quantum gravity: some lore

In this text, I would like to describe how the essential concepts of theoretical physics - defined as a science linking the language of mathematics with the fundamental facts about the real world - and qualitative statements about the real world were evolving as people were finding more accurate theories matching the Universe.

The focus will be on the most recent insights in the context of quantum gravity but let us begin at the very beginning. Well, almost.

Ancient Greece: astronomy, geometry, statics

Animals and mammals have been observing the real world for tens of millions of years but their observations were lacking the kind of mathematical rigor that theoretical physicists are interested in. Monkeys have always been doing some kind of physics (involving bananas and other objects) but it was a very applied sort of physics and the accuracy wasn't great.

That's why we jump right into the epoch of ancient civilizations. They were observing the real world and besides number theory - a discipline for counting cows and assets - geometry became the oldest branch of quantitative natural science. Astronomy was one of the first applications. People learned something about the apparent trajectories of celestial bodies but they didn't understand their origin.

In the terrestrial context, geometry - today viewed as a branch of mathematics - became the oldest discipline of physics whose goal was to study the possible relationships between perfectly solid things represented as geometrical objects. The Euclidean geometry was born and people thought that it could be directly applied to the real world. In this setup, geometry was good to understand statics only.

When you were a kid, did you ever think about the real world in terms of a binary function from space to the set {0,1}?

My emphasis on statics doesn't mean that they couldn't imagine how the objects were moving. They could move them around but they didn't know almost anything about the way how objects are naturally moving in the real world and especially why they were moving in one way but not another. Time was therefore just an arbitrary external parameter. The precise dependence of things on time was unknown, much like the notion of time derivatives.

Dynamics was not understood at all and Aristotle's anti-Newtonian principle, claiming that things always stay at rest unless pushed, is a great example of the limitations of their dynamics. At any rate, the Greeks (and others) were already good in statics. They understood the real world as a collection of points and other objects in the three-dimensional Euclidean spacetime. Geometry as a branch of mathematics even became useful in their lives.

Modern science

Let's move on. Galileo Galilei became the main father of modern science as well as the father of dynamics. He was the first one who realized that the motion of objects - and other previously static quantities - could be understood in terms of functions of time, a variable measured by clocks, and it was damn important to find out what the correct mathematical functions were.

In the gravitational context, he realized that the distance increases with the second power of time. Isaac Newton has combined Galileo's preliminary insights about kinematics and dynamics with rather accurate phenomenological rules describing the trajectories of planets in the old-fashioned, purely descriptive (non-why) approach - Kepler's laws - and discovered the first quantitative dynamical laws.

Classical physics

In his picture of the world, the world was described in terms of particles that objectively had some positions and these positions - being functions of time - evolved according to some differential equations, a mathematical concept that Newton had to invent along the way.

The word "particles" is just an intuitive label meant to simplify our imagination but what matters mathematically are the positions - canonical coordinates - that evolve according to some equations. These coordinates are thought of as objectively existing numerical features of reality.

The "natural" differential equations were later derived from the Hamiltonian approach and the Lagrangian approach.

Field theory

The equations above could have been used for a large number of particles. When their number (and density) is high enough, a statistical treatment becomes appropriate and the resulting "continuum" may be described in terms of partial differential equations.

These equations were relevant for solids, liquids, and gases. While you can view them as limits of a description of a large number of atoms, it is also possible to interpret these partial differential equations as being fundamental. This approach eventually won in the context of the electromagnetic field even though the whole misguided aether movement may be viewed as an attempt to prove that only ordinary differential equations, and not the partial ones, may be fundamental in the real world.

Classical field theory continues to interpret the world as a set of "canonical coordinates" that evolve in time according to differential equations. But because the number of these "canonical coordinates" is large - they depend on additional parameters, the spatial coordinates - we need partial differential equations rather than ordinary differential equations with respect to time.

At some moment, people attempted to combine fields and particles. They imagined that the world was made out of fields and particles. These composite pictures were phenomenologically useful in many contexts but they have always had problems in the role of fundamental equations. The infinite self-energy of a classical electron was an example.

Relativity

Special relativity has preserved the basic picture based on partial (and ordinary) differential equations but it has unified space and time. The spatial and temporal derivatives in the fundamental equations have always looked similar. In relativity, they may be treated on equal footing. One is invited to think about the whole spacetime as the ultimate reality. The spacetime has some additional symmetry mixing space and time - the Lorentz symmetry. The acceptable laws of Nature as well as the vacuum - empty space - are postulated to be invariant under this new symmetry. This constraint dramatically reduces the number of acceptable theories.

When we focus on the Lorentz-invariant theories, we are already naturally residing in the class of causal or local theories where signals never propagate faster than light. The latter condition is a relativistic version of the pre-relativistic causality that allowed the future to be influenced by the past but not in the other way around.

General relativity preserves the general picture based on differential equations, too. But it allows you to reparameterize the coordinates of spacetime in an arbitrary, non-linear way. Theories are still required to be invariant under all these transformations but the vacuum is not. Non-linear transformations of the empty space correspond to a world with fictitious, inertial forces seen by accelerating observers.

Because this framework has some new degrees of freedom that are needed to remember these inertial forces - namely the metric tensor -, it turns out that the framework automatically predicts gravity, too. The gravitational field is influenced by matter via known equations, namely Einstein's equations, and it acts on matter by requiring that matter moves along geodesics. Because gravity is produced in the same way as inertial forces, the equivalence principle is automatically explained: all bodies accelerate equally in the same gravitational field.

The symmetry of special relativity has unified many concepts that were previously thought of as independent: mass and momentum became a part of the same 4-vector much like momentum and energy. The previous two unifications implied that mass and energy is really the same thing. Electricity and magnetism became two sides of the same coin, too. And I could continue.

Quantum mechanics

Although the frameworks above used diverse symmetries and various types (and numbers) of degrees of freedom as well as sets of equations they followed, the basic interpretation was always identical. These numbers are "real" and the laws evolve them deterministically.

The quantum revolution was therefore the first (and most likely also the last) true upgrade of this conceptual heart of theoretical physics.

In quantum mechanics, events cannot be predicted deterministically. Only probabilities of different outcomes may be predicted. They arise as the squared absolute values of complex amplitudes. The coordinates, velocities, and other quantities - the observables - are represented by linear operators acting on the Hilbert space of allowed states. The linearity of the Hilbert space physically means that the superposition principle holds - any superposition of allowed states is an allowed state. Evolution is expressed in terms of a unitary operator. These are the universal postulates of quantum mechanics.

But the postulates are not everything we need. Much like in classical physics, we must ask what are the degrees of freedom and what are the equations (Heisenberg equations, if I use the picture with similar equations to classical physics) that they follow. It turns out that well-behaved classical systems usually have their quantum counterparts that can be deduced by the process of "quantization". But that's just a heuristic trick. Once we talk about quantum mechanics, it is only the full quantum mechanical theory that "really" exists and the classical theory is "only" its limit that may or may not exist. In the familiar "schoolboy" examples of quantum theories, it does exist: all familiar quantum theories may be obtained from a classical theory.

Quantum mechanics may be formulated in one of several "pictures" that can be shown to be completely equivalent as far as the ultimate predictions of observable phenomena go. This type of redundancy is a commonplace feature of theories that are on the right track and that are also sufficiently well understood.

Quantum field theory

We have found a completely new "heart" of theoretical physics. Successful classical theories, such as particles with the Coulomb force, may be quantized to obtain very useful and realistic quantum equations (describing all of chemistry, among other things).

However, classical field theories may be quantized as well. It turns out that the energy spectrum becomes discrete and quantum fields may be equivalently interpreted as systems of particles. In the quantum context, the dilemma whether we should use particles or fields (or both) as fundamental objects evaporates. When you do it properly, it is really the same thing. If you start with fields, you may derive the quantization rules and you discover particles. If you start with particles and decide to define quantum rules for their wave functions that are fully compatible with relativity, you end up with quantum fields, too.

Because it remains natural to look for theories that have a Lagrangian and we still want the principles of special relativity to be obeyed, we are led to a pretty natural class of theories and the Standard Model is the most phenomenologically relevant representative of this class.

In these theories, one decides what the fields are and how they interact. The interpretation coincides with the basic interpretation of a quantum theory - the quantum postulates hold - but the number of degrees of freedom is large enough for us to be able to describe an arbitrary configuration of particles of different types (or the corresponding fields) and their interactions.

The number of psychologically different approaches to such theories increases. We can use not only the Schrödinger, Heisenberg, or Dirac (interaction) picture but also Feynman's path integral approach. In all cases, the ultimate goal is to calculate some probabilities of various outcomes of experiments with particles and fields and/or statistical expectation values of various operators - observables that can be measured.

If we were talking about technicalities such as gauge symmetries, the number of physically equivalent methods to treat them is large, too.

Interacting theories: features

In the text above, you were led to think of the "free fields" giving us the non-interacting particle species that are subsequently supplemented with interactions. And we usually chose the renormalizable interactions only. However, this description only reflects a particular "perturbative" method how quantum field theories may be constructed.

If the interactions are strong or if you want to be very general, you don't want to think in this way. The free particles can't be seen in general - for example quarks don't exist as actual isolated particles. Instead, you want to think about more physical - directly observable - objects that can be calculated from your theory, too.

Various scattering amplitudes and Green's functions can be analytically continued to complex values of energy and momenta. It turns out that these functions must be analytical almost everywhere and every non-analyticity has to have a physical interpretation such as a new particle (e.g. a bound state).

The collection of single poles and cuts in your scattering amplitude describes the spectrum of physical particles. Moreover, various general bounds and facts about the scattering and other amplitudes may be extracted.

It is possible to think about a quantum field theory as a theory with a list of particle species following the principles of quantum mechanics - the different entries appear as poles in scattering amplitudes - and the scattering amplitudes for all possible processes involving these particles.

The set of a priori possible theories of this kind would be huge but we usually consider theories that are close to renormalizable theories or, almost equivalently but more invariantly, theories that can be obtained as long-distance limits of field theories that are nearly scale-invariant at very short distances (or, equivalently, very high energies). This is the truly interesting class and the constraints above pretty much tell us that once we determine the spectrum of light particles in the Standard Model, there are only 30 or so parameters that influence the behavior of these particles at low energies as long as the theory is required to make sense at sufficiently (almost arbitrarily) high energies, too.

In the classical sections, we noted that general relativity doesn't change anything about the interpretational "heart" of physics. It added some symmetries but you could have viewed them as coincidences. Much like we required general covariance from a classical field theory (governed by the classical determinism and other principles), we should be able to add the condition of general covariance to quantum field theory (respecting the quantum postulates) and simply obtain a theory of quantum gravity.

However, when you actually try to follow this procedure in this straightforward way, you will fail. You find out that you don't really know the spectrum of your theory and the interactions at high energies. If you assume that the theory can be constructed purely from the low-energy fields, especially the metric tensor, you will find out that the resulting construction won't be renormalizable: it won't respect the rule mentioned in the QFT context that the high-energy limit of physics should exist and the low-energy physics should be pictured as its low-energy approximation. You won't know what the exact amplitudes should be near the Planckian energies where you really care about them.

Black holes are omnipresent

So what is the spectrum of localized objects in a gravitational theory?

In classical general relativity, a sufficiently heavy collapsing object is guaranteed to end up as a black hole. This conclusion certainly follows from Einstein's equations. But even if you imagine that Einstein's equations are not exactly correct, you should realize that you don't need any extraordinary density of matter or other extreme conditions to create a black hole. A large enough volume of water is enough to end up as a black hole.

The corrections to Einstein's equations that are not experimentally excluded are insufficient to change anything about the qualitative conclusion that black holes must form. Black holes simply do exist. Less importantly, we also observe their effects in the telescopes.

In the quantum context, you find out that they are not quite black. They are emitting thermal radiation. There are many ways to think about it. For example, black hole confines light and other objects by a "classically impenetrable barrier" - a change to the geometry that would make escaping light equivalent to a forbidden "superluminal signal". But in quantum mechanics, we are familiar with quantum tunneling: objects have a nonzero probability to penetrate such barriers as long as they appear in the prohibited region only for a finite amount of time.

The black hole horizon (and the black hole interior) is no exception. There is a probability that a particle near the center of the black hole escapes from the black hole by making such a jump. The information tunnels out, too.

You may also talk about the pair-creation of virtual particles that become physical if they are created near the horizon. When you calculate all these seemingly different phenomena - tunneling, pair-production - carefully, you find out that they're completely equivalent.

Stephen Hawking has determined the temperature of the resulting blackbody radiation. It is proportional to the "gravitational acceleration" on the event horizon. From the known laws of thermodynamics - a useful additional layer of physics that we haven't discussed - you can also determine the black hole entropy which turns out to be proportional to the surface area of the event horizon in Planck units, just like Jacob Bekenstein predicted a few years earlier.

In quantum mechanics, the entropy (divided by Boltzmann's constant) must be the natural logarithm of the number of macroscopically indistinguishable microstates. You may find out all kinds of general rules. Black holes are the highest-entropy objects that you can squeeze into a given volume. And each microstate looks like an elementary particle.

Quantum gravity: particle species

In classical general relativity, the existence of black holes was inevitable. Because we didn't need any extreme densities or extreme gravitational fields (with extreme tidal forces), this conclusion (about the existence of black holes) must also follow from a correct quantum theory of gravity: the classical limit is qualitatively correct when the star collapses.

Because we have seen that every black holes must carry a certain entropy and therefore information, it must correspond to a lot of microstates. Each of them looks like a new particle species. Many of them (with the same mass, angular momentum, and charge) are almost (macroscopically) indistinguishable. But microscopically, they must still differ.

In other words, very general arguments are enough to determine the type and number of high-mass species of elementary particles in any correct theory of quantum gravity: they must look like black holes and their density per unit mass must go as the exponential of the entropy (the surface area in Planck units over four).

Because the black holes maximize the entropy among localized objects of a fixed mass, they are the "generic localized states". There are so many of them that they dominate all the scattering processes with a certain high, trans-Planckian center-of-mass energy. It was possible to deduce this principle from very general arguments about gravity, quantum mechanics, and statistical physics. This principle - that black holes always dominate physics of processes at high, trans-Planckian center-of-mass energy - was called "asymptotic darkness" by Tom Banks.

There is nothing new - there is darkness - if you go to excessively high energies. Saying that there is nothing new above the Planck scale is actually equivalent to saying that there is no geometry at distances shorter than the Planck length.

If you think about a particular classical gravitational theory - for example, general relativity coupled to the Standard Model - you (more than) qualitatively know what happens at very low, sub-Planckian (center-of-mass) energies. It is encoded in the low-energy field theory. However, it turns out that you (more than) qualitatively know what happens at extremely high, trans-Planckian (center-of-mass) energies, too: it is fully encoded in the black hole microstates and all qualitative features of black holes can be determined from the classical black hole solutions - solutions to the same low-energy equations that were relevant for the low-energy regime.

If you collide particles whose center-of-mass energy is huge and trans-Planckian, you don't need to use some superesoteric high-energy properties of your theory to predict the outcome. Instead, you may use classical general relativity and determine that there's enough mass within the Schwarzschild radius and a black hole will be formed.

It means that the low-energy, heavily sub-Planckian regime is qualitatively understood but the very high-energy, trans-Planckian regime is qualitatively understood, too. Ever more energetic (i.e. massive) black holes correspond to ever larger objects with decreasing curvature. A very small curvature is normally relevant for very low energies but in the presence of gravity, it is relevant for very high energies, too.

We see that the very low and very high energy regimes must be governed by the same theory. It is a highly non-trivial requirement that the same theory must behave well in both limits. For example, you might imagine that quantum gravity is obtained by a direct quantization of Einstein's equations - equations that are relevant for low energies.

Imagine that you have a procedure how to determine the spectrum of such a quantized theory at arbitrarily high energies. It sounds very unlikely that such a spectrum would exactly match the black hole microstates that you need to get at very high energies: high-mass states of a theory (and their density) could look like hadrons, nuclei, or atoms. The details of such spectra depend on the behavior of your theory in the strongly coupled regime. However, you know that some of the answers (e.g. the density of states, controlled by entropy) should exactly agree with black hole properties (area) that can be completely calculated from a weakly curved (i.e. low-energy) calculation in your theory. It won't agree by chance.

You need to get discrete, hadron-like microstates that will however look just like black holes when their masses vastly exceed the Planck mass. It's tough. This is another method to explain why the constraints of quantum gravity are so terribly difficult to satisfy.

Is there a solution? Of course, there is. If you count every superselection sector as a new solution, there are countably infinitely many known solutions (and up to 10^{500} of them may be quasi-realistic, although it depends on the precise degree how much quasi-realistic you expect them to be). They are called the backgrounds of string/M-theory. It is a priori conceivable that besides the "10^{500}" solutions, there could exist another consistent theory of quantum gravity that properly interpolates between the known low-energy and the known high-energy limits.

But no such theory is no known and it is likely that even if one were found, it would be natural to count it as another background of string theory even if its relationship to the remaining 10^{500} vacua were not fully understood.

After all, the term "string theory" is already used for many vacua where "strings" don't seem to be the elementary objects (such as 11-dimensional M-theory). And it would be even more likely that you would have to learn pretty much everything that string theorists have to know today in order to understand the hypothetical "new" theory of quantum gravity: most of these "stringy" tools are essential for any kind of good physics (in the quantum gravity context).

Beyond asymptotic darkness: new scales

You may have noticed that when we are discussing quantum gravity, we are mostly using the conceptual framework of quantum field theory but we are dealing with a very special and mysterious kind of quantum field theory with a very special spectrum of massive particles (and very special functions describing their density and interactions).

It follows that string theory may be understood as a gadget to produce generalized quantum field theories with the right particle spectra (and interactions) that correctly interpolate between the low-energy limit and the high-energy limit, dominated by the black holes.

But as the term "string theory" indicates, it is not about black holes only. There should also be some strings attached somewhere. ;-) However, it is important to understand that only the black holes (and gravity) are universal. Strings are new objects that appear (and are helpful) in many (but not all) corners of the configuration space (or moduli space).

In the simplest picture of "asymptotic darkness" that we started with, there is only one scale - the Planck scale - dividing two interesting regimes: low-energy physics below the Planck scale and high-energy physics above the Planck scale. There are no dimensionless parameters, certainly not small one.

This simple picture is relevant e.g. for M-theory in 11 dimensions. This theory has no dimensionless parameters either. Qualitatively speaking, it only contains massless gravitons (and their siblings from the supermultiplet) and black holes. It also predicts the existence of (heavy if large) M2-branes and M5-branes, among other things, but these objects are not point-like localized objects: instead, they are extended.

String theory always invites us to study extended relativistic objects simultaneously with the point-like ones. But if you treat string theory as a gadget to generate generalized point-like quantum field theories, you are not interested in extended objects unless they are wrapped. And if they're wrapped on homologically trivial cycles in space, in order to be fully compact, they become just very special states in families that are dominated by black hole microstates anyway.

More typical vacua of string theory do have dimensionless parameters. Whenever strings are useful and "effectively fundamental" objects in a string-theoretical spacetime, we always have a new dimensionless parameter called the "string coupling", g_s, and this number (also expressed as the exponential of dilaton, phi) is much smaller than one (the dilaton therefore goes to minus infinity) if the stringy picture is really useful.

On the other hand, whenever a background of string theory has such a parameter that can be interpreted as the dilaton, you may always find some strings and the qualitative conclusions associated with string theory (such as the Hagedorn density of states and the characteristic decrease of the amplitudes at high energies) holds. It only holds up to some higher energy scale and gradually changes to the quantum gravitational, black-hole-dominated functional energy dependence.

In more typical vacua, there are many more fundamental as well as derived energy scales associated with the radii of different hidden dimensions, tensions of various types of branes, and so forth. As string theory interpolates between the required low-energy and high-energy limits of physics, it also inserts a lot of new interesting objects and phenomena in the middle. It is difficult to find a theory that properly interpolates between the low-energy and high-energy regimes of gravity. But if you want to find such a consistent theory "constructively", the known methods to construct stringy vacua are the only methods you can use.

Even if you didn't know about the results of string theory as we understand them in 2008, you would ultimately have to look for a theory that describes gravity as well as other forces and particles. Such a theory would have to interpolate between the correct low-energy equations for gravity and other forces and the correct high-energy states in the spectrum, corresponding to the black holes. It would have to reproduce many other intermediate energy scales, too.

Qualitatively, such a theory would have to look as string theory, anyway. But at some moment, you would like to know some "quantitative" details. If you were analyzing your questions carefully, you could answer many questions unambiguously and the answers would be identical to string theory once again.

For example, you would ask whether the general covariance can be extended to a larger yet "simple" symmetry. You would re-discover supergravity. If you asked what is the maximum dimension that allows supergravity, you would end up with the number eleven. If you asked what objects exist in such an 11-dimensional theory assuming that the latter is consistent, you could derive many known things about M-theory. You would be studying the very same M-theory even if your starting point was different.

Similar comments apply to the other vacua of string theory. Anomaly cancellation, discovery of black holes and other objects, and a detailed research of their properties and excitations would allow you to reconstruct pretty much everything we know about string/M-theory. If you insisted on tools that don't assume that the theory is fundamentally a theory of strings, some questions would be hard to answer. But whenever you could answer a question, the answer would coincide with the answer extracted from string theory.

In other words, the term "string theory" sounds much more narrow-minded than the theory actually is. It would be much more accurate to call it "the only conceivable theory reconciling all known established principles of general relativity, quantum field theory, and statistical physics" or "all good physics that denies neither GR nor QFT".

The proposition that such a theory contains strings in pretty much all corners of the configuration space where dimensionless parameters much smaller than one occur doesn't have to be interpreted as a defining criterion of a "special theory" called "string theory". Instead, it is a derivable fact that follows from the very general rules, assuming that quanta, fields, and gravity exist.

The dualities: looking at lighthouses

At the beginning, I mentioned that quantum mechanics allowed us to use many equivalent pictures and quantum field theory unified the notions of particles and waves. String theory leads to a similar redundancy of equivalent descriptions: they are referred to as dualities. The internal structure of localized objects may typically be phrased in many ways. Each way involves a different geometry of compact dimensions. Nevertheless, all the pictures end up to be exactly equivalent.

These dualities are an additional indication that string theory is essential to understand the Universe. If you have a physical system that can be described using two (or more), a priori very different collections of equations, it is analogous to a lighthouse that is seen from two (or more) nearby islands i.e. from two very different directions. Such a redundancy of perspectives reinforces our certainty that the lighthouses are real and that we might actually be seeing most of them.

At any rate, it is extremely unreasonable to expect that string theory may "go away" at any moment in the future of theoretical physics because pretty much all of its qualitative features may be shown to be logically inevitable and the theories with these features seem to be extremely rare - as rare as the backgrounds of string theory.

And that's the memo.