Saturday, October 04, 2008 ... Deutsch/Español/Related posts from blogosphere

Why canonical GR cannot work

A reader called Giotis at Shores of the Dirac Sea asked why we almost certainly know that canonical quantization of general relativity cannot work. Let me answer and carefully keep track of the statements that are established and those that have a chance to be changed in the future.

Classical gravity

First of all, when we talk about the canonical quantization of gravity, we must have the right starting point. By this point, I mean the general theory of relativity. It is a theory that postulates that each point in space and time is equipped with the metric tensor and this metric tensor satisfies certain equations.

Do we know it is true? Well, up to a very high accuracy, we know that it is true in all phenomena we have seen whose characteristic distance scale is macroscopic. The metric tensor in any region can be directly measured by sticks and clocks. The equations it satisfies - Einstein's equations - automatically imply Newton's gravity as the non-relativistic limit. All old-fashioned tests of gravity therefore support Einstein's equations. The detailed form of the equations is pretty much determined by the equivalence principle which is also established experimentally.

Once you go through these arguments, much like Einstein did, you may become certain that general relativity is on the right track and essentially correct at the classical level and dozens of experiments supporting it can't surprise you. The theory inevitably implies the existence of gravitational waves, black holes, and expansion of the Universe.




If you want to successfully deny the very existence of the metric tensor or Einstein's equations it satisfies, as the correct variables and laws of gravitational physics at long distances, you have to replace all of general relativity by something else. To do so, you have to start from the scratch and provide science with your brand new explanation to every gravitational or relativistic experiment that has been done in the 20th and 21st century and that was seemingly successfully explained by general relativity. Good luck.

The rest of us will continue with quantization.

Quantization: what is preserved

We have decided that the metric tensor must "exist", at least in some long-distance approximation. In classical, non-quantum physics, it has well-defined values that evolve pretty much deterministically. For a theory to be consistent with the quantum phenomena as well as the classical gravitational phenomena, these variables must be preserved in your quantum theory.

What does it mean? It means that they must become operators. Everything that looks like a dynamical, evolving observable - a degree of freedom - in a classical theory must become a linear operator in the corresponding quantum theory. In the classical limit, the expectation value of the operator gives you the classical value and the quantum fluctuations are small, relatively speaking.

Why did I say linear operators? Because quantum mechanics is the only consistent framework that can agree with basic experiments that were used to construct quantum mechanics in the first place, e.g. the interference experiments with electrons. Again, you may dispute this assertion. But in this case, you would also have to start from the scratch and find your alternative explanations for all the quantum experiments of the 20th and 21st century. You will need even more good luck than before.

See my 16 pages about entanglement, interpretations of QM, and decoherence that reveal why various attempts to get rid of the probabilistic nature of QM cannot work, among other things.

The silliest thing for you would be to believe that at very short distances, physics becomes classical and deterministic again. Quite on the contrary, the shorter distances you study, the more important (relatively speaking) the quantum phenomena become and the more misleading our intuition from day-to-day life is.

As explained in the PDF file above, the probabilistic character of the correct theory - a key aspect of quantum mechanics - has been proven experimentally. The other postulates - such as linearity and hermiticity of the operators, including the Hamiltonian - follow from the requirement of the "conservation of probability" and from (at least approximate) locality of physical phenomena (the inability to influence distant events instantly) which are either necessary for mathematical logic to work or which have been demonstrated experimentally, at least at some good level of accuracy.

Also, the normal conceptual way to obtain the classical limit is the only possible way possible (or at least the only proposed one as of 2008).

Quantization: operators with a scale

Fine. So at this moment, you should believe me that there must exist something like the metric tensor at each point in your theory and it must be a linear operator, at least in some approximation. What's the approximation? Well, the errors predicted by your "refined" theory must be smaller than the errors with which the corresponding assumptions of the "simplest" theory have been validated. What are the errors? Well, it depends but these errors are extremely small at long, astronomical or macroscopic distances at which classical general relativity has been validated empirically.

The assumption that all the degrees of freedom must be linear operators on a Hilbert space doesn't depend on any direct experiments. It's a mathematical fact: it is impossible to design a theory that violates the postulates of quantum mechanics but is equally successful in describing all the observed phenomena (except for uninteresting theories obtained by supplementing quantum mechanics with some tiny, undetectable, bizarre, and useless modifications). This point has been discussed many times on this blog, too. But it is not the key point of this article: we want to show what's wrong with canonical quantization of GR and let me assume that everyone who talks about this phrase accepts that the metric tensor is an operator in a long-distance description of physics.

Fine. So which of these operators must exist, if we want to be a bit more accurate? The average value of components of the metric tensor must be a good operator as long as we average the metric tensor over a sufficiently large - perhaps macroscopic - region of space. Let us choose a cutoff, namely a distance scale L such that you will agree with me that general relativity has been verified at distances longer than L. Imagine it is a millimeter for a while: you can have millimeter sticks and measure gravity in between millimeter objects. Later, as we get closer to the conceptual questions of quantum gravity, you should imagine L to approach the tiny Planck length, 10^{-35} meters.

Where have we gotten so far? We have a theory that must contain operators corresponding to the average metric tensor in regions of space of size L. These assumptions, pretty much proven experimentally, already imply that gravitons must exist. (The same approximation in the context of black holes is enough to show that black holes radiate and have a nonzero entropy.)

Gravitational waves follow from classical GR (and have been observed because of some slowly collapsing, accelerating binary stars). Once you add the quantum postulates, the energy of any wave of frequency "f" must be a multiple of "hf" (for the wave function/al of a state with a nearly sharp - but not quite sharp - energy to be nearly periodic in time). Gravitational waves are no exception; the quanta are called gravitons. Be sure that if someone doesn't know anything about gravitons, he doesn't know anything about quantum gravity.

Computing loops

Fine. So it makes sense to ask what happens if you collide gravitational waves. In a quantum theory, we can only calculate probabilities (and cross sections) from the complex probability amplitudes. Because all waves are composed out of gravitons, we really reduce this whole task to the calculation of scattering amplitudes of gravitons. They must exist and they pretty much contain all the information about the classical waves, expressed in a language that is more acceptable, natural, and valid in a quantum theory.

The theory we are using has been almost directly deduced from observations, as argued above. How do we calculate the graviton scattering amplitudes? Well, you use whatever method you like to justify Feynman's diagrams - the path integral or some canonical, operator-based approach. Again, these are solid derivations and they will lead you to a couple of diagrams. The tree diagrams contain no power of Planck's constant: they will reproduce all the information about the classical gravitational waves and their interactions.

The loop diagrams are proportional to positive powers of Planck's constant: they're quantum corrections. What should you do with them? Should you throw them away? Well, if your experimental situation is nearly classical and you can prove that the quantum phenomena are smaller than the errors you want to overlook, you may ignore the loops. You will return back to classical GR even though the results may be expressed in a new quantum language.

But obviously, we study quantum gravity exactly in order to know what the quantum principles do with the old classical theory. So we can't ignore them. They can be relatively large. Now, the classical limit and the long distance limit are a priori independent limits. In other words, corrections proportional to powers of Planck's constant are independent from the corrections from physics below the cutoff distance scale L.

Which ones do we have to take seriously? We can afford to discard the corrections from physics at distances shorter than L because gravity hasn't been tested in this regime and many new things can happen over there. But we cannot afford to forget about the quantum corrections because quantum mechanics has already been established and its postulates must be universal. So the "virtual particles" running in the loop diagrams are very real as long as their de Broglie wavelength (and correspondingly the periodicity of their waves in time) is longer than L or so.

They're as real as the sticks whose length is L or longer and that have been experimentally tested in detail: these sticks directly measure the metric tensor, the metric tensor is an operator, and this operator must exist at long distances above L.

The only thing that one may have doubts about are the virtual particles whose typical de Broglie length (or other typical distance scale) is shorter than L because by our assumption, there can be some new objects (dragons and their bizarre interactions) in this regime. We haven't carefully looked at these questions experimentally.

Loops: divergences

As you know, the loop diagrams lead to divergences. In our plan, we get rid of them simply by omitting the contributions from virtual particles whose de Broglie wavelength is shorter than L. The contribution in this regime cannot be trusted, anyway. However, we must be completely open-minded what the contribution actually is because these very short distances are empirically unknown.

The only thing we do know is that the total probability amplitudes are finite. So whatever the mysterious physics at short distances does, it cancels any divergences that we may have obtained. By divergences, I mean terms in the probability amplitudes that are proportional to negative powers of L (or logarithms) and that go to infinity if L is ultimately sent to zero.

Because of the comments above, the correct theory - whatever it is - must produce some extra terms in the scattering amplitudes (and usually even in the Lagrangian itself) that add up to other terms to make the probabilities finite. These extra terms can be attributed to the mysterious physical phenomena at distance scales shorter than L. They're called the counterterms.

Note that a counterterm is certainly needed whenever the naive theory without counterterms generates a similar, divergent contribution to the scattering amplitudes (or Green's functions whenever they're physical). In 1985, Goroff and Sagnotti calculated all the one-loop and especially two-loop divergent terms in the scattering amplitudes (and the effective Lagrangian). You should understand that this calculation is in principle straightforward and only uses objects whose existence and behavior has been established, as argued above.

See also: Finiteness of SUGRA theories
The most important result is that there is a divergent term - a required counterterm - from two-loop Feynman diagrams (of 14 different topologies) contributing to the scattering of 3 gravitons. At the level of the Lagrangian, it is proportional to the integral of sqrt(-g) times a contracted third power of the Riemann tensor (or, more naturally, after some rearrangement of other terms, the Weyl tensor). The coefficient includes a divergent factor of 1/epsilon (in dimensional regularization; in the language above, you should imagine something like log(L) instead) and an impressive coefficient, 209/2880 times 1/(4.pi)^4.

What is the importance of this counterterm? It proves that whatever the short-distance physics (below the distance scale L) does must influence the scattering of 3 gravitons in the same wave as an additional term of the type R^3 in the Lagrangian. But the coefficient of the required term diverges as L is sent to zero.

Counterterms and uncertainty

Now, you should realize that if you subtract two infinities from each other, there is usually no way to say what the result is: the difference is an indeterminate form. It may be another infinity or minus infinity. But because we know that the final probabilities are finite, the difference must be something like 0 or 5 or -7, a finite number. But we still don't know what the number is.

In other words, every counterterm required to get rid of some divergences also brings a new uncertain finite parameter to your theory.

Needless to say, the two-loop divergence is just a "proof of concept". When you continue the calculations to higher loops, you will inevitably generate infinitely many new kinds of divergences. They will require infinitely many new counterterms with infinitely many new undetermined coefficients. Such a theory is referred to as a nonrenormalizable one.

Why is it a problem? Well, if we want to calculate the answer to some typical question - such as the probability that two gravitons at nearly Planckian energies collide and produce three gravitons with momenta in some interval - we need to know the value of all these coefficients. All of them matter. In fact, all of them are equally important. But all of them are undetermined. The theory clearly tells us nothing. We would have to make infinitely many measurements of Planckian scattering to determine all these parameters and only afterwords, we could do some useful calculations.

Clearly, we cannot make a single measurement of this type in practice. It is very clear that any of these questions can only be clarified by a carefully constructed theoretical argument and everyone who says that quantum gravity should become a predominantly experimental science rather than a overwhelmingly theoretical science is either a hopeless layman, a hopeless laywoman, or a complete nutcase.

What is realistically needed here, of course, is not an infinite set of undoable experiments but rather a better organizing principle that tells us what is actually happening at the distances shorter than L - or even close to the Planck scale.

Questions to be answered

But before I tell you how one should find these principles and especially how one cannot proceed, let me emphasize what is the kind of the questions we want to be answered.

In classical general relativity, evolution in time was determined by Einstein's equations. They can often be solved and they have many interesting consequences. But there is a lot of nontrivial calculation going on even if you're modest and stick to the solutions that are close to flat spacetime. All of them can be formulated as collisions of gravitational waves.

In a quantum theory, all the qualitatively new phenomena found in the classical theory still exist - or at least lead to corresponding physical questions. However, even the modest regime close to the flat space is found here. All the meaningful quantitative physical questions about the collisions of gravitational waves may be reinterpreted as calculations of scattering amplitudes for gravitons in the quantum theory. They're damn important. If a theory can't say anything about these amplitudes, it can't say anything about quantum gravity.

If we only keep the classical diagrams, we effectively assume that Planck's constant vanishes and the results will return us to the classical theory. We clearly want to know what happens when Planck's constant is nonzero: in that case, we need the loop diagrams, too. All of them. In a "nearly classical" regime, the higher-loop diagrams may be very small and only a few terms in the effective action will matter (their coefficients are still unknown, except for Newton's constant and the cosmological constant). But in the "full quantum" theory, we need all of the loop diagrams, too.

The quantum corrections - proportional to powers of Planck's constant - to the classical (tree-level) scattering diagrams are the main and first quantities that your quantization of general relativity should be able to tell you, at least in principle. Some people can draw some funny pictures of networks or octopi. But these things are not yet physics. Physics is about observable phenomena and these octopi or networks are obviously unobservable. The real observed phenomena are encoded in the scattering amplitudes of elementary particles - something you could do with a big enough collider, for example. The classical limit of these amplitudes is known; what is not known are the quantum corrections.

If your theory gives us no hope to answer these things, it is contributing exactly nothing to our knowledge about quantum gravity. And if your theory even claims that the corrected amplitudes are not a legitimate question, it is a wrong theory because it's been proven (even above in the text) that the question is fully legitimate and must have an answer: in principle, we could even scatter the gravitons.

Completing a nonrenormalizable theory

In the approach above, a complete quantum theory can be defined by a Lagrangian with the cosmological constant term, the Einstein-Hilbert term, and infinitely many other terms (e.g. powers of the curvature) including the right coefficients that must be determined. I must stress that no set of values for these infinitely many coefficients is canonically better than others. In fact, the coefficients can be mixed with each other by field redefinitions or even by innocent changes of the cutoff scale L or even by much more subtle changes of the renormalization scheme.

Most concretely, it is impossible to say that all the coefficients of the "complicated" terms are zero. If they're zero for one value of the cutoff L, they will be nonzero for other values of L. Moreover, even the vanishing at the scale L depends on the precise conventions how you truncate the integrals in the loop diagrams etc.

At this level, the quantized general relativity is an infinite-dimensional space of a priori equally fit candidate theories. Any correct and complete theory must look like one point in this infinite-dimensional continuum, at least at long distances. It can contain many new things, strings, branes (or dragons or even foams and octopi, if you want to be highly speculative) at tiny distance scales. But at long distance scales, you must return back to the description we were using previously.

In the language of the nonrenormalizable theory, the question is how to determine the infinitely many unknown continuous coefficients of the counterterms. In principle, you might imagine that Nature simply needs the right values for all of them. But it is almost impossible to believe that there are infinitely many universal yet uncalculable (or environmental) continuous parameters in Nature. Some limited people have a problem even with a discrete degeneracy. But the existence of the Standard Model - that correctly describes millions of experiments in terms of 30 fundamental parameters only - is a hint that a theory with infinitely many parameters that must be measured is probably not the final answer.

So there is something missing here. A better theory (and/or a more careful analysis of the consistency criteria) tells you what all (or most of) the unknown numbers are. At least, it must erase the infinite number of continuous adjustable parameters. But if your theory cannot do such a thing or if it disagrees with some statements that have been shown above to be experimentally established, your theory has simply been falsified. It's dead.

New variables: do they matter?

It is obvious that my discussion will now turn to the approaches that some people have incorrectly called "quantizations of GR", especially Ashtekar's variables, loop quantum gravity, and similar sleights-of-hand. First, let us answer the following question: can a choice of new variables bring progress to physics?

Well, it depends on the context. New variables can certainly help you to describe a system that was previously described in some other variables but these variables turned out to be inconvenient for seeing what actually happens in a physical situation. They could have been too strongly interacting (strongly coupled) and the calculations could have been messy. A clever choice of new variables can help you to "see" what happens. But in principle, the correct calculation could have been done in the old variables, too. They're just bookkeeping devices or convenient tools to speed up your calculation. The quantum Hydrogen atom is naturally solved in spherical coordinates but you could do it in the Cartesian ones, too.

But what new variables certainly can't tell you are the correct equations of your theory. If you didn't know what equations were satisfied by your old variables, you will be equally ignorant about the equations satisfied by the new variables. This statement is a truism. Variables or coordinates are just tools to describe a space - in this case, a configuration space of general relativity. Any set of coordinates should be equally good as any other set (even though some of them may be more convenient for calculations in specific contexts).

If your new variables lead you to new physical conclusions that were not included in the old variables, it is not just a change of variables. You're changing the rules of the game. You're adding new physics (or subtracting old physics). For example, Ashtekar's variables rewrite the time-derivatives of the metric tensor as bilinear expressions of a gauge field, roughly speaking. Can it help you in any way? Well, there are two basic problems with Ashtekar's variables:
  • they're not describing the configuration space of GR in a one-to-one way
  • much like any other set of variables, they can't tell you anything about the infinitely many unknown coefficients
Concerning the first point, it is one of hundreds of ways to see that loop quntum gravity is physically incorrect. For example, the "quantization of areas" in the LQG framework is often presented as a virtue. But you can easily see that it is simply a mistake. In the actual theory of gravity, the components of the metric tensor are numbers that can take any value. The integral of the induced volume form over a two-dimensional surface can be any number. If your "new variables" tell you something else, you have clearly not described the configuration space in a one-to-one fashion.

It's not hard to see where this mistake of the LQG people comes from. They write the "density of proper area" as a magnetic field (a random identification of two tables with numbers - metric tensor components and non-gravitational, gauge fields - that was done in this way and not another, for no good reason). The integral of this field, a magnetic flux, is essentially quantized. But the proper area in the full gravity is clearly continuous.

The same problem can be described in dual variables (in the sense of coordinate vs momenta): when the magnetic flux is quantized, the canonically dual variable to it - essentially a Wilson line - spans a compact space (a circle or a U(1) group manifold; in real LQG, it is really an SU(2) group manifold and the quantization rules are more complicated but the discussion is qualitatively isomorphic). But in the full theory of gravity, the space is not compact.

In fact, I can tell you what the dual variable to the "density of area" is: it is a function of the time-derivatives of the metric. If you're saying that this degree of freedom lives on a compact manifold, then you are saying that the time derivatives of the metric cannot be arbitrarily large. When they're too large, they drop to zero again in your (LQG) picture. That's clearly wrong. It's like if you ask someone about the very important year Y when an event took place and a lady will only tell you what is the value of sin(pi.Y). She obviously told you zero. Literally. ;-)

So Ashtekar's variables are not a faithful parametrization of the configuration space of gravity.

But even if you picked better variables that describe the whole space correctly, it would still fail to help you in determining the infinitely many unknown coefficients in the action. Don't get me wrong: in some variables and conventions, some values of the parameters may be "easier to write down" and you might find it easier to define one theory (a point in the infinite-dimensional continuum) than others. But in other variables, you will get completely different "simple" points. Much like there is an infinite-dimensional space of possible Lagrangians, there exists an infinite-dimensional space of possible choices of variables and conventions. In fact, the latter infinity is even "more massive" an infinity.

Lorentz invariance

There are many other ways to see that Ashtekar's variables are wrong i.e. that they eliminate any chance that you end up with a correct quantum theory of gravity. For example, in the discussion of the non-renormalizable Lagrangian, we had a huge, infinite-dimensional space of candidate Lagrangians.

But every single theory in this infinite-dimensional space was a theory that locally respected the Lorentz symmetry. Well, the classical theory did and the quantum theory was constructed according to the classical template. Because the diffeomorphism symmetry - that includes the Lorentz symmetry as a subgroup (e.g. in the superselection sector of the flat space) - had no anomaly, we preserved the symmetry even in the quantum theory. It is very important that we did. For example, a theory that includes black holes but violates the Lorentz invariance makes the perpetual motion machine possible.

On the other hand, a theory based on Ashtekar's variables inevitably breaks the Lorentz symmetry. This problem is closely related to the artificial "compactness" of the configuration space that we mentioned a moment ago. We saw that the canonically dual variables to the density of proper areas - which are simple functions of time-derivatives of the metric in Ashtekar's dictionary - were assumed to span a compact space in Ashtekar's setup.

But if the time-derivatives of the metric tensor live on a compact configuration space, the spatial derivatives of the metric tensor should live on a compact configuration space, too. Otherwise the local Lorentz symmetry is manifestly violated. That's the case of loop quantum gravity, too: the Lorentz symmetry is manifestly broken in LQG. There are many other ways to see this fact. The spin network picks a privileged reference frame, an aether.

Because of these reasons, LQG is outside the candidate space for a short-distance completion of the quantized general relativity. It has no chance to be correct because it messes up your degrees of freedom, already at the kinematic level. You can't call it a quantization of GR: it is only a random discrete model that was perhaps inspired by GR but whose physics was mutated, even in the classical limit.

But even this sacrifice - the sacrifice of any hope that you will get a promising theory - is not enough to help you with the real problem of nonrenormalizable GR, namely the infinite amount of unknown parameters. The discreteness that Ashtekar incorrectly introduced has nothing to do with the ambiguity of the parameters. There are still infinitely many unknown parameters in LQG: they appear as infinitely many unknown coefficients in the Hamiltonian constraint that remains a complete mess and mystery in LQG. That is pretty bad news because the Hamiltonian knows everything about dynamics and virtually all of physics is dynamics - so they don't know anything about dynamics (anything about most of physics) and the kinematics is incorrect, too.

Completely new physics

What is obviously needed instead is a theory that has new degrees of freedom and new interactions such that the gravitational interactions encoded in Einstein's equations are just a long-distance approximation. Even before string theory was found, we knew that this is how short-distance problems and divergences are always solved when the mystery disappears.

In Fermi's (and Feynman - Gell-Mann) theory of the beta decay, there is an explicit term in the Lagrangian that is quartic in the fermionic fields: that's why a neutron [or a down-quark] can directly decay to a proton [or an up-quark], an electron, and an antineutrino. But this quartic interaction leads to qualitatively identical problems with nonrenormalizability (many loop divergences) - and infinitely many new unknown parameters from the cancelation of these divergences - as general relativity.

In the case of beta decay, the correct theory is a gauge theory. The fermions don't interact with each other directly. Instead, they interact with a new field, a massive vector field (psi.A.psi terms), and by exchanging this new field, they mimick the quartic coupling at low energies. The new vector field must respect a gauge invariance for consistency but because it must be massive, the gauge invariance must be broken (by the Higgs mechanism, ideally with simple fundamental scalar fields).

The case of gravity is analogous except that it occurs at much higher energy scales: the corresponding new particles needed to make the theory finite - excited states of strings and other string-theoretical objects - couldn't have been detected experimentally. And a simple dimensional estimate shows that it is likely that they won't be detected. This is surely not a defect of string theory in any sense: it is a fact about Nature.

Every theory that is at least remotely promising must "hide" the new and unusual physical objects and phenomena to very inaccessible situations (energy scales), otherwise it would already have been falsified. I realized the importance of this dimensional analysis - and the likely impossibility to detect quantum gravity directly - when I was 15 or so and I would have never believed that these rudimentary points could ever become controversial in the physics community. The degree of stupidity of many people often exceeds all of my previous wildest expectations.

In quantum gravity, it is inevitable that new physics has to exist. In this text, we demonstrate this assertion by the desire to deal with the divergences and unknown coefficients. But even if you mysteriously answered these questions, pure gravity cannot work at the quantum level: extra matter is necessary (at least above three dimensions) and it seems that other forces actually have to be stronger than gravity if the theory is consistent.

Now, string theory predicts infinitely many new particle species - excited states of a string - and their masses and interactions are completely determined by the theory. Is that too bold a hypothesis? Well, it is as bold as needed. It is bold enough but it is also as conservative as necessary. There are conditions on both sides - experimentally-based requirements for the theory to be conservative enough and other requirements for it to be revolutionary - and the allowed range is very thin. String theory is walking right on this rope (or string, if you wish). The rope still has discretely many minima where you can comfortably sit (with your Universe). ;-)

In fact, the excited strings are known to be just a part of the full story. There are many other objects in string theory - branes and black hole microstates - whose properties are fully determined by the theory. In the context of the AdS/CFT, gravitons can be seen to be composed out of gauge fields on the boundary of the space. In the context of Matrix theory, they are bound states of many D0-branes that seem to respect the rules of noncommutative geometry whenever they organize themselves into the shape of membranes.

But once again, it doesn't mean that gravitons always have to become composite. In perturbative closed string field theory, the metric is still there and has the same properties as in classical GR: but it is supplemented with infinitely many "matter" fields from massive stringy vibrational patterns.

All of these objects in all of the known descriptions influence physics and the graviton scattering and many other important classes of phenomena that are fully calculable. Today, when we talk about string theory, it is really a misnomer. We are talking about all possibilities how general relativity and quantum mechanics can be merged into a consistent union - how the effective quantum field theories that must be correct to "some extent" can be extended into full-fledged theories that can work for arbitrary phenomena. These possibilities include universes with strings behaving as in perturbative string theory but also many other regimes. They seem to be connected by the defining equations of the theory that continues to be called string theory.

So what's important is not whether one can actually see strings as intermediate states: they're good degrees of freedom only when they're weakly coupled. What's important is whether people's approach to physics is solid and avoids at least elementary mistakes such as those that have been shown in this article in the context of LQG. We don't know for sure that our Universe has a good description in terms of weakly coupled strings that behave just like in perturbative string theory. But we know many other things. A small part of them has been described in this article.

There can be new interesting physical phenomena in quantum gravity that will be discovered and some of them may turn out to be relevant for phenomenology. But if they are first discovered theoretically, it is very clear that the theory must be robust and consistent. In other words, the methods to calculate and decide about the fate of hypotheses must be the scientific methods, as used by string theory, and not the sloppy games presented by the LQG people and other subpar scientists.

And that's the memo.

Add to del.icio.us Digg this Add to reddit

snail feedback (0) :