Monday, September 05, 2011

Joe Polchinski on failure of all Lorentz-breaking theories

In June 2011, Joe Polchinski, the main father of D-branes and other things, clarified some confusion in another June 2011 paper.

(Yes, I took this Wikipedia picture in Santa Barbara. If you think that Joe is a cowboy just like e.g. Rick Perry, yes, he is a cowboy but a left-wing one.)

In his paper, Joe Polchinski re-explained the reason why all theories starting with some Lorentz violation at the Planck scale are born dead. But let us be a bit more chronological.

In 2004, five authors explained that a Lorentz violation at the Planck scale leads to a significant (1% or os) Lorentz violation at low energies which manifestly contradicts the observations. In their paper
Lorentz invariance and quantum gravity: an additional fine-tuning problem?
John Collins, Alejandro Perez, Daniel Sudarsky, Luis Urrutia, and Héctor Vucetich (CPSUV – a consumer price sports utility vehicle) showed that the Lorentz violation predicted by such theories for low-energy measurements was 20 orders of magnitude higher than previous misconceptions indicated.

In the early June 2011, three authors found this result inconvenient because this point (discussed many times on this blog) is a simple way to see that all loop quantum gravities, spin foams, causal dynamical triangulations, octopi, all other discrete models of Planck-scale quantum gravity, deformed special relativities, but also Hořava-Lifshitz theories etc. are just stinky piles of feces and these three authors - among dozens of others – have spent decades with a mindless "research" of these piles.

So in the paper called
Small Lorentz violations in quantum gravity: do they lead to unacceptably large effects?
Rodolfo Gambini, Saeed Rastgoo, and Jorge Pullin tried to emit as much fog as possible. They claimed that a divine intervention by God may produce Lorentz violations with supernatural properties that may miraculously save piles such as loop quantum gravity. They use a non-standard vocabulary which is widespread among their colleagues.

Instead of "miraculously", they use the term "non-perturbatively". However, the wording makes it clear that their usage of this adverb has nothing to do with the term "non-perturbatively" as used by serious physicists (because these physical phenomena are mostly understood non-perturbatively and the non-perturbative treatment clearly doesn't lead to any "qualitative" revision of the results that may be guessed "perturbatively" or otherwise). Instead, it is a synonym of "miraculously" as used in the context of discussions on God who may violate the laws of maths whenever He needs to.

Finally, we're getting to the content of these papers that was explained and fixed at the end of June 2011 by Joe Polchinski's paper
Comment on [arXiv:1106.1417] "Small Lorentz violations in quantum gravity: do they lead to unacceptably large effects?"
Much like the two previous preprints, it was submitted as a gr-qc preprint – an arXiv I usually don't read because the percentage of crackpot articles (plus boring articles on classical GR) is way above the critical threshold.

In this paper, Polchinski explains the actual reason why the examples used by Gambini et al. have led them (or at least their gullible readers) to believe that they may circumvent CPSUV. The reason has nothing to do with things' being "non-perturbative". Instead, the reason hides in special properties of the toy models used by Gambini et al. These special properties make it impossible to use the models as realistic descriptions of anything in Nature.

Polchinski notes that the effect of the Planckian Lorentz violation on low-energy phenomena may only be suppressed if this violation only produces operators \({\mathcal O}\) of dimension \(\Delta > 4\) i.e. non-renormalizable operators. If this were the case, dimensional analysis guarantees that \({\mathcal O}\) enters the Lagrangian density with a coefficient
\[ {\mathcal L}_{\rm Lorentz-breaking} = \frac{C}{M_{\rm Planck}^{\Delta - 4}} {\mathcal O} . \] Here, \(C\) is a purely numerical, dimensionless constant. Note that the powers of the only justifiable mass scale, the Planck scale, have to appear in the denominator for the action \(S\) to remain dimensionless (it's an exponent in the path integral!) in four spacetime dimensions we intimately know:
\[ S = \int d^4x\,{\mathcal L}. \] However, the Lorentz-violating effects at the Planck scale may also influence the low-energy physics by operators which are not non-renormalizable:
\[ \Delta\leq 4. \] Independently of the Lorentz symmetry, we need to demand the gauge symmetry of the action. This constraint usually destroys (almost) all candidate operators with \(\Delta < 4\): exceptions include the problematic cosmological constant and the Higgs mass.

However, there are lots of dimension 4 operators with the required properties that nevertheless satisfy the conditions. In 1998, Sidney Coleman and Sheldon Glashow showed that there were 46 CPT-even but Lorentz-breaking operators with dimensions at most four that may deform the Standard Model. All of them obey the anomaly cancellation.

Just to have an idea what such terms are, recall that the electromagnetic Lagrangian is proportional to
\[ {\mathcal L}_{\rm elmg.} = \frac{E^2 - B^2}{2}. \] Sorry if my overall sign is wrong. However, if you only keep the term \(E^2\), it is still gauge-invariant. Consequently, the coefficients in terms of the electric (time-like gradients) and magnetic (space-like gradients) terms may be different and independent; their combination still preserves the spatial rotational symmetry and the gauge symmetry.

Moreover, this ratio of the time-like and space-like coefficients may be different for photons than it is for gluons or up-quarks or down-quarks or any other elementary particle. This would mean that each type of an elementary particle – and there are dozens of them – would have a different limiting speed, different "personal speed of light". Because the interaction terms may also violate the Lorentz symmetry or export the violation from one elementary species to another, bound states of particles would generally have different "personal speeds of light" as well.

It could be a complete mess. Obviously, none of it is observed. Relativity seems to hold perfectly and universally. This observational fact can only be explained by fine-tuning dozens of coefficients in a hypothetical Lorentz-breaking theory (at the Planck scale) at a huge accuracy. To agree with all the experiments, you would almost certainly need to fine-tune hundreds of digits of your fundamental Lorentz-breaking theory in some way. Generically, these theories predict that we must see a gigantic and de facto random violation of special relativity at long distances, too. Note that this is not just some prediction caused by an "imperfect version" of these theories: it's the very defining purpose of their being Lorentz-breaking to predict Lorentz-breaking effects. We don't observe this robust and completely universal prediction of these theories so these theories are ruled out.

Polchinski dedicates some special attention to the foggy arguments by Gambini et al. They consider a latticized theory in a Euclidean spacetime. Discrete symmetries of such a 4-dimensional lattice may prohibit various low-dimension operators. You may say that this method of generating an approximate low-energy Lorentz symmetry is based on the observation that a "hypercube is approximately a sphere". ;-)

However, these very same discrete spacetime symmetries also imply physical predictions that make these theories incompatible with very basic observations. If you wanted to imagine that the spacetime is a lattice, you must ultimately appreciate that Lorentz symmetry implies that 1 second is "as long as" 299,792,458 times \(i\) meters. Here, the letter \(i\) is the imaginary unit.

The imaginary unit is necessary because the actual symmetry between the physical time and physical space isn't \(SO(4)\) but \(SO(3,1)\) which is formally equivalent to the conditions of \(SO(4)\) but the time-like coordinate must be real and not imaginary. The spacing \(i\) in the time-like direction behaves much like a non-integer so as long as you will manage to get a realistic Minkowski spacetime out of your theory, it will be impossible to guarantee that the "same spacing" for space and time will be protected. If you want a moral "lattice-like" toy model for the Minkowski space, the spacings in the time-like and space-like directions should be different in your model.

You could also try to assume that the "Euclidean latticized spacetime" is fundamental and the Minkowski physics is its analytical continuation. Could your discrete symmetries of the Euclidean spacetime replace the full-fledged Minkowski symmetry and still naturally reproduce an accurate enough illusion of the Lorentz symmetry at low energies? Polchinski explains that the answer is, once again, no. You either violate the unitarity; or you will be able to show that the smallness of your Lorentz-breaking term is independent of the energy scale and was therefore assumed as an input, a fine-tuning, even at the Planck scale: the methods of RG can't explain this smallness.

Interestingly enough, supersymmetry allows one to prohibit many dimension-four operators so supersymmetric theories actually have the potential to be fundamentally Lorentz-breaking at the Planck scale; and approximately Lorentz-invariant at low energies with an impressive accuracy. Of course, I am not proposing that this is how the world works. There's no reason to think so. If we include gravity, the Lorentz symmetry becomes necessary at the Planck scale even if you have supersymmetry at the Planck scale at the same moment.

I believe that many of the discrete "alternative physicists" must have understood this totally robust reason why their whole research program is based on an incurable incompatibility with the observations but they're afraid to admit the truth because they're afraid that they wouldn't be too skillful janitors in McDonald's which is what they clearly should be if the Academia were primarily something else than a huge welfare program for ordinary people who write crackpot papers and who pretend to be scientists.

And that's the memo.

Guns'n'Roses recorded the most famous and most testosterone-rich version of this song by Bob Dylan, Knocking on Heaven's Door. Lisa Randall's book is out in two weeks: pre-order it.

The book on particle physics, details of physics of the LHC, and philosophy of science is great - I've read it in some detail. ;-) The people who have written very positive or enthusiastic few-sentence reviews include Larry Summers (Treasury), J. Craig Venter (DNA), Richard Dawkins (DNA+antigod), Steve Pinker (evolutionary psychology), and a particle physics student of Lisa Randall, Bill Clinton, a former employee of the White House (as an assistant of the entertainment manager of the leisure time of the interns, or something like that).

If you need to read this blog in the form of a Booktrack (what is it? Try Sherlock Holmes for free), an immersive 10-dimensional ereader multimedia experience (Peter Thiel is an investor), you need to download an app. To do so, you need an ex-GF of mine, Ms Petra Němcová, who will press a button. Don't get distracted by the superficial aspects of the explanation how to press the button, James.

The TRF visitor #6,666,666 was a user from Melbourne, Australia - apparently a climate skeptic. If he or she can identify himself or herself and show that the data agree by a visit, he or she is elligible to win $10 and the right to write a guest blog.


  1. Quoting Polchinski: The Standard Model admits a large number of dimension 4 operators that are gauge
    invariant but not Lorentz invariant, for example the spatial gradient terms for each of the 19 gauge multiplets. These lead to different ‘speeds of light’ for the different multiplets, so that Lorentz breaking of order one at high energy would lead to unacceptably large breaking at low energy.

    If there is gauge unification before Lorentz breaking, then the single supermultiplet would have give rise to only one speed of light, right?

  2. the invariance of lorentz as mesured by STR of Einstein is due the violation of left-right handed rotational invariance to spins( or calculated by the metrics using spinors) that is equivalent to the cp symmetry breaking-or the conservation of pt to stronger interactions that implies the lorentz' s invariance broken,but conserving cpt to the the covariance of the
    lorentz's groups or the extended groups of poincare with
    complex coefficients and with infinite exteded groups that contain all the values to the speed of light...

  3. in first order the lorentz's symetry of loretz is violated together with the opperator pt,doing apear the speed of light as constant to all the spacetime continuum in the 4-dimensional universe with differents curvatures,and having a torsion tensor linked to the opposite spins of the particles.and the antichronous of lorentz and cpt restaure the simmetry of lorentz as invariant linked together with rotational invariance that "a priori" is violated by pt appearing complex spectrum-fundamental eiingenvalues- non-hermitian hamiltonian matrix.but the 4-dimensional manifold is non-eucliean or best is riemannian,and the the spacetime curves are elliptics.

  4. the spacetime is bifurcated,with this not violating causality,and mantaioning the locally the symmetry of lorentz.but generated by the breakdown of pt,that pplace speed of light as constant and invariant for inertial systems.but have multiples spacetime continuos,bifurcates as perceived in the quantic interactions with time with two dimensions and two spinors chirals.the spacetime is curved with nonlinear systems