Thursday, July 31, 2014 ... Français/Deutsch/Español/Česky/Japanese/Related posts from blogosphere

Unitarity and analyticity say: gravity is the weakest force

In 2006, Arkani-Hamed, Motl, Nicolis, and Vafa presented evidence that gravity is the weakest force, a claim that would often be called the "weak gravity conjecture" (WCG) later. Yes, the swampland on Figure 1, so different from the peaceful Czech landscape, is some territory between Poland, Ukraine, and Belarus. ;-) It means that for any non-gravitational force, imagine the electrostatic or magnetostatic forces, there have to exist light enough particle species that are sufficiently charged so that their mutual non-gravitational forces exceed the gravitational ones.

Be sure that this hypothesis passes the experimental test – gravity is the weakest force in the world around us. But the point is that it couldn't have been otherwise.

This claim – an inequality of a sort – seems to be satisfied everywhere in string theory. Some partial proofs for classes of stringy vacua may be at least sketched. Moreover, even when you ignore any string constructions, the inequality is needed to avoid the black hole remnants which would make any theory of quantum gravity inconsistent. For remnants not to occur, extremal black holes have to be able to decay as well, and by charge and mass conservation laws, that's only possible if the mass/charge ratio of some of the Hawking particles they emit obeys the opposite inequality than the allowed black holes – when the Hawking particles act as "supraextremal black holes" when it comes to their charge/mass ratio.




I am sure that a rule like that is true but I am equally sure that we haven't found the most accurate formulation of the inequality in the most general vacuum of quantum gravity. A related problem is that we couldn't find the most "primordial" reasons behind the claim. We couldn't even figure out whether the inequality is truly fundamental and far-reaching – like the Heisenberg uncertainty principle (although this is almost certainly too formidable a foe to beat) or just some minor technical result.




A derivation of the inequality from some reliable mathematics that is more tightly connected with the formalism of quantum field theory would always clarify the situation. The first hep-th paper today is

Infrared Consistency and the Weak Gravity Conjecture
by Clifford Cheung and Grant N. Remmen. They claim to have derived some weak-gravity-style inequalities from elementary principles in quantum field theory, namely unitarity and analyticity.

The result that the authors are most proud about is the inequality (4)\[

a_1+b_1 -b_3 +c_1 +c_2 +3c_3 \geq 0

\] but they have two similar inequalities and some extra comments. I must tell you that the constants \(a_i,b_i,c_i\) are coefficients of four-derivative terms in the photon-graviton (extended Maxwell-Einstein) action. The letters \(a\) refer to \(F^4\) terms, \(b\) are the mixed \(F^2 R\) terms, and the \(c\) constants are the purely gravitational \(R^2\) terms. The indices of \(a,b,c\) encode different ways how the Lorentz indices of \(F,R\) may be contracted.

I am a bit confused why some of these coefficients aren't obliged to be positive definite (or negative definite) separately from others, but I am probably being stupid. If they know what they're doing, the known arguments in quantum field theory allowed pretty much any values of all these \(a,b,c\) parameters but they have derived new constraints that look incomprehensible as mathematics but they can interpret them as a manifestation of the weak gravity conjecture (it's not trivial to understand why there is a relationship at all but I hope it can be understood at the end).

The weak gravity conjecture became one of the most well-known examples of Cumrun Vafa's "swampland paradigm" – string theory or a consistent theory of quantum gravity (these are really two phrases denoting the same thing – just the first one makes you think more constructively and the second one more generally, bootstrappily) makes lots of universal predictions and constraints that hold even if one is ignorant about the vacuum that Nature chose, constraints that used to be unknown to generic builders of effective quantum field theories.

Unitarity and analyticity that were used as a starting point are fundamental in principle but their consequences for a theory always look like human-unfriendly technical mess. Maybe we need to understand all these features at a more intuitive level to get more familiar with the true foundations of quantum gravity.

Add to del.icio.us Digg this Add to reddit

snail feedback (17) :


reader Holger said...

Dear Lubos,

Thanks for these insights. Near the end you mentioned about the "human-unfriendly technical mess" - a consequence of unitarity and analyticity? Are you suggesting that getting rid of analyticity might perhaps make thinks easier? You are not possibly drifting toward loop quantum gravity I suppose ;-)


reader TwoBs said...

Equation (4), as far as I understand, is about 3D gravity where the graviton doesn't propagate and the t-channel singularity is avoided. The actual equations they are most likely proud of are eq.(5-6) for the 4D case, which do not involve analyticity arguments which otherwise would give rise to the above singularity when propagating massless modes are in t- channel. I should mention, however, that the t- channel singularity can be avoided even using analyticity arguments In the forward scattering as explained in this paper http://arxiv.org/abs/1405.2960 where the arguments of Adams et al. Have been generalized.

As for your comments about whether the coefficients are (not) separately positive definite, I'd like to say that this is quite a common situation indeed. See e.g. the positivity constraints for the chiral lagrangian in QCD or EW interactions of e.g. http://arxiv.org/abs/hep-ph/0604255 (and btw, eq.54 of Arkani-Hamed and co. In http://arxiv.org/abs/hep-th/0602178 isn't quite correct; the correct version being $L_4+L_5\geq 0$ and $L_5\geq0$)


reader Luboš Motl said...

No, you can't get rid of unitarity and (probably) analyticity, it would mean inconsistency - indeed, similar to those that arise in the crackpot theories you mentioned.


I am just saying that all particular consequences of unitarity and analyticity are (obviously true but) technical details of a sort, or they seem to be.


reader Shannon said...

Totally agree with you Lubos.
There are talks in France that the sale of 2 missile launcher frigates to Russia could be cancelled even though their building has just finished and the Russians soldiers are now being trained in France. Washington is putting enormous pressure on France to cancel these sales. Maybe the US will suggest we give them to the islamists in Lybia ?


reader Casper said...

During Earth's celebrated First Contact event, when the aliens dumped space garbage out of the window of their saucer at Roswell in 1947, they knew that the garbage would hoovered up and sequestered by nearby US air force personnel. In so doing, the aliens were tacitly demonstrating that the Pentagon was the preeminent technological and political force on the planet.

Today we have the situation where NATO, the arm of the Pentagon in Europe, should have been made redundant 20 years ago. Instead today it is spending a mere $2B on unattractive new headquarters for its office staff. This is chump change. How many billions more do you think it spends on think tanks, plans and strategies for manipulating the political and media system?

The purpose of the Security Industrial Complex (SIC), like any bureaucracy, is to grow and expand itself. When it runs out of wars it just creates the conditions for new ones. What you are seeing here is this process in motion.



Creating the conditions for new wars when it runs out of old ones is what it does.


reader WolfInSheepskin said...

Hey Lubos, can comment on this: https://www.sciencenews.org/blog/context/quantum-connection-could-revitalize-superstrings


reader anna v said...

IMHO the whole mess started because of the US. I do not know whether they want an open WWIII or as you say it is stupidity, but the EU is caving in to the pressures of the US .


reader Dan said...

Could one follow their strategy in N=4 SYM and see what the resulting inequalities imply for the holographic dual in the bulk?


reader Tom said...

Great post, Lubos, truly profound. Your phrase [that their parasitic life will become easier if they encourage hatred in millions of cheap, low-quality voters] cuts to an absolute reality that seemingly has become the central tenet of a great many Western politicians - most particularly in the current president of the USA and what is, laughably, called his foreign policy team.


reader williamb said...

The Fermilab magnet comes after many years of experience with Nb3Sn magnets. A large first step was the D20 dipole magnet built by Lawrence Berkeley Lab in 1996. D20 was 1 meter long with a beam tube aperture of 60 mm. The design field was 12 T and the magnet operated at 13.5 T. The field quality met all specification for use in a collider. With the superconducting wire available at the time, D20 was expensive in terms of the amount of superconductor used. Noentheless, this magnet could have been the basis of increasing the energy of the Tevatron to well over 2 TeV. The path of experimental physics at the enregy frontier could have been much different.

Modern Nb3Sn wire is nearly 4x better in performance (amps/mm^2) than the wire of D20. Contrary to your guess, the material is between three to eight times as expensive in material than NbTi. As this intermetallic compound is a ceramic after processing at high temperature, it is brittle an far more strain-sensitive than NbTi alloy.


The work of the LARP collaboration (BNL-FNAL-LBNL-SLAC) has been the engineering development to make very large aperture quadrupoles suitable for the redesigned interaction regions of HL-LHC. In these large quadrupoles the peak field at the conductor is ~15 T.


reader Uncle Al said...

Gravitation's mechanism is not quantized carriers like EM, Strong, and Weak interactions. A geometric mechanism must be evaluated with geometric tests - chemistry. Chemistry-based tests are disavowed by physics for their violating founding assumptions.


A good idea need only be testable. It is believable afterward.


reader tomandersen said...

What's the strength ratio of gravity to the next weakest force ? Is it 10^-42 ? At any rate this prediction is far better than the cosmological constant calculation, which comes out wrong by a factor of 10^100 or so. Progress!


reader Luboš Motl said...

LOL, the ratio is extreme in the real world because of the "hierarchy", essentially Higgs' being much lighter than the Planck scale. That would be viewed as the "biggest mystery" by the phenomenologists because it's so "unnatural".


The continuing null results from the LHC increasingly suggests that Nature doesn't care about this form of "naturalness", at least not much.


One may interpret - I add now - the weak gravity conjecture as implying that some of this gap is perfectly natural. At least some gap is really inevitable because it would be unnatural for the scales to saturate the WCG bound.


reader de^mol said...

Lubos, check out this smoking gun:
http://a.disquscdn.com/uploads/mediaembed/images/1188/5617/original.jpg


reader Giotis said...

The article is an outrage!

“…with the potential to make strings respectable again”

Who is this Tom Siegfried signing it and what the hell is he talking about?

String theory never stopped to be respectable.

I’m fed up with all these semi-ignorant wannabes who don’t have a clue of what they are talking about and mislead the public perception of cutting edge theoretical physics.


reader Dilaton said...

Exactly!

These arrogant pompous laymen who feel entitled to troll with a big mouth about cutting edge theoretical physics in popular media, deserve strictly speaking the "Kalashnikov-answer" ;-)

(not sure how one writes this, but I hope it is clear what I mean)


reader WolfInSheepskin said...

Thx