Wednesday, October 28, 2015 ... Français/Deutsch/Español/Česky/Japanese/Related posts from blogosphere

Weak gravity conjecture from composite photons

Composite photons supported by factorization in the presence of Wilson lines through wormholes

Let me begin with a general sociological observation.

A week ago, the world's most notorious anti-physics hate blog was talking about "fads" in fundamental physics. There are periods of focused excitement in which a large portion of the research community spends a lot of time with a certain topic and writes followup papers to a certain publication that has created a sufficient excitement.

In this case, the topic was "entanglement in quantum gravity", a concept that most quantum gravity experts found irresistible in recent 6 years or so. Previously, people would witness "fads" rooted in the construction of huge classes of realistic vacua (KKLT-like landscape), old matrix models, \(pp\)-waves, Matrix theory, Hořava-Lifshitz gravity, or entropic gravity (I mentioned the latter two to make it clear that in some cases, I consider the topics of the "fads" not too exciting or downright wrong, respectively), not to mention dozens of other topics.

Needless to say, the anti-physics blog was negative about the "fads" in general. I must say: "fads" are completely natural, allocate the intellectual potential reasonably, and make the researchers happy.

It's just like when a community searches for gold and someone makes a relatively big find. It's obvious that many other people will try to find gold at the nearby places and/or utilize similar methods that the lucky guy used. Reseachers are thinking analogously. An interesting paper suggests that some sufficiently low-lying fruits may have been discovered. It makes sense to try to pick them.

Be sure and I can testify (from about 5 "mini-fads" or "medium fads" I ignited), it's fun for a researcher to see that his or her excitement and the suspicion that "something extremely interesting is hiding over here" is shared by many other people, that it wasn't a waste of time to spend time and energy with a preprint (because the work also included lots of things that you hate, or at least I did, including the formal and formatting things).

Also, I want to point out that the existence of such "fads" – medium-term events that redefine the character of the field – is a clear sign that the field is alive and kicking. It's crazily inconsistent for the anti-physics blog or anyone else to argue that the field is both stagnant and faddish because these adjectives are almost exactly opposite to one another!

Why did I start with these comments about the "fads"? Because it almost looks like that a new "mini-fad" arrived with a 9.8-year delay. Our "Weak Gravity Conjecture", the first hep-th preprint in 2006, inspired 13 papers in recent two months. I am pretty sure that this safely exceeds the rate in any 2 months since 2006.

The new paper I want to mention looks particularly interesting:

Wormholes, Emergent Gauge Fields, and the Weak Gravity Conjecture by Daniel Harlow
This seems like an interesting enough paper that will make sure that I won't ever confuse Daniel Harlow with (much older) Don Page again. ;-)

Harlow is a postdoc who recently moved from Princeton University to Harvard University – where the "Weak Gravity Conjecture" was formulated. And in his 36-page-long paper, he has found a new perspective from which you may look at the "Weak Gravity Conjecture", understand why it should be true, and link it to some seemingly different principles in quantum gravity.

When I read the paper, it's immediately clear that this guy knows what he's doing and he's able to see under the surface. For example, I believe that the "Weak Gravity Conjecture" paper was written in a self-confident tone but we were never sure what the actual exact statement of the conjecture should be and I am personally not certain even now.

Off-topic: Czechia celebrates its main public holiday today. On October 28th, 1918, Czechoslovakia was established: the new Czechoslovak law enforcement forces smoothly overtook the power from the Imperial & Royal troops. It was another Velvet Revolution, too. Despite the distasteful comments in the video that some Czechs wanted a "socialist republic" (which was thankfully not quite what happened), I think it was a net benefit for the country but as the video from Prague of that day clearly shows (look at the elegant outfits etc.), despite the First World War that was just getting ended, the Habsburg monarchy – the "prison of nations" – left the Czech lands and the Czech nation in a very good shape. Austria-Hungary was clearly a multi-national state analogous to what the EU wants to be – except that it arguably made the nations co-exist in a sweeter and fairer way than the EU. Slovakia still fails to celebrate October 28th – they will realize the immaturity of that attitude once Hungarian becomes the only official language over there again. ;-)

Approximately, the principle says that in any consistent theory including gravity as well as other forces, there must exist particles P whose self-interaction agrees with the statement that "gravity is the weakest force". So the repulsive force between P-P must exceed the attractive gravitational force. This sounds like a virtually complete statement if there is just a \(U(1)\) but if the gauge fields are numerous, complicated, non-Abelian, in the presence of generic electro-magnetic mixing etc., the precise statement isn't known to us. But there seems to be something here.

(For electrons, the electrostatic force is about \(10^{44}\) times stronger than the gravitational force, so gravity is clearly the weakest force in the real world. But a nontrivial fact is that this seems to be true in all stringy vacua and/or consistent vacua of quantum gravity in general.)

In other words, if the gauge coupling is much smaller than one, there must exist particles whose mass is much lower than the Planck mass, at most\[

M \leq g\cdot M_{\rm Planck}.

\] This is exactly the opposite inequality to the "extremality bound" that must be obeyed by charged black holes for them to avoid naked singularities. So while the large black holes must never be "more charged than massive", tiny black holes – elementary particles – are not only allowed to violate this inequality but some of them are obliged to do so. This is not in contradiction with general relativity because you may only claim that elementary particles are "charged black holes" if you appreciate that the quantum gravity corrections are substantial or huge for them.

There have been several basic types of arguments in the original paper, including
  1. the absence of remnants
  2. instability of lots of extremal black hole eigenstates – things shouldn't be stable without a corresponding symmetry so they must be able to Hawking decay to the "highly charged" particle whose existence we postulate
  3. non-existence of global symmetries in quantum gravity: gauge fields with tiny values of \(g\) "formally" satisfy this lore but if \(g\to 0\), it is obvious that they would be cheating, so for fixed values of the masses, something should prevent you from making \(g\) too small because if \(g\) is too small, you "morally" violate the mandatory local character of all symmetries (the local symmetry would get as close to a global one as you want which "feels" immoral because one shouldn't need the infinite accuracy etc. to verify that Nature obeys the rule "there shall be no global symmetries")
  4. the fact that all known classes of string/M-theoretical vacua seem to obey this principle; clearly, this survey of the known string vacua gives you an argument that is "as close to an empirical proof" as you can get
Some other reasons to think that the "Weak Gravity Conjecture" is right were presented by some of the followup papers. For example, Simeon Hellerman derived a bound on some dimensions in CFTs. Cheung and Remmen derived a seemingly equivalent inequality for the coefficients of a Lagrangian from the unitarity and analyticity. Also note that the conjecture may ban naturalness and it's often said to be "nearly equivalent" to the claim that string theory prohibits a large field inflation.

Harlow has formulated his modified version of the principle – the charge lattice has to be "fully populated". He calls it the "principle of completeness" which may be an unfortunate name because it's been used in very different contexts (like for \(1=\int dx \ket x \bra x\): but maybe it's the "same thing"?). But he also has a new reason to believe that the principle should hold:
Construct a wormhole-like configuration in quantum gravity. The holographic CFT description is composed of two components of the "boundary". So the degrees of freedom "factorize". But the wormhole allows observables, namely the Wilson lines going from one side to another, that don't "factorize".

This looks like a potential paradox. Harlow's resolution is that it must always be possible to cut the Wilson line – physically, a thin flux tube through the wormhole – into two pieces. So there must exist the charged particles at the two new ends of the cut Wilson line. And he calculates that they should be "light enough".
Harlow realizes that his "completeness" is weaker than the "Weak Gravity Conjecture" because his constraint doesn't prevent you from making all the particles heavy but he goes beyond his "completeness", too.

While our original paper was somewhat conceptual, vague, or philosophical, Harlow's paper seems quantitative. The specific content of the paper is primarily reflected in his detailed analysis of a CFT, the \({\mathbb{CP}}^{N-1}\) model, the boundary description from which he wants to reconstruct the bulk gauge field. At least in this limited context, he is able to address stronger versions of many statements, e.g. a strengthened "Weak Gravity Conjecture" that also says that the predicted charged states have to be in the fundamental representation.

While I think that his term "the principle of completeness" is too general and easy to confuse with other situations, I do sympathize with this change of the perspective. Indeed, it seems like "there is something in quantum gravity" that doesn't allow you to waste the resources.

In quantum mechanics, the phase space is "completely covered" by cells of area \((2\pi\hbar)^N\). Obviously, we understand why it's so. In the classical limit, the quantum mechanical "traces over the Hilbert space" have to reduce to the "integrals over the phase space" and the prefactor may be seen to be right because locally on the phase space, the geometry is always the same geometry we know from the simple \(x\) and \(p\) and their quantization.

Something like that holds for the charge lattices, too. The Dirac quantization rule (derived from the requirement that the Dirac string coming from a magnetic monopole is unobservable through the phases around it) says that the magnetic charges have to be integer multiples of \(2\pi/e\) where \(e\) is the minimum electric charge. But Dirac's argument wouldn't ban a "much sparser" lattice of allowed electric and magnetic charges.

Dirac's argument says that "the lattice of allowed electric and magnetic charges cannot be denser than the normal one" but it's natural to think that there's also a principle saying that "this lattice can't be too sparse, either". And this is the "principle of completeness", as Harlow calls it, and it's basically equivalent to a form of the "Weak Gravity Conjecture".

So the arguments supporting the "Weak Gravity Conjecture" basically "push" the theory exactly in the opposite direction than the Dirac quantization arguments. The arguments on both sides look qualitatively different from each other. I feel that there must exist a more direct way to see that the "density of the lattice of allowed charges" is exactly what we think it is, a derivation analogous to the arguments proving that the phase space cells have the area of \((2\pi\hbar)^N\). But what is the direct derivation?

There is a problem here. For each point in the charge lattice, one could imagine a different degeneracy of the allowed states. The degeneracy may be large – any integer – and why would zero, another integer, be ever banned? There must exist formulae and mathematical arguments that ban zero in certain or all cases. Maybe, all the points on the lattice are "qualitatively the same" in some new sense – perhaps one needs to carefully look at all particles as small charged black holes whose charges and masses become "continuous" when the black holes grow large.

At the end, I believe that the charge lattices aren't the only example of the "required balanced density" that must be true in quantum gravity.

For another, more speculative, example, consider the landscape of semirealistic vacua. They produce some effective field theories. But string theory only implies a discrete set of such effective field theories. So we get discrete points in the parameter space of the effective field theories. What is the density of these points on the parameter spaces of effective field theories? I believe that the density, when correctly calculated, mustn't exceed a certain bound, either.

In fact, I even believe that that the density could be "calculable" in some way. And with an appropriate "inner product" on the space of some "elementary predictions", two effective field theories coming from two different string vacua must be perfectly "orthogonal to each other" – which also means perfectly distinguishable via an appropriate "basket of balanced experiments". I don't know how to prove such a claim. I can't even offer you convincing circumstantial evidence. At this point, it is a belief based on my intuition.

But I think that on a sunny day in the future, all such claims will arise from a totally new way (or new ways) to geometrize concepts that we don't visualize sufficiently geometrically at this point. In other words, I expect the links between "algebra" and "geometry" to get even deeper and much deeper than they are today.

The history of physics and mathematics is full of many examples of this trend. But in the core of fundamental physics, the \(N\)-dimensional Hilbert space for a large \(N\) may be visualized as a representation of operators similar to \(x,p\), coordinates on a phase space, and one may derive the universal density. The basis vectors of the Hilbert space (in a countable basis) were "discrete" but there are ways to see "continuous physics" behind them, too, and the discreteness only arises when we look at very small areas of the phase space.

Similarly, the ER-EPR correspondence and related business "visualizes" entanglement in new, more geometric ways.

I believe that the lattices of allowed charges will be viewed from a new direction when all the elementary particle species – discrete choices – will be interpreted as parts of a discretuum of charged black holes or other configurations in a general-relativity-like theory. And people will carefully match the "summation" and "integration" expressions for many quantities. Even more generally, I believe that for almost every "sum" that appears in our formulae, people will find totally new ways to rewrite those sums as smoother "integrals". Perhaps all the discrete data in all of our theories will emerge from some application of quantum mechanics or quantum-like mathematics to new continuous spaces.

The geometrization needed to understand things like Harlow's "principle of completeness" goes beyond the geometrization of the Hilbert space using the phase space: the densities depend not only on \(\hbar\) but also on Newton's constant \(G\). The Bekenstein-Hawking and Ryu-Takayanagi expressions involving things like \(S=A/4G\) will be understood as sketches of some "grander mathematics" in which the spacetime (and all the effective field theories in it, including the Einstein-Hilbert action) will be fully emergent and that will make things like the "Weak Gravity Conjecture" self-evident. It is not an established result. It is a prophesy. But there may exist very good reasons to take my prophesies seriously. ;-)

Add to Digg this Add to reddit

snail feedback (0) :