## Tuesday, October 13, 2015

### Quantum character of gravity doesn't need to be "tested"

In one form or another, the content of this text has repeatedly appeared on this blog, especially in the 2012 blog post titled Why "semiclassical gravity" isn't self-consistent, but a blog post on a pseudoscientific blog,
A newly proposed table-top experiment might be able to demonstrate that gravity is quantized,
has convinced me to write this new comment, anyway. The purpose of her tirade and the preprint
Optomechanical test of the Schrödinger-Newton equation
by Großardt and 3 co-authors is to test the idea that much of the world obeys the principles of quantum mechanics but gravity is classical. Such a combination is referred to as the "semiclassical gravity" and it is often used as an approximation to describe various things. (It was the approximate framework in which Hawking derived the black hole radiation for the first time.) Because the gravity is classical in this would-be theory, so is the spacetime geometry. "Semiclassical gravity" effectively ends up being equivalent to quantum field theory in a classical background geometry.

This "mixed" classical-quantum picture is fine as an approximation if the sources of strong enough gravitational fields move in a way that may be approximated by the classical motion. Well, it's never the case exactly. And the "semiclassical theory" is unavoidably inconsistent if you demand it works exactly. Quantum mechanics and classical physics just can't mix.

The "semiclassical theory" wants to use equations such as$R_{\mu\nu}-\frac{1}{2} R g_{\mu\nu} = \frac{8\pi G}{c^4} \left\langle \hat T_{\mu\nu} \right\rangle_{\ket\psi}.$ Einstein's equations are classical but the stress-energy tensor is composed of the matter that fundamentally obeys the quantum mechanical laws. So all the observables are operators but because all objects in classical Einstein's equations are $$c$$-numbers, we have to calculate the expectation value of the important operator, in this case the stress-energy tensor field, and substitute it to the classical Einstein's equations.

Sabine Hossenfelder writes:
Though nobody can make much sense of this theory conceptually, it’s infuriatingly hard to disprove.
But this sentence is contradicting itself. It's spectacularly obvious that the person who wrote it doesn't have the slightest clue about this part of physics – or about logical reasoning. If one has a totally well-defined equation but it doesn't make sense conceptually, then it's immediately disproved. This simple point is hard to understand for crackpots such as Hossenfelder who only write blog posts and papers that don't make any sense but the affirmative action and other things prevent others from effectively pointing out that everything that these Hossenfelders have ever written down is self-evidently worthless rubbish that contradicts even the basic observations.

The reason is that the phenomena in the microscopic world unquestionably obey the probabilistic principles of quantum mechanics but these microscopic phenomena still influence the macroscopic motion of matter, too. The previous sentence implies that the observable properties of the macroscopic objects have to be predictable only at the probabilistic level, too. After all, every measurement of a tiny system is imprinting a microscopic observable to a macroscopic one. Because the former isn't predictable deterministically and because the observations imply a tight correlation between the former and the latter, neither can be the latter. At most the probabilistic distributions of the observable properties of microscopic and macroscopic objects may be predictable by the laws of physics. Everyone who has understood at least the first three courses of an undergraduate quantum mechanics course must understand the simple point in this paragraph.

The "uncertain" microscopic phenomena often imprint themselves into gravitational fields, too. Look at an asteroid that is approaching the Earth, to make it dramatic. Send a nuclear weapon over there and break the asteroid to two pieces which will be safe. The explosion may be ignited when the spin of an electron in the detonation apparatus is measured. When it's up, the bomb tears the asteroid into two pieces separated "vertically", when the spin is down, they will be separated "horizontally".

The details don't matter. But the point is that the two new pieces of rock will have positions that are uncertain and only the probabilistic distributions are predictable; but the gravitational fields of those rocks will be unquestionably linked to the positions of the rocks, anyway.

If the pieces of the original rock A are B and C, the expectation value of $$\vec r_B - \vec r_C$$ may be equal to $$\vec r_A$$ but that doesn't mean that the gravitational field of the rocks will be spherically symmetric, like the field of C was. Instead, the gravitational fields will be centered around the actual rocks. If this weren't the case, we would see the consequences everywhere. The laws of gravity wouldn't work at all because the gravitational field wouldn't be centered at the actual location of the objects where we observed them; but at some "average" position where they were predicted to land before the measurement was made, or something like that. The gravitational field wouldn't be caused by the objects themselves but by some "ghosts" that separate from the "bodies" all the time.

This prediction is clearly refuted experimentally so the hypothesis is instantly falsified. We don't need to build special tabletop experiments that cost millions of dollars because they teach us nothing new whatsoever. For a theory to agree with any experiment sensitive to gravity, the gravitational fields just can't have properties dictated by some expectation values. They must have properties entangled (correlated) with the properties of the matter that is sourcing those fields! So there must exist probabilistic distributions (and wave functions) for properties of the gravitational fields, too. We directly observe this fact whenever we see any gravitational fields correlated with the location or state of the matter.

Unlike Hossenfelder, Großardt et al. are actually aware of the "stillborn" character of the semiclassical gravity. After they write the equation$R_{\mu\nu}-\frac{1}{2} R g_{\mu\nu} = \frac{8\pi G}{c^4} \langle \Psi|\hat T_{\mu\nu} |\Psi\rangle$ with an embarrassingly wrong relative sign on the left hand side on page 2, they state the following:
Of course, such presumption is not without complications. For instance, in conjunction with a no-collapse interpretation of quantum mechanics it would be in blatant contradiction to everyday experience [6]. Moreover, the nonlinearity that the backreaction of quantum matter with classical space-time unavoidably induces cannot straightforwardly be reconciled with quantum nonlocality in a causality preserving manner [7, 8]. Be that as it may, there is no consensus about the conclusiveness of these arguments [2, 9, 10].
They list several lethal problems of this theory. When one tries to insist that there's no "objective collapse" of the wave function, the theory is instantly refuted by the basic empirical data, as I have stated, because the gravitational fields get separated from the actual locations of the objects as if they were ghosts, as I mentioned.

If you do include some ad hoc "objective collapse" of the wave function (and the collapse of the wave function and its moment are unavoidably objective because the wave function directly influences a measurable classical degree of freedom, the spacetime metric!), you will run into equally lethal problems, too. The theories with such an "objective collapse" violate the Lorentz invariance. After all, the very "expectation value" in the equation only makes sense if it is evaluated on a particular slice of the spacetime i.e. if you pick a preferred coordinate system.

All the phenomena enforcing entanglement that "look" like violations of locality to the beginner (e.g. to Einstein, Podolsky, and Rosen) become "genuine" violations of locality if some classical degrees of freedom depend on the expectation values.

Moreover, the "semiclassical theory" involves equations that are non-linear in the wave function. Schrödinger's equation for the wave function may still "look" linear but because the form of this equation depends on the background metric and the background metric is affected by the expectation values measured in the wave function, the superposition principle simply no longer holds. The Schrödinger's equation is nonlinear as well because the nonlinear dependence arises due to the wave-function-dependent metric tensor that appears in the Schrödinger's equation.

If you ignore all these violations of the basic postulates of quantum mechanics, including the superposition principle, you end up with a theory where the "wave function" is nothing else than another set of classical degrees of freedom. They have to be evaluated at a given slice, collapse according to some rules, and all these features will unavoidably lead to the violation of the Lorentz symmetry, unitarity, or both.

Großardt et al. cite the 1981 paper by Page and Geilker titled Indirect Evidence for Quantum Gravity that has 128 citations and that basically elaborates on some of the obvious arguments above. These arguments imply that theories in which a classical metric tensor is coupled to the quantum mechanical matter contradict very basic and general experiments.

But there is a lot of relevant work that is being ignored, including the interference experiments with neutrons in the Earth's gravitational field. Take e.g. the paper Observation of gravitationally induced quantum interference (full PDF) which has 860 citations. Colella et al. – and other experimenters – have verified the simple quantum mechanical theory for a neutron whose Hamiltonian is$H_n = \frac{p^2}{2m} + V(x,y,z)$ where the potential term $$V$$ includes the Earth's gravitational potential energy, too! You can make the neutron jump above the ground and observe the bound states, just like predicted by quantum mechanics. These interference experiments falsify all kinds of childish alternative theories of gravity such as the "entropic gravity" of Erik Verlinde.

But if you think about it, they also falsify the "semiclassical gravity" and any theory where the observables are replaced by some expectation values. In the expectation-value-based theory, you would need to replace the potential energy of the neutron by the expectation value of this energy, too:$V(\hat x,\hat y,\hat z) \to V(\langle \vec r \rangle)$ But that would lead to very different predictions of the neutron interferometry experiments, too. If you tried to average the field over the position of the Earth only but not the neutron, you would get an asymmetric theory that would violate the energy conservation, and so on, and so on. Every particular modification that you could propose is safely excluded.

The last sentence in the quote from the Großardt et al. paper talks about the "absence of the consensus". You may see how catastrophic a collapse of the scientific quality you immediately get when people refer to the "consensus". There is no "consensus" because some people who are allowed to write papers are simply not good enough for the scientific discipline. But that shouldn't be a reason for others to join them and write lousy papers, too. (Or to propose useless experiments.) A good paper cares about the logical robustness of its arguments and the consistency between the assumptions and conclusions and the paper; and the compatibility of the results of the paper with the direct and indirect empirical evidence. None of these things demands any "consensus". For a paper to care about the "consensus" unavoidably means to reduce the quality of the paper towards the worst people in the field. No good scientist should ever risk such a thing.

People like Sabine Hossenfelder – the worst people in the field – love to forget about all the evidence whenever they write a new paper or a new blog post. They never learn anything. Their very salary depends on their remaining completely ignorant about all these things, including the very basic insights. They pretend to do research in quantum gravity – something they don't have the slightest clue about – and it's great for them to pretend that everything is unknown, including the very question whether our world both respects quantum mechanics and contains gravity. When everything were unknown, then the extent of their knowledge (namely zero) would be totally adequate and this is the illusion they want everyone to believe.

But it is not true that everything is unknown and there is a huge difference between a layman who has no clue, like Sabine Hossenfelder, and a researcher in quantum gravity. The difference is some 40 years of intense work and insights spread over thousands of pages of nontrivial results, including hundreds of totally groundbreaking results that have changed how we view the Universe and that will never go away.

Also, Sabine Hossenfelder and many others keep on using the jargon of anti-quantum zealots. Take e.g. the title
...demonstrate that gravity is quantized
Sorry but this is a totally wrong way of talking about the laws of physics. We may say that "gravity is quantized" at the level of the formalism – have we added the hat above the metric tensor? Have we taken the quantum character of the geometry into account? It's about our approximations or the learning process. We often learn quantum theories as if they were "obtained" from classical theories by the process of the quantization. But this is just our pedagogic or psychological strategy, not something that is important in the laws of physics. The laws of physics only have an objective meaning when it comes to their predictions and explanations, not when it comes to a particular method how we learned them. And in fact, important and general enough quantum theories cannot be uniquely reconstructed from some classical templates (or from their classical limits, if I put it a bit more intelligently).

However, when we talk about the properties of the laws of physics, it makes no sense to ask whether "gravity is quantized". It is not gravity but the laws of Nature that may be assigned the q-adjective. And the q-adjective is "quantum", not "quantized". The correct laws of physics are quantum mechanical. Quantum mechanics is a framework, a general type of the laws of physics. This template is different from and totally inequivalent to the framework of classical physics; but these two frameworks play the same role.

There is just no third framework, one that would allow you to cherry-pick "what is classical" and "what is quantized" in the real world. No such consistent hybrid framework exists which is why a sensible physicist shouldn't talk about it at all! So even if the experiment by Großardt or any other experiment ended up by contradicting some predictions of a quantum mechanical theory, the right conclusion couldn't be to "abandon the purely quantum mechanical framework and adopt a hybrid one" – simply because no such a viable hybrid contender exists! Instead, we would be looking for a different theory obeying the postulates of quantum mechanics.

Again, the misguided terminology involving "gravity is quantized" is being systematically promoted by people such as the loop quantum gravity crackpots who still believe (or pretend to believe) that to have a theory of quantum gravity, the only thing you need is to place some hats or caps on the symbols of Einstein's equations, and then spend the rest of the time by praising what the hats mean and how Einstein had the same truth, up to the hats.

But this is a totally wrong assumption: Einstein with a hat isn't "everything". The consistent quantum theory of gravity can't be obtained by any "straightforward quantization" of the classical theory of gravity. It is not the case that the classical theory basically contains the whole truth about quantum gravity and the transition to quantum gravity is as cosmetic a step as adding the hats on the top of someone's head. Quantum gravity is a nontrivial, self-sufficient, independent, constrained, remarkable theory and the classical Einstein's equations are just features of a limit, $$\hbar\to 0$$, of this theory. But the full theory, much like the function $$T(\hbar)$$, can't be fully reconstructed from the $$\lim_{\hbar\to 0}T(\hbar)$$ limit!

The relationship between the two theories goes in the opposite direction. The quantum mechanical theory is the "real deal" and the classical theory is a derived theory. It's derived by taking the limit. The correct adjective is "taken to the limit", not "quantized". I don't know how we can explain to the laymen that even when we look up in the sky and we want to explain the stars that are shining all so brightly, we've got to take the quantum mechanical theory to the limit (instead of assuming that there is an exactly valid classical theory behind the phenomena). Maybe these guys in the 1990s did a better job:

Pseudoscientists like Hossenfelder keep on making naive assumptions that have been known to be wrong for decades and they never learn any lesson that is actually critical for a proper understanding of the problems. It's terrible that this sloppy idiotic mess has spread even to places such as Nordita which Niels Bohr had founded as a place that should have continued the Scandinavian physics glory of the Copenhagen school for the following century. Instead, Nordita has become a hiding camp for cranks who spend their time by doubting whether they should take quantum mechanics seriously at all.

It's sad and it's outrageous.