Saturday, July 09, 2011 ... Français/Deutsch/Español/Česky/Japanese/Related posts from blogosphere

Large black holes have a nearly continuous spectrum

Gia Dvali, Cesar Gomez, and Slava Mukhanov are clever chaps but their newest paper,

Black hole masses is quantized
is just silly. A similar team was promoting classicalons that cannot exist. When I was reading those articles by these authors, articles that try to say that quantum gravity is nothing special and one may have lots of states behaving similarly to black holes even without gravity, I was sure that what they must be misunderstanding is gravity itself.

The newest paper has proved this point explicitly.

They argue that the black holes must have a brutally quantized spectrum so that the entropy (or area) is essentially an integer in some Planck-like units. It's of course nonsense, the arguments supporting such a statement are a stream of irrational consciousness, and their paper has already led to some inflated irrational ramifications.

A striking example of the latter point is that the physics arXiv blog has urged the physics community to admit, in the wake of the publication of this paper, that the LHC may create a black hole that will eat the Earth, thank Walter Wagner for his wise attempts to save the Earth, and return to the blackboard. ;-)

Now, the physics arXiv blog is a kind of a crackpot hub so that you can't be shocked by preposterous comments of this caliber. However, it's actually true that if it were possible to rewrite the basic laws of quantum gravity in such a brutal way - as Dvali et al. do - pretty much everything would be possible.

Bekenstein and Mukhanov

The idea of a brutally discrete spectrum of the black holes goes back at least to 1995 when Jacob Bekenstein and V. Mukhanov proposed this violent new "spectroscopy of the quantum black hole". Bekenstein has done amazing things. I think that it would take several more years or decades to figure out the logic of thermodynamics of black holes if Bekenstein didn't offer his insights in the 1970s. And when Bekenstein was visiting Harvard for many weeks, I liked he didn't intend to hide his Jewishness, not even in the politically correct People's Republic of Cambridge. ;-)

However, despite all those things, those ideas are nutty.

The basic misconception leading to the quantized areas is easy to describe. The entropy of a black hole, "S=A/4G", is one quarter of the area in the Planck units. The entropy is a form of information. And the information comes in bits, the smallest unit of information, so the area must also be a multiple of 4G.ln(2). Correspondingly, the spacing of the mass eigenstates goes like "delta M = C/M".

Except that all these statements are wrong. A bit is not the minimum possible spacing of information. In digital computers, the information may be organized into binary digits but in Nature, the information is surely not organized in this way. The information may be used to distinguish M a priori possible, equally likely outcomes. The amount of information needed to do so is ln(M) which is not a multiple of ln(2) because M is usually not a power of two.

Most of the large numbers M are not powers of two. But it's true that the a priori probabilities of different outcomes don't have to be equal so one outcome may carry the information that is not even of the form ln(integer). The information obtained by learning about the outcome whose a priori probabilities were p_i is simply
Information = sum pi ln (1/pi).
A well-known formula, indeed. The value of the expression above may be any real non-negative number. So there's obviously no information-theoretic reason to think that the areas of black holes should be quantized with the huge spacing L_{Planck}^2 as those authors - and others - want to believe.

Quasinormal twist

At the end of 2002 and beginning of 2003, I (and then I with Andy Neitzke) became a darling of those "quantized black hole" people. because I/we proved that the asymptotic part of the real part of the frequency of highly-damped quasinormal modes of the Schwarzschild black holes is exactly equal to a value proportional to ln(3). See Quasinormal story on quasinormal modes.

However, the reason is that even before you do the detailed information, it's pretty clear that the solution is a log of a simple expression - and the simple expression just turns out to be three for the neutral non-rotating black hole. It may also be shown to be something totally different than 3 for other black holes and any particular conjecture linking the real part of the frequency to some quantization of black hole parameters may be safely ruled out.

Thermal radiation

In reality, the spectrum of the allowed masses becomes nearly continuous for large black holes. To say the least, the spacing of the mass M is parametrically smaller than 1/M in the Planck units. If you realize that the black hole has entropy S, it's clear that the density of eigenstates goes like exp(S), so the spacing - if it is nonzero at all - should be proportional to exp(-S). It is exponentially small for large black holes.

All arguments trying to claim that this is impossible are wrong.

If the mass spectrum were quantized with #/M being the spacing, the spectrum of the radiation coming from the black holes couldn't be thermal at all. In the thermal spectrum, one gets a Planck black-body curve with wavelengths that are comparable to the black hole radius: a wavelength close to the black hole radius is the most represented wavelength among the photons Hawking-emitted by a black hole.

But if the spacing of the mass were #/M, one could only emit discrete frequencies of photons and there would only be one allowed wavelength close to the black hole radius. The next allowed one would be twice as small, approximately speaking, and so on. A totally discrete spectrum.

This would mean that the Hawking calculation is totally wrong, even for very large black holes. It seems that Dvali et al. are saying nothing less than that. I am not sure whether they realize that I am not the only person who is convinced that believing that the Hawking calculation is completely wrong, even for very large black holes, means to be a crackpot.

The Hawking calculation is totally robust. Because of Hawking's powerful technical brain, he was able to calculate the right result for a black hole (a pretty complicated geometry!) a long time before William Unruh could reproduce a much simpler but similar result - the radiation seen by an accelerating observer. But the essence of both of these calculations is the same.

One (or Hawking) only needs to know the effective quantum field theory at low energies together with some basic rules about locality in general relativity to prove that to a low-energy observer, the behavior of a black hole must look like the emission of the continuous thermal radiation (with the grey body factors etc.).

To deny that a black hole emits a thermal radiation means to deny that quantum field theory works at all - or to deny that the world is at least approximately local. At any rate, such a denial contradicts the experiments and observations. And with such a denial, you are left with no principle of physics to rely upon - so indeed, the claim that in such a world, the LHC may produce a black hole that will swallow the Earth becomes conceivable.

It doesn't mean that it is actually conceivable. Instead, it means that your assumption - that all laws of physics should be thrown out of the window - is invalid.

Imaginary part of the mass; random matrix theory

Non-extremal black holes have a nonzero temperature and they therefore evaporate. It means that they correspond to unstable - or metastable - states. Correspondingly, their mass eigenvalue can't be real. Metastable states have the imaginary part of their mass being Gamma/2, i.e. half of the width. That's because the squared wave function - which measures the probabilities - must decay as exp(-Gamma.t) with time (if you represent the black hole by a generalized eigenstate with a complex energy eigenvalue and purely outgoing radiation).

So non-extremal black holes correspond to microstates whose mass eigenvalue is a complex number. It's close to the real axis but it is not real. The imaginary part Gamma is probably linked to the black hole lifetime - M^{D-1} in the Planck units where D is the spacetime dimension. It could be proportional to Gamma=#.1/M^{D-1} but I am not quite sure.

Classically, the black hole states form a continuum. And indeed, I think that at the quantum level, all the branch cuts in the scattering amplitudes actually become a very dense system of single poles. The density of these poles goes like exp(S) where S is the black hole entropy. They're very dense, indeed. Because the exponential of S increases more quickly than any power of S - or any power of the mass or radius - it means that to all orders in perturbation theory, the discreteness of the spectrum will be invisible. The black hole's quantized information and their ability to remember it can only be extracted from non-perturbative physics - physical considerations that go beyond the power-law expansion in G_{Newton}.

For all approximate, perturbative questions, the black hole spectrum is continuous.

When you look at the non-perturbative structure of the spectrum, yes, I do think that you will find out isolated (but very dense) poles at very specific complex values of the mass. I think that the typical statistical pattern how these poles are separated will be governed by random matrix theory. Random matrix theory is all about the distributions of eigenvalues of a Hermitian matrix with random entries.

Eigenvalues of such a random matrix won't be distributed by the Poisson distribution. Instead, they will repel from each other according to a very specific formula. For this reason, if you "beep" at time "t" every time there is a microstate with mass "M=t", the beeping will be much more uniform than the Poisson beeping, i.e. much more regular than the popping of popcorn in your microwave.

Incidentally, the zeros of the Riemann zeta function - those that are relevant for the $1 million Riemann Hypothesis - are also distributed according to random matrix theory. I actually believe that there exists a very simple kind of black hole somewhere in the landscape of quantum gravity whose scattering amplitudes are proportional to 1/zeta(f(M)) and the zeros may be interpreted as black hole microstates. An analytic solution with a zeta function that knows about the random matrix theory and other things. But I can't tell you the details.

Extremal black holes and entropy including "pi"

So there are no huge degeneracies in the spectrum - unless one has huge symmetries. The degeneracy is nearly one for all levels - so the spacing of the mass must be exponentially small. The only exceptions are extremal black holes whose masses are determined by the charges and angular momenta. Such states don't evaporate - their mass is strictly real - and they inevitably have a huge degeneracy (this is the kind of degeneracy that Strominger and Vafa and those who followed in their footprints have been calculating since late 1995).

For extremal black holes, the mass is calculable from the charges and the angular momenta (think of the BPS bounds etc.) and because these two things are essentially integers, the mass is an integer, too. If you care, the entropy S of such simplest extremal black holes is a multiple of "pi". Well, the other coefficient isn't an integer - it is a square root of an integer. But if you wanted to find traces of the "area quantization", the unit entropy wouldn't be ln(2) or ln(3): it would be "pi" (which is, by the way, -i.ln(-1)). Thinking that the entropy has to be a multiple of ln(2) or ln(3) anywhere in Nature is a sign of a kindergarten naivite.

Flawed arguments

I have already mentioned that the main motivation behind all these misconceptions is people's problem to work with continuous numbers. They have infinitely many digits, right, which is hard, so people prefer to think that the Universe is a digital computer which only needs a few digits. But unlike many humans, Nature is not mentally limited in this sense: She has no problem whatsoever to calculate things exactly and to work with continuous numbers.

In the new paper, Dvali et al. offer several wrong arguments (without saying that they're wrong) why the spacing of the mass eigenstates of a black hole should be huge. First, they say that with a more continuous spectrum, the production rate would be infinite. It wouldn't. For each individual microstate, the probability amplitude for the production is exponentially suppressed in the right way so that the total cross section to create a black hole is similar to the geometric cross section of the future black hole - a finite number.

They also say that the virtual black hole states would induce infinitely strong couplings if the spacing were too small. That's wrong, too. Black holes microstates behave much like the light species of elementary particles. But they're not equally fundamental. By the UV/IR duality, including the virtual black holes as independent intermediate states is a sort of double-counting.

We know all these things even in perturbative string theory: one-loop amplitudes are calculated as the integral over the shape over the torus tau over the fundamental domain only. It's just incorrect to think that quantum field theory with independent particle species works up to arbitrarily high values of the masses of intermediate particles. Quantum gravity is *not* (and cannot be, because of consistency criteria) a quantum field theory (in the bulk) in the strictest sense - another point that Dvali et al. seem to completely misunderstand. Even perturbative open string theory - which is arguably closer to a local field theory than the proper regime of quantum gravity - modifies the ways how the loop amplitudes are calculated (fundamental domain of "tau" only etc.).

In another section, 2.3, they ask whether a black hole state may have an "infinite norm". Now, this is a dumb question, indeed. It has nothing to do with black holes per se. In strict mathematics, we define the Hilbert space to be a set of vectors and each of them has a finite norm. The Hilbert space is e.g. the L^2 space of square-integrable functions of some variables. So the infinite-norm states are never included in the Hilbert space. This is the standard convention - a simple result of a mathematical definition of the Hilbert space - and it has nothing to do with any dynamical details of quantum gravity or any other theory.

But even if we do allow states with an infinite norm, they differ from states with a finite norm just by some rescaling, a normalization! They only differ by the norm. So you may include states that are infinitely rescaled (up or down), too. You will gain exactly zero information about physics. There clearly can't be any physical argument that builds on the "finiteness of norm" of some states.

Various Hamiltonians may have discrete, continuous, or mixed spectra and there is no general way to show that one of them is impossible because they're not impossible. But it's true that because of the finite entropy of a black hole, there should only exist a finite number of microstates - and only in this sense, the spectrum is discrete (but exponentially dense).


The misconception that the real world is discrete - and in fact, it is composed of gigantic pixels - is a widely spread fallacy that plagues the brains not only of the people who have never been serious physicists in any sense - for example, the proponents of loop quantum gravities and similar toxic junk - but even some people who have been serious physicists. It's completely wrong. Nature is fundamentally continuous and all discreteness we find in it is derived. The black hole spectrum is continuous to all orders of perturbation theory and there is no contradiction.

Hawking's calculations of the radiation must also be valid perturbatively; its errors - responsible for the spurious loss of information - are exponentially tiny in a proper counting. Quantum gravity, much like perturbative string theory, enjoys the virtues of the UV/IR mixing which guarantees that the high-mass seemingly "elementary" particles shouldn't be counted as virtual states in the same way as light elementary particles because it would be a double-counting.

And finally, the LHC cannot produce any dangerous black holes or any other dangerous objects, for that matter.

And that's the memo.

Add to Digg this Add to reddit

snail feedback (3) :

reader Stephen KIng said...

Could Dvali be right if G (as in 4G.ln(2) or 4G.ln(M) ) is not an absolute constant?

reader Luboš Motl said...

I don't understand what it means for a dimensionful quantity to be constant. It's constant or non-constant depending on the units. For example, in any universe, in G=1 units, G is constant. ;-)

reader Brian G Valentine said...

I'm misunderstanding something; "constant" would mean to me, independent of the motion of the observer (action density and entropy always are)