Wednesday, September 16, 2009 ... Français/Deutsch/Español/Česky/Japanese/Related posts from blogosphere

Myths about the minimal length

Many people interested in physics keep on believing all kinds of evidently incorrect mystifications related to the notion of a "minimal length" and its logical relationships with the Lorentz invariance. Let's look at them.



Myth: The breakdown of the usual geometric intuition near the Planck scale - sometimes nicknamed the "minimum length" - implies that the length, area, and other geometric observables have to possess a discrete spectrum.

Reality: This implication is incorrect. String theory is a clear counterexample: distances shorter than the Planck scale (and, perturbatively, even the string scale) cannot be probed because there exist no probes that could distinguish them. Consequently, the scattering amplitudes become very soft near the Planck scale and the divergences disappear.

However, there is no discreteness of geometric quantities - such as the radii of compact circles in spacetime. And general "intervals" or "surfaces" inside the spacetime can't even be localized with the Planckian precision which is also why their proper lengths and areas, assuming their better-than-Planckian accuracy, are not even well-defined observables in string theory: what can't be measured operationally often can't be defined theoretically, either. ;-)

Many other aspects of quantum geometry or minimal length - such as T-duality, a critical, "maximal" Hagedorn temperature, or some kinds of noncommutativity - do emerge when we approach the smallest distance scales. But the naive discreteness is just one possible way how the usual concepts of a continuous geometry could be realized at short distances. And it is a way that is not chosen in quantum gravity because of many reasons, including its incompatibility with the Lorentz symmetry that we will discuss later in the text.




Even if you talk about non-relativistic quantum mechanics, you can see that the naive "quantization" is not the only possible way how quantum mechanics affects quantities whose units are the same as the units of Planck's constant. While the angular momentum "J" is literally quantized in units of "hbar/2", the action "S" has the same units but it is not quantized in any sense.

Instead, when the action "S" - more precisely its differences between two relevant configurations in spacetime - start to be comparable to "hbar", it just means that interference and other quantum phenomena become important and the classical approximations break down. But a discreteness of "S" is not needed for this breakdown, and as you can calculate by a simple integral, the trajectories that contribute to Feynman's path integral have continuous values of "S".

To summarize, the adjective "quantum" in "quantum mechanics" or "quantum gravity" means that the theory must be compatible with the quantum postulates, including the superposition principle for complex amplitudes that are interpreted probabilistically. The adjective surely doesn't mean that all quantities have to possess a discrete spectrum.

Myth: A structure of links or surfaces filling a Minkowski space may be Lorentz-invariant, at least statistically.

Reality: First, let us start with a picture that actually is Lorentz-invariant. Imagine that you fill your spacetime with a large number of points whose coordinates are random, uniformly distributed, and independent.



The picture above - click to zoom in - was taken from an article about the egalitarian bias. The coordinates were actually not quite random. They were taken from the digits of "pi". But be sure that you can't tell - unless you are a crazy genius. ;-)

This picture has the striking feature that if you Lorentz-transform it, i.e. if you shrink it "gamma" times in the Southwest direction, and expand it "gamma" times in the Southeast direction, it will look statistically indistinguishable. After all, the statistical distribution needed to generate the picture only depended on the "density of dots". It didn't depend on any distances or other "metric" data.

However, whenever a picture does depend on the metric data, it is easy to demonstrate that it cannot be Lorentz-invariant, not even statistically. It will behave just like the luminiferous aether - a substance that picks a preferred reference frame i.e. a preferred "time" direction in the spacetime diagram.

To see an example, imagine a typical picture that would be relevant for the Planck scale model of spacetime according to the naive, discrete theories:



Well, yes, it's a U.S. map designed to check the four-color theorem but we will call it a "spin foam". ;-) Now, boost it to see how it would look like from a different inertial system:



All the states of the union have been stretched in the Northwest direction and shrunk in the Southwest direction. And you can see this fact by looking at the resulting picture. The elementary blocks inside the picture are simply closer e.g. to ellipsoid stretched in the Northwest direction than ellipsoids stretched in the Northeast direction.

Much more generally, whenever your caricature of the spacetime near the minimal length contains one-dimensional edges, two-dimensional surfaces, or any higher-dimensional objects of a finite length, area, or its generalizations, you're guaranteed that the picture will break Lorentz symmetry.

How can I prove it? It's very easy. Just look at a big enough region of spacetime that you expect to "statistically" respect the rules of Lorentz symmetry. And compute the probabilistic distribution of different "directions" of the edges, or different "two-directions" of the surfaces (the space of projective two-forms), and so on. By assumption, you should get a Lorentz-invariant distribution on the set of different directions.

But there can't exist any such Lorentz-invariant distribution because the volume of the Lorentz group - and the volume of the space of the directions - is infinite and there is no uniform distribution on infinite-volume manifolds because the normalization factor would have to equal "1/infinity".



Let me just show you how e.g. the space of spacelike directions in spacetime looks like. It looks like a hyperboloid - pretty similar to this orange tower in Kobe, Japan - and its volume, even the volume measured by the Minkowski metric, is infinite. So if you tried to "average" a picture over the Lorentz group, "almost all" the edges would have to be null because "almost all" elements of the Lorentz group correspond to the speed of light.

(There is really no uniform measure on the Lorentz group - so the previous sentence assumes that you are OK with a cutoff version of it. You can't really "average over the whole Lorentz group" because the latter has an infinite volume.)

But even if you designed a "spin foam" where all the edges would be null, it would still hide a preferred reference frame as long as the coordinate distances of these null edges would be finite. They would also have a distribution on the space of possible coordinate distances, and there is no convergent distribution that is Lorentz-invariant, either.

Whatever "picture" - similar to pictures that can be drawn in the Euclidean space (or on a sheet of paper) - you embed into the Minkowski space, it is guaranteed that it will violate the Lorentz symmetry by "order 100%" effects. All hypotheses that fill the spacetime with particular "discrete junk" at the Planck scale that can be drawn on a paper are incompatible with the Lorentz symmetry at the Planck scale which also means that they're ruled out because violations of the Lorentz symmetry by O(1) effects at the Planck scale have been ruled out experimentally.

This argument is very general and falsifies pretty much all non-stringy models of the spacetime near the Planck scale. More specific theories may be disproved in easier ways.

For example, loop quantum gravity implies (LQG) a formula for the proper area of two-surfaces in the form of a sum of positively definite, real, discrete terms ("8.pi.G.gamma.sqrt[j(j+1)]"): each term comes from one intersection of the surface with the spin network in space (or spin foam in the spacetime). The areas can't ever be imaginary.

But relativity implies that small proper areas may be both real or imaginary (fully spacelike, ++, or partly timelike, -+) which means that LQG cannot be reconciled with special relativity: the partly timelike surfaces are literally impossible in LQG. Note that I don't need to know anything else about LQG, or make it fully well-defined, to prove this incompatibility. It's the very basic philosophy that it incompatible with relativity. The project is doomed from the very beginning.

The spacetime - the vacuum - must be genuinely clean and empty, even at the Planck scale, and the only things that can occupy it are Lorentz-invariant "ground state" oscillations associated with something that can be parameterized as Lorentz-invariant quantum fields.

Myth: The existence of a kind of "minimal length" inevitably leads to a violation of the exact Lorentz symmetry.

Reality: This myth is a sort of combination of the previous two. Again, string theory is a counterexample. We must realize that the "minimum length" we sometimes talk about in string theory - or any other realistic theory - doesn't mean a strict discreteness of the geometric observables. Also, the "minimum length" doesn't imply that every observable with units of length is bounded by the "minimum length" in each reference frame.

The latter would be clearly incompatible with the Lorentz symmetry, as the U.S. maps above were meant to demonstrate. If you said that the wavelength of a photon - as measured in a particle reference frame - can't be shorter than the Planck length, that would violate relativity simply because the wavelength and the frequency of a photon can be increased or decreased by an arbitrary large or small factor by the relativistic Doppler effect. Photons clearly can have wavelength shorter than the Planck scale and some of those that are flying through the outer space probably do.

However, the "minimum length" - the typical scale where new phenomena linked to "quantum geometry" start to routinely occur - always refers to quantities with the units of length that were measured in a Lorentz-invariant way. The proper distance "L" between two (non-BPS) particles is such an example. The radius "L" of a black hole that admits an approximately smooth relativistic description is another example. In both cases, "L" can't really be shorter than the Planck scale, in the usual sense.

Nevertheless, it's manifestly not true that every quantity that can be called "L" and whose units are meters must be longer than the Planck scale, and the wavelength of a particle's de Broglie wave was our example. You must be careful what "L" or what "length" you are talking about. In our world, there is more than just one "length". ;-)

And because our world is relativistic, only inequalities about the "invariant" distances, and not coordinate distances in particular inertial systems, may be valid universally. And only some of them can. The people who believe in inequalities that constrain coordinate distances are doing the very same mistake as the defenders of the luminiferous aether - against relativity - were doing 100 years ago. They have made no progress in their understanding of spacetime whatsoever, and the new ambitious term, "quantum gravity", doesn't change anything about the fact that their thinking is stuck in the very same kind of aether that prevented the 19th century physicists from discovering relativity.

Quantum gravity doesn't allow you to sleep during the high school course that covers relativity. ;-) In quantum gravity, you need to "know" and "feel" relativity and you also need much more than that.

Myth: It is a logical fallacy - or a "sin" - to assume Lorentz symmetry because such an assumption leads to a circular reasoning.

Reality: This kind of criticism directed against science is common with the "new Einsteins" or, more precisely, the new counter-Einsteins. ;-) Philosophers are obsessed with their critiques of "circular reasoning", too. See, for example, philosopher Amit Hagar who criticizes the people who "presuppose the Lorentz invariance".

But in physics, it doesn't hurt if you can show that "A" implies "B" and "B" implies "A". Quite on the contrary. It suggests that your structure of ideas and propositions is robust because it's a very good thing when principles and assumptions logically follow from each other in physics. Why? Because it shows that these assumptions are not quite independent from each other, and a theory with a small number of independent assumptions - able to agree with the same large number of observations - is more convincing than a theory with a larger number of independent assumptions.

What Amit Hagar and many others fail to realize is that if "B" follows from "A", it doesn't mean that "B" is wrong. Also, it doesn't mean that there is no other evidence supporting "B" (or "A", for that matter), even if Amit Hagar fails to see this other evidence. One always has to "presuppose" something to derive any conclusions in physics: it's no sin to "presuppose". Quite on the contrary: the whole scientific method is about "presupposing" of hypotheses, and their comparison with and falsification by the evidence.

What matters for the fate of the propositions and hypotheses in science is whether "A" or "B" or both are correct. And the Lorentz invariance is correct. Such insights always boil down to observations, although the link is often indirect and needs a lot of thinking.

Just to be sure. I also "presupposed" Lorentz invariance of the "spin foam" because I wanted to show a contradiction between any "spin foam" and the Lorentz invariance which is what I have done.

By the way, if you want to see how obsessed Amit Hagar is with critiques against "presupposing", you may look at another place that is separated by a few paragraphs only.

He criticizes some physicists for their "sin" which was to "presuppose" the validity of thermodynamics during their analyses of Maxwell's daemon. Well, it was clearly no sin because what's important is whether they were right or wrong. And they were damn right: the second law of thermodynamics is valid in all macroscopic situations and Maxwell's daemon couldn't exist. Maxwell's daemon is a nice exercise but it is only important pedagogically today, as a tool to explain what it means that the second law of thermodynamics always holds and the perpetuum mobile device of the second kind can't be constructed.

The fact that someone - even Maxwell himself - can "imagine" that there could exist a counterexample to an important statement TD2 doesn't imply that there actually exists a counterexample to TD2. And it can't prevent other people from "presupposing" TD2 when they analyze the real world or gedanken experiments. The right answer about the validity of TD2 is a priori unknown. A posteriori, the answer is that TD2 holds.

But it was completely necessary for some scientists to "presuppose" the laws of thermodynamics because it was the key general hypothesis that ultimately turned out to be the right answer. So someone had to "presuppose it" and look at the evidence in both directions - to see whether the evidence favors the hypothesis or falsifies it. And it evidently favors it while the hypothetical existence of (specific or general) counterexamples has been falsified.

The second law is a "postulate" or "axiom" in thermodynamics but it may be demonstrated in many examples experimentally and it can be proven in general theoretically, using the tools of statistical physics, too. That's what Boltzmann's H-theorem does, among other things. Feynman's fifth Messenger Lecture explains why the daemon is impossible very nicely.

Myth: The proper areas of well-defined two-surfaces in a Minkowski space depend on the velocity of the frame from which you observe them.

Reality: This particular myth is really silly but it has appeared in a paper by Amit Hagar in a philosophical journal published by Elsevier. Note that the file is called "LM2" - guess why.

Hagar published a normal physics paper about physics questions except that all of his statements are incorrect. That doesn't matter because he's a philosopher and the paper is only going to read by other philosophers who have no idea about the subject, either.

At any rate, he wanted to "prove" that a majority of LQG papers about the Lorentz invariance are wrong. You know, the LQG is a wrong approach that cannot agree with the gravity and quantum phenomena that exist in the world around us. But there are many "very small" partial results in the papers that are correct or essentially correct. The calculations of the Lorentz violation - or at least the proofs of its existence - are an example.

Hagar didn't like the results. So his method of "proof" was to simply write the negations of all the statements, without any valid calculation or argument whatsoever. There are several invalid arguments in his paper, however. For example, he claims that some papers make the mistake of not appreciating that the proper areas of surfaces in the Minkowski spacetime should depend on the "velocity".

Except that they don't. The proper areas are invariant. What can get Lorentz-contracted are lengths of three-dimensional objects. But it's because in different inertial systems, these lengths are measured as the proper lengths of differently tilted line intervals in the spacetime, depending on the speed. But once we actually talk about a particular line interval or a particular 2-surface in the four-dimensional spacetime, its proper area is an invariant!

All other hints of an argument in Hagar's paper are incorrect, too. But I don't have enough time to describe them here. It would be a waste of time for you and me to study such crappy papers.

Myth: One 31-GeV photon found by the Fermi collaborations was not "statistically enough" to rule out the O(1) Lorentz violation at the Planck scale.

Reality: Well, it is enough. It is simply not true that the "confidence level" depends on the "number of particles" only in such a way that the number of particles has to be high for the confidence level to exceed 99.9%. One must carefully calculate the probabilities which is what the 204 authors of the Fermi paper did.

Because the 31-GeV had a very high energy (which would predict a pretty long delay) and because it came really exactly when it should have come (within less than 0.1 seconds), the people could have calculated that the confidence associated with this event is higher than 4.4 sigma, and probably closer to the upper end of their interval, 5.6 sigma. The event is "certainly" associated with the relevant gamma ray burst - and the comparably "confident" consequences (delay surely shorter than tens of milliseconds) follow by similar considerations. See the bottom of page 1 and the page 2 of their supplementary material.

The probability that this photon could have been "noise" was calculated to be 17 parts per billion which is de facto zero.

So one photon is actually enough to rule out O(1) violations of the Lorentz symmetry at the Planck scale. But be sure, additional bursts and photons will be observed in the future, and the probability that the laws of physics will suddenly change is pretty unlikely to exceed those 17 parts per billion. ;-)

Add to del.icio.us Digg this Add to reddit

snail feedback (1) :


reader Dale Ritter said...

The minimal length factor is available in new relative quantum physics. It falls out of the GT integral atomic function as the spacon waveparticle, which turned out to be in the class of energy intermedon particles by even cube symmetry. There is a good deal of meaning in the symmetry group operations, as this shows, and the quantum spacon should be a key factor in femtophysics. Here is the guide to spacon topological imaging, which follows numerically in the complete form of the energy intermedon particle manifold calculations.
The atom's RQT (relative quantum topological) data point imaging function is built by combination of the relativistic Einstein-Lorenz transform functions for time, mass, and energy with the workon quantized electromagnetic wave equations for frequency and wavelength. The atom labeled psi (Z) pulsates at the frequency {Nhu=e/h} by cycles of {e=m(c^2)} transformation of nuclear surface mass to forcons with joule values, followed by nuclear force absorption. This radiation process is limited only by spacetime boundaries of {Gravity-Time}, where gravity is the force binding space to psi, forming the GT integral atomic wavefunction. The expression is defined as the series expansion differential of nuclear output rates with quantum symmetry numbers assigned along the progression to give topology to the solutions.
Next, the correlation function for the manifold of internal heat capacity particle 3D functions condensed due to radial force dilution is extracted; by rearranging the total internal momentum function to the photon gain rule and integrating it for GT limits. This produces a series of 26 topological waveparticle functions of five classes; {+Positron, Workon, Thermon, -Electromagneton, Magnemedon}, each the 3D data image of a type of energy intermedon of the 5/2 kT J internal energy cloud, accounting for all of them.
Those values intersect the sizes of the fundamental physical constants: h, h-bar, delta, nuclear magneton, beta magneton, k (series). They quantize nuclear dynamics by acting as fulcrum particles. The result is the picoyoctometric, 3D, interactive video atomic model data imaging function, responsive to keyboard input of virtual photon gain events by relativistic, quantized shifts of electron, force, and energy field states and positions.
Now the spacon particle fits into the intermedon cloud to complete the cube, and fulfill the internal momentum function's velocital magnitude definition.
Images of the h-bar magnetic energy waveparticle of ~175 picoyoctometers are available online at http://www.symmecon.com with the complete RQT atomic modeling guide titled The Crystalon Door, copyright TXu1-266-788. TCD conforms to the unopposed motion of disclosure in U.S. District (NM) Court of 04/02/2001 titled The Solution to the Equation of Schrodinger.
(C) 2009, Dale B. Ritter, B.A.