Thursday, March 23, 2006

Inaccurate clocks in quantum gravity?

Rodolfo Gambini, Rafael Porto, and Jorge Pullin have a paper
in which they argue that there exists some "additional" fundamental contribution to decoherence arising just from the fact that someone is able to measure time. They even say that there is a "fundamental loss of unitarity" in a theory of quantum gravity. I believe that all these papers are completely wrong. Five years ago, we investigated similar statements by Ng and van Dam together with Raphael Bousso and other guys who were in Santa Barbara.

The authors had claimed that quantum gravity implies that the uncertainty of measuring a distance "L" is always of order "L^{1/3} Lplanck^{2/3}" which is a pretty huge uncertainty that grows with "L" quite rapidly. They followed Wigner and constructed a completely stupid kind of inaccurate clocks. Because they had inaccurate clocks, they argued that all clocks must be inaccurate (probably because they are the smartest people in the world and no one can construct better clocks than they can), which means that the inaccuracy of their dumb clocks is a fundamental principle of Nature, they argued.




What a piece of crap. Of course, in reality, one can construct much better clocks. In fact, John Baez had independently realized the same idea as we did and constructed clocks in which the pendulum is attached to a very massive rod - something that guarantees that the position of the pendulum does not fluctuate. By adjusting the ratio of the masses of the two segments of these better clocks, Baez can get the uncertainty "Lplanck" for any measurement of distance "L" which is independent of "L". (The guys who constructed clocks in the 17th century knew similar tricks quite well.) Ng and van Dam tried to "refute" Baez's (and our) statements here - but the only thing that they did to "refute" the criticism was to say that they did not like Baez's (and our) clocks.

I suppose that if you see that even John Baez agrees with me, my argument must have a sociological weight. ;-)

Of course that Baez's result is the correct result for the uncertainty and I believe that every grad student should be able to calculate these things after 2 semesters of quantum field theory. Quantize the gravitational field. Write the metric as
  • g_{mn} = eta_{mn} + sqrt(Gnewton) h_{mn}
The coefficient of "sqrt(Gnewton)" is inserted to make the kinetic term for the spin-two field h_{mn} canonically normalized, schematically:
  • S = integral (partial_{k} h_{mn})^2
What is the uncertainty of the proper length between two points separated roughly by "L"? This uncertainty depends on the uncertainty of "h". Clearly, the uncertainty of "h" (whose dimension is mass in 4D) must be given by the cutoff "Lambda" because in the theory for "h" whose action is written above, there is no other scale (no other dimensionful parameter). By averaging over distances on the rod of length "L", the cutoff "Lambda" can be chosen as small as "1/L".

The typical standard deviation of "h_{mn}" is of order "1/L" which means that the typical standard deviation of "g_{mn}" defined using "h_{mn}" is simply "Lplanck/L" because "sqrt(Gnewton)=Lplanck". The standard deviation in "g_{mn}" from "1" measures the relative error in the measurement of the distance. If the relative error is "Lplanck/L", the absolute error in the measured length is "Lplanck". Note that this simple result only holds in four dimensions; in other dimensions, you need other power laws. Any result that is parameterically higher than this simple estimate is just an artifact of using suboptimal types of clocks and other tools.

If I generalize their flawed reasoning more ambitiously: many people think that if they are unable to prove something or calculate or measure something accurately and reliably, and if everything seems fuzzy and uncertain to them, it implies that no one else can prove, see, or measure these things accurately either. Well, all these people are completely wrong because other people can be simply more skillful than they are - and some people could also say that the believers are "arrogant". But of course, I think that they're mostly nice people after all: they are just plain wrong.

D-dimensional generalization

Incidentally, in D spacetime dimensions, the fluctuation of "h_{mn}" must still be given by a power of "L" which is "L^{1-D/2}", and the corresponding relative error of "g_{mn}" and the relative error in the measured distances therefore goes like "sqrt(Gnewton)/L^{D/2-1}". Note that as the dimension gets very large, the error of the measured distances decreases much faster as these distances grow. In "D=2", the relative error is always of order 100 percent. In three dimensions, the error for distances "L" goes like "sqrt(L)". In low dimensions, the long-distance (infrared) fluctuations are very large; in high dimensions, they are very small.

1 comment:

  1. A amusing clock like quantum gravity postulate is
    "Any particle orbiting under a coupling constant K must swept a area greater than Planck lenght in one unit of quantum time"

    For a circular orbit of a particle of mass m with linear speed v, it amounts to the bound

    K/m > 2 v c l_P

    with l_P the lenght of Planck of course.

    Now for neutrinos the coupling comes from Fermi interaction, thus

    K \approx hc (m/M_{EW}) ^2

    and taking v \approx c we have and approximate bound

    hc (m/M_{EW})^2 / m > 2 c *c * l_P

    thus

    m / M_{EW}^2 > 2 (c L_p/ h)

    and then

    m m_P > (M_EW) ^2

    A very traditional see-sawed bound for the neutrino, even if nowadays people prefer to use the nearby m_GUT instead of m_Planck

    ReplyDelete