Monday, June 06, 2005

Deviations from Newton's law seen?

We just returned from a lunch with Markus Greiner and one of the very interesting rumors I learned from an experimental colleague of ours - who has also told me that the news is publishable (but I won't reveal his name so far) - is the following.

The most careful and respected experimental group in its field which resides at University of Washington - Eric Adelberger et al. - seems to have detected deviations from Newton's gravitational law at distances slightly below 100 microns at the "4 sigma" confidence level. Because they are so careful and the implied assertion would be revolutionary (or, alternatively, looking spectacularly dumb), they intend to increase the effect to "8 sigma" or so and construct different and complementary experiments to test the same effect which could take a year or two (or more...) before the paper is published. You know, there are many things such as the van der Waals forces and other, possibly unexpected, condensed-matter related effects that become important at the multi-micron scales and should be separated from the rest.

Their measured force at these multimicron distances is weaker than expected from Newton's formula. This is unusual because in most models, one expects the force to grow stronger at short distances. For example, in theories with two
(that motivated these experiments) and quantum gravity at 10 TeV, the two large dimensions should be about 100 microns in size. The gravitational law "1/r^2" should switch to "1/r^4" at shorter distances - it should become stronger. One can also add massive scalar fields, but the most typical expectation is that a force obtained from the exchange of a scalar field is also attractive (between objects with the same "scalar charges"); it would therefore strengthen gravity. Of course, a new (massive) spin 1 field would, on the contrary, lead to repulsive interactions. All these scenarios with new intermediate particles are problematic because we can ask: why have not we seen these messengers yet? In most contexts, one would need to make their couplings incredibly small - as weak as gravity - which also suppresses the messenger production. But in this case, in a sense, the new interaction is a manifestation of a new mode of gravity (like its KK modes).




For the latest paper of that group - which could not announce any new effects yet - see
Note that 100 microns is also the scale of the vacuum energy - the cosmological constant. In other words, "1/(100 microns)^4" equals the energy density of the observed vacuum energy. Our experimentalist says that they would like to promote the idea that the gravitational oscillations do not exist below 100 microns. Apparently, the Washington physicists like to think about the theoretical concept of a "fat graviton" being relevant for their current observations. Only virtual particles with the wavelength longer than 100 microns (small enough momenta) should contribute to the vacuum energy because the modes of the graviton do not exist at shorter distances. This also means that gravity becomes weaker at shorter distances. Let me choose, with the help of Ann Nelson, the following references for this kind of ideas:
Hsin-Chia Cheng also told me about a similar fat graviton framework by
Yes, I find these ideas about the origin of the cosmological constant more predictive (correlated with the predicted submillimeter deviations), exciting, and likely than the solution via the anthropic landscape. It remains to be seen how a fat graviton like this one may be derived from a more fundamental theory - namely (or "for example") from string/M-theory. No doubt, a fat graviton seems unnatural in all stringy models I know of - especially because the photon and other particles can't be fat (they're definitely smaller than 10^{-17} meters according to our observations). But there could exist a way to put these ideas on firm ground.

My thanks also go to Ann Nelson for her speedy and detailed comments about the history of these ideas.

6 comments:

  1. Google is your friend: c*h/eV (but check also http://pdg.lbl.gov/2004/reviews/consrpp.pdf )

    So we are also in the neutrino mixing mass range, are we?

    ReplyDelete
  2. Given that the distance in discrete Kaluza Klein models is related to the mass (or higgs coupling) content of theory, the mass range in the lightest fermion suggests that this kind of models could be included in your list. So:

    Lizzi, Mangano, Miele hep-th/0009180
    Arkani-Hamed, Schwartz hep-th/0302110
    Chamseddine hep-th/0301112?

    Generically, we should reminder to the younger generation that an extra discrete dimension, related to Dirac operator, was the main theme of particle models from Connes-Lott and followers and also from Coquereaux et al., in the early nineties, both teams under the generic heading of non commutative geometry.

    ReplyDelete
  3. h*c/eV equals to approximately 1 micron in length. But that value is purely coincidencial and bears no physics significance whatsoever. That's because while h and C are natural physics constants, eV is NOT, eV is totally random happendness due to the artificial selection of unit set. It goes like this:

    eV equals to the electron charge, which is a natural constant, times volt, which is an artificial unit.

    Volts is defined as such an electrical potential difference that one Joules of energy shall be gained or lost should one Coloumbs of charge move through that much electrical potential difference. So volts rely on two artificial units: Coloumb and Joules.

    Joules is defines as the kinetic energy of gained of moving one meter under the pushing force of one Newton. Or the kinetic energy of two kilogram at a speed of one meter per second. Here, kilogram, meter, second are all artificial units. Kilogram is what that piece of alloy in Paris happen to weigh, meter is one part in 4x10^7 of the circumference of the earth's equator, and second is one part of 86400 of one earth day.

    And Couloums is defined as the amount of charge flowed through in one second should the current be one Ampere. One Ampere is the current that causes a static force of 2x10^-7 Newton per meter of electric wires when two wires are placed in parallel and separated by one meter. So again Ampere is also an artificial unit depending on the meter and an arbitrary number 2x10^-7.

    Therefore eV is really NOT an authentic natural physics unit, but merely a convenient engineering unit. It's value bears no physics significance whatsoever.

    Quantoken

    ReplyDelete
  4. If gravitational attraction does appear to decrease at distances closer than 100 microns = 10^(-2) cm = 10^31 Planck Lengths (Lp)
    and
    if that distance to the 4th power, or 10^124 Lp^4, is taken to be equivalent to the cosmological constant energy density
    and
    if the cosmological constant is seen as dark energy that causes large-scale expansion of our universe
    then
    could it be that
    when your experimental system is within a volume corresponding to a the cosmological constant energy density
    the dark energy "force" becomes effectively a repulsive force that cancels some of the ordinary gravitational attraction and therefore makes gravity appear to be weaker at distances less than 10^31 Lp ?

    In other words, the dark energy "force" may be something that manifests itself only on large scales (expansion of the universe) and on small scales (at or below the scale of its energy density).

    That hypothesis can be tested by observing the degree of weakening at 100 microns and comparing it with the repulsive acceleration that would be produced by a dark energy model of the cosmological constant.

    Tony Smith
    http://www.valdostamuseum.org/hamsmith/

    ReplyDelete
  5. I think that Tony's point is valid in the following context:

    Quoting relevant bits from John Baez, out of the Cosmology and Astrophysics section:

    http://math.ucr.edu/home/baez/physics/General/open_questions.html
    Einstein introduced dark energy to physics under the name of "the cosmological constant" when he was trying to explain how a static universe could fail to collapse. This constant simply said what the density dark energy was supposed to be, without providing any explanation for its origin. When Hubble observed the redshift of light from distant galaxies, and people concluded the universe was expanding, the idea of a cosmological constant fell out of fashion and Einstein called it his "greatest blunder". But now that the expansion of the universe seems to be accelerating, a cosmological constant or some other form of dark energy seems plausible.

    Is the universe really full of "dark energy"? If so, what causes it?

    As mentioned above, evidence has been coming in that suggests the universe is full of some sort of "dark energy" with negative pressure. For example, an analysis of data from the Wilkinson Microwave Anisotropy Probe in 2003 suggested that 73% of the energy density of the universe is in this form! But even this is right and dark energy exists, we're still in the dark about what it is.

    The simplest model is a cosmological constant, meaning that so-called "empty" space actually has a negative pressure and positive energy density, with the pressure exactly equal to minus the energy density in units where the speed of light is 1.
    However, nobody has had much luck explaining why empty space should be like this, especially with an energy density as small as what we seem to be observing: about 6 x 10-30 grams per cubic centimeter if we use Einstein's E = mc2 to convert it into a mass density.

    /quote

    ~

    The origin of Einstein's "dark energy" doesn't require any explanation if negative pressure increases when you create matter from his matter-less G=0 vacuum, because the pressure of the vacuum necessarily falls to less than zero as the pressure on the vacuum increases when you condense enough energy over a finite region of space to achieve postive matter density, pressure, and gravitaional curvature, using Einstein's E=mc^2 to convert to the mass density of matter.

    It doesn't require knowledge of the quantum vacuum for Einstein to postulate at this time that constant matter generation *causes* the vacuum to expand in proportion to matter creation. Hoyle did the same thing, but at least Einstein had a built-in mechanism for it, unlike Hoyle, who did not.

    "Empty space should be like this" if the mass density of the vacuum falls in proportion to the creation of new matter, while negative energy increases in proportion to the resultant increase in negative pressure, so that the size of the universe becomes a disproportional function of the number of particles in it, because it requires a greater volume of the vacuum to attain positive matter density each time that you create more matter.

    ...when your experimental system is within a volume corresponding to a the cosmological constants energy density... then gravitational curvature falls off as pressure becomes negative when the volume of the vacuum corresponds to its mass density, which... cancels some of the ordinary gravitational attraction and therefore makes gravity appear to be weaker at distances less than 10^31 Lp

    In other words, the dark energy "force" may be something that manifests itself only on large scales (expansion of the universe) and on small scales (at or below the scale of its energy density).

    In other words, negative pressure over-rides locally positive gravitational curvature on a universal scale, and below the scale of the vacuum's mass density.

    ReplyDelete
  6. Could negative pressure explain the tunneling effect seen with close bond orbit electrons?

    If there is not interaction between two points for a particle on set trajectory and speed would the distance between the two points matter if they are within a finite negative pressure vaccum area of space?

    ReplyDelete