Friday, November 25, 2005 ... Français/Deutsch/Español/Česky/Japanese/Related posts from blogosphere

Supernovae: Lambda is constant

Scientific American reports that the observations of 71 distant galaxies suggest that the dark energy is constant, namely the cosmological constant, indeed. The upper bound on the pressure/energy_density ratio is now

  • pressure/energy_density < -0.85,

very close to "-1". More details in the paper by Ray Carlberg et al.

ATLAS at Wikipedia

Incidentally, if you look at Wikipedia today, the featured article is about the

It was mostly written by SCZenz, a graduate student of experimental particle physics at Berkeley. Very good job! Incidentally, some people think that the Bush-haters usually like science. It does not seem to be the case at Wikipedia. The user Z6 has only "edited" two pages before he was eradicated by an administrator: ATLAS experiment that Z6 proposed to be destroyed, and George W. Bush whose page Z6 has modified in a very similar way. To show how difficult it is to destroy the communists, Z6 has also reincarnated into a new user V9 who attempted to destroy the Atlas experiment again. ;-)

Add to del.icio.us Digg this Add to reddit

snail feedback (3) :


reader nigel said...

'Dark energy' comes originally from Saul Perlmutter's experimental results of 1997-8 on supernovae recession at vast distances obeying the Hubble law, and not slowing down due to gravity.

In the October 1996 letters page of Electronics World, http://www.softcopy.co.uk/electronicsworld/ , the editor sold a paper of mine at £4.50 a copy.

This paper gave the mechanism whereby the inward reaction to the outward expansion in the big bang is the cause of gravity.

This model is like an explosion, see Explosion physics.

You get an outward force of big bang of F=ma where a is linear variation in recession speeds (c-0) divided by linear variation in times past (15,000,000,000 years - 0) = 6 x 10 ^ -10 m/s^2.

Multiply this by mass of the surrounding universe around us and you get outward force = 7 x 10 ^ 43 newtons. (I've allowed for the increased density at great distances due to the earlier time in the big bang, when the universe was denser, by the factor I derive here.)

This outward force has an equal and inward reaction, due to Newton's 3rd law of motion.

The screening of this inward force by fundamental particles geometrically gives gravity to within 1.7 %, as shown here

Feynman said, in his 1964 Cornell lectures (broadcast on BBC2 in 1965 and published in his book Character of Physical Law, pp. 171-3):

"The inexperienced, and crackpots, and people like that, make guesses that are simple, but [with extensive knowledge of the actual facts rather than speculative theories of physics] you can immediately see that they are wrong, so that does not count. ... There will be a degeneration of ideas, just like the degeneration that great explorers feel is occurring when tourists begin moving i on a territory."

On page 38 of this book, Feynman has a diagram which looks basically like this: >E S<, where E is earth and S is sun. The arrows show the push that causes gravity. This is the LeSage gravity scheme, which I now find Feynman also discusses (without the diagram) in his full Lectures on Physics. He concludes that the mechanism in its form as of 1964 contradicted the no-ether relativity model and could not make any valid predictions, but finishes off by saying (p. 39):

"'Well,' you say, 'it was a good one, and I got rid of the mathematics for a while. Maybe I could invent a better one.' Maybe you can, because nobody knows the ultimate. But up to today [1964], from the time of Newton, no one has invented another theoretical description of the mathematical machinery behind this law which does not either say the same thing over again, or make the mathematics harder, or predict some wrong phenomena. So there is no model of the theory of gravitation today, other the mathematical form."

Does this mean Feynman is after physical mechanism, or is happy with the mathematical model? The answer is there on page 57-8:

"It always bothers me that, according to the laws as we understand them today, it takes a computing machine an infinite number of logical operations to figure out what goes on in no matter how tiny a region of space, and no matter how tiny a region of time. How can all that be going on in that tiny space? Why should it take an infinite amount of logic to figure out what one tiny piece of space/time is going to do? So I have often made the hypothesis that ultimately physics will not require a mathematical statement, that in the end the machinery will be revealed, and the laws will turn out to be simple, like the chequer board with all its apparent complexities."

Best wishes,
Nigel


reader nigel said...

The problem with "dark energy" that I predicted via Oct. 96 EW was that it is false.

The prediction of GR that gravity slows the big bang expansion is wrong because it ignores the mechanism for gravity, which says gravity is the asymmetry in a push.

The inward push is due to surrounding expansion. Since supernovae at great distances have less mass of the universe beyond them, there is less inward push from that direction, so they aren't slowed down. This is what drives the expansion as observed.

Hence the whole dark energy thing is a myth due to assuming gravity is not caused by the expansion.

I predicted this before Perlmutter made his "discovery", in EW, which was why I tried to get the idea published.


reader nigel said...

The gauge boson radiation causing gravity and electromagnetism is DISPLACEMENT CURRENT. Catt shows that Maxwell got his interpretation of this ‘displacement current’ wrong, by ignoring the time it takes light speed electricity to flow along the capacitor plates. His co-authors Drs. Walton and Davidson mathematically worked out how the transmission line theory of Heaviside can be applied to explain the charging curve of a capacitor, which is compared to reality and is a correct prediction. Catt's error follows from Heaviside’s false idea that the light speed electricity Poynting-Heaviside vector is the same as light, with the two conductors guiding the light which travels in the insulator between them. This is false, as we know electricity originates as electrons in conductors and such like, although it is true that the measured speed is that in the insulator not the wires. What is going on is plain from quantum electrodynamics, gauge bosons/photons are being exchanged via the insulator between the two conductors. This is why parallel wires carrying currents attract/repel. In addition, the radio transmitter and receiver aerial form a capacitor with air as the dielectric. The radio waves are displacement current energy, detectable just when the varying current varies the electric field across the transmitter aerial. In the same way, the displacement current flows in the capacitor only while the field in the capacitor plate is varying, due to its charging up or discharging. Maxwell's error was fiddling a theory to fit Weber's 1856 observation that 1/(root of product of permittivity and permeability) = c. This fiddle is like the application by Rayleigh of a wave equation to sound without understanding the pressure and force mechanisms involved in particulate (molecular) sound waves. Planck showed the resolution to the problem with the wave model of light by the quantum theory, while Bohr had shown that Maxwell's light theory was incompatible with the atom. Nobody corrected Maxwell's false theory, however. In reality, ‘displacement current’ is the gauge boson, causing electromagnetic and gravitational forces, and all radio and light waves. Emitted by due to the centripetal acceleration of continuous, uniformly spinning charges (fundamental particles) with no oscillation, it is undetectable radiation, but still carries pressure and force (pressure times area), causing fundamental forces.