One of the "great" ideas that are being proposed billions of times every day is the idea that the fundamental physical laws of Nature are "discrete". The world is resembling a binary computer - or at least a quantum computer, we're being told very often. "Discrete physics" even has its own USENET newsgroup "sci.physics.discrete" which has fortunately been completely silent since it was created. Various games and "types of atoms" that are supposed to produce spacetime at the Planck scale are even sold as "alternatives to string theory".

I am among those who are convinced that every single proposal based on the idea that "the fundamental entities must be discrete" has so far transparently been a crackpot fantasy. What's wrong with all of them?

**Both discrete and continuous mathematics matter**

First of all, both discrete as well as continuous mathematical structures are important for actual reasoning and calculations in physics and not only in physics. We just need both of these categories of tools and theorems. Many people who like to say that only one of them may be fundamental are usually the people who don't know the other set of insights well enough - or they don't know it all. And they don't want to learn it. Instead, they want to promote a "theory" that implies that it is good if you don't learn it. In other words, they ignore at least one half of the basic math that is needed for physics.

**Discrete and continuous concepts are related**

Second of all, there are many deep relations between discrete and continuous objects, or between combinatorics and calculus. The Riemann zeta function is a completely continuous function of a complex variable. Nevertheless, it knows more or less everything about the distribution of prime integers. There are many other examples like that in mathematics and even a bigger number of them may be found in theoretical physics.

Knots in three dimensions are discrete objects, too; nevertheless, their properties are encoded in correlators of operators in Chern-Simons theory. The Gromov-Witten invariants and other integers associated with manifolds and their topology that seem to be completely combinatorial in their character may be derived from another continuous theory, namely topological string theory. The integer degeneracies in many contexts are calculated as coefficients in generating functions that are, once again, completely continuous.

**Crackpots are almost always discrete**

The proof of Fermat's last theorem - a theorem that has attracted so many crackpots exactly because it looks so simple (and discrete) - is essentially based on geometry and smooth objects, too. Riemann's hypothesis has attracted many less crackpots - despite the $1,000,000 award from the Clay Institute. It's because most crackpots are discrete, zero-dimensional objects.

The mathematical insights that allow us to derive many "combinatorial" conclusions from analytical considerations, continuous functions, properties and equations defined over smooth manifolds etc. have been important at least for 200 years. As time goes, the fundamental concepts that underlie a significant part of our knowledge of mathematics and mathematical physics become continuous in character. The discrete features and integers are increasingly integrated into increasingly continuous structures. The combinatorial concepts are being integrated with geometry.

**Modern history of continuous dominance**

Both the 19th century as well as the 20th century have escalated and accelerated this process. The discrete, point-like particles in classical mechanics were largely replaced by continuous fields in the 19th century. Although new discrete processes and objects were observed later, their complete description always happened to be a continuous one.

The "old" quantum theory with its model of the Hydrogen atom from Niels Bohr became a childish game when the "new" quantum mechanics was constructed. The correct picture that explains the quantization of energies may be defined in terms of a continuous wavefunction. Shockingly enough, the eigenfunctions of a Hermitean operator may form a discrete set. Also, there still existed discrete particles. Nevertheless, it turned out that they were again manifestations of a discrete spectrum of a completely continuously defined operator in quantum field theory where it acts on a completely continuous Hilbert space.

The more and the deeper we understand something, the more continuous and geometric the fundamental objects underlying our descriptions become. The previous sentence is more or less a tautology for objects: "objects" only look "discrete" when we describe them in a superficial manner, without looking into their structure. The discrete social security number is only a good description of a person for those who are not interested about anything else connected with the person.

String theory continues in the same direction. In quantum field theories, the identity of elementary particles - and their charges and masses - were discrete in character. There were dozens of types of "matter" in the Standard Model. In string theory, there is only one type of matter and the discrete choices only emerge as properties of eigenstates of another Hamiltonian. Also, many "discrete" phenomena and features of low-energy physics are provided with a geometric interpretation and realization emerging from string theory; it often involves additional dimensions of space.

**Future: more geometric**

I find it rather obvious that this process will have to continue if we continue to make progress in our understanding of the fundamental mathematics underlying the laws of the Universe. All kinds of discreteness will have to be derived from a starting point that is rather continuous. A new kind of fuzziness, non-commutativity will be introduced to our physical laws while keeping them quantitative and predictive. Try to look for discrete objects and choices that are used in our current laws of physics and string theory in particular; they must be explained by a deeper principle that is inherently continuous and geometric, where the word "geometry" is used in a generalized sense.

**Discreteness is always a derived concept if you look carefully enough. Discreteness is emergent if you wish; it can never be quite fundamental.
**

Even though the previous paragraphs were mostly about the future, I have also emphasized that the same kind of progress has been dominant in mathematics and theoretical physics at least for 200 years. This is why I am so flabbergasted by the huge number of people who seem to be interested in maths and physics but who have not yet realized what the trend has been for two centuries and who find their childish "discrete" theories so sexy.

As far as I can say, neither of these theories had anything to do with physics. What do I mean by physics? I mean the actual phenomena that we observe in Nature and their mathematical description that always happens to be quite unique. Examples? The Hydrogen atom. Other atoms. Chemistry. Newton's laws. The Lorentz invariance, the translational invariance, the rotational invariance, the required gauge symmetries and diffeomorphisms - all of them being continuous symmetries. Superconductors. Particle scattering. Radioactive decay. The laws of quantum field theory and general relativity into which we must embed the previous theories to agree with all the observed phenomena. String theory that is absolutely needed if we want to agree with quantum field theories as well as general relativity within a unified framework.

The "discrete geniuses" don't care about a single among these experimental constraints. Frankly speaking, their theories may have been silly even in the ancient Rome. They resemble Maxwell's pathetic and totally redundant models of aether from the 19th century - whose naivité was truly understood only when Einstein wrote down his special relativity, after appreciating insights by Hendrik Lorentz (who understood that only one electric and only one magnetic vector exists at each point and even in vacuum) - except that Maxwell's model of the luminiferous aether at least could agree with Maxwell's equations - and FitzGerald has actually constructed a working model out of many gears and wheels.

**Modern "aether geniuses"**

The modern "aether geniuses" don't care about a single physical effect in the list above or many other lists for that matter and they don't even try to construct the models because they probably know very well that they could not work. (The worse among them don't even care about quantum mechanics.) They may propose a theory that the Universe looks like LEGO or a binary computer or a binary quantum computer with a few operations. And they expect others to believe that all of physics will miraculously be reproduced. The Lorentz invariance will suddenly emerge without any reason (or maybe after a few (?) parameters of their LEGO building blocks are adjusted); the atoms will also emerge and they will have the right spectra. Gravity suddenly starts to act according to Einstein's laws. Why not? They have the great idea, don't they? The Universe is just like a simplified model of Commodore 64 - that's exactly what you need to revolutionize all of science.

**A technical paragraph:**Jacques Distler seems to be confused by my statement that the Lorentz symmetry can never emerge without a good reason, and his example is lattice QCD, apparently in Euclidean spacetime (and he apparently assumes some discrete symmetries that are nearly equivalent to the Euclidean Lorentz symmetry). I am talking about the actual "SO(d-1,1)" Lorentz symmetry in the Minkowski space. If you start with a non-Lorentz-invariant theory, there is a separate dimensional analysis for operators with respect to the scaling of time and the scaling of space. You definitely cannot derive that the IR limit is Lorentz-invariant for a generic non-relativistic field theory, and if Jacques ever did so, it's only because he already

*assumed*the Lorentz invariance - a symmetry between space and time - from the beginning. The best thing that can happen is a theory with different speeds of light for different particles where a finite number of IR parameters must be adjusted. This is however not the case if you start with a generic non-relativistic theory. Then there are infinitely many fine-tunings needed for all the operators with arbitrarily many spatial derivatives (equivalently, in the language of lattice theories, with new interactions that are less local). Infinitely many miracles are needed to get a Lorentz-invariant theory from a generic non-relativistic interacting theory. You can only get to the point where the number of terms to adjust is finite if you assume that the theory will be a small perturbation of a Lorentz-invariant theory (and therefore the relativistic dimensional analysis is applicable), and it's simply not a right assumption when you start with a generic non-relativistic theory.

God is just like Lord Sinclair except that God is a bit more lazy so He created a simpler computer. At any rate, deep mathematics is certainly not needed. Every ordinary person may understand the "theory of everything", they believe. All the sins of physics between 1905 and 2005 - that have made physics counterinuitive and too abstract for the farmers - will be undone, they seem to think.

Can this kind of simplicity be a selection criterion in physics? Certainly not. Nature does not care how much time we need to spend to understand some of Her fancy concepts. Nature is simple but the simplicity is only revealed once we accept Her rules of the game. The laymen's "psychological" feelings of simplicity have nothing to do whatsoever with the kind of "beauty" that has become a good guide in the search for new and better theories. As the Russian physicist Okuň once said, "Prostoj prostoty něbůdět." There will no longer be any simple simplicity.

**Do they really believe it?**

Whenever I see these proposals that Nature must be a simple discrete system and everything else will hopefully work, it's hard to avoid the obvious question: have they completely lost their mind? In 2005, a usable theory must agree with billions of observations, and indeed, the Standard Model and General Relativity do agree with them. What miracle makes it happen? It's because the Standard Model and General Relativity satisfy many important principles that more or less imply the right physics, at least qualitatively - because the laws of mathematics are pretty strict if these principles are taken seriously. The principles are often almost enough to choose the right track or even the right theory.

**Principles of physics matter**

The Lorentz invariance of our theories guarantees that millions of properties of the actual collider observations will match with reality. The postulates of quantum mechanics are needed to agree with millions of experiments and many of their common features. Quantum mechanics plus special relativity implies that physics must look much like quantum field theory.

Quantum field theory only allows some types of fields and a very small number of relevant and marginal interactions. This simple classification of interactions is only possible because of a symmetry between space and time. When you think about the possibilities, there are not too many and because the principles are apparently right, our theories agree with experiments. A similar comment would apply to General Relativity, too. One just can't throw QFT and GR into the garbage bin. It is absolutely critical for any theory that is meant as a fundamental theory to reproduce the successful features of GR and QFT. It can be shown to be the case of many dual descriptions of reality that emerge from string theory even though some of these facts could seem like fascinating miracles at the beginning; and it can easily be shown not to be the case for all other "alternatives", especially the "discrete alternatives", that have ever been proposed as of November 2005.

But the "discrete geniuses" won't ever listen. They don't want to hear a 30-second proof that their theories can't work. This is not about any rational debates. The Discrete Universe is one of the postmodern religions. Why are there so many believers? The expansion of the computer industry may offer a clue.

**Miracles in a discrete world**

The Lorentz invariance of a generic interacting theory can never "emerge" accidentally from a starting point that is not Lorentz-invariant. The only reason how why it should emerge is that we could prove that it does - for example, because the theory is exactly equivalent to a manifestly Lorentz-invariant theory. This is the case of Matrix theory or the large N limit of the AdS/CFT correspondence where the Lorentz invariance is not manifest but in the appropriate large N limits, it appears because of the equivalence with a manifestly Lorentz-invariant description of the same physics.

But assuming that it will happen in a random discrete theory due to some miracle, without having a single rational reason why it should happen or why the discrete model should have anything to do with reality, makes Intelligent Design look like a highly reasonable up-to-date scientific theory in comparison.

It's just completely unreasonable to assume that a random, cheap, and childish discrete toy model will give you the Lorentz invariance, the physics of GR, the right atomic spectra, or the Standard Model. It's not only silly because it is too optimistic; we have dozens of rigorously proved theorems that show that such things can't work. String theory is able to circumvent many of these theorems because it is a very sophisticated theory that only differs from quantum field theories in very subtle ways. But the naive discrete models simply can't evade these theorems. The theorems were constructed to kill these silly theories, and they did so. I am aware that the first sentence of this paragraph implicitly says that thousands of people who are viewed as "alternative physicists" are completely unreasonable, and indeed, I don't think that their large number makes their opinions more justifiable.

**The discrete religion can't be killed**

At any rate, we will be hearing these things again and again because they have become a part of the "postmodern" era. The Universe is like an "Intel 8080" microprocessor or a simple type of a quantum computer with two or three gates. The Universe is a discrete history of tetrahedrons, dodecahedrons, or any Platonic polyhedra you can think about. The Lorentz invariance, gauge symmetries, diffeomorphism symmetries, chiral interactions, or the Higgs mechanism don't matter.

They will surely work out properly if the basic idea - namely the idea that the Universe is a simple computer that a mediocre geek can comprehend (but an idea that unfortunately does not have the tiniest glimpse of being realized in Nature) - is accepted and promoted to a new dogma. There will be a growing pressure on all of us to treat the "discrete geniuses" as physicists and never mention that we believe that they are really morons. This is simply how this postmodern era works.

Once again, I think that the statement that the proponents of Intelligent Design are less reasonable or more religious than the proponents of "discrete theory solves it all" hypotheses is a politically motivated and biased assertion. They're on the same level. Both of them may be described as pseudoscientific garbage that attempts to deny virtually everything we have learned in science (physics or biology) in the last 200 years. Both of them need a huge, rationally unjustifiable belief that seems to contradict everything we know and that is only supported by the dogmas themselves - dogmas that have seen no contact with the observable world whatsoever.

**What has provoked me to write this rant?**

I was asked what I thought about Seth Lloyd's ideas about quantum gravity. Seth Lloyd is a great expert in quantum computing whom many of us admire. In a slightly popular paper, he argued that the Universe can only have about 10^{90} bits. Well, today we believe that it is probably above 10^{100} because of the black holes at the galactic centers: the microwave background no longer dominates the entropy of the Universe. But except for this detail, Lloyd's statement is OK. We call the information "entropy" (up to a "ln(2)" factor) - and it is probably the only quantity discussed by Lloyd that can be defined in physics.

But he also talks about the upper bound of 10^{120} of "operations" (the term is being imported from computer science) that could have been done in the Universe. I am convinced that no one has ever defined an "operation" of the Universe. (It could be a spacetime path-integral counterpart of the coarse-grained entropy formula but I remain skeptical that it is possible at all.)

On the contrary, I am convinced that they can only be defined in systems that approximately behave as binary (or other discrete) computers and where we pick a priviliged basis of the Hilbert space. One of the important principles of quantum mechanics is that its Hilbert space has no priviliged basis in general. The number of "operations" then has no meaning in fundamental physics. The only similar concept that matters in fundamental physics is entropy and its time-derivative, and even these concepts are well-defined only when we define some "macroscopic quantities" that divide the microstates into groups. The increase of the total entropy is guaranteed to exceed the number of "operations" - usually by a lot. Computers generate much more entropy than the information that they manipulate with.

One of his conclusions is that the average operation must take 10^{-13} seconds, the geometric mean of the Planck time and the age of the Universe. This sounds as a complete nonsense to me. 10^{-13} seconds is a huge time in particle physics. A few microprocessors can surely make more operations than 10^{13} per second. What is the bound supposed to mean? Are these operations per Planck volume of space? Or in the environments with the particular bizarre values of the density?

Whatever you do, your dimensional analysis estimates of the total and global number of operations per unit time will either be clearly wrong; or it will be clearly higher than any practically realizable number of operations and clearly lower than the entropy times hbar divided by the total energy of the Universe. Pretty big window. ;-) And until someone gives me a quantitative definition (either a measurable one, or a function of operators in the theories we treat seriously) of the "number of operations in the Universe", I really don't care which point in the window someone chooses because it is not physics.

If someone says that the Universe is a family of angels on the tip of a needle, she can also ask how many sisters a particular angel has. If someone says that the Universe is an Apple Macintosh (why I chose this one will be clear after the following paragraph), he can similarly ask how many keys or operations it has. In both cases, it is an unphysical nonsense.

One of the happy punch lines is when Lloyd admits on page 15 that our Universe is not a computer that runs Windows or Linux. ;-)

In January 2005, Prof. Lloyd proposed that the world is a particular quantum computer. Fancy interdisciplinary terminology notwithstanding, it is essentially a paper about the Regge calculus with all the typical careless exercises. At the end of the paper, it is even suggested that - essentially - among the Standard Model gauge groups, SU(3) x U(1) may act on the wires while the SU(2) may act on the gates of the computer. Honestly, I really don't know whether it is a joke or not. (If it is not, it is at least as fascinating a representation of the Standard Model group as the intersecting brane worlds. Of course, my psychological certainty that this can't work exceeds 99.999%.) Honestly, it seems more funny to me than Bogdanovs' suggestions - especially because Lloyd's ideas refer to a system that we know pretty well: the Standard Model.

## snail feedback (6) :

Dear Lumos,

Are you talking about string theorists and supersymmetric partners that aren't observed when you say 'modern aether geniuses'?

Aether string theory disproved

Best wishes,

Nigel

Thanks, Nigel, for this example of the alternatives to string theory. They're not quite discrete but they will undoubtedly impress the readers. ;-)

Be careful about using Feynman's name on these crackpot pages. The reincarnations of Richard Feynman may sue you. Have you tried to search for motl toronto images at Google.com? Why do you think that the first image appears?

Let me to add another two bits of honesty. Space is not quantised, but angular momentum (and sympletic area) is. Time is not quantised, but perhaps the logarithm of decay width is (Phys. Rev. D 13, 574-590 (1976) and hep-ph/0506033).

Recently, Motl has said:

"GR has a symmetry principle that extends that of SR, not reduce it, so all constraints of SR remain true in GR" -Motl

We know how E got GR. He expressed Newton's gravity as a field equation and found that you have to include a contraction. The Newtonian field equation is R = 4.Pi.GT, but in 1915 E found that to correct it you need to put a contraction in to the left hand side (curvature), and correct the right hand side (mass-energy) by doubling it: R - x = 8.Pi.GT.

The physical contraction of earth's radius is by 1/3 MG/c^2 = 1.5 mm.

The physical content of GR is the OPPOSITE of SR:

‘… the source of the gravitational field can be taken to be a perfect fluid…. A fluid is a continuum that ‘flows’... A perfect fluid is defined as one in which all antislipping forces are zero, and the only force between neighboring fluid elements is pressure.’ – Professor Bernard Schutz, General Relativity, Cambridge University Press, 1986, pp. 89-90.

Notice that in SR, there is no mechanism for mass, but the Standard Model says the mass has a physical mechanism: the surrounding Higgs field. When you move a fundamental particle in the Higgs field, and approach light speed, the Higgs field has less and less time to flow out of the way, so it mires the particle more, increasing its mass. You can't move a particle at light speed, because the Higgs field would have ZERO time to flow out of the way (since Higgs bosons are limited to light speed themselves), so inertial mass would be infinite. The increase in mass due to a surrounding fluid is known in hydrodynamics:

‘In this chapter it is proposed to study the very interesting dynamical problem furnished by the motion of one or more solids in a frictionless liquid. The development of this subject is due mainly to Thomson and Tait [Natural Philosophy, Art. 320] and to Kirchhoff [‘Ueber die Bewegung eines Rotationskörpers in einer Flüssigkeit’, Crelle, lxxi. 237 (1869); Mechanik, c. xix]. … it appeared that the whole effect of the fluid might be represented by an addition to the inertia of the solid. The same result will be found to hold in general, provided we use the term ‘inertia’ in a somewhat extended sense.’ – Sir Horace Lamb, Hydrodynamics, Cambridge University Press, 6th ed., 1932, p. 160. (Hence, the gauge boson radiation of the gravitational field causes inertia. This is also explored in the works of Drs Rueda and Haisch: see http://arxiv.org/abs/physics/9802031 http://arxiv.org/abs/gr-qc/0209016 , http://www.calphysics.org/articles/newscientist.html and http://www.eurekalert.org/pub_releases/2005-08/ns-ijv081005.php .)

So the Feynman problem with virtual particles in the spacetime fabric retarding motion does indeed cause the FitzGerald-Lorentz contraction, just as they cause the radial gravitationally produced contraction of distances around any mass (equivalent to the effect of the pressure of space squeezing things and impeding accelerations). What Feynman thought may cause difficulties is really the mechanism of inertia!

In his essay on general relativity in the book ‘It Must Be Beautiful’, Penrose writes: ‘… when there is matter present in the vicinity of the deviating geodesics, the volume reduction is proportional to the total mass that is surrounded by the geodesics. This volume reduction is an average of the geodesic deviation in all directions … Thus, we need an appropriate entity that measures such curvature averages. Indeed, there is such an entity, referred to as the Ricci tensor …’ Feynman discussed this simply as a reduction in radial distance around a mass of (1/3)MG/c2 = 1.5 mm for Earth. It’s such a shame that the physical basics of general relativity are not taught, and the whole thing gets abstruse. The curved space or 4-d spacetime description is needed to avoid Pi varying due to gravitational contraction of radial distances but not circumferences.

The velocity needed to escape from the gravitational field of a mass (ignoring atmospheric drag), beginning at distance x from the centre of mass, by Newton’s law will be v = (2GM/x)1/2, so v2 = 2GM/x. The situation is symmetrical; ignoring atmospheric drag, the speed that a ball falls back and hits you is equal to the speed with which you threw it upwards (the conservation of energy). Therefore, the energy of mass in a gravitational field at radius x from the centre of mass is equivalent to the energy of an object falling there from an infinite distance, which by symmetry is equal to the energy of a mass travelling with escape velocity v.

By Einstein’s principle of equivalence between inertial and gravitational mass, this gravitational acceleration field produces an identical effect to ordinary motion. Therefore, we can place the square of escape velocity (v2 = 2GM/x) into the Fitzgerald-Lorentz contraction, giving g = (1 – v2/c2)1/2 = [1 – 2GM/(xc2)]1/2.

However, there is an important difference between this gravitational transformation and the usual Fitzgerald-Lorentz transformation, since length is only contracted in one dimension with velocity, whereas length is contracted equally in 3 dimensions (in other words, radially outward in 3 dimensions, not sideways between radial lines!), with spherically symmetric gravity. Using the binomial expansion to the first two terms of each:

Fitzgerald-Lorentz contraction effect: g = x/x0 = t/t0 = m0/m = (1 – v2/c2)1/2 = 1 – ½v2/c2 + ...

Gravitational contraction effect: g = x/x0 = t/t0 = m0/m = [1 – 2GM/(xc2)]1/2 = 1 – GM/(xc2) + ...,

where for spherical symmetry ( x = y = z = r), we have the contraction spread over three perpendicular dimensions not just one as is the case for the FitzGerald-Lorentz contraction: x/x0 + y/y0 + z/z0 = 3r/r0. Hence the radial contraction of space around a mass is r/r0 = 1 – GM/(xc2) = 1 – GM/[(3rc2]

Therefore, clocks slow down not only when moving at high velocity, but also in gravitational fields, and distance contracts in all directions toward the centre of a static mass. The variation in mass with location within a gravitational field shown in the equation above is due to variations in gravitational potential energy. The contraction of space is by (1/3) GM/c2.

This is the 1.5-mm contraction of earth’s radius Feynman obtains, as if there is pressure in space. An equivalent pressure effect causes the Lorentz-FitzGerald contraction of objects in the direction of their motion in space, similar to the wind pressure when moving in air, but without viscosity. Feynman was unable to proceed with the LeSage gravity and gave up on it in 1965. However, we have a solution…

‘Recapitulating, we may say that according to the general theory of relativity, space is endowed with physical qualities... According to the general theory of relativity space without ether is unthinkable.’ – Albert Einstein, Leyden University lecture on ‘Ether and Relativity’, 1920. (Einstein, A., Sidelights on Relativity, Dover, New York, 1952, pp. 15, 16, and 23.)

‘The Michelson-Morley experiment has thus failed to detect our motion through the aether, because the effect looked for – the delay of one of the light waves – is exactly compensated by an automatic contraction of the matter forming the apparatus…. The great stumbing-block for a philosophy which denies absolute space is the experimental detection of absolute rotation.’ – Professor A.S. Eddington (who confirmed Einstein’s general theory of relativity in 1919), MA, MSc, FRS, Space Time and Gravitation: An Outline of the General Relativity Theory, Cambridge University Press, Cambridge, 1921, pp. 20, 152.

‘It has been supposed that empty space has no physical properties but only geometrical properties. No such empty space without physical properties has ever been observed, and the assumption that it can exist is without justification. It is convenient to ignore the physical properties of space when discussing its geometrical properties, but this ought not to have resulted in the belief in the possibility of the existence of empty space having only geometrical properties... It has specific inductive capacity and magnetic permeability.’ - Professor H.A. Wilson, FRS, Modern Physics, Blackie & Son Ltd, London, 4th ed., 1959, p. 361.

‘All charges are surrounded by clouds of virtual photons, which spend part of their existence dissociated into fermion-antifermion pairs. The virtual fermions with charges opposite to the bare charge will be, on average, closer to the bare charge than those virtual particles of like sign. Thus, at large distances, we observe a reduced bare charge due to this screening effect.’ – I. Levine, D. Koltick, et al., Physical Review Letters, v.78, 1997, no.3, p.424.

If the electron moves at speed v as a whole in a direction orthogonal (perpendicular) to the plane of the spin, then the c speed of spin will be reduced according to Pythagoras: v2 + x2 = c2 where x is the new spin speed. For v = 0 this gives x = c. What is interesting is that this model gives rise to the Lorentz-FitzGerald transformation naturally, because: x = c(1 - v2 / c2 )1/2 . Since all time is defined by motion, this (1 - v2 / c2 )1/2 factor of reduction of fundamental particle spin speed is therefore the time-dilation factor for the electron when moving at speed v.

Motl's quibbles about the metric of SR is just ignorance. The contraction is a physical effect as shown above, with length contraction in direction of motion, mass increase and time dilation having physical causes. The equivalence principle and the contraction physics of spacetime "curvature" are the advances of GR. GR is a replacement of the false SR which gives wrong answers for all real (curved) motions since it can't deal with acceleration: the TWINS PARADOX.

Strangely, the ‘critics’ are ignoring the consensus on where LQG is a useful approach, and just trying to ridicule it. In a recent post on his blog, for example, Motl states that special relativity should come from LQG. Surely Motl knows that GR deals better with the situation than SR, which is a restricted theory that is not even able to deal with the spacetime fabric (SR implicitly assumes NO spacetime fabric curvature, to avoid acceleration!).

When asked, Motl responds by saying Dirac’s equation in QFT is a unification of SR and QM. What Motl doesn’t grasp is that the ‘SR’ EQUATIONS are the same in GR as in SR, but the background is totally different:

‘The special theory of relativity … does not extend to non-uniform motion … The laws of physics must be of such a nature that they apply to systems of reference in any kind of motion. Along this road we arrive at an extension of the postulate of relativity… The general laws of nature are to be expressed by equations which hold good for all systems of co-ordinates, that is, are co-variant with respect to any substitutions whatever (generally co-variant). …’ – Albert Einstein, ‘The Foundation of the General Theory of Relativity’, Annalen der Physik, v49, 1916.

24 Jan 2006

The expanding universe consists of an expanding scalar field in which at random times and locations, a transition takes place from the "false vacuum" to a single Planck volume. Each such volume expands in discrete steps as a self-avoiding random walk.

Assuming the universe is continuous and behaves like a manifold is accurate enough for work in volumes larger than 10**-30cm, but fails in discussions of the early univese and small-scale quantum phenomena.

http://phillipgood.info/guth.pdf

The traditional view continuous from discrete gives

way to the inverted paradigm: discrete from

continuous.Yuri Manin

http://www.ams.org/notices/201002/rtx100200239p.pdf

Post a Comment