Monday, December 06, 2010 ... Deutsch/Español/Related posts from blogosphere

How many degrees of freedom are there in Nature

...and how they are organized...

A very large number of people still believe that Nature is discrete in some sense, equivalent to a physical system that can only be found in a finite number of configurations. Every localized object is determined by a sequence of bits, they think: it is represented in a similar way as a taxpayer in the IRS database.

I have tried to understand why people like to believe such self-evidently incorrect ideas but I have largely failed. As far as I can say, they must believe that Nature is "economical" in the same sense as an environmentalist family. Nature has to produce everything out of cheap trash. It is very difficult for Nature to calculate, especially with complex continuous objects.

Nature has to hire some inexpensive bureaucrats who can only compute with bits or other simple packages of information. And they must treat everyone - or at least every degree of freedom - as equal to others. In particular, the number of the degrees of freedom has to be finite, those people think, because Nature can't afford to buy the infinite memory needed for continuous numbers or the accurate chips needed for precise calculations involving continuous objects. ;-)

Needless to say, Nature doesn't suffer from any of these limitations. It has no problem to calculate the evolution of the system exactly, whether or not it involves continuous numbers or an infinite collection of degrees of freedom. The limitations described above are anthropomorphic in character. More precisely, they're the limitations of some particular people who just don't like difficult enough maths.

Approximate or rounded or discrete calculations are always emerging out of the accurate and continuous calculations that are fundamental; it's never the other way around.

Nature loves difficult and advanced maths. It doesn't have to recycle trash, or buy special microprocessors or memory chips. Nature doesn't run on them; computer chips are made out of the natural blocks, not the other way around. Clearly, the kind of simplicity that the discrete people believe has absolutely no rational justification.

In fact, the evidence makes it overwhelmingly clear that all the fundamental descriptions of Nature require an infinite number of degrees of freedom. At least in one natural language, they're continuous. Both facts are implied by the existence of continuous symmetries in Nature.

Continuous symmetries require the objects themselves to be continuous; and they also require the number of degrees of freedom to be infinite. It's enough to see that a frequency-f wave may be boosted to higher frequencies by the Lorentz symmetry. The Lorentz group is noncompact so it's clear that there are infinitely many types of waves in each volume.

Only the holographic principle brings some limitations that restricts the information that can be squeezed into a finite volume. The information in an infinite volume is still infinite. Moreover, even the finite information in a finite volume cannot be constructed from some predetermined "bits" or other simple anthropomorphic "building blocks". The potential information that a system can carry is always infinite; just by specifying many things about the system, the infinite reservoir gets mostly suppressed.

In this text, I will review how many degrees of freedom - or numbers that describe the state of the system that evolve in time - have been present in various pictures of physics from Newton's era to the cutting edge in quantum gravity and string theory. But let us begin in the pre-scientific age.

Ancient Greece and Euclidean geometry

How did the ancient civilizations understand the amount and organization of the information in Nature? They were surely confused but I believe that the civilizations that understood geometry at the technical level knew that there was some space and at each point of space, matter may have been either present or absent.

In some sense, you may imagine that at each point of space and time, there is one bit of information that tells you whether something is there or not. The vacuum is separated from the matter. People didn't know whether they needed more than one bit to remember the individual "kinds of matter" - the elements, using their vocabulary - or whether the detailed microscopic structure decided about the properties of a material - which is what the atomists pretty clearly believed.

However, I do remember that when I was 4 years old or so, I had this rather explicit - yet totally naive - understanding of the information of Nature. Such a theory doesn't allow you to say anything about the allowed shapes that decide about the boundaries between the vacuum and the matter; and it doesn't allow you to calculate what will happen after some time. But it tells you at least something.

These days, we consider geometry to be an indisputable part of mathematics - which has divorced from physics a century ago or so. It is based on axioms that don't have to be exactly satisfied by the real world. And indeed, the axioms of simple geometries are not exactly satisfied in the real world, as we know from general relativity and other theories. However, we shouldn't forget something that Einstein liked to emphasize - that geometry was born as the oldest branch of physics. It was a science about the possible positions of perfectly solid objects.

We could say lots of things about the pre-scientific representations of Nature in the minds of the humans - including many contemporary people who still don't quite use the scientific picture to understand the reality. But the presentation would be far too sociological and it couldn't be quite coherent so let us jump to the first scientific theories that were able to predict the evolution of physical objects.

Isaac Newton and point masses

When Isaac Newton described the Solar System, he approximated each planet by a point mass with a position. One could use the coordinates to describe the position of any point - a helpful "analytic" addition to the old-fashioned "synthetic" geometry. If a point-like particle is allowed to move, its position will be a function of time:

x(t), y(t), z(t)
These coordinates are called the "degrees of freedom"; they become directions in the configuration space or the phase space (the latter is also parameterized by the momenta "p_x(t), p_y(t), p_z(t)". What makes the "degrees of freedom" well-separated is that the simplest Hamiltonians or Lagrangians are simply sums of terms from individual degrees of freedom. The terms may be "p^2/2m = mv^2/2" for free particles. Such a decomposition of the Hamiltonian into a sum means that the individual degrees of freedom evolve separately.

Newton realized that the planets were not infinitely small points. However, he also exploited the fact that the approximation that assumed that they were infinitesimal points was damn too useful.

In fact, he has also proved that the gravitational field from a ball - or any spherically symmetric arrangement of matter i.e. any configuration of concentric spheres - has the same gravitational field as a point mass of the same mass. Gauss's law and the symmetry make the proof trivial today. These insights have made him self-confident enough to write down the laws that governed not only the Solar System.

The time evolution of the coordinates can be given by differential equations, i.e. mathematical conditions of the kind
m d2 / dt2 (x,y,z) = (Fx, Fy, Fz)
where the components of force, the vector F, could have been calculated as the sum over all other point masses. The individual contributions to the force were usually radial forces that depended on the distance. So the equations took the schematic form
m d2 / dt2 (x,y,z, ...) = (functions (x,y,z, ...))
The forces destroy the exact separation of the system into the individual degrees of freedom but if there's any sense in which the forces are just "finite perturbations", it still makes sense to calculate the degrees of freedom just like it was done in the free theory that neglects the interactions.

Newton realized that the point masses had a crisp mathematical description which was why he assumed that everything had to be composed out of them. Even light was a stream of corpuscles in his world view. This picture was only partially confirmed by the modern theory of photons and became a genuine "loser" especially in the 19th century when interference experiments became possible and popular.

So the information about the state of Nature was equivalent to the knowledge of the coordinates of all the point masses, "(x,y,z, ...)", at each moment. Those numbers evolved according to some differential equations.

The modern atomic theory made it reasonable to assume that atoms could be the "elementary" point masses. You need 6.022 x 10^{23} atoms to construct one mole - a modest macroscopic amount - of matter. Let me round the number as 10^{24}. So you roughly needed a trillion of trillions of coordinates to describe one gram of matter.

Things were conceptually simple. Of course, the elementary numbers encoding the world were already continuous. But to describe a finite chunk of matter, you only needed a finite number of such numbers.

Classical field theory

However, people eventually became brave enough to use Newton's classical mechanics to describe the "continuum". They could imagine that it had lots of point masses - atoms or molecules, if you use the modern terminology (and refinement of Newton's views).

Instead of remembering the positions and velocities of all the point masses in the material, you could make the following approximation: in each cubic micrometer of matter (or another volume whose size is sent to zero but that should still be thought of as being much larger than the volume occupied by one molecule), you calculated the number of atoms and their average velocity, among other things.

The assumption was that these averages over tiny volumes contained all the relevant information that can be measured and that determines how the continuum moves and evolves. In this way, people could begin to study classical field theory - including hydrodynamics, aerodynamics, tension of solid objects, and electromagnetism. Thermodynamics described the flow of heat in the materials, too.

Because the average velocity must be remembered for each de facto infinitesimal volume in space, quantities such as the velocity have become fields. They were no longer just functions of time "t"; they became functions of the spatial coordinates "x,y,z", too. Classical field theory began to work with these fundamental numbers that knew about "everything" in the real world:
field1 (x,y,z,t), field2 (x,y,z,t), ...
Because there are many independent variables, such fields are required to obey partial differential equations instead of the ordinary differential equations that we encountered in mechanics.

Electromagnetism may be fully formulated in terms of 4 fields, the electromagnetic potential 4-vector, at each point. Note that the original point masses can no longer be quite reconstructed from the fields. While the fields may have a pretty simple dependence on "(x,y,z,t)", the trillions of trillions of velocities of the atoms could be much more complicated.

However, in the case of aerodynamics or hydrodynamics or the science about the tension inside the solids, you may always imagine that the fields are calculated from coordinates of the atoms and that the latter are more fundamental.

This simple fact has led to the most painful misconception of the 19th century physics - the luminiferous aether. Because the fields such as "velocity of a point in the river" may be calculated from the atoms, it was assumed that the electric and magnetic fields had to be some averaged properties of some "atoms" in a material, too. The material was called the luminiferous aether.

The aether has been a bizarre "element" identified with the empty space in most of the ancient civilization. The adjective "luminiferous" means that this kind of aether could carry light. This property has made it even more bizarre because the aether had to be able to penetrate through the glass and water but it still had to behave as a solid.

However, the assumption that everything "has to be" extracted from some "atoms" was exactly one of the unjustified and unjustifiable constraints and artificial dogmas that people invented for themselves to fool themselves and to make progress almost impossible. Using the words of Murray Gell-Mann, the aether was an example of the "excess baggage" that had to be "jettisoned" to make progress.

Special relativity

Indeed, as Hendrik Lorentz and especially Albert Einstein fully appreciated, the electromagnetic fields could exist and did exist independently of any "carrier" comprised of small particles. Lorentz wrote down the vacuum electromagnetic equation, realizing that there was only one electric "E" and one magnetic "B" vector at each point.

For Einstein, this was one of the most important insights that has led him to special relativity. He didn't really need any experiments - such as the fancy Morley-Michelson experiments, for example. Einstein had simply realized that the speed of light was predicted to be universal - it didn't require any privileged reference frame. And because the vacuum was empty and couldn't remember the information about the privileged frame, either, it followed that the space and time had to behave differently than they did in the Newtonian mechanics.

It's not hard to see why the liquidation of the "aether" dogma was paramount for relativity. Any continuum that is composed out of "atoms" inevitably picks a preferred reference frame that breaks the Lorentz symmetry, so it can't be the vacuum. Just compute the average velocity of the "atoms" in a volume "V". This velocity may vanish at most in one reference frame which is the privileged one. Any boost makes the velocity nonzero so the rotational symmetry would be broken in all other laws.

I find it flabbergasting that all the people who are trying to "construct" the vacuum out of small bricks, spin foams, triangulations, and spin networks - much like the atoms of the aether - are still unable to understand this amazingly simple point about an irrational dogma that perniciously plagued the 19th century physics and that was heroically killed by Albert Einstein.

So we must distinguish relativistic field theories and non-relativistic field theories. All the "field theories" relevant for aerodynamics, hydrodynamics, and the solids were inevitably picking privileged reference frames associated with the materials so they had to break the Lorentz symmetry. We call them non-relativistic field theories. On the other hand, the electromagnetic field obeying Maxwell's equations respects the Lorentz symmetry, too.

The electromagnetic field may be found in the "vacuum state" in which "E=0" and "B=0". This state itself preserves the Lorentz symmetry. Not only the laws of physics - Maxwell's equations - are Lorentz-invariant. The vacuum is Lorentz-invariant, too. Note that the vacuum may break some symmetries of the laws of physics; in that case, the symmetry is "spontaneously broken" and the electroweak symmetry and supersymmetry are two major examples.

Clearly, the condition that the Lorentz symmetry is preserved by the vacuum is violated by any point masses. That's why field theory became "more automatically" compatible with special relativity than mechanics. However, even mechanics may be reconciled with special relativity. We must only appreciate that any configuration of localized objects such as atoms will break the Lorentz symmetry.

If you remember, the original setup of classical mechanics was breaking the symmetry between space and time. The information was encoded into functions "x(t), y(t), z(t)". Here, time is the independent variable while the spatial coordinates depend on time. However, it's important to realize that this asymmetry between space and time is just an artifact of a convenient way to describe the world lines.

In reality, the world lines remembering the history of the moving point masses don't have to violate the Lorentz symmetry between space and time in any way. You may describe the very same curves e.g. parameterically, by functions
x(tau), y(tau), z(tau), t(tau)
of another, auxiliary coordinate "tau" that has no special relationship with either of the coordinates "x,y,z,t" (although, of course, it is easiest to imagine that "tau" is a function of "t"). The laws that describe the motion of point masses may be perfectly Lorentz-invariant. Special relativity tells us what the laws are.

One of the big lessons - also misunderstood by many people - is that if some formalism doesn't make a symmetry "immediately obvious", it doesn't mean that the symmetry is actually broken. Symmetries may be respected in many ways in which they are not obvious. If physics requires a symmetry to be respected, Nature wants the symmetry to be respected and you must evaluate this condition rigorously and literally. A principle of Nature never says that such a respect for the symmetry should be obvious, and it surely doesn't say that it should be obvious to the stupidest people. ;-)

Quantum mechanics: a whole industry how to transform the information

So far, we talked about classical - i.e. non-quantum - physics. Things were pretty simple in the sense that the "fundamental degrees of freedom" always had one simple representation only. Well, the positions of particles could be expressed in spherical coordinates and the fields could have been Fourier-transformed. But none of these operations really changes our idea about the "number of degrees of freedom" and their basic character.

Things began to dramatically change with the explosion of quantum mechanics. But as we will see later, it was just the beginning.

Particles were originally described by de Broglie's or Schrödinger's wave and some people thought that these functions of "x,y,z,t" were on par with classical fields. However, Max Born was the first one who clearly articulated that the waves had a probabilistic interpretation. Moreover, "N" particles were not described by "N" fields or waves of 3 coordinates but by one field of "3N" spatial coordinates (and time): recall that Nature has to remember all correlations.

The fact that the wave function cannot be interpreted as a classical field - and there are no "realist" i.e. classical degrees of freedom behind quantum mechanics at all - has been discussed many times on this blog so I will avoid this topic in this text.

Instead, I want to focus on the amount of information that is encoded in the wave functions - the number of numbers that it carries. Imagine that a Hydrogen atom is found in its ground state. The electron's motion may be described by the ground state wave function, "psi(x,y,z)".

However, you may also Fourier-transform this wave function into "psi(p_x,p_y,p_z)". The position basis and the momentum basis are not the only two bases of the Hilbert space - not even the only two important ones. The energy basis is at least equally important. Instead of specifying the continuous functions "psi(x,y,z)" or its momentum counterpart, you may specify countably many amplitudes
c1s, c2s, c2p (m=-1), c2p (m=0), c2p (m=+1), ...
The squared absolute values of these complex numbers remember the probabilities that the atom is found in the ground state or one of the first low-lying excited states. If you know that it's not too excited, you don't really need the "c" coefficients from much higher levels.

But even though we have no function of continuous variables "x,y,z" here, the coefficients "c" above actually remember all the information that may be encoded in the wave function! Fourier series are the simplest example of the fact that a collection of countably many complex coefficients carries the very same information as a normalizable (square-integrable) complex function of an arbitrary number of real variables.

The transformation from one basis to another became one of the most frequently used "dictionaries" that are totally essential for any meaningful calculation - and even verbal descriptions - of quantum mechanical phenomena. Moreover, the uncertainty principle implies that the complete state of a particle is encoded in a complex wave function of e.g. the positions and you don't need to - and you can't - simultaneously describe the dependence on the momenta. The dependence of the probabilities on the momenta is encoded in the same wave function and it becomes totally readable if you Fourier-transform it.

Again, the permanent switching from one basis to another - bases of eigenvalues of different observables that don't commute with each other (that's why the bases are different!) - is totally essential to do anything in quantum mechanics and whoever doesn't understand that there are many bases of the Hilbert space and none of them is "fundamentally more real" than its counterparts has simply misunderstood the most fundamental point of the whole quantum revolution. Quantum mechanics shows that Nature looks at the world from infinitely many angles - in the Hilbert space.

There can't be any segregation of obsevables into beables and non-beables or into primitive and contextual ones. All observables obey the same laws of quantum mechanics and their bases are equally good to describe physics.

Quantum field theory

Quantum field theory has reconciled the postulates of quantum mechanics with the degrees of freedom that existed in classical field theory: quantum field theory adds "hats" to the classical fields.

Because of the hat, the energy of a typical field is no longer continuous; its spectrum is discrete. Every quantum field can be interpreted as a collection of infinitely many harmonic oscillators.

Each Fourier mode of each quantum field - a mode is given by the allowed frequency and the direction of the motion (and polarization if there are many) - may be considered a "degree of freedom". Each degree of freedom may carry "N" particles. The number is integer-valued because the degree of freedom is a harmonic oscillator whose energy spectrum is discrete and "E=hf", the energy of one particle, is the spacing between the allowed energy levels of the single degree of freedom.

Interactions are added to the dynamics and they mix the different degrees of freedom - modes of the individual fields - much like the gravitational interactions between point masses mixed the old degrees of freedom, namely the positions of the point masses.

It's important to realize that this simple counting of the degrees of freedom is only possible because and if the dynamical laws of the theory may be interpreted as a small perturbation of the "free laws", and the "free laws" have a Hamiltonian that is simply a sum over many degrees of freedom, so that the individual degrees of freedom don't interact with each other. They're decoupled. This condition, if satisfied, also simplifies the counting of energy and other extensive quantities in thermodynamics (equipartition theorem).

If the interactions between the degrees of freedom are strong, lots of strong and inevitable correlations between these degrees of freedom will arise in all the permissible low-energy states of the physical system. That will mean that many of these degrees of freedom will cease to look independent from each other and this fact will make the counting more subtle.

That's why it's very hard to say how many degrees of freedom a proton has. Clearly, it contains some quarks and gluons but the precise number of them that matter depends on the resolution you choose to study the proton. The higher energies of collisions you're interested in, the more quark and gluon degrees of freedom you have to add, the closer cutoff distance you have to resolve. You're effectively choosing the degrees of freedom from the "infinite bath" of the degrees of freedom that Quantum Chromodynamics (and any field theory) provides us with, even in a small region of space.

The counting of the degrees of freedom in strongly interacting quantum field theories is hard. However, let us imagine that we have a weakly coupled quantum field theory that may be interpreted as a small perturbation - weak interaction - applied on a free, non-interacting system of a free quantum field theory whose degrees of freedom may be easily counted.

Even in this case, the number of degrees of freedom may become substantially larger if we include many fields at each point. That's possible if the elementary particles have many flavors (up/down/strange...) or many colors (quarks have three colors).

In particular, the latter possibility - a large number of colors - can be realized "en masse". Gauge theories with a parametrically large number of colors exhibit many new phenomena. Effectively, they may secretly produce new dimensions, as the new radial holographic dimension in the AdS/CFT correspondence. But that's already a new story.

Quantum gravity

One of the elementary "upgrades" of the previous counting of the degrees of freedom is that the dimension of spacetime itself may be higher than four. But it's actually one of the most conceptually modest steps that string theory forces upon us. But let us start with the totally conservative issues.

Perturbative string theory - in which the interactions are so weak that all effects are reliably calculated by Taylor expansions in the small string coupling constant (the increasingly higher-order corrections correspond to an increasingly high number of basic interactions that split or merge strings) - may also be defined as a "string field theory", a kind of quantum field theory with infinitely many types of fields. Each possible energy level of a single string produces one field in the spacetime.

In this approach, which is surely not the only possible approach to perturbative string theory, string theory is equivalent to a quantum field theory with infinitely many fields. However, except for a couple of them, all these string fields correspond to very massive particles and can be neglected because we don't typically have enough energy to create a single particle of a heavy sort.

So it would look like string theory is effectively equivalent to a quantum field theory living in the same spacetime, equipped with a finite number of fields at each point. And if you allow arbitrarily high energy densities, you will guess that the number of fields at each point is infinite - so string theory seems to have an "infinitely" higher number of degrees of freedom than a simple quantum field theory in the spacetime of the same dimension.

However, the exact answer is actually the other way around: string theory has many fewer degrees of freedom than a field theory in a 4-dimensional or 10-dimensional spacetime.

This vague insight has been known from many angles. Perturbative string theory is fully described by a conformal theory on the world sheet, the 1+1-dimensional surface that a moving string draws in the spacetime as its history. The world sheet is just two-dimensional and it is smooth. This fact surprisingly allows us to say that the dynamics of string theory - which looked like quantum field theory in "D=4" or "D=10" with many fields at each point - can actually be fully incorporated into a D=2 field theory. In some counting that you have to precisely define, string theory only has "two dimensions" even if the phenomena look 10-dimensional.

These unusually methods to count the degrees of freedom are actually not the deepest truth we have. The deepest truth is that string theory is the only consistent theory of quantum gravity and quantum gravity respects the holographic principle.

The maximum entropy you can squeeze into a given volume - or the maximum entropy that can be carried by a bound object of a fixed mass - is the black hole entropy corresponding to the predetermined volume or mass. That's why the black hole is classically the final stage of any classical evolution involving the gravitational collapse. (The hole continues to Hawking-evaporate, producing a radiation of still higher entropy. But this radiation is no longer a bound state so it can exceed the black hole entropy for the same mass.)

Note that the entropy can't decrease so if a collection of 40 stars and 760 other celestial bodies ends up as a black hole, it's guaranteed that the black hole entropy is higher than the entropy of the initial state - any bound state of any objects with the same mass.

This insight is actually more than just an insight about the black holes: it's an insight into all the information in quantum gravity. The degrees of freedom describing a quantum system that gravitates may be imagined to live on the boundary of the volume occupied by this physical system. The maximum information scales with the surface area - not with the volume. The AdS/CFT correspondence is the most well-defined - and the only truly "settled" - example of this phenomenon.

This decrease of the number of degrees of freedom is true despite the fact that the perturbative picture of string theory seemed to produce many more degrees of freedom than a quantum field theory with a few fields. If you take the interactions - and gravity - into account, the degrees of freedom become so correlated (if your condition is to have a limited mass/energy and/or small volume) that the number of independent degrees of freedom effectively drops.

However, even though there is a unit of information per Planck area, it is totally fallacious to imagine that there are actual "bits" localized in these Planck areas of the event horizon. Recall that Nature doesn't have to use recycled trash to produce physical objects. Bits are easy to imagine for binary people but Nature doesn't give a damn whether something is easy to imagine for anyone. It cares whether things work.

The sequence of "N" bits allows "2^N" different configurations. It's an exponential whose base is two. It could also be three, four, or any integer. However, none of them is fundamental in physics - and mathematics. What is the base that is natural in Nature? What is the base of the natural logarithms and exponentials?

It's called "e" and it is approximately equal to 2.71828.

Despite the beautifully chosen adjective "natural", some people apparently still don't understand that the logarithms and exponentials based on "e" are more natural than those based on two, three, ten, or any other integer. You know, the derivative of "exp(x)" is "exp(x)" again. This is only satisfied if "e" is the base.

Everywhere in fundamental physics, all exponentials and logarithms appear with the natural base "e". For example, if you count the number of possible microstates, it's naturally written as "exp(S/k)" where "S" is the entropy. You could redefine the entropy by a factor of "ln(2)" except that it would make all the other formulae with "S" more contrived: at least some of them would have to include unnatural factors of "ln(2)" to compensate your sick choice of the base of the logarithms.

Equations are only simple - and free of arbitrary and unnecessary complications - if you write them in terms of natural logarithms. The entropy of the black hole - normalized so that the base is "e" - is simply "A/4G". Similarly, Cardy's formula counts the number of highly excited states in a conformal field theory and it is given by the "e"-based exponential of a simple exponent. No powers of two. If you were using the base 2 or any other base, there would be an extra factor of "ln(2)" somewhere in the formula for the elementary area that carries one unit (bit) of information.

It would also be crazy to find "binary degrees of freedom" that are separated from each other inside the black hole because the black hole is the single physical system where everything maximally interacts (gravitationally) with everything else - so the separation of the information into decoupled degrees of freedom is more impossible for black holes than for anything else.

At the beginning, I was describing some people's obsession with arguments that Nature has to be made out of the same bits as digital computers. In reality, such an idea may be good as an order-of-magnitude estimate of the amount of information carried by a system.

A real physical system in Nature has infinitely many degrees of freedom but all of them except for a finite number of exceptions have an overwhelming probability to remain in their ground state (the state with the lowest allowed energy) - otherwise they would make the total energy too high.

The finite number of degrees of freedom that are relevant and active may be imagined to carry the same information as each of their friend - and the same energy. This is clearly never exactly true in Nature but it is enough for order-of-magnitude estimates: you may talk about the estimated number of "active" degrees of freedom and their average energy or information, among other things.

However, such a visualization what's going on is never accurate. People's obsession with the idea that this is how the world should fundamentally work - even though it is totally obvious that the world cannot work in this way - is just another piece of evidence for the fact that the human stupidity knows no borders.

And that's the memo.

Add to Digg this Add to reddit

snail feedback (3) :

reader Eric said...

I think it "is" possible that the information of the universe is finite. One might consider the Main principle of QCD, I.e. asymptotic freedom safety of each quark with relation to the other two, to be the safety valve that allows this. In QCD as each quark in nucleons gets physically closer to each other each quark has more chance of acting independently of the other two. This could be considered a state of each nucleon where each quark within it have An infinite degree of freedom when they are infinitely close together. As each quark moves farther away from each the probability of independent action of each becomes less and less. My understanding is that at the statistical cross section of the proton that freedom regresses to zero - that is, asymptotic freedom confinement.

One could consider the state of each quark to have infinite degrees of freedom with relation to each other when they
are infinitely close to each other. Conversely when they are at the statistical cross section of the nucleon then they have zero degrees of freedom in relation to each other.

I think eventually this principle will be able to be applied to much larger cross sections of the universe in a way that allows a statistical average number of degrees of freedom to be limited "per volume" as the universe expands. It may even be possible that at some late stage in the expansion of the universe the universe as a whole will converge to a zero degree freedom state. Something to ponder.

reader Andrew Kazyrevich said...

Hi Lubos,

Quick question about your note that "...Any continuum that is composed out of "atoms" inevitably picks a preferred reference frame that breaks the Lorentz symmetry.."

From what I (think I) know about Lorentz symmetry, it's about all physics laws being the same for all inertial reference frames. If someone juggles in a steadily moving train, the cones are moving in the same way as for a non-moving juggler. Preferred reference frame you're talking about is apparently reference frame bound to the "atoms of the continuum" itself. If it's an inertial reference frame, I don't see why Lorentz symmetry is broken.

What am I missing?


reader Luboš Motl said...

Dear Andrew,

the principle of relativity, as you explained yourself, requires that you can't be able to say whether your train is moving, or the next one is moving etc.

But if you the trains were going through a (dense enough) material composed out of "atoms", e.g. water, you could surely say whether you're moving. The water would be hitting the front side of your body. (Even the air could be enough for your to feel if you open the windows/remove the roof.)

More generally, in any environment that is composed out of "particles" - not necessarily water molecules - there will surely exist effects that "know" about the velocity of these atoms, and the remaining physical objects will be influenced depending on their relative velocity with the "particles" - such as the water molecules.

So the laws of physics will be different for observers at different velocities, thus violating the principle of relativity.

That's the reason why there cannot be any luminiferous aether or spin foam - or any other material composed out of particles (with well-defined speeds) - that is filling the vacuum. The vacuum must be empty of any objects that could be assigned any "trajectories".

Condensates such as the Higgs field that are not composed of "localized particles" may be OK. The Higgs field, in particular, *is* OK. Of course, not being composed of particles is not a sufficient condition for Lorentz symmetry: even non-particulate "substances" might break the Lorentz symmetry. But "corpuscular" material *have to*.

Best wishes