Monday, August 01, 2011 ... //

Old theories as limits of the new ones

Scientific progress leads to the discovery of new theories that go beyond the old ones.

But the new theories rarely denounce their successful predecessors as pure trash. Instead, they incorporate them as limits. There are many points about the relationship between the old theories and new theories. For example:

• One needs to use the language of the new theories if he wants to be more accurate; the newer theory is often qualitatively different
• New theories lead to new effects whose implications may also be described in the old theories' language but such a description can't be accurate
• New theories that transcend the limitations of the old ones are determined by new constraints: they're rarely random assorted additions to the old theories
• New theories usually set some parameters from the older theory to a constant, or "one" in natural units; those parameters were undetermined and often considered zero or infinite in the older theories
I will discuss some examples, including examples that are usually not presented as examples of this "old theory as a limit" paradigm.

Flat Earth limit

The very first example is sort of funny and refers to the geometry or geography - which may also be considered the oldest branches of physics. A very long time in the past, people surely thought that the Earth's surface was flat (except for the hills and mountains). This misconception didn't prevent them from eating bananas and hunting pigs.

That obsolete theory is also a limit: it's the limit
$R_{\rm Earth} \to \infty$
If you imagine that the radius of our blue, not green planet is infinite, the sphere effectively becomes a plane. Note that the radius is a dimensionful quantity: it has units of distance. So how it can go to $\infty$ which doesn't seem to have units? Isn't the displayed formula above dimensionally incorrect?

Strictly speaking, it is. We should say "infinitely many meters" which is pretty much the same thing as "infinitely many feet". However, it's not the same thing as "infinitely many radii of Earth" because this unit, the radius of the Earth, is exactly what we want to be infinite.

My point is that you should avoid such limiting procedures. When something is sent to zero or infinity, you should better talk about dimensionless quantities. So a better way to write the condition above is
$R_{\rm Earth} / R_{\rm distances\,\,we\,\,care\,\,about} \to \infty$
The left hand side is dimensionless so it's fine. The distances we care about are the sizes of your family members, the pig you want to kill to extend your food reserves, or the distance to the pig. Equivalently, the limit above may be written as
$R_{\rm distances\,\,we\,\,care\,\,about} / R_{Earth} \to 0$
This limiting procedure means that if the hunter evaluates some quantity and there are corrections that contain a factor equal to the ratio above - which goes to zero - he may neglect the corrections.

Geocentric limit

We understand geocentrism as another flawed paradigm, a qualitatively mistaken theory. However, it may also be classified as a limit. For example, you may consider the strict geocentric system in which the Earth isn't even spinning around its axis. Instead, the whole heaven - the celestial sphere - is spinning around the Earth every day.

If you do so, you will naturally create a theory without e.g. the Coriolis force. This force is equal to
$\vec F_{\rm Coriolis} = -2m \vec {\Omega} \times \vec v.$
The mass $m$ of the object is thought of as a finite number, so the limiting procedure is
$|\Omega v| \to 0$
Again, you should divide this quantity by another quantity with the same unit - in this case, some everyday acceleration. This limit means that people achieve their velocity - by normal acceleration while running - in much shorter time than $1/\Omega$ which is approximately (\1/2\pi\) of a day.

If you need much less than one day to accelerate from $\vec v=0$ to your final speed, the accelerations induced by your muscles will be much greater than the Coriolis acceleration, so you will be allowed to neglect the latter. Because the Coriolis acceleration is a textbook example of a sign that the geocentric frame isn't quite correct, the situations in which the Coriolis force (and effects sharing the same origin) can be neglected are situations in which the geocentric perspective may be legitimate and even useful.

That's why most people forget that every 24 hours, they're actually flying around a circle of circumference 40,000 kilometers. They are so used to the geocentric perspective that if you tell them that the Earth is spinning, it gives them vertigo. :-)

There are many other approximations or limits that were routinely made and are still being made in many contexts. For example, the distance to the stars is infinite. That means that the stars have fixed places on the skies. That's not true - the motion of the stars during the year is studied under the "parallax" brand, and so on.

Note that in the case of the Flat Earth, people ultimately had to discover a new finite constant - that was previously thought to be infinite (and therefore carried no detailed information) - the radius of the Earth. It was associated with previously unseen phenomena.

On the other hand, the main extra constant needed to determine the Coriolis force was based on $\Omega$ which is essentially the inverse day. And people knew one day: they just didn't know that the alternation of days and nights leads to a new small (fictitious) force acting on all moving objects, the Coriolis force.

Infinite friction limit

Aristotle established his "old kind of physics". He was convinced that the only natural state of objects is rest. You need to spend energy or force - he couldn't distinguish those concepts because his picture was way too non-quantitative - if you want to move the object.

(Recall that Newton could have made progress because he realized that any uniform motion is a natural state of affairs and you only need force if you want to change the velocity.)

You must imagine some ancient Greek horse in the mud. Every step is hard. You may summarize Aristotle's as the limit
$|\vec F_{\rm Friction}| \to \infty$
The friction is simply infinite. That's why the horse immediately stops. Everything else stops, too. Aristotle graciously overlooked that balls thrown into the air are able to move for quite some time. He just closed his eyes so his theory was OK for him. ;-)

Note that the friction force on the left-hand side of the unit is really understandable only once you know something about the new theory. Aristotle, with his brain's living inside the old theory, couldn't even understand what is the "friction force" that he was accused to neglect.

By focusing on the limiting situation where the friction forces are infinite, Aristotle and his blinded followers de facto justified the notion that the Euclidean geometry - another gift of Ancient Greek science - is the only physics subdiscipline you need. There was no genuine "dynamics" because all motion was viewed as an "unnatural" thing. Only static arrangements of objects were "natural". They still realized that people and horses had muscles and they could move with other objects but they couldn't say anything quantitative about these effects.

Heat-energy conversion

Newton has of course shown that the friction forces are not infinite. No forces in Nature are really infinite. Instead, his new theory carefully and quantitatively analyzed their values. It has described formulae for the magnitude of these forces in some situations - e.g. the gravitational inverse square law - and it has also described the effect of these forces - acceleration of objects.

Energy also started to be used as a useful, properly understood observable - a quantity that is conserved. (Because of the time-translational symmetry of the laws of physics, as Emmy Noether has figured out.)

People knew something about heat but they would think it's something else than energy - much like milk is something else than pork. Again, because people didn't know what was the conversion factor between heat and energy, they could have believed that the conversion factor is infinite or zero: friction (that reduces the velocity) isn't enough to heat objects up, some people could have thought.

Well, the opposite limit seems nonsensical because friction surely doesn't increase the temperature by an infinite amount. But in the opposite situation, it looks equally reasonable. The people believing in the opposite limit could have thought that heating of an object isn't enough to induce the motion of anything within the object. We know that this is wrong, too. Joule has found the correct finite conversion factor - which is why we may use the same unit, named Joule to remember his contributions, both for heat as well as energy.

Speed of light, Planck's constant, Newton's constant

There are many "less important" examples of limits that may interpret various beliefs about the real world that people had held before the 19th century. But let us jump to the canonical examples of the three limits that people associate with the 20th century science.

Special relativity has shown that Newton's ideas about the spacetime are only valid in the limit
$c \to \infty$
which is equivalent to
$1/c \to 0.$
Again, this should be more properly written as $v/c\to 0$ for all velocities $v$ of objects we study. Special relativity qualitatively changes the properties of the spacetime. For example, the simultaneity of pairs of events is no longer absolute; it depends on your reference frame. However, the old picture emerges in the appropriate limit.

The quantum revolution has shown that classical (=non-quantum) physics was qualitatively wrong for many related reasons. The observables such as $x,p$ used to commute in classical physics, different histories couldn't interfere with each other, and predictions were deterministic. That was wrong but the classical physics still emerges in the
$\hbar \to 0$
limit. It's similar to the $1/c\to 0$ limit that defines non-relativistic physics.

Adult theoretical physicists like to use natural units in which
$c=\hbar=1.$
This is a legitimate choice because e.g. the SI units have independent basic units for length, time, and mass. By choosing two of them differently and appropriately, you may achieve $c=\hbar=1$. Well, it essentially means that you use seconds for time, light seconds for distance, and inverse seconds as units for energy (because energy may be expressed as the angular frequency of the photon with the same energy).

By setting $c=1$, the physicists simplify all equations that have something to do with special relativity - they may erase $c$ - and they declare that special relativity is a mundane thing for them. In the same way, $\hbar=1$ means that they want to erase $\hbar$ from all formulae related to quantum mechanics which also simplifies them.

If you set $c=\hbar=1$, it means that you accept both special relativity and quantum mechanics as important facts. With these two assumptions, you're effectively led to quantum field theory - or something that is more powerful and accurate than quantum field theory, namely string theory.

It's usually not discussed in this way but the SI units also include an independent unit of absolute temperature or temperature difference, the Kelvin degree. A similar choice
$k_{\rm Boltzmann} = 1$
also simplifies some equations, those that have something to do with the microscopic description of thermal phenomena. Temperature is roughly speaking the energy per one microscopic degree of freedom.

Note that in the SI units, $k_{\rm Boltzmann}$ is a very tiny number which suggests that people implicitly used to believe that this constant was zero. What does this belief means? Well, it means that the energy corresponding to the room temperature is divided among infinitely many degrees of freedom. Saying that there are infinitely many degrees of freedom of course means that you believe that there are no atoms - that matter may be divided arbitrarily finely.

So just like the replacement of $c\to \infty$ and $\hbar\to 0$ by $c=1$ and $\hbar=1$ is associated with the discovery of special relativity and quantum mechanics, the switch from the $k_{\rm Boltzmann} \to 0$ limit to $k_{\rm Boltzmann} = 1$ is linked to the discovery of the atoms and the statistical, microscopic explanation of thermodynamics that uses these atoms.

Now really: Newton's constant

So far, I haven't mentioned another universal constant, Newton's gravitational constant $G$. It's also small in the SI units which also means that people automatically use all kinds of approximations that assume that this constant is very small.

Well, they don't assume that it's strictly zero which would mean that they deny the existence of gravity altogether. Instead, they just think it's small which means that they would neglect the effects scaling as $G^2$ relatively to those that scale like $G$, and so on.

General relativity has to transcend this limit because one of its jobs is to describe extremely strong gravitational fields - e.g. black holes. That's why general relativity replaces $G\to 0$ by $G = 1$ - well, sometimes it's more natural to use rationalized Planck units in which $8\pi G = 1$.

Old-fashioned general relativists often set $8\pi G=1$ as well $c=1$ because general relativity also includes special relativity as its limit. However, they don't set $\hbar=1$ because they're rarely brave enough to study quantum phenomena at the same moment.

If you study quantum gravity, it's natural to set $1=c=\hbar=G$. However, the only consistent theory of quantum gravity is string theory and it guarantees that many new interesting things have to take place near the Planck scale. For example, there are perturbative strings with tension $T = 1/2\pi \alpha'$ and the usual useful convention in perturbative string theory is
$T=1/\pi\quad$ i.e. $\quad \alpha' = 1/2$.
The limit of point-like particle field theories that neglect the stringy vibrations is $T\to\infty$ i.e. $\alpha'\to 0$. It's important to emphasize that the choice of units $\alpha'=1/2$ is only useful for weakly coupled string theory and it is incompatible with the choice $G=1$.

Effective field theories

Once people understood the conceptual paradigms that make it natural to set $1=c=\hbar=k_{\rm Boltzmann}$ i.e. to use quantum field theories (or stronger weapons), with a microscopic understanding of thermal phenomena considered an obvious part of the story, the character of additional "old theory is a limit" developments shrunk.

In fact, if you choose the units in which all these conditions hold, all quantities have units that may be expressed as a power of a second (or an electronvolt or any other single unit you choose). So there's only one remaining observable - e.g. the energy - and you may say that it's smaller in the old theory and greater in the new one.

This is the only kind of the "improvement" of theories that remains possible in quantum field theory. An old theory is valid for all distances longer than $L_{\rm min}$ or, equivalently, for all energies per particle smaller than $E_{\rm max}$. And a better theory may shrink the value of $L_{\rm min}$ or increase the value of $E_{\rm max}$ and allow you to get further.

Once that happens, the old theories is a limit of the new one - we call it an effective field theory.

Of course, at the end, you want to realize that near the Planck scale, the improvements must stop because if you increase the energy per particle above the Planck scale, a further increase of the energy will guarantee that the collisions produce old, well-known black holes - so they prevent you from seeing any kind of physics that would be genuinely new.

At the end, when you also set $G=1$ or, inequivalently, $\alpha'=1/2$ - which has a similar impact - you're at the end of the story. All quantities are expressed as dimensionless numbers which also means that your calculation is either right or wrong. If all continuous quantities become calculable, just like in string theory, there is no room for any improvement of your theories. For example, you can't take the
$\pi \to 0$
limit. ;-)

In this brief history of physics, I was talking about the constants $1/c, \hbar, G$ that were once thought to be infinitely small which meant that all effects related to their finiteness were inaccessible. Better theories (and, in most cases, better experiments) made these phenomena accessible. They became the "bread and butter" of modern physics which is why it makes sense to set these constants equal to one. You only do it thrice and you seem to be finished.

This doesn't mean that there's nothing to learn about physics after you understand string theory. However, all the other constants - that you originally consider infinitely large or infinitely small, before you learn about their actual finite value - fail to be universal. In this sense, they differ from $c, \hbar, G$.

In fact, the very first example in this blog entry was meant to clarify this issue. It was about the Flat Earth limit. The extra parameter, $R_{\rm Earth}$, could have been sent to infinity. But the radius of the Earth isn't, unlike $c,\hbar,G$, a universal constant. It's just a property of one object.

It may be universally important for all members of the homo sapiens species because they happen to live on planet Earth (greetings to the International Space Station where exceptions live - but in fact, the radius of the Earth is even more important for them than for us haha). But it is not universal in the whole visible Universe. So of course, science that studies particular objects or particular events - which are not unique in this Universe or its history - will always have lots of things to study and it will always try to work with various limits and to develop better theories that transcend the limitations of these limits.

But in principle, the progress in fundamental physics only includes "a finite, rather small number of revolutions" that replace a limit by an inequality, setting a universal constant equal to one.