Saturday, April 12, 2008

Einstein and the physics of principles

A week ago, I wrote an essay beginning with the question
Is theoretical physics possible?
I argued that it was very hard for the laypersons to understand that the brain, abstract reasoning, and careful calculations are not only useful but sometimes essential for the deepening of our knowledge about the physical world.

One more recent article in the Discover Magazine shows this misunderstanding very clearly:
Has the Einstein revolution gone too far?
Its author, Richard Panek, is a "faculty advisor" at an institution with a rather scary name, namely "Goddard (!) College, Progressive (!) education for creative (!) minds". Wow. Well, he doesn't seem too creative, as we will discuss in detail. Also, I am not sure about his being progressive when he argues that the progress has gone too far. ;-)

It is good that Panek has at least realized that the fashionable contemporary criticism of theoretical physics can be equally well applied to Albert Einstein and that the math-driven techniques in physics have been extremely important if not critical for more than 100 years - and maybe much more. Some vitriolic haters of science try to pretend that the importance of theory in physics is a recent phenomenon.

But Panek is still completely wrong about every other idea he advocates.

Limits that scientists mustn't cross?

First of all, the very question whether "science has gone too far" in one respect or another is a symptom of entirely unscientific preconceptions.




The goal of science is to search for the right answers, not to determine how far scientists can go. Heliocentrists have gone too far in claiming that our great planet revolves around an irrelevant dot in the skies. Biologists have surely gone too far in eliminating God from the origin of species. Geneticists and neurobiologists have gone too far in demonstrating the biological origin of many properties and differences between humans (or animals). Quantum physicists have gone too far in getting rid of determinism from physics. Geologists, biologists, and cosmologists have "made" the Earth and the Universe too old, perhaps unnecessarily old, and too large. String theorists have found too many string vacua, i.e. too many solutions to the consistency criteria of quantum gravity and their theory has too many dimensions anyway.

Is it a problem? It is only a psychological problem for those who don't want to learn how Nature actually works and who prefer preconceived opinions or dogmas, if you wish. There exists no scientific way to determine, in advance, how far scientists may be allowed to go or how much they are allowed to use one particular tool or another. What is the procedure by which Mr Panek wants to decide how far the scientists can go? A public vote? Only research can determine such things as long as we are talking about real science, not a controlled theater.

Panek criticizes Einstein who knew whether various theories were valid long before some popular experiments checking these issues were made. Well, it's easy to criticize but Einstein knew the right answers, anyway. And it is possible to know these things. In fact, science and the history of science (including Eddington's questionable experiments) as understood in 2008 shows that Einstein's criteria to decide about the validity of various theories - at least before 1920 - were more robust and trustworthy than the particular experiments that were hyped back in 1919.

All insights of both special and general relativity are pretty much inevitable consequences of the postulates of these two theories and the postulates are known to be right. Good physics students know the derivations. And the application of solid logical arguments can never go "too far". Logic is one of the queens of science and it can break all artificial boundaries that would like to claim that an idea is "too much".

Einstein has attempted to "know" more difficult things, too. But his dreaming about a simple unified theory was not fruitful and he knew that - his later struggles have failed largely because he became out of touch with quantum physics. Such a disappointing outcome is possible because physics is not (only) about dreaming. Regardless of the methods you use to search for the truth, your theories are still being tested and they must compete with others.

Einstein's intuition about the "simplicity" of physical theories was right on the money in the 1900s and 1910s but it turned out to be too naive in the 1920s and later.

Synthetic vs analytic approach

Also, the currently fashionable references to experiments and experience are mostly hot air because the actual difference between the two scientific approaches may be explained differently and more accurately - in a way that only moves experience to a different place but doesn't eliminate it.

And such a classification of scientific strategies was done a century ago. In 1919, as soon as he became a global celebrity, Einstein had to write an essay about the types of theories in physics. As the readers of his books know, he wrote a text that was probably inspired by Henri Poincaré's musings:
We can distinguish various kinds of theories in physics. Most of them are constructive. They attempt to build up a picture of the more complex phenomena out of the materials of a relatively simple formal scheme from which they start out. Thus the kinetic theory of gases seeks to reduce mechanical, thermal, and diffusional processes to movements of molecules - i.e., to build them up out of the hypothesis of molecular motion. When we say that we have succeeded in understanding a group of natural processes, we invariably mean that a constructive theory has been found which covers the processes in question.

Along with this most important class of theories there exists a second, which I will call "principle-theories." These employ the analytic, not the synthetic, method. The elements which form their basis and starting-point are not hypothetically constructed but empirically discovered ones, general characteristics of natural processes, principles that give rise to mathematically formulated criteria which the separate processes or the theoretical representations of them have to satisfy. Thus the science of thermodynamics seeks by analytical means to deduce necessary conditions, which separate events have to satisfy, from the universally experienced fact that perpetual motion is impossible.

The advantages of the constructive theory are completeness, adaptability, and clearness, those of the principle theory are logical perfection and security of the foundations. The theory of relativity belongs to the latter class.
Indeed, philosophy of science hasn't made much progress during the last century. A century ago, similar essays were being written by leading minds such as Poincaré and Einstein. These days, it is a domain of zeroes such as Panek, Woit, and Smolin.

Einstein makes it clear that the true underlying difference is not between the theory and experiments but between the synthetic (constructivist) approach and the analytic approach (based on principles). In both cases, we need to use the brain in one way or another. And in both approaches, we need to rely on experience.

It is just the chronology that differs.

In the synthetic approach, one usually begins by guessing an answer - without taking experience into account in any solid way. In this case, the guess must resemble some elementary or microscopic building blocks or rules. More complicated situations are theoretically constructed out of these ingredients and their predictions are compared with experience which either confirms or falsifies the guess. The theoretical work here is synthesis: we mentally create larger entities out of more elementary theoretical objects.

In the analytic approach, one doesn't begin with a hypothetical guess. Instead, she begins with empirically discovered rules, patterns, and principles that classes of phenomena seem to universally satisfy. These principles are reformulated as mathematical conditions and these conditions are used to analytically deduce new, so-far unknown properties of the real world. In this case, the theoretical work is described as analysis, i.e. breaking of a complex object or situation in one's mind into pieces. The analytical method leads us to more qualified guesses whose implications must still be tested but we shouldn't forget that a part of the test was already made when the guess was being constructed. We are performing a kind of preselection that is effectively equivalent to the falsification of alternatives that occurs in later stages of the synthetic approach.

Analysis is not an invention of string theorists. It is not an invention of Einstein either. As Wikipedia says, it has been ascribed, as a practical method of physical discovery, to Ibn-al-Heytham, Descartes, Galileo, and Newton. But the mode of thinking really goes back to Aristotle. All these people were rationally deducing the internal structure out of its complex manifestations.

It is no coincidence that calculus is referred to as mathematical analysis. It extracts universal properties of mathematical objects such as functions that can be used in many contexts and the discipline involves a lot of intermediate results that can't be directly measured in the physical applications.

Examples: heat, relativity, strings

Besides relativity, Einstein's main example of the synthetic and analytical approaches involves heat. Statistical physics is a synthetic approach because it starts with a guess - atoms - and derives their macroscopic consequences. Thermodynamics is the other, principled approach to heat. It is based on empirically observed principles such as the laws of thermodynamics (energy conservation and the increasing entropy).

The atoms in the first approach are just a "guess" while the principles, namely the laws of thermodynamics, have already been extracted from observations in the case of the analytical approach. It is possible to use these principles to severely constrain the possible form of physical laws. Indeed, the energy conservation has been an important guiding principle for formulating the laws of physics in the last 2 centuries. It is this law that naturally leads us to a (time-independent) Lagrangian or a Hamiltonian as the fundamental object that encodes the laws of physics.

The postulates of relativity - the equivalence of inertial frames, the universality of the speed of light, and the equivalence principle - play the same role as the energy conservation. They constrain the possible form of physical laws and allow us to deduce more elementary laws out of their more complex manifestations and eliminate vast classes of theories that would otherwise be conceivable.

String theory was discovered just like in the synthetic approach - the elementary building blocks were guessed (as a description of the strong interactions) - but it is fair to say that the analytic approach, the favorite approach of Albert Einstein, has been dominating in string theory ever since. The postulates of relativity combined with the principles extracted from quantum field theory are enough to see that theories defined under the string-theoretical umbrella are almost certainly the only solution.

And even when the incremental progress is being made in string theory itself, it often uses principles that are typical for the analytical approach, such as the existence of various required symmetries and low-energy fields or the absence of anomalies and unphysical singularities. That's how many dualities and transitions were first discovered.

The properties of the building blocks in string theory are not "flexible" in any way. There are no dimensionless non-dynamical adjustable parameters in string theory. That's another reason why string theory shouldn't be classified as a constructivist theory in Einstein's sense. It is a theory based on principles even though the ultimate, most powerful principle that may govern the whole structure in all contexts remains unknown (if it exists at all).

Atoms: synthesis vs analysis

The article that contained a part of Einstein's essay also discusses the subtle question whether the atomic theory itself was a synthetic theory or an analytical theory. Well, first of all, we can only classify methods of searching for the right answers or theories that are work in progress. Once our theories are fully completed, they are just theories. They can be dealt with analytically or synthetically. And all of us know that thermodynamics and statistical physics fit together and there exist many logical relationships in between them that go in both ways.

Statistical physics is mostly synthetic because the existence of atoms is a rapid guess and most of the theoretical work involves combining of atoms into macroscopic configurations and deriving their macroscopic behavior. On the other hand, the idea of atoms extracted from the observed fixed mixing ratios in chemistry was a victory of the analytical approach.

The last enemies of the atomic theory evaporated as soon as Einstein (and Smoluchowski) successfully described the Brownian motion back in 1905. I am not really able to say whether Einstein's work on Brownian motion was synthetic or analytical. It was analytical in the sense that he had to qualitatively infer the existence of molecules from the observed chaotic motion. On the other hand, this answer - the existence of molecules - already existed at that time and most of Einstein's calculations relevant for the Brownian motion were as synthetic as those employed in statistical physics.

The article surrounding the essay also correctly states that there may be a lot of confusion about the very question whether relativity is a principled theory. The main point of Einstein's essay was that it was one. On the other hand, relativity may also be presented as a constructivist theory, beginning with a simple object of the Minkowski spacetime and deducing its consequences.

I would still prefer to agree with Einstein that relativity remains a principled theory because it leaves the detailed character of the building blocks of matter more or less arbitrary. Don't get me wrong: relativity constrains them - they must preserve the Lorentz symmetry - but it doesn't uniquely say what they are which also means that relativity is not enough to "construct" complicated and explicit objects out of these blocks. This inherent residual ambiguity is the main reason why Einstein didn't classify relativity as a constructivist theory.

(When the folks were constructing the "right" aether that would agree with the Morley-Michelson and other experiments, there was a lot of constructivism in their convergence towards relativity except that the very idea of these aether constructions turned out to be unphysical bunk.)

The same comment applies to the space of vacua in string theory. The a priori infinite-dimensional space of possible effective field theories is constrained by the rules of string theory but there still remains a whole landscape of possibilities that prevents us from presenting string theory as a conventional constructivist theory of the real world. In the future, the situation can change. If our picture of string theory ever becomes complete, it will be possible to present it as a constructivist theory.

So I would emphasize that the classification of the theories only applies to a particular approach one takes in a given situation - it is about the history and methods of physics, not about the physical content of theories themselves. If you allow me to repeat myself in different words, both analytic and synthetic approaches require some reasoning and inference. In the analytical approach, the properties of the world (and objects) that are harder to be observed directly are being deduced from those that can be (or have been) observed. In the synthetic approach, it is the other way around.

In the synthetic approach, the bulk of the empirical information is applied at the end when the theory is verified. In the analytical approach, the bulk of the empirical information is used at the beginning, when the theory is being constructed.

However, as our understanding of the physical phenomena deepens, both sides of various deduced implications tend to become obvious, established, and observable. It often happens that the insight that used to be the less obvious or hardly observable becomes the more empirically accessible insight and vice versa. When a theory is completely understood, it is usually hard to say which of its insights are assumptions or axioms and which of them are derived consequences. Whenever it happens, the classification of theories into analytical and synthetic theories may flip or become confusing and ambiguous.

Finally, I want to talk about the difference between synthetic and analytical approaches using a slightly different terminology.

Induction and deduction

One can also distinguish inductive reasoning and deductive reasoning. In the deductive reasoning, the conclusions should inevitably follow the premises. In the inductive reasoning, the implication is not inevitable, just likely: induction includes various types of generalizations and extrapolations.

The mankind must obviously rely on both methods. We couldn't have gotten anywhere if deduction were the only method to proceed. And on the other hand, we would be foolish to rely on uncertain induction when we can deduce something rigorously.

Deduction is a part of the synthetic reasoning: for example, the macroscopic properties of objects in statistical physics can be literally deduced from the existence and properties of the atoms. On the other hand, induction is more typically connected with the analytical method: the features of the world that are empirically inaccessible usually cannot be deduced from the observations rigorously. However, the latter rule is not universal. I would argue that Einstein has rigorously deduced the conclusions of special relativity out of his postulates. It was a deductive work. In fact, he used the word "deduce" in his essay.

The only "non-rigorous" induction that he had to employ before he ended up with relativity as a theory of the real world was the derivation of the postulates themselves. All observed physical phenomena seemed to satisfy the postulates of special relativity but he had no real "proof" (even though the postulates are believed to be 100% correct even now, a century later). The postulates were extracted from the experience in a typically inductive fashion. Although such extrapolations fail to be rigorously proven, they are often extremely likely. If a non-trivial event - or the validity of a principle - is observed billions of times, it is at least useful to consider the possibility that it will happen again because it seems "likely". Sometimes it is much more than just "useful" and "likely". Even if such principles have a limited range of validity, the classes of phenomena where they are an extremely important and accurate approximation are usually vast.

At any rate, the people who would like to eliminate whole methods of reasoning such as deduction, induction, analytical reasoning, or synthetic reasoning from science (or those who would like to impose quota on them) have obviously no idea how theoretical physics in particular and science in general operates or can operate and they could have never seriously worked on it. You should ignore them because they are just too ignorant either about the scientific method in general or a scientific discipline in particular.

And that's the memo.

2 comments:

  1. Hi Not a Surfer Dude,

    I suspect that the most precise minimum length [min(L)] lies within the range:
    anthropic calculated min(L) <= actual min(L) <= anthropic measurable min(L).

    It may be that min(L) is less important than minimum time [min(T)] of a trajectory.
    For example(s):
    1 - The Peregrine Falcon stoop is an attack corkscrew dive that is not the shortest distance, but rather the shortest time or fastest means of intercepting prey by using gravity assistance, 'NATURE: Raptor Force'.

    2 - Human satellites often use gravity assistance to explore the planets; again not the shortest distance, but the most efficient if not the shortest time.

    This is probably due to the difference between 2D and 3D representations.

    Consider Hamilton’s 2D Traveling salesman problem which is the shortest distance.
    If this problem is modified to that of a Traveling Spacecraft in 3D, the shortest distance is no longer the most efficient route.

    I suspect that energy is a Lie process of constant transformation at both intra- and inter-gauge scales.

    By energy economics, I mean treating the problem as identifying
    a - agents as standard model particles, including annihilation forms,
    b - strategies by which these particles self organize,
    c - utilities or payoffs which may be as simple as existence or longevity.

    I suspect that one could make the argument that Wolfgang Pauli utilized a constant sum game to deduce the existence of the neutrino when he accepted the concept of the conservation of energy. One may even argue that Pauli used a John Harsanyi [1994 Nobel economics] like method in an incomplete information problem.

    ReplyDelete
  2. Hi Lubos,

    This URL should go to Kozlov page 50 quote, last paragraph:
    "The duality of the time t and the energy H taken with the opposite sign can be seen in the explicit expression of the Cartan 1-form phi = y • dx - H dt."

    ReplyDelete