Tuesday, December 27, 2011 ... Français/Deutsch/Español/Česky/Japanese/Related posts from blogosphere

Steven Weinberg and anti-quantum zeal

A great scientist is defending GRW-like pseudoscience

In recent months, I have made several comments suggesting that Steven Weinberg is somewhere close to one-half of his path towards a personal transmutation into an anti-quantum zealot. This was based on some random sentences from his interviews etc. But I vaguely realized that there was actually a much stronger reason for such beliefs.

And I was reminded about a preprint proving that the figure "one-half" in the previous paragraph is unfortunately a significant underestimate:

Collapse of the State Vector (September 2011)
In this text, Weinberg endorses his own version of a Ghirardi-Rimini-Weber collapse theory. Recall that this is a model designed by several deeply confused people who think that every \(10^{15}\) seconds, each particle in the Universe is forced to fill a "census" after which its position has to become well-defined with the accuracy of 1 micron or so.




The reason why Ghirardi, Rimini, Weber, and others are willing to believe that there exists such a mechanism is a combination of the following ideas:
  1. The wave function has to be a "real complete description" of the physical system whose "reality status" is that of a classical electromagnetic wave [this assumption is pure BS]
  2. Because the wave function spreads, something has to keep the wave function "nice and compact": so random flashes are making the "collapse" approximately \(10^{-15} N\) times per second, where \(N\) is the number of particles in the system [this assumption is required because of the assumption #1]
  3. Those authors believe that they may choose the parameters of the "flashes" so that the macroscopic world will look as a classical object with sharp positions (up to a micrometer); while the microscopic world won't lose its coherence [this goal is a pure wishful thinking and BS]
In that June 2011 article, I explained that such flashes would obviously generate huge perturbations of the system that are experimentally proved not to exist, so any version of such a theory is immediately falsified.

One may say that the GRW collapse fails on both sides: it's both "too vigorous" and "not stringent enough". It's too vigorous because each \(10^{15}\) seconds, the energy of each electron is modified – by the prescribed "material collapse" – by a substantial fraction of an electronvolt.

If such brutal random perturbations of a large system existed, a macroscopic piece of a material would have to exhibit billions of spontaneous ionization events of atoms (and other "flashes") each second, even at arbitrarily low temperatures. Demonstrably, nothing like that is taking place: a cold enough piece of material lives in complete peace. So you may think that the right recipe to resuscitate the GRW theory would be to make these "flashes" less frequent and/or to make the required localization of the particles "more tolerant" i.e. to make the typical modification of the particle's energy smaller.

However, this solution won't help you because the GRW collapses are already too loose with the parameters the proponents of this scheme typically propose. If we use the values of the parameters proposed in the GRW literature, an electron – and everything that has correlated positions with it – only has a "sharp" position after the "authoritatively imposed spontaneous measurement" (the flash) at the accuracy of one micron. However, we may easily measure positions of macroscopic objects (parts of interferometers) with the accuracy of \(10^{-15}\) meters or so, ten orders of magnitude more accurately, and there's still no "stochastic Zitterbewegegung" at this scale.

Of course, the right explanation of such observations is that no GRW-like flashes exist. "Their" frequency in Nature is strictly zero, so these collapses can't have anything to do with the fact that large objects "behave" classically. And indeed, quantum mechanics confirms that the emergence of classical physics has nothing to do with "additional collapses" added upon the unitary evolution in quantum mechanics. The classical limit emerges because
  1. in the \(\hbar\to 0\) limit, all commutators disappear and restore the set of commuting classical observables, and all related quantization rules (e.g. for the angular momentum or energy) that result from quantum mechanics return us to the world of simultaneously measurable observables with continuous spectra
  2. for large enough objects, the information about the relative quantum phase is evolving chaotically so the potential for future decoherence evaporates, too: it follows that the density matrix abruptly acquires a diagonal form with respect to a preferred basis (eigenstates of those observables that are able to imprint themselves into orthogonal states of the environment) which is why the diagonal entries of the density matrix may be interpreted as ordinary classical probabilities, and one reduces the density matrix in quantum mechanics to a phase distribution on the phase space in classical physics
Nothing else is needed for quantum mechanics to reduce to classical physics in the appropriate limits; and there exist no other effects that would make the reduction "more real" than the reduction above indicates. In particular, at the fundamental level, the world is always a quantum system, regardless of the size; the only "classical" thing we may say is that classical physics becomes an acceptable approximation, a scheme that produces results that may agree with the observations (and with the exact QM answer) within some error margins. Nevertheless, much like many other approximations in any context you may think of, classical physics is never fundamentally accurate; it is always just an approximation with a very limited range of validity.

Weinberg's introduction

Just to be sure, Steven Weinberg offers nothing that could "resuscitate" GRW-like pseudoscientific theories that were falsified within a picosecond, as explained above. Despite those 55 years that separate us from the first attempts to make the collapse physical (Bohm + Bub, 1966), there isn't a glimpse of a quantitative idea about the actual values of the parameters that describe new processes that, according to this GRW school of thought, are needed for the classical limit to arise.

People writing papers about this junk never discuss actual physics, like the properties of the hypothetical "electronvolt flashes" they predict billions of times a second in a kilogram of a material, or the consequences of the inaccurate positions of macroscopic objects that still "suffer" from a micrometer error margin.

Papers like Weinberg's paper are purely formal masturbations with physically nonsensical equations driven by the author's (or authors') inability to understand the postulates of quantum mechanics. So far, Weinberg's paper has the deserved number of 0 citations but I am sure that even this very blog entry will encourage some people with these beliefs to write down equally meaningless followups.

Let's go through the introduction in Weinberg's paper:
There is now in my opinion no entirely satisfactory interpretation of quantum mechanics [1].
Weinberg promises us "Lectures on Quantum Mechanics" to be published soon. Section 3.7 will discuss similar stuff. By choosing the same title as one of Dirac's classic books, Weinberg may attempt to overshadow Dirac. In this comparison of two books of the same name, the currently living generations would find themselves in an embarrassing situation, as the preview of Section 3.7 indicates.
The Copenhagen interpretation [2] assumes a mysterious division between the microscopic world governed by quantum mechanics and a macroscopic world of apparatus and observers that obeys classical physics.
Holy cow. The purpose of the Copenhagen interpretation was to provide physicists with a solid phenomenological basis that allows them to make predictions for all new observable phenomena found in the microscopic world and to verify their detailed microscopic theories experimentally, by measurements that involve macroscopic objects. To do so, the Copenhagen school postulated that some parts of the physical system – the macroscopic "apparatuses" – may be assumed to follow the logic of classical physics while the microscopic degrees of freedom can't.

What primarily matters in science is not whether something is "mysterious" but whether it is correct, and be sure that all the Copenhagen school's comments about the objects obeying the logic of classical physics were right! In particular, it is demonstrably true that classical physics is inapplicable to elementary particles and atoms; but it is almost perfectly applicable to the center-of-mass coordinates of macroscopic bodies. The boundary between the situations in which classical physics is "totally invalid" and in which it is "almost fully satisfactory" is fuzzy but it is completely absolute and objective.

There are two realms!

The Copenhagen interpretation, if we interpret it as a particular selection of the papers by the founding fathers, didn't derive the origin of classical physics in detail. But they didn't claim to have done so; they knew that the emergence of classical physics had something to do with the large size of the systems. And they were proved right. Today, using calculations in decoherence theory, we may derive the location of the "classical-quantum boundary". It is there, the location is "absolute" and calculable, and all the properties of this boundary confirm the assumptions that were made by the Copenhagen school folks.

So the fact that large objects may be nicely approximated by classical physics isn't really mysterious in any sense; it was known to have something to do with the large number of degrees of freedom – a situation for which quantum mechanics makes classical-like predictions – and with the modern progress in decoherence, we may derive many more details which make perfect sense.
During measurement the state vector of the microscopic system collapses in a probabilistic way to one of a number of classical states, in a
way that is unexplained, and cannot be described by the time-dependent Schrödinger equation.
There is nothing unexplained about the "collapse of the state vector" that occurs in "a probabilistic way". Quite on the contrary, the insight that the state vector has a probabilistic interpretation – the insight for which Max Born received his deserved Nobel prize – fully explains its character. The "collapse" has the very same interpretation as the "collapse" of the probability distribution
\[ \left( \frac 16, \frac 16, \frac 16, \frac 16, \frac 16, \frac 16 \right) \] for a dice when we throw it to
\[ \left( 0,0,0,1,0,0 \right) \] when we see that the outcome was 4. The probability distribution meant that there were six possible outcomes and each of them could have occurred. We could have only predicted the odds of different outcomes, but we were guaranteed to get one particular outcome (and not a "mixture" or other irregular results). We got 4 in the classical case of a die; in the quantum case of a photon, we may get a location of the photon on the photographic plate. But the logic is the same: we can't predict the unique outcome, just the odds of different outcomes, but when an outcome actually becomes a fact, it is guaranteed to be one of the allowed ones.

The only difference between classical physics and quantum physics is that in classical physics, one may consistently imagine that all the uncertainty was just due to our incomplete knowledge and in principle, we could know everything about the observables describing the real world (and predict the outcomes of all the dices in Las Vegas). In quantum physics, this is not possible because the observables generally refuse to commute with each other: the sharpl well-definedness of one of them automatically implies the uncertain state of almost all others.

Nothing else needs to be "explained" in the scientific sense. We may try to "explain" these issues to a person who seems to be dense and doesn't understand what the concept of a probability means (this is really the core of his or her problem: it has nothing to do with quantum mechanics). But this "explanation" isn't a part of the scientific research: it is a part of the (undergraduate) physics education.
The many-worlds interpretation [3] assumes that the state vector of the whole of any isolated system does not collapse, but evolves deterministically according to the time-dependent Schrödinger equation. In such a deterministic theory it is hard to see how probabilities can arise. Also, the branching of the world into vast numbers of histories is disturbing, to say the least.
While I agree with the comment that no one has identified a way to attribute different probabilities to the alternative histories, and I agree with the general proposition that the many worlds interpretation sucks, Weinberg's criticism strikes me as an incredibly vague and superficial one. What does it exactly mean that something is "disturbing"? Science has found many things that looked "disturbing" to many people – starting from the heliocentric theory – but being "disturbing" isn't something that would reduce the likelihood that a scientific theory is right. The truth is often disturbing to many.

In particular, a theory of the world could predict a huge number of "parallel universes" – after all, the multiverse theory that Weinberg generally advocates is another example of this theme. Their mere existence, assuming that it doesn't make invalid predictions, simply isn't a "crippling property" according to science. One may show that no scheme of "branching the world" into many histories could exist that is compatible with other principles of physics but Weinberg hasn't done so.
The decoherent histories approach [4] gives up on the idea that it is possible to completely characterize the state of an isolated system at any time by a vector in Hilbert space, or by anything else, and instead provides only a set of rules for calculating the probabilities of certain kinds of history. This avoids inconsistencies, but without any objective characterization of the state of a system, one wonders where the rules come from.
Just like the word "disturbing", the verb "give up" in the previous paragraph means that Weinberg has "given up" the scientific mode of thinking in favor of a political one. Whether or not there exists an "objective description of a physical system" is not a cherished "idol" that may be "preserved" or "given up": it is a scientific question that has a Yes/No answer and the scientists aren't allowed any prejudices about the right answer. And in the mid 1920s, it was shown that the answer is negative: there can't be any objective description of the physical system in which everything is certain. Adding the verb "give up" to this profound insight by the early 20th century physicists is a sign of an emotional, biased attitude to these physics questions.

Also, I find it puzzling why the decoherent (well, "consistent" may be a more accurate adjective) histories were described as a completely separate interpretation from the Copenhagen interpretation. It's really the same theory, quantum mechanics. This whole meme that there are "many interpretations" of quantum mechanics is flawed. There is only one quantum mechanics which only has one right meaning for each of its concepts. The consistent histories approach may focus on different kinds of questions than the old Copenhagen school but it is using the same "interpretation" of all the objects such as the state vector or the density matrix.

There aren't dozens of possible colorful "interpretations" of quantum mechanics. One either understands quantum mechanics, which is one theory or one framework underlying physics, well or he understands it not-so-well. Weinberg unfortunately switched to the second camp.
Faced with these perplexities, one is led to consider the possibility that quantum mechanics needs correction.
I find the irrationality of the sentence above breathtaking. There isn't an infinitesimal glimpse of evidence that quantum mechanics needs a correction. There isn't a single experiment that would indicate that quantum mechanics needs corrections. Quite on the contrary, pretty much all corrections that someone could propose – like the GRW "flashes" – may be instantly falsified.

Such observations of "blunders of quantum mechanics" are absent both in the scientific literature as well as in the previous sentences by Weinberg. So if one is "led to consider that quantum mechanics needs corrections", then "one" has just totally abandoned rational reasoning, reasoning based on the actual evidence. The only thing that Weinberg has listed as "reasons" to be "led" to modifications of quantum mechanics were emotional outbursts and politically loaded and biased words such as "disturbing" and "giving up". They only show Weinberg's emotional discomfort with quantum mechanics. It is the same kind of discomfort that many people faced when they were presented the heliocentric theory or any other correct theory of the world. But discomfort isn't what determines the truth in science. In the past, I considered Steven Weinberg to be a top thinker who would insist on such basic principles (and who would insist that the scientific evidence beats philosophical prejudices); I am disappointed that he is no longer one. And he clearly isn't.
There may be a Hilbert space vector that completely characterizes the state of a system, but that suffers an inherently probabilistic physical collapse, not limited as in the Copenhagen interpretation to measurement by a macroscopic apparatus, but occurring at all scales, though presumably much faster for large systems.
As explained at the beginning, there can't be such new "physical collapses". They're experimentally excluded at every accessible scale because they're qualitatively wrong. They also violate the basic principles of quantum mechanics such as unitarity whose validity has been tested in millions of accurate experiments: that's another way to understand why all such theories disagree with the observations.
From time to time specific models for this sort of collapse have been proposed [5]. In the present article we will instead consider the general properties of theories of the stochastic evolution of the state vector, assuming that this evolution depends only on the state vector, with no hidden variables.
In other words, Weinberg is randomly fluctuating in the sea of anti-quantum confusion.
In contrast to earlier work, we concentrate on the linear first-order differential equation that in general describes the evolution of the probability distribution of the state vector in Hilbert space.
Except that if he randomly fluctuated into other corners of the sea of confusion, he could also concentrate on non-linear or higher-order equations, too. When one abandons strict standards according to which falsified theories must be abandoned, he may introduce random modifications of any kind to any equation or any principle. Any person may become a gremlin. Because there are many arrangements of typos one may introduce to science, the potential for such "misprint factories" is basically unlimited. But that doesn't mean that this activity has a positive value.
We find conditions on this evolution so that it leads to final states with probabilities given by the Born rule of ordinary quantum mechanics. This general formalism is also applied to the special case of a state vector that evolves through quantum jumps.
In other words, we may make assumptions such as \(3+6=14\) and \(5\times 4 = 19\) but we still derive that it should better be the case that \(1+1=2\). That may be interesting for Weinberg but others may still see that \(3+6=9\) and \(5\times 4=20\), too, so Weinberg's paper is rubbish even if he manages to calculate \(1+1\) correctly.

Add to del.icio.us Digg this Add to reddit

snail feedback (1) :


reader Brian G Valentine said...

If I am interpreting Weinberg correctly, then action (density) does not remain invariant under coordinate transformation and so is nonsense.

Anyway this type of "science" reminds me very much of climate "science" now days. Make up anything that sounds good to arrive at some outcome that can be given pre-assigned probabilities.

Justify it by smearing and mocking anybody who questions it. Tie it in with polar bears and floods and teach it in grammar schools.

Use it to justify wasted money on chimerical solar power projects.