## Wednesday, August 25, 2010

### The unbreakable postulates of quantum mechanics

Many people try to use every opportunity to diminish the importance of the principles of quantum mechanics and to cover them by fog and shadows - even though they're the most important and most robustly established insights of the 20th century science.

Any quantum mechanical theory is composed out of two layers:
1. Postulates and interpretation of quantum mechanics
2. The choice of the Hilbert space and the Hamiltonian (or action or whatever plays its role of determining the evolution in time)
These two layers are largely isolated from one another. The first layer is universal for all quantum mechanical theories and one only needs to understand it once; the second layer has to be refreshed for each new physical system we want to study and for each description of such a system that is more accurate than the previous one.

Pretty much all the advanced mathematics of every modern physical theory is hiding in the second layer while the philosophy and all the "controversies" of quantum mechanics are concentrated in the first layer.

You may read a lecture of mine about the interpretation of quantum mechanics. However, in this text, I want to focus on more mathematical and less philosophical issues of the postulates of quantum mechanics.

Fine. What do I mean by the postulates?
1. The set of possibilities in which a physical system may be found is described by a linear Hilbert space (more precisely by the rays in this space) equipped with an inner product.
2. Complex (nonzero) linear combinations of allowed states are allowed states, too.
3. A physical system composed out of N separated (or fully independent) subsystems has the Hilbert space equal to the tensor product of the Hilbert space describing the individual subsystems.
4. Physical quantities, also referred to as "observables" in the fancy quantum mechanical context, are encoded in Hermitean (linear) operators acting on the Hilbert space.
5. In particular, the evolution in time is generated by the operator known as the Hamiltonian.
6. The exponentials of its imaginary multiples are the operators that evolve the system over a finite interval and these operators are unitary; similarly, other symmetry transformations are given by other unitary (or anti-unitary, if the time reversal is included) operators.
7. The expectation values of the quantity "A" are given by the inner product <psi|A|psi>; if "A" is replaced by the projection operator "P", this expectation value expresses the probability that the condition connected with "P" will be satisfied once the system is measured.
Again, these rules are universal for all quantum mechanical theories - and they hold for quantum mechanics of a single Hydrogen atom, for Quantum Electrodynamics, for the Standard Model, and even for all of String Theory.

Do we know that they are true? Well, we actually do. First: why do we know that there is a Hilbert space?

If a physical theory has a content, it must be able to manipulate with the information. We insert some information that we know - and it spits out another piece of information that we didn't know but that is predicted, or postdicted, by the theory.

So there must exist some states; which state was realized in Nature, is realized in Nature, or will be realized in Nature, is the way to phrase all the information we have or we want to have about the world or its pieces. That was true even in classical physics: different states of a physical system were given by points in the phase space (spanned by the positions and their canonical momenta).

The new thing about quantum mechanics is that the complex linear superpositions of two allowed states are also allowed states. How do we know that? Well, we may actually design procedures that create such combined states in practice.

For example, a particle may be found in Boston (B) and in Washington D.C. (W). Is there a way to continuously interpolate between B and W? In classical physics, the only way to interpolate was to think about positions between Boston and Washington D.C., such as New York (N): note that it is in between them alphabetically, too. ;-) We needed to think about a train between Massachusetts and Maryland.

However, in quantum mechanics, there's one more way to interpolate between the states B and W. We may consider states
|psi> = a |B> + b |W>
where a, b are complex numbers. If (a,b) is equal to (1,0) or (0,1), respectively, we are back to the states B and W. Such a state will mean that the particle will "certainly" be found in Boston or in Washington D.C., respectively.

In quantum mechanics, a and b may be both nonzero. You can construct such a state by putting Boston and Washington D.C. behind the slits of a two-slit experiment and by adjusting the path between the source of the particles and the two cities. If you do things right, you may change both the ratio of the absolute values of a and b (by screening or the choice of the approximate relative distance between the source and the cities) - as well as the relative phase (by more accurate adjustments to the distance).

The absolute normalization of the state psi doesn't matter. As long as at least one of the numbers a,b is nonzero, the vector itself is also nonzero and it can be multiplied by a number that makes its norm - length - equal to one which is the usual choice.

The right normalization factor still has an undetermined phase. If we multiply both a,b by a complex phase (whose absolute value equals one), we get a physically indistinguishable state. That's why only the relative phase between a,b is physical - only the relative phase has observable consequences.

Aside from this overall normalization of (a,b) by a shared constant, it's clear that any choice of (a,b) leads to non-equivalent description. If one repeats the same experiment many times, he may measure both the absolute value of the a/b ratio, as well as its (relative) phase.

All quantum theories obey the postulates of quantum mechanics much like all ducks are wearing dog masks.

How do we know that the amplitudes have to be allowed to be complex? Well, a wave is connected with every particle. The wavelength is linked to the (inverse) momentum. However, cos(kx) which would be a real wave is not good enough because it doesn't distinguish "k" from "-k". We know the speed of the particle but we don't know whether it moves to the left or right. In a similar way, we wouldn't distinguish a positive and negative energy, and so on. Clearly, cos(kx) must be completed to exp(ikx) which does distinguish the signs.

Now, this information about the particle - given by (a,b) modulo a uniform scaling - is neither incomplete nor redundant. It is not redundant because different values of a/b (which is what is kept constant if you uniformly scale a,b) lead to different predictions of experiments. It is also not incomplete; if the particle contained some additional "hidden variables", one can show that they would typically destroy the interference pattern.

If you even wanted some "classical hidden variables" to mimic all the observable consequences that come out of the relative complex phase, you would find out that the measured correction will always satisfy the so-called Bell's inequalities. The correlations would be way too small and experiments show that the real world easily exceeds the mantinels imposed by Bell's inequalities. Quantum mechanics is able to violate and exceed Bell's inequalities, too. That's not surprising because the correlations predicted by quantum mechanics - and everything else - exactly agree with the observations.

The very physical relevance of the complex linear Hilbert space is what dooms all existing attempts to replace the postulates of quantum mechanics by something else. Those who dislike quantum mechanics like to think themselves as people who are ahead of time. However, it is very easy to see that they're the people who haven't yet understood the 20th century science; the world of physics is at least one century ahead of them. They just want to return to the distant past and "undo" some key discoveries. That's not possible in science.

In the future, the postulates of quantum mechanics may be phrased differently. They may be derived from a different starting point. They may be more tightly connected with the more detailed dynamical information about the Hamiltonian from our second layer. However, the complex numbers will never go away. The same is true about their relationships to the probabilities (via squaring).

In the 1920s, the complex numbers entered fundamental physics and they will never leave it.

You know, until the 1920s, the complex numbers could have been viewed as mathematical curiosities. Some people would even consider them silly bookkeeping devices for two real numbers. However, in the 1920s, the complex numbers became "more real" than the real numbers whenever we count the relative probabilities of different microstates.

It is not only the relative probability; it is also the relative phase of two microstates that has observable consequences. And it can be non-trivial i.e. non-real. These insights can obviously never go away again. Thinking that the complex amplitudes will disappear from physics is as naive as the idea that astronomy will return to epicycles and Elvis Presley will be found on the Moon.

People who think that physics will undo this key 20th revolution may present themselves as smart chaps and they may find journalists who do the same thing. In reality, however, they're as dumb as a doorknob.

Now, there are other postulates and universal rules of quantum mechanics. For example, the composite systems are described by tensor products of Hilbert spaces. It's not hard to see why: if the dimensions of Hilbert spaces H1, H2 are equal to d1, d2, there are clearly d1 basis vectors of H1 and d2 basis vectors of H2. These basic vectors parameterize some linearly independent (i.e. fully mutually exclusive) possibilities.

The set of linearly independent possibilities for the composite system obviously has to be the Cartesian product of the two sets for the separate subsystems. And the "linear envelope" of this Cartesian product - the new basis - is the tensor product of the original spaces. Its dimension - its number of basis vectors - is equal to d1.d2 as expected. This conclusion is pretty much inevitable, by basic logic.

Technical comment: I have redirected all mobile traffic to onbile.com/lumidek (without a slash at the end), a new mobile miniversion of this blog. Apologies to all those who will dislike it.

Now, one may continue to question the universal principles of quantum mechanics. For example, probabilities are given by the squared absolute values of the complex amplitudes, e.g. by |a|^2. Is that necessary? You bet.

If a system is composed out of two isolated subsystems, it's clear that the laws of physics have to evolve the two subsystems "individually from each other". The only universal way to design rules how to evolve a state in the tensor product Hilbert space out of the rules to evolve the states in the smaller factor Hilbert spaces is to take the tensor product of operators.

However, the tensor product of two operators on the smaller Hilbert spaces is only defined for linear operators. It follows that all the operators that encode the evolution of the subsystems (and therefore any systems) in time have to be linear operators. Now, the total probability has to be conserved. There has to be a quantity that is conserved by the linear operators. You will find out that the norm of the states in the Hilbert space is the only simple enough quantity that can be conserved by a sufficiently large class of operators - the unitary operators - that may play the role of evolution by a finite amount of time or other symmetries.

Experiments unequivocally confirm that the norm of the vectors - and the squared absolute values of the individual complex coordinates - are what determines the probabilities. Can you imagine that this "squared" relationship is deformed a little bit? You may imagine it but by pure thought, you may show that such a deformation is physically inconsistent.

Imagine that the probability that you measure "Boston" is not given by |a|^2 computed out of the relevant complex amplitude a; instead it is equal to something else, e.g. |a|^2 + |a|^4, which you may claim is compatible with experiments because the experiments have only measured the system for small values of a (it's not really true, but imagine that it may be).

However, it's clear that the operators evolving the system in time won't conserve
|a|^2 + |a|^4 + |b|^2 + |b|^4
which is the new proposed total probability of all alternatives. You will be losing the total probability. That's too bad because something has to happen - the total probability of all mutually exclusive outcomes has to be equal to 100 percent.

You may imagine that the "better" rules simply scale a,b by the right number so that the sum above is equal to 100 percent, after all. But such a rescaling will depend on all the amplitudes (or many other amplitudes). And you may see that if such rules are applied to the composite system, the predictions for one subsystem will inevitably be affected by the amplitudes for the other subsystem.

In other words, the non-quadratic formula for the probability will not respect the tensor structure of the Hilbert space. It means that you will lose locality; it becomes impossible to have two systems whose interaction goes to zero. Even two particles that are separated by billions of light years will influence one another by terms of order 100 percent. That's too bad because locality, and even approximate locality, becomes impossible.

In fact, such distorted rules of quantum mechanics would violate not only locality; they would also violate basic logic or causality. The rescaling of the probabilities would even be affected by amplitudes for outcomes that are already known not to have occurred. The more you deform the proper rules, the bigger the inconsistency becomes - but some inconsistency is always there whenever you deform the rules at least a little bit. So the predictions for your new measurement would depend on the ordering in which you "reduce the wave function". This fact would mean that it matters when the reduction takes place; it's a real process.

Consequently, the reduction needed to predict the right entangled outcomes would be a real process that takes place faster than light and violates relativity. If you look at such superluminal processes from another reference frame, they're equivalent to the traveling to the past which leads to the usual killed grandfather paradox.

In proper quantum mechanics, it's essential that the "reduction" of the wave function is not a real physical process and the wave function is not a real classical wave. It is just a tool that stores the information about the probabilities - something that we know about the system. Our knowledge about the whole system - including its distant portions - may change instantly (faster than light) and this "subjective change of our knowledge" doesn't violate any rules of relativity. It doesn't even matter when we "refresh" our knowledge about the expected future outcomes.

As you can see, relativity and the probabilistic nature of quantum mechanics nicely support each other. If you try to attack one of them, the other principle jumps to defend its friend and you will be beaten into small bloody pieces of trash. Don't mess up with the basic principles of relativity or the basic postulates of quantum mechanics.

In similar ways, you could try to claim that various operators are not quite Hermitean. An unsophisticated version of this proposal would lead you to imagine that an operator associated with a real quantity - such as position - has non-real eigenvalues. That's too bad because by the other assumptions, only real values may be measured. The apparatus that measures the quantity can only show real answers - which is why the corresponding operator has to have real eigenvalues.

However, you may be a bit smarter and think about non-Hermitean operators with real eigenvalues. You know, an operator can have real eigenvalues but the basis of the eigenvectors may fail to be orthogonal. You may obtain such operators by the conjugation of Hermitean operators by generic non-unitary matrices.

Nevertheless, even this "smarter" proposal is easily seen to be inconsistent. If the eigenvectors linked to different eigenvalues of the operator fail to be orthogonal to each other, it means that the two possible outcomes will fail to be mutually exclusive alternatives. It will no longer be true that
Prob (A or B) = Prob (A) + Prob (B).
You may keep on trying but any attempt to abandon - or even slightly deform - the general postulates of quantum mechanics will quickly kill the self-consistency of your picture or its consistency with the basic logical structure of any hypothetical world that resembles the real one. And they will immediately highlight you as just another crank.

So the recommendation has to be repeated once again: don't mess up with the basic postulates of quantum mechanics.

And that's the memo.

1. Hi Lubos-

As a non-expert, something has always puzzled me about the relationship between relativity and quantum mechanics. At least early on, the two areas developed independently of each other, and their underlying postulates did not seem to have much in common, and in particular were expressed in terms of very different looking mathematics. Historically, major progress eventually came about when the correct way to combine the two theories was discovered.

So we now know that the two theories are not independent of each other, and did not get married and have a baby called quantum field theory. It is really QFT that is more fundamental, that was always there first, and the two separate theories are just manifestations of the larger one. Which brings us back to how different their underlying postulates are. Are there corresponding "fundamental postulates" of QFT from which the fundamental postulates of SR and QM can be derived? It just seems like things should go in that order since QFT is more, well, fundamental. Ralph

2. Dear Ralph,

as you correctly say, the postulates of QM and relativity are independent of one another, and they continue to be independent today.

There still exist non-relativistic theories that are quantum, and classical or non-quantum theories that are relativistic.

As you correctly say, QFT - or string theory - is the general framework how to respect both sets of postulates, QM as well as relativity.

When I referred to the postulates supporting each other, I was not talking about the space of all conceivable theories but only about theories that can describe the real world - which is both relativistic and quantum.

What I mean is that certain experiments (like the entangled particles experiments) that might naively be thought to be testing just one set of these postulates are actually testing both of them simultaneously.

Assuming some known experimental factst - like the elevated, trans-Bell-inequality correlations seen in the entanglement experiments - one can show that the quantum postulates and relativistic postulates are not independent. There can't be any non-quantum but relativistic (and not even causal) theory that would agree with the experiments.

But if you ignore the experiments in this particular Universe, of course that all combinations are possible and the postulates of QM and relativity are independent.

Best wishes
Lubos