Friday, October 10, 2014 ... /////

The very meaning of "probability" violates the time-reversal symmetry

An exchange with the reader reminded me that I wanted to dedicate a special blog post to one trivial point which is summarized by the title. This trivial issue is apparently completely misunderstood by many laymen as well as some low-quality scientists such as Sean Carroll.

This misunderstanding prevents them from understanding both quantum mechanics and classical statistical physics, especially its explanation for the second law of thermodynamics (or the arrow of time).

Time goes up (up=future, down=past). The right diagram.

What is the issue? For the sake of completeness, let's talk about the spreading of the wave function $\psi(x,t)$ describing the position of a particle. In the diagram above, time starts at the bottom and it goes up. You see that there are are three stages of "spreading". The wave packet spreads between $t=0$ and $t=1$, then it abruptly shrinks because the particle is observed, and then is spreads again from $t=1$ to $t=2$, shrinks at $t=2$, and spreads between $t=2$ and $t=3$. The diagram is qualitative and could be applied to the probability distributions for any observable in classical or quantum physics, OK?

You see that the diagram above is self-evidently asymmetric with respect to the upside-down flip. The flipped version looks like a tree

and I will refer to it as "the wrong diagram".

What is going on here? Between $t=0$ and $t=1$, and similarly in the other two stages of the correct tree diagram, the probability distribution or the wave function evolve according to some equations that have a mathematical property: they are invariant under the time-reversal symmetry. The wave function $\psi^*(x,t)$ with the extra complex conjugation evolves in the same way (i.e. obeying the same Schrödinger equation) as $t$ goes up as $\psi(x,t)$ evolves if $t$ goes down.

Similar comments apply to the evolution of the phase space probability distribution in classical statistical physics. The equation is known as the Liouville equation and its fundamental form is symmetric under the time-reversal symmetry, too. These three "continuous segments" of the "green tree" diagram are something that the confused people don't have a problem with.

What they have a problem with are the "discontinuous jumps" at $t=1$ and $t=2$ – and lots of their counterparts in the real world. Needless to say, they're the moments of the "measurement". Look at the measurement at $t=1$, for example: this moment is the horizontal line of the first picture that divides the tree to 1 triangle below the line and 2 triangles above the line. Why did the distribution shrink at that moment? Why didn't it "expand" instead? To answer this simple question, let's first describe the situation near the moment $t=1$:

At time $t=1-\epsilon$, i.e. before the measurement, the wave packet was spread. The location of the particle (or any property of a classical or quantum system) was ill-defined or fuzzy or uncertain or partly unknown.

At time $t=1+\epsilon$, i.e. after the measurement, the wave packet was concentrated. The location of the particle (or any property of a classical or quantum system) became well-defined or sharp or certain or well-known.
I originally wrote the second sentence using the clipboard (via copy-and-paste) but then I had to edit it because the adjectives are different. In fact, they are completely opposite. Note that if you interchange the moments $1\pm \epsilon$ with one another, you simply obtain propositions that are wrong. One may "suddenly learn" some information but one may never "unlearn it" abruptly after an infinitesimal period of time.

Of course, you may "flip" these definitions – but then you will get an equivalent description of physics in which $-t$ is used instead of $+t$, and/or in which the word "past" means the "future" and vice versa. There is no reason to add this extra confusion; you won't gain anything by this proposed chaos in the terminology. The "past" and the "future" are totally and qualitatively different whenever something about learning or observing or probabilities is involved (and it always is).

The probability distribution at the moment $t=1-\epsilon$ i.e. before the measurement – whether the probability distribution is calculated from a wave function in quantum mechanics, or it is a fundamental object in classical statistical physics (or its informal counterparts: my statements really apply to any form of probability discussed by anyone and anywhere) – determines at what locations $x$ the wave packet is more likely to be concentrated at $t=1+\epsilon$, i.e. right after the measurement. Yes, I have used the word "likely" again so the "definition" is circular. It's inevitable because one can't really define the Bayesian probability in terms of anything more fundamental. There is nothing more fundamental than that.

But what's important to notice is that the meaning of the probability always refers to the situation
a property is unknown/blurred at $t=1-\epsilon$
it is well-known/sharp at $t=1+\epsilon$
The two signs simply cannot be interchanged. The very meaning i.e. the right interpretation of the wave function or the phase space probability distribution is in terms of probabilities so the time-reversal-breaking quote above is inevitably understood in every and any discussion about probability distributions and wave functions.

Their very meaning – their defining property – is to tell us something about the final state of the measurement at $t=1+\epsilon$, out of some incomplete knowledge at $t=1-\epsilon$. Again, to stress the point, their very meaning is to tell us something about time-reversal-asymmetric abrupt events. If there were no time-reversal-asymmetric abrupt changes of our knowledge i.e. if there were no learning and no measurements and no observations, there could be no probabilities! In that case, there would be no probability distributions and there would be no wave functions because the very meaning of all these things is to tell us what to expect at $t=1+\epsilon$.

There is no contradiction connected with the existence of the "event of learning" or "measurement" at $t=1$. Obviously, we sometimes have to learn some information about something, otherwise we couldn't talk about anything and there could be no science – or ordinary life, for that matter. If the process of learning has some internal structure, if we are measuring something with an apparatus that works for complicated reasons, there is something to discuss.

But if we only want to talk about the general claim that "there are measurements" i.e. events in which we suddenly learn some sharp or sharper information about something, there is really nothing to talk about. It's as elementary and irreducible fact about the human thought as you can get. People learn. Ergo there are these "shrinking discontinuities" in the probability distributions for everything and anything in the world. The "past side" ("before" side) of these measurements always has a more blurry distribution than the "future side" (or "after" side). Whoever writes whole chapters or books or book series about the very existence of "observations" or "learning of the information" is guaranteed to have written meaningless, pompous, vacuous philosophical flapdoodle only.

This behavior of the probabilities around the measurement – where the probability tells us what to expect "after" the measurement – is the source of what I call the logical arrow of time. Stephen Hawking and others use the word "psychological arrow of time" and it's clearly the same thing. Hawking uses the term "psychological" for a good reason – learning about something by seeing it is a "psychological process".

The reason why I prefer to avoid this "psycho-" prefix is that it leads the people to think that an analysis how brains work and whether they have consciousness and what consciousness means is an obligatory ingredient in a complete analysis of the logical arrow of time. It's not and that's why the term "logical arrow of time" is more appropriate. What we really need is just the fact that some information (about an observable, a property of the external world) or the truth value of a proposition is unknown at $t=1-\epsilon$ but it is known at $t=1+\epsilon$. I don't need to assume anything whatsoever about "agent" for whom it is known, his or her structure, the mechanisms inside the brain, and so on. I don't need to assume that there is an "agent" that also has some other capabilities aside from knowing or not knowing whether a proposition about Nature is true. The logical arrow of time is about the (logical) truth values of propositions that abruptly change at $t=1$, and the probabilities tell us what the final product of the change (which "after" state) is reasonable to be expected!

This logical arrow of time is a simple, elementary, and irreducible part of our existence within Nature. But it has consequences. If you think about the comments above and recognize that all these things are as clear as you can get, you should also understand that there is no "measurement problem" in quantum mechanics – the existence of the "a measurement" is tautologically an inseparable part of any statement about the corresponding "probabilities".

And you will understand that there is no problem with the thermodynamic arrow of time, either. The proof of Boltzmann's H-theorem or its variations and generalizations are proofs that the thermodynamic arrow of time (showing the direction of increasing entropy) is inevitably correlated with the logical arrow of time. But the logical arrow of time always exists in any logical framework that talks about probabilities because probabilities are always relevant before a moment when a property is "learned" or "decided" so they are linked to time and the relationship treats the past and the future absolutely asymmetrically!

I don't really believe too much that this clear-as-sky explanation of this issue will make someone new scream "Heureka" because these people love to be confused idiots and they are proud about it. It is probably a part of their self-confidence to think that there is something seriously wrong with statistical physics or quantum mechanics or thermodynamics (and maybe with mathematics, too) and it would hurt their ego if they had to learn that something has been wrong with the (time-reversal-asymmetric) semi-infinite part of their world lines i.e. with their lives up to this moment when they had a nonzero probability to understand what "probability" means and why none of these would-be problems exist. But the probability was too low so it's not surprising that most of them have remained confused morons instead.

And that's the memo.

snail feedback (94) :

Searching for a concrete model of this, I suggest to myself how mere random noise in systems would result in practical information loss that by its very nature is only one way in time for unknown information about earlier conditions has no mechanism to suddenly reappear if I try to turn back time.

Thank you, yet again, for making sense of physics.

I sometimes wonder (you being all the way out there in the Czech wilderness, on the edge of Conan's steppe) if your position is not somewhat like that of Davy Crockett at the Alamo, surrounded by the likes of Carroll thirsting for the kill.

LOL, thanks, and I am going to relearn what I know about the Texan revolution now. ;-)

I agree, if I understand. ;-) The basic step of the shrinking (throwing dice and learning the result) is irreversible in the normal real world. So despite the T-symmetry of the evolution for the amplitudes, the overall rules still know that the future depends on the past and not vice versa.

Isn't this somewhat like checking if there's a deviation from expectations using all the data instead of looking separately at individual channels?

This seems like a pretty smart thing to do. They mention one double check of their method, "It is interesting to note that the distributions observed here are very different from those observed in our analysis of the 7 TeV data" that could indicate they have something real.

And I thought it was going without saying !... Either I was being to simplistic or I am just one clever duda.

Yup, it could be very smart, and far-reaching, unless it is wrong.

When I was now throwing votes in the local and senate elections, I was thinking what could be wrong about it.

If they neglect that the number of events "N" in a bin isn't a continuous real number, but a positive integer, then they may get completely distorted predictions for channels where N is really really low, like 0,1,2,3 etc.

But if their result isn't an artifact of such things, it could be cool and there is some sense in which they accumulate the signal of "new physics Yes/No" from all the places.

Yes - totally agree with all that - rather trivial as you say.
But isn't it rather missing the point as far as QM is concerned? All of this is fine and dandy **given** a measurement. The problem isn't this interpretation of the change of probability upon measurement - and it is clearly asymmetrical - the issue is the following.
System S, System M - M is our measuring device. They interact. QM applies to everything including the measurement device. S and M interact - QM says this interaction gives a unitary evolution for the state of the S+M system agreed?
The measurement occurs and S is now described by an eigenstate of the observable. This is a **non-unitary** evolution as I'm sure you would agree. How has this occurred if all interactions within QM are described as unitary processes? Is 'measurement' a different kind of interaction obeying different dynamical rules?
One FAPP solution is the decoherence approach in which we derive a master equation for the reduced density operator of S - by a suitable coarse graining procedure. This then gives 'irreversibility' because we've smoothed out over things we don't know (the degrees of freedom of the measuring device + environment).
If we suppose that S starts in a pure state then the entropy of S+M increases after measurement - assuming evolution to a mixture of eigenstates as required - this increase in entropy comes about because we effectively lose information (it's 'stored' in the environment upon which we do some kind of averaging procedure).
Sure, we can just assume a measurement (and yes if we're talking at the level of probabilities only, a measurement is implicit anyway) - and yes we have less uncertainty after the measurement so we get the inverted tree diagram for the probabilities as above - but the question is whether the formalism of QM can describe this process, or whether measurement is an axiom we have to apply (the projection postulate).

You *are* one clever dude! I think that many folks like you have lots of common sense (more than myself) that's been trained in the real world, so if they get the concept of probability or something, it's really immune against some basic confusions.

Everything is fine given a measurement.

If a measurement isn't given, then we can't talk about it, can we? ;-)

The measurement occurs and S is now described by an eigenstate of the observable. This is a **non-unitary** evolution as I'm sure you would agree.

You shouldn't be sure about such things. I disagree because the jump from the spread wave function for S+M to a localized one isn't *evolution at all*. It's just an interpretation of the wave function. All systems in Nature *always* evolve unitarily and this "collapse" is just a psychological process. One learns the value of an observable describing S, so one may work with a simplified wave function where all other components of the wave function are set to zero. But one doesn't have to. One may still work with the full wave function for S+M that hasn't collapsed in any way and talk about the conditional and correlated probabilities of all properties of S and M.

Presumably they don't have enough information to feel very confident of their results. I hope some people from Atlas/CMS take the time to look at this.

This logical arrow of time could be identified to the non-commutation of the projection operators in the quantum mechanics of closed systems? I mean, quantum mechanics is contemplated with logical arrow of time by this non-commutation?

Dear Lucas, good try but I don't think so. The nonzero commutators don't have any preferences for one direction of time or another. Moreover, these comments about the logical arrow of time apply to probabilities even in classical statistical physics - any contexts where probabilities appear - where the commutators are zero.

It's also plausible that they delay the publication of papers that find too few events... ;-)

Philosophically it goes back to the hidden variable hypothesis, where no information really is lost, except to us macroscopic scientists, and that all this statistics is merely measurement theory imposed upon actual reality. Your promotion of mathematics as the basis for the uncertainty principle helps me take randomness seriously, but I don't understand the basis for that randomness, and if that randomness really is dictated by the uncertainty principle itself. Lacking a math background, I can only ask vaguely.

Dear Lubos, I catched now. When we build a history we put the time evolution operator between this projections. So, the non-commutation of this projections only tell us that some histories couldn't have an usual probabilities.

So, were is the logical arrow of time in quantum mechanics of closed systems? Is the logical arrow of time manifest only in decoherence mechanism? Is the logical arrow of time manifest only in the quasiclassical realm?

Note: With "manifest", I don't mean say an explanation of the logical arrow of time, I mean say only a symptom of this arrow (like the second law).

Thanks to you too Lubos.

Right, Lucas! The nonzero commutators of the projection operators are the reason why it's nontrivial to make the set of histories "consistent".

Decoherence is a complex process - a derivation of the evolution of the density matrix on paper, in certain circumstances. Note that the text about probabilities above didn't mention the word "decoherence" once because the basic arrow of time of the probability doesn't depend on decoherence. It doesn't depend on anything quantum in fact, as I already said. So I don't understand why you keep on promoting the word "decoherence" here that is much less fundamental than the truly primordial source of the arrow of time.

Also, the arrow of time has nothing to do with the question whether a system is open or closed. Both open and closed systems require probabilities to be described and every time a probability is interpreted - when we measure something, new time-reversal asymmetry or irreversibility is introduced.

So I guess that the answer to all your questions - and probably 60 other similar questions you are going to ask - is No, No, No, No. The logical arrow of time appears in absolutely all systems and all theories and all circumstances and every attempt of yours to suggest that it depends on some very special effects or circumstances is just completely wrong, wrong, wrong, wrong.

Isn't it enough to ask this question once instead of 64 times?

"One may still work with the full wave function for S+M that hasn't collapsed in any way and talk about the conditional and correlated probabilities of all properties of S and M."
Agreed. But now I would suggest you're skirting perilously close to MWI. In order to be consistent with unitary evolution you'd then have to have a combined entangled wavefunction including the possible outcomes - but somehow only one of those possibilities is actually realized when the measurement is done - according to the probability rule. We only experience one of these outcomes - so a full entangled description including all possible outcomes (unitary evolution) doesn't really accord with this experience (we don't seem to consciously perceive superpositions - not, of course, that this has anything to do with consciousness!)
Keep everything unitary, by all means, but I can't see a way of incorporating new knowledge (the measurement result) in this approach other than non-unitarily 'by hand' - to calculate relevant probabilities for subsequent measurements we'd have to work along those branches of the entangled wavefunction selected by our previous actual measurement results.
We use a new wavefunction to describe the post-measurement state based on our new knowledge (we assign the alternative 'null result' branches a zero probability after measurement)
So when we incorporate new knowledge (an actual result) rather than just calculate a priori probabilities the non-unitarity is implicit. There's no unitary process that allows us to acquire this actual knowledge in the first place.

(You can only change your answering if you want, without submit my question ;))

Forgive me if I'm slow to understand. Are Probabilities in quantum mechanics interpreted in this way due to decoherence mechanism and the construction of the quasiclassical realm?

I saw your twitter about the time asymmetry in the definition of probability. I'm very okay with that. So, this question and the question about symptom of the logical arrow of time in quantum mechanics is strongly correlated.

If the answering is yes, so in an Universe made by few degrees of freedom don't present the logical arrow of time. Of course that in this universe we don't have anything to do or see, because don't exist a emergent reality (quasiclassical realm).

Sean and acolytes think that you misunderstand entropy and the arrow of time and are rather and say so sarcastically and dismissively; and ultimately refer to some famous pedagogue who taught you stat mech at Rutgers who they say would disagree with you. It makes no sense to me--your explanation is clear--their's is muddy hand waving (to mix metaphors). At this point in time, I am not current, but I have taken thermodynamics and statistical mechanics and think, like you, that Boltzmann is basically all that is needed along with some understanding of QM to understand the arrow of time. Sean's book was ok for the first few chapters or so, then launched into lala land.

I need to time to think, this things is very complicated for me. But thanks for your time and patience.

It's complete bullšit. I have always been in complete agreement with my StatMech and Thermodynamic instructors, undergraduate or graduate ones, and I've been arguably the best student in every single class.

Is the lie written somewhere so that I could sue the jerk for libel?

"But now I would suggest you're skirting perilously close to MWI."

I don't know what to do with claims (or accusations?) like that. How do you measure the "distance from MWI"? Not only the distance is ill-defined; MWI itself is ill-defined. So what can any sentence about "distance from MWI" possibly mean? If some sentence sold as "MWI" are closer to correct physics - i.e. to what I say - than other sentences, it's clearly a coincidence.

There is no room for "many worlds" in quantum physics.

We only experience one of these outcomes

We only experience one outcome because that's what physics predicts. When an electron is in the state "0.6 times up plus 0.8 times down", it implies that if we learn the value of the spin (in other words, if we measure it), it will be either up or down. The first option has 36% likelihood, the second one has 64% likelihood. This is the only right interpretation of the implications of the wave function for the question "what spin we will experience and how many options we will experience".

The logical arrow of time is this:
A => B ?

Hi Lubos Thanks again for your efforts on this topic. Sean makes a lot of effort to portray himself as a deep thinker ... but he is NOT.

Why do we have to use the term "measurement problem" instead of just calling it the "interaction problem"? This would subtract humans with laboratories and machines out of the discussion. And then once that step is complete just get rid of the word problem ... interactions in QM are not a "problem".

Also why not put a little bit of slope on the horizontal branch bottoms on your diagram to show schematically the (very tiny) de-coherence time? Then people can stop puzzling about *apparent* non-unitary behaviour.

Cheers
Jan

Dear Lubos, The word knowledge is something that has always confused me about the uncertainty principle. You wrote in one of the comments,

"Nonzero commutators - quantum mechanics' uncertainty principle - means that the maximum knowledge must be smaller than the knowledge of all observables, so something has to be probabilistic. You can't know X and P at the same time.'

What bothers me is that the uncertainty principle is also used to explain why the electron does not spiral into the proton (and why the vacuum state is full of virtual particles too, I think?). Clearly these facts are not about knowledge. They are true irrespective of anything we know. It's more like X and P cannot even simultaneously exist within some limit.

Is this right? thanks,

That 8:24 snapshot looks superficially similar, but the the rough appearance in the russian reproduction case only appears on the exit side, whereas incoming shrapnel signature is all over the real crash.

Hi Lubos, It looks like you didn't read my 'And then once that step is complete just get rid of the word problem ... interactions in QM are not a "problem"'

My point is that there are processes like your tree taking place far away from earth where particles collide and there is no measuring going on -- just interactions and de-coherence.

"The effect of the measurement on the wave function isn't given by any particular map on the Hilbert space"

That is true for Hilbert space of Psi (the "system" that the tree is describing), but I don't see how this true for the much larger Hilbert space of "system"+environment. That larger Hilbert space is still undergoing unitary evolution which causes the apparent change in probabilities of the Christmas tree "system".

You implicitly think that wave function is something real. This is wrong. Wavefunction is only a tool to calculate probabilities of properties. Also there is no objective description of the system which the reason lies behind your problems with realization of a particular property (during measurement). I suggest you to read the book "Consistent Quantum Histories" Robert Griffths. It is free.

Dear Jan, the laws of quantum physics hold for a "system" just like they hold for "system+environment". There is really no physical difference between the two. The separation of some degrees of freedom to the "system" and "environment" is really just a matter of conventions. So it's clearly wrong to suggest that the general laws of quantum mechanics apply in one case and not another.

Dear Lubos, Yes you are right (what else is new:) and that is were the FAPP (For All Practical Purposes) concept makes its entry. If the "system" is interacting then we cannot apply the laws of QM to just the system and strictly speaking we cannot talk about probabilities of observable s for just the system. But FAPP you can describe the system as isolated with its own unitary evolution between t=1 and t=2, and FAPP you can ask about probabilities of the system. Your errors will exponentially small until you hit t=1 or 2.

Luke, the uncertainty principle has many implications. Everything that was impossible in classical physics but becomes possible in quantum mechanics (or vice versa) is due to the uncertainty principle. You surely don't believe that if a principle has more than 1 implication, it is a contradiction, do you?

Concerning the knowledge, I am just saying that in classical physics, because it is defined as physics where all observables commute with each other, it is possible to "simultaneously diagonalize them" - more ordinarily, to find the values of all of them. The maximum knowledge has certain, error-free values of everything that can be known, and this set of numbers typically evolves according to deterministic laws.

In quantum mechanics, such a complete knowledge of the state of the system is impossible due to the uncertainty principle. If X is known, then P is perfectly unknown, and so on. Or some compromise with uncertainties in both. And similarly for lots of other sets of observables. This is a *consequence* of the nonzero commutators.

You are perfectly right that X and P cannot simultaneously *exist*. It is a stronger description of the uncertainty principle that you said, and it's true, too. However, what "really exists" is a potential minefield and one may still talk about what is "possible to know" too, right?

Yes John, it is a mathematical object that allows us to calculate probabilities - it is also a mathematical object that evolves unitarily (however we wish to 'interpret' it). When we make a measurement there's a real physical change - a bit is recorded - what's the unitary connection between the states of the world before and after the recording of this bit?

Dear Jan, thanks for your comment which I don't understand. Which exponentially small errors are you talking about? What is the exponent and why the errors aren't zero? Why do you think that there will be errors at t=1 or t=2? We're just not at the same frequency at all.

In physics, we are calculating what we see when we measure something. Science isn't obliged to predict anything that can't be measured, and indeed, quantum mechanics uses this principle maximally because it labels all such questions meaningless.

The picture with the tree was sort of meant to suggest that we only make measurements at t=1 and t=2 etc. so all the state vector at fractional times is just a speculative auxiliary object and it makes no physical sense to talk about "errors" of the wave function at those times because nothing is measured at those times. Physically, errors may only refer to the difference between what we measure when we measure - e.g. at t=1 or t=2 - from the predictions. In a single repetition of the measurement, there will be unavoidable uncertainties from the uncertainty principle etc. If one repeats the same experiment many times to calculate the observed probability distributions, they will agree with the predictions of a correct quantum mechanical theory exactly.

To (almost) exactly measure these distributions etc., one needs a classical device - a device for which the basic laws of classical physics are an excellent approximation. So you may say that it is a gadget that perfectly decoheres, and/or with lots of environment, and so on. The founding fathers of QM chose not to decompose this condition to some smaller criteria. They emphasized what really matters and what really matters is that the object used to measure these distributions in (repeated) experiments has to be a classical object. The more classical it is, the more accurately one may measure the distributions etc.

Simon, it's like talking to a wall. Because the wave function is *not* a real object, as John tried to remind you, the change of the wave function is clearly *not* a real physical process, either.

The logical arrow of time is the principle that all general rules of physics of the form

A(t_1) implies B(t_2)

for some propositions at the given time that is either guaranteed to hold or that has a calculable probability obeys "t_1 is smaller than t_2".

So the future is probabilistically but otherwise "accurately" determined from the past and this relationship cannot be reverted.

The quantum randomness may be shown not to arise from hidden variables.

The nonzero commutators imply that the probabilities of different values can't be just 0% or 100% because if the observables X,P had well-defined c-values x,p at a given moment, they would obey XP-PX=0 because xp-px = 0, like all numbers, instead of the correct XP-PX = i*hbar. So some generalization has to hold instead, and when one looks what the generalization actually says, it says that for a given state, every observable such as X (or P) has some probability for each allowed eigenvalue. The non-existence of the "common eigenstates" of non-commuting observables is enough to allow a nonzero commutator.

"As I have already explained to you in a way that most dogs must have already understood but you have not, the concept of "unitarity" is only meaningfully defined for actual "maps" from the Hilbert space to the Hilbert space but the measurement isn't defined by any map"
But that's what I've been saying! Measurement is a non-unitary process, i.e. it isn't described by a unitary map!
In a strict function sense a measurement provides a probabilistic 'map' from a set of input states (all possible states of the physical system to be measured) to a set of output states (eigenstates) which is not invertible. The measurement is just a function that takes an input state and gives an output state according to a well-defined probability rule.
This mathematical description is independent of how we choose to interpret what the wavefunction means.
Interactions are described by invertible and deterministic maps. So yes measurements are clearly not described by unitary maps. So measurements in QM are not interactions in the usual sense.
All physical objects are supposed to be described by QM. Measuring devices are physical objects. All physical objects interact with one another according to a unitary map, according to QM. A measurement is an example of an interaction that cannot be described by a unitary map.
Personally I think there's a problem here that's not wholly explained away by decoherence, or by appeal to probability. In physical terms there's a real physical change when a measurement is made - a real bit is recorded. The input and output states are not connected by a unitary map. We must appeal to something different than the usual unitary evolution rule in order to explain this (or introduce some further assumptions if we believe that decoherence or consistent histories provide an adequate explanation).

No, the change of the wave function during measurement is not a map - equivalently, it is not a function - which implies that it makes no sense to ask whether the change is unitary or not, and it implies that everything else you write is gibberish, too.

I was talking about the density matrix elements between the (small) system Hilbert space and (large) environment Hilbert. These decay rapidly towards zero as (exp(-t/decoherence_time) right after the interaction (measurement) takes place. assuming these elements are zero is FAPP a good approximation after t1+eps where eps is a few e-foldings of t/decoherence_time. But you already knew that.

I see, Jan.

You probably mean that the off-diagonal elements of the reduced density matrix for S only quickly decay - they decay much more quickly than exponentially, however (in decoherence). They decay like exp(-D*exp(C*t)). These matrix elements of the reduced density matrix may be computed as a sum of many terms calculated from the whole S+M. Each of these terms is even smaller because it is a product of many tiny numbers.

The usage of the reduced density matrix is absolutely sufficient and *exact* for any predictions as long as we will only measured observables acting on the Hilbert space (tensor factor) of S. If we want (and/or are able) to measure observables of both S and M, there is no point in separating the "environment" M.

Do there must be time within the realm of mathematics itself, then?

A measurement is a stochastic process that takes an input and produces an output - whether we call it a map or a function is irrelevant. What is relevant, as you keep emphasizing, is that whatever it is it is not describable by any unitary map - therefore it is not an interaction in the usual QM sense - because interactions between physical objects are all describable by unitary maps.
I'm sorry that you think all this is gibberish, that I am more stupid than most dogs, and that my thinking is completely fucked-up, but there must surely be something a bit puzzling with a theory that
(1) stipulates interactions are governed by unitary evolutions
(2) is unable to describe the output of an interaction between a measuring device and the measured system as a unitary evolution of the input state (without recourse to further assumptions such as those required to make decoherence treatments work)
It's one thing being entirely comfortable with the rules of QM (which I am) and quite another trying to fit it all into some kind of intuitive framework. As mentioned earlier - it's not immediately obvious why vectors in a complex Hilbert space should be the mathematical entity we require to describe the 'state' of something - even if we only think of this as an entirely abstract object that represents our state of knowledge (whatever that means precisely). It's even less intuitively obvious why the interaction of 'measurement' is different from other interactions in QM.

Hi Lubos: Slightly off-topic, but not completely so. There is a report in Physics World.com. Christopher Ferrie, Joshua Combes (Physical Review Letters) argue that
such measurements, and their counterparts known as "weak values", might not be inherently quantum mechanical .They say that the results from such measurements can be replicated classically and are therefore not properties of a quantum system. I know, you do not care for the theory of weak measurement. What do you think of this proof?

You can think in the following way. Quantum mechanics tell us what we get when we measure a system using an "infinitely" large apparatus (These are words of a winner of fundamental physics prize, you can trust them). Wave function isn't something corresponding to a point in phase space in classical mechanics. Also note that it is extremely important that measurements disturb the system. If they didn't all of quantum mechanics would collapse. Measurements don't commute <-> corresponding operators don't commute. This approach is totally fine. However I believe there is a better approach at least "aesthetically".

You must have noticed that you talk about closed systems such as universe with this point of view. Consistent Histories or Decoherent Histories approach (created by Omnes, Griffiths, Hartle, Gell-mann) defines what is a physical property and allows you to say something about closed systems. However Copenhagen interpretation can be derived using Consistent Histories, so they are really not different things.

Personally, CH approach made me understand quantum mechanics much better. However as I understood it (at least I believe so :) ), I also understood Copenhagen interpretation better (one can easily derive it in CH) so now I am much more comfortable while using Copenhagen interpretation in daily life. For example you can prove in CH that: "Measurements don't commute <-> corresponding operators don't commute", so you really don't loss anything while you are thinking about what you can measure or not. I did understood what is complementarity about and what really wave function is. CH approach stresses this things very strongly. For example CH approach doesn't call wave function as state of the system, it says pre-probability.

Something I have said above may seem cryptic to you. I can't summarize the CH approach here but I strongly recommend you the read the book by Griffths.

"...is all over the real crash".
I don't know what you mean. It is first of all only confined to the cockpit area. Also, in the Russian video you see outcoming (7:56 & 8:18) AND incoming holes (8:13 & 8:16 ). You see the same in AND outcoming holes on the MH17:
https://niqnaq.files.wordpress.com/2014/07/14749781785_221e3fb040_o.jpg

Here, more outcoming holes:
https://www.metabunk.org/data/MetaMirrorCache/20698820663ebd4ef150060de6b4333f.jpg

I don't know what they mean exactly by weak measurement but if they propose that they can simultaneously measure position and momentum, like said here http://en.wikipedia.org/wiki/Weak_measurement, then they are wrong. As a proof, read http://www.feynmanlectures.caltech.edu/III_01.html

I get all that, and I'm really not thinking of the wavefunction as some kind of 'real' object - but the fact remains that 'real' (which it clearly isn't I agree) or representative of our probabilities (when mod squared) its evolution is governed by a unitary and deterministic map, unless our object 'represented' by this wavefunction is subjected to a measurement. So we look at 2 quantum systems and we can use our knowledge of the input states (or wavefunctions) and the physical interaction to determine what our probabilities should be for some observable for one of the systems at some subsequent time.

If one of those systems, however, is designated as a 'measuring' device - then we can no longer apply the same mathematical rules to determine the mathematical representation of these changing values at this subsequent time - despite the fact that measurement must be an interaction. The same mathematical rules don't apply when one of the systems performs what we call a measurement - or when the interaction between the two systems is deemed to have constituted a measurement of one of the systems.

If we input a single photon to a beamsplitter then the output state/representation of our probabilities of the 2 arms is described by ~ |10> + |01> which is an entangled state of the 2 spatial modes of the field. Suppose we perform an ideal photon measurement in the output arms and find that we detect the photon in arm 1 (let's not quibble here about the fact that a photon measurement is destructive). It would now be wrong, given this knowledge, to describe our new representation of the probabilities by an entangled wavefunction between the 2 modes. Indeed, no experiment that could be subsequently (individually) be performed on these 2 photons would reveal entanglement (with or without knowledge that a measurement had been performed, or its result).

But all that's happened is that we've interacted things together (our field modes with our detectors) - it's just that we call this special kind of interaction a 'measurement'. If we'd interacted these photons with, say, 2 2-level atoms in high-Q cavities, we'd be using the unitary evolution of the interaction to determine our representation of the probabilities - and we'd also see evidence of entanglement between the photons in subsequent measurements if we tailored things just right.

Our mathematical description, our mathematical representation of things changes according to whether the interaction is deemed to be a measurement or not.

And if a measurement has been performed our mathematical description then changes according to the new knowledge that we get (or is recorded) from this special kind of interaction we call measurement, and yes it can be subjective. So what's different? In one interaction there is a physical record of a bit which requires energy to erase (the change in the state of our knowledge which is a physical change in our brains, or the physical output of the measuring device), in the other there is not.

Hi Simon, I've posted about 10 comments of yours just during the last 10 hours. None of them makes any sense and you show zero potential for learning even if it comes to completely trivial things.

I've wasted at least an hour just with you today and I don't want this to continue so I am placing you on the black list.

Sorry, Jan, "matrix element between a small Hilbert space and large Hilbert space" is just an oxymoron, a nonsense. The density matrix is an operator so its matrix elements are always between two elements of the *same* Hilbert space.

Refreshing to know that it happens in physics. I thought Cook, Lewandowsky, and their Australian teams only handled "climate".

Thank you for this explanation Lubos.
As I understand it, before throwing a dice we have a probability distribution that we will get one of the six numbers. That probability is 'sharp' because we always get exactly one of six numbers. That is, excluding unlikely case where a dice lands so that it manages to balance on one of its edges :)
Once a dice is thrown we see the final result. One may say we measured it with our eyes. One may say, if one really likes those words better, that probability distribution collapsed to a single outcome.

For the same reason that the precise wavelength of a localized wave packet cannot exist. Just the math of Fourier transforms, nothing more mysterious. The conceptual hurdle is the association of a particle with a wave packet.

Tony, one may talk about the "collapse" of the distribution for the dice if it makes him feel so much better. In some sense, it does collapse.

But the point is that there is no "internal structure" of this collapse to be decomposed or looked for. The collapse is an inevitable part of the term "probability".

Subtleties are only legitimately important if 1) the big picture is at least approximately right, and 2) if these subtleties are not cherry-picked or selectively interpreted and adjusted according to double standards.

Both conditions have to hold. Unfortunately, none of them is being satisfied in the "mainstream" Western reporting about the Ukrainian civil war and many other events.

Yep, I agree in that, too.

Great to hear that, Eclectikus!

Thanks Lubos for a detailed reply. In physics we are learning again and again that the fact that someone does very important work once, does not mean he is right all the time!!

Or the excess/deficit may be coming from a part of the variables that is not well explored as far as cuts etc. When one is concentrating on clearing up the Higgs, for example, It could be that the cuts etc that clear up around 125 GeV are not appropriate for other higher intervals. Then, if some excess or deficit appears, they will hold their horses reanalyzing or waiting for more data .

An interesting quote from the preposterous universe:
"As far as I can tell, the revolutionaries make their case by setting up a stripped-down straw-man version of quantum mechanics that nobody really believes (nor ever has, going back to Copenhagen), then proclaiming victory when they show that it’s inadequate, even though nobody disagrees with them."

We can't all be experts in everything, but if you want to call our differences nuance, that's fine with me. But the personal part? We were both personally affected by WWII. My homeland was not only betrayed, it was wiped off the map permanently. That doesn't change a thing. CinC is a term of art with actual meaning, to some. To others, not so much.

Dear Michael, you are forcing me to discuss this personally because your otherwise indefensible claim that Hitler didn't become the commander-in-chief in early 1938 is being justified by mysterious references to your being an expert.

I don't respect that as an argument. I don't respect you as an expert. Show me one paper or something like that which would claim that Hitler was never a commander-in-chief in 1938. You must know that you won't find anything like that because you are making it up and your claims of expertise are just rubbish.

You may have become a great fan of some generals of the German Army in the late 1930s or something like that, and worship them etc. but they were *not* commanders-in-chief e.g. in February 1938.

In this simple picture, if one says that the dice may show 3 in this universe, but all the other values may and will occur in other, parallel universes, looks like useless, pot-headed non-sequitur.
I was watching video by David Deutsch and he seems to believe that only many-worlds can account for the additional computational power that quantum computation brings with respect to the classical computation - as if additional computations from parallel universes somehow leak into ours.

Absolutely, Tony! If a many-worlds description is meant to be at least approximately equivalent to quantum mechanics, the different copies of the world must completely decouple - the worlds where Hitler won the Second World War (and the whole "tree" of such worlds) must become forever inaccessible from our world where Hitler lost, otherwise it would be a contradiction.

But if those "other worlds" are completely inaccessible, their large number can't possible influence what is happening in our world!

There is one seemingly different but ultimately equivalent way to say what's wrong. The many-worlds evangelists never say "how many worlds" there are in their list of "many worlds". But the idea is that the worlds get "reproduced" when some measurement is done, i.e. after some amount of decoherence.

But the very point and power of a quantum computation is that there is *no* decoherence during the whole process of computation at all, so the number of worlds doesn't grow at all! So it can't be exponential.

The fairy-tale that the "exponential speedup is possible because the number of many worlds is exponential" may sound OK at the level of this slogan composed of a dozen of words. But it fails every other basic test that it should pass.

Deutsch's claims are actually stronger and therefore even weirder. He doesn't only say that many-worlds give *a* viable explanation why quantum computers may be fast. He says that the quantum computers *prove* that there must be many worlds.

This is not one but at least two major steps away from any defensibility. In order to show that the quantum computer speedup *proves* the many worlds, one would have to say that the many worlds are the *only viable* way to explain such a fast computation. But in reality, not only he can't prove that it is the *only viable* explanation. He doesn't even have *a* viable explanation because of the two contradictions in the first paragraphs.

And this Deutsch guy is often sold as a guru of quantum mechanics. He hasn't penetrated a micron beneath the surface of popular nonsense that the laymen are being served about quantum mechanics. Nothing he says really makes any sense whatever. It's just a sequence of words for people who think that sentences with certain buzzwords and a certain predetermined message are "cool" - even if every piece of logic in those statements is completely defective.

The BUK missile has a continuous rod warhead, basically explosives wrapped in a steel rod that has regular indentations in it, sort of a cocoon of integrally connected steel lumps.
On exploding, this produces a shower of rod fragments of very similar size. It is very difficult to distinguish holes produced in a sheet aluminum skin by this warhead from those produced by a gun. The best clue is the pattern of the holes, but in this case there are few pieces big enough to show patterns.
The cockpit fragments that have been shown in news photos are more suggestive of an explosion imho, simply because the metal is really peppered, which is hard to achieve with a gun.
Given the lack of transparency in the inquiry, I do not see how the eventual report will be credible.

[sigh] I really have no reason to argue with you, but, technically, Hitler was never Commander in Chief of the Wehrmacht. This is a non trivial distinction. If you need a link, try this: http://spartacus-educational.com/GERhalder.htm

Hitler did, however, usurp the command structure and name himself "Supreme
Commander of the German Armed Forces" and become the de facto CinC at the end of 1941. https://www.jewishvirtuallibrary.org/jsource/Holocaust/fuhrer.html

I am sure that you could argue convincingly that dark matter is not merely a convenient fudge factor to support a cosmological theory that is no longer supported by observation, and, in that arena, I would never be able to win an argument with you. And I am also sure that my losing argument, like yours here, would be supported by spurious entries in Wikipedia.

The page you linked to doesn't say that Hitler didn't become the commander-in-chief in 1938. You know that very well. No book, paper, or page has ever claimed something of the sort because it's bullshit. Hitler unambiguously became the commander-in-chief in early 1938.

Ban me if you like, but I stand by what I wrote here.

But it's possible to unlearn! It's called "forgetting".

And it's possible for wavefunctions to contract over time. An example would be a Loschmidt reversal. Or let's say we have a simulation of a quantum system on a quantum computer with no interaction between the memory states of the quantum computer and the environment. Then, uncompute. Voila!

What you call a "collapse" due to learning by an agent is actually none other than the creation of an entanglement between the memory states of the agent and the system in question. That analysis was done by von Neumann a long time ago.

You haven't solved the measurement problem in the least bit. Neither have you explained the thermodynamic arrow of time.

No, you're wrong, Quantum. The reversal of the "abrupt shrinking" associated with interpreting a "probability" cannot exist in Nature - or anything that contains the concept of ordered time similar to the physical one.

I've wondered if collapse is a bit too classical of a notion in that it gives rise to intuitions that there is a measurement problem so I propose you say it narrows to a classical limit or approximation. Nothing changes just the words so its a completely pedagogical value.

It is not only that you may "flip" these definitions; it is that both definitions coexist. There are two functions that collapse: the future probability of measuring, and the past probability of coming from some determinate past state.

That is even classical. If I see a person, whose location I didn't know, crossing a transfer gate in an airport. I have both an estimate of the possible flights he could be going to take and an estimate of the past flights he could be coming from. There is a collapse, but not a violation of time symmetry.

why do you want entropy to be low during big bang ? the only thing you have to assume is big bang, you don't need to assume a special state for big bang for entropy to increase. On the contrary you have to impose a special condition on the big bang for entropy to decrease.

I think this is indeed a nuance that only matters for those who study military history.
In every country and Germany is not an exception, there is political command and military command.
Also in every country and Germany is not an exception, the latter is subordinated to the former.
So everything is only a question of how is managed the interface and what names are used for the interface managers.
Up to 1938 this was organised Chancellor -> Ministry of Defence (political layer) -> OKH (Oberkommando des Heeres) -> Army groups -> Armies -> Armycorps-> etc (military layer). To that adds separately air and sea.
The top of the military layer (head of OKH) was called Oberbefehlshaber des Heeres (translation in english CiC or supreme commander of the armed forces).
.
In 1938 Hitler created the OKW (Oberkommando der Wehrmacht) and the chain became :
Kanzler und Führer (political layer) -> OKW -> OKH (Army) + OKL (air) + OKM (sea) (military layer)
The OKW role was coordination and in practice it became the general staff of the Führer.
.
So one one side you are right - the Kanzler and Führer (political head) has never been head of the OKW so strictly speaking he had never been supreme military CiC even if he had (political) authority on it like in every country even today.
You are also right that end 1941 Hitler took over in own name the head of OKH (not OKW !) so that he became head of the Ground forces directly.
One can note that this created a horrible situation on paper because in the military hierarchy Keitel (chief of OKW) became superior of Hitler (only chief of OKH) :)
.
On the other side Lubos is not quite wrong either because in practice the OKW was a tool for Hitler taking direct influence on OKH which was up to 1938 rather independent minded. Btw the German generals surnamed Keitel (OKW chief) Lakei what says it all !
So while Hitler was not directly (de iure) supreme army commander (=OKH chief) in name before 1941, he became it in practice (de facto) in 1938 with the creation of the OKW.
.
While this is important to understand the german army chain of command and strategy for 1938-1945, it is of second order for this thread.

Concerning people who have a problem with measurements in QM or with the thermodynamic arrow of time, I would say it is their own psychological problem ;-P

And the people who suffer from this deficit and have an ego with a radius bigger than the Hubble radius think everybody else is confused as well, which encourages them to pompously make themself seen andheard in popular media channels, blogs, books, etc ...

I think I understand Lubos's response. What about CD ROM degrading after a long time or just getting melted?
Those are two completely different processes that erase information. In principle, I think they are reversible, but it is not time reversal symmetry at work in these instances.

Thanks, Tony! Concerning the differences,

1) obtaining information by observation of previously uncertain information may be said to be instantaneous; forgetting is gradual

2) this is related: the "end side" of the observation really eliminates the probabilities for all other options except for one (collapse); in forgetting, both sides may be generic distributions with no zeroes

3) the likelihood of one answer or another on the "sharp side" of the observation is determined - uniform according to the distribution on the other side; the distribution in the case of forgetting is unconstrained

4) obtaining information by observing (with predictable probabilities) occurs even in a "perfect brain" with a huge excess memory; forgetting etc. only happens when there are imperfections or capacity limitations

5) forgetting is a reducible process that depends on particular smaller processes in the brain etc. and only some of them; obtaining information through observation is irreversible and the information is universal

One may generally feel that something goes up or down, so they're reverse to each other, but all the details are different. The laws governing these two processes are just completely different, T-asymmetric.

The claim that they are T-images of each other is like saying that gas spontaneously spreads from a full kitchen to the other (previously empty) room due to the second law; and the reverse is to close the door and pump all the gas back to the kitchen only. The trend of a particular observable, the ratio of mass of the gas in the two rooms, gets reverted, but all the details about these two processes are completely different from T-images of one another.

The web page you have linked to contains no new information or new images that we haven't discussed yet, and surely no new justification of the proposition in your comment. It is an incoherent summary of a TV program that says exactly the opposite than your comment.

Hi Lubos,

So, for me - as a layman - quantum mechanics has always been inherently difficult. Are you saying that the probability distribution (and the expanding range, as time goes on) does not mean that the particle is in all possible places at once (as quantum mechanics is often explained), but that it actually in one place, except that we can't know where it is until we actually measure it?

Second, if that is the case, why doesn't your diagram at t = 1,2,n reduce down to the same certainty as t=0?

Of course I am. The paticle isn't "here AND there". It is "here OR there". The Schrodinger's cat is "dead OR alive". This is true for all probabilistic distributions and those derived from a wave function in quantum mechanics aren't exceptions.

Quantum mechanics only differs by the ability to discuss probabilities for mutually non-commuting observables, and the nonzero commutators (aka the uncertainty principle) are responsible for everything that is new about quantum mechanics.

But once you fix one observable, like the position of a particle, the probability distributions derived for that observable are probability distributions, so the different options are true with the word "OR" in between them, not "AND"!

Fantastic! Thank you very much. If every physicist/journalist explained the probability distribution aspect quantum mechanics to the laymen in these terms, it would be far more approachable.

[Mind you, quantum entanglement - spooky action at a distance - is still something to get the mind around.]

One more question, if I may. The "Is light a particle, or a wave?" question. Would I be right in assuming that, yes it is a particle, but that it travels along a wave-like path through space/time?

surely the actual impact holes with huge variation of size and shape are a lot different from the almost uniform, round aircraft gun impact holes - also, and that is my take, a lot of the impact is in the front of the airplane - cockpit panels etc - which would mean a head-on flying fighter. Not very likely mode of operation when it comes to using aircraft guns. With so many impacts, it would also mean that the gun would have to keep perfect aim for a very long time - not likely when shooting unguided projectiles from a distance from a flying platform that is influenced by turbulence and maneuvering etc.

It is neither a flow of ordinary (classical) particles, nor an ordinary (ciassical) wave.

It is a set of excitations of a quantum field, a thing that may only be properly understood within the formalism of quantum mechanics, and this new entity may either be interpretated as a flow of particles whose probabilities of positions and velocities are only predictable using probability (amplitude) waves; or as a state of wave in the electromagnetic fields whose quantities however don't commute which also means that the energy carried by frequency-f waves is quantized in multiples of E=hf, the energy of a "photon".

OK, thank you. I think I understand that. So, why a wave? Why not a straight line? Is the quantum field naturally "wavy"?

Dear Anto, every field (classical or quantum) is naturally wavy. Waves are the only way how a field may differ from the vacuum.

A field is a number at each point, and F=0 means the vacuum state, the configuration with the minimum energy E=0. Everything else with a nonzero F means that the field is excited.

When you excite it to F=A in a small region, the excitation will spread to the rest of space. Far enough, it will always look like waves. More precisely, every solution for the electromagnetic field may be written down as a linear combination (sum) of sine waves with a given direction, frequency, and polarization.

I probably don't understand how you are imagining a "straight line field". It sounds like an oxymoron to me. Field is everywhere, by its definition.

It does *not* imply a head-on flying fighter. SU-34/35/37, for example, are known to have rear-pointing radars and the capability to install rear-pointing guns.

"Franco was a deeply ideological fascist"

Except he wasn't; sure, he was anti-democratic and a traditionalist and nationalist defender of the monarchy, the Church, and the army, but he relegated the real ideological fascists (the Falangists) to a very minor role in his regime. On this see the books of Prof Stanley Payne for a corrective.

As for the Loyalists, they celebrated their commitment to democracy (of Robespierre's totalitarian kind during his carnival of terror and death--see Prof J L Talmon work on this) during the Spanish Civil War by murdering ten thousand unarmed priests, nuns, and monks. And Spanish Stalinists set about methodically murdering those on their own side--namely anarchists and sydicalists, Some "democrats," uh-huh. . . .

Dear Jock, it makes no sense to argue, prolong this war that occurred 80 years ago. I have no doubt that just like democratic Czechoslovakia in the 1930s, I would have sympathized with the republicans.

The point is that it was a civil war where different people stood at each side, that it is primarily a war polarizing the nation (Spain or Ukraine) itself, but that everyone in abroad has some opinion about it, too. But the existence of opinions and interests doesn't change the fact that it is a civil war.

Lubos,

Sorry - I was confusing quantum with classical. As laymen do!

However, I think I understand your field explanation. Zero energy/excitation = zero field = flat/straight.

However, energy/mass = excitation = agitation/disruption = wave-like travel?

I guess that, like many other laymen, I'm able to get my mind around the macro-universe, but struggle with the micro-universe.

I think that I've properly understood your explanation of the classical statistical physics, but I'm still struggling with the translation to quantum physics.

I don't want to take up any more of your time. I see your explanation of the quantum field, above.

Something which I find hard to understand is that, with all of the different forces/particles/masses/energies in the universe, that anything is predictable at all. Normally, in say a pond, if you have 5 or 10 people throwing stones into the water, there is chaos.

However, with the universe, despite all of these competing sources of mass/energy, there is still much predictability.