Thursday, April 02, 2015 ... Français/Deutsch/Español/Česky/Japanese/Related posts from blogosphere

Feynman on silly "philosophical implications" of quantum mechanics

He criticized the flawed idea that good scientific theories should only work with experimentally accessible concepts

I was sent this 14-minute audio recorded during the original "Feynman lectures in physics".

With a somewhat combative common-sense voice, Feynman talked about the generalized "philosophical lessons" that people extract from quantum mechanics, or at least they claim to do so.

Well, almost all this stuff is just junk. The actual physical content is almost always distorted beyond recognition and the resulting essence is just silly.

Feynman mentioned several examples of ideas that follow from quantum mechanics that are being distorted in this way.

One topic is about the inevitable influence of the measuring apparatus on the system. The true idea says that it is not possible to "arbitrarily minimize" this influence by rearranging the apparatus – there is no objective or classical underlying state (point in the phase space) that would describe how the physical system really "is" independently of any measurements. Instead, we may predict the probabilities of different outcomes of various measurements and the dependence on the question we ask – on the measurement we make – cannot be eliminated.

If you like Feynman's audio above, you may buy a CD with 6 hours from his lectures for $21. See the link below.
But that doesn't imply, as some people think, that everything is fuzzy and uncertain. He describes the actual answer using the example of the popular question whether the wind makes any noise in the forest if there's no one to hear it. Well, it does because the noise still leaves some small scratches, some minor traces that may later be observed and used to prove that there was noise. Feynman chose to be "explicitly silent" about the question whether the trees had consciousness.

I would say that people love to err on both sides here. Sometimes, they incorrectly say that some phenomenon's existence becomes "forever undetermined" just because this phenomenon is not being perceived "when it is occurring". That's wrong because one can observe it later. Sometimes, people want to deny that our knowledge about the process ultimately comes from some observations and it can't exist without them.

But I really enjoyed the crisp comments about Feynman's attitude to "concepts that can't be measured", around 4:00.

People love to say that quantum mechanics (and relativity) taught us that the idea that
you cannot speak about the things you cannot measure. If you can't define something by a measurement, it has no place in the theory.
This is almost exactly what the Šmoity stinky scumbags and the crackpot sect around them loves to say every day. April 1st was no exception so a bunch of morons has submitted a would-be witty astro-ph preprint named A Farewell to Falsifiability yesterday.

This opinion is a "false position", Feynman says, a result of an "uncareful analysis of the situation". The inequality \[

\Delta x \cdot \Delta p \geq \frac\hbar 2

\] doesn't a priori mean that you can't talk about these observables. Instead, it just means that you don't need to talk about them! The situation in science is as follows:
A concept or an idea which cannot be measured or cannot be directly referred to an experiment may or may not be useful. It need not exist in the theory.
I have never heard this Feynman audio before but I am sure that you may find TRF blog posts (look e.g. for the word "auxiliary" here) where I wrote this principle in an almost identical way. This guy has scooped my articulate, original idea and recorded it a decade before I was born. ;-)

He revisits how the principle was actually used by Heisenberg et al. Everyone else was a classical physicist so they were asking questions about the precise position and momentum, and so forth. Heisenberg's answer was that I don't need to answer such questions. But it doesn't mean that you and your theories mustn't use ideas that can't be measured. We just don't have to do so.

It is not necessary to remove all the components that can't be measured. It is not true that a theory which uses no such unmeasurable concepts is preferred over a theory that does use some. The existence of phenomena in the theory that haven't been observed is really necessary for the theory's nonzero predictive power. The theory must be able to make predictions independently of the experiment i.e. say something about regions that haven't been experimentally probed.

So it wasn't an obvious stupidity for the classical physicists to keep on assuming that a position of the electron objectively exists in the same sense as it does for the baseball. Similarly, we still do assume today that the laws of relativity hold at all energies even though we couldn't have verified that. Maybe, someone else in the future will tell us that "we were stupid". But to do science, one has to stick his neck out. And it's not a stupidity to do so. An assumption may be shown to be wrong but it wasn't stupid that it was made.

After 9:00, he talked about the indeterminacy of quantum mechanics.

This has led to interpretations such as weird comments about the freedom of will (in the human sense) and the innocence of criminals. People have said that the indeterminacy opens the room for the supernatural agents to work. Well, feel free to work, supernatural agents, but then the physicists' task is to calculate the probabilities that govern the behavior of the supernatural agents as a function of the circumstances.

One may talk about "supernatural things" but physics remains equally analyzable and there's no evidence that it shouldn't be.

Feynman says that the ideas that the quantum indeterminacy radically changes the character of the free will of the human mind go to far. If the world were classical, it seems plausible that the mind would feel pretty much the same! After all, the brain isn't a coherent quantum computer. It's so warm for that. Except that the elementary building blocks "internally" rely on quantum mechanics to function (even atoms need quantum mechanics to exist, be stable, and have other basic properties), their mutual relationships are pretty much compatible with the basic framework of classical physics.

The people claiming that quantum mechanics is absolutely needed for an aspect of the human mind should do a careful analysis, and if they still believe in their opinion, they should present a solid argument or proof that classical physics makes incorrect predictions about this feature of the mind.

Feynman mentions that even in classical physics, the dependence on the inaccuracy of initial conditions exists – chaos theory – and this unpredictability affects the "free will" of the human mind in pretty much the same way as the (unavoidable) uncertainty dictated by the laws of quantum mechanics. He describes the "butterfly effect" using the words "given a small but nonzero \(\varepsilon\), there exists a long enough time \(T(\varepsilon)\) so that you can't predict it for this long future at the 10% accuracy". The students obviously didn't like the \(\varepsilon\)-\(\delta\) gymnastics, as their roaring revealed. ;-)

The required time only grows logarithmically, \(T\sim |\log \varepsilon|\).

Applause. I would have joined it, too.

Remarks on the template:

The readers who prefer this new experimental universally "dark on light" color template should fix the problem with this corner_main.gif image that seems to have wrong colors that look disturbing in the right upper corner of the "nearly white" left strip of the blog. Please create a better corner_main.gif image for me. It may be a transparent GIF so if you don't know how to deal with these things, give it up. ;-)

Add to Digg this Add to reddit

snail feedback (49) :

reader br said...

That's the one ;)

reader QsaTheory said...

black on sky blue is more readable. IMO

reader Eclectikus said...

Great Feynman! Luboš, do you think that if Feynman had lived more years, for example until after the second revolution of string theory, would have dissipated his skepticism about it?
I like this new theme, readability is clearly improved and looks cleaner, maybe you should change only the link colors (because they are distinguished only when you hover over them). Also you should improve resolution of the "Veritas" coat of arms... Well, they are minor tweaks, but sure they give shine and splendor ;-)

reader Luboš Motl said...

Thanks for the feedback, Eclectikus. The resolution of the Pilsner coat of arms really looks louzy. I never realized that. ;-)

It's great to believe that someone like RPF would understand and love ST or XY if he lived for 100 years. But realistically, I think that old people are slowed down in their ability to learn sufficiently fundamentally new things. So even if you live for 60 more years between 100 and 160 years of age, you may only make as much progress as between 30 and 35, for example. ;-) So I have some doubts about the scenario.

Feynman's doubts about string theory were really technical in the character - he believed that there have to be more straightforward ways to quantize gravity (and do other things). Well, I think that by now, he could have understood that this expectation has simply been shown incorrect. Even 50 years later, there aren't any non-stringy consistent theories of quantum gravity in d=4 or higher.

But to understand that this negative conclusion really implies the positive one, that one should take ST seriously and analyze near-Planckian and QG questions using the string formalism, really requires some active understanding of the technical issues in string theory, and being 97-year-old like RPF would be now could be a disadvantage, indeed.

reader john said...

Dear Lubos, you probably know that in his volume Weinberg almost derives quantum field theory from lorentz invariance+unitarity+cluster decomposition. Do you know at which step he misses string theory ? I guess he starts with 4 dimension from the beginning but I don't think he has used the fact space-time is 4 dimensional explicitly anywhere. The only thing that comes to my mind is that string theory has infinite number of particles, I think Weinberg always assumes that there are only finite number of particles.

reader Luboš Motl said...

Dear John, Weinberg is aware of this flaw in his proof, see the top of page 8 at

for example. His proof really has a clear gap. Something... and therefore the Hamiltonian is an integral of a local density.

It obviously doesn't follow. More seriously, theory of quantum gravity can't be defined with an integrated Hamiltonian at all.

One may also say that he's using "cluster decomposition" in several different ways - a weak one and a strong one. The strong one is de facto exact locality, and that's the assumption violated by string theory.

A totally different way to resolve your "paradox", at least in some case, is that string theory actually isn't a counterexample at all. In a broader sense, when one is allowed to change the spacetime, string theory *is* quantum field theory (like in AdS/CFT). And even in the same spacetime, *some* truncations of string theory, like open cubic string field theory, *are* a quantum field theory - with infinitely many fields.

If you have a very particular version of the proof, you may obviously go through each step and assumption and check whether it's obeyed by string theory or not, and in this way, you will see all the defects of the would-be proof and hidden assumptions.

reader john said...

Dear Lubos thanks for your answer. I also thought that Weinberg's assumption that Hamiltonian can be written as integral of a local density may be wrong, but I wasn't definitely sure that this was wrong in String Theory although I suspected (I only know basics of Bosonic String Theory).

Mysteries of string theory is really amazing but I also think that quantum field theory isn't properly understood yet (I am not talking about renormalization etc., but a new formulation). I feel like wonderful things will be discovered in 10-20 years, based on some conferences I watched like Strings 2015, although of course I don't understand them properly. I believe that the third revolution will be caused by a new understanding of quantum field theory, but of course I might be wrong. I also think that this is best way to proceed, I don't know if there is anyone who can directly work on (and successfully) foundations of string theory. Even Witten says dualities are too hard.

reader AHD said...

Love it... I wonder if there's an easy way to see T~|log eps| ?

reader Luboš Motl said...

Hi AHD, chaos is a form of instability, so the deviation from the initial state goes like delta X = X0*exp(t/t0). The time needed to get to a particular enhancement, like from epsilon to 10% i.e. exp(t/t0)=0.1/epsilon, is therefore t=t0*ln(0.1/epsilon) by inverting the exponential. ln(0.1) may be neglected relatively to the dominant ln(1/epsilon).

reader AHD said...

Hi Lubos. I understand that saying T ~ ln(1/epsilon) for some epsilon is equivalent to saying delta X ~ exp(t). But I don't have an intuition or back of the envelope understanding of why the growth rate of the deviation is exponential.

reader Luboš Motl said...

Dear AHD, try to read

and its references. At the end, the exponential growth y = y0*exp(lambda*t) isn't anything else than the solution to the differential equation

dy/dt = lambda * y

which implicitly appears within a much more complicated stuff.

reader AHD said...

I understand that T ~ |log eps| is equivalent to del X ~ exp(t/tau). But I don't have an intuition or back-of-the-envelope understanding of why the growth should be exponential. That's really the key, right? Anything sub-exponential would allow similar initial states to stay "similar" for much longer periods. Any simple way to argue that the divergence from the initial state grows exponentially?

reader Luboš Motl said...

Note that the light cone gauge Hamiltonian (to be specific and to make things work in string theory) may also be written as an integral over space (of points where the center-of-mass point of the string is located) but one must be really careful about the attribution of the string fields to individual points. Differences in the distance by L_{string}, the characteristic length scale of string theory, matter.

If one is not careful, the effective field theory is the only thing that is left, and the effective theory is indeed a quantum field theory again. I would say that if one wants to sharply see everything that distinguishes string theory from the effective QFT, he must be careful about the new phenomena that occur at the string (length) scale, and in the more straightforward variables, the locality (or "strong cluster decomposition") will fail at this length scale.

There may be other variables which the locality is exact, but Weinberg doesn't really prove that there exist variables in which it's exact assuming Lorentz invariance, unitarity and the weak cluster decomposition. So in some sense, his way of assuming cluster decomposition is assuming the QFT description.

QFT is perhaps imperfectly understood but I think that the aspects of QFT that remain misunderstood are so different from what we teach in QFT courses that they really deserve to be called string theory and not QFT. It's the cutting edge features and abilities of QFT that remains only partly understood, and those aspects have something to do with the more general perspective on all these theories, and it's better to call this view "string theory" although much of it goes beyond our present understanding of string theory, too. But string theory is meant to be the more complete view on all the phenomena in this class of theories.

reader AHD said...

Thanks, Lubos. The wiki link just assumes the result I'm trying to develop an intuition for. It says that the Lyapanov exponent characterizes the exponential rate at which trajectories of two nearby points in phase space will diverge. I guess I need to dig into a stat mech book and see how the diff eqns that describe phase space dynamics imply this exponential dependence. I was just hoping I could be lazy and someone could feed me a simple answer :D

reader Luboš Motl said...

Dear AHD, you really get a simple

dy/dt = lambda*y

(for a matrix lambda, to start with, but it's the eigenvalues that matter) when you linearize the evolution at a given point of time around a given trajectory.

It's really a trivial thing. Try to look e.g. at

for a different treatment.

reader bog said...

He emphasizes the centrality of prediction for a scientific theory, in that piece. The two main points which were both consistet with that, were (1) that a predictive theory does not necessarily have to consist entirely of components that are falsifiable in their own right. (2) That background concepts...the meaning of words (like 'position') and how they couple to other words (like 'momentum') and what rules associate with that, and what impact that might have on yet other words like "measure" can totally change. What he definitely was not endorsing was predictionless, wholly unfalsiable theories.

reader Luboš Motl said...


reader Richard said...

> the dependence on the question we ask – on the measurement we make – cannot be eliminated

I think it was Zurek who said QM treats correlations between properties, rather than the properties themselves, as measurement-independent. Statement about properties themselves, predictive, retrodictive or otherwise, are made in the context of the system interacting with an external environment.

reader zeGogglesDoNossing said...

The questions around consciousness and quantum mechanics are not philosophical questions, they are metaphysical questions. It's unfortunate physicists don't understand this distinction, because they end up talking past philosophers, casting aspersions on the whole field of philosophy, and, sometimes, becoming metaphysicians themselves. As always, Feynman is, essentially, on the money, though. Those metaphysical questions are silly.

reader Tony said...

're site design and access: note that even on desktop browser you can always press mobile icon which would show the site as designed for mobile browsers, which is much lighter on widgets an ads.

reader Tony said...

Great article by Weinberg, thanks for a link.

reader kashyap vasavada said...

Nice explanation of log (eps) behavior Lubos. One question: Is weinberg talking about all field theories or a specific kind of field theory which has chaotic behavior?

reader kashyap vasavada said...

Sorry Lubos. I withdraw the above question. log(eps) was about Feynman's remarks and not about Weinberg's paper. But in general probably highly non linear field theories give rise to chaotic behavior. Is that so?

reader Luboš Motl said...

Could you please explain the difference between philosophical and metaphysical questions?

reader Luboš Motl said...

Dear Kashyap, the chaotic behavior is really omnipresent. In this sense, almost all classical Hamiltonian systems produce chaotic behavior.

They are "highly nonlinear" in your jargon. This is particularly true for systems with many degrees of freedom, like a description of the motion of all 10^26 atoms in a kilogram of matter, and that's what Feynman meant.

reader RAF III said...


reader Richard the Second said...

Feynman once said that the essence of QM is captured by the double slit experiment. I wonder what he would say about these experiments:

reader Prathyush Manchala said...

Hi, Can you comment on the following discussion between Bohr and Pauli

"The idea that any observation must necessarily involve an increase in entropy has been much discussed and I remember that already in the discussions
with Stern and you in Hamburg, when you helped me with the ‘proofs’ of the old paper on complementarity, I stressed the principal irreversibility of the concept of observation. More specifically, any observation must make use of some registering device, whether through a photographic plate or directly by the retina of the eye, which involves processes of amplification by which free energy is spent. I know that also Teller is interested in these problems which were discussed in Los Alamos. I shall be glad if either you or Stern would write to me what has come out of your discussions in Zurich."

Bohr to Pauli

reader Prathyush Manchala said...

I cannot reply on my old comment pending approval, This is letter 2 From Pauli to Bohr

"The discussions which I had here with Stern (he left Zurich a few days ago) concerned the quantitative side of the connection of the concepts of entropy and of observation, a connection which, as we all agree, is of a very fundamental character. The problem arises whether there is a well defined minimum of the increase in entropy, independent of the particular experimental arrangement in use, if a certain quantity (‘observable’) is measured. Our discussions seemed to indicate that this is actually the case, although we did not reach yet any final conclusion. The increase in entropy can easily [be] computed if one starts with a ‘mixture’ as initial states and changes it into a ‘pure case’ by constatation of the value of a certain quantity. If, however, before and after the measurement the observed system is in pure cases the situation seems to be more difficult to judge. (Example: a single particle in a closed box. Before the measurement it is supposed to be in a certain eigenstate with a sharply fixed value of the energy. Then one observes the place of the particle in space, perhaps by constatation that it is in a certain partial volume. What one can [can one] say about the amount of the increase in entropy through this measurement, independent of a particular experimental arrangement?) We discussed different experimental arrangements, but we are not sure how general our preliminary results are. Stern and I are both trying to continue the considerations of the problems, possibly by correspondence. Needless to say how anxious I am to hear what you know yourself on this quantitative side of the increase in entropy by observations and how grateful I would be if you could write to me your views on this problem. I feel that you might know the answer already or at least that you may find it out quicker than we.”
Pauli To Bohr

reader Luboš Motl said...

Dear Prathyush, I do agree that the entropy strictly increases at "almost all time" - except in equilibrium when it's constant - so it must strictly increase during measurement because this can't be quite at equilibrium.

On the other hand, when one is solving Maxwell's daemon paradox, I think that the right solution is that the entropy really increases only at the moment when memory is erased to make room for new measurements. But it's a technical detail. What is more important is that one can't systematically repeat measurements all the time while keeping the state of the overall system pretty much the same and while keeping entropy constant: it always grows macroscopically after a macroscopic number of measurements.

reader Luboš Motl said...

Dear Prathyush, is there a reason why you posted three long, off-topic, and almost identical comments within 5 minutes?

reader Prathyush Manchala said...

Long they are. I've been reading these comments and many times for years now. So they don't seem as long to me. You can delete Pauli's second letter if you want. I don't see these comments as Off topic however, the topic is quantum mechanics. I think Bohr makes very important remarks in his 3rd letter, I don't fully appreciate them. I was hoping you had something to add.

reader Leo Vuyk said...

IMHO, Future Libet statistical experiments should solve the conscious -QM problem
The famous B. Libet brain scan experiment of 1964, suggests that in only a few percentage of
the brain measurements, test persons seem to pre-plan a simple action as the pressing of a button.
In most measurements he found no pre-planning at all, so are we robots?
Much of Libet’s position (about brain activity of test persons pressing a button) hinges
on the distinction between Type I and Type II Readiness Potentials (RP).
While the distinction between the two types in his data is very clear, it is less clear what critical difference in
mental activity leads to these different classes of brain activity.”
We would expect that based on causality the Time of Conscious Awareness (TCA) always
comes before the electric RP, which is found to be present in our brain because this would be the proof that we
humans are equipped with the free will to press on the button at the moment WE want to do that!!!.
However, Libet (and later all other researchers) measured the opposite for most of the
students, but not for all!
My suggestion is to measure how much in average percentages.
Only a few Libet students reported to have “preplanned their action” ( RPI) which is in my
multiverse based consciousness perspective a key issue to focus on.
The statistical knowledge of how few students are preplanning their action should
indicate how many instant entangled Copy universes we have to deal with in a
democratic sense.
See: Democratic Free Will in the Instant Entangled Multiverse

reader Prathyush Manchala said...

Sorry I Posted very long comments instead of just the summary. Feel free to remove them.

What do you think Bohr means when he says this?
"On the one hand, it is evident that any practical observational
arrangements, making use of photographic plates, cloud chambers or
direct sensual impressions, involve a mechanism of amplification in the
working of which free energy is spent in amounts out of all proportion
with the energy exchanges characterizing the individual atomic processes
under investigation."

reader Prathyush Manchala said...

Hmm... The Third letter is pending approval. Any reason? I think this letter contains some of the important insights into quantum mechanics.

reader Liam said...

Er, I think it was just a typo and zeGoggles meant to say "physical" not philosophical...

reader Luboš Motl said...

Dilaton made me to remove you from the black list but you just seem to be utterly insane. Within 1 hour, you have posted 6 off-topic, unprovoked, uninteresting, lame comments using two IDs.

Because you clearly can't make yourself behave decently by yourself, you have the limit of 1 comment per day (counting all of your sockpuppets) - the limit has been depleted for today - and once you surpass it, you're back to the black list, OK?

reader Prathyush Manchala said...

Sorry, Wont Happen again I promise.

reader Fer137 said...

I found this D.Radin excuse for not claim the famous Randy Price. (It seems disingenuous)

"For the types of psi effects observed in the laboratory, even a million dollar prize wouldn't cover the costs of conducting the required experiment. .... So, from a purely pragmatic perspective, the various prizes offered so far aren't sufficiently enticing."

reader Liam said...

He means that when we measure something about a microscopic system, we do so by coupling it to a macroscopic system (the apparatus) that rests in some kind of fine-tuned unstable equilibrium, so that the microscopic outcome can amplified into a macroscopic record that we can actually inspect.

The microscopic event involved at most some small Delta-E, the creation of the macroscopic record a much larger one (sourced from the stored potential energy of the apparatus). Entropy increases, decoherence occurs etc etc.

So measurement is always in some sense a thermodynamic process.

This is *so* well understood today we take it pretty much for granted and it seems "obvious". I guess at the time of the writing the letters it was a (relatively) new way of thinking about things, so Bohr is going on about it at frankly surprisingly great length.

But he's basically making the same core point in a variety of different ways - that measurement is a kind of entropy-increasing amplification that arises from coupling a micro-system to an "appropriately sensitive" macro one.

Unless you want to start going into technical details, there really isn't any more to say about it, or much of a "mystery" here.

reader Prathyush Manchala said...

Yes, I would be happy to study it more carefully. If you have any papers to share please do so.

reader Jason said...

It's remarkable how Feynman is the polar opposite of John A. Wheeler when it comes to taking the more bizarre consequences of QM seriously especially since Wheeler played no small role in helping Feynman develop his QED theory and "path integral" formulation of QM. Wheeler didn't believe in ghosts but he had no problem imagining a double slit experiment the size of the universe and asking what consequences it could have. Feynman didn't live to see that one but he probably could pinpoint with one guess who came up with the idea and then come up with a counter argument no one today has thought of yet to disprove it.

reader zeGogglesDoNossing said...

Ever since Hume, the metaphysical questions have been criticized as impossible if not meaningless. A classic instance of such a question of substance - in metaphysics, this means things that exist in themselves. The most interesting proposal here was Spinoza's construction of the monad as, essentially, the terminal object in a category with objects beings and morphisms attributes (and, yes, Grothendieck did love Spinoza). Questions about the existence of first causes, the unchanging, the unmoved mover, the distinction between necessary and contingent beings, the existence of beings standing outside the system of all possible worlds, the mental vs the physical, and the problem of free will.... these are metaphysical questions.

Weyl was interested in phenomenological questions in philosophy (he sent Husserl a copy of Space-Time-Matter and claimed it was a work in phenomenology in Husserl's sense), and this inspired his invention of, among other things, Gauge theories.

reader Luboš Motl said...

I don't see the difference you mention at all. Feynman surely did believe that the double slit experiment works and interferes at astronomical length scales as well, and this very monologue reveals that it would be his default assumption.

On the other hand, I think that Wheeler had rather orthodox ideas about quantum mechanics, too. They did some of the "seemingly unconventional" work together - like the semi-retarded propagators etc. Many of these things have morphed into the standard theorist's toolkit.

reader Liam said...

Sure, for some recent work in the area you could try here (Zurek et al.):

Or to start with a more general discussion, chapter 21 of this introductory QM textbook looks pretty good.

The chapter is called "Decoherence and Thermodynamics" and is online viewable on ggl books:

reader Liam said...

Ha ha - I liked Clara Moskowitz's question about how come we should censor the exterior of cosmic horizons but the interior of black hole ones, (not *really* a new point is it, but still a good one)! :D

I guess if Polchinski doesn't believe the Equivalence Principle anyway, he's allowed to censor whichever damn side he likes by arbitrary fiat... :P

reader TomVonk said...

Like Lubos said, it's a property of a Taylor expansion in the neighbourhood of some initial point.

In detail this is how it comes about.

Let's define some system's dynamics by dx/dt = f(x) (Definition)

where x is a point in the phase space and f is some non linear function of x.

Solutions of these ODE are orbits in teh phase space x(x0,t) where x0 is the initial point for t=0.

Let's look at an orbit that started slightly off x0 at t=0 e.g x(x0+ eps,t) where eps is infinitely small.


The distance between the orbits is then x(x0,t) - x(x0+ eps,t) and the rate of change of this distance is :

dx(x0,t)/dt - dx(x0+ eps,t)/dt = f(x(x0,t)) - f(x(x0+eps,t)) (by definition of the dynamical equations giving f)

Taylor expanding f in x we have : f(x(x0,t)) - f(x(x0+eps,t)) = df/dx (x0,t) . (x(x0,t) - x(x0+ eps,t)) + higher order

Calling the distance between the orbits D(t), we have then :

dD/dt = df/dx (x0,t) . (x(x0,t) - x(x0+ eps,t)) = df/dx (x0,t) . D and for small times where the linearisation is OK follows :

dD/D = df/dx (x0,t) . dt = constant . dt and D=K.exp(lambda.t).
Lambda is called the Lyapounov coefficient and if it is positive, the exponential divergence of orbits (aka initial condition sensibility) follows.
The generalisation to N dimensional spaces is immediate by using the Jacobian matrix.
I would also like to stress what Lubos already wrote - chaotic systems are the rule and regular, integrable systems are an exception.
For instance almost all classical Hamiltonian systems are chaotic because they conserve volumes in the phase space what implies that the sum of the Lyapounov coefficients is 0. So if they are not all zero, there is necessarily at least 1 which is postive . Hence chaos.
The chaotic gravitational system of N bodies (N>2) is a famous example known since Poincaré 100 years ago.

reader Luboš Motl said...


And at some moment, the selective presence of weird new phenomena in otherwise equivalent situations becomes as bad as a generic belief in paranormal phenomena or anything of the sort.

reader kashyap vasavada said...

Thanks Lubos. Then if I understand your reply correctly, majority of the systems would be chaotic and therefore unpredictable after a long time. Am I overstating the case? BTW is there an emphasis on the word "classical" in your reply? Does it mean that majority of quantum systems (which means everything, since every thing is quantum anyway) may not be chaotic?