Well over 90% of visitors to threads such as this Stack Exchange question by Gerard 't Hooft asking why professional physicists consider his research in the last 10+ years to be wrong are either unwilling or incapable of rational thinking or ignorant about the basics of modern physics.

Pretty much every forum about fundamental enough physics questions on the Internet is completely overwhelmed by cranks and the market of popular books on quantum mechanics and newspaper articles isn't much different.

That's why one must be kind of happy to see every single person who hasn't lost his mind completely and who is still willing to "speak" in public. In this case, it's Peter Shor of MIT. He answered 't Hooft's question as follows:

PS: I can tell you why I don't believe in it. I think my reasons are different from most physicists' reasons, however.

Regular quantum mechanics implies the existence of quantum computation. If you believe in the difficulty of factoring (and a number of other classical problems), then a deterministic underpinning for quantum mechanics would seem to imply one of the following.

- There is a classical polynomial-time algorithm for factoring and other problems which can be solved on a quantum computer.
- The deterministic underpinnings of quantum mechanics require \(2^n\) resources for a system of size \({\mathcal O}(n)\).
- Quantum computation doesn't actually work in practice.

For the second, deterministic underpinnings of quantum mechanics that require \(2^n\) resources for a system of size \({\mathcal O}(n)\) are really unsatisfactory (but maybe quite possible ... after all, the theory that the universe is a simulation on a classical computer falls in this class of theories, and while truly unsatisfactory, can't be ruled out by this argument).

For the third, I haven't seen any reasonable way to how you could make quantum computation impossible while still maintaining consistency with current experimental results.

LM: It's of course a type of an argument that uses tools that are characteristic for a man doing quantum computation, not fundamental high-energy or condensed-matter or other "real phenomena" physics, but I for one consider this concise answer totally valid and kind of indisputable.

Quantum mechanics implies that quantum computers may be constructed in principle and once they're constructed, they work. This conclusion is really a totally straightforward application of the verified laws that govern the behavior of properties of all physical systems in our quantum Universe – that govern all the quantum information around us. One just combines a thousand (\(n={\mathcal O}(1,000)\) – of degrees of freedom, a thousand of qubits, and he can already build devices that can break into the codes the world relies upon today within seconds, or much faster than that.

It is not really plausible that the laws we have verified for collections of several degrees of freedom, or quadrillions of degrees of freedom arranged in various ways etc., will fail for a thousand of qubits in a quantum computer. It's still true that the size (dimension) of the Hilbert space has to grow exponentially with the number of qubits and it's still true that the Hilbert space obeys the superposition principle. Deviations from quantum mechanics that would be able to cripple the correct quantum behavior of \({\mathcal O}(1,000)\) qubits would almost certainly produce measurable deviations and problems already when tested on several qubits and we know that no such deviations exist.

So if you avoid some science-fiction-like conspiracy theories that Nature is not only deceitful but literally evil and She is making it look like the laws of quantum mechanics work for one, two, or any number of atoms – as well as for their complicated macroscopic bound states containing trillions of trillions of atoms we have tested – you're bound to conclude that it must be possible in principle to create a gadget with just thousands (or millions, it doesn't really matter) of elementary building blocks that may crack these codes in a short enough time. Because we need this gadget to work just for thousands or millions of steps, it's enough to suppress the "decoherence per operation" and the "accuracy of operations" by a factor of million which is doable because it's far less dramatic than the super-exponential size of the numbers that may be factored.

But this conclusion has far-reaching consequences for the proposal that there's any fundamentally classical or deterministic or realist "mechanism" running behind the scenes. If this were the case, it would be possible to break the codes with a fundamentally classical computer, too – simply because a quantum computer may solve these problems via Shor's algorithm and related algorithms.

This sounds inconceivable. Even if you were ready to believe that those "super difficult" problems in computer science have a polynomially fast classical solution, in contradiction with the beliefs of most complexity computer scientists, it wouldn't be enough to make this unusual picture of computer science compatible with the assumption of realist underpinnings of quantum mechanics. Because all the quantum algorithms really work on the same principle – some kind of periodicity finding by a method that looks like a massive parallel computing – there would have to be a rather simple and unified classical algorithm that just emulates a quantum computer and solves all the difficult problems pretty much in the same way.

All these things are unimaginable. The classical computer would really have to be very simple. Because the basic idea behind Shor's algorithm is also simple, you can't rely on any hyper-complicated classical algorithms that would resemble Wiles' proof of the Fermat Last Theorem. The classical algorithm to factor large numbers would pretty much be bound to be composed of two steps – a simple enough gadget that emulates quantum mechanics; and Shor's or similar algorithm added on top of that.

If this were possible, it should be really simple to see how the "emulation" of quantum mechanics could look like.

The only conceivable classical solution that imitates quantum mechanics "faithfully enough" is one that remembers the whole wave function or the density matrix but gives them an invalid interpretation – treats them as classical observables – and is equipped with some extra "collapses" that guarantee that the wrong "wave function is a classical wave" interpretation is effectively converted to the right "wave function is just a probability (amplitude) wave" so that it doesn't conflict with the observations – because the fact that the wave function is just a template to get probabilities is something we directly observe in the experiments (although some crazy people are trying to turn it into a super-controversial mysterious assumption).

(Such collapse-based fakes of quantum mechanics have to be infinitely fine-tuned to agree with things like the Lorentz symmetry of all effects but they're still fundamentally wrong because in Nature, the wave function simply isn't an objective observable and this fact may be demonstrated and has been demonstrated.)

But if you adopt such a classical fake of quantum mechanics, the number of classical degrees of freedom – recall that the individual amplitudes in the wave function have become classical observables – will grow exponentially with the number of the qubits i.e. the number of the quantum degrees of freedom. This is really implausible as well. Both in classical field theory and quantum field theory, the experimentally verified Lorentz symmetry seems to imply that the number of elementary observables scales with the volume. If the quantum field theory is an effective quantum field theory including gravity, the true number of degrees of freedom grows even less quickly than that – it's proportional to the surface area (because of the holographic principle).

It seems implausible that you could design a theory that agrees with the precision tests of the Lorentz symmetry in so many contexts that would nevertheless imply that the number of degrees of freedom grows exponentially with the number of qubits in a quantum computer. And you may verify that none of the proposed theories (or classes of theories) intended to fake quantum mechanics can achieve such a goal. Such a dramatic superlinearity would contradict not just the apparent absence of actions at a distance; it would really conflict with the extensiveness of ordinary materials such as air and water, too (and other basic facts).

The logic is really waterproof and any proponent of a fundamentally "realist" description of quantum mechanics must choose one of Shor's options. Incredibly enough, Gerard 't Hooft chooses the "quantum computers can't work" option which is almost certainly the least plausible option among the three implausible ones. (Of course, I've known about this insane claim by 't Hooft for many years.)

A quantum computer is just a clever piece of applied maths or engineering. It combines many "qubits", quantum degrees of freedom whose behavior has been verified separately or in small groups or in special large groups (but very accurately), and local steps that may be applied to the state of these objects (e.g. electrons' spins). What could possibly go wrong that could prevent the quantum computer from working but you wouldn't notice the disease in either of the experimental tests that quantum mechanics has been passing flawlessly for almost a century? It just makes no sense whatsoever.

If you have \(n\) qubits, the dimension of the Hilbert space is \(2^n\). It has to be so because the qubits are allowed to be in independent states from others. If a group of 1,000 electrons were already forced by Nature to adjust their behavior according to some collective properties, it would mean that Nature includes a very blatant action at a distance. If this were the case, we would have already observed telepathy, telekinesis, voodoo, and other things. Imagine that Nature was secretly imposing quotas on the number of electron spins in a region that can be up. Such quotas would instantly lead to violent violations of the angular momentum conservation law, and so on. While my sister who just returned from the Summer Tarot School for Beginning Witches ;-) surely thinks that such phenomena are commonplace, I think that people familiar with basics of physical sciences realize that they have never been observed despite many somewhat sophisticated attempts to do so.

You could invent other "toy models" in which Nature doesn't remember an exponentially large number of the complex amplitudes for 1,000 qubits. For example, Nature secretly chooses a basis and only remembers 1 trillion of the largest complex amplitudes (in absolute value) among all the amplitudes while the rest is set to zero after each Planck time, and the whole wave function is renormalized. Imagine anything of the sort and think of the consequences. You would see that any such unnatural interventions would immediately lead to consequences – spectacularly observable new "supernatural" effects. All these interventions would be nonlocal effects that may easily become macroscopic if they operate in macroscopic situations. For example, because the largest amplitudes would likely be either positively or negatively correlated with a large energy, the elimination of the smaller or larger amplitudes would lead to an increasing or decreasing energy in average: a perpetual motion machine would become possible. And so on, and so on. There's just no way to avoid these things.

The excuse that there are many degrees of freedom isn't a license for you to mess with the laws of quantum mechanics. Quantum mechanics has been tested not only for a few degrees of freedom; its numerous predictions have been verified for trillions of trillions of degrees of freedom, too. Only if quantum mechanics produces the right classical limit for macroscopic bodies, you may argue that it inherits the agreement with the experiment from its classical predecessor. Any selective filtering or messing with the quantum information in the case of many qubits – something needed to "kill" the quantum computers – would imply that quantum mechanics no longer has the right classical limit, so it fails even in tests that were done before quantum mechanics was born. Everyone who suggests that quantum mechanics has only been verified for 2 or 3 spins etc. misunderstands quantum mechanics in the same sense as those who say that physics only works for the motion of planets but it says nothing about phenomena that matter to humans (where witchcraft takes over).

There can't be any constraint that would prevent 1,000 qubits from being allowed to try all \(2^{1,000}\) states. The holographic limitations of quantum gravity are closest to this possibility but they may only become relevant when you see that they're relevant – when the matter is too dense and collapses into a black hole. When you don't see anything this spectacular, it's just impossible for electrons' spins to routinely deviate from their quantum-mechanics-predicted behavior.

Now, the quantum computer is capable of doing several basic operations on localized groups of the qubits. Those things have really been tested in isolation; the ability of these groups to work is physically equivalent to many tests of the laws of quantum mechanics that have already been performed.

An imitation of quantum mechanics that agrees with the normal experiments that verify quantum mechanics but that also miraculously bans quantum computers at the same moment is exactly as crazy as a theory of classical computers that suddenly prevents you from connecting a GPU with a microprocessor. It can't happen. The denial of the existence of quantum computers is the complete and full-fledged denial of all of quantum mechanics. I am amazed by the people such as Mr Ron Maimon who are ready to say that they really don't deny anything about quantum mechanics at all, it's just a straw man, they say: they "only" deny quantum computers, the superposition principle, the existence of entanglement (at least when there are many degrees of freedom), unitarity of the evolution and every other postulate of quantum mechanics, and every single consequence of quantum mechanics (perhaps unless it may be verified directly in their kitchen, while they're allowed to forget everything they've learned previously).

It's a stunning amount of incoherence and stupidity.

Why don't they test their ideas against at least a single and simple situation, a collection of 3 qubits, the ammonia molecule, anything else? They would see as clearly as I have when I did these tests that the status of all these "theories" is exactly on par with telepathy. They just don't copulating work at all.

And that's the memo.

## snail feedback (46) :

Aah, this came in with exactly the right timing to be a nice lunch time reading :-)

I´ll probably reread the nice introductary TRF article about quantum computers too later today.

I dont know why this is (?), but questions on physics SE or articles in the internet that deal with "quantum foundations", "quantum interpretation", etc always make my mind shut down ... :-D. I barely manage to read the title if it describes the body of the post in a too obvious manner, much less can I follow the often lengthy discussions ... :-/

This is strange, maybe I should see a doctor and ask him what is wrong with my hardware ... :-P

Wow what memo! %-o

Off topic:

Tommaso Dorigo has become really ridiculous now; he gets excited about and cheers up as "plot of the week" etc every single BSM phenomena that is NOT observed (at the LHC) so far, LOL :-D

Not sure if such a pathological scornfulness and negativity (well known from Dorigo's best friend...) is the right attitude to impartially do serious and honest work in physics ... :-/. Maybe he needs a supervisor to have a close look at what he is doing ...

(I did not click him of course, just saw the title in Kneemo's side bar ...)

Well, *nothing* beyond the Standard Model has been (clearly) observed by the LHC (and published), so it may be easier to say what *was* observed, namely the Standard Model which predicts various cross sections for all processes as a function of energy which generally give decreasing (as a function of energy) charts whose detailed shape isn't really interesting for anyone, and it so far nicely agrees with the experiments.

What's silly for him is to suggest that the Standard Model is "posessed" by sourballs like himself. The Standard Model has been found by folks like Gross and Wilczek, and others, and most of them continued to think about physics beyond what they found - and, another majority of these fathers, in fact, believes that it will be found at some moment. That a theory works for years doesn't mean that it will hold forever.

Interesting: a modern incarnation of the Einstein-Eddington debates.

Since QED it has been clear that even the simplest particle interactions have a fractal nature and so require an infinite amount of computation. By comparison o(2^N) is trivial. If I understand correctly, the only particle level problems that can be solved are the ones where there is enough symmetry to use analytic methods to get rid of the infinities, and all the other problems are left unsolved and experimental values are plugged in. Nature has no such luxury and is simply capable or doing all that, or else what is really happening is something quite different from what the models say is happening.

Apologies, Jonathan, I have personally no idea what you're talking about and how it's relevant to the topic of this blog entry.

Whether "level problems" may be solved - of course, most problems in physics can't be solved analytically - has nothing to do with the question whether combinatorial problems such as finding factors of very large integers of the form prime1*prime2 can be efficiently solved. These are completely different questions. One of them is physics - and analytic solutions (numerical solutions always exist, at least in principle); and computer science where we search for discrete answers.

But even if I imagine that it's OK for you to be off-topic, I don't really understand what you're saying about QED. To get analytically unsolvable problems, you don't need QED. The three-body problems in classical mechanics is already enough. The helium atom is a good enough quantum example, too.

So good try but probably not too good. ;-)

The point I am trying to make in relation to the post is that the 2^N order complexity vastly underestimates the problem. If you step back and look at what is going on in QED then 2^N is not so hard at all to accept by comparison. The point is not that numerical methods are required, it is it that the calculation of the series of virtual particle interactions are divergent and so cannot be solved numerically. Only in some cases can a combination of analytic and numerical methods be used, and if the analytic method cannot help then no solution is possible based on what is now known. In some sense QED is more advanced than the basic superposition postulates used to describe a quantum computer, but the idealized quantum computer models are really implented out of building blocks that use QED. So the 2^N is a huge simplification of what is known to be really happening already. I hope that clarifies what I am trying to say.

I went to that page on Stackexchange intent on upvoting every answer or

comment from the boss, but then realized that I lack just about

everything needed to evaluate the respective arguments made by the

disputants. Don't get me wrong, I am willing to bet on Doc Motl and Prof

Shor being right and the opponents being wrong, but this is based on

"trust", not on a qualified analysis.

I had to make an exception, however, for this comment:

Dear @Ron, what is "conservative" and what is "progressive" among physical theories can't really be classified so easily and unambiguously. You may describe quantum mechanics as a conservative theory. That's fair but someone else could also describe it as the still-fresh progressive development and the people who would like to restore determinism of the 17th-19th centuries are the true paleoconservatives or reactionaries or whatever. These are just different ways to spin the "camps". The truly relevant separation is good physics and bad physics and the denial of QM in 2012 is bad physics. – Luboš Motl 7 hours ago

Woo hoo! First class -- and dear Ron walked right into it ;)

Dear Jonathan, 2^N underestimates *which* problem?

Again, these types of problems as well as these types of "complexity" have nothing to do with one another, and your particular claims are wrong, too. The fact that QED or almost any other QFT has UV divergences doesn't mean that it can't be calculated numerically. QCD has the same divergences when treated perturbatively but it's treated numerically all the time - it's called "lattice QCD". The divergence only applies if one naively takes a limit at the beginning instead of taking it at the end.

You are using "2^N" for some quantity and you never say what the quantity actually is. The degree to which it is ill-defined is really striking because you seem to be using it for the dimension of the Hilbert space, size of large integers, as well as time needed to solve a numerical problem in quantum field theory. These are three totally different things.

Taking a limit at the end is an example of an analytic technique, because taking a limit is an analytic tool that is used to deal with the problem. My claim is that nature does not have access to such techniques, or if it does they so not yet appear in physical theories yet. In the case of lattice QCD, the physical numbers that are plugged in to the models are not the real numbers, but are pre-adjusted so that the result agrees with experiment. That's because the technique cannot get the correct answer on its own, so we use nature to get the correct and answer and then fudge the inputs. I believe this is because of the infinite fractal nature of the Feynman diagrams needed for the calculations. To make my previous statements more clear, let me rephrase it to say that the 2^N size of a Hilbert matrix for a quantum computer is simple by comparison to the infinitely recursive Feynman diagram needed to describe the implementation of even the simplest possible QC.

Yeah, that is a very good one :-)

When going to physics SE to upvote Lumo systematically, you have to be careful since if the "system" detects what you are doing the upvotes could get reversed ... :-/. I would probably not cast more than 10 well seperated in time upvotes per day :-).

Dear Lubos,

looking forward to read this new article.

How sure are you that the legitimate Gerard 't Hooft was visiting your blog and not just some random physics student looking for some attention?

One thing which bugs me is there used to be a wonderful photograph of the young Gerard 't Hooft lecturing in front of the black board on the internet. I don't find it anymore right now. Maybe some of you knows a link. If it is public domain it would make a nice illustration for one of Lubos articles.

Oh, this is about quantum computers. In fact I remember Gerard 't Hooft adressed this question and predicted that it would break down at a certain level.

Dear Mikael, I am absolutely sure that 't Hooft on this blog is the real 't Hooft because he has the same IP address and for many other reasons.

I've talked to 't Hooft before several times, too, and also took several pictures of him. Attached is the 't Hooft in front of the Harvard high-energy coffee machine. ;-)

I guess I'm missing something here but it seems to me that the double slit experiment alone shows that there cannot be a pre-existing reality that determines the result of a measurement. If you try to force such a "reality" into the double-slit experiment you wind up in fantasyland. This isn't rocket science, for God's sake. You simply cannot twist classical mechanics into an explanation of the double slit diffraction pattern.

Give it up, folks, and come in out of the rain. It's eroding your brains.

Right, I remember the paper as well. That's when I reduced my attention from his papers to the abstracts.

There isn't any "some point here". Quantum computers are systems of 1,000 or so qubits which is neither too little, nor too many. The theory is tested both for lower and higher numbers. There's nothing remarkable or "limiting" about a quantum computer running Shor's or similar algorithm.

"An imitation of quantum mechanics that agrees with the normal experiments that verify quantum mechanics but thhat also miraculously bans quantum computers at the same moment is exactly as crazy as a theory of classical computers that suddenly prevents you from connecting a GPU with a microprocessor. It can't happen"

That's a perfect exorcist's prayer before bed ;-)

Goodnight !

Exactly, Gene. The double-slit experiments contains all the wisdom about QM – it only needs one to think about it a bit carefully. And the non-existence of any objective information about the particle/wave prior to the measurement is a directly experimentally measured fact whose denial is equivalent to a suicidal attack against one's own brain.

Thanks for the picture Lubos. :-)

I am still not convinced by your arguments about the quantum computer, But maybe I have to just read it more carefully. After all we know that on the level of cats we cannot distinguish if there is unitarity or a true collapse of the wave function because we are not able to follow all the degrees of freedom let alone roll them back. Interestingly these limitations are of the same kind as the ones in thermodynamics, Our attempts to do quantum interference with more and more hot and macroscopic objects will fail at some point for this reason. So we can be never be quite sure whether quantum mechanics breaks down or we just reach the limits of our technical abilities. Similar things could apply when constructing bigger and bigger quantum computers. So I would have tried to attack 't Hooft's position on the basis of the Bell inequalities where I really see no way how he wants to escape them without any conspiracy theories. Apparently it is your "sportive" ambition to kill his arguments without even using this killer argument. :-) And to pick up Gene's comment I indeed have a hard time imagening a classical explanation of the double slit experiment,

Darn, I need such a high-energy (maybe Planck scale ?) coffee machine too in the morning... LOL :-D

Cool picture !

Gene: "I guess I'm missing something here"... The point of thought-experiments like Schrodinger's cat? No reality before you measure would mean the cat really is neither alive nor dead until you look.

Dear Lubos, OT (or maybe not) but I thought this was pretty funny:

http://tinyurl.com/7fdf87t

Nima would have about 30 espressos a day so of course he soon organized a personal coffee machine upstairs. ;-)

Dear Mikael, I can't make sense out of what you're saying. As long as you adopt the basic quantum framework, you surely *can* decide whether unitarity holds. Unitarity always holds in the proper QM framework because it's a consistency condition: the total probability of all possibilities is 100%. The number of measurements one needs to prove this conclusion is zero but this doesn't make it less scientific; on the contrary, it makes it less reliable.

You can also prove, but not this 100% purely mathematically, but with the help of observations, that there's no process that would "collapse" a physical wave because if such processes existed, they would lead to violations of the Lorentz invariance that would inevitably show up in other situations but they never do.

Both learning and thermodynamics have irreversibility of a sort that depends on arrows of time and the thermodynamic arrow of time is indeed demonstrably aligned with the logical one. But why are you opening this extra layer of off-topic fog in a discussion on whether or not there exist quantum computers? Don't you see that what you're saying can't possibly have any relevance for the question - while on the other side, there is an actual relevant full proof that quantum computers have to be possible in principle? I find texts like yours completely irrational. The coherence length is way shorter than one sentence. You just seem to combine some physics jargon that doesn't make any sense. There's no way how anything of the sort you're writing could imply anything. It's just pure gibberish.

Lubos - I'm not sure whether you have a marvelous knack for coming up with simple plausibility tests or a marvelous willingness to use them. (I'm not well-read enough.) I suspect it's the former, but whichever it is, I applaud you.

I'm sure the sourball gracefully overlooks the fact that Gross is a cofather of the SM too ;-)

And without actually letting my mouse slip to click the dark side, I conjecture that the sourball's "Plot of the week" has become the Trollmaster's current "This week hype" now, LOL ... :-P :-D ;-)

Yeah, I always like it to look at the pictures you have made of your colleagues; they are cool :-) and some party pictures are very funny :-D

@Lubos Motl

Well, it is not a comment to this article, but i have a question - can you write something about perspectives in theoretical physics in Eastern Europe (Poland)?

I mean, i will be math and physics student - 1 year in University of Warsaw and - if it is possible - i would like to hear from you if it is possible to do serious reaserch in Poland etc.

I have read that you did your undergraduate studies in Czech Republic, but PhD in USA. Was there good environment than to do string theory or you just had to do it all alone?

Thanks in advance.

Krzysztof J

Dear Dilaton, I've mentioned Gross and Wilczek because Wilczek is a great supersymmetry enthusiast, too. Of course, there are others among the fathers of the SM. Weinberg is a mild ST champion.

Abdus Salam unfortunately died in 1996 but he had worked on string theory, too, see e.g.

http://www.sciencedirect.com/science/article/pii/0550321386903664

The idea that the Standard Model is "against string theory or SUSY" is completely missing the point of high-energy physics which surely isn't to try to stop any progress beyond a certain arbitrary, obviously not final, point.

Krzysztof J asked in the defunct blogger.com comments:

@Lubos Motl Well, it is not a comment to this article, but i have a question - can you write something about perspectives in theoretical physics in Eastern Europe (Poland)? I mean, i will be math and physics student - 1 year in University of Warsaw and - if it is possible - i would like to hear from you if it is possible to do serious reaserch in Poland etc. I have read that you did your undergraduate studies in Czech Republic, but PhD in USA. Was there good environment than to do string theory or you just had to do it all alone? Thanks in advance. Krzysztof J

LM: I learned lots of things as an undergrad in Prague - good basics up to general relativity and quantum field theory. There wasn't any string theory research in the Czech Republic (and even more cleanly so, none of it in Prague) so of course everything here was my personal "hobby", so to say, although it was possible to write and defend a diploma thesis on it.

The research of string theory - and almost any sufficiently advanced discipline close to the cutting-edge research - is of course vastly better and stronger in the U.S. Despite some good places elsewhere, I still think that the U.S. as a country is an indisputable leader in fields such as theoretical physics even in 2012.

Wow, that is really a wonderful argument by Peter Shor. I just can't understand why some physicists still cling to the idea of a deterministic interpretation of quantum mechanics when we've learned over the years that you have to really bend over backwards in order to even attempt to make it work.

In your first paragraph, should the second "less" in "but this doesn't make it less scientific; on the contrary, it makes it less reliable" be a "more"? (Feel free to delete this question, either way.)

Mitchell Porter,

You are stuck in the same place that gets everyone else in trouble with QM. As Lubos points out, nature simply refuses objective reality at a profound level.

QM and science itself is about observation and observation alone. Try, try and try, my friend; you can do it.

Mitchell Porter, I think you meant the cat is both alive and dead. If he is neither then there's nothing in the box ;-)...

Dear Lubos,

you write:

"To invalidate unitarity, you need to invalidate the whole structure of quantum mechanics and build a completely different one - and reproduce all of the successes of QM despite having a fundamentally different theory - which you haven't and which isn't really possible."

I never claim to have this different theory. But there is a guy called Gerard 't Hooft who tries to construct it and claims some partial success. I consider it more likely than not that his papers are a dead end. But since at one point of time he was the most able person on this planet who did technical work on quantum mechanics I consider for a moment the possibility that he is on the right track. From this point of view the arguments about the quantum computer are not very convincing because the new deeper theory will always mimic the older one in some approximation. It is quite possible for me to consider the possibility that you can see that his papers are junk without studying them in detail but I can't. The only point I wanted to make beyond this is that in any experiment to check the validity of quantum mechanics and in any attempt to construct a quantum computer decoherence is your main practical obstacle. So if you wanted to construct a classical theory which mimicks quantum mechanics in some limit the way to "hide" the differences from too easy detection would be in a slight additional portion of decoherence.

Shannon, that would be a many-world answer. But yes, clearly it doesn't

make much sense to say that the cat is there but it's actually neither

alive nor dead.

Gene, the emphasis on observation is meant to keep you honest and remind

you of what you do and don't know. It doesn't mean that there's nothing

rational to say about unobserved reality. If we truly say that to exist

is to be observed, that's going to be solipsism - reality consists of

nothing but a sequence of sensations.

The antirealist theorems like Hardy's theorem

http://motls.blogspot.com/2011/01/hardys-paradox-kills-all-realistic.html

don't imply solipsism or even instrumentalism. The specific theorem

mentioned in that article implies only that the particle can't have

predetermined values for *all* of "A, B, A', and B'". You never see all

of those observables at once anyway, so this doesn't actually contradict

realism, it just contradicts a sort of super-maximal realism, which

wants too many observables to have values at once.

In trying to figure out a sensible theory for which observables do get

realized, you could actually start with the "sequence of sensations",

but approach it like a neuroscientist rather than a solipsist. My

sensations do exist, and they have something to do with brain activity,

so that brain activity has to be really existing and not just

potentially existing. But the brain is just another physical object,

it's irrational to suppose that only it really exists; so you then try

to deduce, from the reality of sensations, principles which explain why,

out of possible brain observables, those are the ones that exist; and

then you apply those principles to all of physics.

Why has nobody yet built a 1000-qubit quantum computer then, when we have for some time had fully functional quantum computers with a handful of qubits only? The greatest achievement of quantum computing thus far seems to be factoring the number 21 using Shor's algorithm (according to Wikipedia).

It seems that the problem with building larger and larger quantum computers is fighting off decoherence. Perhaps it will turn out to be impossible /in practice/ to build a quantum computer of such size, because decoherence can not be overcome.

(Not that this affirms t'Hooft's ideas of course, just countering the objection to his objection #3).

Dear Old Wolf, except that your speculations are easy to see to be wrong if you look at them nonsensical if one is a little bit quantitative.

Decoherence is an expo-exponentially fast process once its starts. But before it starts, it pollutes the matrix elements by some terms proportional to the time and a coefficient determining the rate.

This coefficient only scales as a power law with the number of qubits and required operations while the abilities of the resulting quantum computer grow exponentially.

There exists no "absolute lower bound" on the amount of decoherence per unit time, except for zero, so it's obvious that the decoherence limitations we experience today are technical. Just like we haven't detected gravity waves by LIGO yet, we haven't built a large quantum computer (and people who have built it kept it in secret, if any).

In your first paragraph, should the second "less" in "but this doesn't make it less scientific; on the contrary, it makes it less reliable" be a "more"? (Feel free to delete this question, either way.)

Wow, that is really a wonderful argument by Peter Shor. I just can't understand why some physicists still cling to the idea of a deterministic interpretation of quantum mechanics when we've learned over the years that you have to really bend over backwards in order to even attempt to make it work.

Mitchell Porter,

You are stuck in the same place that gets everyone else in trouble with QM. As Lubos points out, nature simply refuses objective reality at a profound level.

QM and science itself is about observation and observation alone. Try, try and try, my friend; you can do it.

Mitchell Porter, I think you meant the cat is both alive and dead. If he is neither then there's nothing in the box ;-)...

Dear Lubos,

you write:

"To invalidate unitarity, you need to invalidate the whole structure of quantum mechanics and build a completely different one - and reproduce all of the successes of QM despite having a fundamentally different theory - which you haven't and which isn't really possible."

I never claim to have this different theory. But there is a guy called Gerard 't Hooft who tries to construct it and claims some partial success. I consider it more likely than not that his papers are a dead end. But since at one point of time he was the most able person on this planet who did technical work on quantum mechanics I consider for a moment the possibility that he is on the right track. From this point of view the arguments about the quantum computer are not very convincing because the new deeper theory will always mimic the older one in some approximation. It is quite possible for me to consider the possibility that you can see that his papers are junk without studying them in detail but I can't. The only point I wanted to make beyond this is that in any experiment to check the validity of quantum mechanics and in any attempt to construct a quantum computer decoherence is your main practical obstacle. So if you wanted to construct a classical theory which mimicks quantum mechanics in some limit the way to "hide" the differences from too easy detection would be in a slight additional portion of decoherence.

Dear Lubos,

I recently watched the following seminar by 'h Hooft at Cern and want to take this as the opportunity to ask you the following question.

http://indico.cern.ch/conferenceDisplay.py?confId=241015

As 't Hooft knows all deterministic models of quantum mechanics are confronted with Bell's theorem.

But as 't Hooft points out there is a maybe ugly looking loop hole which is superdeterminism.

Let me discuss this.

The problem is that if we can freely decide which polarizations of two entangled photons we measure then there is simply no local realistic model which is prepared to give all the right answers, I would like to keep the free will out of this discussion because it is a very loaded term and nobody knows what it is. So let's say we have two quantum mechanical random generators which decide this for us. It sounds totally absurd that the random generators always determine two axis for which the hidden variables of the photons are prepared to give the right quantum mechanical answer. But despite sounding absurd couldn't it still be true, We find it absurd because we imagine the entangled photon as one qbit of information, But maybe there is a lot of redundancy in the hidden variables and quantum mechanics just describes the tip of the ice berg. In any case my question is why do you think that superdeterminism is not a good way out of Bell's theorem?

Dear Mikael,

superdeterminism as a would-be loophole is clearly ruled out as long as one actually searches for the scientific explanation of genuinely observed phenomena.

Prof 't Hooft may tell an experimenter that he, the experimenter, doesn't have a free will so he's not testing the response of a system/apparatus to a choice of which buttons the experimenter pressed etc. because the choice was guaranteed a priori. This is meant to be superdeterminism, right?

But that' completely irrelevant but the experimenter still *did* press some buttons and he may press different ones at different times, effectively proving his free will and forcing a theory to predict what happens in all the cases.

An alternative theory to quantum mechanics would have to actually predict what the experimenter is going to do, or at least be close to that, otherwise it's as empty nonsensical babbling as 't Hooft's talk. Needless to say, in quantum mechanics, the experimenter's choice of the pressed buttons is just another process that may be predicted probabilistically and that is indeed correlated with various things, according to the maths of QM. But the mechanisms that lead to one experimenter's decision or another are so complex that they can't be correlated "a priori". They may only be correlated "a posteriori" which is nothing else than a technical way of describing the "free will".

So by babbling about superdeterminism without an actual ability to predict how the people will decide, one is not only giving science in the sense of an explanation what will happen to the observed system in an experiment; one is also denying - not only the free will but - the fact that the brain is a complex physical system. One may say lots of supernatural, would-be ambitious words like that but every sensible person must have *some* threshold behind which the ideas are declared far-fetched and their proponents are counted as cranks. Gerard 't Hooft has crossed this threshold of mine many, many years ago.

Cheers

Lubos

Post a Comment