Someone sent me The Case Against Quantum Computing (IEEE) written by Mikhail Dyakonov in November 2018. Dyakonov is an ex-Soviet physicist who has discovered the Dyakonov surface waves in 1988. For decades, he's worked in Montpellier, France – where he's called Michel. He's investigated some spin physics – but most of it looks like a rather "classical treatment" of these phenomena.Political violence close to me:On Saturday, Ladislav Jakl – a very close life-long collaborator of the Czech ex-president Klaus and my fellow lecturer in a climate bi-lecture we gave to Southern Bohemian schools in Fall 2014 [Jakl-Motl just rhymes, doesn't it?] – was brutally attacked in the Prague Subway. The number of attackers was probably two. One of them screamed "You are that Jakl, the leader of SPD". The attackers were clearly warriors for diversity, peace, and love. He's not really a "leader" in any way, not even a member, he was just running for SPD to Senate once and failed, and he was formally appointed by SPD, the "nationalist" party, to some chair[s] (Jakl is a member of the Committee for Public TV – and he wants to abolish this neo-Marxist den). For the first time, I coincidentally voted for SPD on Friday. Jakl ended on the floor of the cart near the "Square of the Republic" stop of the yellow B Line. The event has left his head and shoulders with visible cosmetic traces that shouldn't have long-term consequences – although he's gone through one sleepless night due to the pain. Let me hope that no Prague subway will be built in Pilsen. The attack was denounced by many, including Jakl's political foes. Ex-president Klaus' criticism was harsh and he blamed the attack on the globalist scoundrels who don't respect the results of elections while his son Klaus Jr "credited" those who are brainwashing our youth.

*The IBM Q Cryostat is remarkably similar to the Central Brain of the Mankind used in 2484 AD, from the 1984 AD Czech sci-fi sitcom "The Visitors". That CBM/CML will have predicted a collision with an asteroid, so an expedition to 1984 using a time machine was organized to get the notebooks of Adam Bernau, a boy genius, that explain how to move the continents. After many funny episodes showing the life of the future people in the advanced socialist society of the 20th century, the mission failed and Adams' "teacher", a pensioner, was taken back to 2484 AD where he fixed CBM/CML by noticing it wasn't horizontally placed on the floor. The global-warming-like catastrophic prediction was undone and the mankind was "saved".*

You can see that Dyakonov has posted many similar anti-QC papers to the arXiv. In early 2013, John Preskill opposed some statements by Dyakonov on Scott Aaronson's blog. Ben Criger, a QC postdoc, wrote The Case Against The Case... in January 2019 which is concise and reasonable. See also an interactive, entertaining, and rather multi-dimensional lecture by Aaronson about the QC skeptics – in a QC course he taught.

OK, what are the skeptics and especially Dyakonov skeptical about?

When we hear the people who say that quantum computers will never be created, and Dyakonov is one of them although he is a bit ambiguous, we might distinguish roughly three levels of criticism:

- It's bad, it's bad – criticism without any content whatsoever
- Quantum mechanics doesn't really work, it can't be precisely true etc., and the "new" effects will make QC impossible – anti-quantum zeal
- Some potentially plausible doubts about the feasibility of scaled QC due to trouble with the error correction schemes or decoherence etc.

Well, he sometimes seems to belong to the first class – pure hostility without any content that would indicate that Dyakonov knows what he is talking about. A criticism that could be applied to anything else that someone decided to dislike. Like pop music.

Quantum computing is all the rage. [...] Most commentators forget, or just gloss over, the fact that people have been working on quantum computing for decades—and without any practical results to show for it. [We've been deceived...]Much of the article is indeed just 100 times repeated "no" – the kind of hostile stuff addressed to the least demanding readers out there. Accusations and complaints about hype, fervor, the absence of practical results, and bubbles are everywhere.

My answer is simple. No, never.

I believe that, appearances to the contrary, the quantum computing fervor is nearing its end. [LM: He's been saying it for many years, I would say, but the QC industry grows bigger.] That’s because a few decades is the maximum lifetime of any big bubble in technology or science. After a certain period, too many unfulfilled promises...

Indeed, you may replace "quantum computing" with "string theory" and you obtain a tirade that is almost identical to the tirades that the generic critics of string theory address to their readers, the most brainwashed and dumbest possible "consumers" who have absolutely no idea about the field and who think that they're smart if they understand, after 100 unjustified repetitions of "no" with various emotional decorations that have clearly nothing to do with the issue, that the writer meant "no" which is why they must uncritically believe that the answer is "no". Just to be sure, if you still failed to understand – and be sufficiently certain – that the criticisms of string theory out there are just worthless emotional garbage written by people who have no idea about the topics they discuss, then you are simply not an intelligent person.

Great.

In the second iteration, you read Dyakonov's article and look for something that wouldn't immediately insult your intelligence, something that could have been addressed to readers with the IQ above 70. And indeed, you find some introduction to quantum mechanics. Quantum mechanics allows qubits which generalize classical bits and quantum computers, as collections of qubits, are described by an exponentially large number of \(2^N\) probability amplitudes.

This theme is actually omnipresent in criticisms of string theory, too. Some anti-string writers want to emphasize or pretend that they're not addressing their tirades just to readers with the IQ below 70. So they also include some "technical" but very elementary background, like "there was quantum field theory" or "there are qubits". In this way, they make it clear that their tirade is addressed to consumers with the IQ between 70 and 75, too.

"There are qubits" has nothing to do with the claim that "quantum computers won't be built", however. So if your IQ is detectably above 75 and you want to look for Dyakonov's

*actual arguments*against quantum computers, assuming that there are any, you need to spend much more time by isolating the "gems". And you will find something after a while:

[...] Experts estimate that the number of qubits needed for a useful quantum computer, one that could compete with your laptop in solving certain kinds of interesting problems, is between 1,000 and 100,000. So the number of continuous parameters describing the state of such a useful quantum computer at any given moment must be at least 2This is arguably the technical "core" of Dyakonov's diatribe. The first two paragraphs above would make you reclassify Dyakonov as an "anti-quantum zealot" while the final paragraph could allow him to make it to the "somewhat conceivable criticisms of quantum error correction". But I think that he never makes it to the final, sophisticated category of critics.^{1,000}, which is to say about 10^{300}. That’s a very big number indeed. How big? It is much, much greater than the number of subatomic particles in the observable universe.

To repeat: A useful quantum computer needs to process a set of continuous parameters that is larger than the number of subatomic particles in the observable universe.

At this point in a description of a possible future technology, a hardheaded engineer loses interest. But let’s continue. In any real-world computer, you have to consider the effects of errors. [...]

Now I have to increase the number of readers who have a clue what I am talking about, by mentioning a few basic facts. The largest quantum computer as of now has a register with some 50+ qubits. So they are described by a Hilbert space of dimension 2

^{50+}. Mathematically, you need 2

^{50+}complex quantum probability amplitudes to describe a pure state.

If you wanted to simulate this pure state on a classical computer, you would need each amplitude to be described by a rather accurate number. Such a classical computer would require more than 2

^{50+}classical bits. Well, you really need to multiply the number at least by 1,000 (which wouldn't make much difference) because the simulation would have to safely distinguish many numbers comparable to 2

^{-25}, the typical magnitude of a probability amplitude in a wave function with 2

^{50}complex coordinates (whose squared absolute values have to sum to one).

The properties of the classical computer discussed above, especially if you add the need to correct the noise and errors in the representation of the 2

^{50+}complex numbers, and especially after you realize that 50 has to be replaced with 1,000-100,000 because the error correction has to be incorporated, makes a "hardheaded engineer lose his interest".

The issue that Dyakonov – and similarly every superficial critic of anything, including and especially string theory – overlooks is that quantum computing is a rather fancy enterprise pursued by extremely skillful and smart people. So whether an average engineer, hardheaded or otherwise, loses his interest in quantum computing (or string theory) is absolutely irrelevant. Indeed, he or she almost certainly does. Almost all people on Earth have nearly zero interest in quantum computing (or string theory). But why should it matter? Their being rather dumb doesn't imply that everyone in the world has to be rather dumb. Again, this dimension of the diatribe is just the superficial hostility addressed to readers whose IQ is below 70.

By the way, note that Dyakonov has referred to "experts" when he wanted to estimate the number of qubits in a useful quantum computer. But he's forgotten about the experts when he was building his conclusions on whether the QC was possible. Even if you don't understand any quantum mechanics, you should be able to see that something is fishy here. He clearly

*needs other experts*, which indicates that he is not one himself, but at the same time, he distrusts them when subjectively needed. This combination is irrational. In effect, he is just a guy who doesn't understand the field in its entirety and who is cherry-picking statements by experts to write a text that is sounding negative about QC. Even if you don't get QM, you should be able to see that this kind of argumentation is pretty much worthless noise because it's dominated

*both*by incompetence and by bias.

If we isolate the sentences further, the most technical sentence in Dyakonov's text that is related to his skepticism is the following short paragraph:

To repeat: A useful quantum computer needs to process a set of continuous parameters that is larger than the number of subatomic particles in the observable universe.But this sentence is a sentence by an anti-quantum zealot. It is simply not true that a "quantum computer processes 2

^{50+}or 2

^{1,000+}physical observables". A quantum computer only processes 50 or 1,000 observables, the qubits. The extra exponentiation that Dyakonov added proves that he is not thinking quantum mechanically at all. He is thinking classically. He is thinking about a classical computer that has the task to simulate a quantum computer!

However, the whole point of quantum computing – and quantum mechanics – is that the world fundamentally doesn't obey the laws of classical physics. It works in a way that is

*fundamentally different*from the inner workings of a classical world or any object in it. Quantum mechanics is

*not*just classical physics with an exponentially large number of degrees of freedom. And quantum computers are

*not*massively (exponentially) parallel classical computers.

A quantum computer with 50 qubits has 2

^{50}mutually exclusive states, not something of order exp(2

^{50}). The expression 2

^{50}is the dimension of the Hilbert space i.e. the number of vectors in a basis, e.g. an orthogonal basis. If you think that Nature distinguishes roughly exp(2

^{50}) states from each other, then you're just assuming that the wave function is a classical wave. But it's not.

This basic misunderstanding of the meaning of the wave function also affects Dyakonov's talk about the error correction schemes. He thinks that errors arise in the 2

^{50+}or 2

^{1,000}complex numbers and each of these complex numbers has to be separately corrected. If this were the case, QC would be impossible because the correction of each complex number would probably need a separate building block inside the computer (or at least a separate operation performed by a building block) and you can't have that many. But that's a deep misunderstanding of quantum mechanics in the context of quantum computing. The errors arise in 50 or 1,000 qubits only. The quantum error correction schemes aren't correcting 2

^{50}independent complex numbers. They are correcting 50 independent qubits. That's the number of bits we can measure at the end – the maximum number of mutually commuting, independent observables that the quantum computers' Hilbert space admits.

There is simply some nonzero probability that some of these 50 qubits gets switched to the wrong value. And it has to be corrected. In practice, one qubit is replaced with many qubits where the errors may arise independently, too. An error in such a "qubit that is made redundant" occurs in a "fraction" of the representation of the qubit, and it may be fixed separately. There is a contest between the birth of new errors or noise; and the fixing procedures. As the quantum threshold theorem shows, if the error rate is beneath some bound – which is realistic and doesn't scale with the number of qubits "insanely" – then the rate at which the errors are fixed will beat the rate at which the new errors are created, and the computer will work.

Don't get me wrong. It's not trivial. And I think it will take a decade or more before quantum computers are used for really practical purposes. I think (but I am not sure) that Google's and other companies' claims that they will have one in 2019 is just a self-motivating chanting – that may be helpful psychologically. But there's no valid physical argument that would make this optimistic outcome impossible. It's clearly a major trend in applied physics and/or really high-tech engineering which may take 20 or 100 years of work but intelligent people have to or should exploit their abilities for QC or other demanding things!

Longer calculations will require you to fix a bunch of errors or errors within the error-correcting system, too. So you might need to embed the redundant qubits into an even larger system of redundant qubits. But the number of such embeddings only grows logarithmically with the length of the calculation, so the size of the real-world quantum computer only grows polynomially with the task that needs to be solved.

Dyakonov is also skeptical about the usability of quantum computers, even if they are built. Well, many laymen have surely been misled to think that the quantum computers will supersede the classical ones. They almost certainly won't. Quantum computers will only be used for rather special tasks such as

- breaking of the codes, e.g. Shor's algorithm and related
- database searches, via Grover's algorithm (Lov Grover found it in the Nokia Bell Labs)
- simulations of quantum objects

The second example, database searches, could be practical for Internet companies with big servers. Searching in databases quickly may be helpful. And the final application could be good for science and applied science – anyone who actually wants to know how complex enough quantum systems behave. Again, these applications won't directly affect the life of the most ordinary people or even average "hardheaded engineers" but there's no reason why they should.

Let me return to the anti-quantum zeal. The very fact that Dyakonov talks about "continuous parameters" unmasks him as an anti-quantum zealot. The 2

^{50}variables are continuous but they are not physical variables or observables. They are probability amplitudes. They are analogous to – in fact, complex generalizations of – probabilities in classical physics.

There is nothing characteristically quantum mechanical about the existence of 2

^{50}numbers describing 50 bits or 50 qubits. 50 bits in classical physics may

*also*be described by 2

^{50}numbers – real numbers, in this case: by the probabilities of each of 2

^{50}configurations of the classical bits! This is the right analogy for the quantum wave function.

Quantum mechanics is only new because these 2

^{50}numbers that were real become complex; and, correspondingly, they may be manipulated by unitary operations whose effect is more general than just a permutation of the 2

^{50}probabilities. (I discussed the role of permutations of probabilities in classical physics a week ago.)

Because the descriptions with 2

^{50}numbers are possible both for 50 qubits and 50 classical bits, we could create an equally valid (i.e. invalid) argument in classical physics, by saying that a classical computer with 50 bits also has 2

^{50}"parameters", namely the classical probabilities, and they need to evolve sufficiently precisely for the computer to work. As you know, classical computers do work and may be reliable enough. So the evolution of the 2

^{50}probabilities – that isn't perfect because transistors are never quite "idea" – is possible.

The case of quantum mechanics is analogous. The counterparts of "transistors" are also imperfect but the errors may be described as errors in individual qubits or in nearby bunches of qubits. And those can be corrected. In classical computers, transistors can be made reliable enough so that the error correction isn't needed at all. In realistic quantum computing, error correction is always needed. But that's the extra step that the higher "complexity" of quantum mechanics requires. This step may be done in principle.

It's right to say that the quantum computer only has 50 independent (mutually commuting) observables – qubits – and in this sense, it is a

*discrete*, not a continuous, system! The probability amplitudes are continuous but so are all probabilities, including probabilities in classical physics. The fact that you describe a physical system by continuous probabilities (they are always continuous) doesn't make the system itself continuous! Dice remain rather discrete. But he is not really thinking about the quantum computer in a quantum mechanical way. He is thinking about a classical simulation of it, misunderstanding that these two things are completely different and at the end, they unavoidably behave differently in any real-world conditions as well as conceptually.

You may describe errors and noise as actions on 2

^{50}complex amplitudes but only some "highly correlated" transformations of these "very many" real numbers are physically allowed – and these allowed transformations or perturbations may be basically traced to (or derived from) errors and noise that physically affect 50 or so physical variables (or pairs of their neighbors etc.) i.e. many fewer than 2

^{50}variables.

Dyakonov also likes to say – and it's repeated in every text he writes, it seems, he must be proud about it – that the quantum threshold theorem has well-defined assumptions – assumptions that assume we are living in an ideal world – and those cannot be precisely satisfied. This is also a bizarre format of a criticism to be proud of.

*Every theorem should have well-defined assumptions.*This is how theorems work! Theorems are results in mathematics where everything – both assumptions and the final statements (and the proofs, too) – must be rigorously formulated. Is that a disadvantage? No. It's really an advantage. When we're precise and when our arguments are rigorous, our knowledge is more reliable. That's why mathematics is so valuable and essential in science, especially but not only in physical sciences and engineering.

Also, in general, rigorous mathematics is never guaranteed to apply to the real world. We

*never*have a perfect proof that assumptions that are precisely formulated at the mathematical level are obeyed in Nature. But this is just a general fact about the difference between mathematics and physics. You could use this general fact to criticize

*any*usage of mathematical reasoning anywhere in science. It would always be a "legitimate argument" in principle. But it would almost always be a "deeply wrong argument" in practice.

In science, when we really discuss the messy world around us, we can never "rigorously" prove that some claims are correct. But we can do measurements that show that if the assumptions are violated, they are violated just a tiny little bit. We can accumulate increasingly strong scientific and empirical evidence in favor of many statements, including statements that look like mathematically rigorous ones. That's how science works.

For example, the theorem surely has to assume that it's possible to describe the 50-qubit quantum computer in terms of a Hilbert space whose dimension is 2

^{50}– or similarly 2

^{1,000+}when the first layer of error correction is included. It has to assume that the wave function or density matrix has the expected number of components and they evolve according to some Schrödinger-like equation. Or, with some noise added, perhaps along some Lindblad equation or some generalization of it.

There is some parameterization of the mathematical variables that represent the system and its evolution; and some

*Ansatz*is used for the possible source of errors or noise. If Dyakonov really made it to the "third, sophisticated class" of critics of quantum computers, he would discuss whether the

*Ansatz*for the noise and errors is a sufficiently general one. And he would propose at least a "sketch" of a more general type of noise or errors that is possible, that has the capacity to make quantum computing impossible, and that's been assumed not to exist by the theorem.

As far as I can see, he has nothing of the sort. He hasn't gotten this far in his critiques of quantum computing – his critiques remained mostly stuck at the two levels, the "superficial, repetitive, and populist hostility addressed to moronic readers" and the "denial of basic rules of quantum mechanics".

You know, I think that if he seriously tried to add "new kinds of noise", he would either find out that they're not really new i.e. they're either effectively equivalent to those that have been assumed in the theorem or those that can be tamed by a safer design and made infrequent enough, like in classical transistors; or they're impossible by basic consistency of quantum mechanics.

Concerning the latter, some deformations of the rules for the noise and errors simply cannot work because they would amount to new deformations of quantum mechanics. And as I have argued last week and many times in the past, the basic framework of quantum mechanics cannot really be deformed.

Because he hasn't formulated any details – what is actually being ignored – I can only guess, using his actual texts as a guide. And when I do so, it looks overwhelmingly clear to me that his expectations that "something will doom the quantum computers" are derived from his anti-quantum zeal. He is actually thinking about the quantum mechanical description of a quantum computer as if it were a classical system with 2

^{50}complex numbers, classical degrees of freedom, that evolve according to some classical equations. And he thinks that these equations may be corrected in a general way – like any classical equations.

But that's simply not the case. We have an overwhelming physical evidence that this is not the right way to think about our quantum world. In particular, unlike the classical degrees of freedom in any realistic system, all the probabilities or probability amplitudes evolve linearly. There cannot be any non-linear deformation in this sense! The linearity with respect to probabilities or probability amplitudes ultimately follows from mathematical logic expressed in terms of the probability calculus. Nothing can be changed about this.

Aside from non-linearities, he could also consider some evolution of "separate probability amplitudes" that is linear but almost arbitrary. Again, a totally generic evolution of probability amplitudes is virtually impossible in a quantum computer, too. It's really the

*observables*that are evolving according to some equations, not the

*wave functions*. And there are just 50 or so independent observables. Why don't you just think in terms of the Heisenberg picture? The 50 qubits are observables that evolve according some Heisenberg equations of motion. (You actually need 50+50 "basic" observables, \(\sigma_{x,i}\) and \(\sigma_{y,i}\), and all other observables may be written as polynomials of these 100 "basic" ones.) They may be modified but the modifications are highly restricted and all possible modifications of the Heisenberg equations of motion are allowed in the assumptions of the quantum threshold theorem.

Of course if you think that the quantum computer "is" a set of 2

^{50}classical observables whose evolution is almost arbitrary – nonlinear, selective permutations of amplitudes in this exponentially huge ensemble, and more – you will end up with the expectation that quantum computers will never be possible because you may imagine deformations of these classical equations for 2

^{50}classical degrees of freedom that introduce errors that cannot be corrected. But if you do so, you are using a fundamentally wrong model to think about the quantum mechanics – and about the Universe. You are stuck in the scheme of classical physics but Nature follows the laws of quantum mechanics. In quantum mechanics, the wave function appears in a mathematical description of the system but the system still has 50 or so degrees of freedom, not 2

^{50}degrees of freedom!

Because we don't have the truly universal and most general

*Ansatz*for a description of a quantum computer – how the qubits interact with the environment, classical parts of the quantum computer in the real-world sense, and so on – there may always be some complaint that the theorems are missing something that is important in the real-world realization of the theoretical concept. However, as long as we are thinking as scientists, we must still have an idea about the question whether these hypothetical omitted effects have the capacity to render quantum computation impossible. I think that the people who generally understand quantum mechanics (because they have carefully studied some quantum mechanical systems, including theory and the evaluation of experiments) – and in what sense it's different from classical physics, and which parts of both frameworks should be considered analogous – end up with the conclusion that quantum computation is probably possible in principle.

And those who think incorrectly and who are stuck in classical physics (because they're more experienced with the old, simple, classical systems than the 20th century quantum ones) are much more likely to conclude that quantum computers are probably impossible.

More generally, Dyakonov just looks unfamiliar with the error correction literature – there is 25 years worth of it – much like string critics are unfamiliar with the string theory literature. He was even persuaded to change a demonstrably incorrect statement about the non-existence of some analysis of some QC errors. But the correction he chose has made it clear that the major conclusions – negative ones about QC – have been predetermined and nothing will ever change his mind about that. Dyakonov's writing is simply nothing like an impartial analysis of a problem. It's a rationalization of a prejudice.

## No comments:

## Post a Comment