Two days ago, Scott Aaronson was rightfully confused by some bizarre statements in the literature on "superqubits" and he asked:
Two very intriguing papers recently appeared on the arXiv, claiming that one can use "superqubits"  a supersymmetric generalization of qubits  to violate the Bell inequality by more than standard quantum mechanics would allow. (That is, they claim one can violate the Tsirelson bound, which says that the CHSH game can be won quantummechanically with probability at most \(\cos^2(\pi/8) \sim 0.85\).) The first paper is by Borsten, Brádler, and Duff and the second is by Brádler. (LM: Kamil Brádler is a quantum information theorist trained at my Alma Mater in Prague.)
Alas, I remain deeply confused about the physical meaning of these results, if any. As the authors define them, "superqubits" seem to involve amplitudes that can be Grassmann numbers rather than just complex numbers. While I know next to nothing about the topic, that seems like a fundamental departure from "supersymmetry" in the sense that highenergy physicists use the term! I take it that supersymmetry is "just" a proposed new symmetry of nature, alongside the many other symmetries we know, and doesn't involve tampering with the basic rules of quantum mechanics (or with spatial locality). In particular, in supersymmetric theories one still has unit vectors in a complex Hilbert space, unitary transformations, etc.
If that's correct, though, then what on earth could superqubits have to do with supersymmetry in physicsbesides perhaps just repurposing some of the same mathematical structures in a totally different context? Is there any possibility that, if nature were supersymmetric in suchandsuch a way, then one could do an actual experiment that would violate Tsirelson's bound?
Your humble correspondent answered:
I completely agree with Scott that this particular "Grassmannization" isn't equivalent to what supersymmetry is doing in physics. Supersymmetry is a constraint that picks a subset of theories – ordinary theories with ordinary bosonic and fermionic fields that are just arranged (and whose interactions are arranged) so that there is an extra Grassmannodd symmetry. Because supersymmetric theories are a subset of more general theories, of course that all the general inequalities that hold for the more general theories hold for supersymmetric theories, too. And there are many new, additional inequalities and conditions that hold for supersymmetric theories – but not fewer constraints.
In supersymmetric theories, what becomes Grassmann numbers are never probability amplitudes. Only particular observables are fermionic operators – operator counterparts of Grassmannnumbervalued quantities in classical physics. These fermionic operators only have nonzero matrix elements between Grassmannodd states and Grassmanneven states; for the same reason why bosonic operators only have nonzero matrix elements between states of the same grading. One may introduce a grading on the Hilbert space but the amplitudes are still complex commuting \(c\)numbers.
If a basis of the Hilbert space has Grassmanneven as well as Grassmannodd states (e.g. states with an even or odd number of fermionic excitations), then the actual state in which the system may be found is either Grassmanneven or Grassmannodd which means that it is only composed of basis vectors of the same kind. Mixing of these two types of basis vectors isn't allowed; that's what the grading (or, equivalently, the superselection rule for bosons and fermions) means. All the coefficients of the wave function in front of the basis vectors are still ordinary complex commuting \(c\)numbers.
There's a simple reason why probability amplitudes can't be Grassmann numbers. To get physical commuting quantities out of Grassmann numbers, one always has to integrate. That's why the Grassmann variables may be integration variables integrated over in Feynman's path integral; but that's also why they have to be set to zero if we're doing classical physics. There aren't any particular nonzero values of Grassmann numbers. On the other hand, probability amplitudes don't have to be integrated; their absolute values should be just squared to obtain the probabilities (or their densities such as differential cross sections).
So if their construction is consistent at all, it's just a mathematical analogy of superspaces at a different level – amplitudes themselves are considered "superfields" even though in genuine quantum physics, amplitudes are always complex numbers. That's why the inequalities can't be considered analogous to Belllike inequalities and can't be applied to real physics. In particular, once again, Tsirelson's bound can't be violated by theories just because they're supersymmetric (in the conventional sense, just like the MSSM or type IIB string theory) because it may be derived for any quantum theory, whether it is supersymmetric or not, and supersymmetric theories are just a submanifold of more general theories for which the inequality holds.
I would point out that it wouldn't be the first time when Michael Duff and collaborators would be giving wrong interpretations to various objects related to quantum computation. Some formulae for the entropy of black holes mathematically resemble formulae for entangled qubits etc. But the interpretation is completely different. In particular, the actual information carried by a black hole is \(A/4G\) nats i.e. the black holes roughly parameterize an \(\exp(A/4G)\)dimensional space of microstates. That's very different (by one exponentiation) from what is needed for the quantuminformation interpretation of these formulae in which the charges themselves play the role of the number of microstates.
So I think that at least Michael Duff has been sloppy when it came to the interpretation of these objects which was the source of his misleading comments about the "black hole entropy formulae emulating tasks in quantum computation". There may be mathematical similarities – I am particularly referring to the Cayley hyperdeterminant appearing both in quantum computing and black hole entropy formulae – but the black holes aren't really models of those quantum algorithms because their actual Hilbert space dimension is the exponential of what it should be for that interpretation and they're manipulating pretty much all the qubits at the same moment. The objects in the hyperdeterminant have completely different interpretations on the string theory and quantum computing side; there isn't any physical duality here, either.
Let me return to the superqubits. Normal quantum mechanics realizes all "bosonic Lie group" transformations you may think of – such as rotations or translations – as elements of \(U(N)\) acting on the Hilbert space; that's true for all quantum field theories we know and string theory, too. You could think that there could be a "natural" extension where \(U(N)\) is replaced by \(U(MN)\), a supergroup. Similarly the normalization condition for the wave function "could" include squared fermionic amplitudes.
The first "novel" idea is partly possible: generators of supersymmetry indeed correspond to operators that map bosonic states to fermionic ones and vice versa. However, to get an "actual finite supersymmetry transformation", you need to consider objects of the type\[
\exp(\theta_\alpha Q^\alpha\text{h.c.})
\] in which the generator \(Q^\alpha\) is multiplied by a Grassmannodd coefficient \(\theta_\alpha\). Again, there are no "particular nonzero values" of Grassmannodd numbers so you can't talk about any "particular transformation" of this form. In particular, the evolution in time can never map bosonic states to fermionic ones and vice versa. You must view the generators of supersymmetry as fermionic operators that map particular bosonic states to particular fermionic states but the exponential above can't be constructed because there are no "actual values" of \(\theta_\alpha\) that you could insert. The finite exponential is just an abstract construct designed to look analogous to similar exponentials of bosonic operators but there's still a difference, namely that the bosonic numbers take values in a particular set while the fermionic ones don't.
Equivalently, we may observe that quantum mechanics makes it possible to learn the initial state of a physical system – everything about it – up to an overall phase (or normalization). This couldn't be the case if some amplitudes were allowed to be fermionic because Grassmann numbers can't really be "measured" by any apparatus or their values can't be "pronounced" (except for zero).
Effectively, these superqubit folks conclude that they may violate universal inequalities for quantum mechanics because they allow certain objects – probability amplitudes – to take values of an entirely different form than what is allowed in quantum mechanics. In practice, it has the same effect as if you assume that \(c^2\) may be negative for a complex number (it can't) which is the ultimate reason why they think that the winning probability may be higher than what is allowed by quantum mechanics.
This is actually more than just an analogy. The actual role of the Grassmannodd amplitudes is that there is an extra term in \(\braket\psi\psi\) which is equal to \(i\theta_1\theta_2\) or \(\theta^*\theta\), a product of two Grassmann variables, replacing \(c_i^2\). But given one possible arrangement of this sort, the other arrangements are obtained by rescaling \(\theta_1,\theta_2\) by the inverse factors \(k\) and \(1/k\), respectively, so this extra term in the overall probability is really equivalent to the bosonic \(xy\) which isn't positively definite; if subsystems were described by superqubits, they could be created with negative probabilities. The feeling that \(\theta^*\theta\) could be positively definite is an illusion because these two Grassmann variables must actually be independent; there's no way to impose the usual reality condition on one Grassmann variable.
There are no superqubits but there are superorganisms. ;) A community of ants (superorganism) together with several human collaborators built a sophisticated system of routes and tunnels out of concrete in the soil. It's the equivalent of the Great Wall of China but it's much more structured and fractal than the Chinese counterpart. The human society is doing similar things although the algorithms are less "hardwired" in the heads of the humans. However, it's essential for the human progress that many people work "outside the system", as individuals.
A Young, Dusty, DiskBearing Star Debunks ‘Alien Megastructures’ Theory
(Synopsis)

“Otherwise we are trying to communicate with someone who doesn’t exist with
a system which doesn’t work.” Philip K. Dick When it comes to the now
famous “...
10 hours ago
snail feedback (7) :
You said "Some formulae for the entropy of black holes mathematically resemble formulae for entangled qubits etc. But the interpretation is completely different. "
This criticism is not warranted because Duff has always tried to make it clear that he and his collaborators are not claiming anything more than a mathematical similarity between quantum information theory and some objects appearing in string theory. Duff has said that so far it is useful but not deep (see e.g. http://mathdl.maa.org/mathDL?pa=mathNews&sa=view&newsId=936)
Dear Philip, the very page you link to doesn't actually say it's not deep. Instead, it says:
This may be telling us something very deep about the world we live in, or it may be no more than a quirky coincidence.
It also says that this research is an application of string theory to quantum computation which it's surely not. That's really the question here. The objects that enter the analogous stringy formulae at the place of the qubits simply aren't qubits so the formulae aren't quantum information formulae. Also, and it's related, the title of the article "mathematics of string theory may be tested" doesn't follow from that research because if it could be tested for the Duff reasons, it would be because string theory shows up in quantum computation in this way. But it doesn't because those objects aren't qubits.
I you, Lubos, remain nonplussed about (or does not know) what is going on here, it guarantees that I also will remain short of a clue  *at least* until you do. %}
Certain tunnels, in particular below Geneva, are very important for humanity ... :D.
Thanks Lumo for pointing out this interesting question and your clear answer. From this even I can see that superqubits do not work :). I still have a paper sitting on my writing desk about renormalization group analysis of turbulence where too some fermionic fields are introduced that cancel (some?) corresponding loops of the bosonic fields etc... I`m wondering what the fermionic fields mean in this case but I'll have to come back here (or go to physics SE :P) when I`ve read (and hopefully understood a bit) more carefully ...
Now I'm going to physics SE to upvote the question and your answer :).
(Phil Gibbs has written an answer too now)
The crucial word in the quote is "may". It may be telling us something deep but so far there is no real indication of that. It is just a mathematical correspondence. The word "may" makes the statement very weak but when people think back on what they read they forget the word was there and make a big deal about what they think was said. It seems to be very hard to avoid this kind of confusion when there are people out to crticise anything to do with string theory.
Yep, I can lively imagine how the trollmaster jumps on that ...
Dear Dilaton, superqubits are a fun mathematical generalization of qubits, just not a physical one.
As you wrote elsewhere, it's intriguing to soften claims by the word "may" but I still think that the right word is "may not" so the contradiction is there even after the softening...
Things may be telling us lots of deep things indirectly but if we know that a factor in a formula is not a qubit and can't be a qubit, we also know that the formula isn't telling us anything about qubits entering similar formulae. ;)
Post a Comment