Wednesday, January 08, 2020 ... Deutsch/Español/Related posts from blogosphere

Donald Knuth clarifies why \(P=NP\) seems likely

Neutrino physics: IceCube has ruled out an astrophysical explanation of the strange ANITA events, see arXiv, by placing an upper limit on the flux from the would-be sources, thus strengthening the odds that they have observed a stau (supersymmetric partner of the tau lepton \(\tilde \tau_R\))
Old men who have been amazing contributors to STEM fields evolve in various ways. When I randomly saw the 599th followup of the Weak Gravity Conjecture – the first hep-th paper posted in 2006 (hep-th/0601001: I didn't have to be particularly fast to get this nice number), it turned out to be a paper titled "Musings..." by Mikhail Shifman who got an even better number arXiv:2001.00101 but it only appeared on a historical archive dedicated to crackpots, senile men, and the public.

Shifman has been great and he's still extremely active but I think that he has lost the ability to follow and evaluate the new ideas from the truly best physicists who are around right now. In many respects, the paper seems as shallow as the whining by many childish fake scientists who just repeat "physics or string theory must be wrong because I don't understand why they're correct, so it must be bad science or no science" and all this amazing garbage often heard from the failers-to-think who pretend to be thinkers.



Thanks to Willie Soon for the video

When I listen to Donald Knuth, it's a totally different experience: even though he sometimes stutters (and if you are annoyed by this superficial imperfection like I am, believe me that our reactions are mostly due to our compassion and intelligence, we just can't help ourselves), he is clearly the intellectual master in the room, despite the fact that the young host Lex Fridman isn't just a journalist. He is developing some AI and autonomous vehicles at MIT. But relatively to Knuth and his depth, Fridman sounds like just another journalist who is interviewing someone who is vastly smarter.

Between 58:02 and 1:10:05 in this 1:46:00 long interview, he addresses his reasons to think that \(P=NP\) is more likely than \(P\neq NP\) although many people would love to promote the latter to the only allowed faith. I have previously discussed Knuth and \(P=NP\) in 2016.



In the interview, Knuth talks about tons of other things but I can't cover everything. You are more than encouraged to discuss anything that is related and you find it interesting.

Recall that \(P\) is the set of all problems dependent on a parameter \(n\in \ZZ\) whose solution may be found after \(O(n^K)\) steps (in a Turing machine) where \(K\) is some finite exponent. They can be solved in a polynomial time. \(NP\) are a greater or equally large class where the correctness of a proposed solution may be verified in an equally defined polynomial time (where you can find a proof that the solution is correct; the proof is automatically assumed to be a proof of some general type with a template where the long computation just inserts some problem-specific details).



The finding of the correct solution does automatically include the verification so the "solve it" \(P\) task is "equally hard or harder" than the "verify it" \(NP\) task, and therefore the "solvable" problems \(P\) are a subset of \(NP\). The question is whether it is a proper subset, i.e. \(P\neq NP\), or whether they are equal, \(P=NP\). If they're equal, it means that the process of finding the solution for these problems is at most "polynomially harder" than the process of the verification.

The group think that some people want to promote to the universal dogma despite the absence of any proof says that it must be \(P\neq NP\) because lots of problems are known that we can verify in practice but we can't solve them by known algorithms even though "many smart people have tried" to find such algorithms, moreover motivated by $1 million prizes etc.

Knuth says that, first, the money is no argument at all. One million dollars is simply too little to find an extremely hard-to-comprehend algorithm. Second, it is an illegitimate reference to authority which was applied in a biased way. Just like one can say "look, so many smart people and no one has found a fast solution to an \(NP\) problem", one can say "look, so many smart people and no one has found a proof of \(P\neq NP\)". These comments may be equally used as a demagogic justification for \(P\neq NP\) or \(P=NP\), respectively. Third, more importantly, the number of algorithms that "exist" in the mathematical sense is vastly greater than the number of algorithms that a mortal human being may master.

There are lots of algorithms that are unknown to humans, that would be surprising to humans, and they use lots of patterns and regularities that exist but humans would simply not understand them because humans are stupid in comparison with the "entire wisdom of the Platonic world of mathematics". So humans unavoidably assume some probabilistic distributions that implicitly assume the non-existence of all major, far-reaching patterns and that's a fundamentally wrong assumption.



To simplify the main point, Knuth simply says that the number of algorithms that exist mathematically is so vast that some of them are likely to be good enough to solve the so-far-unsolved \(NP\) problems in a polynomial time. Knuth presents two wonderful examples. The first one is Hex, a board game generalizing Tic Tac Toe, see the picture above, co-invented by John Nash. The red and blue player are placing stones of their color to the grid of hexagons. When the left-and-right blue boundaries are connected by a blue line, the blue player wins. When the upper-and-lower red boundaries are connected by a red path, the red player wins. Czech public TV knowledge contest AZ Quiz uses a similar game (just as a background, the stones are placed when you answer correctly) but the hexagons sit on an equilateral triangle, not a diamond.

One may show that there are no draws (the first proof was written down by Nash in 1949) and "just a slight" extension of the proof shows that assuming two ideal players who can see all the combinations and paths (and there is just a finite number of them), Hex is actually a "won game" for the first player! While we know "which idealized player wins", we have no idea what the right strategy to win is.

Knuth's second example is a classification of non-planar graphs. A graph is a bunch of vertices connected by edges, a planar graph is one that can be drawn so that no edges intersect in the middle. There exists a proof that all non-planar graph in a certain class defined by the closure under some operation (taking a minor) – which really contains the "bulk" of nontrivial classes of graphs, and the theorem morally applies to "all classes of graphs" – contain one of the bad minors and the number of such bad minors that have to be classified is finite. Well, in the fully understood special example, by Wagner's theorem, it's enough to ban \(K_5\) and \(K_{3,3}\) subgraphs of minors to check the planarity. A minor is a generalization of a subgraph which you obtain from the original graph either by removing vertices with all their edges (that is what would give a subgraph) or by the shrinking of vertices that were connected and you made them identified.



So it has been proven that there exists a list of finite "viruses", i.e. bad minors, that your "immunity system" or "antivirus program" has to check, and if all of them are absent, then you may say that the graph is planar, i.e. free of the non-planar infection. Knuth talked about a generalization of this theorem due to Robertson and Seymour to other classes. The non-planar graphs form a class that is closed under a closure and we may indeed decide whether a graph is planar by checking for the absence of the two graphs above ("everyone with everyone among five", and "three utilities connected with three consumers in a distributive way"). But the point is that one may define similar classes to non-planar graphs where it has been proven that a polynomial method exists to determine the membership in the class (by checking for a finite number of obstructions); but the general list of the "bad minors" is unknown for the generic other classes. There are lots of polynomially fast algorithms which exist due to an existence proof but we can't construct these algorithms.

It is very clear to a sufficiently intelligent person why this example is relevant for an intelligent opinion about \(P=NP\) but the MIT AI specialist clearly didn't have any idea what was the relationship. Well, the finite number of "bad minors" is exactly analogous to one of the "finite exponents" in a hypothetical algorithm to solve an \(NP\) problem quickly. The religion maintaining the \(P\neq NP\) dogma doesn't necessarily say that all such lists of "bad minors" and similar things must be infinite sets; but if you have a coherent philosophy, it's rather clear that they would make this claim about the "infinite cardinality of these sets", too. And that's a provably wrong proposition.

So the list of the "bad minors" is finite. Nevertheless, we don't know what the list exactly is.

Knuth generally says that due to deep and complicated reasons, a particular sequence of operations may simply but totally surprisingly solve a seemingly unrelated problem. Take an NP-complete problem, e.g. one involving graphs. There might be a similar obstruction-checking algorithm that solves it and the existence of such a single algorithm would be enough to prove \(P=NP\).

I think that a person who comes from the "deep theoretical physics" or "beautiful mathematics" such as group theory must totally sympathize with Knuth's point because he is really saying that there are many special patterns and large integers whose special character just cannot be obvious to an untrained, shallow human brain –and they must look surprising.

OK, all simple compact Lie groups may look like a chaotic mess with any dimensions. But we may actually classify them. They're the classes \(SU(N)\), \(SO(N)\), \(USp(2N)\), and five exceptional groups \(G_2,F_4,E_6,E_7,E_8\). The largest exceptional group, \(E_8\), is famously 248-dimensional. All these claims about the "largest exceptional group" with a strange dimension such as 248 would have looked unbelievable to the people who lived a long time ago.

The finite groups are perhaps even more shocking. Aside from some systematic classes, you need to list 26 or 27 exceptional cases, the sporadic groups. The largest one among them, the counterpart of \(E_8\), is the monster group with
808,017,424,794,512,875,886,459,904,961,710,757,005,754,368,000,000,000
elements. Almost 10 to the 54th power. The smallest nontrivial linear representation is 196,883-dimensional and 196,884 is a coefficient in the most important expansion of the seemingly unrelated \(j\)-function in complex calculus. The proximity of these numbers could be guessed to be an unexplainable accident – "there is nothing to see here" – except that we know that there is a lot to see here. The relationship – the monstrous moonshine – has deep, now well-understood reasons which can be made manifest by a special compactification of (basically) bosonic string theory on the 24-dimensional Leech lattice.

Also, you may ask how much is \(\exp(\pi \sqrt{163})\). It looks like a stupid exponential of a stupid product of a stupid pi and a stupid arbitrary square root of some bizarre large integer. But the numerical value is
262537412640768743.99999999999925007...
It's almost an integer: there are 12 copies of the digit 9 behind the decimal point. That must be a coincidence, lots of people would say, confident that by denouncing any conspiracy theory, and this is a conspiracy theory, they look smarter and more rational. But you know, some conspiracy theories are true theories about conspiracy facts. And the set of possible conspiracy facts is large enough so that we simply encounter them.

And we do understand why the exponential above, exploiting the largest Heegner number 163, is almost an integer: the proof boils down to the same expansion of the same aforementioned \(j\)-function around a different point.

I've mentioned some examples of surprising mathematical facts that are understood because they were sufficiently unique and fundamental – revolving around the largest exceptional or sporadic groups \(E_8\) or \(M\) and the unique \(j\)-function that maps the fundamental domain of \(SL(2,\ZZ)\) to a sphere. General problems with traveling salesmen and similar things are analogous – they are also mathematics – but they are far less unique, far less beautiful, and the patterns that govern them are likely to look far more contrived. But there is no reason why the wisdom and "conspiracy facts" that allow you to master these problems should be non-existent.

When we say that the number of steps needed to solve an \(NP\) problem is polynomial, it doesn't mean that the number of steps is around \(n^2\). It may be\[

P\cdot n^Q

\] where \(P\) or \(Q\) or both are some extraordinarily large numbers comparable to 196,883, the order of the monster group, or even greater integers. It simply can happen. An algorithm may be "recursive" in character and naively it could take an exponentially or factorially large number of operations. Except that there may also exist mathematical patterns – such as those from the finite list of "bad minor graphs" or the finite number of simple compact Lie groups etc. – which guarantee that the degree of recursion is never greater than some large number like 196,883. That's why the "potentially" exponentially slow algorithm may very well be universally polynomial and provably so. We just don't know the proof now.

And the existence of such finite numbers \(P,Q\) that make the \(NP\) class "polynomially solvable" may be totally useless for practical purposes. If the prefactor \(P\) were the order of the monster group, almost \(10^{54}\), then the algorithms would be practically useless despite the formally polynomial character because \(10^{54}\) is just too high a number. This number times the Planck time is still \(10^{11}\) seconds or 3,000 years. Similarly, if the exponent were \(Q=196,883\), the growth of the required time would be so steep that the formally polynomial character of the function would be useless.

But I am not claiming that either \(P\) or \(Q\) or both must be insanely large. It's also possible that they're rather low. Feasibly fast polynomial algorithms could very well exist. We just don't know them. They could be very clever and dependent on mathematical "accidents" that even the deepest living mathematicians would consider incredible miracles. But they would have some reasons that are potentially understandable and hackable, just like the monstrous moonshine. They're just not understood in early 2020, much like there were no actually constructed flying machines heavier than the air in early 1903 (but you should have waited what the Wright brothers show in December).

At the end, I think that both Knuth as the \(P=NP\) believer and the \(P\neq NP\) believers are rationalizing their beliefs. It is possible to collect evidence and general "visions" about the organization of proofs and algorithms that make one believe \(P=NP\), \(P\neq NP\), or any unproven assertions increasingly with time. This process is always a rationalization and a form of demagogy and self-deceit. One must understand that any strict opinion about an unproven assertion such as the \(P\) vs \(NP\) problem is just a prejudice.

Any such problem may be framed as a clash of the titans that is similar to the question whether "the omnipotent God may destroy an unbreakable stone" and stuff like that which many of us have heard from our would-be deep 10-year-old classmates and sometimes this stupid sophistry did look deep to us, indeed. ;-) Just to avoid misunderstandings, I almost always found those discussions dumb because an axiomatic system equipped both with "omnipresent God" and an "unbreakable stone" is simply inconsistent (self-contradictory) and intelligent people stop right away when they realize that a system is inconsistent (intelligent people do spend some time to check the consistency at the very beginning, they care about it!). They dismiss such systems. I find this attitude critical for the psyche of a good theoretical physicist – who just wants to spend his time on consistent theories and systems only.

You know, in the metaphor, there is an omnipotent God-solver who knows an infinite number of algorithms, most of which are inaccessible to the mortal human being. That's Knuth's side. But the \(P\neq NP\) believers will insist that there also exists an unbreakable stone, the \(NP\) problems needed to be solved in the polynomial time to be broken. They must be unbreakable because humans would be too dangerous for their socialist system or too close to God whatever is the exact argument. Well-paid academicians who worship socialism and transsexuals have tried for years to find the flying machines that are heavier than the air or polynomially fast algorithms and they found none. So you are obliged to agree that airplanes and fast algorithms don't exist, otherwise you would insult these comrades sensitivities! ;-) That's exactly how I understand Scott Aaronson's main argument.

Well, maybe humans may be close to God and they may ask Him for help in breaking the \(NP\) stones, maybe it's not enough. We just don't know.

We don't know the resolution to the \(P\) vs \(NP\) problem but each of us may still report on his feelings about the rationality and honesty of someone's argumentation. And I personally find e.g. Donald Knuth's argumentation to be far more impartial and rational than e.g. Scott Aaronson's argumentation.

Add to del.icio.us Digg this Add to reddit

snail feedback (0) :

(function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){ (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o), m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m) })(window,document,'script','//www.google-analytics.com/analytics.js','ga'); ga('create', 'UA-1828728-1', 'auto'); ga('send', 'pageview');