It’s All Algorithms, Algorithms and Algorithms
All results whether upper or lower bounds on based on new programming tricks
Donald Knuth has won the Turing Award, and almost every other prize possible. He created the modern field of the analysis of algorithms, and his work in combinatorics and algorithms has had a tremendous impact on the field. He also created that allows us to write mathematics that looks great. Thanks Don.
Today I want to discuss one reason why I think that P=NP is at least a possible outcome. It has to do with Knuth’s Dictum.
Knuth’s Dictum is: Computer Science is algorithms. That’s it. Not just theory. All. Everything. If there is not an algorithm lurking around, then it is not Computer Science. I am not sure I always have agreed with this, but I think it probably is a reasonable statement. Back to this in a moment.
Knuth has a rule, not a dictum, that when he cites you in these famous textbooks on the Art of Computer Programming, he must have your full name. He wants your first, middle, and full last name. Nothing less will do.
Today with Google, getting full names is a bit easier than back in the 1970′s when he was editing his second volume. Although it still is not trivial to get even just the first and last name of some people. I know because I have a rule: if I mention you in a post, I want your full first and last name. I decided from the beginning that it was courteous to always refer to everyone by at least their first and last name. This rule has not always been easy to follow, since there are many who do not use their full name. I have had sometimes to work pretty hard to get their full name, but so far I believe I have always gotten them.
Knuth’s rule is in another league—the big leagues. He will not settle for less than the full name, middle name included. In the seventies Knuth sent me a letter—this was before email—asking me for my full name. I love to have fun, I hope that comes out, and decided to write a letter back. The letter I sent him went like this:
I never heard a peep from him about this, but I was correctly cited in the next edition.
Let’s turn now to one reason why I feel the way I do about P=NP.
Algorithms, Algorithms, and Algorithms
I have a thesis that I wish to advance—it is a corollary to Knuth’s Dictum. The thesis is that all the advances in computational complexity theory, all advances, have been from the discover of new algorithms or programming “tricks”. Every advance has been the result of someone discovering a new way to program. Sometimes the programming was explicit, when they discovered a new algorithm; and sometimes it is implicit, when, for example, they proved a lower bound. But all advances come from new programming tricks.
These tricks may involve programming nondeterministic machines in some new way, using randomness in some novel manner, or programming circuits in some clever way. But, it’s algorithms, always algorithms.
Here are some examples of what I mean. The examples are from famous results that are not stated directly as algorithmic papers.
Neil Immerman and Róbert Szelepcsényi proved that nondeterministic space is closed under complement. This is a theorem all about space classes, but at the core is a clever algorithm that proves that a path does not exist in a directed graph. They use a counting trick to prove this.
Alexander Razborov and Roman Smolenski proved that ACC cannot compute the mod function, provided is a different prime than . The core of this proof is again clever programming of low degree polynomials. For example, one of the key steps is to show how to approximately compute the OR function with a low degree polynomial.
Omer Reingold proved that undirected reachability could be done in logspace. Essentially he de-randomized the previous random algorithm. His method is a very clever use of the zig-zag product of Omer Reingold, Salil Vadhan, and Avi Wigderson. Again at the core it is an algorithm for searching graphs. For example, one simple subroutine is how to search an expander—just do a brute force search. A much more important part of his proof is how to construct the edges of the result of applying many zig-zag products to a graph. This is an algorithm that must run in logspace for the whole proof to work.
Mark Braverman recently proved that AC can be fooled by weakly independent random variables. Part of the proof is the construction of certain polynomials that compute various required predicates. This requires that Mark be an expert AC programmer. See Luca Trevisan’s post for a nice summary of this.
I will not go into details on all possible examples, but they do include many others, for example: IP = PSPACE, Seinosuke Toda’s Theorem, and the PCP Theorem.
P=NP and Knuth’s Dictum
The relationship to P=NP is simple, in my mind. Since we are constantly discovering new ways to program our “machines”, why not a discovery that shows how to factor? or how to solve SAT?
Why are we all so sure that there are no great new programming methods still to be discovered? Methods that could allow us to program a random polynomial time Turing machine to factor, or to solve SAT? I am puzzled that so many are convinced that these problems could not fall to new programming tricks, yet that is what is done each and every day in their own research. I am confused.
Take a look at some result that you know. See if I am right that at the core it’s based on a programming trick of some kind. I would be surprised if you can suggest any results that do not fall into this category.