Valiant’s Permanent Contributions
The complexity theory of the permanent of a matrix
Leslie G. Valiant is one of the top theorists in the world. He has just won the 2010 Turing Award to add to the many other honors he has already received. He is a perfect choice for computer science’s highest award.
Today Ken and I would like to talk about just a part of Valiant’s seminal contributions, in particular his work on the complexity of counting.
His Turing Award announcement cites his work on machine learning and more. We think his paper on machine learning was an important event that helped create the vital field of modern machine learning. Since we are not experts on machine learning, we will let others discuss that contribution, and will discuss here his brilliant work on the permanent of a matrix.
The permanent is like the determinant in that it maps square matrices to values. The definition of the permanent is:
Here is over all permutations of . It was introduced by Augustin-Louis Cauchy in 1812 while he developed the notion of determinant.
We note that David Glynn in this paper has given a new formula for the permanent. It can be implemented to have the same best-known complexity as a formula due to Herbert Ryser. Glynn’s formula takes an average over all the vectors with entries and beginning with a , where means the product (i.e., parity) of those entries:
The work was published in 1979 as, “The Complexity of Computing the Permanent.” It introduced the complexity class and many ideas that are fundamental to modem complexity theory, along with the companion paper, “The Complexity of Enumeration and Reliability Problems.” Both papers were originally written in 1977 while he was at the University of Edinburgh, made into technical reports and submitted straight to journals; neither appeared at STOC or FOCS or apparently any other conference. A third paper, “Completeness Classes in Algebra,” did make it into STOC 1979. The main theorem, which entails that the computation of is -hard, is:
Theorem: Over any field of characteristic other than , the permanent is complete for under polynomial-time reductions.
Of course the restriction to characteristic other than is due to the simple observation:
which follows from the neat fact that . In mathematics is often special—being the only even prime makes it unique.
Genesis of the Ideas
George Pólya asked, in 1913, a question about the relationship between the permanent and the determinant. The question asked whether there was an “easy” way to reduce the calculation of a permanent to that of a determinant. The rationale was simple: computing determinants is easy, while at the time it was unknown how to compute the permanent in any manner significantly faster than computing all the possible terms.
The question precisely was: Given a matrix , is there a way to set the signs of the entries so that the resulting matrix satisfies
This is trivially true for matrices of size , and the question was, is there a way to do this in general?
I think the thought process behind the question may have been something like this: The determinant differs from the permanent only in signs on terms. Perhaps there is a way to get all the signs right by flipping some signs of the matrix . Actually to prove that this is impossible is not trivial, and it was done by Gábor Szegő. Note the power of Valiant’s notion of . We now know that if there was a way to do this, even a way that was non-uniform, then at least would have polynomial size circuits. Since this would be quite surprising—against most people’s beliefs—we could say even without Szegő’s proof that finding such sign-setting rules is quite unlikely.
Actually the beauty of this argument via complexity is that it even “proves” a much stronger rule: there is no sign setting method that is adaptive. Adaptive means the method is allowed to decide which signs to flip based on the values of the entire matrix. I believe proving this result without complexity theory would be very hard.
The general impact of Valiant’s work is that it explained why mathematicians had been unable to make an easy connection between two seemingly similar concepts. It is surprising that the permanent is less tractable to deal with, because its formula looks simpler without the signs that the determinant has. This was an early example of complexity theory speaking truth to the deep structure of mathematics. Or at least speaking evident truth—no one has yet proved that the permanent cannot be computed feasibly like the determinant can. The rest of this discussion is about efforts to understand and compute permanents.
We have just mentioned the famous van der Waerden Conjecture: a doubly-stochastic matrix of non-negative real numbers has
and this is uniquely achieved by the matrix with all entries . In 2008, Leonid Gurvits proved an inequality that generalizes the van der Waerden Conjecture, giving a new and simpler proof besides. This also extended related inequalities by Alexander Schrijver and W.G. Valiant. (No—we have not had another visit from the Leprechaun Neil L. playing tricks with Les’ name. The name “W.G. Valiant” appears on their paper, but Lex Schrijver tells us that it was a pseudonym for the researchers in the Werkgroep van Lint, and hence honors not Les or “Prince Valiant” but the (late) famous combinatorialist Jack H. van Lint.)
Here is a statement of Gurvits’ inequality, as given in a recent article by Schrijver with Monique Laurent in the American Mathematical Monthly.
Here is the number of non-zero entries in column of . Gurvits’ paper actually proves a more-general version involving -fold partial derivatives of homogeneous polynomials. Here is a video of Donald Knuth recently giving a talk on this work.
A final inequality that we find quite interesting is the following:
for any matrix . The key point is that this implies the neat fact—one that I found surprising—that the permanent of any unitary matrix is at most .
Reductions to Determinants
Valiant’s STOC 1979 paper proved that every arithmetical function with a formula of size can be computed symbolically by taking the determinant of an matrix. For a slight tightening, see this paper.
Note that the determinant itself is not known to have polynomial-sized formulas. The best known formulas have size . Of course if the permanent had formulas of such quasi-polynomial size , then it could be computed by taking the determinant of basically a matrix. Thus by proving a lower bound on the size of matrices giving , we could prove formula size lower bounds on itself.
However, even proving to be more than linear was a devil of a task. For a long time the best known lower bound was by Jin-Yi Cai in this 1990 paper. Then in 2004 came a breakthrough result of Thierry Mignon and Nicolas Ressayre, showing must be at least quadratic for fields of characteristic zero. Others including Cai with Xi Chen and Dong Li have refined the technique, but have not yet definitively pushed above quadratic.
Approximating the Permanent
Mark Jerrum and Alistair Sinclair proved in 1989 that the permanent of 0-1 matrices could be given a fully polynomial randomized approximation scheme, on several cases, including the case when the matrix had sufficiently many 1’s. This condition was lifted, and the scheme extended to any matrix with non-negative real entries, when Eric Vigoda was added to the team, for this paper. This result has been cited as a major answer to the question, Who cares about Permanents?
Other Aspects of The Permanent
There are many other studies of the permanent that were launched by Valiant’s seminal work, and have helped shape computational complexity. Since this is already getting long let me just list the top level ideas.
Random Self Reductions:
I proved in 1991 that the ability to compute for a large fraction of the matrices was enough to compute the permanent for all values. I would like to discuss the story of this simple, but useful result, another time.
Eric Allender has shown that uniform constant depth threshold circuits of polynomial size cannot compute the permanent. This is one of the few provable results we have on permanent, ones that actually show that it is not too easy to compute.
Values of Random Permanents:
There has been recent interest in the value distribution of the permanent of random Bernoulli matrices. Terry Tao and Van Vu have just proved some anti-concentration theorems—see their paper here for the details.
The Road Ahead
Perhaps the greatest homage paid to Les’ work on the permanent is that it has become a metonym for the versus question itself. (Ken wrote this sentence, since I had no idea what “metonym” means.) Or rather vice-versa: when people talk of Ketan Mulmuley’s attack on with Milind Sohoni and others, they are really talking about a strategy that has not yet gone beyond showing that the permanent does not polynomial-time reduce to the determinant. This is symbolized by Valiant as , and is not unconditionally known to be related either way to —see this recent “Stack Exchange” discussion. Nevertheless, captures so much of the spirit that it can even stand for the other question in what we teach our children.
We owe Les a great deal of thanks, for without his seminal work connecting the permanent to complexity theory, much research would have been impossible. Thanks Les and congratulations on receiving this year’s Turing Award.
The main open problem is: while various barriers seem to stop us cold on or trying to show that does not have polynomial-size circuits directly, can we develop mathematical tools and make steady incremental progress toward ?
[fixed Diaconis reference “Why care…” to “Who cares…”; 2011 -> 2010 as date of his Turing Award]