An Approach to Graph Isomorphism?
Graph isomorphism and classic invariant theory
David Hilbert is one the greatest mathematicians of all times, but you knew that. He solved hard open problems, he unified fields, and he created his famous “program” for the foundations of mathematics. The latter was essentially destroyed by Gödel Incompleteness Theorem.
Today I want to focus on an area he made important contributions to, and is related to one of his problems from his famous list of open questions. He presented his list at the international mathematics meeting in Paris in 1900. His list of problems have been extremely influential, and has lead to great work by many mathematicians. Hilbert was generally an excellent selector of what problems were important and could be solved.
Some of the problems were solved quickly, some were solved decades later, some are still open, and some are “fuzzier” and may never really be solved. Others were solved, then errors were found in their proofs decades later, and the problems were re-solved. See this for a wonderful treatment of the problems and the people who have worked on them.
In the category of problems that were solved quickly is Hilbert’s question about tetrahedra: his third problem. This problem was solved fairly quickly: Ron Graham once told me this problem was so “easy” that the solver, Max Dehn, found it during Hilbert’s talk, but he did miss the statements of the next two problems. Ron is slightly exaggerating the situtation, but not by much. While the proof is clever, it is not probably as hard as most of the rest of the Hilbert problems.
Today I want to talk about graph isomorphism and its relationship to classic invariant theory. Invariant theory is an area that Hilbert made major contributions to and is related to one of his problems, the fourteenth. This post is based on work that I am doing with one of my graduate students, Farbod Shokrieh, and with Ken Regan.
Graph Isomorphism (GI) is one of the most famous problems in complexity theory and mathematics. The appeal of this problem is that it sounds so approachable: it is just about undirected graphs, after-all. But it is one of the great “diseases,” that is it is one of those easy to state problems that has bent a little over the years, but refuses to break. There are some quite pretty results known about GI, but still it is unknown whether GI is in polynomial time. We do not even know if GI is in , i.e. does it have polynomial size circuits, nor do we know if it is in . We owe Laszlo Babai the great insight into creating the new class , so at at least we have that GI is in .
Many special cases are known to be easy: trees, planar graphs, random graphs with high probability, and constant degree graphs: the latter is a beautiful result of Eugene Luks. There are also many equivalent problems and related results, but I will not attempt to give a complete list of all the known results.
My goal today is to outline a connection between GI and some interesting questions from classic mathematics. In the best case, it could lead to real progress on GI. In any event the idea raises questions and potential applications that beg to be explored further.
Instead of the general GI problem we will focus on another GI complete problem. The problem is given two — matrices and are there two permutations matrices and so that
We claim that this problem is equivalent to GI problem. We will call this problem the restricted bipartite graph isomorphism (RBGI) problem.
The rationale for the name is that essentially it is a special case of the bipartite graph isomorphism problem. The only extra restrictions we have added is that we:
- We require the two color sets to have the same cardinality;
- We insist that the isomorphism preserve the colors.
It is easy to see that these leave the problem GI complete, since it is an old result that GI can be reduced to the bipartite case. However, it is convenient to have the more restricted problem.
For the remainder of this post we will use the matrix representation of graphs. Thus, when say that is a graph, we simply mean that is a — matrix. Thus, the graph is isomorphic to provided there are permutation matrices and so that
Invariant Theory for the Symmetric Group
A central problem of invariant theory is given a finite group that acts on what polynomials are invariant under the group? A polynomial is invariant under a group , provided for all ,
When the group is the full symmetric group, this is simply the class of symmetric polynomials. Thus,
is invariant, while
is not: a permutation that exchanges and changes the polynomial.
For any finite group the polynomials invariant under form an algebra. That is just a fancy way of saying that the sum and product of two invariant polynomials is still invariant. The goal is to describe this algebra in as succinct a manner as possible. Let see how this is done, first, in the basic case of the full symmetric group.
The elementary symmetric functions are the sum of all distinct products of distinct variables:
The elementary function satisfy two basic theorems. First, they generate the algebra of all symmetric polynomials and second they are easy to compute.
Theorem: If is a symmetric polynomial, then there is a another polynomial so that
This can be proved by an induction on the number of variables. See this for details. Thus, the elementary symmetric functions generate, as an algebra, all the symmetric polynomials. Note, they do not generate all symmetric polynomials as a vector space: just consider a very high degree symmetric polynomial, it clearly will not be a linear combination of the elementary functions.
Theorem: Given there is an algorithm that computes all for all in polynomial time.
The proof of this theorem uses the following simple insight. Let
Expanding this out gives,
Thus given values we can use Lagrange interpolation to obtain the coefficients of the resulting single-variable polynomial, which give us the desired evaluations of every elementary function.
Invariant Theory for Bipartite Graphs
be the general vector that stands for a general matrix. The group that we are interested in is not the full symmetric group . Instead we need to use the subgroup that is isomorphic to . The action of a group element is to permute the rows of by and then the columns by . Let’s denote the group by , or just when the subscript is clear. We are interested in polynomials that are invariant under this group.
Note, because the row and column permutations are independent we the invariant theory is based on a direct product of symmetric groups. Otherwise, the action would be more complex.
What we hope is that there is something like the basis of elementary functions for the polynomials invariant under . We need there to be polynomials
that are invariant and they generate the algebra of all invariant polynomials. And further:
- The number of the polynomials is bounded by .
- Each polynomial has a straight-line arithmetic computation that is at most long.
Let us call these assumptions . Note, the analogous assumptions are true for the symmetric polynomials. Classic results from invariant theory, I believe, show that at least a weaker result is easy. The assumption is true without the restriction on the number or the complexity of the polynomials . As usual we in complexity theory need more.
Invariant Theory and GI
The reason we have raised the assumption is that it is enough to make a breakthrough on GI.
Theorem: Suppose that is true, then GI is in .
The algorithm is very simple. Suppose that and are graphs. Then, for each , check whether or not
If this fails for some , then clearly cannot be isomorphic to . If on the other hand, this holds for all , then we claim that must be isomorphic to . Note, the algorithm can be done by a circuit of polynomial size by the assumption .
It only remains to show that the algorithm is correct. The key case is where and satisfy the above test for all , yet are not isomorphic. We will show that is impossible.
We need some definitions first. Let be a graph. Define the following function:
Lemma: Consider as a polynomial in the matrix . Then, and for any that is not equal to .
Proof: The value of is equal to
which is clearly . The value of The value of is equal to
Since and are different, there is a pair of indices so that either
In the either case the product is clearly .
Let . Then, the following is the symmetric form of the polynomial :
Finally, for a graph , let be the number of elements of such that
Lemma: Suppose that and are isomorphic graphs, then . Suppose that and are not isomorphic graphs, then .
Proof: The key is if they are not isomorphic then the sum is over all zero’s. In the case they are isomorphic, it sums over all possible and so that .
This lemma shows that the algorithm works as claimed. For suppose that and are not isomorphic. By assumption for all . Since the polynomials generate the algebra of all invariant polynomials, it must be the case that . This is a contradiction since,
The main open problem is to see if this approach can work. We need, of course, to understand the assumption . Note, if somehow we proved that GI did not have polynomial size circuits, then we would show something non-trivial about the action of . Either there would be a super polynomial number of generators or the generators must have large complexity.
Also even if fails, there are weaker versions that would say something non-trivial about GI. For example, suppose had an exponential number of generators , but each still had polynomial complexity. Then, we would get that GI is in . The latter would require that we could efficiently go from to the circuit for .