Graphs, Permutations, Characters, and Logspace
Counting the number of cycles of a permutation with connections to some classic open problems of complexity theory
Ken Regan is a great theorist, and probably the best chess player in the world who can really state what P=NP means.
Today I want to talk about a connection between logspace and the computation of the characters of a symmetric group. I also want to show how stupid I can be, which is not fun to point out. However, one of the agendas I had in starting this blog was to show the warts and all in research, and showing my errors is one way to do that. Oh well.
Ken is a very strong chess player. Very. He is an International Master (IM), which is the rank below Grandmaster (GM). His international rating is 2405—ratings in the 2400’s are typical for IM’s, over 2500 for GM’s. I am impressed, when I played in high school my rating started with a 1 and the next digit was not large.
Ken has recently been working on some fascinating projects on detecting cheating in chess tournaments. The issue is simple: is a player using a computer program to help make the moves? As you probably know computer programs are terrific today. Top level programs are matched very well against the best chess players in the world. See for example this. They are especially strong in tactical positions that routinely arise in real matches. So access to a computer would yield a great advantage to the player.
During the 2006 Kramnik-Topalov world championship match cheating accusations against Kramnik were made. Ken was asked to try to decide how one might evaluate such accusations—that a player is making “too many” moves favored by a computer program, in this case the Fritz 9 program. Within days he posted his conclusion that Kramnik’s high coincidence to Fritz and another program, Rybka, was due to the unusually forcing nature of the games, Game 2 in particular. As Ken says,
“If you force play and limit your opponent’s options, don’t be surprised if he matches a computer’s moves!”
See this for further material and details on other cheating controversies and on ratings in general.
Ken deserves a thanks from all chess fans, at all levels, for his quick and insightful analysis of this controversy. Now let’s now turn to logspace and other concerns, and leave behind the coolness of high caliber chess.
I need to define a number of languages that will be important for our discussion. All the languages consist of directed graphs with some additional property. The input will always be of the form:
where each and is in encoded as binary strings. The meaning of is that there is a directed edge from to .
The first is the language which consists of all inputs that encode a directed graph that consists of only vertex disjoint cycles. The next languages form a family of languages and consist of all inputs that contain exactly number of vertex disjoint directed cycles. Note,
for any . Also
Finally, define to be
Some definitions are in order. The class is the uniform version of the famous class of constant depth and polynomial size circuits with AND/OR/NOT gates, and the AND/OR gates have unbounded fan-in. Also is the uniform class that also allows gates that compute mod . The class denotes logspace. The following two theorems are straightforward.
Theorem: The language is in .
Proof: The key is that a directed graph has this form if and only if the in-degree and out-degree of each vertex is . Clearly, this can be checked as required. For example, the vertex has in-degree at least is equivalent to there is some so that is an edge and .
Theorem: For any , the language is in .
Logspace Complete Problems
A language is logspace complete, for us, always with respect to reductions. The following theorem and proof is based on Stephen Cook’s and Pierre McKenzie’s classic paper—Problems complete for deterministic logarithmic space.
Theorem: It is -complete to tell from , or any from . Hence none of these tasks is in , for any prime .
I have a proof of this theorem, but Ken worked out a better proof. The key parts of the proof were already in Cook-McKenzie, and Ken and Pierre worked out the rest. I thank them for allowing their proof to be included here. If you like you can skip the proof, but Ken worked hard to get all the details right. So please read it now. Or skip it now and come back later on. Or both.
Even Cycle Theorem
So if it’s hard to tell cycle from , how about from , or from ? Long before I knew from is hard, I proved the following, which seems to be new as far as Ken and Pierre and some others can tell.
Theorem: The language is in .
A corollary of this is that cannot be complete. This is an unconditional result, because otherwise it would violate known lower bounds on . More on this in the next section.
Every input that describes a directed graph of disjoint cycles can be viewed as describing a permutation. This is the key to the proof of the theorem. Therefore, we need some basic facts about permutations. While they are simple and classic properties, they are surprisingly useful and powerful.
Each permutation has a value associated with it the number of inversions of the permutation, . An inversion is a pair of indices so that and yet .
Moreover, the following classic lemma is the key:
Lemma: For any permutation on letters,
We now turn to the proof of the even cycle theorem (ECT):
Proof: Recall that we could view the input graph as a permutation. We want to calculate the parity of the number of cycles. By the above lemma, it is enough to calculate the number of letters and the number of inversions of the permutation. Then, the theorem will follow by the lemma. But, computing each of these is quite simple for a constant depth polynomial circuit that has mod gates. To compute the number of inversions one needs only try all pairs of and see if they form an inversion. If they do, count the number of them mod . This completes the proof.
I Was Stupid
Quite a while ago I began to look at directed graphs that consisted only of vertex disjoint cycles. I could encode certain problems into this simple class of graphs, and began to try to understand the complexity of telling the number of cycles and other properties.
After a while I realized that they could be viewed as permutations in a natural way. This eventually led me to prove a theorem that was equivalent to the ECT. I then started to think about how to tell how many cycles a permutation had modulo . This seemed, at the time, to be a reasonable possibility. I tried a wide range of tricks. They all were based on the fact that mod gates can compute mod . This is done by a kind of simple amplification trick. So I tried many graph and permutation constructions that attempted to reduce mod to mod .
I had expected the ECT to generalize into statements like this:
My hope was all in vain, since each of these is impossible. When I say impossible, I mean that these statements are actually false—not just if-horses-can-fly-then-pigs-can-whistle unlikely, but really false. The reason is the brilliant work of Roman Smolensky. I am a bit embarrassed that I took a while—make that a long while—to see the connection with his well known work. I can only say that I guess I got caught in the trees, and could not see the forest. In any event I now understand now what is going on much better.
Recall Smolensky proved that is not in when is a prime and is not a power of . So and are not in each other, and is not in or in . This is not symmetrical, though—e.g. it does not give you that is not in . Indeed, is a class that gets more annoying every year. It should be small, but we currently cannot even tell that it’s different from NP. Why do we think we can separate P from NP when we cannot even separate it from ? Thus counting modulo primes versus composites is the frontier of lower bounds, and has been for over two decades. I thought counting cycles would behave like counting 1s in a binary string, but be more important because of how graphs relate to Turing machines.
It should now be clear why and are impossible. Look at —if it were true, then you could tell cycles apart from cycles. By Theorem that would imply , which would contradict his famous result. Likewise would put telling from cycles into , which is impossible.
Further, if I promise that the input graph is either a single cycle, or is disjoint cycles, with circuits, can you tell apart these two cases? No. Likewise if I promise forms either cycles or , or versus cycles, the answer is no. However, you can tell versus by ECT.
There is a way to try to generalize the ECT based on character theory for the symmetric group. This attempt is an approach to prove a theorem like
Or if you think this is false, this approach can be viewed as RCT: reverse complexity theory. What we may be able to prove is that characters of the symmetric group are “hard” to compute.
For us characters will just be mappings from the symmetric group to the integers. A character has many properties, but one of the key properties is that if and are conjugate permutations, then
Recall two permutations are conjugate if they have the same cyclic structure. Functions that are constant on all permutations with the same cyclic structure are called class functions, since they are constant on a conjugacy class. The characters form a linear basis of the class functions.
Consider the problem of telling apart two permutation types: and . Suppose that has one cycle, and has three cycles of equal size. The class cannot tell these apart, for that would again yield an impossible collapse. However, can , for example, tell them apart? This is the main open question that we are interested in.
Since the characters form a linear basis of the class functions, if and are not conjugate permutations, there is a character , so that
Theorem: Suppose that for , all the characters of the symmetric group on letters can be computed by , then
I think that this theorem can be greatly strengthened, by carefully calculating which characters are actually needed—needed to tell one cycle from three cycles. Finally, note that one of the characters is always easy to compute:
Lemma: For any permutation ,
The character is defined as follows. Every permutation can be written as the product of transpositions. Recall a transposition leaves all elements fixed, but switches two elements. A permutation can be written in many ways as the product of transpositions, but the parity of the number of transpositions is always the same. Thus, we can assign to a value in : the value is for odd permutations and is for even permutations.
Another way to view this lemma is that there is a summation that computes . For any permutation ,
Here is if is true, and , otherwise. The question, is are there similar summations for any other interesting characters?
There are other characters that are also easy to compute, but they seem to be uninteresting from a complexity point of view. For example, let be the number of points that the permutation leaves fixed. This is easy to compute too, but does not seem to help.
There are many interesting open questions. First, are there any applications of the ECT? Second, are there some characters that have special formulas like does? I also do not know if is in for any odd prime .
The key question is what can we say about the computational cost of computing characters in for composite? If the answer is that can compute all characters, then as we observed, that would solve a long standing open problem.