Proving Cook’s Theorem
Another proof idea using finite automata
Steve Cook proved three landmark theorems with 1971 dates. The first has been called a “surprising theorem”: that any deterministic pushdown automaton with two-way input tape can be simulated in linear time by a random-access machine. This implies that string matching can be done in linear time, which inspired Donald Knuth and Vaughan Pratt to find a direct algorithm that removes a dependence on the input alphabet size. This was before they learned that James Morris had found it independently, later than but without knowledge of Cook, and their algorithm is now called KMP. Fast string matching has thousands of applications. Second was his characterization of polynomial time by log-space machines with auxiliary pushdowns, a fact which may yet help for proving lower bounds. Then there was his third result, which appeared at STOC 1971, and was given a “slif” (stronger, later, independent form) by Leonid Levin.
Today I want to present a proof of the famous Cook-Levin Theorem that is -complete, and also mention one used by Ken.
I am currently giving a class on complexity theory, and thought this proof might have some advantage over the usual proofs. There are the original tableau-based proofs, the later circuit-based proofs, and variations of them.
The proof here is based on my favorite kind of objects—well one of them—finite automata. They are remarkably powerful and can be used to give a relatively clean proof. At least I believe the proof is clean for students—I would like to hear any thoughts that you all may have about it. It needs no assumption about oblivious tape access patterns, and does not use lots of complex indexing.
So let’s take a look at the proof. Here is a Wordle from STOC 1971:
My goal is to try and give a proof of Cook’s famous theorem that is easy to follow. There are many proofs already, but I thought I would try and see if I could give a slightly different one, and hope that it is clear. My thought is that sometimes the measure of clarity of a proof is the same as “ownership”: if you wrote it—own it—then it is clear. But here goes a proof that is a bit different.
The overarching idea is to do two things: (i) avoid as much detailed encoding as possible, and (ii) leverage existing concepts that you probably already know.
Let be any set in . Then there is always a one-tape nondeterministic Turing Machine (NTM) that accepts exactly and runs in time for some fixed . For any input this means that accepts if and only if is in . An ID of is an encoding of the total state of in the usual way. We have two special ID’s:
- the start ID , which depends on the input, and
- the accepting ID , which can be made unique and independent of the input.
means that can reach in one step of . This is all standard and is stated here just for review.
Saying that is an FSA means that is a deterministic finite state automaton. The language that accepts is . Also for any two strings and of the same length let be the shuffle,
If one string is longer, we suppose the other is padded with a special null character to have equal length, and then we shuffle.
You might ask why introduce FSA when our goal is to encode NTM’s? It turns out that our proof will take the following steps:
- Show that the behavior of an FSA can be encoded by -, which is a more powerful version of . This version is easier to use and understand.
- Show that the behavior of a NTM can be reduced to the behavior of FSA’s, and so to -.
- Finally, replace - by regular .
A Slight Generalization of
A -- problem is of the following form:
Here each is a general clause, which means it is a Boolean function of the variables ‘s. We allow the general-clauses to any Boolean function, but will restrict the number of variables that they can depend to at most . The problem is called satisfiable provided there exists Boolean values for the variables so that each evaluates to true.
Note that -- is closed under conjunction in the following sense: Suppose that is
is the conjunction. This is satisfiable if and only if there are variables so that both sets of clauses are true. This is a simple, but very useful property of --.
There is nothing mysterious or strange about --. It is just a way of defining a type of search problem. Informally it asks: are there Boolean values (…) that satisfy all of the properties (…)? We have lots of problems like this in mathematics. Consider a system of linear equations over the reals:
where is a matrix, is a vector, and is a vector. The LINEAR problem is: does there exist a vector so that the above is true? We could have written it as
to make it look like the -- problem. You should, and probably do know, what Gauss knew years ago that LINEAR is “easy”: in modern terms LINEAR is in , that is in polynomial time.
The difference between -- and LINEAR is that while the latter is extremely useful and can be used to model many important problems, -- is “universal.” By universal we mean that any problem from can be reduced to solving a -- problem.
Our plan is to show that we can encode by a -- formula. Then the behavior of for steps will be encoded by simply using the fact that -- is closed under conjunction.
Theorem 1 The problem -- is -complete.
Proof: Let be a one-tape NTM that runs in time and let be an input of length . Clearly accepts if and only if there are exists a series of ID’s
of length so that is the initial state, is a final accept state, and for each ,
Let for all consist of Boolean variables.
Claim: We have -- formulas over these variables so that (i) is satisfied precisely when is the initial state corresponding to the input ; (ii) is satisfied precisely when corresponds to the final state; (iii) and for each , the is satisfied precisely if
Then we claim that the conjunction of the clauses of all the
are satisfiable if and only if accepts. This follows directly by the definitions.
Thus to complete the proof of the theorem we must show that claim about the existence of ‘s is true. But this is easy as we will now show.
Encoding With FSA
Theorem 2 Let be a one-tape NTM. Then there is an FSA that depends only on so that for all and ,
Proof: The FSA just passes over and checking that the tapes are the same, except at the location of the head. There it checks that a legal move of has been used. Because the ID’s are shuffled together this can be done by the automaton.
Encoding With --
Theorem 3 Let be a fixed FSA. Then for any input string there are a set of general-clauses over variables so that
Proof: The idea is to look at the sequence
where are strings of Boolean variables that are long enough to encode the state of the FSA. Then the general-clauses are of three types:
- One clause that checks that is the start state of .
- One clause that checks that is a possible state that can reach after seeing the input bit .
- One clause that checks that is an accept state of .
Clearly all of these can be done by general-clauses: they are just Boolean tests.
Final Reduction to
A problem is a -- problem except that the clauses are restricted to be disjunctions of at most three variables and their negations. Thus a clause can only be of the form
The final step is to note:
Theorem 4 The problem -- can be reduced to .
This proves Cook’s Theorem.
Nand Now For a Shortcut
Ken tells me that especially in Buffalo’s undergraduate theory course, he uses—and gets away with—a big handwave. He states the principle
“Software Can Be Efficiently Burned Into Hardware.”
He uses this to assert that the witness predicate for a given -language can be replaced by a polynomial-sized circuit for any input length . Since NAND gates are universal, every gate can be a binary NAND. Thus if and only if there exist a and an assignment to every wire in that is consistent with every NAND gate, such that the output wire has value . For the inputs and every output of each NAND gate, the assignment must satisfy the clauses
Together with the singleton clause and clauses fixing the wires for to the corresponding bits of the given input string , this creates a formula that is satisfiable if and only if .
This proof has some nice features. It shows the completeness of several restricted forms of where the variables in each clause have the same sign and (with a little more work) assignments must make at least one literal in each clause false as well. But it hides the way computations are represented behind the handwave.
What do you think of the proof via finite automata?