# Taking Passes at NP Versus L

* Trying to extend the power of Nisan’s PRG for polylog passes *

Noam Nisan is one of the great researchers in the world of computer science theory. His brilliant work has covered many areas from complexity theory, to his more recent work on theory aspects of games and economies. I still am at a loss why he did not win a Gödel Prize for his early brilliant research. See his terrific blog for more on his recent work.

Today we wish to talk about an approach to separating logspace from .

Actually what I want to talk about is a seeming paradox: Why do we not already today have a proof that logspace, , is different from ? In order to show that is different from , we need only find a property that one class has and the other does not. Apples and oranges are different colors; apples and bananas have different skins.

Now it sounds like we can state such a property to distinguish these complexity classes: Nisan’s pseudorandom generator (PRG) *fools* logspace, but does not fool polynomial time, and in an even stronger sense, does not fool . So why doesn’t this prove , or even ?

The quick answer is that Nisan’s PRG is only known to fool logspace machines that have a *one-way* input tape, or a limited number of left-to-right re-reads. The not-so-quick answer is less clear, however—it is a bit of a complicated story, one I would like to share with you. I believe it could be used to attack some of the open problems that we face. So let’s turn to the story.

** The Story: Part I **

The high-level point is: Nisan exhibits an unconditional pseudo-random number generator (PRG) for logspace. This generator fails to work for higher classes, even polynomial time, so this is indeed a difference between the two classes. The reason this does not lead to a separation theorem is a bit tricky.

The story starts with a paper by Matei David, Periklis Papakonstantinou, and Anastasios Sidiropoulos (DPS) that I really like and discussed earlier here.

I will first explain Nisan’s generator in the way they use it in their paper. Here and are the uniform distributions over and , respectively. The generator maps an to an . For any deterministic finite-state machine (FSM) , and state of , let

denote the probability over that on input ends up in state . In this manner and induce a distribution on the states of , and by reasonable use of notation we also call this distribution “.” We then have

Let be computable in polynomial time in and let . Say that is a *PRG against Finite State Machine FSMs* for space with parameter if for any FSM with at most states,

See their paper for all the details.

Nisan’s theorem is:

Theorem:There exists such that for any , there exists a function computable in time polynomial in that is a PRG against FSMs for space , with parameter .

The definition allows that the machine can read the “random” bits only once. They suggest that you think of the bits as placed on a special *random* tape that is one-way. The machine reads a bit, uses it any way it wishes, and then can move on to the next bit of the tape. The only way it can “recall” the exact bit is by storing it in its limited storage.

The connection to logspace is that when , the time to compute is polynomial, and the number of states of is polynomial. A logspace Turing machine has only polynomially many possible worktape configurations, which can become the states of the FSM . However, the conversion requires the same sequential-only access to the input by as has. A two-way FSM can be converted into a one-way FSM via a crossing sequence argument, but this can require an exponential blowup in the number of states. Our question is whether one can effectively compromise by allowing passes over the input, and exploit a greater power of non-determinism in a scaled-down situation.

** The Story: Part II **

The basic thought is to try and exploit the distinction that Nisan seems to create and use it to prove a separation theorem. Consider a logspace machine that only uses its own input tape to get —this helps avoid any issue with the input tape causing any extra complexity. Suppose the machine uses as its random tape. It reads the random bits from and makes some decision based on the randomness. The key is that the can be generated by a PRG and not be real randomness. This “fools” the logspace machine.

Suppose now that ; we may be able to handle a weaker assumption, but let’s use this one for now. Then consider a new machine that is nondeterministic and runs in polynomial time. It operates as follows: it reads some of the tape , and then stops reading. It guesses the seed that the PRG may have used. Then it checks that its guess is correct. If not it rejects. If yes then it computes the next unread random bit, and uses the its knowledge to “cheat.” This makes the PRG fail since it cannot fool this machine.

That is the high-level idea. The key question is: why does this not contradict ? The logspace machine was fooled and the machine was not fooled. Seems like a proof to me—where did we go wrong? Note also that uses non-determinism only to guess a relatively short seed, so this would separate from a subclass of .

The answer is that on close examination what we really proved is this: Treat the random tape as the real input tape for both machines. If they both can only read the tape once left-to-right, then the logspace machine is weaker than the machine. But this says nothing about . The difficulty is that restricting the logspace machine to read the input once is huge, while the same restriction for the other machine is nothing. Clearly the second machine can first copy the tape into a temp storage area and read it over and over.

Note that a logspace machine that can only read its input tape once, left-to-right, is really just a finite state automaton. More exactly, for each input length it yields an FSM with states. We can show that such machines cannot even do trivial tasks like accept the language

Indeed, for each take to be the sequence of states is in counting only the times it moves rightward off the character, in its accepting computation on input . Then we must have whenever , as else we could reconstruct accepting computations of on the bad inputs and . Thus some must embody close to a linear number of passes, or there must be more than states.

** The Story—Part III **

The restriction to reading the random tape only once seems to make the difference between the (one-way) complexity classes uninteresting. But wait. The paper of DPS shows that it is possible to read the tape multiple times for the Nisan generator. If it were possible to read the tape a polynomial—or even linear—number of times then this really would yield a proof that is not equal to .

The result of DPS is that Nisan generator’s still works if passes are allowed over the random tape. This is way too few for the above argument to work, but it is suggestive, and DPS do mention the possible connection between PRGs and complexity separations.

Recently Ken, H. Venkateswaran, and I have started to discuss how we might make a tight connection between the power of Nisan-like generators and complexity separations. Venkateswaran has worked for years on related problems in this area, and once had a paper on extending Nisan’s PRG to fool Auxiliary Pushdown Automata.

Let’s make a definition. Say a PRG is a **strong Nisan** generator if it fools logspace as his generator does, uses a seed of size for some absolute , allows computing each random bit from the seed in size, and fools machines that make passes for any fixed . Then I believe—we believe?—that this would prove a separation theorem:

Conjecture:The existence of a strong Nisan PRG implies .

The sketch of the proof is based on assuming that the two classes are equal. Then we *scale down* to make a nondeterministic machine that reads only a poly-log number of bits and then guesses the seed. If the guess is correct, can use this knowledge operating deterministically to make the PRG fail. Scaling down the input size comes from focusing on subsets of the bits produced by Nisan’s generator and using the fact that those bits are –*succinct*. It also implies that we are making full use of non-determinism relative to the input size, rather than guessing only polylog bits. We are working on the details, and hope to achieve them soon.

** Open Problems **

Can we prove that there is a strong Nisan generator? Note that we can weaken the error bounds on the generator: I believe the PRG need only be uniform with error, rather than polynomially small error. I think this could be a viable approach to proving such great theorems. What do you think?