All: Too late!
from the 1985 movie “Clue”

Lane Hemaspaandra was among the youngest present at the creation of the Computational Complexity Conference (CCC). We should thank him and all those who promoted this great conference in the green days of our field—indeed we will. He is older now, as are we all, but continues to make important contributions to complexity theory.

Today Dick and I want to talk about an older idea from the kitchen of Lane and others, and how it adds mustard to ideas we have already discussed on this blog.

Initially CCC was called Structure in Complexity Theory to emphasize the scientific position that the real-world behavior of many thousands of disparate computational problems is channeled by a small number of information-theoretic properties. Although the online conservatory called the Complexity Zoo is about to go over 500 classes, most problems are tied with rope into the completeness levels of barely more than a dozen classes. Many complexity classes have complete problems; many are closed under various operations and types of reductions; others involve languages with various densities and properties such as self-reducibility that affect worst-case/average-case behavior. Other structural notions have been proposed, studied, furthered, or put aside according to their utility.

The concept that Dick and I have just found reason to study further originated with Andrew Goldberg and Michael Sipser in their 1985 STOC paper “Compression and Ranking.” The ranking task for a given language ${A}$ is to compute, given any string ${x}$, the number ${f_A(x)}$ of strings ${w \in A}$ such that ${w \leq x}$ in the standard ordering of strings. If this function ${f_A}$ is computable in polynomial time, then ${A}$ is ${\mathsf{P}}$-rankable. In his single-author paper at the second Structure in Complexity Theory conference at Cornell in 1987, Lane expanded one of their implications into equivalences that hold for several weaker notions of ranking as well, and gave many related results. Stephen Rudich had obtained several of these results plus some others, and became a co-author on the journal version.

Lane is also an expert on voting systems. Indeed Lane and his wife Edith Hemaspaandra and his postdoc Jörg Rothe began in 1997 by analyzing the work of Lewis Carroll on voting systems that we mentioned in-passing here, and this led to a large ongoing project with its own library of three dozen papers in computational social choice theory. Lane Hemachandra and Edith née Spaan amalgamated their names when they married. I (Ken) have been happy to have them in nearby Rochester, though this year they are on sabbatical in Düsseldorf, Germany, visiting Rothe—whose name is akin to scarlet.

## Succinctness

Large objects are succinct if they have small descriptions that are easy to operate with. Psst—to take you out in the hall to explain the title of this post:

People who say, ‘…and to make a long story short’ usually have already missed being succinct.

Consider a large graph with ${N = 2^n}$ vertices. The graph may be fairly sparse, with each vertex having at most ${n}$ or ${n^2}$ or so neighbors, but in toto it is too big to write down in ${n^{O(1)}}$ time. However, to navigate this graph, it may suffice to have a small program that given any vertex ${v}$ outputs the polynomial-sized list ${N_v}$ of neighbors of ${v}$. Or weaker, given any ${v}$ and ${w}$ we would at least like to know whether they are connected by an edge.

For the simpler case of a binary string, the condition which we have discussed before goes as follows:

Definition 1 With regard to some fixed polynomial ${p}$, a binary string ${x}$ of length ${N = 2^n}$ is (weakly) succinct if there is a Boolean circuit ${D}$ of size at most ${p(n)}$ with ${n}$ input gates such that for all ${i}$, ${0 \leq i \leq N-1}$, ${D(i)}$ outputs bit ${i}$ of ${x}$.

By size of ${D}$ we mean its number ${m}$ of wires; if we require instead that ${m\log_2 m \leq p(n)}$ then ${p(n)}$ is also an upper bound on the polynomial-time Kolmogorov complexity of ${x}$. Compared to standard Kolmogorov complexity notions which allow programs arbitrary time to lounge around generating ${x}$ serially from a short description ${d}$, the circuit description ${D}$ provides fast parallel computation of the bits of ${x}$.

## New Definition

The above sounds good, but is it good enough? When trying to extend the notion to graphs and matrices in general, we encountered situations where we know there is a single ${1}$ in a large interval, and would like to find where it is. We tried making special-case definitions with wording like “strongly succinct” for some of the cases that we worked on, yet this issue threw a monkey wrench or spanner into what we wanted to do next. Finally we hit upon extending the base definition instead. Again with regard to a fixed polynomial ${p(n)}$, the new property reads:

Definition 2 A string ${x \in \{0,1\}^N}$, ${N = 2^n}$, is succinct if there is a circuit ${D}$ of size at most ${p(n)}$ with ${n}$ inputs and ${n+1}$ outputs such that for all indices ${j}$, ${D(j) = |\{i \leq j : x_i = 1\}|}$.

That is, ${D}$ computes the census of places previous to ${j}$ where ${x}$ has a ${1}$. Now we can recover bit ${x_i}$ itself as ${D(i) - D(i-1)}$. To be strictly-ballroom of course we put ${D(-1) = 0}$.

More to the point, we can quickly find a place with a ${1}$ between any places ${i}$ and ${j}$ by binary search. We call a family of strings ${x^n}$ succinct if there is a polynomial ${p(n)}$ with regard to which each ${x^n}$ is succinct. Although doing binary search bumps up the runtime to ${O(np(n))}$, in asymptotic usage this is still “polynomial.”

## Relation to Ranking

Given any language ${A}$ and string ${u}$ of length ${n}$, take ${x_A}$ to be the binary string that encodes ${A}$ on strings of length up to ${n}$. Then ${f_A(u)}$ gives the census of ${x_A}$ on indices up to that of ${u}$. Alternatively we may consider the family of strings ${x^n = A^{=n}}$ for each ${n}$. Then the family is succinct if and only if ${A}$ is rankable by ${n^{O(1)}}$-sized circuits.

This is actually called strong rankability in the Hemachandra-Rudich paper. The base definition puts ${f_A(u) = -1}$ whenever ${u \notin A}$, while their “weak” definition allows ${f_A(u)}$ to be arbitrary when ${u \notin A}$. Neither weakening seems to allow binary search—though both allow equivalence theorems for ${\mathsf{P}}$ and ${\mathsf{NP}}$. The angle here is that the task of ranking the members of ${A}$ can be restricted to the members of ${A}$.

We also want to extend the notion to vectors ${v}$ and matrices over arbitrary sets ${F}$ of entries. Here the small circuit ${D_v}$ computes ${D_v(j,a) = |\{i \leq j: v(i) = a\}|}$ for all ${a \in F}$ and indices ${j}$. We can still do binary search to find some entry ${i}$ where ${v(i) = a}$. Indeed if there are only ${p(n)}$ such entries, then we can find them all. This is the property needed to navigate various kinds of sparse graphs. It is a plum that a single idea allows us to treat various combinatorial structures as partially-white boxes.

## Matrix Computations

Now let ${A}$ stand for a big ${N \times N}$ matrix. We say ${A}$ is succinct if the string or vector of length ${2^{2N}}$ obtained by unrolling ${A}$ in row-major order is succinct. It is weakly succinct if the string satisfies the original succinctness notion, i.e., if a small circuit can compute individual entries ${A(i,j)}$.

For a ${\{0,1\}}$ matrix, succinctness means we are computing ${f_A(i,j) =}$ the census of ${1}$‘s to the left of ${j}$ in row ${i}$ plus all those in rows above. We can do binary search in row ${i}$, and thus in a sparse graph, find all neighbors of vertex ${i}$ in polynomial time.

We do not know whether succinct matrices, even Boolean matrices, are closed under matrix product. This involves the problem of computing or estimating the inner product of two succinct exponentially-long vectors, which is a knife-edge challenge all its own. We can, however, prove some special cases.

Lemma 3

• (a) The product of two succinct permutation matrices is succinct.
• (b) The product of a succinct sparse matrix and a weakly succinct matrix is weakly succinct.

Proof: For (a), let ${F = AB}$ for permutation matrices ${A,B}$ and let any ${(i,j)}$ be given. We know the census of ${1}$‘s in rows above ${i}$ will be ${i-1}$, so we need only tell whether the ${1}$ in row ${i}$ is to the right or left of (or in) column ${j}$. We can first find the unique ${k}$ such that ${A(i,k) = 1}$. Then in row ${k}$ of ${B}$ we find the unique column ${j'}$ with a ${1}$ and compare ${j' \leq j}$.

For (b), we have ${F(i,j) = \sum_{k \in S} A(i,k)B(k,j)}$ where ${S}$ is the ${n^{O(1)}}$-sized set of columns in which row ${i}$ of ${A}$ has nonzero entries. Since ${A}$ is succinct this can be found by binary search, and then the availability of a small circuit computing ${B(k,j)}$ for those values completes one for ${F(i,j)}$. $\Box$

## Is the Notion Strong Enough? Too Strong?

To multiply by a succinct sparse matrix on the right, for ${F(i,j) = \sum_k B(i,k)A(k,j)}$, we need to find the sparse set of rows in which column ${j}$ of ${A}$ has a nonzero entry. This may require symmetrizing the notion of succinctness for matrices by stipulating that the column-major string also is succinct. This ensures it is closed under transpose ${A^{\dagger}}$—note this is A^\dagger in ${\mbox{\LaTeX}}$.

Stronger still, we could stipulate that a small circuit ${D}$ computes the census of the quadrant northwest of ${(i,j)}$. Then ${D(i,j) + D(i-1,N) - D(i-1,j)}$ gives the row-major census, and the column-major census is similar. Is this equivalent to both, or to either?

It is also possible that counting is stronger than we need—note that part (b) of the theorem falls short of getting census counts for ${F}$. Perhaps we can make our desired applications work by crafting an intermediate notion that allows binary search but stops short of full counting? Ryan Williams has pointed out to us that for applications with for-contradiction assumptions that entail ${\mathsf{\#P} \subset \mathsf{P/poly}}$, the assumption makes the counting-based definitions equivalent to the original one.

In any event, the strongest definitions do seem to be met by important examples of succinct objects in complexity theory. Some of them come from circuits that simulate Turing machines ${M}$. These circuits have a regular grid structure that replicates a fixed-size gadget for the finite control of ${M}$. Owing to the regular structure, global counting—whether north, west, or northwest in the grid—does not impose any greater difficulty. Another source is quantum computation. Deterministic gates such as controlled-NOT and Toffoli are represented by succinct permutation matrices, and the lemma above suffices to compose them at will while preserving succinctness.

Why fret these details? Proving another nice lemma could make the choice a lead-pipe cinch. In non-uniformity and some other ways the quest for lower bounds suffers from the lack of a ‘standard candle’ for complexity, but first one needs to make a standard candlestick. We feel there should be a unique solution to this puzzle of logic and esthetics, but we are still picking up clues.

## Open Problems

Admittedly we’ve bounced around like balls in a billiard room, but to make a long story short, what is the best notion of succinctness?

Are the row-major and/or column-major and “northwest” notions of succinctness for matrices equivalent?

For fans of the game “Clue” or Cluedo: who did it, where, and how?

[changed P to P/poly in last main section]

December 18, 2011 6:46 am

Interesting post ! I suppose that the Hemaspaandra pair interest in voting system is not independent of the fact that they shared university with Riker, the heresthetician.

• December 19, 2011 9:58 pm

They certainly cite Riker. Wow, I learned a new word! This page on “Heresthetics” says that “Riker coined this term from a Greek root meaning ‘choosing and electing.'” I would have preferred to see it used to describe the iconography of iconoclasm.

SPOILER ALERT! on the Clue(do) puzzle below,
hence
this
comment
is
being
elongated
vertically
by
as
many
words
as
I
have
the
patience
to
type
right
now.