Birthday workshop at Rutgers last January

 Combined from source

Eric Allender and Michael Saks have been leading lights in computing theory for four decades. They have both turned 60 this year. I greatly enjoyed the commemorative workshop held in their honor last January 28–29 at DIMACS on the Rutgers campus.

Today Dick and I salute Eric and Mike on this occasion.

Eric and Mike have been together on the Rutgers faculty since the middle 1980s. I have known Eric since we were both graduate students. We both had papers at the first Structure in Complexity Theory conference in 1986—when it was co-located with STOC at Berkeley—and again at the 1986 ICALP in Rennes, France. I don’t know if I first met Mike at the Berkeley conferences or a couple years later at FOCS 1988 in White Plains, New York. The “Structures” conference was renamed CCC for “Computational Complexity Conference” in 1996. The 2017 conference starts tomorrow in Riga, Latvia.

Mike also has recently been named an ACM Fellow, joining Eric on that illustrious roster. Mike’s primary appointment is in Mathematics and he is currently serving as Chair. Eric was Chair of Computer Science at Rutgers not long ago. I have somehow managed not to do a paper with Eric—nor has Lance Fortnow, as Lance noted during his talk—but we collaborated with Michael Loui on three book chapters covering complexity theory in the CRC Algorithms and Computation Theory Handbook. I did write a paper with Eric’s student Martin Strauss, who along with Michal Koucký—another of Eric’s students—organized the workshop.

## Structure and Combinatorics

One hears all the time about movements and generations in art and music but they happen also in science. We’ve discussed the “AI Winter” among other things. Complexity theory is no exception.

Eric and I were among the avant-garde of “Structural Complexity.” The idea was to understand common features of complexity classes and reductions between problems, apart from specific features of the problems in isolation. In part this was a reaction to how direct analysis of problems had not only failed to resolve ${\mathsf{P}}$ versus ${\mathsf{NP}}$ in the 1970s, but had met barriers in the form of oracle results that applied in similar ways across many levels of classes.

Eric’s paper at STOC 1987, titled “Some Consequences of the Existence of Pseudorandom Generators,” kicked off his study and use of multiple forms of Kolmogorov Complexity (KC). We have covered this in Eric’s research before. The “structural” flavor of KC comes from how it avoids referencing specific combinatorial structures like graphs or formulas or set systems and how it can be applied at many levels of complexity.

It seems fair to say the structural approach clarified and systematized many questions of complexity theory but did not resolve many on its own. But it was great for fashioning molds into which combinatorial arguments can be injected. Eric’s signature lower bound with his student Vivek Gore, separating the classes uniform-${\mathsf{TC^0}}$ and ${\mathsf{PP}}$, employs these ‘structural’ ingredients:

• Oracles;

• Modular counting;

• Uniformity of Boolean circuits (as needed to frame the result, since we still seem far from separating ${\mathsf{PP}}$ from non-uniform ${\mathsf{TC^0}}$);

• Alternating Turing machines.

A similar list can be made for Ryan Williams’s separation of ${\mathsf{NEXP}}$ from nonuniform ${\mathsf{ACC^0}}$: oracles again employed constructively; succinctness; quasi-linear time complexity as a stepping-stone; and probabilistic simulation of AND/OR using modular counting—for which Allender-Gore is cited. The use of symmetric functions and gates in both papers might be deemed more “combinatorial” but this still goes toward my point here.

That said, Mike has always represented more the “combinatorial” side—indeed, his first three journal papers after joining Rutgers appeared in the journal Combinatorica. Further, the workshop emblem ${E + M = C^2}$ adds up to “Eric + Mike = Complexity and Combinatorics.” The two flavors were evident in the series of talks chosen to honor each and both.

## The Talks

All but one talk has video online. DIMACS Director Rebecca Wright gave a welcoming introduction.

${\bullet}$ Avi Wigderson spoke on “Branscamp-Lieb Inequalities, Operator Scaling, and Optimization.” After relating why his and Mike’s families are close, he set the tone by telling what led Mike into combinatorics as a graduate student in mathematics:

“When he was near the algebraists, the typical conversation he would hear is, ‘Remember this really extremely general result I proved last year? Well, guess what—I can now generalize it even more and I can prove an even more general one.’ [But] then when he hung around with the combinatorialists, he would hear, ‘You remember this extremely trivial problem that I could not solve last year? Well, I found a special case that I can still not solve.’ So he decided, ‘that’s for me.’

Well … Mike still has some affinity to the algebraic side, … and has this tendency whenever he is facing a problem the first thing to do is to generalize it just below the point where it becomes false, and then scale it a bit.”

Avi then introduced the technical part by saying that he was led into it by the Polynomial Identity Testing (PIT) problem, “a problem that Eric cares about, Mike cares about, lots of people care about”:

“I just want to mention that Mike and I spent five years on [PIT], meeting every week in a café for the day. We had lots of great ideas that ended up with nothing. I think that’s the story with a lot of other people.”

Avi could have appended to that last sentence, “…and in complexity theory in general.” Thus C & C are married to a hard bed. The main body of his talk was about testing inequalities, where things can be done.

${\bullet}$ Harry Buhrman spoke on “Computing With Nearly Full Space.” We covered this work with a different slant here. Harry’s first six minutes featured many stories and photos of conferences and meetings with Eric and Mike.

${\bullet}$ Meena Mahajan spoke on “Enumerator Polynomials: Completeness and Intermediate Complexity.” Although she began by saying she mainly knew Eric, having invited him to India and vice-versa, her talk was highly combinatorial involving polynomial enumerators for cliques, Hamiltonian circuits, and much else including projections in real space.

${\bullet}$ Clifford Smyth spoke on “Restricted Stirling and Lah numbers and their inverses.” This involved the problem of computing (the signs of) entries of certain inverse matrices without having to do the whole inversion.

${\bullet}$ Yi Li spoke on “Estimating the Schatten Norm of Matrices in Streaming Models.” He started with a problem about ${n}$-dimensional real vectors ${x}$: Starting with ${x = 0}$, you get sequential updates ${x_i}$ += ${\delta}$ to the components of ${x}$. You want to maintain estimates of a function ${F(x)}$ to within ${(1+\epsilon)}$ without using space ${n}$—ideally, using space polylog in ${n}$. He then took this to the case of matrices and described solvable and hard cases.

${\bullet}$ Mary Wootters—who along with Yi Li did her PhD under Martin—closed the first day by speaking on “Repairing Reed-Solomon Codes.” This, from a joint paper with Venkat Guruswamy, was my favorite talk. The basic problem is deliciously simple: Given an unknown polynomial ${f(x)}$ of degree ${k-1}$ over ${\mathbb{F}_N}$ where ${N = 2^t}$, and an argument ${b \in \mathbb{F}_N}$, we want to compute ${f(b)}$. A random set of ${k}$ values ${\alpha_j = f(a_j)}$ suffices to compute any ${f(b)}$ by interpolation. Having only ${k-1}$ values is never enough. Each value has ${t}$ bits. Do we really need all ${kt}$ bits of the ${k}$ values? Mary gave cases where, amazingly, getting samples of only ${T \ll kt}$ total bits from the values is enough. The bits sent by each node ${j}$ may be computed locally from ${\alpha_j}$ but not with communication from any other node.

The conference dinner had several speeches and toasts and a joint birthday cake.

## The Second Day

${\bullet}$ Neil Immerman spoke on “Algebra, Logic and Complexity.” He began by noting that he met Eric at the same joint STOC and Structures meeting in 1986 which I mentioned above. He started with how the descriptive complexity program refined notions of reductions to make them very sharp, culminating with uniform-${\mathsf{AC^0}}$ reductions formalized via first-order logic. This covered his 2009 paper with Eric and three others, showing that under standard complexity assumptions there are exactly six equivalence classes of Boolean-based constraint satisfaction problems under ${\mathsf{AC^0}}$ isomorphisms.

${\bullet}$ Nati Linial spoke on “Hypertrees.” He has been Mike’s most frequent collaboration partner—21 joint papers and counting—and vice-versa. He related how they have visited each other often since they were post-docs together at UCLA. The talk involved large matrices whose nonzero entries are ${+1}$ or ${-1}$.

${\bullet}$ Toniann Pitassi was supposed to speak on “Strongly Exponential Lower Bounds for Monotone Computation” but she was unable to travel at the last minute.

${\bullet}$ Ramamohan Paturi spoke about “Satisfiability Algorithms and Circuit Lower Bounds.” This covered Mike’s algorithmic ideas in the famous “PPSZ” paper, “An Improved Exponential-Time Algorithm for ${k}$-SAT.”

${\bullet}$ Lance Fortnow closed the technical part of the meeting by talking about “Connecting Randomness and Complexity.” He started by noting that he and Eric have gone a combined 61-for-62 in attending the Structures/Complexity conferences, Lance having missed only 2012 in Porto, and told more personal stories. His talk covered Eric’s work involving degrees of Kolmogorov complexity-based randomness, which I’ve noted above.

I had to drive back to Buffalo early the next day, so I missed the festivities on the second evening and a third-day brunch at Eric’s house. Overall it was a really nice and convivial time. It was great seeing friends again, and one conversation in particular has proved valuable to me since then: I heard Eric and Harry and Michal and Jack Lutz and perhaps Mario Szegedy or Mohan Paturi talking about how Kolmogorov complexity is “not so concrete.” Without giving the actual details I outlined a practical case where one would want a definite, concrete measure. I thank DIMACS for sponsorship and the organizers for putting together a great event.

## Open Problems

I’ve just now returned from Poland, whose classic toast “Sto lat!” means, “May you live one hundred years.” Accordingly we wish that ${E + M = C^2}$ may come to denote their ages in Roman numerals.

[fixed two names and added note about Li being Martin’s student too]