Blasts From the Past
Is the “Forsch” awakening in complexity theory?
|Composite of src1, src2, src3|
Max von Sydow starred as the chess-playing knight in Ingmar Bergman’s iconic 1957 film The Seventh Seal. He has the first line of dialogue in this year’s Star Wars VII: The Force Awakens:
This will begin to make things right.
His character, Lor San Tekka, hands off part of a secret map before… well, that’s as much as we should say if you haven’t yet seen the movie. Highly recommended.
Today Dick and I marvel at how some of this past year’s best results hark back to the years when Star Wars first came out.
That was in 1977, followed by The Empire Strikes Back in 1980 and Return of the Jedi in 1983. Von Sydow wasn’t in any of those, nor the “prequel” films in 1999–2005. He starred as Jesus in 1965’s The Greatest Story Ever Told and as arch-villain Ernst Stavro Blofeld in the 1983 James Bond film Never Say Never Again. He earned a 2011 Oscar nomination for Best Supporting Actor but lost to the equally venerable Christopher Plummer, with whom he’d co-starred in the 2007 film Emotional Arithmetic. He is appearing in season 6 of the TV series Game of Thrones and is apparently in the next Star Wars movie too. We should all be so active so long.
Both of us remember seeing Star Wars in 1977 as if it were yesterday. That is now as long ago as 1939—the beginning of World War II—was then. If you double back 1957, you get 1899. This can make one feel old. Or it can make one feel energized by the Forsch—that is the root of ‘research’ in German. Let’s see some things that 2015 gave us and that 2016 might have in store.
EDC and GI
1957 is when Paul Erdős first published his Discrepancy Conjecture, though he dated it “twenty-five years ago” to 1932. We have already covered Terry Tao’s stunning proof of this conjecture. The latest news on Tao’s blog from early this month covers attacks on related conjectures.
1977 marks the beginning of László Babai’s citable work on the Graph Isomorphism problem according to the bibliography of his full paper. Beginning to work through it now, we are again struck by how firmly it stands on the foundation of Eugene Luks’s 1980 algorithm for the bounded-degree case. Babai and Luks mapped the terrain further in two 1983 papers, one including William Kantor. Other major references also date to the initial Star Wars years.
One of them is a 1981 paper by Peter Cameron, who was a co-advisor when I came to Merton College, Oxford, that year, and who has recently made a flurry of posts on his own blog about travels to New Zealand and Australia. Although Peter’s paper does not reference Luks’s algorithm, its results identify the shield to extending it: a class of permutation groups with highly symmetric actions and no “nice” subgroups of small index. What Laci’s algorithm does resembles the isomorphic climactic plot elements of movies IV, VI, and VII: either there is a polylog-size local irregularity and exhaustively firing on it gets the whole thing to split (IV and VII), or the absence of one enables building a global automorphism to reach a known case which is like taking out the shield generator (VI).
We could have included “local-global” in our roundup of general themes. We should add that although Peter’s result depends on the Classification, Laci’s section 13.1 outlines a second wave of attack that avoids needing either for its analysis. A primary dependence on the classification theorem remains, but for reasons we discussed before, we feel the possibility of error from that is only a phantom menace.
1980 is when William Masek and Michael Paterson shaved a factor off the previous -time algorithms for computing the edit distance between two length- strings. The question for 35 years has been, can we save more to get time for some fixed ? There seemed to be no solid reason why not, until Arturs Backurs and Piotr Indyk earlier this year applied a connection first shown by Ryan Williams to the strong exponential time hypothesis (SETH). Namely, for edit distance implies the same for Ryan’s “Orthogonal Vectors” problem, which in turn implies an algorithm for satisfiability of -variable, -clause CNF formulas in time for some fixed , which SETH says cannot exist.
The heart is a new degree of fine-grained reduction between problems in . The point is that the problems involved are all in the bedrock of polynomial time, not flung into the space of NP-completeness. As we discussed last June, this is a puzzling kind of revenge of the SETH.
1981 is when Péter Frankl and Richard Wilson explicitly constructed uniformly succinct families of graphs of size having no cliques or independent sets of size , where
The lower , the harder this is to achieve. Erdős obtained in his 1947 paper but this famous first use of the probabilistic method does not give an explicit family. “Explicit” could mean definable by a formula of first-order logic, but consensus requires only that whether node has an edge to node is decided in time, which is what we mean by “uniformly succinct.”
Frankl and Wilson’s upper bound stood for over 30 years until 2012 when Boaz Barak, Anup Rao, Ronen Shaltiel, and Avi Wigderson achieved , indeed for some fixed . Now Gil Cohen has proved a major further step by constructing graphs with . This is the first explicit construction giving bounds that are quasipolynomially related to the nonconstructive bound of Erdős. Such graphs of course witness that Frank Ramsey’s famous theorem needs a higher for that value of .
3 + 1/86
1984 is when Norbert Blum proved a lower bound of 3n – 3 on the circuit complexity of an explicitly-given family of n-variable Boolean functions. This is for circuits using the full set of binary gates, not just (with free use of ) for which bounds have gone as high as 5n – o(n) before meeting resistance.
Now Magnus Find, Alexander Golovnev, Edward Hirsch, and Alexander Kulikov have beaten the “3” — by “1/86.” That is, they have constructed an explicit family of Boolean functions that require circuit size
How impressed should we be? Their paper has several new and striking ideas, including that allowing circuits to have cycles (using just and to yield linear equations) increases the toolbox of recursions for inductive proofs. However, it is still far from a healthy jump to, say, 10n, let alone a nonlinear lower bound. Our next featured result from this past year gives a bit of pause.
The Epsilon Strikes Back
A few years ago we hailed a pair of independent ‘breakthroughs’ on the exponent of the time for multiplying two matrices that had been achieved by Andrew Stothers and Virginia Williams, the latter getting down under 2.372873 (correcting an earlier claim of 2.3727). This was nudged down to 2.372864 by François Le Gall last year.
A new hope for getting perceptibly toward 2, or at least achieving 2.373 – for non-microscopic values of , is however apparently quashed by a paper of Le Gall with Andris Ambainis and Yuval Filmus which came out at STOC 2015. They show that the particular technique of these papers cannot even reach down to 2.3725. More foreboding is that a wider class of methods cannot get down to 2.3078.
Getting back to the circuit lower bounds, it is thrilling to beat 3n but sobering that a paper with three new creative ideas beat the ‘3’ by only some kind of . The novelty of the methods will hopefully avert an attack of clone papers pushing it no higher than, say, 3.0125n. We can also mention a new paper by Daniel Kane and Ryan Williams on non-linear lower bounds for threshold circuits of depths 2 and 3.
We could perhaps devote a whole separate post to developments in quantum complexity. The simplest to state is a new mainstream complexity class upper bound for quantum Arthur-Merlin games with unentangled provers:
The paper by Martin Schwarz improves the previous upper bound of . (Update: Per this the result is in doubt.) We can also mention a new paper by Le Gall with Shojo Nakajima that improves ten-year-old bounds for quantum triangle-finding on sparse graphs.
The headline-getting progress, however, has been in efforts to build larger-scale quantum computers. In April, IBM announced advances in fabricating quantum circuits that detect errors.
Earlier this month, Google released a paper on experiments with their quantum computer built by D-Wave Systems claiming a speedup over classical methods. See also this interview with Scott Aaronson and a comment with remarks on the paper in Scott’s own post on it. We have been intending for a long time to cover the debate over whether speedups claimed by D-Wave are necessarily quantum. At least we can say this marks a return of the Geordie Rose and Vern Brownell-led company to the driving side of it.
We look forward to , that is next year. Perhaps it will be the year that leads to some real progress on lower bounds. Who knows?
We wish everyone a safe New Year’s Eve and all the best for 2016—may the Forsch be with you.
[Update on doubt about QMA(k) result, Robin->Richard Wilson, some word and format changes]