Algorithms At Oxford
A report on a recent workshop at Oxford University
Joël Ouaknine, Georg Gottlob, and Andreas Pieris are faculty at Oxford University in St. John’s College. They recently held a workshop on algorithms to celebrate the creation of an algorithms group within the computer science department.
Today I wish to give a report on the talks that were given at the workshop.
The workshop was a joy to attend. It was well organized and well attended, and the talks were terrific. The talks covered algorithms in the broadest sense. I was honored to be included among the speakers, but will not say anything about my talk. Unfortunately I could only attend about two-thirds of the talks, and therefore will only comment on those I had the pleasure of seeing in-person. I heard that all the talks were great, and wish I could have been there for all of them. The ones I missed were given by Leslie Goldberg, Monika Henzinger, and Elias Koutsoupias.
Joël told me they plan to put the talks on the web—I will update you if and when this happens.
Here are those talks that I did get to hear.
Paul Goldberg: Towards Zillions of PSPACE-completeness Results. Paul spoke on non-zero-sum games and the complexity of solving them. He focused mostly on various versions of Brouwer’s Fixed Point Theorem in a discrete setting. This leads to quite interesting questions about how to encode creation path following problems, and he had exquisitely beautiful two- and three-dimensional diagrams of colored blocks arranged in complex patterns. The key insight is: any algorithm for these problems that solves them by a “path following method” must be PSPACE-complete.
What does it mean for an algorithm to be complete? The algorithm is working on a problem for which there are multiple correct answers—in this case, equilibria. Reproducing the particular outputs favored by the algorithm is the PSPACE-complete task.
- Joan Feigenbaum: DISSENT: Accountable, Anonymous Communication on the Internet. Joan spoke on a current large DARPA sponsored project that she is involved with on how to communicate on the Internet and really stay anonymous. There are current ad-hoc methods that are in wide use, but she points out that they are well-known to be attackable. Often simple traffic analysis methods can be used to get around the claims that users remain hidden. Her group is working on a system, which they call DISSENT, that has provable properties. This was a great example of using basic tools from modern crypto and putting them together in ways that solve an important problem. The project goal is to make the system practical, and to work with huge numbers of users.
- Peter Jeavons: Discrete Optimisation—The Big Picture. Peter took the big picture, as he likes to say. He also focused on the last 100 days of research in this classic area. In his view almost all hard optimization problem can be viewed as constraint satisfaction problems (CSP). He then showed how the structure of CSP’s are related in a beautiful way to classic universal algebra. While there has been great progress in the recent past—100 days—some of the key ideas date back to work of Emil Post from 1945. A great combination of a grand vision and hard concrete results.
- Raphaël Clifford: Lower Bounds for Streaming Problems. Raphaël spoke on lower bounds for streaming problems—big surprise—but the interesting part was that he could use methods that were developed already. These are lower bound methods for data structure problems based on the so called “cell-probe” model. This model assumes that there is a random-access memory that can be read and written one cell at a time. There already are a multitude of results for data structures based on this model, but somewhat surprisingly Raphaël is able to adapt them to solve his streaming problems.
- Phil Blunsom: A Bayesian Approach to Learning the Structure of Human Languages. Phil spoke on the acquisition of languages. His group is one of the best in the world at learning languages without any prior models or assumptions about the language. His group gets scores on known test sets that are better than anyone else’s.
- Georg Gottlob: Hypergraph Transversals and Monotone Boolean Formula Dualization. Georg gave a wonderful talk on a very old problem about hypergraphs. Recall a hypergraph is nothing more than a finite collection of subsets of a set. You can think of graphs, no “hyper,” as the special case where the sets are all pairs of vertices. Many problems can be expressed as properties of hypergraphs. One of the big open questions in the area is whether or not a hypergraph is the transversal of another one. The surprising result is that this can be done in time
where is the number of vertices. Georg is able to generalize this result and show that it can be done in small space:
This is an unusual space bound, and the result seems to be quite neat.
- Joël Ouaknine: Decision Problems for Linear Recurrence Sequences. Joël spoke on a problem that has been open for over 80 years: the Skolem Conjecture. The Skolem question is just this: consider a linear recurrence over the rationals. Does there exist an so that ? In more details:
where the coefficients are reals, and the initial values are also reals.
He quoted Terence Tao—as we did—saying:
It is faintly outrageous that this problem is still open; it is saying that we do not know how to decide the Halting Problem even for ‘linear’ automata!
I agree. I have worked on some related theorems that are now years old—see here—with Ravi Kannan, and are well aware of the difficulty of the area. Joël gave a masterful explanation of the area, stated what is known about it, and presented some new types of barriers to further advances. I will say more about the type of barrier in a the next section.
- Robert Tarjan: Algorithm Design: Theory and Practice. Bob spoke on some new results on data structures and algorithms that are driven by practical questions. One example was on balanced binary search trees—something Bob is the world expert on. He re-visited some of his classic work on splay trees, and showed how they can be improved in two ways. The data structures can be made to handle very uneven frequencies of access and still yield better performance. Also they can be made more concurrent. Today such data structures are used in many-core applications, in which the old method had a hot spot at the root. That is, all updates would touch the root. Bob’s new result yields a data structure in which almost all updates are near the leaves, and this yields very good concurrent performance.
A Type Of Proof Barrier
In computer science theory we have a variety of barriers that explain why it is hard to prove something. Prominent examples that come to mind are the oracle limitations, the natural proof barriers, and the algebrization barrier. There is another type of barrier: If you solve X, then you solve Y; but Y is really hard.
Ken and I have pointed out one example recently in a post on the famous Hartmanis-Stearns conjecture. We noted that a proof of the conjecture would entail a super-linear Turing Machine lower bound on integer multiplication, which on current knowledge is held to be extremely unlikely to be proved.
Joël Ouaknine’s work with James Worrell, and recently including Matt Daws, does the same for the Skolem conjecture. He shows that either resolution of the Skolem problem would solve deep open problems in number theory. If Skolem’s conjecture is true, then there are new results that seem beyond the reach of current methods, and the same if the conjecture is false. The pivotal concept is the Lagrange values of real numbers :
For “most” this is zero, but for most (not just many) important irrational numbers precious little is known, even whether is computable. What their work shows is that for the real angular parts of certain classes of complex algebraic numbers on the unit circle, decidability of Skolem’s problem (for order 6 or higher) implies computability for one class, while undecidability implies positivity of a related quantity for another class. This is a very neat state of affairs.
One of his solved special cases is even neater, for those who are fans of the counting hierarchy:
Theorem 1 Whether a linear recurrence of order at most 5 has no negative terms is decidable, and belongs to the complexity class
At dinner the Oxford people told the following “joke”: “How many Oxford dons does it take to change a light bulb?” The answer is:
The context of this is the difficulty of creating the algorithms group. Ken remembers the discussions around creating a computer science department in the 1980’s. They seem to be off to a great start, and look to be one of the top groups in the world in the near future. Cheers to them.
Does Skolem’s Problem have more direct implications for issues in complexity lower bounds?