Game Changing Conjectures In Mathematics
As a CMU alum I think the Steelers are a lock, but those Packers
Joseph Oesterlé and David Masser are two mathematicians who are world renown for their wonderful conjecture, the ABC conjecture. It has been called the “one of the best conjectures of the century,” by Serge Lang, who was referring to the twentieth century, but it may still be true this century.
Today I really want to talk about the Super Bowl, since I am a huge football fan. But I think that this is hard to fit under the umbrella of our usual topics, so instead I will talk about game changing.
The ABC conjecture is one of those surprises that changes the way that people think about an area, which in this case is number theory. One shock, I believe, is that the conjecture was only made late last century in 1985. That a conjecture of such importance can be found just a few years ago, especially a conjecture about integers, is extremely exciting. Perhaps there are other basic conjectures lurking out their, conjectures that will have change the “game”—help change the way that we think of an entire area of mathematics. Perhaps.
I must add a story of how I once caused a stir on the ABC conjecture, without any intention of doing so. Jin-Yi Cai and Zeke Zalcstein and I had proved a result on the following problem: Suppose that are matrices over the rationals that commute pairwise:
We were able to show that it is decidable, actually in polynomial time, whether or not there are natural numbers so that
This is a question essentially about the word problem for commutative semigroups of matrices. The result really comes down to the study of certain exponential Diophantine equations.
Our mistake, innocent, but a mistake nevertheless was to title the paper initially: “The ABC question resolved,” or something like that. Well this caused a great deal of email to be sent to us asking for the paper. A lot of email. Our result has nothing to do with the real ABC conjecture, and eventually all was worked out. One obvious suggestion: be careful how you title your paper—we knew about the ABC conjecture, we knew it was not the “ABC problem,” but we messed up. The conference version, in FOCS 1994, was changed to: “The Complexity of the Membership Problem for -generated Commutative Semigroups of Rational Matrices.” Strangely the final paper in SIAM Journal of Computing had the older title: “The Complexity of the A B C Problem.” Maybe the spaces between the letters helped not to confuse anyone.
One of the traditions in mathematics when faced with a resistant problem is to change the problem. Perturb it a bit, change the definitions, or the model—make a change. The art is to make the change small enough so that the new problem is close to the “real” problem, yet large enough so that you can make progress.
Football—I had to work this in somehow—is one sport that often changes the game. The NFL Rules Committee regularly makes small but tangible changes in the rules of play, enough to change the results of games. For example, the NFL recently abolished the old “force-out rule”—now a receiver must always get both feet in-bounds even if he is man-handled over the line. This came close to being a factor the last time the Steelers were in the Super Bowl in 2009, when Santonio Holmes just barely got his feet in for the game-winning catch. It is not clear these changes make the game better, or fairer, but they are changes that the NFL feels are important.
Integers polynomials: This is a classic switch that is used in number theory. I will discuss it in more detail in a moment. But it is a powerful idea: integers share many properties with polynomials over the integers, but polynomials are usually much easier to handle. One simple example: the famous Goldbach conjecture is solved for polynomials but is still open for integers. The first solution was due to David Hayes who proved in 1965:
Theorem: Every polynomial of degree at least one over integers can be written as the sum of two irreducible polynomials.
Graphs trees: Many problems, perhaps most interesting problems, on graphs are difficult. They are often -complete, and cannot be solved efficiently—at least not today. One way to change the game is to restrict the class of graphs that we work on to a tractable subclass. Trees are often a great class, many problems that are hard in general are easy for trees. Another way to change the game is to move to graphs that have bounded tree-width, graphs that are planar or have low genus, and many other restrictions.
General circuits monotone: There still are essentially no non-linear lower bounds for general non-uniform circuits. One way to change the game here is to replace general circuits by monotone or by other restricted classes of circuits. Then one can often obtain strong lower bounds, by results that are still quite hard. It is notoriously unclear how well they help us understand the general problem of circuit lower bounds.
Turing machines machines with oracles: I have discussed this at length before, and am on the record as not being a fan of oracle results. I do agree that many oracle results are very technically challenging, many oracle questions remain open. Perhaps this is the best message with this game change: even replacing Turing machines by machines with oracle access does not make some separation results easy. If with this game change the problems are still hard, then it seems plausible that the “real” problem without oracles must be very difficult. Oracles also play a major role in Scott Aaronson and Alex Arkhipov’s recent important work on quantum complexity.
Exact approximate: There are countless problems that we would like to solve, but cannot solve exactly. Some we are blocked because they are -complete or worse; others are in polynomial time, but we cannot find a fast exact algorithm. Examples of the former include solving integer programs, non-linear optimization problems, and many others; the latter include solving near neighbor problems, eigenvalue computations, and again many others.
Worst case average case: Changing the model of how we measure complexity is another important game changing idea. Average case is one that sometimes is insightful, but for many problems the choice of the distribution is critical. Another way to change the basic game is to use smooth analysis, which seems to be quite powerful. This brilliant idea is due to Dan Spielman and Shang-Hua Teng—I will discuss it in more detail in the future.
The honest point is: no one wants a solution to the changed problem, we switch the game to allow “near” solutions to be acceptable. But no one really wants these—people want the answer.
The ABC Conjecture
The ABC conjecture connects the additive structure of the integers with their multiplicative structure. it has its own home page created by Abderrahmane Nitaj. Results that connect these two are usually deep; many of the open problems in number theory concern exactly the interplay of addition and multiplication. For example, the Twin Prime Conjecture exactly does this: Given an , is there a prime so that is also prime? The adding of to a prime is the source of the reason this conjecture is so hard.
Define to be:
This is called the radical of the number . It is the product of all the primes that divide , but taken with multiplicity . Thus and . The ABC conjecture is:
Given there is a constant so that for all non-zero relatively prime integers with , we have the inequality
One simple demonstration of the immense power of this conjecture it that it implies Fermat’s Last Theorem is true for all large enough. Suppose that
where are non-zero and relatively prime. Then let , , and you guessed right . The key insight is that the radical of is very small
A simple calculation using the ABC Conjecture shows that this is a contradiction for large enough. Pretty impressive.
The ABC Problem Changed
Walter Stothers in 1981 and independently Richard Mason in 1983 discovered a basic theorem, not a conjecture, about polynomials, that looks like the ABC Conjecture for integers. Since their game changing idea preceded the integer ABC Conjecture. I am not sure whether there was actual intellectual synergy between the two forms or whether the timing was coincidence—I welcome comments on this. In any event, the polynomial version is provable, and this really is the point that I wish to make. Often the polynomial version of a hard integer problem can be solved
Theorem: If are relatively prime univariate polynomials over the complex numbers, not all constant, such that , then
where is the number of distinct roots of .
I will not give a proof—see this for a well written version of a proof. I will say that one of the many reasons this works for polynomials is that they have additional structure: for instance we can form the derivative of a polynomial, but there seems to be no counterpart for the integers.
Though it is a “second prize” compared to the original ABC Conjecture, this theorem can still be used to solve many questions. In particular it can be used to prove:
Theorem: Let be an integer.
has no solution in non-constant polynomials over the complex numbers.
The proof is similar to the above deduction of Fermat-for-large- from ABC. Note that since there are no unknown constants here, the proof can handle all .
What other game changing methods can we use in complexity theory? Are the results from the game changing really insightful? Or not? What do you think?