Unexpected Connections In Mathematics
Connections in mathematics: games and graphs
John Nash is famous for his creation of what has become one of the central notions of modern game theory—his concept of what is a “solution” for a non-zero sum game. For this work he was awarded the Nobel Prize in Economic Sciences in 1994, along with John Harsanyi and Reinhard Selten.
Today I wish to talk about connections between problems in mathematics that are surprising.
Nash is famous for three things: his wonderful work on game theory, his wonderful embedding theorem, and his unfortunate illness that made him unable to work for most of his adult life. His embedding theorem proves that every Riemannian manifold can be isometrically embedded into some Euclidean space. This is a deep result—way out of my area of expertise. But the core is that a Riemannian manifold is defined abstractly without assuming that it lies in any ambient Euclidean space. Nash’s theorem proved that such a manifold could always be embedded into a Euclidean space. His original proof was quite difficult; later the proof was unravelled and it was shown that it depended on a new implicit function theorem.
One wonders what other great results he could have proved had he not been ill for so long.
During his illness he was a well known “figure” who spent much of his time at the main Princeton Library, Firestone. My wife, Judith Norback, has told me she often would see him, since she spent many days at the library. Her research required a great deal of “lit” review. I almost never saw him, since I spent my library time at the main mathematical library, which was located across campus.
Roger Myerson has written a paper on Nash’s life and work. Myerson points out that Nash’s work on game theory spread slowly. It started with a short note in the Proceedings of the National Academy of Sciences, which was published in 1950. His concept of what is a solution to a non-zero sum game is elegant, but elegant or not it did not cause immediate excitement. It is useful to remember this in our own research: often it takes a long time for others to appreciate the importance of a new result. Even the great John von Neumann did not acknowledge Nash’s concept quickly: Myerson states this:
The short perfunctory reference to Nash’s work in the preface to the 1953 edition of von Neumann and Morgenstern’s book is particularly disappointing.
Either von Neumann and Oskar Morgenstern did not understand the importance of Nash’s achievement, or who knows.
In science, especially mathematics, it is reasonable to expect that different areas of research, even if they are “far” apart, may have interesting connections. Part of this is due to the beauty and power of mathematical ideas—there often is no precise demarcation between areas. Often a problem can be viewed, for example, as a question about a group, or a graph, or a set. Each view may yield different insights into a problem, and may eventually yield enough different insights to allow the problem to be solved; this fusion of ideas is often very powerful. We expect and welcome these connections between disparate areas, since they usually enrich both of them: ideas and methods from one can flow into the other. These expectations aside, we are still sometimes taken aback by unexpected connections.
One of the most exciting events in mathematics is when an area that seems very far from another one helps solve a problem there. For a concrete example: if the question, , is solved by a clever argument using complexity theory and some key new combinatorial lemma, we would all rejoice, but few would be shocked by the connection. If it is solved by the use of model theory and large cardinal axioms, or by the application of the theory of non-linear partial differential equations, however, I believe we would be truly surprised.
It is these connections that are the most exciting and the most fun. They do happen. Here are two that come to mind, I will add one that is the main point of discussion in a moment.
Analysis and Number Theory: This book by Donald Newman is a gem, in my opinion. I love how he starts the book: The most intriguing thing about Analytic Number Theory (the use of Analysis, or function theory, in number theory) is its very existence! How could one use properties of continuous valued functions to determine properties of those most discrete items, the integers. Analytic functions? What has differentiability got to do with counting? The astonishment mounts further when we learn that the complex zeros of a certain analytic function are the basic tools in the investigation of the primes.
Model theory and Analysis: James Ax’s solution of a problem in analysis using model theory is one of my favorite examples of connections that are unexpected. I have discussed this before more than once.
A recent book review in the Bulletin of the AMS is a great source of very interesting comments on the nature of connections. The review, by Joan Birman, is on braids, knots, and links. Birman is one of the world experts in this area, and has written a beautiful summary of some of the unexpected connections that this area has made with other parts of mathematics. I strongly recommend that you read her piece—perhaps I will discuss it in the future.
The connection I wish to discuss is between game theory and a crypto-type problem. The latter is the so-called planted clique problem. Lorenz Minder and Dan Vilenchik have proved a neat theorem that connects this two areas—I find the connection surprising, and one that could have important consequences. (I should have pointed out that Elad Hazan and Robert Krauthgamer actually made the first connection here.)
In order to state their theorem I have to first explain what the planted clique problem is. It sounds a bit like a botanical problem, but is a natural question about the difficulty of solving the well known clique problem. Of course we have known for a long time that determining whether or not a graph has a clique of size is -complete.
Dick Karp first asked, in 1976, whether or not one could find a clique of size in a random graph in polynomial time. This is the usual random graph model where each edge is there independently with probability . He noted that a simple greedy algorithm would yield size about , so his question was essentially could one slightly improve the greedy algorithm. His question appears to still be open today—see the thesis of Benjamin Rossman for details and some other interesting results.
Lack of progress can often suggest that we modify our goals; hence the introduction of the planted clique problem. This problem is this: take a and add a clique of size to the graph. The question is: can we find the added clique in polynomial time? If the size of the clique is very large, it is easy. The best known is that it is possible to solve if is order . Any value of that is smaller seems to be hard.
The connection of Minder and Vilenchik is to relate this problem to Nash solutions. In particular they prove:
Theorem: There exists a positive constant so that if there is a polynomial time algorithm that finds in a two-player game the -best -equilibrium, , then there is a probabilistic polynomial time algorithm that distinguishes with high probability between two graphs: in and , an arbitrary graph on nodes with a clique of size .
Roughly, any good approximation algorithm for even approximate Nash equilibrium yields insights into the planted clique problem. As always take a look at their paper to see the details, and the proof. I find this connection pretty interesting—I hope you do too.
What are your favorite unexpected connections in mathematics?