What do you think?

Charles Lindbergh is famous for his solo non-stop flight that left Roosevelt Field in Long Island on May 20, 1927 and arrived the next day at Le Bourget Field in Paris. He covered a distance of almost ${4,000}$ miles, which is a long flight even for today’s jumbo jets. He did it in his single-seat, single-engine, Ryan monoplane called the “Spirit of St. Louis.”

Today Ken and I want to talk about not about making solo flights, or air travel, or jumbo jets; but about selecting the
“Person Of The Year.”

Lindbergh helped indirectly create the “Man of the Year” concept at Time Magazine. Toward the end of the year 1927 the editors at Time wanted to remedy an error they made earlier by not having Lindbergh’s historic flight pictured on their magazine’s cover. So Lindbergh became the first “Man of the Year” of Time magazine. This clever solution started their tradition of selecting what is now correctly called the Person of the Year, where Time features:

A person, group, idea or object that “for better or for worse … has done the most to influence the events of the year.”

The usual winners are leaders of countries, although in 1960 science was selected and represented by a group of scientists:

George Beadle, Charles Draper, John Enders, Donald Glaser, Joshua Lederberg, Willard Libby, Linus Pauling, Edward Purcell, Isidor Rabi, Emilio Segrè, William Shockley, Edward Teller, Charles Townes, James Van Allen, and Robert Woodward.

Over two decades later, in 1982, “The Computer” was the winner.

## Our Plan

We would like to have your input on who or what should be the “Person Of The Year” winner from the view of theory, and more widely, mathematics. We have some ideas that we will share, but would love to get some additional input. The winner(s) will be determined by a complex system of voting—forget Arrow’s Impossibility Theorem—and later this month we will announce them. The first prize is a free subscription to Gödel’s Lost Letter, and the second prize is a fifty percent discount on a subscription.

## Some Candidates From Experts

We have directly asked a few experts already. Both Paul Beame and Avi Widgerson independently selected the following results on constant depth circuits. We quote Paul:

I think that the biggest complexity results of the year were not contained in a single paper. It is the chain of papers on algebraic complexity which show that arbitrary algebraic circuits can be reduced to homogeneous depth 3 formulas of restricted size and fan-in, and that this simulation is tight up to constants in the exponents. This includes several papers including the following 2 papers. These are deserving together but there may be another to include.

${\bullet }$ Ankit Gupta, Pritish Kamath, Neeraj Kayal, and Ramprasad Saptharishi, Approaching the chasm at depth four, in Conference on Computational Complexity, IEEE, 2013.

${\bullet }$ Ankit Gupta, Pritish Kamath, Neeraj Kayal, and Ramprasad Saptharishi, Arithmetic Circuits: A chasm at depth three, in Foundations of Computer Science (FOCS), IEEE, 2013.

Ravi Kannan suggests the co-winners of the STOC 2013 Best-Paper Award: Ken Clarkson and David Woodruff for their paper Low Rank Approximation and Regression in Input Sparsity Time.

## Some Candidates

Here are some of our candidates.

Vladimir Voevodsky: For the new approach to mathematics in his paper, Univalent foundations and set theory. We have discussed this lightheartedly, but still intend serious treatment as time allows.

Colin McLarty: For work on trying to unravel the logical complexity of Andrew Wiles great proof of Fermat’s Last Theorem. See this and this for details.

Yitang Zhang: For his unexpected and wonderful result on prime gaps that shows an almost twin prime conjecture is true. See this for recent results and improvements on the original paper.

Shinichi Mochizuki: For his claimed proof that the ABC conjecture is true. This is a continuation from last year, when we discussed it, but the efforts by the mathematical community to grapple with the heft of this new work have continued all this year.

Atri Rudra and Mary Wootters: For their result, Every list-decodable code for high noise has abundant near-optimal rate puncturings.

What sets this result apart from the large literature on list-decoding is that it is combinatorial. Perhaps the reason not much progress was made for years was because people were trying to prove a “purely algorithmic” version of this result. Atri tells me that

I don’t think the statement of the result is surprising: many folks, have believed that such codes should be list-decodable beyond the Johnson bound.

The fact still seems quite neat, in my opinion.

## A Non-standard Candidate

How about our friends at Wikipedia? They might be a good choice for “Person” of the year. We use them all the time here at GLL. And they are trying right now to raise some money—maybe we could all help?

## Open Problems

Who or what would be your choice?

December 21, 2013 1:02 am

Marcus, Spielman, and Srivastava.

Interlacing Families I: Bipartite Ramanujan Graphs of All Degrees
Interlacing Families II: Mixed Characteristic Polynomials and the Kadison-Singer Problem

• December 21, 2013 6:46 am

I thought this was an obvious candidate and was very surprised not to see it in the original post. The results are important, and the interlacing families of polynomials technique seems very powerful.

2. December 21, 2013 2:27 am

AR and MW’s coding work quoted here is interesting and backs up DL’s conjecture that DLOG is in P! But am I the only one noticing the ugly math in their(AR and MW) paper? Is this how it is supposed to be? I bet Qi Cheng knows a better trick to get the same results in an easier and more clear path.

3. December 21, 2013 6:55 am

For math, I would vote the Marcus-Spielman-Srivastava papers.

For algorithms, Clarskon-Woodruff.

For complexity, I like the recent work on extended formulations. In particular, Chan-Lee-Raghavendra-Steurer, who showed that the Sherali-Adams hierarchy essentially gives the strongest approximation guarantees of all polynomial size LPs for MaxCSP problems. And Rothvoss’s recent result that the matching polytope requires exponential size LP formulations.

4. December 21, 2013 3:25 pm

zhang!. lots of links to info, articles, etc. … few note that this is a 2-millenia old problem apparently 1st considered by Euclid from what I understand. the narrow problem itself may be unsubstantial but the techniques it is leading to developing might have some reusability. this is why it makes sense to study the hardest open problems to some degree and not denigrate them (as even elite mathematicians do at times, eg Gauss vs FLT etc).

December 22, 2013 7:12 am

The probability of finding a pair of twin primes greater N tends to zero as N tends to infinity.
The probability of finding an algorithm of time complexity f(n) for an NP-complete problem tends to zero as f tends to a polynomial.
I don’t think these problems can get more than a probabilistic solution.

• December 22, 2013 10:03 am

“The probability of finding an algorithm of time complexity f(n) for an NP-complete problem tends to zero as f tends to a polynomial.” what is you reasoning here?

December 22, 2013 10:50 am

Of course it’s only intuitive, otherwise I’d get the Millennium Prize. 🙂 I think there are fewer and fewer algorithms as one raises the efficiency expectations. It seems possible to state such conjecture rigorously. In case it’s undecidable, it would make a nice axiom.

5. December 21, 2013 8:26 pm

Clarkson-Woodruff. David’s talk for a broader audience at Brown: http://grigory.us/files/theory/david.ppt

December 22, 2013 5:44 am

More on the amplituhedron
http://mathoverflow.net/questions/142841/the-amplituhedron-minus-the-physics/143421

7. December 22, 2013 9:15 am

I think Yitang Zhang would be a good choice overall.
For algorithms, wouldn’t Antoine Joux be a good nominee?

December 22, 2013 10:35 am

Me (not me)

Yes Antoine Joux is excellent additional candidate

8. December 23, 2013 8:20 am

As far as I understand from Approaching the chasm at depth four paper, the conjecture 26 is the most relevant and this is the only one that is useful to separate VNP and VP. Pretty much other information does not show anything to the separation. Then why is the paper Wigderson’s fav? The main difficulty is carpetted as conj 26 and beautifully framed as a paper. The other paper(Arithmetic circuits: A chasm at depth three
) is somewhat interesting. Still the difficulty is carpetted under conj 26.

December 23, 2013 8:08 pm

Dear Dick and Ken

Thank you for sustaining one of the mathematical world’s *great* weblogs, and thank you especially having the chutzpah to nominate Colin McLarty’s What does it take to prove Fermat’s Last Theorem? Grothendieck and the logic of number theory as a candidate for “What Is Best Result Of The Year?”

Why is chutzpah needed? Two reasons.

First, fans of Sidney Redner’s article Citations for 100 Years of Physical Review know one answer: history shows plainly that it’s mighty hard (pretty much impossible in fact) to identify seminal articles in their initial year of publication.

The second reason is McLarty’s solid-gold credentials as a no-fooling philosopher:

Colin McLarty … is Truman P. Handy Professor of Philosophy and the Chair of the Philosophy Department at Case Western Reserve University

McLarty’s article exemplifies an accelerating trend of trans-disciplinary boundary-blurring that is — beneficially as it seems to me — transforming and unifying the entire STEM enterprise (including even engineering seminars).

To see this trend in action, just read the “Foreword” and/or “Preface” to pretty much any well-respected book by any well-respected philosopher / mathematician / scientist / engineer. You will find that nowadays it’s increasingly difficult to guess the author’s profession from the preface.

In honor of which, here is a quiz for Gödel’s Lost Letter and P=NP readers. Try to guess the profession — philosopher, mathematician, scientists, or engineer — that is associated to this Preface:

“As a graduate student working in [redacted] I encountered deep irritation caused by the work I was doing, something quite fundamental that I did not understand.

Certain elementary notions that are accepted as starting points for work in [redacted] somehow had no fundamental foundation, no verifiable root. My inclination was to mine [redacted] vertically, and here was a subject whose tunnels were dug horizontally.

I now appreciate more clearly where my question was headed: Yes it does head downward, and it goes very deep. What is less clear is that there is a path in that direction understood by anyone very well.

Here one must face those notorious issues of interpretation that stimulate much confusion and contention but few definite answers.

Hint: the (sole!) author of this textbook is the person who invented the one-word answer to the crossword-puzzle clue “process associated to BOSONSAMPLING mysteries”

“_ n _ _ v _ _ _ i _ _”

Bonus Question: Name the profession (different book, different profession, and no fair Googling!)

We have been particularly anxious to introduce the pioneers to whom we owe so much, and from whom we can still draw useful inspiration. They were human beings not so different from ourselves, and perhaps some of our readers will be inspired to make similar contributions.

However, it is always the youngest generation of professionals who see the future most clearly and who must build on their imperfect inheritance.

Each new book depends for its success on many more individuals than those whose names appear on the title page. The most obvious debt is certainly to the hard-working and gifted students who have collectively taught us much more than we have taught them.

Best Wishes for Happy Holidays and productive boundary-blurring are extended to all Gödel’s Lost Letter and P=NP readers and their families!