What Is Best Result Of The Year?
What do you think?
Charles Lindbergh is famous for his solo non-stop flight that left Roosevelt Field in Long Island on May 20, 1927 and arrived the next day at Le Bourget Field in Paris. He covered a distance of almost miles, which is a long flight even for today’s jumbo jets. He did it in his single-seat, single-engine, Ryan monoplane called the “Spirit of St. Louis.”
Today Ken and I want to talk about not about making solo flights, or air travel, or jumbo jets; but about selecting the
“Person Of The Year.”
Lindbergh helped indirectly create the “Man of the Year” concept at Time Magazine. Toward the end of the year 1927 the editors at Time wanted to remedy an error they made earlier by not having Lindbergh’s historic flight pictured on their magazine’s cover. So Lindbergh became the first “Man of the Year” of Time magazine. This clever solution started their tradition of selecting what is now correctly called the Person of the Year, where Time features:
A person, group, idea or object that “for better or for worse … has done the most to influence the events of the year.”
The usual winners are leaders of countries, although in 1960 science was selected and represented by a group of scientists:
George Beadle, Charles Draper, John Enders, Donald Glaser, Joshua Lederberg, Willard Libby, Linus Pauling, Edward Purcell, Isidor Rabi, Emilio Segrè, William Shockley, Edward Teller, Charles Townes, James Van Allen, and Robert Woodward.
Over two decades later, in 1982, “The Computer” was the winner.
Our Plan
We would like to have your input on who or what should be the “Person Of The Year” winner from the view of theory, and more widely, mathematics. We have some ideas that we will share, but would love to get some additional input. The winner(s) will be determined by a complex system of voting—forget Arrow’s Impossibility Theorem—and later this month we will announce them. The first prize is a free subscription to Gödel’s Lost Letter, and the second prize is a fifty percent discount on a subscription.
Some Candidates From Experts
We have directly asked a few experts already. Both Paul Beame and Avi Widgerson independently selected the following results on constant depth circuits. We quote Paul:
I think that the biggest complexity results of the year were not contained in a single paper. It is the chain of papers on algebraic complexity which show that arbitrary algebraic circuits can be reduced to homogeneous depth 3 formulas of restricted size and fan-in, and that this simulation is tight up to constants in the exponents. This includes several papers including the following 2 papers. These are deserving together but there may be another to include.
Ankit Gupta, Pritish Kamath, Neeraj Kayal, and Ramprasad Saptharishi, Approaching the chasm at depth four, in Conference on Computational Complexity, IEEE, 2013.
Ankit Gupta, Pritish Kamath, Neeraj Kayal, and Ramprasad Saptharishi, Arithmetic Circuits: A chasm at depth three, in Foundations of Computer Science (FOCS), IEEE, 2013.
Ravi Kannan suggests the co-winners of the STOC 2013 Best-Paper Award: Ken Clarkson and David Woodruff for their paper Low Rank Approximation and Regression in Input Sparsity Time.
Some Candidates
Here are some of our candidates.
Vladimir Voevodsky: For the new approach to mathematics in his paper, Univalent foundations and set theory. We have discussed this lightheartedly, but still intend serious treatment as time allows.
Colin McLarty: For work on trying to unravel the logical complexity of Andrew Wiles great proof of Fermat’s Last Theorem. See this and this for details.
Yitang Zhang: For his unexpected and wonderful result on prime gaps that shows an almost twin prime conjecture is true. See this for recent results and improvements on the original paper.
Shinichi Mochizuki: For his claimed proof that the ABC conjecture is true. This is a continuation from last year, when we discussed it, but the efforts by the mathematical community to grapple with the heft of this new work have continued all this year.
Atri Rudra and Mary Wootters: For their result, Every list-decodable code for high noise has abundant near-optimal rate puncturings.
What sets this result apart from the large literature on list-decoding is that it is combinatorial. Perhaps the reason not much progress was made for years was because people were trying to prove a “purely algorithmic” version of this result. Atri tells me that
I don’t think the statement of the result is surprising: many folks, have believed that such codes should be list-decodable beyond the Johnson bound.
The fact still seems quite neat, in my opinion.
A Non-standard Candidate
How about our friends at Wikipedia? They might be a good choice for “Person” of the year. We use them all the time here at GLL. And they are trying right now to raise some money—maybe we could all help?
Open Problems
Who or what would be your choice?
Marcus, Spielman, and Srivastava.
Interlacing Families I: Bipartite Ramanujan Graphs of All Degrees
Interlacing Families II: Mixed Characteristic Polynomials and the Kadison-Singer Problem
I thought this was an obvious candidate and was very surprised not to see it in the original post. The results are important, and the interlacing families of polynomials technique seems very powerful.
AR and MW’s coding work quoted here is interesting and backs up DL’s conjecture that DLOG is in P! But am I the only one noticing the ugly math in their(AR and MW) paper? Is this how it is supposed to be? I bet Qi Cheng knows a better trick to get the same results in an easier and more clear path.
For math, I would vote the Marcus-Spielman-Srivastava papers.
For algorithms, Clarskon-Woodruff.
For complexity, I like the recent work on extended formulations. In particular, Chan-Lee-Raghavendra-Steurer, who showed that the Sherali-Adams hierarchy essentially gives the strongest approximation guarantees of all polynomial size LPs for MaxCSP problems. And Rothvoss’s recent result that the matching polytope requires exponential size LP formulations.
zhang!. lots of links to info, articles, etc. … few note that this is a 2-millenia old problem apparently 1st considered by Euclid from what I understand. the narrow problem itself may be unsubstantial but the techniques it is leading to developing might have some reusability. this is why it makes sense to study the hardest open problems to some degree and not denigrate them (as even elite mathematicians do at times, eg Gauss vs FLT etc).
The probability of finding a pair of twin primes greater N tends to zero as N tends to infinity.
The probability of finding an algorithm of time complexity f(n) for an NP-complete problem tends to zero as f tends to a polynomial.
I don’t think these problems can get more than a probabilistic solution.
“The probability of finding an algorithm of time complexity f(n) for an NP-complete problem tends to zero as f tends to a polynomial.” what is you reasoning here?
Of course it’s only intuitive, otherwise I’d get the Millennium Prize. 🙂 I think there are fewer and fewer algorithms as one raises the efficiency expectations. It seems possible to state such conjecture rigorously. In case it’s undecidable, it would make a nice axiom.
Clarkson-Woodruff. David’s talk for a broader audience at Brown: http://grigory.us/files/theory/david.ppt
== 2013 ==
* NSA corrupted cryptography algorithms:
– http://www.math.columbia.edu/~woit/wordpress/?p=6522
* HoTT:
– The HoTT Approach to Physics: http://golem.ph.utexas.edu/category/2013/10/the_hott_approach_to_physics.html
– http://blogs.scientificamerican.com/guest-blog/2013/10/01/voevodskys-mathematical-revolution/
* Debate about set theory
– https://www.simonsfoundation.org/quanta/20131126-to-settle-infinity-question-a-new-law-of-logic/
* Amplituhedron
– https://www.simonsfoundation.org/quanta/20130917-a-jewel-at-the-heart-of-quantum-physics/
* Mochizuki’s proof of abc-conjecture progress report
– http://www.kurims.kyoto-u.ac.jp/~motizuki/IUTeich%20Verification%20Report%202013-12.pdf
* Quantum-light-harvesting
– http://www.technologyreview.com/view/522016/quantum-light-harvesting-hints-at-entirely-new-form-of-computing/
* Semidefinite programming
– https://www.simonsfoundation.org/quanta/20131220-nudging-spheres-closer-little-by-little/
* Approximate computing
– http://www.purdue.edu/newsroom/releases/2013/Q4/approximate-computing-improves-efficiency,-saves-energy.html
* Kadison-Singer problem
– http://terrytao.wordpress.com/2013/11/04/real-stable-polynomials-and-the-kadison-singer-problem/
* Wormhole entanglement
– http://motls.blogspot.co.uk/2013/06/maldacena-susskind-any-entanglement-is.html
More on the amplituhedron
– http://mathoverflow.net/questions/142841/the-amplituhedron-minus-the-physics/143421
– http://www.preposterousuniverse.com/blog/2013/10/03/guest-post-lance-dixon-on-calculating-amplitudes/
I think Yitang Zhang would be a good choice overall.
For algorithms, wouldn’t Antoine Joux be a good nominee?
Me (not me)
Yes Antoine Joux is excellent additional candidate
As far as I understand from Approaching the chasm at depth four paper, the conjecture 26 is the most relevant and this is the only one that is useful to separate VNP and VP. Pretty much other information does not show anything to the separation. Then why is the paper Wigderson’s fav? The main difficulty is carpetted as conj 26 and beautifully framed as a paper. The other paper(Arithmetic circuits: A chasm at depth three
) is somewhat interesting. Still the difficulty is carpetted under conj 26.
Dear Dick and Ken
Thank you for sustaining one of the mathematical world’s *great* weblogs, and thank you especially having the chutzpah to nominate Colin McLarty’s What does it take to prove Fermat’s Last Theorem? Grothendieck and the logic of number theory as a candidate for “What Is Best Result Of The Year?”
Why is chutzpah needed? Two reasons.
First, fans of Sidney Redner’s article Citations for 100 Years of Physical Review know one answer: history shows plainly that it’s mighty hard (pretty much impossible in fact) to identify seminal articles in their initial year of publication.
The second reason is McLarty’s solid-gold credentials as a no-fooling philosopher:
McLarty’s article exemplifies an accelerating trend of trans-disciplinary boundary-blurring that is — beneficially as it seems to me — transforming and unifying the entire STEM enterprise (including even engineering seminars).
To see this trend in action, just read the “Foreword” and/or “Preface” to pretty much any well-respected book by any well-respected philosopher / mathematician / scientist / engineer. You will find that nowadays it’s increasingly difficult to guess the author’s profession from the preface.
In honor of which, here is a quiz for Gödel’s Lost Letter and P=NP readers. Try to guess the profession — philosopher, mathematician, scientists, or engineer — that is associated to this Preface:
Hint: the (sole!) author of this textbook is the person who invented the one-word answer to the crossword-puzzle clue “process associated to BOSONSAMPLING mysteries”
“_ n _ _ v _ _ _ i _ _”
British spelling please.
Bonus Question: Name the profession (different book, different profession, and no fair Googling!)
Best Wishes for Happy Holidays and productive boundary-blurring are extended to all Gödel’s Lost Letter and P=NP readers and their families!
PS It turns out that Sidney Redner’s survey is available on the arxiv server as “Citation Statistics from 110 Years of Physical Review” (arXiv:physics/0506056 [physics.soc-ph]). No, it’s not easy (in physics, or presumably any STEM field) to recognize “best results” in the same year that they appear.