Skip to content

Avi Wins The Knuth Prize

April 5, 2019


Wigderson wins the well deserved Knuth Prize

From Avi’s 60Fest at IAS

Avi Wigderson is this year’s selection for the ACM/IEEE Donald E. Knuth Prize.

Today Ken and I wish to congratulate Avi and say a bit more.

We thank first the Knuth Prize committee, chaired by Avrim Blum, for their excellent selection. It must be very difficult each year to select one person for this honor. There are so many strong candidates. Of course the Knuth Prize is awarded to individuals for their overall impact in the field. Who could deny that Avi has had vast impact on the whole field?

Here is a nice comment from the “in theory” blog by Luca Tresvian.

Avi has worked on all aspects of computational complexity theory, and he has had a transformative influence on the way theoretical computer science relates to pure mathematics. I will not repeat what I wrote about his work on the occasion of his 60th birthday here and here. Long-term readers of in theory will remember that I consider him as one of the saints of computational complexity.

Hmmm. Saints of computing—we find that a pretty interesting comment.

The Three-Chain Race

Since the prize was created in 1996, it has been awarded to:

1996 Andrew Yao
1997 Leslie Valiant
1999 László Lovasz
2000 Jeffrey Ullman
2002 Christos Papadimitriou
2003 Miklos Ajtai
2005 Mihalis Yannakakis
2007 Nancy Lynch
2008 Volker Strassen
2010 David S. Johnson
2011 Ravi Kannan
2012 Leonid Levin
2013 Gary Miller
2014 Richard J. Lipton
2015 László Babai
2016 Noam Nisan
2017 Oded Goldreich
2018 Johan Håstad
2019 Avi Wigderson

I am now cheering for a number of potential future Knuth Prize winners. The reason is that just now Jeff Ullman and I have become tied for the longest chain of advisor-to-advisee all with the Prize. We both have a chain of length two: Ullman to Yannakakis is his. The next prize winner could tip the scale toward one of us or the other. Or could create another chain of length two. We will see.

Lower Bounds and Randomness

We thought we would present some cool story about Avi. We have already given some good ones—the Chinese Remainder Theorem story is a favorite. Or the brilliant first theorem he proved while a graduate student: The approximation method for coloring three-colorable {n}-vertex graphs in {\mathsf{poly}(n)} time with {O(\sqrt{n})} colors, when poly time with {O(n/\log n)} colors had been best-known. That was really beautiful. We could zig-zag through some of his great papers {\dots} and surveys. We could go on and on.

What Ken and I find most emblematic is that Avi has been at the forefront of lower bounds in complexity throughout the four decades since I knew him as a student. He and Noam Nisan delineated the “Hardness Versus Randomness” tradeoff at the end of the 1980s, pulling together work that began with Adi Shamir and Andy Yao at the beginning of that decade and was furthered by others including Russell Impagliazzo, with whom Avi later had two famous papers. The main theorem of the former paper connects upper and lower bounds in iconic fashion:

Theorem 1 If there are problems in exponential time having no {2^{o(n)}}-sized circuits, then {\mathsf{BPP = P}}.

Just like the flip side of a coin, this work has also put Avi at the forefront of our understanding of randomness. That was the subject of an interview in 2013 with the Polish magazine Polityka titled, “How to Rule Randomness Mathematically.” Quoting him in it (Ken’s translation):

The concept of secrecy on the Internet almost makes no sense without random bits. If you use a procedure, one that I know, I can impersonate you in a perfect way. I can become your master. Secrets must be generated in a random way. Of course, we do not have to do it ourselves—our laptop or phone does the work for us.

The interview goes on to marvel yet also warn how all of this is still founded on properties of prime numbers and the putative hardness of factoring. It looks to implementations of more advanced crypto whether “we will be forced to do it” or not. In all of this, Avi truly represents dedication to foundations. He backs that up with tireless advocacy and organizational work for people in our field—extending to the 2018 IAS workshop we covered last June.

A Symbolic Gift

As a symbol of our affection we dedicate to Avi a simple riff on lower bounds. We start with arguably the oldest lower bound, except that the vocabulary to realize it was lacking 2,000 years ago—and Ken and I weren’t conscious of it until recently. This is that {\pi(x)}—the number of primes {\leq x}—satisfies

\displaystyle  \log\log(x) \le \pi(x)

for {x \geq 2}, with logs to base 2. To show this, argue as in Euclid’s famous proof: Suppose that

\displaystyle  0 < p_{1} < p_{2} < \cdots < p_{n},

are the first {n} primes. The Euclid argument proves that the next prime, say {p_{n+1}}, must be so that

\displaystyle  p_{n+1} \le 1 + p_{1}p_{2}\cdots p_{n}.

The property {p_n \leq 2^{2^{n-1}}} is maintained by induction, because we have {p_1 \leq 2^{2^0} = 2} and thus

\displaystyle  \begin{array}{rcl}  p_{n+1} &\le& 1 + 2^{1}\cdot 2^{2}\cdot 2^{4} \cdots 2^{2^{n-1}}\\ &=& 1 + 2^{1 + 2 + 4 + \cdots + 2^{n-1}}\\ &\le& 1 + 2^{2^n - 1} \le 2^{2^n}. \end{array}

So if {x = p_{n+1} - 1}, giving {\pi(x) = n} in a worst-case manner, then {2^{2^n} \geq x+1 > x}, so {n > \log_2\log_2 x}. Thus {\pi(x) > \log_2\log_2 x}.


Now we note from here an exponentially better lower bound that is almost as easy to prove: Take {x = p_{n+1} - 1} again. Now if {1 \leq y \leq x}, then it must be possible to write

\displaystyle  y = s^2 p_1^{a_1} p_2^{a_2} \cdots p_n^{a_n}

with {s \leq \sqrt{x}} and each {a_i \in \{0,1\}}. This is because {s}, the largest square dividing {y}, can be at most {\sqrt{x}}, and then {y/s^2} has no squared prime dividing it. How many different such numbers can we make? There are at most {2^n} for the possible {a_1,\dots,a_n} exponents, multiplied by the initial {\sqrt{x}}. This means

\displaystyle  x \leq 2^n \sqrt{x},

so

\displaystyle  \sqrt{x} \leq 2^n = 2^{\pi(x)}.

Taking the base-2 log of both sides gives

\displaystyle  \pi(x) \geq \frac{1}{2}\log_2(x).

Again this works for all {x \geq 2}. This lower bound is much better, but it is still far from the truth that {\pi(x) = \Theta(x/\log x)}. The involvement of {\sqrt{x}} in the derivation, however, harks us back to Avi’s graph-coloring example. So in that spirit let us ask:

What is the easiest way to prove {\pi(x) \geq C\sqrt{x}} for some constant {C}? Is there a really easy way?

Open Problems

We congratulate Avi once more. The same Polish interview notes at the end that the Gödel and Nevanlinna Prizes “did not do any evident damage to his personal charm,” nor does his 1956 birthday diminish the energy “to bound down stairs … several breakneck steps at a time.” We are sure the new prize will have no effect on either.

4 Comments leave one →
  1. April 5, 2019 8:59 am

    Of course, the Knuth Prize to Avi is incredibly well deserved!

    On your open question: There is an elementary lower bound of Omega(x/log x) that I have seen attributed to Chebyshev. It comes from looking at the factorization of N=(2n choose n).

    Each prime dividing N contributes to pi(2n).

    For every prime p that divides N, one can easily count the index of p in the prime factorization of N as follows:

    Consider the numerator of N first: The index of p in the prime factorization of (2n)! is [2n/p}+[2n/p^2] + … + [2n/p^k] where p^k = 2^(2n)/(2n+1) >= 2^(2n)/(2n)^2.

    Therefore pi(2n) >= (2n)/log_2 (2n) – 2. []

  2. April 5, 2019 9:10 am

    Part of my reply didn’t show up because I used “less than” signs. Here it is again
    ————————————————————————————————

    Of course, the Knuth Prize to Avi is incredibly well deserved!

    On your open question: There is an elementary lower bound of Omega(x/log x) that I have seen attributed to Chebyshev. It comes from looking at the factorization of N=(2n choose n).

    Each prime dividing N contributes to pi(2n).

    For every prime p that divides N, one can easily count the index of p in the prime factorization of N as follows:

    Consider the numerator of N first: The index of p in the prime factorization of (2n)! is [2n/p}+[2n/p^2] + … + [2n/p^k] where p^k is the largest power of p that is at most 2n.

    Now consider the denominator (n!)^2: There the index of p is
    2([n/p}+[n/p^2] + … + [n/p^k]) where the last term may be 0.

    Therefore the index of p in the prime factoriization of N is:
    [2n/p}-2[n/p])+([2n/p^2]-2[n/p^2]) + … +([2n/p^k]-2[n/p^k]).
    It is easy to see that each term is at most 1 so the index is at most k.

    Therefore the largest prime power dividing N is at most 2n.

    It follows that pi(2n) is at least log_(2n) N.

    Now – keeping things purely elementary – we have that N=(2n choose n) is the largest of the 2n+1 binomial coefficients (2n choose i) whose sum is 2^(2n). Hence
    N is at least 2^(2n)/(2n+1) which is larger than 2^(2n)/(2n)^2.

    Therefore pi(2n) is more than (2n)/log_2 (2n) – 2. []

    • April 6, 2019 6:58 pm

      Neat—thanks, Paul. There still might be room on the quality/ease-of-proof tradeoff to insert a point for {\sqrt{n}}.

Trackbacks

  1. Shtetl-Optimized » Blog Archive » Congratulations!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s