Skip to content

Consistency and P=NP

May 11, 2020


Can we at least show that {P\neq NP} is consistent?

[ From personal page ]

Jan Krajicek is an expert on linking computational complexity and mathematical logic. He has authored four books on this area including the often cited text Bounded Arithmetic and Complexity Theory. Moreover, he has studied the consistency of various of theories and whether they can prove {\mathsf{P \neq NP}} and related questions.

Today I thought we would discuss consistency proofs in general.

Read more…

Mathematics of COVID-19

May 1, 2020


Its not just {R_0}

[ Sir Francis Galton by Charles Wellington Furse ]

Francis Galton is a perfect example of a Victorian era scientist. Sir Galton, he was knighted in 1909, had many roles: a statistician, a sociologist, a psychologist, an anthropologist, a tropical explorer, a geographer, a meteorologist, a geneticist, and an inventor. He coined the phrase “nature versus nurture” and more.

Today we trade in jokes for some mathematics of the virus.

Read more…

Time For Some Jokes

April 26, 2020


Can we still smile?

[ Hardy and Littlewood]

John Littlewood lived through the 1918–1919 flu pandemic, yet he appears not to have remarked on it in print. Nor can we find mention of it by Godfrey Hardy in A Mathematician’s Apology—though Hardy did write about the ravages of WW I.

Today, Ken and I thought you might like some fun comments that are not about the current pandemic.

This is not to say we are ignoring it. We are all fighting the virus in one way or another. Our hearts go out to those of you fighting it directly. We are all worried about ourselves and others. We are stuck at home, at least most of us. We are all in this terrible time together. We hope you all are safe and well.

We thought we would list a few jokes and stories that you might enjoy. We wrote recently about one kind of mathematical joke that can be given various proportions of pure levity and mathematical content. Our friends Lance Fortnow and Bill Gasarch, plus commenters in their item, collected some jokes on the computer science side.

Littlewood’s notion of “mathematical joke” leaned more on mathematical content, though his memoir A Mathematician’s Miscellany includes many funny stories as well. At the end of his introduction to the book, he wrote:

A good mathematical joke is better, and better mathematics, than a dozen mediocre papers.

We will start at the levity end. This is almost a math joke:

The Daily News published a story saying that one-half of the MP (Members of Parliament) were crooks.
The Government took great exception to that and demanded a retraction and an apology.
The newspaper responded the next day with an apology and reported that one-half of the MPs were not crooks.

We like this one, even if it is not really a hardcore math one. It does rely on the fact that {\frac{1}{2} + \frac{1}{2} = 1.}

Jokes and More

The following are some examples that we hope you all like. They are from a variety of sources:

  • Jokes that mathematicians think are funny.

  • Some are from StackExchange.

  • Others are from Andrej Cherkaev’s page.

We have lightly edited a few.

{\bullet } “My age is two billion years old,” said Paul Erdös. The point is:

When I was seventeen years old it was said the earth was two billion years old. Now they say it is four billion years old. So my age is about two billion years old.

{\bullet } There was a statistician that drowned crossing a river {\dots} It was 3 feet deep on average.

{\bullet } An infinite number of mathematicians walk into a bar. The first one orders a beer. The second orders half a beer. The third orders a third of a beer. The bartender bellows, “Get the heck out of here, are you trying to ruin me?”

{\bullet } An chemist, a physicist, and a mathematician are stranded on an island when a can of food rolls ashore. The chemist and the physicist comes up with many ingenious ways to open the can. Then suddenly the mathematician gets a bright idea: “Assume we have a can opener {\dots}

{\bullet } A theorist decides she wants to learn more about practical problems. She sees a seminar with the title: “The Theory of Gears.” So she goes. The speaker stands up and begins, “The theory of gears with a finite number of teeth is well known {\dots}

{\bullet } The reason that every major university maintains a department of mathematics is that it is cheaper to do this than to institutionalize all those people.

Regarding the last one, Littlewood did after all write in his book:

Mathematics is a dangerous profession; an appreciable proportion of us go mad.

This appears to have been a playful swipe at Hardy’s decision to leave Cambridge for Oxford. It was couched in a discussion of events that would seem to have had tiny probabilities before they happened.

The last two we’ve picked out from the above sites verge into philosophy:

{\bullet } The cherry theorem: Question: What is a small, red, round thing that has a cherry pit inside?
Answer: A cherry.

{\bullet} René Descartes went into his favorite bar and the bartender asked, “would you like your usual drink tonight, Monsieur Descartes?” Descartes replied “I think not.” Then he promptly ceased to exist.

Wrong Derivations, Right Results

Littlewood’s standards for a “mathematical joke” were higher than ours, but we will start by adapting an example from this MathOverflow discussion of Littlewood-style jokes. Sometimes we can play a joke on ourselves by deriving a result we know is right but with an incorrect proof. Here is the example:

{\bullet} Casting out 6’s. Suppose we want to simplify the fraction {\frac{166}{664}}. We can use the rule of casting out 6’s to get

\displaystyle  \frac{166}{664} = \frac{16}{64} = \frac{1}{4}.

The rule works quite generally:

\displaystyle  \frac{1666}{6664} = \frac{16666}{66664} = \frac{166666}{666664} = \cdots

\displaystyle  \frac{26}{65} = \frac{266}{665} = \frac{2666}{6665} = \frac{26666}{66665} = \cdots

You can even turn the paper upside down and cast out the {6}‘s that you see then:

\displaystyle  \frac{19}{95} = \frac{199}{995} = \frac{1999}{9995} = \frac{19999}{99995} = \cdots

\displaystyle  \frac{49}{98} = \frac{499}{998} = \frac{4999}{9998} = \frac{49999}{99998} = \cdots

Note, this is a joke: The rule of course does not actually work all the time:

\displaystyle  \frac{56}{65} = \frac{5}{5} = 1.


We thought to try to come up with our own examples, or at least blend in other sources. It once struck me (Ken), on reading a column by Martin Gardner on difference equations, that they give a “convincing proof” of {0^0 = 1}. Consider the powers of a natural number {k}, say {k = 5}. Take differences like so:

\displaystyle  \begin{array}{ccccccccccc} 1 & & 5 & & 25 & & 125 & & 625 & & \dots\\ & 4 & & 20 & & 100 & & 500 & & \dots\\ & & 16 & & 80 & & 400 & & \dots & & \\ & & & 64 & & 320 & & \dots & & &\\ & & & & 256 & & & & \\ & & & & & \ddots & & & \end{array}

The powers of {k-1} always appear on the bottom diagonal. Thus we have: {(k-1)^0 = 1}, {(k-1)}, {(k-1)^2}, and so on. Now do this for {k = 1}:

\displaystyle \begin{array}{ccccccccccc} 1 & & 1 & & 1 & & 1 & & 1 & & \dots\\ & 0 & & 0 & & 0 & & 0 & & \dots\\ & & 0 & & 0 & & 0 & & \dots & & \\ & & & 0 & & 0 & & \dots & & &\\ & & & & 0 & & & & \\ & & & & & \ddots & & & \end{array}

The diagonal now holds the powers of {0}. It thus follows that {0^0 = 1}.

Open Problems

What are your favorite mathematical jokes? Please send them to us. Be safe. Be well.

[fixed equations in last main section]

Proof and Cake Envy

April 18, 2020


Our proofs can be big too

[Mackenzie and Aziz]

Haris Aziz and Simon Mackenzie are computer scientists at UNSW and CMU respectively. Of course UNSW is the University of New South Wales and CMU is the Carnegie Mellon University.

Today we will discuss cake cutting and more.

Aziz and Mackenzie have solved an open problem concerning how to cut cakes. Their paper is in the April issue of the CACM: “A Bounded and Envy-Free Cake Cutting Algorithm.”

Why Cake Cutting?

Before we talk about cake cutting from a theory viewpoint let’s take a look at why it is interesting. The real answer probably is it is a beautiful math problem. It is easy to state without lots of background. It is simple like Fermat’s Last Theorem:

\displaystyle  x^{n} + y^{n} = z^{n}

has no solutions over the integers with {xyz \neq 0} and n>2. Cake cutting is hard. We like problems that strike back: problems that are not easy to solve. This is a strange view. In real life we might prefer problems that we can easily solve. But not in math. We like problems that are not trivial. The cake cutting problem is hard, so we like it.

Cutting Cakes

We are theorists so our cakes are one-dimensional line segments. The problem involves a finite set of agents, say Alice, Bob, and so on. They want to divide the cake, the line segment, into a finite number of pieces. The pieces are then allocated to the agents. The goal is to get a fair division of the cake.

The notion of “fair” is what makes the problem interesting. Often agents will not have the same tastes: Some like icing more than others, some like the end pieces, while others do not. The fact that the agents assign different values to a piece of the cake is what makes the problem challenging.

If there are two agents the problem has long been solved. Let Bob divide the cake into two pieces, so that he is happy to get either of these pieces. Then have Alice chose which piece she wants. It is easy to see that both Bob and Alice are happy. Both are envy-free: neither would exchange their piece for the others piece.

There is a large literature on the cake-cutting problem. Its creator, Hugo Steinhaus, noted:

Interesting mathematical problems arise if we are to determine the minimal numbers of “cuts” necessary for fair division.

We have taken the quote from an article on Medium that neatly conveys details on various protocols. Some main results are:

  • The Selfridge-Conway discrete procedure produces an envy-free division for {3} people using at most {5} cuts.

  • The Brams-Taylor-Zwicker moving knives procedure produces an envy-free division for {4} people using at most {11} cuts.

  • Three different procedures produce an envy-free division for {n} people. Both algorithms require a finite but unbounded number of cuts. That is to say, the number of cuts may depend on details of their preference functions.

  • The procedure by Aziz and Mackenzie finds an envy-free division for {n} people in a bounded number of cuts.

The last is the result in the CACM paper. Note, the number of cuts can be large:

\displaystyle  n^{n^{n^{n^{n^{n}}}}}.

Even for {n=2} this is immense, galactic. This should be compared to the best lower bound that is order {n^{2}}. This gap is even larger than the usual gaps we find in complexity theory. The P=NP question is only one exponential not five.

This has started me thinking: what exactly is the relationship between this and proof complexity? The latter has well-established relationships to complexity-class questions. The link from proofs in various systems of bounded arithmetic goes through the heart of P=NP. See for instance these slides by Sam Buss and notes that were scribed by Ken and others. What I am puzzled by is that in most cases the blowup is only one or two exponentials. The setting with cake-cutting is different, but how different?

Easy Cases

The Aziz and Mackenzie algorithm takes a long time. It is a nontrivial result, but not one that applies in any practical case. It always takes way too long. The cake will be stale by the time the agents have agreed on their pieces.

This raises a question, that also applies to many computational problems. Is there a way cut a cake faster on some interesting examples? We can explain this by the analogy to sorting. The fastest sorting algorithms run in {O(n \log n)} time where there are {n} objects. But what happens if the objects are already in sorted order? Or at least close to sorted order? The answer is it depends:

  1. Some sorting algorithms always take the same time, independent of the input structure.

  2. There are other sorting algorithms that can take advantage of the nature of the input.

That is some sorting algorithms can run say in linear time if the input is almost sorted. For the cake cutting problem we ask:

Is there a way to cut cakes that is envy-free when the agents have some property {P}?

We do not know the answer, but we think it is an interesting question. Here is an example. Suppose that the agents have the same measures. That is, they evaluate every piece of cake in the same way.

If we know this—and if we continue our supposition above that Bob can cut with exact precision—then there is an easy answer: Have Bob do the cuts. Then all agents will be equally happy since they have the same measures. The question is, what if we do not know? I believe there should be some theorem like this:

Theorem 1 (Conjecture) There is an envy-free algorithm that operates in {O(n^{2})} time the algorithm so that either:

  1. It yields an envy-free solution, or

  2. It determines that some agents have different measures.

In the second case the cake will be cut as before.

Open Problems

I originally planned on discussing size and complexity of proofs. This is driven by the complexity of the cake cutting algorithms. They tend to have lots of cases and are difficult to understand.

They are also difficult to find—this is why cake cutting questions have been resistance to progress. More on this in the future.

[Edit n>2 in Fermat example]

John Horton Conway 1937–2020

April 14, 2020


An appreciation

Names for large numbers source

John Horton Conway just passed away from complications of COVID-19. We are all saddened by this news, and we hope you all are doing your best to stay safe and help others cope.

Today Ken and I thought we would reflect on some of Conway’s many contributions and emphasize three in which we see connections to computational complexity.

Conway was a Fellow of the Royal Society, and was the first recipient of the London Mathematical Society’s Pólya Prize. His nomination to the Royal Society reads:

A versatile mathematician who combines a deep combinatorial insight with algebraic virtuosity, particularly in the construction and manipulation of “off-beat” algebraic structures which illuminate a wide variety of problems in completely unexpected ways. He has made distinguished contributions to the theory of finite groups, to the theory of knots, to mathematical logic (both set theory and automata theory) and to the theory of games (as also to its practice).

A Life Force

Conway may be most noted for his game of Life. This is a two-dimensional cellular automaton. Conway invented it in 1970, which he rounded up from 1969. The game—and Martin Gardner’s 1970 column on it in Scientific American—made him famous in the wider community. The website conwaylife.com and several others link to more information than we could digest in a lifetime.

We want to emphasize instead how Conway was a special force in mathematics. He applied an almost elementary approach to deep hard problems of mathematics. This is a unique combination. There have been mathematicians who worked on deep problems and also on recreational math, but few who established integral flows across the boundary between them. Conway infused both with magic in a way conveyed by an iconic photograph of his Princeton office in 1993:

Guardian via Dith Pran, NY Times source

What Ken remembers is how accessible Conway was outside his office. “I know I met him at least once while I was an undergraduate at Princeton in 1979 or 1980, though this is overlaid by a memory of finding just him and a few others in the Fine Hall tea room when I was there for my tenth reunion in 1991. My most evocative memory is when Conway gave an evening talk to the undergraduate mathematics club at Oxford when I was there sometime after 1981. It was relatively sparsely attended, perhaps because it was literally a dark and stormy winter night. But after his lecture we all got to huddle around him for another hour in the tea room as he regaled us with stories and mathematical problems.”

We also remember that Conway was one of Andrew Wiles’s main confidants during the months before Wiles announced his proof of Fermat’s Last Theorem in June 1993. Here is a transcript of a PBS Nova documentary on the proof in which Conway appears prominently. Ken has picked out two of Conway’s other contributions that we feel may have untapped use for research in complexity theory.

Conway’s Numbers

One of this blog’s “invariants” is first-name last-name style, thus “Godfrey Hardy” not “G.H. Hardy.” But we make an exception in Conway’s case. Partly this owes to how his initials were amplified by Donald Knuth in his novella Surreal Numbers:

In the beginning, everything was void, and J.H.W.H. Conway began to create numbers.

Besides the void (that is, {\emptyset}), the creation uses the idea of a left set {L} and a right set {R}. Every number has the form {\langle L~|~R \rangle}. The initial number is

\displaystyle  \langle \emptyset ~|~ \emptyset\rangle = 0.

Once a number is generated, it can be in the {L} or {R} of other numbers. Thus, next come

\displaystyle  \begin{array}{rcl}  \langle 0 ~|~ \emptyset \rangle &=& 1\\ \langle \emptyset ~|~ 0 \rangle &=& -1. \end{array}

You might think of {\langle 0 ~|~ 0 \rangle} next, but it violates the invariant

\displaystyle  (\forall \ell \in L)(\forall r \in R)\neg (r \leq \ell).

which defines an {\langle L~|~R \rangle} form to be a number.

The relation {\leq} is inductively defined for {a = \langle L_a ~|~ R_a \rangle} and {b = \langle L_b ~|~ R_b \rangle} by

\displaystyle  a \leq b \quad\equiv\quad (\forall \ell_a \in L_a)(\forall r_b \in R_b)\neg(b \leq \ell_a \;\lor\; r_b \leq a).

That is, no member of the left-set of {a} “bumps” {b} (in the sense of rowing races) and {a} does not bump any member of the right-set of {b}. Note that {R_a} and {L_b} are not involved—they already behave correctly owing to the invariant. The numbers {a,b} are equal if {a \leq b} and {b \leq a} both hold. The rule for addition is

\displaystyle  a + b = \langle (L_a \boxplus b) \cup (a \boxplus L_b) ~|~ (a \boxplus R_b) \cup (R_a \boxplus b) \rangle,

where {L_a \boxplus b = \{\ell_a + b: \ell_a \in L_a\} = b \boxplus L_a} and so on. The logical rule {\emptyset \boxplus a = \emptyset} for any {a} makes the definition of addition well-founded. This yields the numerical fact

\displaystyle  0 + 0 = \langle (\emptyset \boxplus 0) \cup (0 \boxplus \emptyset) ~|~ (\emptyset \boxplus 0) \cup (0 \boxplus \emptyset) \rangle = \langle\emptyset ~|~ \emptyset\rangle = 0.

It is immediate that {+} is commutative. There is also a rule for multiplication but addition gives us enough to talk about here.

Redundancy and Simplicity

It is straightforward to compute that {0 + 1 = 1} and {-1 + 0 = -1}. Now consider:

\displaystyle  -1 + 1 = \langle (\emptyset \boxplus 1) \cup (-1 \boxplus \{0\}) ~|~ (-1 \boxplus \emptyset ) \cup (\{0\} \boxplus 1)\rangle = \langle -1 ~|~ 1\rangle.

This is a legal number. You can check that the relations {\langle -1 ~|~ 1\rangle \leq 0} and {0 \leq \langle -1 ~|~ 1\rangle} both hold. Thus—as a number rather than a “form”—the number {\langle -1 ~|~ 1\rangle} equals {0}.

That seems to make sense since {0} is the average of {-1} and {1}, but now compute {2 = 1 + 1} as a formal Conway number and consider {c = \langle -1 ~|~ 2\rangle}. This also satisfies the relations {c \leq 0} and {0 \leq c}, so {c} must likewise equal {0}. Thus {\langle L ~|~ R \rangle} is not some kind of numerical interpolation between {L} and {R}. The interpretation that grabbed my imagination as a teenager in 1976 is that:

{\langle L ~|~ R \rangle} equals the simplest number that is between {L} and {R}.

This is especially evocative in cases like {\langle 1 ~|~ \emptyset \rangle}, which is what one gets by computing {1 + 1}. In general, {m+1} is the simplest number between {m} and {\emptyset}. Conway made this a theorem by giving each number a set-theoretic ordinal for its “time of generation” and proved that {\langle L ~|~ R \rangle} always equals a (the) least-ordinal number {c} such that {\ell \leq c \leq r} for every {\ell \in L} and {r \in R}.

Conway’s rules allow {L} and {R} to be infinite sets—any sets of numbers built by the rules of set theory. Then not only do all real numbers emerge at ordinal times, so do infinitesimals and further richness of structure. We should remember that Conway began as a set theorist with a dissertation under Harold Davenport titled Homogeneous ordered sets. All Conway numbers with finite creation times are dyadic rational numbers, which may seem trivial from the standpoint of set theory, but those are akin to binary strings.

What became magic was how Conway’s rules characterize games. Through games we can also interpret forms like {\langle 0 ~|~ 0 \rangle} that are not numbers. I did not know about complexity when I purchased Conway’s book On Numbers and Games around 1980, let alone the connections between games and complexity. The book has a lot of depth that might be useful to complexity theory. To quote Peter Sarnak, per this article by Conway’s biographer Siobhan Roberts on Conway’s meeting with Kurt Gödel:

The surreal numbers will be applied. It’s just a question of how and when.

Modular Programming

Most of us know that the conditional-jump instruction


if (x == 0) goto k


where k is the label of another instruction, creates a universal programming language when added to the usual programming primitives of assignment, sequencing, and simple arithmetic. Conway was a maven of the “modular-jump”:


if (x == 0 mod m) goto k.


In complexity theory we know that mod-{m} gates having 0-1 inputs define the idea of {\mathsf{ACC}} circuits, with {\mathsf{ACC}^0} denoting problems solved by families of these circuits having fixed depth and polynomial size. If we don’t insist on fixed depth and unary inputs, we get modular programs. They are more complex than {\mathsf{ACC}^0} circuits, but we can learn from what can be done concretely with them.

Conway created a particular form of modular programs in a language he called FRACTRAN. A program is just a list of positive fractions {\frac{a_r}{b_r}} in lowest terms. The input is an integer {n} held in a separate register. Each fraction represents the code line

\displaystyle  \text{if } (n*a_r \equiv 0 \pmod{b_r}) \{ n = n\frac{a_r}{b_r}; \text{goto start} \};

In other words, each iteration takes the first fraction {f} such that {nf} is an integer and updates {n} to {nf}; if there is no such fraction then the program exits and outputs {n}.

For example, the following FRACTRAN program given in Wikipedia’s article implicitly computes integer division:

\displaystyle  \left[\frac{91}{66},~\frac{11}{13},~\frac{1}{33},~\frac{85}{11},~\frac{57}{119},~\frac{17}{19},~\frac{11}{17},~\frac{1}{3}\right].

The notation is unary: The input {x} has the form {2^n 3^d 11} and the ouput is {5^q 7^r} where {n = qd + r} with remainder {r < d}. This already hints the fact that FRACTRAN is a universal programming language. Powers of primes serve as memory registers. The following program computes the Hamming weight {k} of the binary expansion of a natural number {x} encoded as {2^x}, returning the value {13^k}:

\displaystyle  \left[\frac{33}{20},~\frac{5}{11},~\frac{13}{10},~\frac{1}{5},~\frac{2}{3},~\frac{10}{7},~\frac{7}{2}\right].

This might help bridge to our notions of {\mathsf{ACC}}. The Wikipedia article does a good job of de-mystifying the fractions in terms of their actions on the prime-power registers under the unary-style encoding. We wonder what happens when we try to work directly with binary encodings.

The Collatz Example

The famous “{3n+1}” problem of Lothar Collatz is a case in point. It iterates the function

\displaystyle  T(n) = \begin{cases} \frac{3n+1}{2} & \text{if } n \text{ is odd} \\ \frac{n}{2} & \text{if } n \text{ is even} \end{cases}

The following FRACTRAN program given by Kenneth Monks iterates {T(n)} under the unary encoding {2^n}:

\displaystyle  \left[\frac{1}{11},~\frac{136}{15},~\frac{5}{17},~\frac{4}{5},~\frac{26}{21},~\frac{7}{13},~\frac{1}{7},~\frac{33}{4},~\frac{5}{2},~\frac{7}{1}\right].

Note that since the last fraction is an integer the program runs forever. If {n = 1} so that the input is {2}, it would go {2 \rightarrow 5 \rightarrow 4 \rightarrow 33 \rightarrow 3 \rightarrow 21 \rightarrow 26 \rightarrow 14 \rightarrow 2 \cdots} and thus cycle, unless we stop it. The powers of {2} that appear in its output give the desired sequence.

More natural to us, however, is the following modular program—which can use binary or any notation:


start: if (n == 1) { halt; }
if (n == 0 mod 2) { goto div; }
n = 3*n + 1;
div: n = n/2;
goto start;


One can generalize the Collatz problem to moduli {m > 2}. For each {k < m} we have a linear transformation {n \mapsto c_k n + d_k} that always gives an integer value when {n \equiv k \pmod{m}}. We want to know about the orbits of numbers {n} under this iteration.

In fact, this is exactly what FRACTRAN does. Take {m} to be the least common multiple of the denominators {b_r} in a FRACTRAN program {[\frac{a_r}{b_r}]}. Then for each {r} we can list the remainders {k} that are multiples of {b_r} and we get {c_k = \frac{a_r}{\gcd(k,m)}}, with {d_k = 0}. The Turing-universality of FRACTRAN then proves a general theorem Conway stated in 1972:

Theorem 1 Generalized Collatz-type problems for moduli {m > 2} are undecidable.

Several followup papers have proved stronger and more particular forms of the undecidability. The paper by Monks linked above leverages the unary encoding to show that having {d_k = 0} is essentially without loss of generality for universality; it is titled “{3x+1} Minus the {+}.”

Having digested universality, it is natural to wonder about complexity. Can we use modular programming to achieve stronger connections between number theory and complexity classes—classes above the level of {\mathsf{ACC}^0}, say? One possible mode of connection is exemplified by this paper from STACS 1994, which both Dick and I attended. We wonder whether the kind of connection noted by Terry Tao in his tribute to Conway can also smooth the way to understanding {\mathsf{MIP^* = RE}}.

Open Problems

Conway posed many open problems himself. Here is a list of five for which he posted cash rewards in the manner of Paul Erdős. The fifth was recently solved. The fourth can be stated in one sentence:

If a set of points in the plane intersects every convex region of area 1, then must it have pairs of points at arbitrarily small distances?

Our condolences go out to his family and all who were enthralled by him in the mathematical world. We could talk endlessly about his impact on mathematics education—even about simple things like how to prove that {\sqrt{2}} is irrational—or try to tangle with his applications of the “Monster” group to modular forms, but those must be for another time. Also see Scott Aaronson’s tribute and its comments section for many more stories and items.


[some small word and format changes, added link to Scott and may add others as time allows]

Nina Balcan Wins

April 10, 2020


Congrats and More

[ CMU ]

Nina Balcan is a leading researcher in the theory of machine learning. Nina is at Carnegie-Mellon and was previously at Georgia Tech—it was a major loss to have her leave Tech.

Today we applaud her winning the ACM Hopper Award.
Read more…

Not As Easy As ABC

April 5, 2020

Is the claimed proof of the ABC conjecture correct?

[ Photo courtesy of Kyodo University ]

Shinichi Mochizuki is about to have his proof of the ABC conjecture published in a journal. The proof needs more than a ream of paper—that is, it is over 500 pages long.

Today I thought we would discuss his claimed proof of this famous conjecture.

The decision to published is also discussed in an article in Nature. Some of the discussion we have seen elsewhere has been about personal factors. We will just comment briefly on the problem, the proof, and how to tell if a proof has problems.

The Problem

Number theory is hard because addition and multiplication do not play well together. Adding numbers is not too complex by its self; multiplication by its self is also not too hard. For those into formal logic the theory of addition for example is decidable. So in principle there is no hard problem that only uses addition. None. A similar point follows for multiplication.

But together addition and multiplication is hard. Of course Kurt Gödel proved that the formal theory of arithmetic is hard. It is not complete, for example. There must be statements about addition and multiplication that are unprovable in Peano Arithmetic.

The ABC conjecture states a property that is between addition and multiplication. Suppose that

\displaystyle  A + B = C,

for some integers {1 \le A \le B \le C}. Then

\displaystyle  C \le ABC

is trivial. The ABC conjecture says that one can do better and get

\displaystyle  C \le F(ABC),

for a function {F(X)} that is sometimes much smaller than {X}. The function {F(X)} depends not on the size of {X} but on the multiplicative structure of {X}. That is the function depends on the multiplicative structure of the integers. Note, the bound

\displaystyle  C \le ABC

only needed that {A,B,C} were numbers larger than {1}. The stronger bound

\displaystyle  C \le F(ABC),

relies essentially on the finer structure of the integers.

Roughly {F(X)} operates as follows: Compute all the primes {p} that divide {X}. Let {Q} be the product of all these primes. Then {F(X) \le Q^{2}} works:

\displaystyle  C \le Q^{2}.

The key point is: Even if {p^{100}}, for example, divides {X}, we only include {p} in the product {Q}. This is where the savings all comes from. This is why the ABC conjecture is hard: repeated factors are thrown away.

Well not exactly, there is a constant missing here, the bound is

\displaystyle  C \le \alpha Q^{2}

where {\alpha>0} is a universal constant. We can replace {Q^{2}} by a smaller number—the precise statement can be found here. This is the ABC conjecture.

The point here is that in many cases {F(ABC)} is vastly smaller than {ABC} and so that inequality

\displaystyle  C \le F(ABC),

is much better than the obvious one of

\displaystyle  C \le ABC.

For example, suppose that one wishes to know if

\displaystyle  5^{z} = 2^{x} + 3^{y},

is possible. The ABC conjecture shows that this cannot happen for {z} large enough. Note

\displaystyle  F(2^{x} 3^{y} 5^{z}) = 30

for positive integers {x,y,z}.

Is He Correct?

Eight years ago Mochizuki announced his proof. Now it is about to be published in a journal. He is famous for work in part of number theory. He solved a major open problem there years ago. This gave him instant credibility and so his claim of solving the ABC conjecture was taken seriously.

For example, one of his papers is The Absolute Anabelian Geometry of Canonical Curves. The paper says:

How much information about the isomorphism class of the variety {X} is contained in the knowledge of the étale fundamental group?

A glance at this paper shows that it is for specialists only. But it does seem to be math of the type that we see all the time. And indeed the proof in his paper is long believed to be correct. This is in sharp contrast to his proof of the ABC conjecture.

Indicators of Correctness

The question is: Are there ways to detect if a proof is (in)correct? Especially long proofs? Are there ways that rise above just checking the proof line by line? By the way:

The length of unusually long proofs has increased with time. As a rough rule of thumb, 100 pages in 1900, or 200 pages in 1950, or 500 pages in 2000 is unusually long for a proof.

There are some ways to gain confidence. Here are some in my opinion that are useful.

  1. Is the proof understood by the experts?

  2. Has the proof been generalized?

  3. Have new proofs been found?

  4. Does the proof have a clear roadmap?

The answer to the first question (1) seems to be no for the ABC proof. At least two world experts have raised concerns—see this article in Quanta—that appear serious. The proof has not yet been generalized. This is an important milestone for any proof. Andrew Wiles famous proof that the Fermat equation

\displaystyle  x^{p} + y^{p} = z^{p},

has no solutions in integers for {xyz \neq 0} and {p \ge 3} a prime has been extended. This certainly adds confidence to our belief that it is correct.

Important problems eventually get other proofs. This can take some time. But there is almost always success in finding new and different proofs. Probably it is way too early for the ABC proof, but we can hope. Finally the roadmap issue: This means does the argument used have a nice logical flow. Proofs, even long proofs, often have a logic flow that is not too complex. A proof that says: Suppose there is a object {X} with this property. Then it follows that there must be an object {Y} so that {\dots} Is more believable than one with a much more convoluted logical flow.

Open Problems

Ivan Fesenko of Nottingham has written an essay about the proof and the decision to publish. Among factors he notes is “the potential lack of mathematical infrastructure and language to communicate novel concepts and methods”—noting the steep learning curve of trying to grasp the language and framework in which Mochizuki has set his proof. Will the decision to publish change the dynamics of this effort?

[Fixed typo]

Research at Home

April 1, 2020


An idea for human-interest interviews

Pixabay free src

Dr. Lofa Polir is, like many of us, working from home. When we last wrote about her two years ago, she had started work for the Livingston, Louisiana branch of LIGO. They sent her and the rest of the staff home on March 19 and suspended observations on the 26th. Since Polir’s duties already included public outreach, she is looking to continue that online.

Today we helped Dr. Polir interview another pandemic-affected researcher.

We liked her idea of interviewing young people just starting their careers, who are facing unexpected uncertainties. Her first choice was a new graduate of Cambridge University doing fundamental work related to LIGO. Unfortunately, he had been unable to install a current version of Zoom on his handheld device, or maybe afraid owing to security issues. So she requested the special equipment we have used to interview people in the past.

He replied at the speed of light that he was willing to do the interview so long as we respected some privacy measures. As for what name to use, he said we could just call him Izzy—Izzy Jr., in fact. So Dick, I, and Dr. Polir all used our own Zoom to port into our machine’s console room. The connection worked right away as Izzy’s head glimmered into view.

Starting The Interview

At first glimpse, all we could see was his long, light-brown hippie hair. This really surprised us—not the image we had of Cambridge—and we gasped about it before even saying hello. He replied that it was fashion from the Sixties. We asked how his family was doing and he said his fathers had passed on but mother and young siblings were at home and fine. We think he said “fathers” plural—the machine rendered him in a drawl like Mick Jagger and he was hard to follow.

Izzy picked up on our discomfort and immediately assured us he hadn’t been doing any drugs: “You can’t get them anyway because they’re all being diverted to treat the sick.” But he did open up to us that he was in some kind of withdrawal. He confessed that he had resorted to looking at the sun with one eye. “It was ecstasy but bad—I still can see only reds and blues with that eye, and I need to use an extra-large rectangular cursor to read text.” We were curious what brand of handheld device he was using because of his problems with Zoom, and he told us it was a Napier 1660 by Oughtred, Ltd. We hadn’t heard of that model but he said he’d connected three of them into a good home lab setup.

We asked how he was coping with distance teaching, but he said he hadn’t yet started his faculty position at Trinity College. We were surprised to learn that lecture attendance at Cambridge University is optional. “I shall be required to give the lectures but nobody will come to them so that’s all the same now—at least here I’ll have a cat for audience. No dogs and not my mother or siblings—I’d sooner burn the house down.” He quickly added, “Oh, my mother and I get along fine now and I love playing teatime with my little sisters.”

We really didn’t want to go into Izzy’s personal life, and I tried to shift the small-talk by noting a little chess set on a shelf behind him. He snapped that he shouldn’t have spent money on it and he was a poor player anyway. We thought, wow, either this guy’s really down on himself or the cabin fever of the pandemic is getting to him. So Dick, always quick to pick up on things and find ways of encouragement, said:

“Dr. Polir here works on gravity and we’re told you have some great new ideas about it. We’d love to hear them.”

“Yes, I do—or did. But something happened yesterday that is making me realize that it’s all wrong, rubbish really…”

In the Garden

Izzy started by explaining that it’s a basic principle of alchemy that all objects have humors that can manifest as kinds of magnetism. (“Alchemy”? did we hear him right?) If you realize that the Earth and Sun are objects just like any other then you can model gravity that way. You just need to assign each object a number called its “mass” and then you get the equation

\displaystyle  F = G \frac{m_1 m_2}{r^2}

for the force of attraction, where {r} is the distance between the objects with masses {m_1,m_2} and {G} is a constant that depends on your units.

“We understand all that,” said Dr. Polir.

Izzy said the point is that {G} depends only on your units and is the same regardless of where you are on Earth or on the Moon or wherever. It is very small, though. Then he went into his story of yesterday.

“I was in our garden by the path to the neighbor’s farm. I was supposed to be watching my little brother Benjamin who wanted to help harvest squash but I hate farming so I let him go without me. I was lying under an apple tree for shade when an apple fell and I realized all my mistakes.”

“What?,” we thought silently. We didn’t need to speak up—Izzy launched right into his litany of error:

“First, I’d thought the force was in what made the apple fall, but that’s nonsense. The apple would fall naturally because down is the shortest path it would be on if the tree branch were not holding it back. The only force is the tensile strength of the branch which was restraining it. I think that the tensile force really is magnetism, by the way.”

“Second, it’s ridiculous to think the force is coming from the Earth. On first principles, it could come from the ground, but that’s not what the equations say. They could have it all coming from one point in the center of the Earth. Just one point—four thousand miles deep!”

“Third and worst, though, is when you apply it to the Sun and the Earth. My equation means they are exerting force on each other instantaneously. But they are millions of miles apart. Whereas, the tree was touching the apple. Force can work only by touch, not by some kind of spooky action at a distance.”

We realized what he was driving at. Dick again always likes to encourage, so he said:

“But the math you developed for this force theory—surely it is good for calculations…?”

“No it’s not—it’s the Devil’s own box. I can calculate two bodies—the Earth and the Sun, or the Moon and the Earth if you suppose there is no Sun, but as soon as you have all three bodies it’s a bog. Worst of all, I can arrange five bodies so that one of them gets accelerated to infinite velocity—in finite time. This is a clear impossibility, a contradiction, so by modus tollens… it can all go in the bin.”

We didn’t think it would help to tell him that his math was good enough to calculate a Moon landing but not to locate a friend’s house while driving. He supplied his own coup-de-grâce anyway:

“And even the two-body calculations are tainted. I can calculate the orbits of the planets but the equations I get aren’t stable. I would wind up having to postulate something like God keeps the planets on their tracks. Yes, you need an intelligent Agent to start the planets going—all in one plane, basically—but to need such intervention all the time defeats the point of having equations.”

Inklings

We asked Izzy what he was going to do. He said that the one blessing of enforced solitude is that one gets time to reflect on things and deepen the foundations. And he said he’d had an idea later that afternoon.

“Toward supper I realized I needed to get Benjamin home. The path to the farm is straight except it goes over a mound. I was sauntering along and when I got to the hill I realized that if I didn’t watch it I’d have fallen right into it. So that got me thinking. First, what I thought was straight on the path was really a curve—the Earth is after all a ball. We think space is straight, but maybe it too is curved. So when I’m standing here, perhaps I would really be moving in a diagonally down direction, but the Earth is stopping me. The Irish blessing says, ‘may the road rise up to meet you.’ Perhaps it does.”

“So are you doing math to work that out?,” I ventured.

“I started after supper. One good thing is that it allows light to be affected by gravity—which I was already convinced of—even if light has no mass. But a problem is that it appears Time would have to be included as curved. That does not make sense either.”

We asked when he might write up all this. He said he didn’t want to be quick to publish something so flawed on the one hand, or incomplete on the other, “unless someone else be about to publish the same.” We noted that there weren’t going to be any in-person conferences to present papers at for awhile anyway.

“Besides, that’s not what I’m most eager to do. What the respite is really giving me time for is to start writing up my work on Theology. That’s most important—it could have stopped thirty years of war. For one thing, homoiousios, not homoousios, is the right rendering. There will be a time and times and the dividing of times in under 400 years anyway.”

That last statement somehow did not reassure us. We thanked Izzy Jr. for the interview and he gave consent to publish it posthumously.

Open Problems

We hope that your April Fool’s Day is such as to allow a time to laugh. But also seriously, would you be interested in the idea of our interviewing people during these times? Is there anyone you would like to suggest?

Logic and Star-Free, Part Deux

March 29, 2020


A visual proof with no abstract-algebra overhead

Composite crop of src1, src2

Dominique Perrin and Jean-Éric Pin are French mathematicians who have done significant work in automata theory. Their 1986 paper “First-Order Logic and Star-Free Sets” gave a new proof that first-order logic plus the {<} relation characterizes star-free regular sets.

Today we present their proof in a new visual way, using “stacked words” rather than their “marked words.” We also sidestep algebra by appealing to the familiar theorem that every regular language has a unique minimal deterministic finite automaton (DFA).
Read more…

Star-Free Regular Languages and Logic

March 21, 2020


Part 1 of a two-part series

Daniel Winton is a graduate student in mathematics at Buffalo. He did an independent study with me last semester on descriptive complexity.

Today we begin a two-part series written by Daniel on a foundational result in this area involving first-order logic and star-free languages.

Read more…