# The ABC Conjecture And Cryptography

* A missed point in the history of cryptography *

Shinichi Mochizuki recently released a series of four papers (I, II, III, IV) totaling 512 pages that claim to contain a proof of the ABC Conjecture of number theory. I understand that he is a strong mathematician, who has a strong track record, so he may be correct. Of course such a long paper will take a serious effort by the experts to check, and given the depth and importance of the ABC Conjecture, we all wish that it is correct.

Today I want to talk about a connection that I see between his work and modern cryptography.

No, there is no direct mathematical connection: the conjecture does not break any crypto-system—at least I do not see how it could be used to do that. Rather there is an interesting—I believe—philosophical connection between his approach and cryptography. I will explain later.

I have already told the story of a joint paper of mine with Jin-Yi Cai and Zeke Zalcstein proving that given three commuting matrices one could decide in polynomial time whether or not there were natural numbers so that

This is essentially an exponential Diophatine equation, and was a nice result—but was not a millionth as important as the ABC conjecture. It was accepted at a top conference and we used the term “ABC” in the title. Mistake. We were quickly flooded with requests for a preprint, and soon realized that we needed to change the name of the paper. This shows public awareness of the importance of what Mochizuki has done, if correct.

**Update 9/18/2012:** Today’s New York Times Science Tuesday section has an article on Mochizuki and his papers, which also links to a post by Jordan Ellenberg, in which Brian Conrad, Terence Tao, and Noam Elkies among others make comments.

## The Conjecture

You may know the conjecture, but I thought for completeness I would state it here, along with its curious history. The best source for this, I think, is the beautiful book of Serge Lang Math Talks for Undergraduates. Lang has written many many books, some easy to read, some well not so easy to read, but this softcover book is a gem—perhaps over-priced and under-typo-checked, but still a gem.

I will follow Lang’s version of the story. Richard Mason in 1983 discovered a new property of polynomials. The property is quite neat, and as Lang points out, it came as a surprise that no one earlier had noticed such a basic property of polynomials. That’s not quite true: Walter Stothers had proved it two years previously, but it was unknown to many in the 1980’s. As Lang said:

“So the history of mathematics does not always flow smoothly.”

We need a few definitions about polynomials first to define their beautiful theorem. If is a polynomial, let be the degree of the polynomial and let be the number of *distinct* zeroes of .

Theorem:Let be relatively prime polynomials over the complex numbers so that . Then,

See this for a simple proof due to Noah Synder, who found it while still in high school. One cool consequence of this theorem is that there are no non-trivial polynomials so that

where . That is, Fermat’s Last Theorem is true for polynomials.

## The Game Change

Not long after the Mason-Stothers work, Joseph Oesterle and David Masser made a conjecture of the same spirit, but replacing polynomials by integers. Of course they needed to replace the degree by something that made sense for integers and replace the number of zeroes by something also.

The degree was not too hard, since it measures the “size” of a polynomial, so the degree of an integer becomes the absolute value. However, what plays the role of the number of zeroes? They found that the *radical* of an integer is the correct idea: the radical of , denoted by , is the product of all the *distinct* primes that divide .

The ABC Conjecture then is:

Given , there exists a constant such that for all relatively prime integers with , we have the inequality,

Again as we noted before, it can be used to also prove Fermat’s Last Theorem for all large enough. This is due to the constant . If we knew explicitly, then this could be a way to prove Fermat for all , since it is known to hold, independent of Andrew Wiles work, for very large .

The above conjecture is the amazing ABC Conjecture that may now be proved by Mochizuki.

## Why Is ABC Amazing?

The integers are closed under addition and multiplication. Their structure under one of these operations alone can be non-trivial, but it is relatively well understood. Under addition the integers just form an infinite abelian group with one generator. Under multiplication they form an abelian semigroup with all the primes as the generators.

A precise measure of how “easy” the additive group structure is seen by Mojżesz Presburger’s Theorem. This theorem shows that the first order theory of addition is decidable: so any question about addition that can be stated in a first order theory can in principle be decided. There is no Gödel Incompleteness Theorem to worry about. The same is also true for the first order theory of multiplication: it is also decidable.

Thus questions about only addition or only multiplication are at least in principle decidable. The lower bounds on proofs in these respective theories are not friendly, so automatic answers are in general impractical, but at least we know that answers exist.

Everything changes when we ask questions that talk about the interplay between addition and multiplication. The first-order theory is undecidable and incomplete. There are countless simple-to-state questions that mix the two structures—addition and multiplication—that are beyond are current abilities. A simple example of this is the Twin Prime Conjecture: Are there an infinite number of primes so that is also prime? Note how it mixes primality (which depends on multiplication) with addition.

This is also the reason that the ABC Conjecture is so deep. It mixes in a critical way the additive structure

with the multiplicative structure, the radical of . This mixture is at the core of why the question is so deep.

## The Connection

A disclaimer: **I am a complete innocent when it comes to the type of number theory that is being used by Mochizuki.** So is Ken. I am completely in the dark, but that never stops us here at GLL from trying to explain things. So the following analogy could be way off and misleading. That said, I will push on and try to give you some idea of perhaps how his breakthrough work can be explained without mentioning:

the theory of semi-graphs of anabelioids, Frobenioids, the étale theta function, and log-shells, and more.

Books and texts are messages. In the pre-modern era of cryptography, messages were text strings. Letters could be changed from one to another. Or they could be re-ordered. But that was all. This meant that cryptosystems were limited in what they could do to text to attempt to encode it.

For example, the German Enigma machine used rotors and plugboards to change a letter to another letter in a potentially complex manner. It was broken as we know by the Allies, and this helped save many lives, but it was a quite difficult feat.

The real modern era started when the famous paper of Whitfield Diffie and Martin Hellman (DH) was published. It contains the great idea of a public-key asymmetric cipher system. This is what we think about when we credit them for their great work.

## The Message is the Medium

But in my opinion a greater advance was the realization that messages had a natural richer structure. Messages could be mapped one-to-one to integers, and this simple idea opened whole new vistas. Previously a book could be encoded by changing its letters or re-ordering them. Now new things could be done. One could multiply two messages together, or add them, or raise them to an exponential modulo a prime, or a composite. This made no sense previously: what is the square of Charles Dicken’s a Tale of Two Cities? I have no idea. But now there were magic things that we could do to text. This opened the door for modern cryptography.

In my opinion it was a huge breakthrough and was just as important, if not more important, than the concept of public key. Suppose DH had written a paper that simply said:

Messages are numbers; they can be manipulated as numbers in any way you like.

I believe one could argue that this would have been a huge advance, and that shortly later many great ideas would have come forward.

I may be in the minority here, but I do believe that the idea that messages had rich hidden mathematical structure changed cryptography forever. This is the connection with the work in the ABC Conjecture. Really. I think that what I hear that Mochizuki has done is create new mathematical structures that allow Diophantine problems to be addressed in new ways. It is immensely more difficult, more abstract, more technically demanding, than what DH did. But I would argue it is in the same spirit: take ancient objects, in their case messages, and embed them in a richer structure that allows entire new ideas to be used. This revolutionized cryptography, and I believe that the same *may* be happening now in number theory.

## Open Problems

Besides the claimed proof being correct, is the above connection cogent? Okay I am out on the limb, and I hope that what I said today is not too silly. I hope that it helped a little in our understanding of the ABC Conjecture and of cryptography. If not, then so be it.

[fixed spurious “deg” in Mason-Stothers theorem; added update]

### Trackbacks

- The abc conjecture and its consequences « Fight with Infinity
- » The ABC Conjecture And Cryptography Gordon's shares
- Mochizuki May Have Proved the ABC Conjecture « Pink Iguana
- What Is The Object? « Gödel’s Lost Letter and P=NP
- Why We Lose Sleep Some Nights « Gödel’s Lost Letter and P=NP
- The Year That Was « Gödel’s Lost Letter and P=NP
- Predictions and Principles « Gödel’s Lost Letter and P=NP
- Finite simple groups | Peter Cameron's Blog
- Another Forgotten Bet: Does Don Zagier About to Owe Me 1000 Shekels For The Proof of the ABC Conjecture? | Combinatorics and more
- What Is Best Result Of The Year? | Gödel's Lost Letter and P=NP
- Predictions For The New Year | Gödel's Lost Letter and P=NP
- Alexander Grothendieck 1928–2014 | Gödel's Lost Letter and P=NP

This post is probably a certain disrespect to what Mochizuki has claimed to done and his efforts in anabelian geometry. I strongly suggest a researcher who is not even familiar with what Grothendieck has done to suggest any thing along the lines of comparing DH and Mochizuki’s work. DH work was done on something that is representable and working with representations is always easy. No one (not even mochizuki) would have advanced anabelian geometry to a great deal such that many of his objects would even be known to be representable. (Just like when Grothendieck invented etale cohomology groups, I do not think he worked with any explicit representations). Messages are numbers. It has been known a long time (for example Shannon). If not DH someone else would have done it.

“to not suggest”

“messages have a representation through numbers”

Chill.

:)

First theorem right side should not contain deg.

Along with Dick Lipton, and pretty every human being in the world, I can rejoice in Shinichi Mochizuki’s progress toward proving(?) the ABC Conjecture, without having even the faintest clue regarding

p-adic and/or inter-universal Teichmüller theory.This post will instead contribute a few thematic quotations regarding

whyeveryone is so very pleased to hear of Mochizuki’s result … even though pretty much none of us understand his proof methods. The first thematic quotation is from the quantum pioneer David Deutsch’sFabric of Reality(1997):The second thematic quotation is from Ed Witten’s very recent (Tuesday!) preprint “Notes on super Riemann surfaces and their moduli” (arXiv:1209.2459v1 [hep-th]):

The third thematic quotation is from Lars Onsager (and is especially dear to us engineers):

For me, the Borromean link among these three quotations is associated to the 21st century’s rapidly evolving ideals of mathematical naturality … and so it is notable the word “natural” and its variants appears no less than eighty times in Witten’s preprint, and moreover, Mochizuki’s four preprints use “natural” and its variants

more than six hundred times.:shock: :!:ConclusionResearchers like Mochizuki and Witten provide for us mathematically natural ideas that present to us the natural world from an eagle’s soaring perspective. And not less importantly, these same natural ideas — once we have thoroughly digested — are certain to yield Onsager-style fertile humus for practical 21st century applications! :)I don’t understand your comment. Usually “natural” simply means “functorial”, a notion with a precise mathematical meaning…

I do not agree with the attribution to the DH paper. The idea that messages could be manipulated as symbols and numbers probably came up no later than with the invention of electronic communication and the telegraph. A definite attempt to use math to manipulate messages for cryptographic purposes was the Hill cipher, around 1929, which used linear algebra for the transformation. The era of modern cryptography arguably is not just due to DH but also due to Shannon, who anticipated many ideas of modern cryptography. Lucifer, the precursor to DES, was based on Shannon’s concept of a product cipher, and was developed in the late 1960s and thus a decade before DH’s seminal paper. While it does not use multiplication nor addition, the abstraction is already there.

of course, it does use addition modulo 2, if you want so

Mal Malego,

Thanks for the comment. You are right about Hill cipher, but it still did all operations on letters. The arithmetic was modulo 26. So it still did not look at messages as a whole as numbers. See here.

Shouldn’t Gödel get some credit here, for representing statements as numbers? (Like many greats, Gödel gets too much credit for things he didn’t do, and too little for things he did…)

Unlike 1ly1, I find your cryptographic analogy quite suggestive. But unlike you, I’m almost sure the ABC theorem will lead sooner or later to a new cryptography system.

If Shinichi Mochizuki’s proof is found to be correct, will it represent an even more important breakthrough than Andrew Wiles’ achievement? Or maybe this question is meaningless?

Hi Mr. Serge: If Mochizuki’s techniques prove correct, it probably is a revolution. Andrew Wiles techniques used the entire machinery of etale cohomology (every thing that was known in the 20th century upto his proof). Mochizukoi;s technique go beyond that. Wiles lived in the Grothendieck ‘universe’. Mochizuki creates his own.

Hi Mr. Serge: It probably is true his proof will eventually lead to new ‘concrete’ techniques in CS. I don’t unbderstand what me contributing anything to crypto has to do anything with Professor Lipton’s post.

Oops please disregard.. my post I am bit sleepy.

Don’t worry, some of my “best” comments here were written when I was about to go to bed. :) Anyway, whatever Mochizuki did it won’t diminish my admiration for Wiles and Grothendieck. Schemes are indeed a very nice topic, an instance of generalization up to the appropriate level and not for its own sake.

Hi Serge: Ofcourse over the shoulders of giants

Hi 1ly1: I admit this. The slow rise toward generality, from the Ancient Greeks to the present days, is especially impressive in algebraic geometry.

One thing that puzzles me is 500 pages of proof probably carry 500bits of information. Did it take 2^500 relative human time units? Would verifying that take that long?

That messages are numbers and the linkage between math and human language is absolutely fascinating. In the recent debate about teaching algebra in schools, I think the point that language is understandable in the context of algebra was something people failed to grasp, and is something that highlights the absolutely horrible path math education has taken.

I read the problems my son gets in school and I realize more and more that they don’t actually teach math in school, my own recollection verifies that. The are taught process, and in many cases its arbitrary as to how even that is taught. If I person is unable to instinctively deduce the process then they are simply left behind, because even the teachers don’t typically understand the math beyond the process.

In any case, the interesting point brought up message and numbers is that what we are really talking about is the “coordination” (e.g. assigning coordinates) to messages. The question then is what is the most efficient coordination of for a particular language. There is an interesting interplay between having a coordination system sufficient robust to capture all statements, but that doesn’t become dominated by nonsensical statements.

There is a lot of similarities between this problem sensible statments in a language and “physical states” in a physical theory. Its interesting to think about.

> That messages are numbers and the linkage between math and human language is absolutely fascinating.

Whenever you encrypt a message, you don’t have to make use of its meaning. So if the messages are written in English, cryptography won’t tell you anything about the structure of the English language. The digital nature of your messages is relevant only as far as their encryption is concerned. Understanding the mathematical nature of language is something else…

If you are not focussing on giving any structure to your objects (in this case messages), then you can represent it say as ‘donkey’s provided the sequences of ‘donkey’s are distinguishable from one another. And by using operations on these ‘donkey’ objects you can pretty much do anything on your underlying objects. Cryptography happens to choose ‘numbers’ as ‘donkey’s since this particular ‘donkey’ object as lot of rich operations. Is it anything great? The surprise is why no one cared before. May be growth in electronics and automation?

It’s like saying P=NP is a statement about numbers just because the programs are recursively enumerable… whereas an integer doesn’t say anything about the algorithm it’s supposed to represent!

Serge,

Thanks, but I syntax and semantics (a la Weaver’s expose of Shannon) are just as encodable in principle as the words and numbers. If words can be used to convey concepts then there must be some stability in a particular concept to allow for some mapping. The interesting thing when it comes to language and people is that there is an apparent flow and morphology in language as it relates to concepts. The question that puzzles me is whether there is any invariant quantities in the flow? I don’t immediately think there is any real conservation principle in place when it comes to human thought outside of your standard *physically* conserved quantities.

Neither do I. Concepts are evanescent in nature: that’s why translating a scientific paper is more reliable than translating a novel. Nobody has been able yet to express concepts without words, and from one language to another the concepts aren’t exactly the same. To define an invariant notion of concept would probably require a new kind of math… and maybe some fuzzy logic could help.

Dear Dick, regarding your remark on Presburger arithmetic and Gödel: Presburger arithmetic is indeed decidable, but Fischer & Rabin’s 1974 proof of its high complexity was allegedly described by Gödel himself as a kind of second incompleteness theorem (that is, even through restriction to a decidable case, decision is prohibitively expensive).

You use |a| instead of |c| in the statement of ABC.

A conjecture must be explicit, either true or false. ABC conjecture tolerate some counter-examples (quite a few). Therefore number theory cannot prove this conjecture.

You’re wrong.