# Psst—It’s The Definition

* Definitions are extremely important *

src |

Shafi Goldwasser and Silvio Micali won the 2012 ACM Turing Award last March for their seminal work in cryptography. We are not a year late saying this—that is how the award is dated. By contrast, the official Academy Awards website says that the 2012 Oscars mean the ones for films in 2011, so February’s ceremony was the 2013 Oscars. The Turing Award is not like an Oscar anyway, as it is for lifetime achievement. The ACM-EATCS Gödel Prize, whose just-announced winners we intend to recognize soon, is more like an Oscar in that it honors specific performances (papers) within a limited timespan. Like the Oscar, some have won it more than once, including Shafi, while we cannot tell from the definition whether the Turing Award can be won twice.

Today Ken and I wish to talk about how great definitions transmit advances in math and theory, and how to make them greater.

Shafi and Silvio won the prestigious award not mostly for a deep theorem—although both have proved some very deep results—but for creating the “right” definition for security. Well actually the definition needed some work after their famous 1981 paper, as noted here and as we expand below. But it was an important milestone in the history of modern cryptography.

This phenomenon of creating the right definition as a seminal result in math is actually a recurrent theme. Deep results are important, and are the life-force of math, but the right definitions can advance a field more. A deep theorem closes a door: X is now known to be true. A deep definition opens a door: Y and Z and are now new questions to be considered. We have previously talked about how “the right definition” can enable proving a theorem, and about different roles of definitions, but *great* definitions are known for larger impacts.

Let’s take a look at some of the great definitions. The list is obviously personal and incomplete, and we invite you to add to it.

## Six Examples

**Numbers**. Leopold Kronecker was apparently referring to not when saying what “the dear God” made, while “everything else is human work.” So perhaps he excluded zero and the negative integers as “definitions,” but that still leaves the rationals, the reals, and the complex numbers as some of the most important definitions of all.

**ZF**. Ernst Zermelo and Abraham Fraenkel created this definition of the most popular modern view of set theory.

**Lebesgue measure**. Henri Lebesgue defined the measure in 1901, and later it was in his dissertation in 1902. Modern analysis would be very different without this beautiful definition. There are other notions of measure and related integrals, but Lebesgue’s definition is wonderful.

**Turing Machines**. Alan Turing—you knew that—created this definition in his famous work on the Halting Problem and undecidability. Where would complexity theory be today without this simple, intuitive, but brilliant definition? Perhaps it deserved a Turing Award all by itself.

**Complexity Classes**. We earlier singled out Hartley Rogers and his classic 1967 textbook for the importance of defining classes of languages, and perhaps the influence goes back to Rogers’ work in the 1950’s. But the seminal 1965 paper cited for Juris Hartmanis’ and Richard Stearns’ own Turing Awards is what introduced complexity classes in the way known by our field, long before the theory of -completeness revealed the scientifically amazing extent to which problems “quantize” into a relative handful of complexity classes. Another reason for *class* accomplishing more motion advancing the field than *problem* is that the definition of the class expresses the computational goal or barrier.

**PAC Learning**. Ideas of computational learning had existed for decades prior to 1983, of both recursion-theoretic and artificial-intelligence flavors, but had limited boosting power compared to the explosion that followed Leslie Valiant’s PAC definition that year. His paper and concept were the first things cited in his own Turing Award, and he has just published a book titled simply, *Probably Approximately Correct*.

## How to Tell a Secret

Many good definitions come from a direct construction of a mathematical object or logical property. However, sometimes this involves choices of details that are in line with the way the intended application is framed but are not really part of the *concept*. They may be representation-dependent, such as in a particular cryptographic system. This occurred with initial attempts to define,

What is a secret?

Similarly, what *is* security, what *is* a secure system? No one could say directly.

One approach was to *classify* protocols that were known or believed to be secure apart from ones known to be compromised, with the idea of:

How to tell a secret…apart from a non-secret?

But this is still wrong—it is still trying to say what a secret “is.” Anything can *be* a secret—what you need to define is whether it *stays* a secret. This requires a definition that is operational. The brilliant first insight of Shafi and Silvio was to shift the definition from knowing something to whether you gain the power to do something else.

## How to Tell a Secret Hasn’t Been Told

Suppose is a secret—remember could be any string of some length . We whisper a string —we intend to be the encoding of but again it could be anything. How can we tell that telling didn’t give away anything about ? We consider given away if a probabilistic polynomial-time algorithm (PPTA) given gains the ability to succeed on a task involving , with tangibly higher probability than it could without .

Let’s suppose the task is directly to learn some bit of . Suppose gets correct with some probability . Was to blame for this? We can say was not to blame if there is another PPTA such that gets correct with probability , where is negligible in terms of . That is, we can **simulate** by an that does almost as well without .

To be more general, let stand for any easily-computed property of the strings , which are drawn from under some distribution . Then the basic simulation idea of **semantic security** is:

For every PPTA there is a PPTA such that over all (sufficiently large) lengths drawing according to ,

Now comes a question: What if there are two secrets , and is the property that they are different? If the encoding is always a simple function of , then whether would give this property away. For general properties this already indicates that the encryption must be *probabilistic*, producing multiple valid ‘s for the same . The decoding can still be deterministic—in practice often aided by trap-door information generated by the system. Moreover, probabilistic encryption enables achieving security for any , with similar import to allowing an adversary to choose the plaintext .

Simulation and probabilistic encryption, in the complexity context, were Shafi’s and Silvio’s two main innovations. Well so was building a great probabilistic cryptosystem that worked, but we’re talking about definitions. Simulation was the right *concept*, and shortly founded their wonderful work on zero-knowledge proofs. However, it was still not the right *definition*.

## Playing Games is Fairer

This is a problem peculiar to definitions: it’s not enough to say the right thing, they have to say it the right way. To use the definition in formulas for a proof, we need good syntax not just semantics.

The problem for crypto was having all these loose quantities: , , and even the arbitrary success probability for . They made the definition hard to prove for specific systems, which came with their own multiple quantities. But by the time of their 1984 journal paper, Shafi and Silvio had found a way to shave the definition.

Let’s go back to and . Let the adversary choose them, and let’s run the system over and over again.

Each time, the adversary picks one of or , probabilistically encodes it as , and sends to our algorithm . Can tell whether came from or from ? That’s the game, and though the definition had a new name, it came to denote the same:

A cryptosystem satisfies

indistinguishability under chosen plaintext attack(IND-CPA) if for every PPTA , the success rate of in this game is no more than , where is a negligible function of .

This definition has no , no , and fixes to be the convenient “fair” value *one-half*. It doesn’t even have . It says that no information about any one secret is being given away, if there is no feasible way to tell its encodings apart from those of anything else with better success than random guessing.

Under quite general conditions on a cryptosystem, Shafi and Silvio proved that the system is semantically secure if and only if it satisfies IND-CPA. That may have been a difficult proof, but it was just one proof. The point is that when you make a new system, IND-CPA is usually a lot easier to prove for it than the original definition—conditioned on some hardness assumption, of course. The point is even stronger when specific settings such as public-key cryptography add their own particular quantities to the applicable forms of these definitions, as in the original papers.

## Open Problems

What is your favorite definition?

Again we congratulate Shafi and Silvio on their award. As a footnote, the June *Communications of the ACM* article by Neil Savage dates their achievement to 1983 and equates semantic security with CPA-IND. It leads by saying:

…Yet it was not until 1983 … that cryptographers actually defined what they were doing.

Well, since you asked for it …

My favorite definitions would have to be C.S. Peirce’s definitions — in one long breath —, of not only the

sign relationsthat found semiotics, or the theory of signs, butlogicitself as “formal semiotics”.→ PlanetMath • Sign Relation

It is commonly said that a definition gives necessary and sufficiently conditions for the application of a concept, predicate, or term. In that light, Peirce’s definition of

logic, depending in turn on his definition of asign relation, makes claims about the necessary and sufficient complexity of logical processes, among which are numbered mathematical inference, scientific inquiry, communication, and computation — to name just a few that concern us most here.Peirce’s definitions of logic and signs amount to the claim that all of the above processes comfortably live within the space of triadic relations. And they claim moreover that the minimal complexity necessary to support thinking according to norms is also marked by triadic relations.

for favorite definitions: Kolmogorov or descriptional Complexity, time synchronization of physics by Einstain, sizes or cardinality of infinite sets,

I/we considered including the definition of algorithmic probability below the Lebesgue measure item, but decided this would interrupt the flow, and there is some multiplicity of Kolmogorov complexity definitions used here.

Reblogged this on Pink Iguana and commented:

six examples, Valiant’s PAC book looks interesting, and a shout out to Dick Stearns in Albany vs the wtf Google Richard Stearns President of World Vision, really? Google Phil Lewis and you get an actor from the Suite Life of Zack & Cody. Dudes you got the wrong Phil Lewis in pretty much any measure. Maybe it’s a secret or something.

Dear Ken and Pip,

Inspired by a comment from a reviewer of my paper The Cook-Levin Theorem is False (http://www.andrebarbosa.eti.br/The_Cook-Levin_Theorem_is_False.pdf), I’ve created (or discovered) a new concept for enrichment of TCS, the Observer-Dependency Complexity (ODC): Knowing means here that a person knows completely that involved polynomial and is fully conscious of this fact; hence the definition of NPg is surely subjective, in the sense that a problem may be in NPg [NPu] for a person but not for another one. Questions such as “Is L in NPg [NPu]?” can have different answers for different people. Although whether it is in NP continues an objective matter.

Thus, as with Quantum Mechanics in respect to Physics, this revolutionary and seminal concept (ODC) puts the observer into center of TCS stage. The human factor eventually wins again, even into the abstract aridity of Computational Complexity Theory: Turing Machine and other similar formalisms are very important and powerful intellectual tools, but we, people that non-mechanically think [“– cogito, ergo sum.”], are more than it!

The comment:

“On page 8, the authors asks the reviewer to consider the question whether NP=NP_g and whether NP_u=\emptyset.

I have thought about it. However, unfortunately I cannot answer the first question because I the definition of NP_g is not clear for me. The definition asks that an integer p is known such that the size of the certificate is O(|x|^p). The question is *who* has to know p, and also what you mean by “know”. If knowing means that a person is fully conscious of the fact, then surely the definition of NP_g is a “subjective” in the sense that a problem may be in NP_g for one person but not for another person. Questions such as “does L belong to NP_g” then can have a different answer for different people. On the other hand, if the definition were to ask merely of the existence of p then the definition becomes equivalent to the standard definition of NP. As for the second question, NP_u=\emptyset, again, the definition is ambiguous for me: what does “unknown” mean. Does it mean that no human individual on this planet is fully and vividly conscious of the fact? If so then the question becomes an empirical (and probably unverifiable) question, but if unknown is replaced by “not ecessarily known” then NP_u encompasses all of NP and therefore is certainly not an empty class.”

It is possible to make formal counterparts of these ideas with notions of “provable complexity properties” that were current in the 1980’s, dialing back the philosophy but preserving some of it. See, for instance, this recent post, including the problem at the end and the comment by Harvey Friedman.

Pip = Dick+Ken, since Pip from

Great Expectationsis among the best-known Dickens characters. The idea of having apipin place of abitcould have been worked into last week’s post, but it was already long enough.Dear Ken, Dick and Lip, (since {Ken}, {Dick} and {Ken, Dick} are different people (sets)),

Verify that, as with the Calculus in the XIX century, I – with my brilliant articles, a real fantastic four (1. http://arxiv.org/ftp/arxiv/papers/0907/0907.3965.pdf, 2. http://www.andrebarbosa.eti.br/P_different_RP_Proof_Eng.pdf, 3. http://www.andrebarbosa.eti.br/NP_is_not_in_P-Poly_Proof_Eng.pdf, and 4. http://www.andrebarbosa.eti.br/The_Cook-Levin_Theorem_is_False.pdf) – am providing stronger and more logically consistent foundations to TCS.

This simple-yet-vast definition was (essentially!) first proposed by Wilhelm Killing in 1888. Per

A. J. Coleman’s review:Even today, 125 years later, the STEM community is far from a full appreciation of the implications of Killing’s definition.

Yet another influential definitional broadening was set forth by Arnold in an article titled

Contact geometry: the geometrical method of Gibbs’s thermodynamicsThe opening sentence of Arnold’s article is itself celebrated among mathematicians

Arnold speaks of the geometric arguments that justify his Gibbs’s Thesis as “a joyful exercise”, and here Arnold’s language distinctly echoes the central theme of Bill Thurston’s also-celebrated essay “On proof and progress in mathematics”

Not the least of these joys comes when we distill new definitions, that dissolve old obstructions, and that opening new windows to our understanding, and that open new doors for our STEM journeys.

Kudos to this fine

Gödel’s Lost Letteressay for reminding us to peer curiously through new definitional windows, and pass boldly through new definitional doors.Thank you, Jon! :)

The circumstances that led Bill Thurston to write “On Proof and Progress” are relevant both specifically to Ken and Dick’s topic of “definitions” and broadly to present trends in complexity theory.

Then as now, there was no scarcity of folks enamored of “biting, merciless and hard-nosed criticism.” Such criticism is of course valuable in science, and a well-respected article in this class is Arthur Jaffe and Frank Quinn’s “‘Theoretical mathematics': Toward a cultural synthesis of mathematics and theoretical physics” (Bull. AMS 1993, arXiv:math/9307227, 188 citations).

A considerable portion of Jaffe and Quinn’s criticism was directed specifically toward Bill Thurston:

————————–

Cautionary Tales(arxiv p. 7) William Thurston’s “geometrization theorem” concerning structures on Haken three-manifolds is another often-cited example [of "weak standards" and "casual reasoning" causing "major damage"]. A grand insight delivered with beautiful but insufficient hints, the proof was never fully published. For many investigators this unredeemed claim became a roadblock rather than an inspiration.————————–

Jaffe and Quinn’s biting criticism elicited on the pages of the

Bulletin of AMSa spirited exchange of passionate letters by researchers that included Michael Atiyah, Armand Borel, Gregory Chaitin, Daniel Friedan, Jeremy Gray, Morris Hirsch, Saunders MacLane, Benoit B. Mandelbrot, David Ruelle, Albert Schwarz, Karen Uhlenbeck, René Thom, Edward Witten, and Christopher Zeeman (arXiv:math/9404236, 1994). Their remarks are well worth reading! :)It is striking and reassuring — and even hilarious — that (1) no individual AMS commenters agreed with Jaffe and Quin, and (2) taken pairwise, neither did any commenters agree with one another, and (3)

allof these AMS commenters nowadays are sufficiently well-regarded as to have biographical Wikipedia pages devoted to their achievements! :)Michael Thurston’s essay “On proof and progress in mathematics” was a response to Jaffe and Quinn’s criticism (Bull. AMS 1994, arXiv:math/9307227, 389 citations), and yet Thurston’s essay is remarkable for its entire

absenceof criticism; —Gödel’s Lost Letterreaders are invited to read for themselves, and reflect for themselves, upon Thurston’s (much-admired) response to Jaffe and Quinn.This is not to say that Thurston’s ideas were uncontroversial. Some of Thurston’s most disquieting observations were definition-related:

It is plausible at least some of the critical rancor that was directed toward Thurston was associated to the disquieting implications of these ideas. If as Thurston asserts, we cannot be confident of the well-ordering of the real numbers, then how shaky should be our confidence in the well-ordering of complexity classes? If we cannot know whether well-ordered cardinalities exist between the integers and the reals, then how confident can we be that no well-posed dynamical state-spaces exist between the tangent manifolds of Newton and Lagrange, and the vector-spaces of Hilbert and Dirac? These Thurston-style definitional questions are sufficiently disquieting as to keep even the most rational-minded researcher awake at night! :)

QuestionWho among us, who comment here onGödel’s Lost Letter, can show accomplishments to compare with those of Atiyah, Borel, Chaitin, … (to begin alphabetically a long list of Thurston’s defenders and critics!). In view of the vast benefits that are associated to this diversity (as the history of mathematics shows us plainly!) perhaps we each of us ought not to be over-confident, that our definitions necessarily are the best definitions, or our views are the only good views, or our roadmap is the best roadmap.ConclusionThe entire STEM community has benefited greatly from Bill Thurston’s outstanding cordiality, composure, and self-control, which from the mud-and-straw of Jaffe and Quinn’s criticism, created beautifully sound bricks of principled mathematical progress. :)