# Still A Brilliant Idea

* An apology and correction *

Cynthia Dwork, Frank McSherry, Kobbi Nissim, and Adam Smith are the inventors of differential privacy, as formulated in their 2006 paper “Calibrating Noise to Sensitivity in Private Data Analysis,” in the proceedings of the 2006 Theory of Cryptography Conference.

Today Ken and I want to talk about differential privacy again.

Our last discussion was on differential privacy (DP). We stand by our statements that it is a brilliant idea, has direct societal relevance, and is important in theory because it now is being used to prove theorems that do not seem to directly be about privacy. But we messed up in giving the main credit for the framing of the idea to Dwork. For this we apologize.

## History Of DP

The paper by Dwork that we featured was an invited presentation to ICALP 2006 that opens the proceedings volume. We referenced its acknowledgments but did not bring the names to top level. In addition to McSherry, Nissim, and Smith, her paper names Moni Naor as joint author of the proof of impossibility of semantic security in that context.

We have since been apprised by them and others of the longer history of both the concept and its fulfillment by the mechanism of careful perturbations of database query responses. Among several precursors cited in their work are two PODS 2003 papers, one by Nissim with Irit Dinur titled “Revealing Information While Preserving Privacy” and the other by Alexandre Evfimievski, Johannes Gehrke, and Ramakrishnan Srikant titled “Limiting Privacy Breaches in Privacy Preserving Data Mining,” a 2000 SIGMOD paper “Privacy Preserving Data Mining” by Srikant with Rakesh Agrawal, and then a Crypto 2004 paper by Dwork and Nissim and a 2005 SIGMOD paper on “Practical Privacy: The SuLQ Framework” adding McSherry and Avrim Blum.

There are now many followup papers and surveys on DP. Three followup papers are “Mechanism Design via Differential Privacy” by McSherry with Kunal Talwar (FOCS 2007 version), “Smooth Sensitivity and Sampling in Private Data Analysis” by Nissim and Smith with Sofya Raskhodnikova (STOC 2007 version), and “What Can We Learn Privately?” by N-R-S plus Shiva Prasad Kasiviswanathan and Homin Lee (FOCS 2008; 2013 version). We also note a paper by Jaewoo Lee and Chris Clifton in the 2011 Information Security conference titled “How Much is Enough?: Choosing for Differential Privacy,” which speaks to us the technical delicacy of the mechanism; Lee and Clifton have gone on to vary the concepts. Of course there is also the new book by Dwork and Aaron Roth. If you are interested in the full history of DP and the progress from its roots to its current applications, we suggest to see these sources.

## Open Problems

We apologize again. We retract any statement that suggested an inaccurate history of the invention of DP. We *do not* retract our opinions regarding its importance.

[referenced a second PODS 2003 paper]

another item, it seems like differential privacy as a research program was dealt something of a blow by the MS Silicon Valley lab closing (also some details/ compiled reaction here), where it was apparently one of the primary topics of study at the lab, others have pointed this out & commented some. 😦

MS had said not much about why that particular lab was closed but it does make one wonder. of course differential privacy research is likely to live on elsewhere. it seems to mesh closely with big data & also maybe has tieins with the new field of holomorphic computing (also with natural connnections to big data). not sure if anyone has pointed that out yet but there seem to be basic parallels.

@vznvzn Doesn’t seem easy to find something about holomorphic computing. It would be most kind of you to provide me with a place to start looking. Thanks!

Reblogged this on Pathological Handwaving.