David just passed away

David Johnson was a computer theorist who worked on many things, with special emphasis on the care and treatment of hard computational problems.

Ken and I are sad today to hear that David just passed away.

Many will announce this sad event; we expect that the whole community will express how much they miss David. He was unique among theorists in his dedication to see how hard so-called “intractable” problems really are. He dedicated much of his career to building challenges: A typical one asked you to design an algorithm that solved some hard problem, often an NP-complete one. These challenges were of great importance in pushing forward the field of attacking difficult but important problems.

We owe David much for working on these challenges, usually in conjunction with DIMACS—the Center for Discrete Mathematics and Theoretical Computer Science. See this for the list of DIMACS challenges over the years. Note, David was usually part of the team, if not the sole person, running a particular challenge. He also did the lion’s share of assembling a 2000 draft for NSF about challenges in theory.

## A Sample Challenge

We quote from the challenge page for the traveling salesman problem (TSP):

The TSP is probably the most-studied optimization problem of all time, and often the first problem that newcomers to the field (or visitors from other domains) attack. Consequently, it has one of the most fractionated literatures in the field, with many papers written in apparent ignorance of what has been done before.

One goal of this Challenge is to create a reproducible picture of the state of the art in the area of TSP heuristics (their effectiveness, their robustness, their scalability, etc.), so that future algorithm designers can quickly tell on their own how their approaches compare with already existing TSP heuristics. To this end we are identifying a standard set of benchmark instances and generators (so that quality of tours can be compared on the same instances), as well as a benchmark implementation of a well-known TSP heuristic (the greedy or “multi-fragment” algorithm), so that the effect of machine speed on reported running time can (roughly) be normalized away.

A good place to read about the details that go into creating and running a challenge is the paper by David and Lyle McGeoch. It is quite a complex endeavor to do one of these challenges properly. Just some of the issues that strike us are:

• What instances make good tests for the problem in question? Usually generators that use randomness are used to build the instances.

• How do you handle the representations of the problems? This sounds simple, but in practice it can be quite difficult. In the TSP case some algorithms cannot handle input instances that have points that are not integral. You can imagine that this issue can be quite problematic for more specialized problems.

• How do you compare algorithms run on different machines? We in theory are content to say ${O(n^{2})}$ beats ${O(n^{2}\log n)}$. But in these challenges the measure is actual running times. Making this work is not easy, and requires a clear understanding of what the various machines can do.

## From Ken

I am also deeply saddened to learn the news this morning. David was a particularly welcoming voice at conferences in the 1980s that formed my outlook on the field as a graduate student. I also remember he helped host me for a one-day visit to Bell Labs in the summer of 1984. I recall that during this visit I was held to a draw in a chess game against Ken Thompson’s Belle computer.

I last saw him at FOCS 2009 in Atlanta at a lunch table with Dick and others I’ve been glad to know. He was an oracle for the status of algorithmic problems: He added to my knowledge concerning a problem that one of my then recently-graduated students was working on. Among things he did to build up the community was to create and edit the STOC/FOCS Bibliography, which I used often to look up journal versions of papers before the rise of the Internet. His book with Michael Garey was the first book on complexity that I read, and it set the style for presenting computational problems in our field.

## Open Problems

Ken and I send all of our thoughts to David’s family and many friends. He will be missed.

Update 3/11: Lance Fortnow’s memorial has been hailed by some others, and today the ACM has posted a memorial by Lawrence Fisher.

1. March 10, 2016 11:39 am

will never forget as a naive newbie writing about SAT on usenet many yrs ago, and talking about a bit of brief few-page intro to subj in Sedgewicks “algorithms”. someone told me that it was “possibly one of the worst intros to the subj” at the time & to immediately go look at Computers & Intractability/ theory of NP completeness. one of the great/ most influential CS books of all time, standing the test of time. how many copies have been sold? how many comp scientists have it on their shelf? its interesting how much of the P vs NP theory was fleshed out early on and much that remains seems to be super-hard. its as if there are very strong extremes in the field, and this was not anticipated early on. and notice how the recent Babai graph isomorphism quasi-polynomial time breakthrough ties into the history… graph isomorphism was regarded/ pointed out as an open problem at the time and to some degree it still is! (still not known to be in P even with this breakthru recent progress.) guess we will all have to adopt that zen patience and realize that some of the big questions are not even resolvable in a single lifetime…. at least we have (had) the honor of wrestling with them some….