Skip to content

DC Post

November 6, 2014


Discussing lower bounds for this blog’s 600th post

trinity-tpb-2-batman-vs-superman-fears-over-wonder-woman
source

Superman, Batman, and Wonder Woman are three of the best known superheroes in DC Comics books, which were a childhood staple for many of our age. They brought us action in color before our families had color TVs. There is a new nonfiction book on Wonder Woman. She and Superman have super powers: flying, super strength, X-ray vision, magical healing, and more. It is notable that Batman does not. All of his powers come from great training, great study, great equipment, and great helpers.

Today this blog makes post number 600, DC in Roman numerals.

Our joint-author handle “Pip” recently passed 100 posts, and I have 67 more as single author. We now have over 15,000 non-spam comments. Some technically-detailed posts take much of the full average 4–5 day interval to write, while others with lighter fare or discussion points are quicker. We are always open to suggestions for kinds of posts and balance of content.

DC can stand for various things: Washington where power changed hands on Tuesday, direct current, da capo in music meaning to restart from the beginning, even Dyson’s Conjecture. It is also the product code for all vacuums made by James Dyson (no close relation to Freeman Dyson). But we will stay with DC Comics and the question:

Will it require “super powers” to break down the lower-bound barriers?

We have of course discussed barriers often before, but they came up again last week at a workshop at MIT for Michael Sipser’s 60th birthday.

Kryptonite

Superman has one weakness of a kind not shared by Wonder Woman or Batman: he becomes powerless in the presence of kryptonite. The name and material come from Superman’s home planet of Krypton, though substances with much the same chemical formula were found to exist innocuously on Earth. There are a huge variety of forms of kryptonite in official DC Superman stories, not to mention others. In complexity theory we also have several forms of kryptonite, of which the three best known are:

  1. Relativization: You are powerless to prove a statement {S} by many techniques if there is an “oracle set” {K} such that a similar “relativized statement” {S^K} is false.

  2. Natural Proofs: Many other techniques to try to prove {S} must fail, unless security systems holding up the entire Internet suffer an unexpected blow to their integrity.

  3. Algebrization: Techniques that could conquer the set-of-binary-strings kind of kryptonite {K} are vaporized by the higher-grade form {\tilde{K}} that results when the binary elements {0} and {1} are placed in a higher field.

New radioactive varieties may yet be enriched from sources such as the hardness versus randomness tradeoff, by which certain strong lower bounds require the ability to de-randomize certain algorithms (and in several senses vice-versa), or sameness of average-case complexity by problems of different worst-case complexity. Weaponry that was supposed to penetrate to the core but was melted or blunted include the Sipser programme for circuit lower bounds and the Fusion Method. The ravages in our knowledge are far worse than ignorance of whether {\mathsf{P = NP}}:

  • No super-linear time lower bound is known for any of the {\mathsf{NP}}-complete problems like {\mathsf{SAT}} that are actually in nondeterministic linear time, if Turing machines are allowed to have planar tapes.

  • No super-linear circuit size lower bound is known for any function in {\mathsf{E^{NP}}}, even though {\mathsf{E = DTIME}[2^{O(n)}]} is an intractable uniform time class and the oracle queries are allowed to be exponentially long.

  • Indeed the highest circuit size lower bound known for such an “explicit function” is {5n - o(n)}, and even that bound “cheats” by excluding XOR and its negation from the gate basis; moreover the technique for it can go no further.

  • The main technique for algebraic circuit size lower bounds tops out at size {\Theta(n\log n)}.

  • Even the famous {\Omega(n\log n)} lower bound for sorting, which is taught in undergraduate data structures and algorithms courses, is false in reasonable models.

The last item hints at a kind of kryptonite that no champion of “truth, justice, and the American Way”—at least not the first of these—can overcome. Perhaps many of the lower-bound statements we have been trying to prove are just false.

Lex’s Vengeance?

Superman’s greatest enemy, Lex Luthor, reminds us that “lex” stands for binary words, which are the Gotham City we are fighting to control. What is it about information that refuses to be known? As is clear from a post made by Bill Gasarch about the Sipser-fest, questions like these filled the halls. A discussion of the Fusion Method with its co-originator, Mauricio Karchmer, led to Karchmer saying the following:

(1) We can’t prove lower bounds because they are false, and (2) we can’t prove upper bounds because we are stupid.

A followup comment by him says he was not kidding.

What makes this double trouble is a theme we have talked about on this blog, that proving a good lower bound often involves finding a new upper bound. Bill gave eleven examples in a post last year titled, “Are most lower bounds really upper bounds?”

The link works in the other direction with Natural Proofs and hardness-versus-randomness: from any lower-bound proof (of a broad kind) for certain problems (again of a broad kind) one can derive a new algorithm for some other problem. For discrete log it can be the same problem, and this leads to ironclad nonexistence of natural proofs of its difficulty. Lance Fortnow has just now remarked the following about Ryan Williams and his circuit lower-bound for {\mathsf{NEXP}}, which is Bill’s eleventh example, and which we covered two years ago:

Recently Ryan has turned that process around, for example getting a faster algorithm for all-pairs shortest path using the techniques from the Razborov-Smolensky circuit lower bounds.

To our mind, this sends up a natural challenge, at least as far as making survey charts of objectives to guide people entering our field:

Which upper bounds seem hard to prove because we might be stupid, and which seem hard to prove because they imply lower bounds that might be false? And vice-versa…

Can Batman Win?

As we said in our intro, Batman is distinguished by not needing to rely on any super powers, just diligence and training and study and a sonar sense of where to attack. A new movie, tentatively titled “Batman v Superman: Dawn of Justice,” is in production for a 2016 release. So what chance does a Batman have in our arena?

Dick and I have tried to offer a “progressive” outlook that at least frames the issue of what an algorithm might be said to “learn,” and trial notions of constructiveness and succinctness. These may be mere gadgets, but Batman after all keeps a large utility belt. There are of course many other ideas out there, and Batman has no illusion that just using any one of them will bring the hard cases down.

What we suspect is needed, however, is a deeper strategy—a new algorithmic idea. We are struck that many breakthroughs in the past—including several mentioned above—have come from perception and application of a new procedural idea. Even diagonalization is algorithmic—given a function {F} from a set {S} to its power set, it shows how to find a subset {D} of {S} that is not in the range of {F}. Indeed, for any function {g} on {S}, we can define {D_g = \{g(x): g(x) \notin F(x)\}}, and there is still scope for interest in how the diagonal sets {D_g} one obtains depend on the function {g} as well as on {F}.

What we wonder is whether there is greater scope for “introspection” of the process of executing a proof, specifically the effect of complexity limitations on the process. Suppose a deduction hinges on whether certain functions {g_n} on a family of sets {S_n} are 1-to-1. Suppose this may in fact be false, but {g = \{g_n\}} is a cryptographic hash function whereby finding colliding pairs {x_n,y_n \in S_n} such that {g_n(x_n) = g_n(y_n)} is hard. Already we have tools for relating this kind of hardness to independence in (weak) formal systems, and many papers on the proof-theoretic power of various degrees of the pigeonhole principle have faced this issue. We still feel this is a cave worth delving deeper. Independence results are also mentioned in Bill’s post, quoting Steven Rudich as telling him at the Sipser workshop that they too are alas not close.

In DC Comics, Batman was eventually able to seize Lex Luthor’s kryptonite ring, though he has used it only in case Superman goes wrong. Perhaps with more-clever self-reference we can turn it around and find a sense in which a little kryptonite can be exploited for good?

Open Problems

Can Batman make progress, or will we need a new super power idea? What do you think?

We also salute Mike on his 60th.

[minor word and symbol tweaks]

3 Comments leave one →
  1. November 7, 2014 11:29 am

    yeah what a great analogy. complexity theory as a comic book. there are “impenetrable adversary lairs” so to speak. (jukna writes in his great book a chapter on “our adversary”.) 👿
    still waiting on our superhero with Einstein-level mad scientist powers to vanquish this fiendishly diabolical foe, nontrivial lower bounds or P=NP.
    as for results in this area, the time/space tradeoffs of van melkebeek / williams / fortnow are notable.
    you might enjoy this link The Holy Grail of Comic Books Hid in Plain Site at New York Comic Con… & here around my parts someone rendered this $3.2M cover in a lego mosaic. awesome!

Trackbacks

  1. Alexander Grothendieck 1928–2014 | Gödel's Lost Letter and P=NP
  2. How Joe Traub Beat the Street | Gödel's Lost Letter and P=NP

Leave a Reply

Discover more from Gödel's Lost Letter and P=NP

Subscribe now to keep reading and get access to the full archive.

Continue reading