Skip to content

Happy Unbirthday To Turing

June 24, 2012

Happy 100th birthday to our great Machine Tools provider


Alan Turing was born 100 years ago yesterday. Hence today is his first unbirthday since his one hundred birthday. Per Wikipedia, an unbirthday is an event that can be celebrated on any day that is not the person’s birthday. It was created by Lewis Carroll in his celebrated novel Through the Looking-Glass.

Today we speculate on what Turing would find most amusing about what he helped create.

Of course many of his near-contemporaries are giving their own impressions at celebratory conferences around the globe, including ACM San Francisco a week ago, Cambridge University this past week, and Manchester right now.

We decided at GLL that we could not compete with all the celebratory events, dedicated pieces, and speeches that are being given for him. These will no doubt, like the earlier Princeton Turing Celebration, talk about how great his work was, how ground-breaking it was, and how he helped create computer science itself as a new discipline.

No we cannot compete with all these, but we do feel we could put our own spin on what developments would most bemuse Turing. Recall that “bemuse” means:

to cause to have feelings of wry or tolerant amusement.

Turing Is Not Amused

Since Turing was a British subject, we are paraphrasing the famous statement “We are not amused.” See here for this quote:

This supposed quotation was attributed to Queen Victoria by Caroline Holland in Notebooks of a Spinster Lady, 1919. Holland attests that Victoria made the remark in 1900, but supplies no details of the circumstances. Other reports have suggested that the line wasn’t an example of the ‘royal we’, but that Victoria was speaking on behalf of all the ladies present at court.

We contend that the following “obvious” examples would not be the things that would most amuse Turing:

  • He would not be most amused by the sheer technology advances. Yes an iPhone, an iPad, a laptop, and all the other gadgets are remarkable. But he was already thinking beyond Siri.

  • He would not be most amused by chess playing programs being better than world champions.

  • He would also not be most amused by the other advances in what we now call AI.

You may or may not agree. But surprise is not the same as being amused—or bemused.

Turing Is Amused

So what would he find most amusing? We believe—perhaps you will too—that it is the long staying power of his machine model that we now call Turing machines. And the unfortunate inability for us, over 70 years later, to have any idea of their limitations.

His proof that they cannot solve the Halting Problem was brilliantly mutated into proofs that Turing machines with computational limits on time and space and other measures also are restricted in their abilities—with respect to the same measure. But today, beyond the kind of diagonalization results due to Juris Hartmanis and Richard Stearns and a few others, there is precious little that we can prove about his famous machines. Ken and I find this dual success and failure—his machines are still the model of computation for theory and the diagonal proof is still the only way we can prove restrictions on their power—what might amuse Turing the most.

In the next sections we would like to expand a bit more on this point, and also show how versatile Turing’s machines are. His formalism not only founded a science—computational complexity—that didn’t really begin until a decade after his death, it still has growing room.

The Multi-Tape Turing Machine

Turing showed how the universe can be encoded onto a single one-dimensional tape. This is what Google’s Turing 100 doodle shows.

The only extension complexity theorists ask for is to have any finite number {k} of tapes. The first tape conventionally receives the input {x}, and optionally the last tape provides the output {y}. Both {x} and {y} are strings over an I/O alphabet {\Sigma}, while the machine employs a larger work alphabet {\Gamma}, which includes the blank character {B}. The machine has a finite set of states {Q}, which includes the start state {s} and a set {F} of designated final states. Let {D = \{L,R,S\}} stand for directions to move one cell left or right or to stay, respectively. Then all we need to specify a Turing machine {M} is to give a subset

\displaystyle  \delta \subset (Q \times \Gamma^k) \times (\Gamma^k \times D^k \times Q)

of instructions {(p,\vec{c};\vec{d},\vec{D},q)}, where {p,q} are states with {q = p} allowed, and the other components are {k}-tuples of chars or directions.

Mathematically that’s all {M} is—we need to mention tapes and heads only to execute {M}. To define a computation we need that every configuration includes the current state {p} and the characters {\vec{c} = c_1,\dots,c_k} currently scanned on the tapes. Find an instruction that begins with {p} and {\vec{c}}, change the characters to {\vec{d}}, move heads per {\vec{D}}, and go to state {q}, setting up the configuration for the next move. Start with {p=s} and {c_1 = x_1} and everything else blank, and halt if and when you reach a configuration with no applicable instruction. The language {L(M)} is the set of {x} that have a computation that halts in a state in {F}.

The time is the number of steps it took to halt. {M} is deterministic if {\delta} is a partial function on {Q \times \Gamma^k}, so there is always at most one applicable instruction and hence one computation. {\mathsf{NP}} is the class of languages of machines {M} whose accepting computations all have length {|x|^{O(1)}}, and {\mathsf{P}} the subclass when {M} is deterministic. Complexity theory in a nutshell.

TM Definitions for Everything

The real power of this formalism is how easily other important concepts can be defined from it. Conditions apply to every instruction.

  • Finite Automaton: {k = 1}; {c \in \Sigma} and {D = R}.

  • Pushdown Automaton: {k = 2}; {c_1 \in \Sigma}, {D_1 \neq L}, {D_2 = L \implies d_2 = B}.

  • Linear Bounded Automaton: {k = 1}; {c = B \implies d = B}.

The notions of language accepted and (non-)determinism factor right in—for instance, a finite automaton halts on the {B} following {x} and accepts iff the current state is in {F}. Thus Turing gives us Noam Chomsky’s hierarchy.

  • Space complexity of a computation: the number of cells whose character was ever changed and re-read.

  • Reversal: when a head moves {L} having previously moved {R}, or vice-versa.

  • Reversal complexity: the total number of reversals.

  • {\mathsf{NC}}: The class of languages accepted by polynomial-time deterministic TM’s that make {\mathsf{polylog}(n)} reversals.

Thus we can define the class of problems with fast parallel algorithms without having to define a parallel machine. Can we define streaming algorithms? Well, we can try:

  • Streaming machine: All tapes either use {\mathsf{polylog}(n)} space or are limited to {\mathsf{polylog}(n)} reversals.

Let {\mathsf{SA}} be the class of languages accepted by streaming machines that also run in polynomial time. Then {\mathsf{SA}} contains both {\mathsf{NC}} and {\mathsf{SC}}, the latter named “Steve’s Class” after Steve Cook. Is it in some sense a closure of both classes? What if poylog is replaced by log?

Here are some concepts that lend themselves to more-stringent notions of streaming and have other uses.

  • Online machine: {D_1 \neq L}, wlog. {d_1 = c_1}.

  • Real-time input: {D_1 = R}.

  • Real-time output: {D_k = R}, {d_k \in \Sigma}.

  • Sequence generator 1: Input is empty, {D_k \neq L}, {d_k \in \Sigma}.

  • Sequence generator 2: Input is {0^n}, {D_1 = R}, {c_1 \neq B}, {D_k \neq L}, {d_k \in \Sigma}.

  • Fixed stride on tape {j}: for some fixed {m} the head moves right every {m}th step and is stationary otherwise. Similar concept with “left” in place of right, and allowing worktapes to alternate phases of each.

Thus we can talk about streaming in real time or some fixed stride, not just for the input and output tapes but also for the non-logspace or non-polylog-space worktapes. Turing’s flexible framework allows many handles on the processing of information. This even goes for oracle machines—Turing defined those too.

Is TM Linear Time “Right”?

Turing machines are equivalent to many models for defining polynomial time, but as surveyed here, linear or other fixed polynomial time bounds are not so robust. Most of those models, including random-access machines and TM’s with planes instead of tapes, can do computations that TM’s provably cannot imitate step-for-step, even with some fixed delay. This does not prove they accept more languages in linear time, but that is plausible. Nevertheless we may ask:

Are these richer models technologically scalable? Or is the more-restrictive Turing-tape notion of linear time “correct”?

The richer models do not charge for poor locality of reference, something you can actually feel whenever a program making many small random-access reads causes page faults and “thrashing” of disks. Whereas, Turing tapes have affinity for streaming and pipelining and embody the technological notion that “locality is one-dimensional.” But what can distinguish the Turing machine notion of linear time?

One is that the standard circuit-complexity notion of {\mathsf{DLOGTIME}}-uniformity really defines the circuits’ connection languages to belong to TM linear time, because the inputs to these languages have encodings of size {O(\log n)}. What makes this uniformity notion salient is the way Turing machine computations lend themselves to definability in first-order logic. The language classes of non-deterministic and alternating linear-time TMs have notable equivalent characterizations.

The “purest” indicator, however, may be the truth of the Hartmanis-Stearns Conjecture, which we have been discussing. This connects Turing machine linear time to the classic concepts of rational, algebraic, and transcendental number. Although it is for sequence generation, not language acceptance, the linear structure of Turing’s tapes is the key feature.

Open Problems

Is {\mathsf{SA}} an interesting class? Does it have a simple characterization via known classes? Could it equal {\mathsf{P}}?

If the Hartmanis-Stearns conjecture is false, can one give a natural, more-restrictive notion of linear time that makes is true?

Can the function {f} on {\{0,1,2\}^*} inductively defined by {f(\lambda) = \lambda}, {f(0x) = 0f(x)}, {f(1x) = 1f(x)}, {f(2x) = f(x)2} be computed by linear-size circuits? It moves all 2’s flush-right while leaving the sequence of 0’s and 1’s the same, e.g. {f(02112202) = 01102222}. See this guest post by Ken on the Computational Complexity blog.

[changed/added sentences with “unbirthday” and “oracle” and “salient”]

10 Comments leave one →
  1. June 24, 2012 11:20 pm

    Doron Zeilberger’s latest (heretical) opinion starkly contrasts with this post.

    • June 25, 2012 12:08 am

      I think his opinion runs transverse to ours, not opposed. Notice the bolded word “scalable”, which is a finitistic way to say “O(n)”. I added the sentence about “oracles” as an edit after we posted—elsewhere on this blog you can find that we’re not fans of them. We did put some pointed deprecation of Turing Machines into Kurt Gödel’s mouth here, but that’s not inconsistent—the fact of their long-term use and their morphable richness is our point. But thanks for noting the opinion!

      [Regarding the opinion itself, I think the statement “Every meaningful statement is either provable or disprovable” runs afoul of finitism already—what if the proof is really long? The issue of how proofs scale with meaningful statements is a complexity issue, for which (for instance) this work on Groebner proof systems is a good place to start.]

  2. Emmanuel permalink
    June 25, 2012 4:43 am

    And 100 years after the birth of Alan Turing, his machine come true! Without electricity or electrons! Check the video, it looks great.

  3. John Sidles permalink
    June 25, 2012 11:58 am

    Although the mathematical idea of the number-line (more formally, the real numbers) and the practical applications of that idea both date to antiquity, even today — after thousands of years of mathematical effort — we have achieved only an incomplete mathematical taxonomy of the Real Zoo: integers, rationals, transcendentals, incomputables, infinitesimals, and surreals (to name only some).

    Today the complexity-theoretic idea of P and the practical applications of that idea are only a few decades old, and so is it any wonder that we have achieved (to date) only a hazy complexity-theoretic grasp of P’s taxonomy?

    That is what Alan Turing might perhaps have enjoyed most about our present understanding of his machines … how strikingly little we know about the ecosystem of mathematical creatures who dwell on those tapes!

    As with many researchers, my own interest in the creatures of Turing’s Zoo is largely or even wholly practical, and indeed I can say with the early Arab physician Abu Bakr Muhammad ibn Zakariyya al-RazI (865 – 925 ce)

    With respect to mathematics, I acknowledge that I have looked into them only to the extent that was indispensable for me. … That I have not consumed my time in trying to master them is deliberate on my part …  … If what I have reached with respect to knowledge is not what is reached by the one deserving to be called a philosopher, then I would like to know who such a one would be in this epoch of ours.

    Like very many modern-day researchers I combine a practical interest in the creatures of Turing’s Zoo with a keen appreciation of their mystery and beauty. And therefore, to any person who can help to improve our collective understanding of these creatures, to that person in three days I shall distribute my modest store of TCS StackExchange reputation points with the prodigious generosity, and the delighted appreciation, and the sincere thanks, that such contributions deserve! 🙂

  4. Serge permalink
    June 26, 2012 9:14 am

    Turing machines are, with lambda-calculus, good languages for proving properties of algorithms but bad ones for humans to make use of computers. Symmetrically, a language such as C is highly usable but it’s no good for doing theory. Let’s say that the more you know about things, the less you can make use of them and the other way around…

    In complexity theory, this means that the more you can know how an algorithm behaves the less you can make use of it practically, and the other way around. The algorithms taught in the classroom with their complexity proofs (sorting, searching, etc…) are easy exceptions studied precisely for their simplicity. At the opposite end we have the NP-complete problems. In between we have a problem such as integer factoring. Hence my conjecture:

    If an algorithm A solves SAT in polynomial time, then either:

    1) A is “galactic” (A is physically unrunnable)
    2) the proof of its polynomiality is “galactic” (it can’t be understood by any brain, whether natural or artificial).

    • John Sidles permalink
      June 26, 2012 11:24 am

      Serge, it is fun to transform your post into a form that the Babylonians, Egyptians, and Greeks of antiquity would have understood along the following lines:

      Turing machine \to real number

      complexity class P \to rational number

      run-time exponent \to least denominator

      SAT \to \pi

      Then your conjecture becomes a plausible intuition of mathematical antiquity:

      If a rational number is \pi — that is, the ratio of the circumference to the diametr of a circle — then either (1) \pi‘s denominator is “galactic” (that is, \pi is physically infeasible to compute) or (2) the proof of \pi‘s rationality is “galactic” (that is, it can’t be understood by any brain).

      To the ancient Babylonians, Egyptians, and Greeks this conjecture might have seemed reasonable. What “bemused” (in Dick and Ken’s excellent phrase) these ancient mathematicians was the lack of (1) well-posed mathematical definitions of irrationality and transcendence, and the associated lack of (2) a theorem-proving tool-set adapted to these notions.

      These bemusing lacunae were not easily or quickly filled … more than four thousand years of mathematical effort were required before Johann Heinrich Lambert proved that \pi is irrational and Ferdinand von Lindemann proved that \pi is transcendental. And of course, the proofs of Lambert and Lindemann could not even be created until *after* well-posed definitions of irrational and transcendental had been conceived, and an armamentarium of proof methods for drawing rigorous inferences regarding these definitions had been developed.

      These reflections inspire us to to wonder (along with Juris Hartmanis, for one) whether our present definitional appreciation of the complexity classes P and NP is adequate even to the proper statement of theorems that we might reasonable strive to prove. And this is a possibility that might plausibly have bemused Alan Turing.

      • Serge permalink
        June 26, 2012 3:00 pm

        Yes John, your analogy’s both fun and clever. 🙂 Moreover, it’s quite possible that your opinion about a lack of accurate definitions be the right one.

        However, you’ll have noticed that algorithms are to be examined by means of proofs, and that proofs are also algorithms in their own right. When you study something by means of itself, all kinds of weird phenomenons may happen. This is quite a different situation from that with real numbers, where the topological field R isn’t in any case isomorphic to the set of proofs.

        I really feel that complexity theory needs its Heisenberg. We might well discover that, like for quantum particles in the 1920-30’s, most interesting facts about algorithms and problems are essentially unknowable… unless we manage to build more powerful tools such as quantum algorithms, quantum proofs and quantum computers. But for now, the PvsNP question seems to be requiring an act of faith: there’s the P=NP-religion, the P!=NP-religion, the undecided-religion, and probably many others which I’m not aware of.

      • John Sidles permalink
        June 26, 2012 3:32 pm

        Serge, both you and I, and Doran Zeilberger in his latest opinion, and Juris Hartmanis in his classic monograph Feasible Computations and Provable Complexity Properties, and perhaps Dick Lipton and Ken Regan too, all share an uneasiness regarding the central role that oracles play in the definitions of P and NP.

        Progress in understanding the number-line continuum required mathematicians to associate a great many sophisticated definitions to the (essentially correct) intuition that “a real number is a point on the number line”. Even after many centuries, creative definitional adjustments to our appreciation of the number line still are ongoing (witness the surreal numbers of Conway and Knuth, for example). Good!

        Similarly, to make progress in understanding computational complexity, perhaps we to need to improve the definitions associated to the (essentially correct) intuition that “a language is in P iff it is accepted by some Turing machine that runs in polynomial time.” If we are lucky, then perhaps creative definitional adjustments to our appreciation of computational complexity similarly will continue in decades and even centuries to come. If so, good! 🙂

  5. June 27, 2012 2:29 am

    The Guardian prize crossword last Saturday had the theme “centenary”. It was very enjoyable, and the fact that I have mentioned it here should give you a head start.

    • June 27, 2012 11:27 pm

      Thanks, Peter! This link to the PDF of the puzzle worked for me, but I may have a grandfathered Grauniad sub.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s