Skip to content

Transcomputational Problems

April 30, 2015


Problems beyond brute force search

Bremermann
Cropped from Wikipedia source

Hans-Joachim Bremermann was a mathematician and biophysicist. He is famous for a limit on computation, Bremermann’s limit, which is the maximum computational speed of a self-contained system in the material universe.

Today Ken and I wish to talk about the limit and why it is not a limit.

A transcomputational problem is a problem that requires processing of more than {10^{93}} bits of information. The number comes from Earth-scale considerations, but adding less than 30 to the exponent breaks the scale of the known universe. Our friends at Wikipedia say:

This number is, according to Bremermann, the total number of bits processed by a hypothetical computer the size of the Earth within a time period equal to the estimated age of the Earth. The term transcomputational was coined by Bremermann.

Pro

What is interesting is that he thought about “transcomputational” problems in 1962. Yes almost a decade before the P=NP problem was stated. See his paper for details.

He noted back then that certain problems were beyond any reasonable brute-force search. In his own words:

The experiences of various groups who work on problem solving, theorem proving and pattern recognition all seem to point in the same direction: These problems are tough. There does not seem to be a royal road or a simple method which at one stroke will solve all our problems. My discussion of ultimate limitations on the speed and amount of data processing may be summarized like this: Problems involving vast numbers of possibilities will not be solved by sheer data processing quantity. We must look for quality, for refinements, for tricks, for every ingenuity that we can think of. Computers faster than those of today will be a great help. We will need them. However, when we are concerned with problems in principle, present day computers are about as fast as they ever will be. We may expect that the technology of data processing will proceed step by step—just as ordinary technology has done. There is an unlimited challenge for ingenuity applied to specific problems. There is also an unending need for general notions and theories to organize the myriad details.

Quite insightful for a paper that dates a decade before Cook-Karp-Levin on P=NP. It also predates the limits associated to Jacob Bekenstein’s bound on the information capacity of finite space and/or to Rolf Landauer’s principle.

One wonders what might have happened if Bremermann’s paper had been better known in our theory community. Ken notes that the Russian theory community in the 1960s highlighted the question of perebor—brute-force search. But he senses the emphasis was on problems for which it could be necessary in the abstract, rather than tied to Earth-scale considerations like Bremermann’s.

Con

Of course there are several eventualities that were missed. One of course is quantum computation—I believe all his calculations depend on a classical view of computation. There are several other points that we can raise to attempt to beat his “limit.”


{\bullet } Change the algorithms: Of course his limit could be applied to computing primality, for example. The brute force method is hopeless for even modest-sized numbers. Yet we know methods that are much better than brute force and so we can easily beat his limit.

Steve Wozniak visited Buffalo yesterday as a UB Distinguished Speaker. In a small group attended by Ken, he told his standard anecdote about the first program he ever wrote. This was to solve the Knight’s Tour problem on an {8 \times 8} chessboard. He first coded a brute-force solution trying all Knight moves at each step but realized before he hit “run” that it would take about {10^{25}} years. This awakened him, he said, to the fact that good algorithms have to go hand-in-hand with good hardware.


{\bullet } Change the answers: Another method is to change what we consider as an answer. Approximation algorithms of course are one important example: allow the answer to be near the optimal one. This has opened the floodgates to increase the class of problems that we can solve.


{\bullet } Change the problems: Another method is to change the problems that we attack. In many cases we can avoid general problems and exploit special structure of a problem. Examples that come to mind include: replace dense matrices by sparse ones; replace arbitrary graphs by planar ones or those with restricted minors; and replace data analysis of arbitrary data sets by analysis of data that is generated with specific noise, like Gaussian.


{\bullet } Change the world: We have posted about the idea of a world without true randomness, presenting Leonid Levin’s proof that SAT is nearly polynomially solvable in such a world. That post offered the weaker idea that every efficient generator of SAT instances might be solved by Levin’s algorithm on all but finitely many instances. The finite bound might be huge, but the fact of Levin’s algorithm would carry weight: it would solve everything else based solely on the principle that “nothing succeeds like success.” We can put this idea in concrete terms like Bremermann’s:

Could we live in a world where the act of creating an instance that requires processing more than {10^{93}} bits of information requires processing more than {10^{93}} bits of information?

We note that Larry Stockmeyer proved that every Boolean circuit capable of deciding a certain class of logical formulas {f} that fit into 8 lines of 80-column ASCII text must have more gates than atoms in the observable universe. But this does not rule out a little algorithm {A} solving every {f} that we could generate—unless we spend the time to cycle through every legal formula that fits into 640 characters.

Open Problems

Are there realistic limits on computation of the type that Bremermann postulated? What are the right limits in light of today’s insights into computation?

20 Comments leave one →
  1. May 1, 2015 6:17 am

    This fine Gödel’s Lost Letter essay encompasses themes of complexity-theoretic mystery and madness that are inherent in two great intellectual pursuits: chess and statistical mechanics. To appreciate this, we reflect as follows.

    Modern computer-chess and statistical mechanics alike are grounded in large databases of micro-theorems (e.g., in chess position X, move Y is legal; in system state X, increment Y is Hamiltonian). Each micro-theorems is in itself entirely rigorous, and even a cell-phone nowadays can crank out billions of them.

    For scientists, chess-players, and entrepreneurs alike, the chief opportunity to create value is to distill large numbers of micro-theorems into a small number of high-reliability macro-postulates (e.g., white to play and win, drug-design Z will bind strongly, macroscopic entropy never decreases, scalable quantum computing is/isn’t feasible).

    In recent decades humanity’s empirical capabilities in extracting high-reliability macro-postulates has been improving exponentially; this ongoing improvement is Moore’s Law broadly conceived, and it is among the dominant features of the 21st STEAM enterprise.

    Rigor has been harder to achieve than empirical success, and the obstructions to rigor are maddening. As David Goodstein famously put it:

    “Ludwig Boltzman, who spent much of his life studying statistical mechanics, died in 1906, by his own hand. Paul Ehrenfest, carrying on the work, died similarly in 1933. Now it is our turn to study statistical mechanics. Perhaps it will be wise to approach the subject cautiously.”

    Conclusion  If the 20th century study of chess-play and statistical mechanics has proved to be maddening yet hugely productive — especially the vexing transition from micro-theorems to macro-postulates — then how much more productively maddening will be the 21st century study of computational complexity and quantum computing/simulation?

  2. Alexandre de Castro permalink
    May 1, 2015 6:35 am

    “One wonders what might have happened if Bremermann’s paper had been better known in our theory community”

    The theory community would have realized that the P=NP identity is the Maxwell’s demon paradox.

    • Serge permalink
      May 4, 2015 7:07 pm

      Surely, Maxwell’s demon was clever enough to turn any thermal disorder into a quantum computer… 🙂

      • Alexandre de Castro permalink
        May 5, 2015 1:39 pm

        exactly Serge, Maxwell’s demon is a quantum computer.
        Maxwell’s demon is an involution of controlled-NOT gates.

      • Serge permalink
        May 5, 2015 4:15 pm

        On the quantum scale, there’s no outstanding time variable (according to loop quantum gravity). A time variable can only emerge as an average on thermal disorder. As an argument, recall that the only known irreversible phenomena must involve heat in one way or another. So, on our macroscopic scale, complexity is nothing but a struggle against the flow of time. But Maxwell’s demon lives on the microscopic scale, which allows it to run a quantum process (contrary to us poor humans).

    • Serge permalink
      June 1, 2015 5:11 pm

      Thank you very much Alexandre for giving me this idea. I really think it settles the P=NP puzzle!

      Mathematics ignores time and P=NP does hold in this timeless science. Nothing indeed in the Peano axioms stipulates that time is irreversible. But, in a world like ours with its irreversible processes, everything behaves as though as P!=NP. Unfortunately, the second law of thermodynamics prevents us from proving it, and also prevents mathematicians from proving P=NP. Finally, the existence of one-way functions seems to be the one missing axiom in Peano arithmetic that’s required for doing computer science.

      • Alexandre de Castro permalink
        June 7, 2015 11:13 am

        Thanks, Serge.
        The NOT gate (bijection) is the core of irreversibility!

  3. May 3, 2015 5:48 am

    Perhaps world without true randomness is possible. Howard Wiseman wrote:

    “Parallel universes – worlds where the dinosaur-killing asteroid never hit, or where Australia was colonised by the Portuguese – are a staple of science fiction. But are they real?
    In a radical paper published this week in Physical Review X, we (Dr Michael Hall and I from Griffith University and Dr Dirk-André Deckert from the University of California) propose not only that parallel universes are real, but that they are not quite parallel – they can “collide”.
    In our theory, the interaction between nearby worlds is the source of all of the bizarre features of quantum mechanics that are revealed by experiment.”

    http://app.griffith.edu.au/sciencesimpact/when-parallel-worlds-collide/

  4. May 4, 2015 10:08 pm

    Thanks for pointing me to Bremermann’s limit — I hadn’t heard this concept before!

    There’s a somewhat similar idea in Paul Davies’ book, “The Last Three Minutes,” about what we know and conjecture about the “end” of the universe. One of the topics discussed at length is whether there is a universal limit on the total amount of computation, since computation fundamentally must take up some energy.

  5. Serge permalink
    May 5, 2015 8:43 am

    Are there realistic limits on computation of the type that Bremermann postulated? What are the right limits in light of today’s insights into computation?

    The limits on computation, if any, seem to be the same as the limits on proving that there are limits on computation. I think this vicious circle can only be broken by some physical hardness principle. Indeed, the class NPC is a sort of computational black hole, with NP as its boundary. Now let’s postulate that the reduction of a black hole to its boundary is always computationally inefficient, P≠NP might thus be a mere relativistic effect as a special case of the so-called holographic principle. Example of that principle in physics are:

    a black hole vs. its boundary
    quantum gravity vs. quantum field theory (the AdS/CFT correspondence from string theory)

    I suggest extending it to computer science:

    non-deterministic polynomial time vs. deterministic polynomial time (NP/P)
    integer programming vs. linear programming (IP/LP)
    quantum computing vs. classical computing

    Computational complexity would then turn into a chapter of general relativity. Moreover, I believe that computability theory is already, to some extent, a chapter of special relativity. Indeed, should extraterrestrials who live outside our light cone develop an exotic kind of mathematics, their theorems might contradict ours without any problem… provided they were self-consistent and that any potential contradiction between their results and ours could never get computed – thanks to the speed of light!

    So, if computability is a consequence of special relativity, why not consider complexity to be a consequence of general relativity?

    • Alexandre de Castro permalink
      May 5, 2015 1:58 pm

      Serge, our very mathematics is contradictory!!!
      See: https://unitambo.wordpress.com/2015/04/01/one-way-bijection-over-gf2/

      Consider GF(2) as the smallest finite field (binary system), a two-element Galois field, with Z2 = {0,1}. Let P1 be equal to x²+1 and P2 equal to x. The following are equivalent representations of the same value in a characteristic 2 finite field: Polynomial (P1): x²+1 Binary: {101} and Polynomial (P2): x Binary: {010} Remembering that a sum in GF(2) corresponds to the bitwise XOR operation, P1 ⊕ P2 =x²+x+1. See:

      x²+1 ⊕ x = x²+x+1

      {1 0 1} XOR {0 1 0} = {1 1 1}

      So, since the XOR operation for making the sum P1⊕P2 is used, now check the truth table of XOR gate. Remembering that the polynomial x²+x+1 is the irreducible polynomial in finite field GF(2), because x²+x+1 has no solution (try plugging in 0 or 1 to see). In other words, x²+x+1 is a polynomial that may not be factored into the product of two non-constant polynomials:

      x²+1 x x2+x+1
      Irreducible? T T F
      Irreducible? T F T
      Irreducible? F T T
      Irreducible? F F F

      Now, note that x²+1 (= x+1) corresponds to the NOT gate in GF(2), thus being a bijection (reversible). Note also that x cannot be irreducible, then, so that x²+x+1 is irreducible in the truth table of its XOR gate, x²+1 must be irreducible (one-way).

      Therefore, the NOT gate is both reversible and irreducible in GF(2) at once. This true contradiction is the core of Gödel’s P vs.NP problem!!!

      NOT gate is a one-way function in binary system???

  6. May 5, 2015 9:29 am

    GLL quotes Hans-Joachim Bremermann’s 1962 Question  Could we live in a world where the act of creating an instance that requires processing more than 10^{93} bits of information requires processing more than 10^{93} bits of information?

    In the light of 53 subsequent years of research in complexity theory and quantum computing, and with the experience gained in 53 years of Moore’s law progress in computing machinery and simulation algorithms, it is natural to amend Bremerman’s question to read:

    Bremermann’s Question (amended for 2015)  Could we live in a world where the act of simulating a physical system whose specification requires n bits of information never requires more than polynomial-in-n logical gate operations?

    The just-announced transformationally large and ambitious collaboration of the Swiss-based pharmaceutical corporation Sanofi with the Portland-based quantum simulation corporation Schrödinger ($120M/5year) amounts to a vote of confidence that the answer is “yes” … as does this week’s xkcd cartoon titled “Degree-Off” (#1520).

    One of the great lessons of the GLL-sponsored Kalai/Harrow debate “Is quantum computing feasible” (as it seems to me) is that the answer to Bremmerman’s Question (as amended) cannot be yes in any universe in which scalable quantum computing and/or scalable BosonSampling is feasible.

    From this perspective, the immense scale and vast ambition of the Sanofi/Schrödinger program amounts to a vote-of-confidence that:

    The Sanofi/Schrödinger Postulate  (1) If we live in a world that is dynamically governed by quantum electrodynamics (but not necessarily if our world is governed by some other quantum Hamiltonian) then the answer to Bremerton’s Question is “yes”, and (2) Hot wet noisy biological creatures — meaning, all of us humans — *DO* live an a world that is dynamically governed by quantum electrodynamics.

    Conclusion  Enterprise-investors are placing increasingly large bets that the answer to Bremermann’s Question is “yes”.

    Question  Is the quantum dynamics of the world — both its small-scale low-energy QED limit and its large-scale high-energy (string theory?) limit — determined by the requirement that the answer to Bremmerton’s Question (amended) is “yes”?

    Don’t ask me (or anyone) for the answer to this tough question … but we can all be grateful to Gödel’s Lost Letter for inspiring us to ask ourselves these foundational questions, and to read with sympathy the marvelously hopeful literature that tries to answer them.

  7. May 5, 2015 1:56 pm

    Further Readings  Several of the themes raised in recent essays here on Gödel’s Lost Letter have also been raised in recent essays on Shtetl Optimized (“Is There Something Mysterious About Math?”); these themes include large-F (Fundamental) foundations that guarantee mathematical rigor, small-f (practical) foundations for doing mathematics, and even Gil Kalai’s recent interest theological foundations (per Gil’s SO comment #87 ).

    It may interest GLL/SO readers that these same themes are being vigorously debated, at a high mathematical level, in recent essays and comments on (Fields Medalist) Michael Harris’ weblog Mathematics Without Apologies.

    Particularly commended (by me anyway) is Harris’ essay Pantheism and homotopy theory, Part 2 (April 30, 2015), which begins

    “Every mathematician should probably read [Fields Medalist] Vladimir Voevodsky’s article [“The origins and motivations of univalent foundations”] in the Summer 2014 IAS Letter.”

    Please let me add that (as it seems to me) it’s no bad idea for scientists, engineers, and historians in general, and GLL/SO readers in particular) to read Voevodsky’s article too. `Cuz when Fields Medalists trade candid views regarding large-F and small-f STEAM foundations, then to borrow Feynman’s phrase, it’s terrific!

    These Harris/Voevodsky essays will make clear to Shtetl Optimized readers that the ongoing debate among mathematicians in regard to univalent foundations is comparably heated and enlightening as the Harrow/Kalai debate regarding the feasibility of quantum computation.

    Both debates are thrilling and even share common F/foundations … regarding which Michael Harris’ weblog Mathematics Without Apologies is an outstanding resource for background information.

    As further student-friendly/rabble-rousing mathematical reading, it’s tough to better Michael Harris’ own textbook, titled (like his weblog) Mathematics Without Apologies (2015). A delightful historical extension — and worthy counterpoint to both Jonathan Israel’s ongoing multivolume history of the Enlightenment and Gil Kalai’s comment #87 on SO — is Leszek Kolakowski’s “Dutch seventeenth-century non-denominationalism and Religio Rationalis: Mennonites, Collegiants and the Spinoza connection” (2004).

    And finally, the title alone of Leszek Kolakowski’s 2005 essay collection, My Correct Views on Everything, will bring a smile to regular readers of Gödel’s Lost Letter and Shtetl Optimized, and more broadly, to all fans of enlightened friendly discourse.

    Summary and Conclusion  The Enlightenment that was born in the seventeenth century’s friendly ferment of Spinozists, Collegiants, and Anabaptists is today being reborn and renewed — in full radical vigor and public view — as a friendly ferment of quantum information theorists (as Spinozists), category/type/univalent theorists (as Collegiants), and medical researchers (as Anabaptists).

    As eventuated in the 17th century, our 21st century’s friendly ferment may turn out to be wonderfully nourishing (at it seems to me) of the entire STEAM community … full of wonderful opportunities for young researchers in particular … especially because (just as in the 17th century) opinions in this regard are so marvelously various and vigorous.

  8. Someone permalink
    May 8, 2015 10:17 am

    Is it possible to get at last an answer: what or who is “Pip”? This is unclear and annoying.

    • May 8, 2015 1:11 pm

      Please see the “About Us” tab at top. WordPress doesn’t (didn’t?) allow multiple author names.

  9. May 9, 2015 2:00 pm

    ❗ thx for playing a pioneering role in recognizing the significance of what might be called “source code complexity” in the study of TCS. alas its kind of controversial & a fringe area still. intend to blog on this at some pt. this recent cs question on complexity shows some of the controversy on the downvoted answer that mentions source code complexity & there was some ensuing discussion in the cs chat room also on the topic. it was handy to have some of your blogs on the subj to cite.

  10. Serge permalink
    June 8, 2015 8:05 am

    Dick, in 1979 you wrote about the social processes which output commonly acceptable proofs and programs. I contend that this notion of social process can be formalized. A social network is a network of neural networks – mathematical language being its communication protocol – and a social process evolves in rules analogous to those of Conway’s Game of Life, though only more complex. Thanks to Church’s thesis, social processes are Turing-complete – that is, everything which can be done by a social process could also be done, in principle, by a Turing machine.

    Now, you might wish to apply all the modern results of complexity theory to this natural super-computing model. It would be nice indeed to prove that Levin’s universal one-way function has probability zero of getting inverted by a social process that’s executed within a closed system such as the Earth’s light-cone. I wonder if it’s a reasonable consequence of the second principle of thermodynamics…

Trackbacks

  1. Los límites de la computación: el coste exponencial | Diseño Web Tiendas Online Barato en Valencia | Diseño y Posicionamiento Web y Tiendas Online en Valencia al alcance de todos
  2. Quantum Switch-Em | Gödel's Lost Letter and P=NP
  3. Los límites de la computación: el coste exponencial - Informática++

Leave a Reply

Discover more from Gödel's Lost Letter and P=NP

Subscribe now to keep reading and get access to the full archive.

Continue reading