tags:

Happy Thanksgiving

Apple and Cider are the names of the two turkeys pardoned this year by president Barack Obama in a ceremony at the White House. This continues a tradition started years ago. After this, they will go to George Washington’s Mount Vernon Estate and Gardens to live out their lives.

Thus Apple and Cider will avoid an even older tradition in America—serving turkey as dinner on tomorrow’s Thanksgiving holiday. The tradition is said to come from observances and feasts thrown by Pilgrim settlers in Massachusetts in 1621, but it may go back even further in Canada, which celebrates its Thanksgiving in October. Yet Americans have an ambivalent attitude toward “turkey”, because the word also means a failure or a flop. Or even worse.

Today I want to talk about possible worst posts I have made in the last two years, ones whose ideas may be “turkeys”.

In the spirit of holiday fun, I thought I would nominate three of them myself. You are welcome to add your own nominations, well I hope you do not have too many others. But we will see.

Do understand, I believe that it is important to try out and discuss ideas, and the fact that some become turkeys is a necessary price for a greater good. But it is still important to identify the turkeys, even if they get pardoned. And if discussion decides the posts weren’t too bad and finds something new to say, then it might play into another Thanksgiving tradition: leftovers.

The Worst?

I have posted 228 times since starting this blog almost two years ago. Here are my three suggestions for my personal “turkey” awards.

${\bullet}$ The World is Digital. This generated a lot of comments and some applause, but as a turkey it may have been shot down by my own student (Subruk), though I posted anyway. I still think I made a good point; let me restate it as ${\dots}$ Perhaps another time.

${\bullet}$ Complexity Classes Meet Particle Physics. I argued that particles were like complexity classes. Oh well.

${\bullet}$ The Iceberg Effect in Theory Research. I discussed the notion that often it is hard to tell if a result is new. Perhaps this was not a good idea, or even a new idea.

Open Problems

Have a happy Thanksgiving. Yes I know that in Canada it was October 11, and in many countries you do not celebrate Thanksgiving like we do. Lots of food, lots of parades, lots of football on the tube, and on Friday lots of shopping. In any event have a wonderful next few days.

Of course the main question is which were my worst posts? It is unfair to vote for this post, but I guess that it is possible to do that.

1. November 24, 2010 7:14 pm

In the opposite spirit, I’d like to nominate a high point, which were your introductory Theory lectures around 1977 at Yale. In those days there were no Theory textbooks. The selection of topics, the order of presentation, the depth to which to cover each — everything teachers take for granted had to be created from scratch. As I’ve grown older, I’ve realized more and more how much of an obstacle that would have been.

I remember not missing the textbook, which I think shows the skill with which you put the course together.

2. November 24, 2010 7:23 pm

This being the season of Thanksgiving in the USA, I prefer to say why I am thankful for Dick’s webblog (and many other CS/CT/QIT webblogs).

In response to the challenges and opportunities associated to quantum mechanics, it was Neils Bohr who in 1948 articulated the famous principle: “All well-defined experimental evidence, even if it cannot be analyzed in terms of classical physics, must be expressed in terms of ordinary language making use of common logic.”

Now in the 21st century we are faced with challenges and opportunities associated to computation and simulation; challenges that are similarly global in scope and consequence to those of quantum mechanics. My (wonderfully enjoyable) experience with Dick’s weblog has been that it embodies a 21st century version of Bohr’s maxim, which is effectively along the lines of: “All well-defined computational processes, even if they cannot be analyzed without recourse to oracles, must be expressed in terms of ordinary language making use of common logic.”

To cite just one example, Dick’s account of Stockmeyer’s Approximate Counting Method—which Gil Kalai’s weblog has recently highlighted—is expressed in Dick’s characteristically lucid style of “ordinary language making use of common logic” … and the Stockmeyer post in particular has been immensely helpful to me (and I think many people) in coming to grips with the increasing torrent of results on simulating quantum dynamics with classical resources.

I think pretty much anyone who follows this weblog can name a dozen or more posts that explain tough concepts via “ordinary language making use of common logic” … for which Dick’s efforts (and Ken Regan’s efforts too) deserve appreciation and thanks from everyone.

November 24, 2010 7:35 pm

I too am thankful for this blog. I get the feeling that many people who do not formally consider themselves part of the “theory crowd” enjoy following this blog and get a lot out of it.

I personally think your treatment of the Deolalikar situation was a highlight and a model to be emulated.

Happy Thanksgiving!

4. November 25, 2010 12:53 am

Ooh, physics. I vaguely remember something about that from 1966 (the last physics course I took, fourth year physics at the University of Sydney).

I have no turkeys, only two questions.

1. Is the space that quantum mechanics currently models as Hilbert space really flat (in the sense of Gauss) the way Hilbert space is?

2. If not would that compromise quantum computing in any way? For example is there some curvature, positive or negative, at which the current best error correction methods fail?

• November 25, 2010 10:58 pm

@Vaughan

I can think of one caveat that separates Hilbert spaces from the sorts of things we see in experimental physics (this is from Hans Ohanian’s undergraduate text – not something I noticed myself).

|electron> and |proton> are both valid vectors in H. As required by the rules of Hilbert spaces a|electron> + b|proton> must also be a valid vector, but alas we would think this repugnant (and there is no experimental evidence to suggest that it should be so).

Based on this simple example I have to conclude that the structural realists have it right – obviously “stuff happens.” Whether that “stuff happening” _is_ a Hilbert space is not interesting. That’s interesting is that it can be modeled very well by a Hilbert space.

• November 25, 2010 10:59 pm

Of course, I don’t think this answers you question. An interesting (I hope) aside.

November 26, 2010 1:56 am

I am not a physicist. My perspective is that states like a|electron> + b|proton> can in principle exist, but won’t because of superselection rules. In combinatorial models, this is equivalent to charge decoherence. There is more to be said, until we understand the ultimate origin of particles, so this is only intuition.

• November 27, 2010 9:38 am

Ross, there are well-studied physical systems in which quantum superpositions of particles and anti-particles are experimentally observed: the neutral kaons and B-mesons. The Wikipedia topic CP violation in neutral meson oscillations provides a concise introduction to the fascinating physics of these superpositions. The numerical values of many of the parameters in the Standard Model of particle theory are fixed solely or largely by the data from these neutral meson experiments.

The extent (if any) to which symmetry-breaking effects in neutral meson physics can be associated to geometric features of some underlying non-Hilbert state-space (perhaps M-theory?) is not presently known. More broadly, we appreciate that it is very challenging to construct candidate M-class theories that respect all that we know—or think we know!—about dynamics, causality, locality, relativity, information theory, thermodynamics, and symmetry-breaking.

When the dust settles—hopefully sometime in the 21st century—will the quantum state-space of Nature still be understood to be a static geometrically flat Hilbert space? Alternatively, will classical and quantum physics both be appreciated as flows on state-spaces that are non-flat and geometrically dynamic? The long history of classical and quantum dynamics suggests that a humble attitude is prudent, namely: “We don’t know, but fortunately, when we seek answers, we discover an abundance of opportunities in fundamental math, physical science, and practical engineering.” Good!

• November 28, 2010 9:25 pm

John, this is really cool. I wish I understood it all.

The Ohanian text seems really good, and it certainly put me through the gauntlet this past quarter. I’ve been considering picking up Sakurai or Griffiths, which I have heard mentioned a lot. In Ohanian’s defense the example used was specifically |electron> and |proton> which are not antiparticles, nor kaons or mesons. Has a superposition of fermions/hadrons and leptons been observed? In any case there still seems to be good reason to probe, scrutinize and study “archetypal” flat Hilbert spaces.

Makes me wish I could devote more time studying the algebraic metaphysics behind quantum kinematics and dynamics. TCS, I think, will only demand I get part of the way there. Luckily I’m young, and might be here when the dust settles.

• November 29, 2010 8:33 am

Ross, the textbooks you mention (Sakurai, Griffiths) are very good, and there is considerable virtue too in “old-school” quantum dynamics textbooks like Gottfried and/or Messiah, and in applied textbooks like Slichter’s Principles of Magnetic Resonance. It doesn’t hurt to have a working familiarity with the table of contents of all of these classic texts.

With regard to basic mathematical tools, one exercise that I recommend to students (it’s a tough exercise though) is first to check whether one can read Chapters 2 and 8 of Nielsen and Chuang’s Quantum Computation and Quantum Information with reasonable comprehension; if so, then you’ve got a solid grasp of the algebraic aspects of dynamical flow and information theory. Good!

Then check whether one can read with reasonable comprehension (say) the concluding two chapters (8 and 9) of Woodhouse’ (new-this-year) 2010 edition of Introduction to Analytical Dynamics. If so, then you’ve got a solid grasp of the geometric aspects of metric and symplectic dynamical flow. Good!

Now systematically transpose Nielsen and Chuang’s informatic and algebraic descriptions of dynamical flow, into the vocabulary of Woodhouse’ geometric descriptions of dynamical flow, and vice versa. As you become comfortable applying a blend of informatic, algebraic and geometric mathematical tools to solve practical dynamical problems, you’ll be acquiring a solid grasp of modern dynamics. Good! But tough! A well-written, friendly-to-students next step is John Lee’s Introduction to Smooth Manifolds.

Both the Woodhouse text and the Nielsen and Chuang text commonly are assigned at the undergraduate and/or first-year graduate level. At first sight these texts look mighty different, but upon closer inspection we appreciate that they are highlighting the informatic, algebraic and geometric aspects of the same underlying subject … as I mentioned earlier, the articles in the 1990s of (physicist) Abhay Ashtekar and his student Troy Schilling were among the earliest to explicitly point out this natural unity.

I have long had a feeling that somewhere out there, there must be texts on graph theory and/or complexity theory that dovetail naturally with this emerging 21st century synthesis/ unification of our 20th century dynamical understanding … and I would be very grateful for suggestions in this regard.

• November 26, 2010 1:16 pm

Q1. Is the space that quantum mechanics currently models as Hilbert space really flat (in the sense of Gauss) the way Hilbert space is?

Vaughan, the safest answer is arguably this: “We don’t know, because the experimental and mathematical evidence either way is far from compelling.”

There is reasonably strong experimental evidence that whatever the state-space of Nature may be, the dynamical flow on that state-space must induce a symplectomorphism, because this fundamental principle of Hamiltonian dynamics ensures that thermodynamics “just works” and (importantly) it also helps ensure that Nature’s computational capabilities are not extravagant.

There is strong experimental evidence too for a second broad principle: that Nature’s noise mechanisms are Lindbladian; this principle ensures that locality, causality, and collapse all “just work” both in the common sense and in the sense of John Bell’s inequalities.

But these “it has to just work” requirements do not necessitate that the geometry of quantum state-space be Hilbert; pretty much any Kählerian state-space that supports these metric/symplectic structures “just works” in terms of classical and quantum physics.

A good way to see this is transpose Herbert Goldstein’s 1951 classic of dynamical analysis, The classical motion of a rigid charged body in a magnetic field into the modern language of symplectomorphic flow on a Kählerian state-space (a.k.a. the Riemann/Bloch sphere of Landau-Lifshitz-Gilbert dynamics). Goldstein’s analysis was commissioned by Norman Ramsey to help in understanding the astounding (in the 1950s) linearity and spectral purity of atomic beams and atomic clock experiments. These experiments are sometimes cited as evidence for the flatness of quantum state-space, but (as Goldstein’s 1951 analysis established) the observed linearity and spectral purity of atomic clocks are wholly consistent with strongly curved state-spaces too.

Nowadays pretty much all large-scale quantum computations are carried through by computing (what amounts to) metric and/or symplectic integral curves on tensor network manifolds, which are ruled manifolds whose sectional curvatures (from a Riemannian point-of-view) are generically non-positive (this is implied by a 1967 theorem of Goldberg and Kobayashi). It can be shown that both the negative sectional curvature attribute (which is essentially geometric) and the ruled attribute (which is essentially algebraic) are crucial to the accuracy and efficiency of modern quantum simulation codes.

As for whether Nature’s state-space might have (for example) a tensor network geometry versus a Hilbert geometry … well … no one knows.

Q2. If not would that compromise quantum computing in any way? For example is there some curvature, positive or negative, at which the current best error correction methods fail?

If one construes “quantum computing” broadly, to encompass (for example) everyday computing with semiconductor junctions, then non-Hilbert quantum dynamics would *NOT* compromise quantum computing. If one construes “quantum computing” narrowly, to encompass (say) factoring by Shor’s algorithm, then much depends upon the dimensionality and geometry of the state-space … about which at present we know very little.

To my knowledge, the only class of experiments ever specifically designed to uniformly sample a large-dimension quantum-state space, are the linear optics experiments of Aaronson and Arkhipov; that is why (IMNO) their proposed class of experiments is novel and (potentially) seminally important.

Although many books have been written upon classical and quantum dynamics, no single book has yet been written that presents the above ideas in what is (in retrospect) their most logical order. The textbook would develop first the general idea that dynamical state-spaces are manifolds endowed with metric and symplectic structures, then develop the tools necessary to pullback functions and pushforward trajectories, and finally stipulate that symplectic flows are associated to Hamiltonian (smooth) potentials, and metric flows are associated to Lindbladian (stochastic) potentials.

From these fundamental principle, the book would demonstrate that thermodynamics, locality, causality, and collapse all “just work,” and that Nature’s computational capabilities are not extravagant (equivalently, the Extended Church-Turing Thesis (ECT) is consistent with experimental evidence). Hilbert state-spaces would be introduced only at the concluding chapter, as a large-dimension approximation that is associated to certain spectral theorems; theorems that are computationally convenient even as approximations, yet are by no means essential to practical simulations of quantum dynamics.

As for the distinction between classical and quantum dynamics, this imagined textbook would draw no such distinction, in accord with today’s increasingly widespread recognition that (as Troy Schilling put it in his 1996 PhD thesis) “the mathematical differences between classical and quantum mechanics are not so dramatic as they initially appear.”

Overall, it seems to me that Abhay Ashtekar and Troy Schilling got the key ideas right in their 1999 article Geometrical formulation of quantum mechanics. Moreover, if we’re ever going to achieve a deep understanding of why today’s quantum simulations on tensor network manifolds work so astoundingly well … and if we’re ever going to experimentally observe the geometry of quantum dynamical state-spaces … then it seems likely to me that this progress will be achieved (in part) via extensions of these geometric ideas, in both classical and quantum dynamics.

• November 28, 2010 2:11 am

John Sidles

There is reasonably strong experimental evidence that whatever the state-space of Nature may be, …

There is indeed a state-space of our current model of Nature but there is no such thing as the “state-space of Nature”.
If our model(s) were to change all this would go the way the phlogiston, the four humors, etc…
(prediction is difficult, especially of the future, eh?)

• November 28, 2010 10:13 am

If our model(s) were to change all this would go the way the phlogiston, the four humors, etc…

Kevembuangga, IMHO we should all have the greatest respect for phlogiston theory and the four humors.

After all, phlogiston theory evolved to become the First Law of Thermodynamics … the four humors evolved to become the theory of phase transitions in condensed matter … Euclid’s axioms evolved into the tangent and cotangent spaces of differential geometry … and many more such examples could be cited (the miasma theory of disease! heliocentric astronomy! Newton’s dynamics! Galen’s theories of blood circulation!)

A particularly exciting aspect of the 21st century is our shared, slow-dawning perception of deeper mathematical and physical structures that seem to mysteriously underly … and perhaps eventually will unify … the 20th century’s best theories (quantum dynamics, general relativity, and complexity theory, to name three).

It seems to me, that the only folks who are likely to be disappointed by the 21st century, are the folks who expect that the existing foundations of our knowledge will remain unaltered, without amendment or extension.

November 25, 2010 1:45 am

Prof. Lipton, your blog as a whole has served as an inspirational-crystal-ball for so many buddying minds in theory. While I certainly enjoyed some posts more than others – and it is obvious that some essays were written with more enthusiasm and more authority than others – I do feel that I shouldn’t try to isolate the ‘bad’ ones in particular. Happy thanksgiving.

6. November 25, 2010 11:30 am

How often are CS students asked to brainstorm far-out ideas as part of their undergraduate education?

Or maybe that’s not a good thing as it would result in lower job satisfaction later when they are working as software engineers?

7. November 29, 2010 10:42 pm

Hey I liked your particles as complexity classes post! Not because I agreed with it (!) but because looking for analogies across fields is fun and a good way for each side to learn about the other. For example a particle theorist with no knowledge of P, NP, PP, and friends might stumble upon your blog and then realize he or she has much to learn about these odd one, two, three, and more lettered sets. Similarly a complexity theorist might start thinking that he or she doesn’t know enough physics and start learning all about Lie algebras and the eightfold way. Often this cross disciplinary learning leads nowhere, but sometimes, as Peter Shor can probably tell you, this leads to really awesome results 🙂

• November 30, 2010 7:30 am

I definitely agree with Dave that the Complexity Classes Meet Particle Physics post was very enjoyable.

I am old enough to remember when particles like protons were thought to be fundamental. But in the end, that wasn’t a productive way of thinking … nowadays we regard them as condensate of (fermionic) quark fields and (bosonic) gauge fields.

Similarly, complexity theorists nowadays regard classes like P and NP as fundamental. But perhaps that too is not a productive way of thinking … after all, membership in P and NP can only be decided by an appeal to oracles.

Perhaps the “quarks” of (N)P are languages that probably are in (N)P , and the “gluons” of (N)P are languages for which no such proof exists, but nonetheless are “oracularly” in (N)P.

The laws of physics prevent the observation of quarks and gluons as isolated particles … but it took many decades for this non-intuitive phenomenon to be recognized and accepted physics orthodoxy. I have often wondered the laws of mathematics similarly prohibit the concrete exhibition of “(N)P-quarks” and “(N)P-gluons”.

Here the point is that even though “(N)P-quarks” and “(N)P-gluons” may perhaps be inseparable, they might nonetheless possess very different informatic properties, that have to be analyzed separately, if we are ever to understand the aggregate properties of the standard complexity classes.

December 27, 2010 8:56 am

Of course, the biggest turkey some of us saw was the “How To Stop Wikileaks” entry, which was lurking but a few days into the future when this post was written.

Gentle ribbing, I hope.