Skip to content

An Old Galactic Result

July 25, 2014

A cautionary tale


Karl Sundman was a Finnish mathematician who solved a major open problem in 1906. His solution would have been regarded as paradigm-“shifty” had it been a century or so earlier. It employed power series to represent the solutions of equations, namely the equations of the 3-body problem famously left unsolved by Isaac Newton. The tools of analysis needed to regard a convergent power series as defining a valid real number had been well accepted for a century, and the explicit giving of series and convergence proof even met the demands of mathematical constructivism of his day.

Today Ken and I want to explain why that problem is nevertheless still considered open, even though Sundman solved it over a hundred years ago.

This is a cautionary tale: For some problems there are solutions and there are solutions. There are solutions that make you famous and there are solutions that no one cares about. Unfortunately for Sundman his solution, which is correct mathematically, is the latter type; hence, his lack of fame—did you know about him? He did win honors from the top French and Swedish academies of science for this and other work in mechanics, and he has a Moon crater and an asteroid named for him.

His solution was generalized to any finite number {N} of bodies in 1991 by Qiudong Wang of the University of Arizona, albeit with more restrictions on collisions and singularities than Sundman needed. All of this including Wang’s “beautiful paper” is surveyed in a great 1996 article for the Mathematical Intelligencer by Florin Diacu. We try to add a little more to what Diacu says from the perspective of our paradigms in computational complexity.

Complexity and Uselessness

There is another aspect to this tale: Sometimes impossibility theorems are not ironclad. Sundman solved a problem that had previously been shown to be “impossible.” This is another lesson for us—beware claims that something is impossible. It may not be.

Perhaps his solution failed for two reasons. It went against what was thought to be impossible, and it was useless. That is, the solution, in modern terms, did not yield a practical algorithm.

Ken and I like this tale, even though it is about an old result, a non-theory problem, because it sheds light onto problems that are dear to us in complexity theory. We have talked many times before about galactic algorithms—recall these are algorithms that are useless in practice. We have also talked before about impossibility results—we believe that some of these may not be ironclad. So what happened to Sundman over one hundred years ago is relevant to complexity theory today.

One more word about the following tale: Ken and I are not even amateurs in Sundman’s area of research so be careful to check what we say—another caution.

The Three Body Problem

The problem Sundman solved concerns the motion of three bodies under the sole force of gravity. In more detail, the problem is taking an initial set of data that specifies the positions, masses and velocities of three bodies for some particular point in time, and then determining the motions of the three bodies at a given future time. All of this must be in accord with the laws of Newtonian mechanics; relativity need not enter. It is a important problem: think sun, earth, and moon. The one-body problem is easy, while much of the two-body problem was solved by Johannes Kepler before Newton.

Simulating a system with {N \geq 3} bodies is not the issue. Subject to limitations of numerical precision and computing power, it is straightforward to apply the attractions and momenta of the bodies at each instant of time to calculate their positions at the next. Animations of particular {N}-body cases are readily available, see here for instance. The issue is representing the positions after {t} steps by a formula of {t}, namely a “closed-form solution.”

Getting a closed-form solution is what we commonly mean by “solving” parameteric equations. At least that’s what Newton meant by solving when he posed the problem, and what others meant by offering prizes for {N \geq 3}. Newton could not solve it. Nor could anyone else. Let’s look and see why the great Newton failed—what was the roadblock that left the problem open?

An Impossibility Theorem

The obvious way to solve any problem is to look at the number of quantities that one needs to solve for. Then to get enough equations to allow a solution to be found. In linear systems this is easy: if there are {n} unknowns, then in general one needs to have at least {n} linear equations. Of course we may need more if the equations are not independent. The precise theorem is well known: To solve for {n} quantities we need {n} independent linear equations.

This can roughly be generalized to non-linear systems, like the three body problem. A natural approach is to try and find additional equations that can be added to the basic equations of motion. If enough equations can be found, even though the system is nonlinear, then it could be solved. This was tried for years by some of the best mathematicians of the 19th century, and later rigorized in analytic geometry as we know it today.

Finally, in 1887 Ernst Bruns and Henri Poincaré showed that there were not enough extra equations. More precisely, there were not enough extra conserved quantities that would yield enough equations. This leads to the “obvious” conclusion that there could be no analytic solution. For {N} bodies there are {6N} quantities and only {10} equations that meet a natural constraint of defining quantities called “integrals”—that is, constants of motion that are algebraic functions of positions and velocities alone, or integrals of such functions. Thus in the three-body case with {N=3} this yields {18} variables and only {10} equations. The theorem is:

Theorem: There are only ten integrals, algebraic with respect to the time, position, and the velocities only of the bodies.

This is the “impossibility” result for the three-body problem. This barrier seems definitive because it concerns equations, and so trumps any issue of how real numbers are represented.

Sundman’s Result

Yet Sundman was able to solve the three-body problem, by a completely different method. This shows that care must be used when discussing impossibility theorems. Bruns and Poincaré had really shown only that one kind of natural approach must fail. In modern terms this would be like showing that there is no fast algorithm of a special type for a certain problem. They did not rule out all possible algorithms. This why Sundman’s result is not in contradiction with the above impossibility theorem.

We are not the best place too explain what Sundman did. For more details, see Diacu’s article, this 1995 MAA Chauvenet Prize survey by Donald Saari involving complex variables, and also this recent term paper by Peter Senchyna, who is currently a physics undergraduate at the University of Washington.

Roughly speaking, Sundman used an analytic method to get a power series for the three body problem. The key issue then was, does the series converge for all time in the future? The problem is that the series will not converge for initial data that yields singularities. Sundaman used some nontrivial results to show that for generic data he could avoid singularities. The trouble arises from the fact that he uses several transformations that make the series converge slowly. That is, getting the value to any useful precision requires so many terms, that his solution is of little practical use. Or is it?

Is It Really Useless?

So why is Sundman’s result not famous? The answer seems to be that it is a galactic algorithm. He solves the problem by finding a convergent series that correctly predicts where the bodies will be in the future, but the convergence is galactically glacial. Our friends at Wikipedia state that to get a reasonable accuracy one would need to sum {10^{8000}} terms. That is obviously way too many.

The point we want to make is simple:

Who cares if the series converges slowly, there may be a fast way to get the same result…?

I wish we could show that Sundman series could be sped up tremendously. We cannot. But we can show that there are many examples of a slow series that looks useless, but can be summed quickly. For instance, so consider the sum

\displaystyle \sum_{k=1}^{\infty} \frac{1}{k^{2}}.

To get this to an error of {2^{-1000}} say would require the summation of an astronomical number of terms. So should we give up? Of course not.

Leonhard Euler in 1735 proved the amazing result that

\displaystyle \sum_{k=1}^{\infty} \frac{1}{k^{2}} = \frac{\pi^{2}}{6}.

This means that your laptop can almost instantly compute the sum to the required accuracy not by summing the series but by computing {\pi^{2}/6}. This is easy to do.

Getting the closed form for this series was called the Basel problem. It was first stated in 1644 by Pietro Mengoli, and worked on many top mathematicians. It became a famous open problem of the day. Euler’s solution was a shock that made him immediately famous.

Of course the three-body series may not be able to be sped up in this way. But maybe it can. It might even be possible to transform it analytically into another series for which any decimal place {d} of the answer can be computed locally, in the manner we have covered of “spigot algorithms” for {\pi}. This leads to two further observations:

  • Such a fast series would probably be called the solution, eclipsing Sundman’s.
  • Our notorious unsolved problems about complexity classes get involved when we approach the practicality of convergence in this way.

Still, the naive remark that the series itself converges too slowly may be no barrier at all. There are many other examples of slow series that can be computed via various tricks. Perhaps the three-body case is one of these. The slow convergence is not an ironclad impossibility theorem—not even close.

Open Problems

The obvious problem is to look at the series for the three body problem and see if we can apply methods to speed it up. Can we use any theory tricks to make the calculation go exponentially faster?

Shifts In Algorithm Design

July 21, 2014

How to find approximate page rank fast, among other things


Jennifer Chayes is the current director of a research lab in Cambridge—that is Cambridge Massachusetts—for a company called Microsoft. She is famous for her own work in many areas of theory, including phase transitions of complex systems. She is also famous for her ability to create and manage research groups, which is a rare and wonderful skill.

Today Ken and I wish to talk about how to be “shifty” in algorithm design. There is nothing underhanded, but it’s a different playing field from what we grew up with. Read more…

Constructive Sets

July 16, 2014

An approach to complexity lower bounds?

Book source

Kurt Gödel did it all, succinctly. His famous 1938 paper “The Consistency of the Axiom of Choice and of the Generalized Continuum-Hypothesis” was {1\frac{1}{3}} pages long. Since the Proceedings of the National Academy of Sciences was printed on single-column pages in fairy large type, it would have been under one page in FOCS or STOC format. Only Leonid Levin has famously been that succinct.

Today Ken and I wish to advance another approach on complexity questions based on succinctness and Gödel’s brilliant idea. We have covered succinctness before, but now our angle is more arithmetical.
Read more…

High School Theorems

July 10, 2014

Taking a conjecture about identities to college


Alex Wilkie is a Fellow of the Royal Society, and holds the Fielden Chair in Mathematics at the University of Manchester. Ken knew him at Oxford in 1981—1982 and again from 1986. In 1993 Wilkie won the Karp Prize of the Association of Symbolic Logic, equally with Ehud Hrushovski. This is named not after Dick Karp but rather Carol Karp, a mathematical logician in whose memory the prize was endowed.

Today I wish to talk about logical theories, motivated by some questions from Bill Gasarch and others.
Read more…

Do Random Walks Help Avoid Fireworks?

July 4, 2014

Musings on gravity and quantum on the 4th of July

Cropped from Crumbel source

Amanda Gefter is the author of the book titled Trespassing on Einstein’s Lawn: A Father, a Daughter, the Meaning of Nothing, and the Beginning of Everything. As the title suggests—and as Peter Woit pointed out in his review—there are parallels with Jim Holt’s book Why Does the World Exist?, which we covered a year ago. Gefter’s essay for Edge last year lays out more briefly how Einstein regarded relativity and early quantum as structurally divergent theories.

Today her book has led me to think further about the relationship between gravity and quantum mechanics. Read more…

Remembering Ann Yasuhara

July 1, 2014

Ann will be missed


Ann Yasuhara was a mathematician and a complexity theorist, who passed away this June 11th. She lived over half of her 82 years in Princeton, and was a member of the faculty of computer science at Rutgers since 1972.

Today Ken and I send our thoughts and condolences to Ann’s husband Mitsuru Yasuhara, her family, and her many friends—Ann will be missed—she was special.
Read more…

A Pretty Identity

June 29, 2014

Or rather two beautiful identities


Joseph Lagrange was a mathematician and astronomer who made significant contributions to just about everything. Yet like all of us he could make mistakes: he once thought he had proved Euclid’s parallel postulate. He wrote a paper, took it to the Institute, and as they did in those days, he began to read it. But almost immediately he saw a problem. He said quietly:

Il faut que j’y songe encore.

“I need to think on it some more.” He put his paper away and stopped talking.

Today Ken and I thought we would talk about a beautiful identity of Lagrange, not about the parallel axiom. Read more…


Get every new post delivered to your Inbox.

Join 1,557 other followers