# Proud Simons

* Help get the new institute rolling *

(src) |

Dick Karp is now the director of the Simons Institute for the Theory of Computing.

Today Ken and I are going to do something that we rarely do. Well hardly ever do. Okay we have done it once or twice before.

We have been asked by Dick to help announce the Institute’s call for new proposals. So propose away, and help them get started.

## The Call

Asking to do something that we usually do not do reminded us of the great song “Proud Mary”—with apologies to John Fogerty and Tina Turner here is our version:

Y’ know, every now and then

I think you might like to hear something from us

Nice and simple

But there’s just one thing

You see we never ever do nothing

Nice and simple

We always do it nice and complex

So we’re gonna take the beginning of this

And do it easy

Then we’re gonna do the finish complex

This is the way we do “Proud Simons”

And we’re rolling, rolling, rolling across the Bay

Listen to the story

I left a good job at my department

Working for the chair every night and day

And I never lost one minute of sleeping

Worrying ’bout the way theorems might have been

`
Big wheel keep on turning
Proud Simons keep on burning
And we’re rolling, rolling
Rolling across the Bay-ay…
`

Here is the pointer. Or just study the wordle:

## Another Announcement

Our rule is we break rules only when we’ve already broken the rules. So here is another announcement: Christoph Koch, Aleksander Mądry, and Rüdiger Urbanke are organizing an Algorithmic Frontiers Workshop, June 11–14 at EPFL in Lausanne, Switzerland. It appears they are still accepting registration, which is free. The program includes talks by people previously featured here.

## Something Complex

We promised to end on something complex. This new StackExchange “Code Golf” puzzle asks for the shortest program that generates the lyrics to another song, “Never Gonna Give You Up” by Rick Astley. Note that carminal complexity is an established research subject going back to a famous paper by Donald Knuth.

## Open Problems

One of the stated purposes of the Simons Institute is to “bring new insights into key problems in domains beyond conventional computation that require the analysis of vast amounts of data using new algorithms and mathematical approaches.” What should a complexity theory of Big Data look like? Close to complexity of song lyrics, or closer to data center models where darkness can be a complexity measure? Or must we heed caveats about “data science” voiced here?

Jim Simons Builds 3000 Mile Long Bridge

There is one experimental system that manage to analyze the vast amount of data pretty well, at least there is no competitive approaches in wide range of the application – human visual system. In particular, when things go to texture discrimination. That goes pretty much to many other scientific domains, although not widely appreciated. In particular, one of the approaches to modeling textures are Markov Random Fields. The problem here is that one need to learn the “important neighborhood” (formally clicks), and than the probability densities (we are in experimental/data domain – everything is good enough) on that neighborhood. For the compact representation of densities it is sensible to actually learn the family of densities from the data (I did not see papers on that topic, help me with references please). If the approximation of probabilities is done using exponential families, than we get statistical mechanics, where we need to learn appropriate potentials that describe data (more formally that would be a subspace of the Hilbert space where family lies – approximation of sufficient statistics). Think what physicists are doing? Given experimental data they learn the laws of the nature, i.e. the probabilities of co-occurrence. In classical physics those laws are 100% co-occurencies. In quantum mechanics it is less so, although again the description is a set of potentials lying in particular subspace of Hilbert space with wavefunctions being specific probing functions for this probabilities. So, the big data approach should be physics discovery automation.

Just what to say that computer vision coupled with good experimental system (human vision) seem to be overlooked in the depth hidden in it. Computer vision may help to understand in what sense quantum mechanics is epistemological, and in what sense it describes reality. By that I mean where the artificial (by definition, as an interpretation of “reality”) mathematical constructions that seems natural are the byproducts of the mathematical language, and how to overcome them.