An amazing new set of conjectures and the jacobian conjecture

Arno van den Essen is one of the world experts on the Jacobian Conjecture. He has written a book on the problem—really the book. Besides this he has made many contributions to the understanding of this great conjecture.

Today I plan to talk about a paper Essen just put into the archives on the Jacobian Conjecture (JC). His paper is amazing—really amazing. Its title is The Amazing Image Conjecture.

Years ago Essen, Evangelos Markakis, and I wrote a simple paper on an approach to the JC. The paper is I still believe a viable approach to the problem. But we could only prove a very weak result about the JC problem.

Since we could not make our ideas work, lets turn to Essen’s new paper.

Jacobian Conjecture

I thought you might like to go through a possible scenario of how the JC was created, and see some of the rationale behind the conjecture. This is not a “real” history, but one that I hope helps expose the ideas that go into the creation of a great conjecture.

Imagine that you are studying mappings ${F}$ from the two dimensional Euclidean plane to itself. A natural question is: when is ${F}$ one-to-one? There are many properties that mappings may or may not have, but being one-to-one is a basic property. Such functions are called injective—this name is due to Nicolas Bourbaki.

The first thing you notice is that ${F}$ being injective is a “global” property, since ${F}$ is one-to-one, or injective, provided there are no two distinct points ${a}$ and ${b}$ in the plane so that

$\displaystyle F(a) = F(b).$

Determining whether or not there are such distinct points is equivalent to asking for the solvability to the two equations

$\displaystyle \begin{array}{rcl} f(a_1,a_2) &=& f(b_1,b_2) \\ g(a_1,a_2) &=& g(b_1,b_2) \end{array}$

where ${(a_1,a_2) \neq (b_1,b_2)}$ and ${F(x,y) = (f(x,y),g(x,y))}$.

In general deciding whether such equations have a solution can be very difficult. In order to have any hope of solving this problem you decide to restrict the function ${F}$ to be a polynomial mapping: this means that ${f(x,y)}$ and ${g(x,y)}$ are both polynomials.

Even with the restriction to polynomial maps the difficulty is that ${a}$ and ${b}$ can be anywhere in the plane; they need not be “near” each other. This global nature is the key reason that determining whether or not ${F}$ is injective is hard.

You realize one way to simplify the problem is to ask a much weaker question: is the mapping ${F}$ “locally injective?” The mapping ${F}$ is locally injective if for each point ${p}$ there a disk around ${p}$ so that for all ${a}$ and ${b}$ in the disk if ${F(a) = F(b)}$, then ${a=b}$.

There are two reasons you think this is a good move. First, the mapping ${F}$ cannot be injective unless it is locally injective at every point ${p}$. This is a simple idea, but a powerful one. When facing a tough question, often relaxing the question can yield insight.

Second, you recall a classic theorem that has a sufficient condition for a smooth map to be locally injective at a point. The theorem is the famous Inverse Function Theorem. In one dimension it says that a function ${h}$ is locally injective at point ${p}$, provided

$\displaystyle f'(p) \neq 0.$

For example, ${h(x) = x^2}$ is locally injective at any point ${p \neq 0}$.

The Inverse Function Theorem generalizes this simple idea from calculus-101 to two dimensions and beyond. The critical question is what takes the role of the derivative of a function? It turns out the role is played by the Jacobian matrix of the mapping ${F}$—clearly we are close to why the conjecture is called the JC.

Associated with any mapping smooth ${F}$ is the Jacobian matrix defined as

$\displaystyle J_F(x,y) = \left[ {\begin{array}{cc} f_x(x,y) & f_y(x,y) \\ g_x(x,y) & g_y(x,y) \\ \end{array} } \right]$

Note, ${f_x(x,y)}$ is the partial derivative of the function ${f}$. In the one dimension case this would degenerate to the ${1}$ by ${1}$ matrix of ${f_x(x) = f'(x)}$. If the determinant of this matrix is non-zero at the point ${p}$, then by the famous inverse function theorem the map ${F}$ is locally injective at ${p}$.

Thus to tell if the mapping ${F}$ is locally injective at every point you need only look at the determinant of the matrix ${J_F(x,y)}$. Note this is a polynomial in the variables ${x}$ and ${y}$. It can be a high degree polynomial, and over the reals it could be quite difficult to tell if this polynomial is or is not ever ${0}$. For example, does

$\displaystyle 17x^3 - 348xy^6 + x^7 - 89x^2y^4 + 12 = 0$

have a solution over the reals?

Now you make a move that is brilliant. Rather than try to figure out whether such polynomials have zeroes or not you realize that the polynomial ${c}$ where ${c}$ is a non-zero constant has no zeroes. So you make the `simplifying” conjecture:

Conjecture: If ${F}$ has ${ \det(J_F(x,y)) = c }$ where ${c}$ is a non-zero constant, then ${F}$ is globally injective.

This is the Jacobian Conjecture due to Ott-Heinrich Keller in 1939, and later Shreeram Abhyankar gave it the modern name.

You might ask why not replace the condition ${ \det(J_F(x,y)) }$ is a non-zero constant by the condition ${ \det(J_F(x,y)) }$ is never zero for any real numbers. The answer is surprising—it is false. This was the so called “Real Jacobian Conjecture,” which is now known to be false. See this for a counter-example due to Sergey Pinchuk.

I will not go into more details, but the intuition I believe is that many real analysis problems are only understood when complex numbers are allowed. This happens in understanding the singularities of rational functions. For example the reason the power series of ${ 1/(1+x^2)}$ has radius of convergence only ${1}$ is there is a complex singularity. In the reals this function has no poles, while over the complex numbers it has a pole at ${i}$.

JC Approaches

There have been many claims of proofs of JC, some are still active claims. The incorrect proofs of JC—I think all have been proofs not counter-examples—have been by amateurs and professionals alike. The problem is slippery. There seems to be many difficulties in understanding the problem. Yet the problem’s statement only requires basic calculus.

JC seems to be one of those problems that should be resolvable. Yet so far it has withstood the attacks of mathematicians for over 70 years. Since then there have been many partial results and reductions, but no proof yet.

${\bullet }$ Equivalent Problems: Like many problems in mathematics there are numerous equivalent statements. See Essen’s book for many examples. One of the most famous is to reduce the degree of mapping ${F}$. In order to do this the problem is generalized to higher dimensions: the map ${F}$ is allowed to be

$\displaystyle F(x) = (f_1(x),\dots,f_n(x))$

where ${x = (x_1,\dots,x_n)}$. Hyman Bass, Edwin Connell, and David Wright proved the famous:

Theorem: If the generalized JC is true for all ${n}$ and polynomial maps of degree at most three, then JC is true.

As often is the case, two is special. The general JC is easily proved for mappings of degree at most two.

${\bullet }$ Direct Proofs Attempts: One “obvious” idea is to try and prove an inductive statement. If ${F = (f,g)}$ is the mapping, then prove that there is a transformation of ${F}$ that preserves injectivity and lowers the degree. A claim a few years ago was based roughly on this type of approach, but it fell apart.

${\bullet }$ Advanced Attempts: There were approaches to the JC problem based on advanced ideas of algebra. I am the last one to discuss these, but while they look possible they have not yet worked.

${\bullet }$ Algebra or Analysis: There are other approaches based on analysis rather than algebra. Of course all methods are allowed, but I have often thought the problem is closer to number theory than algebra, again who knows.

Amazing Conjectures

Essen’s amazing paper describes a beautiful connection recently discovered by Wenhua Zhao between the JC and some striking questions that seem interesting on their own. There are a new set of extremely interesting conjectures. I will explain one of them—see Essen’s well written paper for the full set of conjectures.

The generic class of open problems concern particular linear functionals ${L}$ on spaces of polynomials. Call ${L}$ “nice” if the only polynomial ${f}$ such that ${L(f^m) = 0}$ for all ${m \ge 1}$ is the zero polynomial. The amazing fact is that there exist particular functionals ${L}$ such that: if ${L}$ is nice, then JC is true. These functionals are still fairly complicated to define, so here I will follow Essen’s paper by starting with three simpler functionals for which niceness is non-trivial—and still open for the third:

$\displaystyle \begin{array}{rcl} A(f) &=& \int_0^1 f(x) dx \\ B(f) &=& \int_0^{\infty} f(x)e^{-x} dx \\ C(f) &=& \int_0^{\infty} \cdots \int_0^{\infty} f(x_1, \ldots, x_n) e^{-(x_1 + \cdots + x_n)} dx_1 \ldots dx_n \end{array}$

Each of ${A,B,C}$ sends a polynomial to a scalar. The questions are:

1. If ${A(f^m) = 0}$ for all integers ${m \ge 1}$, must ${f=0}$?
2. If ${B(f^m) = 0}$ for all integers ${m \ge 1}$, must ${f=0}$?
3. If ${C(f^m) = 0}$ for all integers ${m \ge 1}$, must ${f=0}$?

Essen can prove the first two are true for univariate polynomials, but the third about ${C(f^m) = 0}$ is open for multi-variate polynomials.

I will give a bit of flavor of how he proves such theorems, but as always see his paper for the details.

Consider the polynomial

$\displaystyle f(x) = c_n x^n + \cdots + c_l x^l$

where ${l>0}$ and each coefficient is a Gaussian integer. I am assuming that the constant term is zero, and plan on showing the lowest term is also zero. This will inductively prove that ${f}$ is zero.

The key is to raise the polynomial to a special power ${m}$, which is selected so all the terms but the last “disappear” when ${B}$ is applied. Let ${m = (p-1)/l}$ where ${p}$ is a prime and ${l}$ divides ${p-1}$. Then, ${f^m}$ is congruent to

$\displaystyle g(x) = h(x) + c_{l}^m x^{ml}$

modulo ${p}$. Define ${d}$ to be the minimum degree of the terms of ${h(x)}$. Then ${d}$ is at least

$\displaystyle lm + 1 = p.$

Thus ${B(f^m)}$ is congruent modulo ${p}$ to

$\displaystyle c_{l}^m (p-1)!,$

since all the terms in ${h(x)}$ have degree at least ${p}$. It is easy to prove that ${B}$ maps

$\displaystyle x^r \rightarrow r!.$

It follows that ${c_{l}^m}$ is divisible by ${p}$, and that ${c_l = 0}$. The latter uses that ${c_l}$ is a Gaussian integer.

Open Problems

Of course the open question is the JC. Is it true? This problem is not a Clay problem—perhaps it should be added to replace the Poincaré, Conjecture. The problem JC is on Steve Smale’s list of open problems as problem ${17}$.

1. July 18, 2010 6:19 am

i don’t understand the finite field case.
if a polynomial map: GF(2)^n -> GF(2)^n has non-invertible jacobian matrix, does this mean the map is not one-to-one? (the derivative is the formal derivative in case this makes any sense…)

July 18, 2010 3:24 pm

j,

Do not get your question. The JC is over the complex numbers. However, it makes sense over any field and even over commutative rings. Check out Essen book where he discusses some issues related to finite fields.

• July 20, 2010 9:46 am

{x-y, 3x+3y} over Z/24Z ?

• August 29, 2010 7:33 am

sorry my question wasn’t clear, will try to explain by example:

given a polynomial map GF(2)^n -> GF(2)^n can you probabilistically tell if it is one-to-one?

a specific example: consider the md5 hash function restricted to 128 bit input (16 bytes). the output is 128 bits too and it is a map GF(2)^128 -> GF(2)^128. (addition mod 32 can be emulated over GF(2) basically by using an adder circuit).

i was told it is open question if this is one-to-one.

a possible approach of settling this may be to try to use the jacobian matrix.
to compute the numerical jac. matrix partial derivatives can be computed via automatic differentiation (http://en.wikipedia.org/wiki/Automatic_differentiation taking into account x^k = x, k>0) or Derivatives of Binary Functions (http://cr.yp.to/cubeattacks.html see the scanned pages by djb).

so if the jacobian matrix has determinant 0 on random input it is not invertible (if it is 1 it is possible just to be the value of polynomial). i got a 0 witness for 128 bit md5 and failed to get 0 witness for the known 1-to-1 128 md5 restricted to the 1st round.

do you have opinion if md5 restricted to 128 input is one-to-one ?

2. July 18, 2010 12:51 pm

Thanks so much for sharing this fantastic conjecture. I’m shocked that the general real case has a counter-example.

July 18, 2010 9:56 pm

This is a great problem.
I read the wikipedia article on the
Jacobian conjecture first.

I wonder why the wikipedia definition
of the conjecture defines the mappings
to be between complex numbers rather
than from the two dimensional Euclidean
plane to itself.

In this definition it’s obvious why the
Jacobideterminant has to be a constant.
But how can it still be the same conjecture?

July 19, 2010 8:12 am

I thought real case would be more natural. The problem makes sense over any field or even ring. So there are many versions—which are often the same, We this, of course, all the time in theory. For example, if can prove over the rationals or even integers that implies JC—need to allow any number of variables.

July 19, 2010 4:27 pm

I absolutley agree that the real case is much more natural.
even i can see why this is an interesting problem 🙂

I guess i just got confused, that over the complex numbers
it’s much easier to show the Jacobideterminant has to be a
constant. But as Sergey Pinchuks counterexample is known
in the real case, you don’t have an advantage for the real case
anymore.

Am i right in thinking if the counterexample of Pinchuk
wouldn’t been known, the real case looks easier to proof
than the complex case?

If i am right i would have another question, i’m sorry if this is a
stupid question as i’m actually less than an amateur:

As the problem makes sense over any field or even ring, is
it proven over any field or ring, that the jacobimatrix has
to be a constant?

July 19, 2010 2:27 pm

Over an algebraically-closed field like the complex numbers, a (possibly multivariate) polynomial is a nonzero constant if and only if it is nowhere zero.

July 19, 2010 2:48 pm

I wonder – are there proofs in complexity theory that draws upon the world of complex numbers ?

Quantum world – yes. but in classical or probabilistic cases?

July 19, 2010 5:48 pm

Mendel,

Yes the real case looked nice—I would think. The reals have ordering and other properties that might have been exploited.

Your last question is quite interesting. Must the det be constant over other fields? I think likely to be yes for some other fields, but I do not know the answer.

7. July 20, 2010 6:43 am

Hmmm … Dick’s column gives a name to a problem that arises naturally in quantum simulation theory, that problem being: Given a (finite-dimensional) Hilbert vector that can be written as a tensor network state of rank r, is the tensor network representation unique?

We can regard the tensor network state as a compressed algebraic representation of the Hilbert vector, so the question is really about the uniqueness of algorithmically compressed quantum states. We see that it is also natural to restrict the question to bosonic (symmetrized) and fermionic (antisymmetrized) tensor network states.

It is perfectly feasible to search numerically for tensor network representations, and these searches turn out to work amazingly well, in essence because the manifold of tensor network has both a ruled structure and negative sectional curvature.

Our impression from numerical experiments is that the uniqueness conjecture is true for tensor network states, but on the other hand, it never occurred to us to search for counterexamples. For that idea, we are indebted to Dick’s fine post!

Often-times it’s not easy to recognize the correspondence between abstract mathematical theorems and practical engineering problems, so comments regarding the relevance of existing theorems to the injectivity of tensor network maps would be welcome.