Mathematical Intuition—What Is It?
Why is math intuition hard for some areas and less for others?
Enrico Bombieri is one of the world leaders in many areas of mathematics, including number theory, algebraic geometry, and analysis. He has won many awards, and is a Fields Medalist.
Today I want to talk about the notion of intuition in mathematics. I am curious what it is, how to get it, and how to use it.
One story, perhaps an urban legend, is that a senior expert in real analysis was once sent a paper that “proved” a surprising theorem. The expert looked at the proof, and was immediately skeptical. The “theorem” seemed to be too surprising—his intuition based on his great experience, was that the theorem could not be true. Yet even after hours of studying the proof he could not find any mistakes. But his intuition continued to bother him. He finally looked even more carefully, and found the problem. The author of the proof had used a lemma from a famous topology book. He had used the lemma exactly as it was stated in the famous textbook. But there was a typo in the book. Somehow the words “closed” and “open” had been exchanged in the statement of the lemma. This made the lemma false, caused a gap in the proof of the surprising theorem, and left the poor author with a buggy paper.
Proving theorems is not mechanical; proving theorems does require formal manipulation. Yet proving theorems also requires the use of intuition, the ability to see what is reasonable or not, and the ability to put all these together. Blindly using a lemma from even the most famous textbook can be dangerous, as the story shows.
I once lost several months of hard work trying to use a published theorem to solve an open problem. I almost had a proof, but eventually like the story I found a bug in the published result—the result was the sole result of a friend’s Ph.D. thesis. Oh well. This is not an urban legend, I was there, but I will discuss it another time.
For now let’s turn to the discussion of intuition in mathematics.
In mathematics you don’t understand things. You just get used to them. John von Neumann
Number Theory
I think that it is not too hard to have a reasonable intuition in number theory, especially concerning the behavior of prime numbers. A pretty fair approximation is to try the distribution of primes as “random.” That is primes in the range are roughly randomly selected odd numbers with the correct density: the number of primes in such an interval for is about . This of course follows from the prime number theorem.
A classic example of this is the conjectured behavior of twin primes. If primes are random, then one would expect that there are about
twin primes in where is a constant. Godfrey Hardy and John Littlewood made an even more detailed conjecture, which included the value of the constant . They guess that is
Of course there are local properties that need to be added to the model. The number of primes so that and are all primes is not
but one. This follows since on of , , is divisible by : so must be equal to .
I once had a proof that needed only a lemma about the structure of the primes to be complete. It was about a communication lower bound that I was working at the time with Bob Sedgewick. We could not prove the lemma, nor could we find any references to anything like it. So we made an appointment to see the famous number theorist, Enrico Bombieri. He is a member of IAS, and was there at the time. So Bob and I went over to ask him about our lemma.
Bombieri listened very politely, asked a question or two for clarification, and said, “yes that lemma is surely correct.” We were thrilled, since this would make our proof complete. I then asked him for a reference. He looked at me and said:
Of course the lemma about primes is true, but it is completely hopeless to prove it.
He had great intuition about primes, but proving certain results was then and still is today completely beyond anything anyone can do.
Geometry
My intuition in geometry, especially high dimension, is very poor. I have proved basic geometry theorems that hold in -dimensions, but I have terrible geometric intuition.
I am not sure why, I am not sure what makes geometry so hard for me, but it is very different than prime numbers. There are plenty of geometric questions that I would not have any idea how to conjecture what is true. Perhaps it is just a defect in my abilities, but geometry seems to be orders of magnitude trickier than number theory.
See this for some nice comments on high dimension geometry. It is a quite interesting anonymous blog—it is being written a by a Ph.D. student, I wish him/her well.
Groups
My intuition about finite groups is even worse than my geometric intuition. No that is not quite right. In a sense my intuition about groups is really very good. Over the years I have hit upon a rule in thinking about groups. I figured out that if I thought that X was a reasonable theorem that should hold for finite groups, then X was likely to be false.
Of course, this is a bit silly. It is like having a really poor sense of picking stocks—if you were always wrong, then there would be a great strategy. But, somehow I do believe there is something to what I am saying. My intuition is so bad that after a while I just started to guess the opposite of whatever I first thought.
My last post is a perfect example. I have a series of conjectures about solvable groups. The conjecture I listed took an expert, Colin Reid, a few minutes to disprove. He is a group theorist. I should have known better, but my intuition about groups is terrible, yet they may play an important role in our understanding of theory.
Reid’s Proof
Colin Reid posted his proof that is impossible in the comments section of the last discussion. The parts of his comments using angle brackets were snipped as HTML-style tags, so this is an expanded version fixing the glitches. Ken Regan and I have also replaced his quotient-subgroup notation by homomorphism notation.
Let be a non-trivial solvable group—some say soluble group. The composition series is defined by , and for , the subgroup generated by the commutators for . Solvability of means that some is the trivial subgroup , whereupon the series terminates. The important facts are that not only is every normal in its predecessor, it is normal in the entire group , and that the immediate quotients are abelian. This means that we can define a homomorphism on all of whose kernel is , and for all , and commute. The reason for the latter is that
where , since is in the kernel .
Now recall that asserted the existence in of elements such that some conjugates of and of are in , where , and likewise some conjugates of and are in , with a third condition on the orders of these elements in . The condition is that be relatively prime to , and this is what finally prevents the existence of .
For contradiction, suppose a solvable group with such elements exists. Then in the composition series, there is some such that contains all of , but does not. By symmetry, without loss of generality, we can suppose does not have . Take as above, so . It does not matter whether or belongs to ; that both belong to is enough. Since is a conjugate, . Now take the homomorphism with as kernel, and observe:
- .
- divides , divides , and divides .
- Since and are in , and commute.
- Hence and can generate at most different elements.
- Put more strongly, the subgroup they generate is also a subgroup of the abelian group generated by and , which has exactly elements, so by Lagrange’s theorem, the order of divides .
- Since is in , it follows that divides , which divides .
- However, since divides which is relatively prime to , the only way this can happen is .
- That means , so since is the kernel of .
- But is normal in , not just in , so is in . This puts into , which yields a contradiction.
To someone with good algebraic intuition this comes trippingly off the tongue, which is why it is a service to communicate in public. So we thank Colin Reid—and we will see if the insight gained works against our more-complicated stratagems of this kind. Colin is currently a grad student at Queen Mary college, University of London.
It may be that deterministically simulating each level of a Boolean circuit simply must bump you one step along a composition series, which in a solvable group is a finite, non-renewable resource.
However, we also have other ideas\dots Perhaps we can get mileage out of choosing different solvable groups for different input sizes . This relates complexity questions to ones about the possible lengths of composition series in groups of certain sizes—although what we know about such sizes is not so promising. But perhaps a randomized simulation can possibly avoid these limitations.
Complexity Theory
I think we have good intuition here, but I have seen many surprises in my career:
- Linear Programming is in polynomial time.
- Nondeterministic space is closed under complement.
- Polynomial size bounded branching programs are powerful.
- Permanent has random self-reduction.
- Quantum polynomial time can factor integers.
- De-randomization of random walks for undirected graphs.
- Proofs can be checked in constant time.
- The existence of Zero-Knowledge protocols.
I have reasonable intuition, yet all of these were surprising to me. Even some that I worked on and contributed to their development.
Open Problems
Is intuition simply built up by learning more and more about an area? Or is intuition something that is separate from just being an expert in an area? Can you be quite strong in an area and still have weak intuition, or is that impossible?
I’m actually the owner of that anonymous blog. I’ll have to deanonymize it. :-)
Thank you for the link. I’m glad that you found my comments useful. Let me also take the opportunity to say that I get a lot of insight and enjoyment from your posts here.
This is a surprisingly deep subject, I doubt I’ll be able to it much justice – but it’s a personal interest of mine.
To answer your open problems from my own intuition:
> Is intuition simply built up by learning more and more about an area? Or is intuition something that is separate from just being an expert in an area?
I think this could be paraphrased as “Is intuition something real and distinct?”
It’s clearly distinct from merely being an expert, in that it’s certainly possible to know a given subject broadly enough to do useful work, and yet not be able to come up with results that someone else – with less purely factual knowledge – could come up with intuitively.
We probably need a working definition of mathematical intuition, at this point. I would say it’s the ability to come up with answers that are correct more often than they would be by pure chance, without reasoning them out consciously or rigorously. One could say it’s nothing more than a highly developed ability to make educated guesses (though I’d say “nothing more” is a bit missing the point).
> Can you be quite strong in an area and still have weak intuition, or is that impossible?
Sure. You’d certainly be at a disadvantage, though.
This starts to drive towards the question of what intelligence is. In doing anything creative, you aren’t working entirely at a conscious level; you have to come up with ideas – perhaps randomly, at first – try them out, see where they lead, repeat.
It’s this coming up with ideas part that a well developed intuition helps with. The better they are, the faster you’ll go – you could think of creativity as a monte carlo method, and your guesses come from a random number generator. Only the random number generator isn’t fixed – it’s got a bunch of parameters you can tweak.
But coming up with ideas, by itself, isn’t doing math – and a good intuition isn’t enough to create a proof. If you’re good at the conscious level at doing math, that’s still quite a lot you are good at – if you’re very good at holding an entire potential proof in your head, that’ll compensate for a lesser ability to come up with things to try.
But in anything creative – i.e. coming up with new math, versus merely applying things that are already known – I would also say that a good intuition is a very powerful tool.
Oops, it’s possible my use of an “\mbox” for text inside an equation got snipped: in the section on Reid’s proof after dc = cdk the box defined k as the commutator [d^{-1},c^{-1}], which turns cd into dc. (And in step 9., actually it’s x^{-1} a’ x that equals a, though this barely matters.)
About Italian and intuition Luigi Salemi proved that 3Sat is solvable in polynomial time.
I think it substantially correct
Hmm, that academia.edu page needs updating. I’m currently a postdoc at the University of Göttingen.
Thanks for writing out my proof in detail, this ought to make it clearer for non-algebraists. Just one small nitpick: for 5., it’s better to think of it as a quotient. Essentially, you can write down the full list of m possible strings and they do all map to elements, but the map need not be injective.
KWRegan: in algebra there is a ‘left’ convention and a ‘right’ convention; x^{-1} a x is the right conjugate of a by x (which is the same as the left conjugate by x^{-1}). It seems most group theorists (myself included) write on the right as default (because most languages including English are written ‘on the right’, that is, from left to right), whereas many other algebraists prefer the left (because the ubiquitous f(x) notation is left-handed). Which side you prefer is completely irrelevant mathematically, of course, but in general it’s important not to switch sides by accident.
Thanks for explaining the beautiful proof! I never realized that Gᵢ is normal in G, although it seems so trivial. It helped me to view π(g) as gGᵢ = g[Gᵢ₋₁,Gᵢ₋₁] = [Gᵢ₋₁,Gᵢ₋₁]g to follow the proof, so I’ll mention it.
In addition to Colin’s comment on 5. please add the small world “free” (abelian group) to it, before one reads on to “exactly order m”, it is a bit puzzling to read that 〈π(c),π(d)〉 might be anything else but the (abelian) group generated by π(c),π(d) itself ;)
That should read “word”, not “world”, of course.
What a great topic! Here is a trio of further quotes — all are wholly compatible with Dick’s remarks on the role of intuition:
———————–
George Dantzig: “In brief, one’s intuition in higher dimensional space is not worth a damn!”
———————–
Richard Feynman: “One thing is to prove it by equations; the other is to check it by calculations. I have mathematically proven to myself so many things that aren’t true. I’m lousy at proving things — I always make a mistake. … So I always have to check with calculations; and I’m very poor at calculations — I always get the wrong answer. So it’s a lot of work in these things. … If [we] can do a real physical problem with real physical things in them, then I’m sure we have the right method.”
———————–
Bill Thurston: “Our brains are complicated devices, with many specialized modules working behind the scenes to give us an integrated understanding of the world. Mathematical concepts are abstract, so it ends up that there are many different ways that they can sit in our brains. ”
“A given mathematical concept might be primarily a symbolic equation, a picture, a rhythmic pattern, a short movie—or best of all, an integrated combination of several different representations. These non-symbolic mental models for mathematical concepts are extremely important, but unfortunately, many of them are hard to share.”
“Mathematics sings when we feel it in our whole brain. People are generally inhibited about even trying to share their personal mental models. People like music, but they are afraid to sing. You only learn to sing by singing.”
————
Dick’s column and the above Dantzig/Feynman/Thurston remarks can be compactly merged as follows: “Intuition is natural ideas, concretely instantiated, that sing in our minds.”
Every student of every subject, of every age in every land, is familiar with the wonderful feeling of new subjects beginning to “sing in our minds.” For teachers and students alike, cultivating this wonderful feeling is (IMHO) what education is mainly about.
Indeed a great topic and a great post, John. And I like your definition of intuition.
The shadows to the face are easily identifiable. Its the shadows to numbers that offer an unusual tautology The bits of theory one did not see related are then the key to understanding them in a strange light than the norm. Waho has then made the progress talk-tall ?
Francis Su’s notes on mathematical understanding and his newfound priorities in mathematical education seem apropos here ( http://www.math.hmc.edu/~su/papers.dir/leitzel.pdf ).
I love that post, at last some “serious” questions. ;-)
Alas this does not appear to be easy ones.
From the Bombieri anecdote it appears that there might three distinct capabilities in maths:
- Grinding out proofs while also holding some understanding of the big picture.
- Having correct intuitions about the expected outcome of complex questions.
- Translating these intuitions to actual proofs.
Which means that the representations used in the mind by intuition are nothing like written maths otherwise someone like Bombieri would not get stuck with completely hopeless proofs.
From where could the answer about the nature of intuition arise?
Psychology? Neurology? Linguistics?
My bet is on linguistics, not the syntactic side the semantic side.
Some researches at the boundary between natural languages and computer languages appear interesting, though this doesn’t approach yet problems with the dreaded ontologies which keep fluctuating and morphing while you are using them.
It should be noted that if this succeed there will be a mathematical model of intuition, somehow “closing the loop”: language -> maths -> intuition model -> language …
John Lomont, a mathematician I used to play go with, once gave me the following test. It’s important not to think too hard about it, because you can grind out the answer if you spend enough time on it, but that’s not the goal of the question.
Describe the shape of the object formed by the intersection of three long identical cylinders, one running down the x-axis, one along the y-axis, and one along the z-axis.
Stop and think about it.
Now check that answer.
He suggested that some people (topologists, if i remember correctly) have an uncanny ability to answer this immediately and correctly.
I’m sure that there is a battery of such tests you could design for number theory, group theory, logic, topology, etc. for various ages or education levels of questionee.
There’s a symbiotic relationship between learning and this intuition, though. You can get more and more intution (in the areas which are in your wheelhouse) by spending more time working in and thinking about the area. Which you are likely to do, because presumably you enjoy thinking about those things.
To go to one of your other points, I think a good example is logical intuition. One logician I know gave me this strongly-worded and useful piece of student advice, “The problem with intuition is that it can be wrong. You need a proof.”
s.
In January of ’99 I was an undergrad who some how made a trip to what was basically a colloquium lecture series. Someone made the comment on one of the early speakers that he had ‘great intuition’ for the problem. One of the postdocs who was there made the point to me that that really meant that he had dealt with the problem so much and from different angles [which in light of his presentation is a bad pun] that he had a deep and broad understanding of the problem outside of the rigorous results. That had a huge formative effect on me.
I do not like the word ‘intuition’ in this context. It screams ‘chance’ and ‘luck’ to me. I prefer to think of it has having a well-trained aesthetic.
Very good comment, but I believe in a compromise of both views presented here.
Let me illustrate with something that happens in some sports (e.g. soccer):
A very well trained and very focused athlete will certainly have a good performance. (We may call him a disciplined athlete). However, some athletes that don’t like to train much (and are not very focused on his carrer) can still perform superbly in his sport. (We usually say he has “talent”).
I may relate this discussion somehow to the range of psychological theories. Some tend to believe that mind abilities are innate (Innatism), some believe they are just the result of live experiences (Empiricism). However, most balanced theories recongnize that both things occur: there are innate mental “structures” and experience may “shape” them somehow. (I am not a psycologist, so some terms may not be precise).
To conclude:
I would say that “intuition” is a natural “talent” (just a matter of definition). But certainly experience/knowledge develops the intuition, allowing one to go much further.
Another little nitpick: the series G_i is the derived series (sometimes called the commutator series), not the composition series. A (descending) composition series is a series of subgroups H_i, each normal in the previous one, such that H_i/H_{i+1} is simple, that is, there are no subgroups strictly between H_i and H_{i+1} that are normal in H_i. Otherwise though it is all very well explained.
As for intuition: while mathematicians do develop deeper insights into problems they have put a significant amount of thought into, I think a lot of intuition is a more basic kind of Pavlovian association. For instance, a finite group theorist is primed to look for and react to the words ‘relatively prime’ or something like it, because there are so many results in finite group theory about elements/subgroups of relatively prime order. Of course, most of us only have this trained response to a limited number of stimuli, hence the specialised nature of the intuition.
Timothy Gowers has spoken/written about whether it would be possible to program a computer with mathematical insight and use it as a high-level aid to proof. See for instance the start of this talk:
http://www.dpmms.cam.ac.uk/~wtg10/gafavisions.ps
While I often realize that the statement of the theorem I was trying to prove was wrong, it’s far more useful to realize when my _intuition_ about a problem is wrong — for instance, when an analogy between two mathematical objects will work and when it won’t.
For an example, which I hope can be appreciated even by people unfamiliar with the area: Alex Russell and I, and many other people working in quantum computation, deal a lot with irreducible representations of finite groups. In many ways these act like random unitary matrices; for instance, Schur’s lemma implies that if rho is a d-dimensional irrep and u and v are vectors with norm 1, then
Exp_g ||^2 = 1/d ,
where Exp_g is the expectation over all elements of G, the finite group.
This is the same answer you would get if rho(g) were a uniformly random d-dimensional unitary matrix, which rotated this d-dimensional space in a random way. (For experts, if it were uniform in the Haar measure.) But rho(g) turns out to be random “only up to second order”, and if we look at higher-order averages (i.e. more than quadratic in rho(g)) then rho can act in highly non-random ways that depend on the specific structure of the group G, and how tensor products of its representations decompose into irreducibles.
Realizing to what extent rho(g) acts as if it were random, and to what extent it doesn’t, took me a long time. But now I have a clearer picture of when I can apply my intuition about random unitary matrices to the case of finite groups, and when I can’t.
p.s. a typo in your section on prime pairs: “since on of” should be “since one of”, and you mean p=3 if p, p+2, and p+4 are all prime.
p.p.s. is there a generalized version of the Prime Pairs Conjecture that states that any arithmetic pattern of primes appears with the density we expect when “local” obstructions are taken into account?
my inner product got removed. That should read
Exp_g |(inner product of u and rho(g).v)|^2
In answer to your PPS, there is, and it is called the Hardy-Littlewood k-tuples conjecture. It has recently been proved in many cases by Green and Tao (but not for twin primes).
Unfortunately I am out of town and don’t have my copy of Douglas Hofstadter’s novel “Godel, Escher, Bach” with me. I am not able to post a direct quote at this moment.
In the text, Hofstadter shares anecdotal evidence that has a great depth of things to say about the ontology of the “platonic heavens” – man’s ability to reach into ‘the divine’ to see glimpses of truth. It comes in the form of an experiment done with savants. Savants describe their sensation as a nebulous “right feeling,” and it can be broadly put that savants intuitively “feel” the answer to large calculations. For example, a savant might be asked to the multiplication of large number x and large number y. The product n “feels” right.
In the experiment, savants were timed on a series of calculations of growing size. The findings were that savants took longer to “feel” the right answer the larger the problem posed to them. Furthermore – the plot of the time it took them to “feel” the right answer, in general, agreed with known complexity results of the time. This hints that there may be a biological mechanism, some “satisfiability solver” with incredibly tuned heuristics that exists a layer deeper than of conscious access. Of course the mind is much more complicated than this analogy, which only captures some aspects of its effects.
I am very interested to hear about this, as I have sometimes had the fantasy that one could show that P “probably equals” NP by finding a savant who can spot cliques in graphs in a time that scales up polynomially rather than exponentially — or some similar problem.
I would be interested in comments on the role of intuition, not in proof-proving, but in proof-checking.
It is clear that (human) proof-checking cannot be a wholly logical process, in which (for example) a mathematician converts the proof into some internal, rigorous, formal representation. If that were true, proof-checking could be readily automated … which it has not been.
Thus in Don Knuth’s phrasing, proof-checking (in practice) must be partly an Art and not wholly a Science.
So what is the role of intuition in proof-checking? A mathematical book titled The Art of Proof-Checking would perhaps be a very interesting book, both to read and to write.
“Gas me with a spoon,” to a Standard Model expert who is already situated in a world of 19 parameters such as quarks, gluons, leptons could at times get problematic as he might not distinguish the distinction between symmetry and supersymmetry distinguishing the gas from the spoon. Let alone the situation when he enters Minimally Supersymmetric Standard Model. None the less, since there is also some kinds of renormalization groups involved with SM and MSSM too, it might be wise to bring up the problem of finding the complexity in polynomial time of the Galois group, solvable or not, of a given polynomial. At the time of Galois this probably did not make sense. But I have seen algorithmic book on it after I mentioned it in the presence of a friend some ten years ago or so. I have been out of touch with the advanced subject for some time apparently.
It was most probably totally random that I came up with that algorithmic Galois group book. I just went after anything that had Galois in it like Galois cohomology, Galois theory, Galois motives, and so on. But I was truly flabergasted to see that the idea that I mentioned to a couple of math profs at tea time actually had applicatoins. Actually, that was becasue I was so stupid at Fort Collins to find the particular Galois groups of given polynomials to the extent that I was called to be disgusting. Or was it that I got stuck at a paper of Hochschild on local Galois theory that he called me so. And I tell you with the advent of modern algebraic number theory based on profinite groups and Galois cohomology (which can presumably be translated into Hochschild cohomology) the matter becomes hopeless without serious computer hardwares supported with high level softwares to figure things out for Fermat equations like x^n+y^n=1. Sometines, a savant can figure out a difficult problem that no one else can, but sometimes one does not even know how to use these huge supercomputers to pose a difficult problem. and sometimes one has not simply climbed up the academic ladder step by step to figure out the full scope of the problem of say connecting the softwares to the hardwares properly. Thanks God I’ve never had the experience of teaching electrodynamics at different undergrad and grad level, but it must give great intuition to the instructor than the straight A student who successfully reaches the postdoc stages with exemplary lab works. Thanks God, but I also go nuts when I still wonder about the missed opportunities. Of course, one could always try politics or military instead which make science more glamorous exactly because not everything in life is math and physics. It is better to go on the market and buy what engineers came up with possibly remotely using the efforts of mathematical physicists. That way the mathematician can sit home and think about his math or go on the market to buy what he need at home for a better mathematical intuition. I never doubted the ethics of someone like Bombieri or Miranda as mathematicians. John would be convinced that I am babbling too much by now. But sometines one is forced to babble as one is left with no other choices.
Hi.. Im into the first year of ‘serious’ research. I too have encountered ‘problems’ with intuition. I was working on a problem in Computer Science (related to Data Flow Analysis). and i got stuck. Intuitively my solution seemed fine… but just couldn’t prove it is correct. Later.. the solution was proved to be incorrect. But I was convinced that my intuition is fine and hence went on to think about the problem from a different perspective… hurray… and now i got a ‘working’ solution.. Intuition helps!
I suspect that we form a large series of models internally as we learn things. We build up them from experience and education. In that sense they can be both overlapping and contradictory. Perhaps intuition is just our ability to utilize some of our rougher internal models to help explain our more complete ones.
When I’m building huge software systems, I generally ‘see’ them, but my internal view is nearly impossible to articulate. I can write out projections of it in many different ways, but my overall picture defies a single syntactic representation (but is extremely useful for making sure all of the pieces are complete and behaving as necessary).
Paul.
Of course intuition is incredibly messy and contradictory yet it captures some similarities, analogies, salient features, correlations (?) relevant to the question at hand.
About the same way that ayurvedic or chinese medicines do work and even sometimes better than rational western medicine despite their “theories” being utter nonsense.
This is why mathematicians will have a very hard time finding out how their intuition work, they will recoil in horror at the first glimpse of this reality. :-)
I’ve always sensed that the sum of all of our knowledge is also messy and contradictory. It’s just a large mash of all of the projections of everyone’s individual models in various forms. Incomplete in parts, it is a vague outline of the reality around us. Some people perceive it as being more concrete and complete than this, in the same way that we see the objects around us as being continuous. It’s not until we can leverage a tool like a computer, that we’ll really be able to see the gaps and contradictions clearly.
Paul.
A possible mathematical intuition:
1. Suppose we have a 3-dimensional coordinate system, then we have 8 octant {(000), (001), (010), (011), (100), (101), (110), (111)}.
2. Suppose we have a 3-SAT formula such as (1 ∨ ~2 ∨ 3) ∧ (~3 ∨ 4 ∨ ~5), then we may convert it to its form of complement such as (~1 ∧ 2 ∧ ~3) ∧ (3 ∧ ~4 ∧ 5).
3. Suppose we REJECT the complements of the 3-SAT formula then we have a program of reversibility (reducibility), such that the remaining is the ACCEPT as assignments.
http://arxiv.org/abs/1009.3687 in more detail.
Interesting post, I may subscribe to this blog. I remember when I was in college, I asked my math professor what area of math he study and he said, “numbers”.
What is Gödel’s lost letter about? From an intuitive point of view:
1. Existence of the law of reciprocity, which he stated in his letter.
2. Existence of iterative set structure, which is his conception of set theory.
3. Existence of bidirectional reducibility, which he did not state in the letter. However bidirectional reducibility is necessary for P=NP.
The central question is how to apply his iterative conception of set to realize his bidirectional reducibility. An iterative set structure http://arxiv.org/abs/1006.2495 may offer such an intuition.
Godel’s lost letter or P=!NP appears to be much more difficult than say his imcompleteness theorem, numerous examples of which appear in this fine site of Professor Lipton and one of my own interest is Fermat’s Last Theorem.It is true that the theorem is proved at last, but there seems to be much more to it in physical or geometric terms in the same spirit that Alan Connes related noncommutative geometry and number theory via Riemann Hypothesis. The complexity raises when a set of equations as in FLT are singled out and its decidability is questioned. Fortunately FLT was finally solved for the case of integers. It might just be that the Galois theory of the these equations over various extension fields of the rational numbers are already textbook problems and I am too dumb intuitionistically or algorithmically (via various CS softwares), but I can’t stop thinking about it and solve the problem with just a snap of the fingers. I’ll try to go back to Barrington’s theorem reference in one of the previous posts as it appeared relevant to the problem at a first skim.
> Proofs can be checked in constant time.
Wait, what?
Isn’t proof-checking linear with the size of the proof?
@Joe
The PCP Theorem.
NP = PCP[O(log n), O(1)].
Perhaps a better way to say it: Proofs can be checked for correctness with negligible probability of error by checking only a constant number of bits.
Thanks for the info, very interesting =)
For most proofs it should not practically matter if it’s O(n) instead of O(1).
Also, doesn’t the proof need to be fully formalized?
Has this been done for the 4 colors theorem?
Great post! Thanks, Richard.
Intuition has been a topic of my interest for the past 4 years. My interest began when I first read Malcolm Gladwell’s book “Blink”. The book highlights marvels of expert intuition.
My favorite reference on this topic is a lecture by Prof. Daniel Kanheman (Nobel 2002) of Princeton at Berkeley. The lecture is titled “Explorations of the mind. Intuition: the marvels and the flaws”. The video is available at: http://www.youtube.com/watch?v=dddFfRaBPqg. I have written a blog on this lecture as: My most favorite YouTube video. http://cataligninnovation.blogspot.com/2010/09/my-most-favorite-youtube-video-and.html
Prof. Andrew Wiles explains his view of the process in Simon Singh’s “Fermat’s Last Theorem” – “Often you write something down to clarify your thoughts. In particular when you’ve reached a real impasse, when there’s a real problem that you want to overcome, then the routine kind of (mathematical) thinking is of no use to you. Leading up to that kind of new idea there has to be a long period of tremendous focus on the problem without any distraction. You have to think about nothing but that problem – just concentrate on it. Then you stop. Afterwards there seems to be a kind of period of relaxation during which the subconscious appears to take over, and it’s during that time that some new insight comes.”
Vinay
A citation:
“The only real valuable thing is intuition.”
(Albert Einstein)
I think I know what Doug Hofstadter would say. It’s about GISTs.
I totally disagree with Von Neumann. He must have just been a formula person, a “Bourbakian”.