# Bounds On Binomial Coefficents

* A simple but useful bound on binomial coefficients *

Andreas von Ettingshausen was a German mathematician and physicist who lived in the early part of the 19 century. He studied philosophy and jurisprudence, but later taught and wrote exclusively on mathematics and physics. Something that today, a mere 200 years later, would be impossible.

Today I wish to share a simple inequality with you on binomial coefficients.

This issue arose in a proof I am working on that concerns a new approach to circuit lower bounds. That is still in progress, could easily fail, but I thought this simple result would be of some interest.

The bound on binomial coefficients made me think about the related issue of who created the beautiful notation for them:

The answer—you may already have guessed—is that von Ettingshausen did. Binomial numbers were known long before his work. Pascal’s triangle was known for centuries already in Europe, and had been discovered even earlier by the tenth century in India. Earlier notation included some ugly ones like:

I especially find terms like unpleasing—the superscript just hangs there in space. So thank you, Herr Prof. Dr. von Ettingshausen.

## Binomial Coefficients

Recall that

is the number of ways of choosing things from objects, where order does not matter. In many proofs it is quite useful to have approximate results for binomials. For example, the central-term for even,

is well known to be approximately

However, I need bounds on binomials where is much smaller than . In this case it is also well known that

The trouble with this is that I needed control on the exact errors. So I will sketch a proof of a bound that works well when is small. For :

## Binomial Inequalities

We need the following simple inequalities for the binomial coefficent that holds for :

The upper bound holds for all and the lower bound requires the limit on the size of . Note, that a much weaker lower bound is often given,

This holds for all , but is exponentially weaker when is smaller that the square root of . This is the primary reason for this work.

Here is the proof. We note that

The upper bound is immediately clear from this—note that it too is best for small . For the lower bound, consider the function equal to

For in the function decreases as increases; and for in the same range, the function decreases as increases.

Therefore,

by our above remark. But we know that for all ,

Putting this all together shows that the inequality claimed is true. We can improve the constant by limiting to be large enough, but the above is sufficient for our needs. For an example,

## Open Problems

Is this result well known to you all? I found it out of necessity. Even though it is coarse, the fact that it is an inequality and not an asymptotic result is useful in some proofs. Sometimes we just need to have a convenient inequality.

“He studied philosophy and jurisprudence, but later taught and wrote exclusively on mathematics and physics. Something that today, a mere 200 years later, would be impossible.”

I disagree. Ed Witten studied history and linguistics before becoming a top physicist and mathematician

I did not know that! We thought of mentioning Alexander McCall Smith, who besides his novels and children’s books is Professor of Medical Law (emeritus) at U.Edinburgh and wrote what remains the only digest of Botswana’s criminal code, besides co-founding the country’s opera training center.

Also in the 19th century, Hermann von Helmholtz was both a practicing physician

andan eminent mathematician/physicist.It is surprising and regrettable too — perhaps even shocking? — that during the 20th century there were (to the best of my knowledge)

noMacTutor-level mathematician/physicians … unlike every previous century since the Islamic Golden Age (ofRhazesfor example).A modest hopePerhaps the 21st century will witness a return to the broadly Enlightened traditions of earlier centuries.For many French mathematicians, the notation $C_n^k$ is the French notation while $\binom{n}{k}$ is the American one. As you can imagine, some (should I say old?) teachers in France were upsets when the new curriculum replaced the French notation by the American one… I think the American notation is now totally accepted, and this is a good thing to my mind since, as you mention, it is much nicer!

Can 1/4 be replaced by 1/e to get a better bound?

Some binomial sum bounds can be surprisingly tricky — I humbly submit Lemma 5 here as an example.

The results amazes me. I always thought that the approx (n choose k) is approx n^k/k!

was somewhat crude since it seems as though you are tossing out so much information.

Thanks much!

I think ${n \choose k} \le \frac{1}{2}\frac{n^k}{k!}$ when $k\le\sqrt{n}$.

Consider a hashtable with $n$ buckets and $k$ items. There are $n^k$ ways to distribute the items, out of which ${n \choose k}k!$ have no collision. The probability of having some collision is $1 – {n \choose k}k!/n^k$. Now suppose $Y$ is a random variable whose value says how many collisions there are. So,

$$P(Y\gt 0)=1-\frac{{n \choose k}}{n^k/k!} \le {\rm E} Y$$

The inequality comes from the first moment principle. The expected number of collisions increases with $k$, so we just look at the case $k=\sqrt{n}$. There are $\sqrt{n}(\sqrt{n}-1)/2$ possible collisions, each hapenning with probability $1/n$. Thus,

$${\rm E}Y=\frac{\sqrt{n}(\sqrt{n}-1)}{2}\frac{1}{n}=\frac{1}{2}-\frac{1}{2\sqrt{n}}$$

Putting the previous displayed inequalities together, we get

$$1-\frac{{n \choose k}}{n^k/k!}\le\frac{1}{2}-\frac{1}{2\sqrt{n}}$$

and finally

$$\frac{{n \choose k}}{n^k/k!} \ge \frac{1}{2}+\frac{1}{2\sqrt{n}} \gt \frac{1}{2}$$

Trying to get LaTeX to work. I really hope this work, because I don’t know how to remove comments. Sorry.

I think when .

Consider a hashtable with buckets and items. There are ways to distribute the items, out of which have no collision. The probability of having some collision is . Now suppose is a random variable whose value says how many collisions there are. So,

The inequality comes from the first moment principle. The expected number of collisions increases with , so we just look at the case . There are possible collisions, each happening with probability . Thus,

Putting the previous displayed inequalities together, we get

and finally

I fixed the parse errors, though I don’t understand how or why, and also the inequality direction you note below. I can say this since both of me are in my office now.

This looks good to me. I like the use of the moment method.

[x]

Oh, dear: The very first inequality goes the wrong way. I meant that when $k\le\sqrt{n}$. (The rest seems OK, appart from the formulas that aren’t parsed and appear just in the latex version.)

We can also use the simple lower bound $\product_{i=1}^{k-1} 1-i/n\ge e^{-k^2/n}$, though it doesn’t give quite as good a bound as Radu Grigore’s.

that is.