A finite version of the Axiom of Choice Thomas Jech is a set theorist and logician, who among many other things wrote a classic classic book on The Axiom of Choice (AC). I strongly recommend this book for its wonderfully lucid explanation of many aspects of AC—including Kurt Gödel’s and Paul Cohen’s results on the independence of AC from ZF set theory.

Today I want to discuss an old result about the famous Axiom of Choice. The result concerns a finite version of AC, and may be related to some of our problems in complexity theory. In any event it is a quite neat result, in my opinion.

At the recent Association of Symbolic Logic meeting, hosted by George Washington University in Washington, DC., a group of us went to dinner after our session on complexity theory. As usual we discussed many things during dinner: who is hiring, who is moving, P=NP, but somehow the conversation hit upon the AC. I told the group, I recalled vaguely a result on the AC for finite sets—no one seemed to be aware of such a result. I was a bit embarrassed, since I could not recall the exact statement of the theorem, but did recall it concerned choice on finite sets. We then moved onto other topics.

This is my attempt to make up for forgetting these pretty theorems about the AC for finite sets.

Finite Choice

“The Axiom of Choice is necessary to select a set from an infinite number of socks, but not an infinite number of shoes.” — Bertrand Russell

Recall the AC is all about choice functions, since we are computer scientists we will only consider choices among finite objects. Consider a family ${\cal F}$ of sets each of cardinality exactly ${n}$. The statement ${C_{n}}$ says every such family has a choice function: More precisely, there is a function ${s}$ from sets ${A \in {\cal F}}$ to elements so, $\displaystyle s(A) \in A$

for every set ${A}$. Jech gives several theorems, but the most interesting ones solve the following problem: when does ${C_{m}}$ imply ${C_{n}}$? Here are two key examples:

Theorem: The statement ${C_{2}}$ implies ${C_{4}}$.

Theorem: The statement ${C_{2}}$ does not imply ${C_{3}}$.

What I find interesting is the non-monotone nature: the relationship from ${C_{m}}$ to ${C_{n}}$ is not simple.

Let me prove the first, ${C_{2}}$ implies ${C_{4}}$. In the spirit of Russell this implication says

The ability to choose from “socks” implies the ability to choose from “four dinning table chairs”.

I will follow Jech’s proof almost exactly. It is important to understand what we can do with sets:

1. Given a finite set ${A}$ we can tell the cardinality of the set.
2. Given a set ${A}$ and ${B}$ we can form the set difference ${A-B}$.
3. Given any set ${A = \{a,b\}}$ of two elements, there is a universal function ${g}$ that is a choice function: $\displaystyle g(A) = x$

where ${x=a}$ or ${x=b}$.

The last property follows from the assumption ${C_{2}}$.

We now assume we have a family ${\cal F}$ of four element sets, and are looking for a way to select among any four elements. We will construct the choice function using the above properties—the existence of ${g}$ is critical.

Let ${A}$ be in the family ${\cal F}$. There are six two-element subsets of ${A}$. For every ${a \in A}$, let $\displaystyle q(a) = \mbox{ Number of pairs } \{a,b\} \subset A \mbox{ such that } g(\{a,b\}) = a.$

Obviously ${q(a)}$ cannot be the same for all ${a \in A}$: if all had ${q(a)=1}$ this would be too few choices, if all had ${q(a)=2}$ this would be too many choices. Let ${u}$ be the least positive value of ${q(a)}$ over all ${a \in A}$. Then, define ${B}$ as $\displaystyle B = \{ a \mid q(a) = u \}.$

Thus, ${B}$ must have ${1,2}$ or ${3}$ elements.

1. Suppose ${B}$ has one element. Then use this element as the choice for the set ${A}$.
2. Suppose ${B}$ has two elements. Then use ${g}$ to choose among the two elements and this is the choice for the set ${A}$.
3. Suppose ${B}$ has three elements. Then use the unique ${a \in A-B}$ as the choice for the set ${A}$.

Pretty neat argument—no?

The General Theorem

In order to define the general relationship we need what Jech calls condition ${S}$: Say ${(m,n)}$ satisfy condition ${S}$ provided there is no decomposition of ${n}$ into a sum of primes, $\displaystyle n = p_{1} + \ldots + p_{s}$

such that ${p_{i} > m}$ for all ${i=1,\dots,s}$.

Theorem: If ${(m,n)}$ satisfy condition ${S}$ and if ${C_{k}}$ holds for every ${k \le m}$, then ${C_{n}}$ holds.

See his book for the details of the proof. The proof is an induction argument and uses some simple properties of binomials. Note, ${(2,4)}$ does satisfy property ${S}$.

He also shows the condition ${S}$ can be used to completely characterize when ${C_{m}}$ implies ${C_{n}}$. The technically harder direction is to show that a ${C_{m}}$ does not imply a ${C_{n}}$, since this requires set theory independence machinery. If you are interested please check his book for the details. It allows him to prove:

Theorem: If ${(m,n)}$ does not satisfy condition ${S}$, then there is a model of set theory in which ${C_{k}}$ holds for every ${k \le m}$ but ${C_{n}}$ fails.

Open Problems

Can we use this finite AC in any part of complexity theory? Does the “trick” of proving ${C_{2}}$ implies ${C_{4}}$ have any application in any part of theory?

1. April 12, 2010 10:11 am

Neat argument indeed! Though it is just barely related, this seems an opportunity to mention the delightful paper Division by three by Doyle and Conway. “In this paper we show that it is possible to divide by three.”

2. April 12, 2010 12:12 pm

This is tangential to the main content of your post but I hope it is interesting. Martin Escardo has been doing some intriguing work on searching infinite sets in finite time. Programs that implement such search are available on his page.

A hands-on demonstration is given here. There is a LICS paper that provides more details.

• April 12, 2010 12:16 pm

Whoops! The demonstration link is here.

3. April 12, 2010 12:58 pm

Great post. You are really raising two interesting questions. The first whether AC has direct application (I suppose there might be a meta result in which choice is essential) the second is whether the methods of independence proofs can be applied. I think the answer is almost certainly yes. In fact our early independence results used some related methods although I think Paul Young eventually showed they were not needed.

4. April 13, 2010 9:24 am

I heard about this type of problem once. I managed to figure out that C2 implies C4, but couldn’t prove any other nice relation. It’s nice to know about the last result you mention (and to have the reference, too!).

For what it’s worth, here’s my seemingly ungeneralizable argument for C2 implies C4:

Each four element set can be expressed as the disjoint union of two two element sets in three different ways. Using a choice function for two element sets you can select, for each of those three ways, one of the two element sets. Think of the four element set as the complete graph on four vertices; we are choosing one edge out of each of the three pairs of disjoint edges. Three such edges either form a triangle or a K1,3. Define the choice function on the four element set as follows: if the edges from a triangle, pick the fourth vertex; if they form a K1,3, pick the degree 3 vertex.

I’m biased of course, but I like my argument slightly better than Jech’s. 🙂

• April 13, 2010 10:58 am

Very nice argument. I like it very much.

5. • 