Charles Peirce, who is regarded among the greatest logicians and philosophers, wrote a paper in 1897 titled “The Logic of Relatives.” Now we might want a guide to the logic of our own relatives, and in fact the word “brother” furnished Peirce’s first example. But what Peirce meant by “relative” is a term that attains its meaning only in conjunction with another noun as object, plus a word or other sign giving the relation. Examples, the first two his, are “brother of Napoleon,” “slayer of giants,” and “among the greatest logicians and philosophers.” The last has a wordless sign after “among” where the others have “of,” but then the rest may seem to make it a four-way relation. However, Peirce gave a general calculus of how all four and higher-way relations in semiotics could be decomposed into twos and threes. Such “relatives” and some other three-way relations, however, he proved to be irreducible within his system.

Today we consider an open problem about decomposing 3-ary finite automata.

Peirce’s name is usually written with at least the ‘S.’ of his middle name. He was born Charles Sanders Peirce, but late in life he inserted “Santiago” before or replacing his middle name. As “St. James” in Spanish, this was said to honor his friend the philosopher William James, whose son Pierce designated as second heir after his own wife Juliette. However, earlier usages may have honored the European origins of Juliette herself. His taking up with Juliette before his divorce from his first wife became final was made a scandal that cost Peirce the one academic position he ever had, from Johns Hopkins University in 1884.

I was prompted to write this by a response from Jon Awbrey, who frequents our pages and writes the blog “Inquiry Into Inquiry.” In Boolean logic every ${N}$-way function can be decomposed as a circuit of binary functions, indeed composed entirely of the binary NOR function, as Peirce himself discovered. Awbrey termed it a “trap” to infer that this carries over to logical semantics. The rest of this post isn’t related to Pierce, but we offer it toward the general question, when do threes decompose into twos?

## Multi-tape Finite Automata

The familiar idea of a finite automaton ${M}$ can be generalized to any number ${k}$ of input tapes. If each input head must advance one cell on each move, then the input strings may as well be padded to the same length ${n}$ with an extra symbol $added to the alphabet ${\Sigma}$. Then they can be shuffled into one string of length ${n}$ over the alphabet ${\Sigma' = \Sigma^k}$, whereupon ${M}$ becomes equivalent to the resulting ordinary one-tape finite automaton ${M'}$. Things are more interesting when one or more heads are allowed to stay stationary on each move. Then the string-matching relation can be recognized by a nondeterministic 2-tape finite automaton (2-NFA). This relation consists of all pairs ${(x,y)}$ such that ${y}$ can be written as ${y = uxv}$ for some ${u,v \in \Sigma^*}$. The 2-NFA pauses its first tape head and advances the second until it guesses where ${x}$ may start in the input ${y}$, and accepts if and when it matches all characters of ${x}$ on consiecutive steps. It is easy to show that no 2-DFA can recognize this relation. Thus the equality “NFA = DFA” does not carry over to multiple tapes when pausing is allowed. A three-way relation that is deterministically recognizable is the concatenation relation ${C(x,y,z) = (xy = z)}$. With ${x}$ and ${y}$ on two separate tapes, a 3-DFA can recognize where ${x}$ stops upon reading the (implicit or explicit) terminal$ on that tape, and then begin matching the rest of the third tape against ${y}$. In an early paper, I showed that there is a bijection ${f: \{0,1\}^* \times \{0,1\}^* \longrightarrow \{0,1\}^*}$ whose graph is similarly recognizable as a three-way relation. Indeed I gave a two-tape finite transducer to compute it. This showed a pairing function that is linear-time computable and invertible, while previous examples used integer multiplication for which linear time is unknown.

## Multi-tape Finite Transducers and the Problem

The definition of a ${k}$-tape finite-state transducer (${k}$-FST) is straightforward: Each transition has the form

$\displaystyle (p,(c_1,\dots,c_k),(D_1,\dots,D_k),y,q)$

where ${p}$ and ${q}$ are states, ${c_1,\dots,c_k \in \Sigma}$ are input characters or \$, each ${D_i}$ is ‘S’ for “stay” or ‘R’ for move one cell right, and ${y \in \Sigma^*}$. To make the computation straightforward, we can stipulate that at least one ${D_i}$ in each instruction is an R, and that the machine goes to a designated halting state only when each ${c_i = \}$. Then on inputs ${x = (x_1,\dots,x_k)}$ where ${\max|x_i| = n}$, the machine runs for at most ${k(n+1)}$ steps.

An example of a function computed by a 3-FST is the ternary concatenation function ${c_3(x,y,z) = xyz}$. The latter two tapes pause while ${x}$ is copied to the output, then ${y}$, then ${z}$. This function, however, can be written as the composition

$\displaystyle c_3(x,y,z) = c_2(c_2(x,y),z)$

of the usual binary concatenation function. This leads straight into our question:

Can every 3-FST function be computed by a finite composition of 2-FSTs?

We can think of other problems involving relations and functions computed by binary and ternary finite automata that seem to have ready answers. One is whether every 3-DFA relation can be written as a Boolean combination of 2-DFA relations on its three pairs of inputs—consider the relation ${z = x \oplus y}$ where ${\oplus}$ is bitwise exclusive-or. The answer for the function question has, however, eluded me—I suspect it is no, but have not even found a “killer” candidate function for being 3-FST only.

## Open Problems

What about other cases of ${k}$-way into ${2}$-way decomposition?

1. June 11, 2013 10:34 am

Re: “When do threes decompose into twos?”

One of the slogans emblazoned in my brain by freshman algebra was, Binary Operations Are Ternary Relations.

For example, 2-ary operations like Peirce’s amphecks count as 3-ary relations.

Long story short, when it comes to counting coup in twos and threes, functions are apples and relations are oranges, and we need to be sure of comparing like to like in weighing the ways these fruits decompose.

2. June 11, 2013 12:34 pm

speaking of FSM transducers, heres a brief writeup of the construction/use of it for the collatz conjecture. also looking for volunteers as a proposed polymath problem.

3. June 12, 2013 1:15 am

Nice question. At the end, k-way and 2-way should read k-tape and 2-tape, right?

4. June 12, 2013 4:06 am

Here is a killer candidate: The first and second tape each contain a length n sequence of some alphabet, while the third contains a length n sequence of 1′s and 2′s. Whenever an i is read on the third tape, copy the current symbol of the i-th tape to the output tape and advance on the i-th tape (do nothing on the (3-i)-th tape).

• June 12, 2013 4:08 am

Well, pity there is no delete comment here, this is pretty simple to realize as the combination of 2-FST’s…

5. June 12, 2013 4:30 am

Okay, now this is really a killer: The symbols of our alphabet are 1, 2, 3. We will always have an “actual” tape, in the beginning the first. In each step we write down the current symbol of the actual tape on the output tape and make the read symbol-th tape the new actual tape. We proceed until we reach the end of a tape.

6. June 12, 2013 2:00 pm

It may help to clarify the relationship between logical relatives and mathematical relations. The word relative as used in logic is short for relative term — as such it refers to a piece of language that is used to denote a formal object. So what kind of object is that? The way things work in mathematics, we are free to make up a formal object that corresponds directly to the term, so long as we can form a consistent theory of it, but in our case it is probably easier to relate the relative term to the kinds of relations we already know and love in mathematics and database theory.

In those contexts a relation is just a set of ordered tuples and, if you are a fan of strong typing like I am, such a set is always set in a specific setting, namely, it’s a subset of a specific Cartesian product.

Peirce wrote ${k}$-tuples ${(x_1, x_2, \ldots, x_{k-1}, x_k)}$ in the form ${x_1 \colon\! x_2 \colon\! \ldots \colon\! x_{k-1} \colon\! x_k}$ and referred to them as elementary ${k}$-adic relatives. He expressed a set of ${k}$-tuples as a “logical sum” or “logical aggregate”, what we would call a logical disjunction of these elementary relatives, and he regarded them as being arranged in the form of ${k}$-dimensional arrays.

Time for some concrete examples, which I will give in the next comment …

• June 13, 2013 12:16 am

Meanwhile, was my use of “sign” in the first paragraph OK? It seems to signify both the (effect of the) 3-way relation and (in terms by other semioticians) an element.

7. June 13, 2013 1:01 am

A couple of questions about the open problem:

Can we do any of the following: reuse inputs? reuse the same pairs of inputs? copy outputs? As a concrete example, if we want to recreate f(x,y,z) can we do stuff like: g_5(g_2(g_1(x, y), z), g_4(g_3(x, y), z))?

What are the restrictions on the alphabet? I assume the 2-FSA has to work over the same alphabet as the 3-FSA?

• June 15, 2013 6:38 pm

That’s a very good point, and I had to remember it myself. Yes, you are allowed to re-use inputs. The 2-FSAs are also allowed to use their own richer alphabets.

8. June 13, 2013 10:50 pm

Table 1 shows the first few ordered pairs in the relation on positive integers that corresponds to the relative term, “divisor of”. Thus, the ordered pair ${i \text{:} j}$ appears in the relation if and only if ${i|j}.$

$\begin{tabular}{|c||*{11}{c}|} \multicolumn{12}{c}{Table 1. Elementary Relatives for the Divisor Of" Relation} \\[4pt] \hline $$i|j$$&1&2&3&4&5&6&7&8&9&10&\ldots \\ \hline\hline 1&1:1&1:2&1:3&1:4&1:5&1:6&1:7&1:8&1:9&1:10&\dots \\ 2&&2:2&&2:4&&2:6&&2:8&&2:10&\dots \\ 3&&&3:3&&&3:6&&&3:9&&\dots \\ 4&&&&4:4&&&&4:8&&&\dots \\ 5&&&&&5:5&&&&&5:10&\dots \\ 6&&&&&&6:6&&&&&\dots \\ 7&&&&&&&7:7&&&&\dots \\ 8&&&&&&&&8:8&&&\dots \\ 9&&&&&&&&&9:9&&\dots \\ 10&&&&&&&&&&10:10&\dots \\ \ldots&\ldots&\ldots&\ldots&\ldots&\ldots& \ldots&\ldots&\ldots&\ldots&\ldots&\ldots \\ \hline \end{tabular}$

Table 2 shows the same information in the form of a logical matrix. This has a coefficient of ${1}$ in row ${i}$ and column ${j}$ when ${i|j}$, otherwise it has a coefficient of ${0}.$ (The zero entries have been omitted here for ease of reading.)

$\begin{tabular}{|c||*{11}{c}|} \multicolumn{12}{c}{Table 2. Logical Matrix for the Divisor Of" Relation} \\[4pt] \hline $$i|j$$&1&2&3&4&5&6&7&8&9&10&\ldots \\ \hline\hline 1&1&1&1&1&1&1&1&1&1&1&\dots \\ 2& &1& &1& &1& &1& &1&\dots \\ 3& & &1& & &1& & &1& &\dots \\ 4& & & &1& & & &1& & &\dots \\ 5& & & & &1& & & & &1&\dots \\ 6& & & & & &1& & & & &\dots \\ 7& & & & & & &1& & & &\dots \\ 8& & & & & & & &1& & &\dots \\ 9& & & & & & & & &1& &\dots \\ 10&& & & & & & & & &1&\dots \\ \ldots&\ldots&\ldots&\ldots&\ldots&\ldots& \ldots&\ldots&\ldots&\ldots&\ldots&\ldots \\ \hline \end{tabular}$

In much the same way that matrices in linear algebra represent linear transformations, these logical arrays and matrices represent logical transformations.

To be continued …

9. June 18, 2013 2:07 pm

Arrays like the ones sampled above supply a way to understand the difference between a relation and its associated relative terms. To make a long story short, we could say that a relative term is a relation plus an indication of its intended application. But explaining what that means will naturally take a longer story.

In his first paper on the “Logic of Relatives” (1870) Peirce treated sets — in other words, the extensions of concepts — as logical aggregates or logical sums. He wrote a plus sign with a comma ${(+\!\!,)}$ to indicate an operation of logical addition, abstractly equivalent to inclusive “or”, that was used to form these sums.

For example, letting ${t}$ be the concept of the first ten positive integers, it can be expressed as the following logical sum.

${t ~=~ 1 ~+\!\!,~ 2 ~+\!\!,~ 3 ~+\!\!,~ 4 ~+\!\!,~ 5 ~+\!\!,~ 6 ~+\!\!,~ 7 ~+\!\!,~ 8 ~+\!\!,~ 9 ~+\!\!,~ 10}$

Relations, as sets of tuples, can also be expressed as logical sums.

For example, letting ${\mathfrak{D}}$ be the divisibility relation on positive integers, it is possible to think of ${\mathfrak{D}}$ as a logical sum that begins in the following way.

${\mathfrak{D} ~=~ 1\text{:}1 ~+\!\!,~ 2\text{:}2 ~+\!\!,~ 2\text{:}1 ~+\!\!,~ 3\text{:}3 ~+\!\!,~ 3\text{:}1 ~+\!\!,~ 4\text{:}4 ~+\!\!,~ 4\text{:}2 ~+\!\!,~ 4\text{:}1 ~+\!\!,~ 5\text{:}5 ~+\!\!,~ 5\text{:}1 ~+\!\!,~ \ldots}$

It should be apparent that this is only a form of expression, not a definition, since it takes a prior concept of divisibility to say what ordered pairs appear in the sum, but it’s a reformulation that has its uses, nonetheless.

To be continued …

• June 18, 2013 2:40 pm

Arrgh, I somehow reversed the terms of $\mathfrak{D}$ … it clearly should be this:

${\mathfrak{D} ~=~ 1\text{:}1 ~+\!\!,~ 1\text{:}2 ~+\!\!,~ 2\text{:}2 ~+\!\!,~ 1\text{:}3 ~+\!\!,~ 3\text{:}3 ~+\!\!,~ 1\text{:}4 ~+\!\!,~ 2\text{:}4 ~+\!\!,~ 4\text{:}4 ~+\!\!,~ 1\text{:}5 ~+\!\!,~ 5\text{:}5 ~+\!\!,~ \ldots}$

10. June 30, 2013 12:14 pm

I’ve been meaning to get back to this, but the way things are going it may be August or September before I do. For the time being here’s a link to my commentary in progress on Peirce’s 1870 Logic of Relatives.

To my way of thinking this is one of the most amazing papers in the history of mathematical logic. Many of its deepest and most far-reaching ideas are so far ahead of their time that we have yet to see their full potential realized even today.

11. July 11, 2013 12:48 pm

Define operations on the elementary relatives that obey the following rules:

${(i : j) ~ j = i},$

${i ~ (i : j) = j},$

$(i : j) (k : \ell) = \left\{\begin{array}{cl} (i : \ell) & \text{if}~ j = k, \\ 0 & \text{if}~ j \neq k. \end{array}\right.$

Extending these rules in the usual distributive fashion to sums of elementary monadic and dyadic relatives allows us to define relative multiplications of the following forms:

$\mathit{a}\mathit{b},$ where $\mathit{a}$ and $\mathit{b}$ are 2-adic relatives,

$\mathit{a}\mathrm{b},$ where $\mathit{a}$ is 2-adic and $\mathrm{b}$ is 1-adic,

$\mathrm{a}\mathit{b},$ where $\mathrm{a}$ is 1-adic and $\mathit{b}$ is 2-adic.

For example, expressed in terms of coefficients, the relative product $\mathit{a}\mathit{b}$ of 2-adic relatives $\mathit{a}$ and $\mathit{b}$ is given by the following formula:

$\displaystyle (\mathit{a}\mathit{b})_{ij} = \sum_{m}^{,} \mathit{a}_{im}\mathit{b}_{mj}.$

This will of course remind everyone of the formula for multiplying matrices in linear algebra, but I have affixed a comma atop the summation symbol to remind us that the logical sum is the inclusive disjunction $(\lor)$ — that Peirce wrote as $(+\!\!,)$ — and not the exclusive disjunction that corresponds to the linear algebraic sum $(+).$

To be continued …

12. October 21, 2013 10:52 pm

Here’s a bit on Triadic Relations, including examples of sign relations.

• October 24, 2013 3:14 pm

As far as relational reducibility goes, there are two concepts of reducibility that are commonly used.

1. Compositional reducibility, or reducibility under relational composition.
2. Projective reducibility, or reducibility under relational projections.

All triadic relations are irreducible under composition, since composing any mixture of monadic and dyadic relations can give at most dyadic relations as a result.

With respect to projective reducibility — a weaker form of reducibility since it makes use of specialized triadic relations to effect the reduction — some triadic relations are reducible to their dyadic projections and some are not.

There’s a bit of exposition on that score at PlanetMath:

13. October 29, 2013 2:54 pm

An article I started on Peirce’s theory of signs via triadic sign relations is still something of a work in progress, but I did just improve the formatting a bunch, so maybe it will fit in here.

• November 1, 2013 10:30 pm

I just added a section on Semiotic Equivalence Relations, which is a crucial concept for computational applications (not to mention most other applications).