Triads and Dyads
Not dryads and naiads…
src |
Charles Peirce, who is regarded among the greatest logicians and philosophers, wrote a paper in 1897 titled “The Logic of Relatives.” Now we might want a guide to the logic of our own relatives, and in fact the word “brother” furnished Peirce’s first example. But what Peirce meant by “relative” is a term that attains its meaning only in conjunction with another noun as object, plus a word or other sign giving the relation. Examples, the first two his, are “brother of Napoleon,” “slayer of giants,” and “among the greatest logicians and philosophers.” The last has a wordless sign after “among” where the others have “of,” but then the rest may seem to make it a four-way relation. However, Peirce gave a general calculus of how all four and higher-way relations in semiotics could be decomposed into twos and threes. Such “relatives” and some other three-way relations, however, he proved to be irreducible within his system.
Today we consider an open problem about decomposing 3-ary finite automata.
Peirce’s name is usually written with at least the ‘S.’ of his middle name. He was born Charles Sanders Peirce, but late in life he inserted “Santiago” before or replacing his middle name. As “St. James” in Spanish, this was said to honor his friend the philosopher William James, whose son Pierce designated as second heir after his own wife Juliette. However, earlier usages may have honored the European origins of Juliette herself. His taking up with Juliette before his divorce from his first wife became final was made a scandal that cost Peirce the one academic position he ever had, from Johns Hopkins University in 1884.
I was prompted to write this by a response from Jon Awbrey, who frequents our pages and writes the blog “Inquiry Into Inquiry.” In Boolean logic every -way function can be decomposed as a circuit of binary functions, indeed composed entirely of the binary NOR function, as Peirce himself discovered. Awbrey termed it a “trap” to infer that this carries over to logical semantics. The rest of this post isn’t related to Pierce, but we offer it toward the general question, when do threes decompose into twos?
Multi-tape Finite Automata
The familiar idea of a finite automaton can be generalized to any number of input tapes. If each input head must advance one cell on each move, then the input strings may as well be padded to the same length with an extra symbol $ added to the alphabet . Then they can be shuffled into one string of length over the alphabet , whereupon becomes equivalent to the resulting ordinary one-tape finite automaton .
Things are more interesting when one or more heads are allowed to stay stationary on each move. Then the string-matching relation can be recognized by a nondeterministic 2-tape finite automaton (2-NFA). This relation consists of all pairs such that can be written as for some . The 2-NFA pauses its first tape head and advances the second until it guesses where may start in the input , and accepts if and when it matches all characters of on consiecutive steps. It is easy to show that no 2-DFA can recognize this relation. Thus the equality “NFA = DFA” does not carry over to multiple tapes when pausing is allowed.
A three-way relation that is deterministically recognizable is the concatenation relation . With and on two separate tapes, a 3-DFA can recognize where stops upon reading the (implicit or explicit) terminal $ on that tape, and then begin matching the rest of the third tape against . In an early paper, I showed that there is a bijection whose graph is similarly recognizable as a three-way relation. Indeed I gave a two-tape finite transducer to compute it. This showed a pairing function that is linear-time computable and invertible, while previous examples used integer multiplication for which linear time is unknown.
Multi-tape Finite Transducers and the Problem
The definition of a -tape finite-state transducer (-FST) is straightforward: Each transition has the form
where and are states, are input characters or $, each is ‘S’ for “stay” or ‘R’ for move one cell right, and . To make the computation straightforward, we can stipulate that at least one in each instruction is an R, and that the machine goes to a designated halting state only when each . Then on inputs where , the machine runs for at most steps.
An example of a function computed by a 3-FST is the ternary concatenation function . The latter two tapes pause while is copied to the output, then , then . This function, however, can be written as the composition
of the usual binary concatenation function. This leads straight into our question:
Can every 3-FST function be computed by a finite composition of 2-FSTs?
We can think of other problems involving relations and functions computed by binary and ternary finite automata that seem to have ready answers. One is whether every 3-DFA relation can be written as a Boolean combination of 2-DFA relations on its three pairs of inputs—consider the relation where is bitwise exclusive-or. The answer for the function question has, however, eluded me—I suspect it is no, but have not even found a “killer” candidate function for being 3-FST only.
Open Problems
Can you prove the answer?
What about other cases of -way into -way decomposition?
Re: “When do threes decompose into twos?”
One of the slogans emblazoned in my brain by freshman algebra was, Binary Operations Are Ternary Relations.
For example, 2-ary operations like Peirce’s amphecks count as 3-ary relations.
Long story short, when it comes to counting coup in twos and threes, functions are apples and relations are oranges, and we need to be sure of comparing like to like in weighing the ways these fruits decompose.
speaking of FSM transducers, heres a brief writeup of the construction/use of it for the collatz conjecture. also looking for volunteers as a proposed polymath problem.
Nice question. At the end, k-way and 2-way should read k-tape and 2-tape, right?
Here is a killer candidate: The first and second tape each contain a length n sequence of some alphabet, while the third contains a length n sequence of 1’s and 2’s. Whenever an i is read on the third tape, copy the current symbol of the i-th tape to the output tape and advance on the i-th tape (do nothing on the (3-i)-th tape).
Well, pity there is no delete comment here, this is pretty simple to realize as the combination of 2-FST’s…
Okay, now this is really a killer: The symbols of our alphabet are 1, 2, 3. We will always have an “actual” tape, in the beginning the first. In each step we write down the current symbol of the actual tape on the output tape and make the read symbol-th tape the new actual tape. We proceed until we reach the end of a tape.
It may help to clarify the relationship between logical relatives and mathematical relations. The word relative as used in logic is short for relative term — as such it refers to a piece of language that is used to denote a formal object. So what kind of object is that? The way things work in mathematics, we are free to make up a formal object that corresponds directly to the term, so long as we can form a consistent theory of it, but in our case it is probably easier to relate the relative term to the kinds of relations we already know and love in mathematics and database theory.
In those contexts a relation is just a set of ordered tuples and, if you are a fan of strong typing like I am, such a set is always set in a specific setting, namely, it’s a subset of a specific Cartesian product.
Peirce wrote -tuples in the form and referred to them as elementary -adic relatives. He expressed a set of -tuples as a “logical sum” or “logical aggregate”, what we would call a logical disjunction of these elementary relatives, and he regarded them as being arranged in the form of -dimensional arrays.
Time for some concrete examples, which I will give in the next comment …
Meanwhile, was my use of “sign” in the first paragraph OK? It seems to signify both the (effect of the) 3-way relation and (in terms by other semioticians) an element.
A couple of questions about the open problem:
Can we do any of the following: reuse inputs? reuse the same pairs of inputs? copy outputs? As a concrete example, if we want to recreate f(x,y,z) can we do stuff like: g_5(g_2(g_1(x, y), z), g_4(g_3(x, y), z))?
What are the restrictions on the alphabet? I assume the 2-FSA has to work over the same alphabet as the 3-FSA?
That’s a very good point, and I had to remember it myself. Yes, you are allowed to re-use inputs. The 2-FSAs are also allowed to use their own richer alphabets.
Table 1 shows the first few ordered pairs in the relation on positive integers that corresponds to the relative term, “divisor of”. Thus, the ordered pair appears in the relation if and only if
Table 2 shows the same information in the form of a logical matrix. This has a coefficient of in row and column when , otherwise it has a coefficient of (The zero entries have been omitted here for ease of reading.)
In much the same way that matrices in linear algebra represent linear transformations, these logical arrays and matrices represent logical transformations.
To be continued …
WordPress changed the way it handles LaTeX tables shortly after I posted the above comment. Here’s a redo that I hope will work here.
Table 1 shows the first few ordered pairs in the relation on positive integers that corresponds to the relative term, “divisor of”. Thus, the ordered pair appears in the relation if and only if
Table 2 shows the same information in the form of a logical matrix. This has a coefficient of in row and column when , otherwise it has a coefficient of (The zero entries have been omitted here for ease of reading.)
In much the same way that matrices in linear algebra represent linear transformations, these logical arrays and matrices represent logical transformations.
To be continued …
Arrays like the ones sampled above supply a way to understand the difference between a relation and its associated relative terms. To make a long story short, we could say that a relative term is a relation plus an indication of its intended application. But explaining what that means will naturally take a longer story.
In his first paper on the “Logic of Relatives” (1870) Peirce treated sets — in other words, the extensions of concepts — as logical aggregates or logical sums. He wrote a plus sign with a comma to indicate an operation of logical addition, abstractly equivalent to inclusive “or”, that was used to form these sums.
For example, letting be the concept of the first ten positive integers, it can be expressed as the following logical sum.
Relations, as sets of tuples, can also be expressed as logical sums.
For example, letting be the divisibility relation on positive integers, it is possible to think of as a logical sum that begins in the following way.
It should be apparent that this is only a form of expression, not a definition, since it takes a prior concept of divisibility to say what ordered pairs appear in the sum, but it’s a reformulation that has its uses, nonetheless.
To be continued …
Arrgh, I somehow reversed the terms of … it clearly should be this:
I’ve been meaning to get back to this, but the way things are going it may be August or September before I do. For the time being here’s a link to my commentary in progress on Peirce’s 1870 Logic of Relatives.
• Notes on Peirce’s 1870 Logic of Relatives
To my way of thinking this is one of the most amazing papers in the history of mathematical logic. Many of its deepest and most far-reaching ideas are so far ahead of their time that we have yet to see their full potential realized even today.
Define operations on the elementary relatives that obey the following rules:
Extending these rules in the usual distributive fashion to sums of elementary monadic and dyadic relatives allows us to define relative multiplications of the following forms:
where and are 2-adic relatives,
where is 2-adic and is 1-adic,
where is 1-adic and is 2-adic.
For example, expressed in terms of coefficients, the relative product of 2-adic relatives and is given by the following formula:
This will of course remind everyone of the formula for multiplying matrices in linear algebra, but I have affixed a comma atop the summation symbol to remind us that the logical sum is the inclusive disjunction — that Peirce wrote as — and not the exclusive disjunction that corresponds to the linear algebraic sum
To be continued …
Here’s a bit on Triadic Relations, including examples of sign relations.
As far as relational reducibility goes, there are two concepts of reducibility that are commonly used.
1. Compositional reducibility, or reducibility under relational composition.
2. Projective reducibility, or reducibility under relational projections.
All triadic relations are irreducible under composition, since composing any mixture of monadic and dyadic relations can give at most dyadic relations as a result.
With respect to projective reducibility — a weaker form of reducibility since it makes use of specialized triadic relations to effect the reduction — some triadic relations are reducible to their dyadic projections and some are not.
There’s a bit of exposition on that score at PlanetMath:
☞ Relation Reduction
An article I started on Peirce’s theory of signs via triadic sign relations is still something of a work in progress, but I did just improve the formatting a bunch, so maybe it will fit in here.
☞ Sign Relations
I just added a section on Semiotic Equivalence Relations, which is a crucial concept for computational applications (not to mention most other applications).
A three tape FST $A$ reads from its input tapes in a particular order. You can take the first two of these input tapes and shuffle them in arbitrary order (assuming we allow nondeterminism). Take copies of letters if the alphabets are not disjoint. Now original three tape $A$ can be simulated on two tapes, one of which is the shuffle of the first two original tapes. The new automaton just blocks when it expects a letter from either of the first two tapes which is not available on the shuffled input.
Hendrik, thanks for the thought. The problem does intend deterministic FST’s.
I’ve been reworking my excerpts and commentary on Peirce’s 1870 “Logic of Relatives” as a series of blog posts. The initial installment is here:
☞ Peirce’s 1870 “Logic Of Relatives” • Preliminaries
And the rest can be followed up from there on the sidebar.
Update
I’ve been working away at upgrading my commentary on Peirce’s 1870 “Logic of Relatives”. Here’s a tag that runs through the series of posts so far (from the latest to the first).
☞ Logic of Relatives