Today I want to talk about relationships between monoids. These can be useful to think about when we’re developing libraries involving monoids, and we want to express some algebraic laws among them. We can then check these with automated tests, or indeed prove them with algebraic reasoning.
This post kind of fell together when writing notes on chapter 10, “Monoids”, of Functional Programming in Scala. I am putting it here so I can reference it from the chapter notes at the end of the book.
Monoid homomorphisms
Let’s take the String
concatenation and Int
addition as example monoids that have a relationship. Note that if we take the length of two strings and add them up, this is the same as concatenating those two strings and taking the length of the combined string:
1


So every String
maps to a corresponding Int
(its length), and every concatenation of strings maps to the addition of corresponding lengths.
The length
function maps from String
to Int
while preserving the monoid structure. Such a function, that maps from one monoid to another in such a preserving way, is called a monoid homomorphism. In general, for monoids M
and N
, a homomorphism f: M => N
, and all values x:M
, y:M
, the following law holds:
1


The +
syntax is from Scalaz and is obtained by importing scalaz.syntax.monoid._
. It just references the append
method on the Monoid[T]
instance, where T
is the type of the arguments.
This law can have real practical benefits. Imagine for example a “result set” monoid that tracks the locations of a particular set of records in a database or file. This could be as simple as a Set
of locations. Concatenating several thousand files and then proceeding to search through them is going to be much slower than searching through the files individually and then concatenating the result sets. Particularly since we can potentially search the files in parallel. A good automated test for our result set monoid would be that it admits a homomorphism from the data file monoid.
Monoid isomorphisms
Sometimes there will be a homomorphism in both directions between two monoids. If these are inverses of one another, then this kind of relationship is called a monoid isomorphism and we say that the two monoids are isomorphic. More precisely, we will have two monoids A
and B
, and homomorphisms f: A => B
and g: B => A
. If f(g(b)) == b
and g(f(a)) == a
, for all a:A
and b:B
then f
and g
form an isomorphism.
For example, the String
and List[Char]
monoids with concatenation are isomorphic. We can convert a String
to a List[Char]
, preserving the monoid structure, and go back again to the exact same String
we started with. This is also true in the inverse direction, so the isomorphism holds.
Other examples include (Boolean
, &&
) and (Boolean
, 
) which are isomorphic via not
.
Note that there are monoids with homomorphisms in both directions between them that nevertheless are not isomorphic. For example, (Int
, *
) and (Int
, +
). These are homomorphic to one another, but not isomorphic (thanks, Robbie Gates).
Monoid products and coproducts
If A
and B
are monoids, then (A,B)
is certainly a monoid, called their product:
1 2 3 4 5 6 

But is there such a thing as a monoid coproduct? It’s certainly not possible to build a monoid on Either[A,B]
for monoids A
and B
. For example, what would be the zero
of such a monoid? And what would be the value of Left(a) + Right(b)
? We could certainly choose an arbitrary rule, but not one that actually satisfies the monoid laws of associativity and unit.
To resolve this, we need to know the precise meaning of product and coproduct. These come straight from Wikipedia, with a little help from Cale Gibbard.
A product M
of two monoids A
and B
is a monoid such that there exist homomorphisms fst: M => A
, snd: M => B
, and for any monoid Z
and morphisms f: Z => A
and g: Z => B
there has to be a unique homomorphism h: Z => M
such that fst(h(z)) == f(z)
and snd(h(z)) == g(z)
for all z:Z
. In other words, the following diagram must commute:
A coproduct W
of two monoids A
and B
is the same except the arrows are reversed. It’s a monoid such that there exist homomorphisms left: A => W
, right: B => W
, and for any monoid Z
and morphisms f: A => Z
and g: B => Z
there has to be a unique homomorphism h: W => Z
such that h(left(a)) == f(a)
and h(right(b)) == g(b)
for all a:A
and all b:B
. In other words, the following diagram must commute:
We can easily show that our productMonoid
above really is a monoid product. The homomorphisms are the methods _1
and _2
on Tuple2
. They simply map every element of (A,B)
to a corresponding element in A
and B
. The monoid structure is preserved because:
1 2 

And for any other monoid Z
, and morphisms f: Z => A
and g: Z => B
, we can construct a unique morphism from Z
to (A,B)
:
1 2 

And this really is a homomorphism because we just inherit the homomorphism law from f
and g
.
What does a coproduct then look like? Well, it’s going to be a type C[A,B]
together with an instance coproduct[A:Monoid,B:Monoid]:Monoid[C[A,B]]
. It will be equipped with two monoid homomorphisms, left: A => C[A,B]
and right: B => C[A,B]
that satisfy the following (according to the monoid homomorphism law):
1 2 

And additionally, for any other monoid Z
and homomorphisms f: A => Z
and g: B => Z
we must be able to construct a unique homomorphism from C[A,B]
to Z
:
1


The simplest thing that could possibly work
We can simply come up with a data structure required for the coproduct to satisfy a monoid. We will start with two constructors, one for the left side, and another for the right side:
1 2 3 

This certainly allows an embedding of both monoids A
and B
. But These
is now basically Either
, which we know doesn’t quite form a monoid. What’s needed is a zero
, and a way of appending an A
to a B
. The simplest way to do that is to add a product constructor to These
:
1


Now These[A,B]
is a monoid as long as A
and B
are monoids:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 

These[A,B]
is the smallest monoid that contains both A
and B
as submonoids (the This
and That
constructors, respectively) and admits a homomorphism from both A
and B
. And notice that we simply added the least amount of structure possible to make These[A,B]
a monoid (the Both
constructor). But is it really a coproduct?
First we must prove that This
and That
really are homomorphisms. We need to prove the following two properties:
1 2 

That’s easy. The first two cases of the append
method on the coproduct
monoid prove these properties.
But can we define fold
? Yes we can:
1 2 3 4 5 6 7 

But is fold
really a homomorphism? Let’s not assume that it is, but test it out.Here’s the homomorphism law:
1


What happens if both t1
or t2
are This
or That
?
1 2 3 4 

That holds. But what if we introduce a Both
on one side?
1 2 3 4 

So far so good. That holds because of associativity. What about the other side?
1 2 3 4 

No! Something has gone wrong. This will only hold if the Z
monoid is commutative. So in general, These[A,B]
is not the coproduct of A
and B
. My error was in the Both
constructor, which commutes all B
values to the right and all A
values to the left.
So what kind of thing would work? It would have to solve this case:
1


We need to preserve that a1
, b1
, a2
, and b2
appear in that order. So clearly the coproduct will be some kind of list!
We could modify the Both
constructor this way:
1


And let’s add an empty case for the combined zero:
1


In which case These[A,B]
has become a kind of tree, or an unbalanced list of This[A]
and That[B]
values. A free product of the two monoids A
and B
. The implementation of append
for the coproduct
monoid could be:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 

This append
normalizes the list so that consecutive values of the same type are added together.
And we would modify fold
to recurse over the tree:
1 2 3 4 5 6 7 

This is now a homomorphism! We already know that this is so for the This
and That
cases. And now that the Both
case simply appeals to the inductive hypothesis, we know that it holds for Both
as well.
Free monoids on coproducts
To better understand what’s going on, let’s try going the other way. What if we start with the coproduct of the underlying sets and get a free monoid from there?
The underlying set of a monoid A
is just the type A
without the monoid structure. The coproduct of types A
and B
is the type Either[A,B]
. Having “forgotten” the monoid structure of both A
and B
, we can recover it by generating a free monoid on Either[A,B]
, which is just List[Either[A,B]]
. The append
operation of this monoid is list concatenation, and the identity for it is the empty list.
Clearly List[Either[A,B]]
is a monoid, but does it permit a homomorphism from both monoids A
and B
? If so, then the following properties should hold:
1 2 

They clearly do not hold! The lists on the left of ==
will have two elements and the lists on the right will have one element. Can we do something about this?
Well, the fact is that List[Either[A,B]]
is not exactly the monoid coproduct of A
and B
. It’s “too big” in a sense. But if we were to reduce the list to a normal form that approximates a “free product”, we can get a coproduct that matches our definition above. What we need is a new monoid:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 

Eithers[A,B]
is a kind of List[Either[A,B]]
that has been normalized so that consecutive A
s and consecutive B
s have been collapsed using their respective monoids. So it will contain alternating A
and B
values.
This is now a monoid coproduct because it permits monoid homomorphisms from A
and B
:
1 2 3 4 

And we can implement the fold
homomorphism:
1 2 3 4 5 6 

And this time fold
really is a homomorphism, and we can prove it by case analysis. Here’s the law again:
1


If either of e1
or e2
is empty then the result is the fold of the other, so those cases are trivial. If they are both nonempty, then they will have one of these forms:
1 2 3 4 5 6 7 8 9 10 11 

In the first two cases, on the right of the ==
sign in the law, we perform a1 + a2
and b1 + b2
respectively before concatenating. In the other two cases we simply concatenate the lists. The ++
method on Eithers
takes care of doing this correctly for us. On the left of the ==
sign we fold the lists individually and they will be alternating applications of f
and g
. So then this law amounts to the fact that f(a1 + a2) == f(a1) + f(a2)
in the first case, and the same for g
in the second case. In the latter two cases this amounts to a homomorphism on List
. So as long as f
and g
are homomorphisms, so is fold(_)(f,g)
. Therefore, Eithers[A,B]
really is a coproduct of A
and B
.
Since we have already convinced ourselves that These
is a monoid coproduct, we could also simply show that there is a homomorphism from Eithers
to These
:
1 2 3 4 5 6 

Which would amount to proving that (toThese(xs) + toThese(ys))
= toThese(xs ++ ys)
.
The lesson learned here is to check assumptions and test against laws. Things are not always as straightforward as they seem.