212 Theory in Biosciences (2020) 139:209–223
1 3
through this consolidation the emergence of individuals
(Flack 2017a, b).
Given our proposition that individuals are aggregates
that “propagate” information from the past to the future and
have temporal integrity, and that individuality is a matter of
degree, can be nested, distributed and possible at any level,
how can we formalize individuality?
Formalizing individuality
We will take as our starting point measurements from a
stochastic process. This could be a vector of chemical con-
centrations over time, the abundance of various cell types,
or probabilities of observing coherent behaviors. We use
coarse-grained or quantized information-theoretic filters the
quantize the measurements. Some of these filters will reveal
a coordinated pattern of behavior, whereas others will filter
out all signal and detect nothing. Thus, signal amplitude
given an appropriate filter becomes a means of discovering
different forms of individuality. This is somewhat analo-
gous to observing patterns in infrared that would be invis-
ible using the wavelengths of visible light—individuality is
revealed through characteristic patterns of information flow.
The basis for this approach to aggregation comes from
information theory, and throughout this paper we assume
that individuals are best thought of in terms of dynamical
processes and not as stationary objects that leave informa-
tion-theoretic traces. In this respect, our approach might rea-
sonably be framed through the lens of “process philosophy”
(Rescher 2007) which makes the elucidation of the dynami-
cal and coupled properties of natural phenomena the primary
explanatory challenge. From the perspective of “process
philosophy,” the tendency of starting with objects and then
listing their properties—“substance metaphysics”—places
the cart before the horse.
The origin ofinformation
Our proposal that individuals are aggregates that propagate
information from the past to the future and have temporal
integrity can be viewed as a pragmatic operational definition
that captures the idea there is something persistent about
individuals. However, our motivation for defining individu-
ality this way is actually much deeper. It lies in the informa-
tion-theoretic interpretation of entropy, its connection to the
physical theory of thermodynamics, and formal definition of
work introduced by Clausius in the 1860s [see (Müller 2007)
for an introduction to this history].
Briefly, work (displacement of a physical system) is
produced by transferring thermal energy from one body to
another (heat). Entropy captures, or measures, the loss in
temperature over the range of motion of the working body. In
other words, entropy measures the energy lost from the total
available energy available for performing work. The insights
of Clausius were formalized and placed in a mathematical
framework by Gibbs in 1876.
In 1877, Boltzmann provided in his kinetic theory of gas-
ses an alternative interpretation of entropy. For Boltzmann
entropy is a measure of the potential disorder in a system.
This definition shifts the emphasis from energy dissipated
through work to the number of unobservable configurations
(microstates) of a system, e.g., particle velocities consistent
with an observable measurement (macrostate), e.g., temper-
ature. The thermodynamic and Boltzmann definitions are
closely related as Boltzmann entropy increases following the
loss of energy available for work attendant upon the colli-
sion of particles in motion during heat flow. There are many
different microscopic configurations of individual particles
compatible with the same macroscopic measurement, and
only a few of which are useful.
In 1948, encouraged by John von Neumann, Claude Shan-
non used the thermodynamical term entropy to capture the
information capacity of a communication channel. A string
of a given length (macrostate) is compatible with a large
number of different sequences of symbols (microstates). A
target word will be disordered during transmission in pro-
portion to the noise in a channel. If there were no noise,
each and every microstate could be resolved and the entropy
would define an upper limit on the number of signals that
could be transmitted. The study of the maximum number
of states that can be transmitted from one point to another
across a channel, in the face of noise and when efficiently
encoded, is called information theory.
Shannon did not describe entropy in terms of heat flow
and work but in terms of information shared through a chan-
nel transmitted from a signaler to a receiver. The power of
information theory derives in part from the incredible gen-
erality of Shannon’s scheme. The signaler can be a phone in
Madison and the receiver a phone in Madrid, or the signaler
can be a parent and the receiver its offspring. For phones, the
channel is a fiber-optic cable and the signal pulses of light.
For organisms the channel is the germ line and the signal the
sequence of DNA or RNA polynucleotides in the genome.
Increasing entropy for a phone-call corresponds to the loss
or disruption of light-pulses, whereas increasing entropy
during inheritance corresponds to mutation or developmen-
tal noise. The same scheme can be applied to development,
in which case the signaler is an organism in the past and the
receiver the same organism in the future. One way in which
we might identify individuals is to check to see whether we
are dealing with the same aggregation at time t and
.
If the information transmitted forward in time is close to
maximal, we take that as evidence for individuality.
In its simplest form, Shannon made use of the following
formal measures when defining information. The entropy