3.1. Functions and the subjective/objective distinction
We can distinguish between two forms of causal interaction, and I further specify the physical system that corresponds to the observer system
O. This specification states that the system
O contains a
function. This is the most abstract expression of the intuitive fact that observations go back on a purpose, either intentional, as in the case of the human experimenter, or as a part of a cybernetic feedback mechanism, as in the case of a bacterium. Functions involve observations in the sense that they are selective, that is, a function is triggered by a certain property of the object system with which the relation of observation is constituted. The notion of a function is a most general one that applies for both biological systems and artificial ones [
52], therefore includes, in particular, the physical interactions taking place in the context of human experiments.
A function is a special kind of causal process that can be described as having the following general structure [
53]:
The function of
X is
Z means:
- (a)
X is there because it does Z, and
- (b)
Z is a consequence (or result) of X’s being there.
For example, the sun emits light which causes the existence of life on earth, but the sun does not have this function. The heart pumps blood, and because it does so, it exists. So the latter physical process has a function. It is important to notice that on this level of generality, functions do not only relate to technological functions (based on design) or physiological functions (a result of evolution), but also to all imaginable kinds of interlocking and embedding functions in larger systems. Thus, in the general conceptual frame, one cannot state in empirically meaningful senses that the prey exists because it feeds a predator, but one can say that the prey exists because it is a part of a larger ecological system, of which both prey and predator are a part. As it is shown in
Figure 2a, we can therefore relate
Z to another effect
Z’, which is part of a larger system into which the relation (a) is embedded.
Figure 2.
(a): Elementary form of a function; (b): Autocatalysis.
Figure 2.
(a): Elementary form of a function; (b): Autocatalysis.
The most simple function, corresponding to the definition given above, is a symmetric autocatalytic reaction cycle in chemistry [
54] (
Figure 2b). In this case, (b) describes the original chemical reaction linking up two cycle intermediates
X and
Z, and (a) describes the autocatalytic effect of the product.
Y refers to the raw material of the reaction. Correspondingly, the ontologically primordial example of an embedded function is a an enzyme [
15,
58]. The enzyme has the function to increase the chemical production in a cell. This function is derived from the function of the product in the cell, from which the causal feedback to enzyme production is generated in the entire systemic interdependencies. The selectivity of enzymes is also the primordial example for a physical relation of observation.
The extensive philosophical discussion of functions has shown that this term manifests similar ontological and epistemological issues as the term entropy, which suggests that the two concepts belong to a related philosophical terrain. This is because a function, taken as a fact, can be epistemologically subjective and objective, as well as ontologically subjective and objective, depending on the fundamental philosophical stance taken [
56]. There are opinions that functions are always assigned by human observers, and also other opinions which treat functions as objective physical phenomena. Hence, as with entropy, the notion vacillates between being observer relative or observer independent.
One possibility to classify functions is offered in
Table 1. Following Searle [
57], I distinguish between two kinds of judgment about facts, which can be epistemically subjective, thus depending on points of views of the person who makes and hears the judgment, or epistemically objective, thus depending on some independent criteria of truth. Additionally, I distinguish between two kinds of entities, which are the ontologically subjective ones, whose mode of existence depends on mental states, and ontologically objective ones, which are independent from mental states. In this two-dimensional box, one standard assignment looks like the following (differing, however, from Searle’s original position):
Technological functions are epistemologically objective because statements on them relate to physical laws, but they are ontologically subjective because they relate to human design;
Biological functions are epistemologically objective because statements on them are science-based, and they are ontologically objective, because they are the result of natural selection;
Mental functions are ontologically objective because they relate to neuronal states, which are observer independent, but they are epistemologically subjective because we can only refer to them from the viewpoint of the person who experiences them (such as pain);
Semantic functions are ontologically subjective because they relate to individual mental states, and they are epistemologically subjective, because they relate to individual intentionality.
Table 1.
Types of functions.
Table 1.
Types of functions.
| Judgement | Epistemically subjective | Epistemically objective |
---|
Entity | |
---|
Ontologically subjective | Semantic function | Technological function |
Ontologically objective | Mental function | Biological function |
This helps to further clarify the Jaynes statement that entropy is an “anthropomorphic concept.” On first sight, this means that entropy is a notion that is epistemologically objective, because it relates to physical laws, but ontologically subjective, because it relates with particular experimental settings which fix the degrees of freedom of the observed macrostate; the settings themselves reflect mental states of the experimenter. However, in the philosophical debate over functions the assignment of technological and biological functions has been questioned [
52]. This is because it depends on the level of system which is under consideration. For example, we can certainly describe a part of an engine independent from the original design merely in terms of its endogenous functioning in the engine, hence assigning it to the ‘ontologically objective’ box. The deeper reason for this is that we analyze a two-level system, where the engine has a function that directly relates to human design, but a part of the engine contributes to that function, based on mere design necessities which follow physical laws. A similar reconsideration is possible if we adopt the perspective developed in
Figure 1, which results into a mutual embedding of functions, treated as observers there. If we keep the technological function in the original box, a higher level observer may relate to the design process directly. This design process can be seen either as a mental function or a semantic function, depending on the particular methodological perspective taken. However, in both cases one can proceed along similar lines as when shifting the technological function to the ‘ontologically objective’ box: If we integrate the technological function and the design function
qua mental function into one higher-level system, we could argue, for instance, that the technological function is a part of a mental function. This discussion directly applies for the Jaynes notion of entropy: Its assignment depends on whether we include the observer system
O into the analysis or not.
Regarding this integrated view on a higher-level observer, there are two choices. This is evident from the ongoing controversy over evolutionary approaches to technology [
58,
59]. One position posits that the design notion distorts the facts about the population-level evolution of designs, thus attributing too strong a role to individuals. The other follows early approaches in evolutionary epistemology and reconstructs the design process as an evolutionary process in the human brain [
60,
61,
62]. In both cases, the role of independent mental states in determining technological functions would be reduced, if not nullified altogether.
These different possibilities are directly relevant for the interpretation of the Jaynes conception of entropy. If we interpret the observer in
Figure 1 as a neuronal system that is a component in an evolutionary process, concepts such as entropy would not relate to mental states, but to physical processes that establish a causal connection between a physical system
A and an observer system
O. In this case, the notion of the entropy of an integrated system <A,O> becomes meaningful, along the lines of the solutions to the Maxwell’s demon paradox: On the most abstract level, the observer and the object system are just two physical systems which stay in a causal relation to each other. Hence, we end up with a further clarification of the final conclusion of
Section 2.1: Gibbs/Jaynes entropy can be interpreted
Either as an epistemologically objective, but ontologically subjective notion, as long we consider the standard use in physics, which relates object systems with mental states of observers who conduct experiments,
Or as an equally epistemologically objective and ontologically objective notion, if we consider the integrated system of observer and object system, and conceive of the former as a physical system, which implies that not only the experiment as such, but also the design of the experiment is a physical process.
Thus, by means of these distinctions we reach an important clarification and extension of the Jaynes proposition (
Figure 3):
There are two different notions of entropy. Observer relative EntropyOR is ontologically subjective and can only be determined with reference to mental states of human observers, for example, the physicist conducting an experiment. Observer independent EntropyOI relates to coupled physical systems of observers and object systems.
Figure 3.
Observer relative and observer independent entropy.
Figure 3.
Observer relative and observer independent entropy.
Obviously, this approach raises the question of the next-higher level observer. In Jaynes original approach, there is only entropy
OR. So, entropy
OI would obtain the role of an unattainable
Ding an sich, since any approach to grasp it involves the establishment of an observer’s position in a potentially infinite sequence of nested observations. Subsequently, I wish to show that the distinction relates with the distinction between micro- and macrostates, and finally I come back on the tricky issue of levels of observers in the final section. This follows a line of thinking developed by Salthe [
18], who distinguishes between the internalist and the externalist perspective. Correspondingly, Salthe argues that the total information capacity, hence entropy in the Boltzmann and Shannon sense, of a system can be decomposed into the capacity of a supersystem and the subsystems. Then, if we consider the internal position of the observer within the system, the observer is a subsystem that also has a particular information capacity which is a part of the total information capacity. This observer cannot, in principle, take a position outside the supersystem in order to measure its total information capacity. The only way is to estimate the information capacity by the observation of another subsystem. In doing so, the observer adapts structurally, hence also changing its information capacity. Thus, total information capacity from the internalist perspective is the sum of the observer’s and the observed subsystem’s information capacity, which, however, does not imply that the total information capacity is directly measured by the observer inside the system. This argument states the distinction between entropy
OR and entropy
OI, without referring to the Jaynes conception. An important consequence would be that the two entropies are not simply two different views on the same physical magnitude, but will always differ, as entropy
OR does not include the observer, unless the latter establishes a self-referential relation.
Let us now explore the consequences of introducing the notion of function into our analysis of the Jaynes appproach to entropy, which is straightforward because observations are just special kinds of functions.
3.2. Evolving functions and maximum entropy production
Subsequently, I follow those strands in the literature which propose an evolutionary generalization of the concept of function. This states that all functions are the outcome of evolutionary processes, which, at the same time, determines the generic criterion for proper functioning, that is, reproducibility of the functions [
63]. In a nutshell, all functions are evolved functions, and the continuing existence of a function lies in the fact that the function reproduces through time, independent from the specific mechanism of reproduction. Thus, a machine function is being reproduced if there is an energy throughput and a regular maintainance, or an ecological function is reproduced by stabilizing natural selection, given an energy throughput. This perspective is straightforward to accept for biological functions, it seems. As for technological functions, I just follow the previously mentioned evolutionary approaches which eschew the centrality of the design notion. As for mental functions, I accept the different versions of neuronal network selectionism in the brain sciences. So, in principle, a unified evolutionary approach to functions is possible. In this approach, the previous version of the Jynes proposition would have to be translated into:
There are two different notions of entropy. Observer relative EntropyOR is ontologically subjective and can only be determined with reference to a function. Observer independent EntropyOI relates to coupled physical systems of functions and object systems.
If two physical systems are causally connected in a way that the proper functioning of system
O is the result, we have a substantial modification of the standard conception of causality [
46]. This is because the effect of system
A on
O is mediated by all other factors that determine the proper functioning of
O. In both artefacts and living systems, the most general condition for proper functioning is energy throughput. If we remain on a purely ontological level, following Bunge’s [
8] exposition, energy is an additive property of all things which measures their degree of changeability relative to a reference frame. Accordingly, a causal relation is a sequence of events through time in which energy is transferred from
x to
x’. In a slightly more specific way, all causal relations involve the transformation of free energy into bound energy, and, following the Second Law, the production of entropy. Therefore, interactions involving physical systems with functions differ fundamentally from other forms of causality, as they both manifest a causal relation between system
A and system
O, hence an energetic transformation, and a simultaneous energy throughput in system
O which is necessary for the function to be realized. We can also state the the energy throughput is necessary for the reproduction of the function. So,
Figure 2a can be modified and extended into
Figure 4.
It is important to notice that, if we regard as the observer system O the entire structure of embedded functions, the causal impact of A on O only happens on the level of the single function. So, for example, a causal impact is the kinetic impact of a substrate on the active site of an enzyme. The enzyme reaction occurs in the heat bath of the cell, so involves an energy throughput. The function of the fundamental enzyme reaction is embedded into a hierarchy of functions in the cell.
So far, the argument of functions seems to come close to the famous Schrödinger definition of life, because it is tempting to see the function as a biological function which, according to Schrödinger’s view, feeds on energy and produces both negentropy as internal order and exports entropy in the environment. However, although this is an inspiring idea, it does not grasp the essential point. This is the role of functions in the physical process. In the notion of function, it is also easy to accomodate the alternative argument related to non-equilibrium systems in which energy dissipation results into the formation of structure, which corresponds to a Second Law trajectory, thus seems to contradict the Schrödinger definition. The notion of function includes both possibilities, because it does not have any implications about the way how the underlying physical patterns emerged.
In order to further clarify the physical role of functions, I propose to explore an evolutionary approach to the Jaynes conception of entropy. This means, I present a naturalization of the concept of inference which follows the lines of evolutionary epistemology [
65,
66]. In such kind of approach, the experimenter in Jaynes original setting is seen as an evolving system
O which produces inferences about the behavior of system
A, which translate into further causal interactions between system
A and
O, that in turn impact on the reproducibility of system
O. This process takes place against the background of a population of observer systems
O1,…,n, the reproduction of which is maintained by energy throughputs which are constrained so that selection among observer systems takes place, depending on the degree how far the observations approximate the true physical functioning of system
A. In other words, I assume that observations have a fitness value.
Figure 4.
Function and causality.
Figure 4.
Function and causality.
Along the lines of the Jaynes approach, the degrees of freedom of the system A can only be specified with relation to the function of the system O. A function can be fulfilled or can fail. In case of failure, system O collapses or loses out in competition with the other systems. Therefore, the question is whether the degrees of freedom that are specified with relation to a function result into a pattern of causal interaction between system A and O in which proper functioning of O is possible, given the selective pressures. A function is a mechanism that relates an observation to a process, in the sense that the function defines certain states of the object system which relate to states of the system with the function. These states can be grained to vastly differing degrees, depending on the functioning, and depending on the energetic costs of the observation. So, in any kind of function a distinction between micro- and macrostates is involved, with the macrostates being those directly relevant for the function: This results in functional selectivity.
From this follows, that the entropy measure implied by the Gibbs/Jaynes approach is not arbitrary if the two systems interact with a function involved. The function implies a certain set of constraints on the macrostates of system A, and it is the possible tension between functional relevance and irrelevance that determines the evolution of constraints, in a similar way as the experimenter produces physical models in order to explain her observations, and continuously improves the estimates. If we envisage a selection of functions, the crucial question is whether unobserved microstates are relevant for the functioning, such that the causal interaction results into a less than proper functioning of system O. If the functioning does not distinguish between microstates, even though they affect the functioning causally, the system would have to change the number and kind of degrees of freedom in the macrostate which are implied by the function. In the evolutionary setting, this does not happen because of a new design created by an experimenter, but by variation and selective retention in the population of observer systems.
So we end up with a further modification of the basic structure of a function in a physical context (
Figure 5). Physically, the causal impact of system
A on
O happens between the microstates of
A and
X, such as the kinetic or quantum level interactions between substrate molecules and enzymes. The relation of observation supervenes on the microlevels in the sense that the molecular shape of the active site of the enzyme (which is an emergent property of the quantum level [
67,
68]) fits into the molecular shape of the substrate. The relation between the two levels becomes physically explicit in the induced fit process in which the two sides adapt to each other. As for the function, a similar consideration holds. The causation (b) is between the microstates of
X and
Z, but the relation (a) operates on the macro-level, which is implied by the proper function with relation to
Z’,
i.e., the embedding system. For proper functioning, certain variations on the micro-level of
X do not matter, even though they operate in the causality between
X and
Z. For example, the products of the enzymatic reaction may manifest slight variations in some respects resulting from the chemical process (e.g., the product may be deviating from a pure substance), but for the functioning only the pure substance features are relevant. So, in this example it is possible to reconstruct the notion of ‘molecular structure’ as the set of constraints, so that it is the very fact of chemical reactivity which establishes the differentiation between the macro- and the microlevel (
i.e., molecular structure and quantum level processes): The notion of a chemical substance refers to the patterns of reactivity, which obtains the status of a proper functioning, e.g., when conducting a test [
69,
70].
Figure 5.
Functional selectivity and the Micro/Macro distinction.
Figure 5.
Functional selectivity and the Micro/Macro distinction.
In this evolutionary setting, the central question is whether we can interpret the Maximum Entropy correction process in the Jaynes model of inference as an evolutionary optimization. I posit that a function operates properly if the macrostates that underly its functioning correspond to a maximum entropy
OR state of the microstates. This is just an evolutionary extension of the Jaynes approach to maximum entropy in statistics [
46]: If the least-biased assignment of probabilities does not apply, this is an indication that the constraints have not been properly defined, and hence need readjustment. So, one can view the series of statistical tests in which degrees of freedom are changed by the tester according to the criterion whether MaxEnt applies as steps in an evolutionary sequence of selections. Hence, the MaxEnt principle is also a principle of learning. In other words, if the initial observation results into observations which fail to confirm the MaxEnt distribution, this is a cause for a realignment of the constraints on the macrostates. Following Dewar [
71], we can apply this on the relation between physical theories and the MaxEnt principle (
Figure 6): The physical theory posits a set of macroscopic constraints of an object system, and the MaxEnt principle serves to generate predictions about the macro-behavior of the system, such that a failure of those predictions implies the need to change the physical theory. From the viewpont of evolutionary epistemology, and therefore equating the physical theory as a function in the sense of a physical state of the observer system, we can therefore also argue, that in a process of natural selection, the MaxEnt principle will also underlie the convergence of functional macrostates with macrostates of object systems. Proper functioning requires that all other possible causal interactions between the object system and the function are irrelevant to the fulfilment to the function, such that any information about a deviation from equal probability of all states is also irrelevant. For this conceptual transfer of the of the MaxEnt principle from theory evolution to physical evolution it is sufficient to refer to a general principle of computational efficiency, in the sense that the MaxEnt principle allows for a minimization of effort (time, energy, resources) invested into the processing of epistemically or functionally relevant information.
Figure 6.
MaxEnt as inference.
Figure 6.
MaxEnt as inference.
I summarize:
The Jaynes approach to entropy as an inference procedure corresponds to an evolutionary sequence of adaptations of macro level constraints in the observing system O to the macro level constraints in the observed system A, such that all micro states of the observed system are assigned to the maximum entropyOR state. This process is driven by natural selection, such that deviations from the state of equiprobability trigger further adaptations of the observer system, if they prove to be causally relevant for its proper functioning. Hence, evolutionary change follows the MaxEnt principle.
This argument links the reproducibility of the function with the reproducibility of the macrostates of system
A, in the sense of that predictions which correspond to the MaxEnt principle will be confirmed in the course of time. In other words, if the MaxEnt principle is the best way how to predict macrostates which relate with a large number of causally productive microstates, or, as in Dewar’s [
72] reformulation, microscopic paths that result into the same particular macrostate, then the principle of natural selection of functions implies that they will approximate the MaxEnt principle in a sequence of observations across a population of functions,
i.e., observer systems.
By this argument we have stated that evolving functions as observations will maximize entropy
OR in the process of adapting their structure according to the macrostates of the system
A with which they causally interact. This is a statement about the informational content of the evolving structures, in the classical Shannon sense. The question is whether this purely informational view also corresponds to a physical reality, in the sense of whether the implied MaxEnt principle also corresponds to maximum entropy production. In fact, this correspondence between the MaxEnt principle and the MEP hypothesis has been stated recently for many non-equilibrium systems, as an alternative to the older dissipative systems hypothesis [
73]. Interestingly, this approach also includes basic feedback mechanisms which correspond to the general notion of a function that I have outlined previously: That is, a MEP steady state can be described as a trade-off between a thermodynamic force and a thermodynamic flux, in which changes of the flux cause changes in the force, which generate subsequent changes of the flux that lead back to the steady state [
74]. Beyond the specific thermodynamic assumptions, the correspondence thesis can be supported by a purely ontological argument on the structure of the evolutionary system that I have just described. If the MaxEnt principle correctly grasps the evolutionary dynamics of evolving functions, then it must also present an approximately true picture of the underlying physical reality of the system
A. That means, if the notion of observation is naturalized, then the epistemic aspect of the maximum entropy principle corresponds to the ontological aspect of maximum entropy production, in the sense that with properly identified macro-constraints, the observed system will also be in a state of maximum entropy production (this may correspond to Virgo’s [
75] proposal that the MEP principle relates with the problem of how to define the relevant system boundary). This argument stand aside the thermodynamic analysis of non-equilibrium systems that shows that those systems will maximize entropy, given a local variable with a local balance of flux and local sources [
72,
76].
Clearly, this assertion is actually a statement of the relation between entropyOR and entropyOI, as maximum entropy production is a process that is observer independent. Yet, at the same time the determination of this entropy can only happen in a relation between observer and observed system. So, the evolutionary correspondence between the MaxEnt principle and the Maximum Entropy Production provides the foundation for the assumption that the two entropies converge.
This second step in the evolutionary interpretation of Jaynes’ notion of entropy results into the hypothesis:
In physical systems with evolving functions, the structural features of the functions which establish an observational relation between them and object systems will result into a causal interaction between functions and systems in which those structures correspond to a set of constraints on the macro-level of the object systems which maximize entropyOI in the object system. This entails a correspondence between the MaxEnt principle and MaxEnt production.
Now, we can further develop on this insight by modifying the assumption about system A: We consider the case that system A is a function, too. In other words, we deal with the case of mutual observation. In this case, the structural features of function O become the constraints that are the object of the observation by system A’. Again, if we add the evolutionary context, hence envisaging a population of systems A’1,…,n, then we can reproduce the entire argument that I have presented above. Evidently, this interlocking of evolving functions implies a problem. This is analogous to the determination of equilibria in games where agents have to adapt their mutual expectations. If the observer system changes its structure because of an adaptation to the then existing structure of system A’, system A’ will adapt its structure to this new structure, because this is driven by the ongoing process of selection. A mutual fit of structures corresponds to the notion of Nash equilibrium in games in the sense that both systems will keep a certain structure in which the mutual predictions of their macrostates will be correct, that is, self-reinforcing through time. In game theory, this is a state of pure mental coordination. In the naturalistic interpretation of the observation relation, we can say that this is a structural equilibrium that, according to the MaxEnt process, results into maximum entropy production of both systems, hence also of the coupled system. So we can posit another hypothesis:
In physical systems with coupled evolving functions, the functions will mutually adapt their structures such that the implied constraints of the respective object system maximize entropyOI production; from this follows, that the integrated system also maximizes entropyOI production. Maximum entropy production defines the point of optimal mutual adaptation, hence an equilibrium. In this state, entropyOR and entropyOI converge because the mutual adaptation of constraints removes the contingency involved in observer relative entropy.
This hypothesis is very significant, as it shows that the Schrödinger approach needs to be corrected and amended, for very fundamental ontological reasons. Systems with functions evolve structural constraints precisely by means of maximizing entropy production. For this, we do not need the more special approach of dissipative systems, because the ontological argument counts for both equilibrium and non-equilibrium states. Reconsidering the fundamental point, it boils down to this: In the Schrödinger view, it is the structure of the functions which is regarded to be a state of lower entropy. In my view, it is precisely the evolution of the structure that results into
maximum entropy production. The emphasis is on ‘maximum’ here, because entropy production as such happens under any circumstances involving energy flows. Therefore, in most general terms, and contrary to Schrödinger, the evolution of biological functions corresponds to the Second Law, and by no means contradicts it. However, there are also reasons to assume that the MaxEnt state is also the state of minimum entropy of the system that produces entropy [
74,
77]. To this we can add the additional insight, that in evolutionarily coupled functions the contingency of the relation between entropy
OR and entropy
OI is reduced, as a result of the mutual coupling. Again, this statement leaves open the question of hierarchical embeddedness into higher-level relations of observation. Interestingly, this observation corresponds to a speculation by Deutsch [
78] who argued that only evolution by means of natural selection can reach states in which certain structures are stable across different parallel universes, if seen according to the Everett interpretation of quantum theory. I will come back on this idea later.
3.3. Endogenous entropy and evolution
Before I continue to develop on the more general philosophical argument, I present the most significant example for coupled functions, which is the emergence of life, understood as a fundamental chemical process. One of the pertinent theories that corresponds directly to the general framework developed in this paper is the ETIF theory (Emergence of templated information and functionality) [
79]. The basic unit are TSD reactions (template and sequence directed). The central building block of this theory are two coupled reaction cycles, both occuring in a heat bath with the necessary chemical components (
Figure 7). One cycle is the translation cycle in which peptide synthesis takes place from amino acids and proto tRNA and proto mRNA, the other is the replication cycle, in which RNA is produced via template RNA. The connection between the cycles is constituted by the catalytic function of peptides resulting from the translation cycle on the replication cycle. In this system, we have one autocatalytic cycle, the translation cycle, which therefore contains the most elementary type of function. At the same time, there is a more complex feed-back loop via the catalysis of the RNA replication. This elementary feedback loop can evolve into more complex structures and manifests increasing selectivity of the chemical reactions, resulting into an increasing capacity to distinguish between different chemical species.
Figure 7.
Emergence of templated information and functionality, after Lahav
et al. [
55].
Figure 7.
Emergence of templated information and functionality, after Lahav
et al. [
55].
From the viewpoint of the previously presented argument, the increasing selectivity of the evolving networks of chemical reactions corresponds to the Second Law, but simultanously reduces the contingency of entropyOR on the level of the single functions, in the sense of the contingency of the constraints operating in the MaxEnt process. This reduction of contingency is what we perceive as increasing “order,” but it does not imply that the Second Law does not hold.
There are two aspects of origin of life theories that need further scrutiny in our context. The ETIF model is one specific version of the general chemoton model which works without enzymes [
54]. In this model, an autocatalytic reaction cycle corresponds to the metabolic subsystem of an elementary living system, which interacts with a cycle of template replication. This cycle produces a byproduct which reacts with another byproduct of the metabolic subsystem to form a membranogenic molecule. The latter process creates the possibility of a physical separation of different chemotons, which is the precondition for evolutionary selection of different functions. Clearly, the chemoton is a system of coupled functions in the sense of
Section 3.2.
Now, compartmentation is crucial for two reasons (as an important aside, this provides the ultimate explanation of the central role of the notion of membranes in biosemiotics [
80]). The first is that compartmentation happens as a result of the Second Law, and not against it. In this, compartmentation is similar to the lumping together of matter which results from the entropic transformation of gravitational energy. Similarly, the formation of fluid vesicels and, especially on particular sites such as rocks supports compartmentation, simply because they represent states of higher entropy. However, compartmentation triggers a fundamental change of the evolutionary process, because it causes the emergence of group selection [
81,
82]. Group selection offers the evolutionary explanation of the nesting of functions. It is important to notice that group selection is a necessary statistical outcome in systems with elementary heredity, mutation and replication.
The reason for this is easy to grasp when one considers the fact that in a selective environment, the production of a byproduct (or the catalytic use of the original product) that catalyzes another reaction is costly in terms of the reproduction of the base reaction [
54]. In this sense, cross-catalyzation cannot be an evolutionary stable equilibrium because the mass of molecules that do not cross-catalyze will grow stronger as the former. On the other hand, a cycle will always grow stronger compared to other cycles if cross-catalysis happens. The only way how this can be stabilized through time is by compartmentation. That is, if cycles with different degrees of ‘molecular altruism’ separate, the cycle with higher catalytic potential can reproduce better than other cycles. Absent compartmentation, this ‘altruistic’ molecule would be driven out by parasites. ‘Altruism’ in this context just designates the emergence of a new function, matching our elementary model of function. That is, once a chemical web including autocatalytic and replicative cycles manifests compartmentation, there is a possibility that a
Z’ can emerge,
i.e., a higher-level function to which the lower-level function is directed.
This argument further supports the general view that evolution does not contradict the Second Law, because the emergence of more complex functions is a necessary result of the underlying physical processes, in this case even of a very simple mechanical fact, that is, compartmentation. However, this conclusion depends on how we conceive of the process of selection. In our scenario discussed so far, the central feature is that even on the primordial level of molecular evolution, chemical reactions compete for the acquisition of substrate and the utilization of thermodynamic potentials.
The second aspect of life is the throughput of energy that relates to the most universal selective constraint on the fundamental chemical processes. Evolution is a physical process in the sense that the ultimate currency for selection is the relative capacity for processing energy, relative to a reference frame, which is endogenous in the sense of being co-determined by the existence of other living systems which compete for the same energy sources.
This observation allows to relate the argument so far with Lotka’s principle of maximum energy flows in evolution, which is, of course, a corresponding principle to the principle of Maximum Entropy Production [
74,
83]. Lotka’s principle has been invoked in many contexts, fitting to my general argument. There are also different levels of empirical specification, reaching from Lotka’s most general argument to biological systems and to human technological systems [
84,
85,
86,
87,
88]. This broad scope shows that it is related with the general notion of function as developed in this paper. Therefore, we can state another formulation of Lotka’s principle:
Systems with coupled evolving function will evolve along a trajectory in which states with increasing energetic throughputs will be achieved in temporal sequence. These states correspond to states of MEP, in which entropyOI and entropyOR converge, relative to the set of coupled functions.
There are different specific mechanism that produce this result. The most general one is the principle of maximum power in the sense that living systems will evolve into states which manifest a growing capacity to do useful work, such as larger size, greater reach, higher speed
etc. [
87]. There are more specific mechanisms, such as the Red Queen principle that applies to interlocking functionings of the predator-prey type [
89]. The Red Queen principle triggers the maximization of power as a result of an arms race. As we shall see in the next section, the most important mechanism is signal selection [
90]. Signal selection builds in the so-called handicap principle [
91,
92]: Natural selection favours the mutual coordination of living systems via signals that are costly, because only costly signals transmit truthful information. Signal selection therefore drives the evolution of seemingly non-functional diversity, such as colours or additions to body structure.
Lotka’s principle is a description of the general trajectory taken by systems with evolving functions. However, the relation to entropy is not a simple one. This is immediately obvious if we go back to the Jaynes approach, where the measure of entropy which links the micro- and the macrolevel is specific to the particular observation, hence function. This means, in a sequence of coupled systems O1↔A’1, ….., On↔A’n, the corresponding measures of entropy S1,…,Sn are incommensurable, because they define different entropiesOR. So we have two different propositions that we need to reconcile. One is that in an evolutionary systems, with reference to a given state space, hence set of functions and their structural characteristics, the mutual adaptation of functions results into the maximization of entropyOR. At the same time, the entire sequence of evolving functions follows Lotka’s principle, and hence maximizes entropy with reference to the environment, which is entropyOI. Yet, the state-specific entropies and the general entropy are not commensurable, even though on a foundational level, the Second Law holds under every circumstances, and refers to maximum entropyOI production..
This problem can only be solved if we consider the role of functions again. Coming back to our original discussion of the notion of information, we can say that the MaxEnt process with reference to given functions results into a state-specific information capacity in the original Shannon sense. This information capacity does not change endogenously, but by the emergence of another function which has certain structural features that identify a bias in the distribution of microstates that was not identified in the previous function. In other words, the MaxEnt principle, viewed from the evolutionary perspective, ultimately only applies when the functional structure fully corresponds to the constraints that actually hold in the object system. Now, in a system with coupled functions this is not a given, but every change of a function opens up new evolutionary possibilities for another, observing function. That means, the evolution of functions implies a continuous increase of the information capacity of the encompassing system of all evolving functions. The central point is that the information that is carried by a function is not ‘stored’ in any meaningful sense in that function, but relates with another function that is causally connected with that function. So, a function evolves following the MaxEnt principle, and at the same time becomes information relative to another function.
In other words, it is wrong to state that a function carries information with reference to the state space that reflects its specific constraints. It can only carry information relative to other functions that are coupled with it. The best illustration for this distinction is the fact that genomes carry a lot of ‘junk’ DNA. Interestingly, there is even no relation between genome size and complexity and the level of complexity of the phenotype. This observation clearly shows that there is no direct correspondence between the role of the genome in causing development and its role of carrying information. The information resides in the system into which the genome as a set of functions is embedded. For this system, only parts of the genome carry information, here understood as having a higher level function. From this follows, that the increasing divergence between the functional and the non-functional DNA in genomes just reflects the MaxEnt process over time: All functions evolve into a state in which there is a maximum amount of diversity which is non-functional. In plain, this is why simple organisms can have larger genomes than humans: All depends on whether the system can make use of the genome, which otherwise just accumulates diversity.
Figure 8.
Evolution of entropies.
Figure 8.
Evolution of entropies.
This particular relation between functional evolution and information capacity has been stated by Brooks and Wiley [
93] (building on earlier work by Layzer [
94] and has been further developed by Salthe [
18], whose distinction between externalist and internalist notions of entropy corresponds to my distinction between entropy
OI and entropy
OR, as we have already seen. So we can adapt a diagram that has been recurrently used in the literature (
Figure 8). The important insight is that the evolution of functions implies a series of MaxEnt processes with reference to entropy
OR. Entropy
OI increases simultanously because the entire process is driven by energy throughputs and follows Lotka’s principle. This means that an increasing amount of energy is dissipated into bound energy, so that entropy
OI in the general thermodynamic sense increases.
The important additional insight that we can gain from this graphical exposition is that we recognize the role of functions explicitely. Instead of envisaging the role of an observer who measures the entropy of a system and processes information, we see a sequence of evolving functions, which are coupled and nested in an increasingly complex way, such that the entire embedding system evolves into states with an increasing number and scope of constraints [
74,
76]. There is no way to get out of this process and adopt the perspective of an entirely exogenous observer (actually, Maxwell’s demon). Instead, we realize that evolving functions imply an endogenous state space, and therefore create new information capacity that materialized if new coupled functions evolve. The entire process operates according to the Jaynes MaxEnt principle, which, in the evolutionary context, also implies that the MEP hypothesis holds. The MEP hypothesis refers to the evolution of the outer curve in the diagram, hence entropy
OI.
We have now sketched the necessary theoretical framework to discuss the relation between entropy and semiosis in a fresh way. It is already obvious that Shannon information only plays a marginal, because merely descriptive role here, in certain contexts. Instead, we concentrate on the thermodynamic notion of entropy and the Gibbs/Jaynes view on the relation between macro- and microlevel. We only need to do a simple step: Semiosis is a function.