This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license (

Evolutionary information theory is a constructive approach that studies information in the context of evolutionary processes, which are ubiquitous in nature and society. In this paper, we develop foundations of evolutionary information theory, building several measures of evolutionary information and obtaining their properties. These measures are based on mathematical models of evolutionary computations, machines and automata. To measure evolutionary information in an invariant form, we construct and study universal evolutionary machines and automata, which form the base for evolutionary information theory. The first class of measures introduced and studied in this paper is evolutionary information size of symbolic objects relative to classes of automata or machines. In particular, it is proved that there is an invariant and optimal evolutionary information size relative to different classes of evolutionary machines. As a rule, different classes of algorithms or automata determine different information size for the same object. The more powerful classes of algorithms or automata decrease the information size of an object in comparison with the information size of an object relative to weaker4 classes of algorithms or machines. The second class of measures for evolutionary information in symbolic objects is studied by introduction of the quantity of evolutionary information about symbolic objects relative to a class of automata or machines. To give an example of applications, we briefly describe a possibility of modeling physical evolution with evolutionary machines to demonstrate applicability of evolutionary information theory to all material processes. At the end of the paper, directions for future research are suggested.

Evolutionary information theory studies information in the context of evolutionary processes. All operations with information, information acquisition, transmission and processing, are treated from the evolutionary perspective. There are many evolutionary information processes in nature and society. The concept of evolution plays important roles in physics, biology, sociology and other scientific disciplines. In general, evolution is one of the indispensable processes of life, as well as of many other natural processes. For instance, human cognition in general and scientific cognition in particular is an evolutionary information process. Indeed, over centuries, science has been obtaining more exact and comprehensive knowledge about nature. Although many important discoveries and scientific accomplishments were considered as scientific revolutions, the whole development of science was essentially evolutionary.

After biologists found basic regularities of biological evolution, computer scientists began simulating evolutionary processes and utilizing operations found in nature for solving problems with computers. In such a way, they brought forth evolutionary computation, inventing different kinds of strategies and procedures, such as genetic algorithms or genetic programming, which imitated natural biological processes. The development of evolutionary computation has essentially influenced computational science and information processing technology. As Garzon writes [

Here we consider evolution controlled by evolutionary algorithms and modeled by evolutionary automata and machines studied in [

Algorithmic information theory is based on the concept of Kolgmogorov or algorithmic complexity, which provides means to measure the intrinsic information related to objects via their algorithmic description length. In turn, algorithmic or Kolmogorov complexity is based on appropriate classes of Turing machines [

Evolutionary information theory stems from the assumption that evolution is performed by some means, which are modeled by abstract automata, algorithms or machines, which work in the domain of strings and perform evolutionary computations. The input string is a carrier of information about the output string, which represents the result of evolution. Based on these considerations, it is natural to define the information size of the output string

The algorithmic approach explicates an important property of information, connecting information to means used for accessing and utilizing information. Information is considered not as some inherent property of different objects but is related to algorithms that use, extract or produce this information. In this context, a system (person) with more powerful algorithms for information extraction and management can get more information from the same carrier and use this information in a better way than a system that has weaker algorithms and more limited abilities. This correlates with the conventional understanding of information.

Evolutionary information reflects aspects and properties of information related to evolutionary processes. Many information processes, such as software development or computer information processing, have evolutionary nature. Evolutionary approach explicates important properties of information, connecting it to natural computations and biological systems. At the same time, in the context of pancomputationalism or digital physics (cf., for example, [

Developing the main ideas of algorithmic information theory in the direction of evolutionary processes, here we introduce and study two kinds of evolutionary information: evolutionary information necessary to develop a constructive object by a given system of evolutionary algorithms (evolutionary automata) and evolutionary information in an object, e.g., in a text that allows making simpler development of another object by a given system of evolutionary algorithms (evolutionary automata). Respectively, we have two basic evolutionary information measures: the

This paper is organized as follows. In

Informally, the evolutionary information size of an object

In

In

In Conclusion, open problems for further research are suggested.

The author is grateful to unknown reviewers for their useful comments.

Evolutionary computations are artificial intelligence processes based on natural selection and evolution. Evolutionary computations are directed by evolutionary algorithms and performed by evolutionary machines. In technical terms, an evolutionary algorithm is a probabilistic search algorithm directed by the chosen fitness function. To formalize this concept in mathematically rigorous terms, a formal algorithmic model of evolutionary computation—an

Let

the goal of the

the automaton

the first population/generation

for all

Each automaton

Note that

We denote the class of all general evolutionary

The desirable search condition is the optimum of the fitness performance measure

Let us consider some examples of evolutionary

We denote the class of all general evolutionary finite automata by

It is possible to take as

Turing machines

Simple inductive Turing machines are abstract automata (models of algorithms) closest to Turing machines. The difference between them is that a Turing machine always gives the final result after a finite number of steps and after this it stops or, at least, informs when the result is obtained. Inductive Turing machines also give the final result after a finite number of steps, but in contrast to Turing machines, inductive Turing machines do not always stop the process of computation or inform when the final result is obtained. In some cases, they do this, while in other cases they continue their computation and give the final result. Namely, when the content of the output tape of a simple inductive Turing machine forever stops changing, it is the final result.

We denote the class of all general evolutionary inductive Turing machines by

We remind that inductive Turing machines with recursive memory are called

We denote the class of all general evolutionary inductive Turing machines of order _{n}

When the search condition is satisfied, then the ELTM

We denote the class of all general evolutionary limit Turing machines of the first order by

For instance, there are unrestricted evolutionary Turing machines when

Using different classes

^{Q}

It gives us the following types of evolutionary

1. When the type _{n}

2. When the type

Some classes of bounded evolutionary

3. When the type

Some classes of periodic evolutionary

4. A sequence {_{i}_{1}, _{2}, _{3}, …, _{k}_{k}_{+i};

5. When for each sequence

6. When for each sequence

7. When for each sequence

There is a natural inclusion of these classes:
_{n}

Definitions imply the following result.

Let us consider two classes

^{Q}^{P}

As it is possible to simulate any

Proposition is proved.

As any periodic general evolutionary

It is proved that it is possible to simulate any Turing machine by some inductive Turing machine [

Another condition on evolutionary machines determines their mode of functioning or computation. There are three

1.

2.

3.

There are also three

1. In the

2. In the

3. In the

In [

The new definition given here is more flexible because when the outcomes of all level automata of an evolutionary machine coincide with the transfer outputs of the same automata, the new definitions coincide with the previously used definitions.

Effective modes can be also local. There are three

1. In the

2. In the

3. In the

Local modes of evolutionary

Local and global modes are orthogonal to the three traditional modes of computing automata:

Existence of different modes of computation shows that the same algorithmic structure of an evolutionary automaton/machine

There are also three

1. The

2. The

3. The

Evolutionary

We denote the class of all basic evolutionary machines with level automata from

As any basic evolutionary

Similar to the class

_{n}

There is a natural inclusion of these classes:
_{n}

Note that all considered above modes of evolutionary computations are used both in general and in basic evolutionary

Many results demonstrate that evolutionary

_{q}. One of these tapes is used for the transition output and the other one stores the outcome.

The Turing machine _{q} works in the following way. Taking the input word, _{q} processes it by the rules of the machine _{q} writes the same output in both output tapes and stops functioning. Thus, _{q} is a Turing machine because it either halts and gives the result or does not produce any output (result).

On the next step of building _{q}. So, the Turing machine

Theorem is proved.

As simple inductive Turing machines are more powerful than Turing machines [

At the same time, some classes of evolutionary

Let us assume that the result is proved for all (

By the principle of induction, theorem is proved.

Proof is similar to the proof of Theorem 2.2.

These results show that in some cases, evolutionary computations do not add power to computing devices.

_{E }_{E}

Now let us assume that the sequence {_{H}_{H}

Part (a) is proved.

Proof of part (b) is similar. Theorem is proved.

_{1} (_{1}) of all basic (almost) periodic evolutionary finite automata with the period 1.

_{1} (_{1}) of all basic (almost) periodic evolutionary Turing machines with the period 1.

Now let us consider general evolutionary machines.

Proof is similar to the proof of Theorem 2.4.

Proof directly follows from Theorem 2.5 as any periodic evolutionary finite automaton with the period 1 is a cellular automaton.

One-dimensional cellular automata are functionally equivalent to Turing machines, while Turing machines are more powerful than finite automata [

Proof is similar to the proof of Theorem 2.1.

Classes

Let

Indeed, any automaton from the class

For general classes of automata/machines, it is possible to find much more tentatively inherited properties in [

For finding properties of evolutionary information, we need the following property of inductive Turing machines.

Let us consider two inductive Turing machines

Their sequential composition _{°} _{°} _{°} _{°}

Now let us build the inductive Turing machine _{0} and _{0} computationally equivalent to the inductive Turing machines _{0}. The output tape of the machine _{0}.

In addition, _{0} and _{0} in the following way. Whenever the machine _{0} writes its new partial result in its output tape, the machine _{0} makes its next step of computation and the machine _{0} makes its next step of computation. When the compared words are different, the machine _{0} into the working tape of _{0}. After this the machine _{0} erases everything from its working and output tapes, writes 1 into its output tape, changes 1 to 0, erases 0 and makes first step of computation with the new input written by the machine _{0} makes its next step of computation.

Now we can show that the inductive Turing machine _{°} _{0 }does not stop changing. Thus, by construction of _{0} also produces the same result _{0 }stops changing after some step of computation. Consequently, all comparisons of the machine _{0}, which will perform all computations with the input _{0 }also does not stop changing and _{0 }does not give the result. Thus, by construction of _{0} starting with the input _{°}

Theorem is proved.

Theorems 2.6, 2.2 and 2.3 give the following result.

Let

For instance, a universal Turing machine

We define a universal evolutionary automaton/algorithm/machine as an automaton/algorithm/machine that can simulate all machines/automata/algorithms from

The pair (^{l}^{(u)}0

where ^{n}

_{u}_{u}

For the standard coding <·,·>, the constant _{u}

In this context, an automaton/algorithm/machine

It is also possible to give a dual definition: an automaton/algorithm/machine

Note that existence of a universal automaton in the class

In other words, a basic universal evolutionary Turing machine can simulate behavior of an arbitrary BETM

In other words, a basic universal evolutionary inductive Turing machine can simulate behavior of an arbitrary BEITM

Indeed, it is possible to construct the code of an evolutionary _{°} _{°} _{°} ... _{°} _{°} … where the symbol _{°} does belong to the language

Note that the code

Definition 3.1 gives properties of universal evolutionary

A universal evolutionary

Note that although components of the evolutionary

For simulation by _{°} _{°} _{°} ... _{°} _{°} … where the symbol _{° }does belong to the language

Consequently, the evolutionary

(b) Taking a universal automaton (machine)

Theorem is proved.

As there are universal Turing machines, Theorem 3.1 implies the following result.

As there are universal inductive Turing machines of the first order [

_{1}

As there are universal inductive Turing machines of order

_{n}

As there are universal limit Turing machines of order

_{n}

Theorem 3.1 and Proposition 3.1 also imply the following result.

_{°} _{°} _{°} ... _{°} _{° }does belong to the language

Taking a bounded evolutionary _{°} _{°} _{°} ... _{°} _{° }does belong to

In the case of an almost periodic evolutionary

Proposition is proved

Proof is similar to the proof of Proposition 3.2.

Proof is similar to the proof of Theorem 3.1.

As there are universal Turing machines, Theorem 3.1 implies the following result.

As there are universal inductive Turing machines of the first order [

_{1} of all basic evolutionary inductive Turing machines of the first orderhas a basic universal evolutionary inductive Turing machine.

As there are universal inductive Turing machines of order

_{n}

As there are universal limit Turing machines of order

_{n} of all basic evolutionary limit Turing machines of order

Theorem 3.2 and Proposition 3.1 also imply the following result.

By construction, the universal general evolutionary automaton in

_{1} of all periodic general evolutionary automata in _{1}.

The same is true for basic evolutionary automata.

_{1} of all periodic basic evolutionary automata in _{1}.

The situation with universal machines in arbitrary classes of bounded evolutionary automata is more complicated. Some of these classes have universal machines and some do not have.

Indeed, by Theorem 2.3, any general bounded evolutionary

In a similar way, by Theorem 2.2, any basic bounded evolutionary

As the following example demonstrates, the condition of containing all sequential compositions is essential for Theorem 3.5.

_{1}_{2}_{3} … _{n}, it gives the word _{1}_{2}_{3} … _{n−1} as its output. We define _{0}, ε), _{0 }, ε), (_{0}, ε), (_{0}, ε)} is the set of states of _{0}, ε) is the start state,_{0}, ε) → (_{0}, ε), ε;_{0}, ε) → (_{0}, 0), ε;_{0}, ε) → (_{0}, 1), ε;_{0}, 0) → (_{0}, 0), 0;_{0}, 0) → (_{0}, 1), 0;_{0}, 1) → (_{0}, 0), 1;_{0}, 1) → (_{0}, 1), 1;_{0}, 1) → (_{0}, ε), ε;_{0}, 0) → (_{0}, ε), ε;

The expression _{0}_{0}_{0}, _{0},

_{A}

Let us consider bounded basic evolutionary _{A}_{A}_{1}_{2}_{3} … _{n}_{1}_{2}_{3} … _{n}_{−1} as its transition output and the empty word ε as its outcome. Then when _{1}_{2}_{3}…_{n}_{−1} as its input and produces the word _{1}_{2}_{3}…_{n}_{−2} as its transition output and the empty word ε as its outcome. This process continues until the automaton _{1 }as its input. Then _{1}.

Any evolutionary machine in the class _{A}_{A}_{A}_{n}_{+1} = {_{1}_{2}_{3}…_{n}a_{n}_{+1} and on any longer input word, the machine _{1 }on the input word _{A}_{A}

Here we introduce and study evolutionary information necessary to develop a constructive object by a given system of evolutionary algorithms (evolutionary automata/machines). It is called

Let us consider a class

_{A}(_{A}(_{A}_{A}

Informally, the quantity of evolutionary information about an object

However, people, computers and social organizations use systems (classes) of automata/algorithms and not a single automaton/algorithm. To define the quantity of information about an object

Taking a universal automaton

_{H}_{H}_{H}

Note that if there are no words _{H}

Informally, quantity of evolutionary information about an object

Note that when _{H}_{A}

_{H}

_{H}

It is possible to show that evolutionary information size (evolutionary information about an object x) with respect to the class

_{AU}such that for any object/word _{U}_{A}_{AU}_{H}_{U}

_{A}_{0})_{U}_{0})

where <> is a coding of pairs of words. By Condition L, there is a number _{v}_{0}>) ≤ _{0}) + _{c}_{(A)}_{0}>) = _{U}_{0}) ≤ _{0}>) = _{0}) + _{c}_{(A)}_{0}) ≤ _{0}) + _{c}_{(A)}_{U}_{A}_{AU}_{AU}_{c(A)}. For the standard coding <·,·>, _{c(A)} = 2_{H}(

Theorem is proved.

Inequality (1) defines the relation ≼ on functions, which is called the

Indeed, for any function

Besides, if

Thus, Theorem 4.1 means that information size with respect to a universal automaton/machine in

Theorems 3.3, 3.4, 4.1 and Proposition 3.3 imply the following result.

_{EU}(

The evolutionary information size EIS_{PGEAK}_{EU}

The evolutionary information size EIS_{PBEAK}_{EU}

As the class of all Turing machines has universal Turing machines and a finitary simulation coding, Corollary 4.1 implies the following result.

The evolutionary information size EIS_{PGETM}_{EU}

The evolutionary information size EIS_{PBETM}_{EU}

As the class of all inductive Turing machines has universal inductive Turing machines and a finitary simulation coding [

The evolutionary information size EIS_{PGEITM}_{EU}

The evolutionary information size EIS_{PBEITM}_{EU}

Theorem 4.1 and Proposition 3.3 imply the following result.

The evolutionary information size EIS_{BGEAK}_{EU}

The evolutionary information size EIS_{BBEAK}_{EU}

As the class of all Turing machines has universal Turing machines, is closed with respect to the sequential composition and has a finitary simulation coding, Corollary 4.4 implies the following result.

The evolutionary information size EIS_{BGEITM}_{EU}

The evolutionary information size EIS_{BBEITM}_{EU}

As the class of all multitape inductive Turing machines has universal inductive Turing machines, is closed with respect to the sequential composition (Theorem 2.3) and has a finitary simulation coding [

There are specific relations between information sizes relative to different classes of automata/algorithms.

_{H}_{Q}

As it is possible to consider any automaton from

_{K}_{BBETM}_{PBETM}_{APBETM}_{APBEITM}_{BBETM}_{BBEITM}_{PBEITM}_{APBEITM}

_{K}_{BGETM}_{PGETM}_{APGETM}_{APGEITM}_{BGETM}_{BGEITM}_{PGEITM}_{APGEITM}

Inequality ≼ defines the relation ≍ on functions, which is called the

Indeed, as the relation ≼ is reflexive and transitive, the relation ≍ is also reflexive and transitive. In addition, by definition, the relation ≍ is symmetric,

_{UV} such that for any object/word _{U}_{V}_{UV}

This shows that evolutionary information size (evolutionary information about an object

_{U}(_{V}(_{U}(_{V}(

_{U}_{V}_{UV}_{V}_{U}_{VU}_{U}_{V}_{V}_{y}

It means that the quantities EIQ_{U}_{V}

Theorem is proved.

Definition 4.2 and properties of universal evolutionary algorithms imply the following result.

_{H}

Lemma 4.1 and Proposition 4.4 show that evolutionary information size with respect to each of the classes

_{BGETM}

_{BGEITM}

_{PGETM}

_{PGEITM}

_{BBETM}

_{BBEITM}

_{PBETM}

_{PBEITM}

Let us study other properties of the evolutionary information size.

_{1}, _{2}, _{3}, …, _{k}_{1}), _{2}), _{3}), …, _{k}_{i}_{1}, _{2}, _{3}, …, _{k}

Lemma is proved.

Lemma 4.2 implies the following result.

_{H}

Indeed, taking a universal evolutionary automaton/machine _{H}_{U}_{U}_{H}_{U}_{H}

_{PGETM}

_{PGEITM}

_{PBETM}

_{PBEITM}

There are interesting relations between the length of a word and the evolutionary information size of the same word.

_{H}_{H}_{H}_{H}_{H }_{H}_{UH}

Note that similar relations exist for other information sizes, such as recursive information size [

It is interesting to compare evolutionary information size EIS_{PBETM}_{TM}

_{TM}_{PBETM}

_{PBETM}_{TM}

At the same time, by Theorem 18 from [_{TM}_{ITM}_{TM}_{PBETM}

Infinitely many elements

Theorem is proved.

Theorem 4.4 means that evolution can essentially reduce information size of objects because for infinitely many objects, their information size defined by periodic basic evolutionary Turing machines is essentially less than their information size defined by Turing machines.

_{2}(EIS_{TM}_{PBETM}

_{TM}^{1/n} > EIS_{PBETM}

Here we introduce and study the

To define evolutionary information in an object, we consider the set ^{l}^{(u)}0^{l}^{(v)}0

To study relative information size and relative quantity of information, we need Condition L introduced in

_{v}_{v}

For the coding of pairs dual to the standard coding <·,·>, the constant _{v}

As before,

^{L}_{A}^{L}_{A}^{L}_{A}^{L}_{A}^{R}_{A}^{R}_{A}^{R}_{A}^{R}_{A}

Informally, the relative quantity of evolutionary information about an object

_{AU} such that for any object/word _{A}^{R}_{A}_{y}

_{A}_{0})^{R}_{A}_{0})_{y}_{0} from _{0}) ≤ _{0}>) = _{0}) + _{y}_{A}^{R}_{A}_{y}

Proposition is proved.

_{A}(^{R}_{A}(

We can see that in a general case, the right relative evolutionary information size of an object

^{R}_{A}^{R}_{A}^{R}_{A}^{R}_{A}^{R}_{A}^{L}_{A}^{L}_{A}^{L}_{A}^{L}_{A}^{L}_{A}

Informally, the quantity of evolutionary information in an object/word

Here we encounter the problem of ^{L}_{A}^{L}_{A}^{L}_{A}

In essence, there are three meanings of negative information:

(1) Information about something negative,

(2) Information expressed with a negation of something is called negative information.

(3) Information that has negative measure is called negative information.

Here we study evolutionary measures of information. Consequently, we are interested in the third meaning. In the traditional approach to information, it is always assumed that the measure, e.g., quantity, of information is always positive. However, recently in their exploration of quantum information, researchers came to the conclusion that there is quantum information with negative measure [

The general theory of information also permits different types of information with negative measures [

Evolutionary information theory, as well as algorithmic information theory, explains how information can be negative. Indeed, information in a word (an object)

Misleading information can also increase the information size of an object because it will demand additional information for automata or algorithms to detect misleading, to eliminate it and to guide automata or algorithms in the right direction. Thus, it becomes understandable that algorithmic information in general and evolutionary information in particular can be negative in many situations.

Evolutionary information theory, as well as algorithmic information theory, also gives a grounded solution to the information paradox called by Hintikka “scandal of deduction” [

To understand the situation, we remind that deduction is an evolutionary process of and a tool for evolutionary knowledge integration. Thus, it is natural to use evolutionary information theory for making sense of the “scandal of deduction”.

According to this theory, extraction of information from a message (statement) is an evolutionary process, while information content of the message (statement) depends not only on the message itself but mostly on algorithms used for information extraction.

To illustrate this principle of evolutionary information theory, let us consider the following situation that involved Mr. B and Mr. F.

Two men, Mr. B and Mr. F, are sitting at a train station not far from London. Little Mr. F feels himself very important. He has read different articles and even some books about information. He even wrote something in that area. So, he thinks he knows everything about information.

Eventually a third man comes and asks, “When does the train to London come?” Little Mr. F starts thinking how to better share his wisdom advising that man to look into the schedule. But before Mr. F is ready, Mr. B says: “The train to London either comes at 8 p.m. or does not come at 8 p.m.” Then he gets up and leaves the station. The third man follows Mr. B.

Little Mr. F is shocked—why instead of telling something reasonable that passenger uttered a sentence that was not informative at all. Indeed, according to MTI (Mathematical Theory of Information), TWSI (Theory of Weakly Semantic Information) and TSSI (Theory of Strongly Semantic Information) tautologies are not informative. Little Mr. F read this in the book “The Philosophy of Information”.

May be, little Mr. F thinks, the goal of the response was mocking at the third man and now the third man is going to punish that stupid passenger. Little Mr. F is happy with this thought.

However, the third man got some information from the tautological statement of Mr. B because after leaving the station, the third man asks Mr. B, “Are you Mr. James Bond?” “Yes, I am,” answers Mr. B. “Then”, says the third man, “let’s go to my car.”

This imaginary episode shows that information is a more complicated phenomenon than many think and even tautologies can be informative depending on the context,

As we did before, it is necessary to go from the quantity of evolutionary information with respect to one algorithm to the quantity of evolutionary information with respect to systems of algorithms because people, computers and social organizations use systems (classes) of automata/algorithms and not a single automaton/algorithm. To define the quantity of information in an object

Taking a universal automaton

^{R}_{H}^{R}_{H}^{R}_{H}_{H}^{L}_{A}^{L}_{A}^{L}_{H}^{L}_{U}

Examples of quantities of evolutionary information about an object

Note that if there are no words ^{R}_{H}^{L}_{H}

Informally, quantity of evolutionary information about an object _{H}_{U}

Observe that when _{H}_{A}

It is possible to show that relative evolutionary information size (relative evolutionary information about an object

_{AUy} such that for any object/word ^{R}_{U}^{R}_{A}_{AUy}^{L}_{U}^{L}_{A}_{AUy}

^{R}_{A}_{0})^{R}_{U}_{0})_{y}_{c}_{(A) }such that for any word _{0 }, _{0 }, _{c}_{(A) }≤ _{0}) + _{y}_{c}_{(A)} = _{0}) + _{AUy}_{0}, ^{R}_{U}_{0}) ≤ _{0 }, _{0}) + _{AUy}_{0}) ≤ _{0}) + _{AUy}^{R}_{U}^{R}_{A}_{AUy}_{AUy}^{L}_{H}(

Proof of the second part is similar.

Theorem is proved.

There are various relations between relative and absolute evolutionary information sizes. For instance, Proposition 5.1 implies the following result.

_{H}^{R}_{H}

As the class of all Turing machines satisfies all necessary conditions [

_{BGETM}^{R}_{BGETM}_{PGETM}^{R}_{PGETM}_{APGETM}^{R}_{APGETM}

_{BBETM}^{R}_{BBETM}_{PBETM}^{R}_{PBETM}_{APBETM}^{R}_{APBETM}

As the class of all inductive Turing machines satisfies all necessary conditions [

_{BGEITM}^{R}_{BGEITM}_{PGEITM}^{R}_{PGEITM}_{APGEITM}^{R}_{APGEITM}

_{BBEITM}^{R}_{BBEITM}_{PBEITM}^{R}_{PBEITM}_{APBEITM}^{R}_{APBEITM}

To compare left relative and absolute evolutionary information sizes, we use conditions introduced in [

_{v}_{v}

_{v}_{°} _{v}

_{v}_{v}_{v}

_{°} _{v}

Let us assume that the class _{v}_{v}_{λ} from

_{AU} such that for any object/word ^{L}_{U}^{L}_{U}_{Uy}

_{λ} _{°} V _{°} A_{x} _{°} E_{x}

By the listed properties of the class

Given the input <_{λ} converts the word <_{x}_{x}_{x}_{x}_{x}

This allows us to build the evolutionary automaton/machine ^{L}_{H}^{L}_{U}

By construction, the evolutionary ^{L}_{A}_{U}_{H}

By Theorem 5.1, for the evolutionary automaton/machine _{EUy}^{L}_{U}^{L}_{A}_{EUy}^{L}_{U}_{U}_{EUy}

Proposition is proved.

As EIS^{L}_{U}^{L}_{H}

^{L}_{H}_{H}

As the class of all Turing machines satisfies all necessary conditions [

^{L}_{BGETM}_{BGETM}^{L}_{PGETM}_{PGETM}^{L}_{APGETM}_{APGETM}

^{L}_{BBETM}_{BBETM}^{L}_{PBETM}_{PBETM}^{L}_{APBETM}_{APBETM}

As the class of all inductive Turing machines satisfies all necessary conditions [

^{L}_{BGEITM}_{BGEITM}^{L}_{PGEITM}_{PGEITM}^{L}_{APGEITM}_{APGEITM}

^{L}_{BBEITM}_{BBEITM}^{L}_{PBEITM}_{PBEITM}^{L}_{APBEITM}_{APBEITM}

Let us find what relations exist between right relative evolutionary information size and left relative evolutionary information size of the same object. To do this, we consider the function

_{sw}

For instance, the code <^{l}^{(u)}0^{l}^{(v)}0

_{sw}_{° }

^{R}_{U}(^{L}_{U}(^{R}_{U}(^{L}_{U}(

_{sw}_{° } _{sw}_{°}

Given the input <_{sw}^{R}_{D}^{L}_{U}

At the same time, by Theorem 5.1, for the evolutionary automaton/machine _{DUy}^{R}_{U}^{R}_{D}_{DUy}^{R}_{U}^{L}_{U}_{DUy}^{L}_{U}^{R}_{U}_{GUy}_{Guy }^{R}_{U}^{L}_{U}

Proposition is proved.

Optimality results allow us to define the quantities EIS^{R}_{H}^{L}_{H}

^{R}_{H}^{R}_{H}^{R}_{U}^{R}_{U}_{H}^{L}_{A}^{L}_{H}^{L}_{U}^{L}_{U}_{H}

Informally, quantity of evolutionary information in an object/word

As EIS^{L}_{U}^{L}_{H}^{R}_{U}^{R}_{H}

^{R}_{H}^{L}_{H}

Note that when the class _{H}_{A}

If the class

^{L}_{H}

As the class of all Turing machines satisfies all necessary conditions [

^{L}_{BGETM}^{L}_{PGETM}^{L}_{APGETM}

^{L}_{BBETM}^{L}_{PBETM}^{L}_{APBETM}

As the class of all inductive Turing machines satisfies all necessary conditions [

^{L}_{BGEITM}^{L}_{PGEITM}^{L}_{APGEITM}

^{L}_{BBEITM}^{L}_{PBEITM}^{L}_{APBEITM}

It is possible to show that evolutionary information in an object

^{L}_{U}(^{L}_{V}(^{R}_{U}(^{R}_{V}(

^{L}_{U}_{U}^{L}_{U}^{L}_{V}_{V}^{L}_{V}_{U}_{V}_{UV}_{V}_{U}_{V}_{y}^{L}_{U}^{L}_{V}_{V}_{y}_{UV}_{V}_{y}^{L}_{V}^{L}_{U}

Thus, the quantities EIQ^{L}_{U}^{L}_{V}

Proof for the quantities EIQ^{R}_{U}^{R}_{V}

Theorem is proved.

Theorem 5.2 shows additive invariance of the evolutionary information in an object

However, it is necessary to remark that evolutionary information is additively invariant only as a function. For an individual object, choice of different universal algorithms for defining evolutionary information can result in big differences in evolutionary information estimation for this object.

To demonstrate applicability of evolutionary information theory to all material processes, in this section, we describe modeling physical evolution with evolutionary machines. In physics, evolution is often called

The standard interpretations and applications of evolutionary machines and computations include optimization and modeling of biological and social processes [

According to the basic physical theory called _{i}_{i}_{i}_{i}_{i }

Every quantum state corresponds to a spin network. The mathematics that describes the quantum states of spatial volumes and areas provides rules for how the nodes and lines (links) can be connected and what numbers can be assigned to them. For instance, when a node has three links, then the labels (colors) on the links satisfy the well-known Clebsh-Gordon condition: Each color is not larger than the sum of the other two, and the sum of the three colors is even [

Subatomic particles, such as quarks or electrons, correspond to certain types of nodes in a spin network, and are represented by adding more labels on nodes [

This shows that spin networks are special cases of named sets, namely they are labeled networks or labeled graphs, while matter is represented in loop quantum gravity by naming (labeling) operations [

Material particles change their positions and interact with one another and with physical fields, while physical fields change their characteristics. These changes are naturally modeled by naming operations, such as renaming and reinterpreting [

Interestingly, there are computational models that work with labeled networks or labeled graphs. The first of such models is called a

One more type of models that represent dynamics in labeled networks is

The third type of models that represent dynamics in labeled networks comprises Turing machines with a structured memory, inductive Turing machines with a structured memory and limit Turing machines with a structured memory [

Note that Petri net modeling of spin network dynamics is different from the realization of these processes by (evolutionary) Kolmogorov algorithms, (evolutionary) Turing machines with a structured memory, (evolutionary) inductive Turing machines with a structured memory and (evolutionary) limit Turing machines with a structured memory. While a computational machine or a Kolmogorov algorithm has a unified system of rules for changing spin networks, Petri nets work guided by local rules. These local rules represent local dynamics of spin networks. However, it is possible to show that (evolutionary) Turing machines with a structured memory can model both (evolutionary) Petri nets and (evolutionary) Kolmogorov algorithms [

Using universal evolutionary machines/automata, we introduced and studied several measures of evolutionary information,

Thisbrings us to the following problems:

Evolution is very important in biology, being directed by genetic processes. This gives us a natural direction for evolutionary information theory application.

The role of social evolution in society makes principal the following problem.

The general theory of information [

Spaces of information operators are built and studied based on functional analysis. Another mathematical model for the general theory of information utilizes category theory. Thus, we have a similar problem in the context of algebraic categories.

Evolutionary computations in general and genetic algorithms in particular are efficient tools for optimization [

Evolutionary automata are utilized for modeling and exploration of collaboration processes [

In this paper, the main concern was evolutionary information for classes of evolutionary automata/algorithms that have universal automata/algorithms.

Evolutionary computations have been used simulation of communication and language evolution [

It is possible to put this problem in a more general context.