Next Article in Journal
Detection of Left-Sided and Right-Sided Hearing Loss via Fractional Fourier Transform
Next Article in Special Issue
Information and Selforganization: A Unifying Approach and Applications
Previous Article in Journal
Predicting China’s SME Credit Risk in Supply Chain Finance Based on Machine Learning Methods
Previous Article in Special Issue
Stochastic Resonance, Self-Organization and Information Dynamics in Multistable Systems
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Entropy and the Self-Organization of Information and Value

1
Leibniz Institute for Baltic Sea Research, 18119 Rostock, Germany
2
Institute of Physics, Humboldt University, 12489 Berlin, Germany
*
Author to whom correspondence should be addressed.
Entropy 2016, 18(5), 193; https://doi.org/10.3390/e18050193
Submission received: 18 March 2016 / Revised: 4 May 2016 / Accepted: 16 May 2016 / Published: 19 May 2016
(This article belongs to the Special Issue Information and Self-Organization)

Abstract

:
Adam Smith, Charles Darwin, Rudolf Clausius, and Léon Brillouin considered certain “values” as key quantities in their descriptions of market competition, natural selection, thermodynamic processes, and information exchange, respectively. None of those values can be computed from elementary properties of the particular object they are attributed to, but rather values represent emergent, irreducible properties. In this paper, such values are jointly understood as information values in certain contexts. For this aim, structural information is distinguished from symbolic information. While the first can be associated with arbitrary physical processes or structures, the latter requires conventions which govern encoding and decoding of the symbols which form a message. As a value of energy, Clausius’ entropy is a universal measure of the structural information contained in a thermodynamic system. The structural information of a message, in contrast to its meaning, can be evaluated by Shannon’s entropy of communication. Symbolic information is found only in the realm of life, such as in animal behavior, human sociology, science, or technology, and is often cooperatively valuated by competition. Ritualization is described here as a universal scenario for the self-organization of symbols by which symbolic information emerges from structural information in the course of evolution processes. Emergent symbolic information exhibits the novel fundamental code symmetry which prevents the meaning of a message from being reducible to the physical structure of its carrier. While symbols turn arbitrary during the ritualization transition, their structures preserve information about their evolution history.

1. Introduction

Entropy is an emergent physical quantity with respect to the underlying microscopic dynamics. The thermodynamic energy of a body can be expressed as a function of the coordinates and momenta of the particles it consists of, while its entropy cannot [1]. The irreversible growth of entropy requires a different symmetry of its mathematical description, namely that of a semigroup, in contrast to that of the reversible microscopic motion, mathematically described by a group [2,3,4]. The universal increase of entropy is generally accompanied by devaluation of energy and, although not always, decomposition of macroscopic structures [5]. Counteracting this tendency, self-organization and evolution of dissipative open systems requires that, along with the flux of energy passing through the system, more entropy must be exported across the boundary of the system than is imported plus inevitably produced internally. This rigorous constriction imposed by the 2nd Law of Thermodynamics holds, without exception, for any living organism or machine, and in particular also for any macroscopic information-processing devices.
A comprehensive axiomatic information theory was recently proposed by Burgin [6,7,8]. Information is seen in close relation to entropy and thermodynamics [9,10] by various renowned scientists such as Boltzmann [11], Maxwell [12], Schrödinger [13], Shannon [14], Brillouin [15], Stratonovich [16], Volkenstein [17], Haken [18], Landauer [19], Ayres [20], or Hawking [21]. However, to the present day, the information specialists among physicists or engineers, philosophers, journalists or secret services have not agreed yet on a generally accepted, comprehensive and rigorous definition of what “information” actually is [22,23], and the perspectives taken may vary substantially from one expert to another. This somewhat unsatisfactory situation is not as exceptional as it may superficially seem; for example, a proper formal definition of “physics” is perhaps a similarly difficult task. But, unlike the case of “physics”, as soon as we ask for measurable quantities such as the amount or the value of “information”, a reasonable answer can hardly be given unless the kind of information meant is clearly specified. In this paper, we shall focus on two forms of information, structural and symbolic information [24,25,26], which are similar to the concepts of environmental and semantic information, respectively, as introduced by Floridi [23]. Section 3 and Section 4 will address their qualitative differences in more detail. In particular, as a concept borrowed from behavioral biology, here we generalize the term ritualization as a universal transition [24,26,27,28] by which symbolic information emerges from structural information in the course of natural evolution history, see Section 4.
For the relation between energy and entropy, as well as between structural and symbolic information, emergence is an important paradigm that reflects the hierarchy of current scientific description levels [29]. For instance, human mental cognition cannot be explained in terms of animal behavior, which in turn may not be derived from the equations of thermodynamics, which, again, constitute a different quality than the laws of quantum mechanics, whose Schrödinger function does not result as a logical consequence from elementary-particle physics, etc. It makes no sense to ask for the entropy of a single particle, or whether a carbon atom turns alive as soon as it belongs to an organism. “Physical systems require many levels of models, some formally irreducible to one another, but… the higher levels of the hierarchy must have emerged from lower levels. Life must have emerged from the physical world” [30]. Ritualization is such a transition to emergent properties; in various cases this process may be traced step by step in evolution history [24,25,26].
As a logical category, an emergent property may be defined as a property that is “novel and robust relative to some natural comparison class” [31]. Less rigorously stated, the term “emergence is broadly used to assign certain properties to features we observe in nature that have certain dependence on more basic phenomena (and/or elements), but are in some way independent from them and ultimately cannot be reduced to those other basic interactions between the basic elements” [32]. Emergent properties express in a scientifically more rigorous way the proverbial fact that “the whole is more than the sum of its parts”. Typical features of emergent order parameters are [24]:
  • Emergent properties are certain integrals of the microscopic motion under given initial and boundary conditions.
  • Emergent properties cannot be expressed as functions of the microscopic variables.
  • Emergent properties represent holistic, irreducible properties of the system.
  • Emergent properties may possess different symmetries than the underlying equations.
  • Emergent properties are consistent with the fundamental laws of physics.
The concept of values constitutes a central paradigm in biology and humanity, such as the values of life and ethics [33], but values exist also already within physics [34]. While certain values may be immeasurable and qualitative, such as use values of commodities or scientific values of discoveries, here we consider only values that are quantitative, irreducible measures of emergent properties, such as exchange values (prices) of commodities. The semantic meaning of a message and its associated value, in contrast to its syntactic and token structure and complexity, are central but so far only incompletely understood aspects of information theory [15,18,23,35,36,37]. Similar to thermodynamic entropy as a value of thermodynamic energy, see Section 2, information entropy can be understood as the value of structural information. In contrast, symbolic information is subject to self-organized valuation in the context of competition and selection processes where the meaning of the particular information is materialized in the form of physical structures and functions. Here, self-organization means a “process in which individual subunits achieve, through their co-operative interactions, states characterized by new, emergent properties transcending the properties of their constitutive parts”. In turn, evolution can be characterized “as a potentially unlimited succession of self-organization steps, each beginning with an instability (mutation, innovation, invention) of the previously established parent regime” [24]. Important examples for valuation as a driving force of evolution processes are selective values (quantitatively expressed as survival rates) in biology and exchange values (quantitatively expressed as market prices) in economy [24,28,38,39], as will be outlined in Section 5 and Section 6.
The paper is organized as follows. In Section 2, we discuss the emergent character of Clausius’ thermodynamic entropy and its role as a value. In the course of interdisciplinary scientific advance, the concept of entropy proved much more general than being just a thermal property of physical bodies; in Section 3 we ask for the relation between Clausius and Shannon entropies, and for the valuation of structural information by the latter. Ritualization as a fundamental transition process from structural to symbolic information is briefly introduced in Section 4. As key examples for values of symbolic information in the context of self-organization and evolution processes, Section 5 and Section 6, respectively, emphasize the emergent character of selective values in biology and of exchange values in economy. By ritualization, metallic commodity money that constitutes a reference exchange value, evolved to symbolic paper money that represents information about that value. In Section 6, the emergence of money as a symbol is briefly presented as a particularly instructive and transparent transition process.

2. Entropy as a Value of Energy

Entropy, a term originally introduced by Rudolf Clausius in his lecture given in Zürich, Switzerland, on 24 April 1865, has become a key concept to many disciplines of science. It may be worth to quote here the original arguments for his invention of the word [5] (p. 390):
If one looks for a term to denote S, one could, similar to what is said about the quantity U that it is the heat and work content of the body, say about the quantity S that it is the transformation content of the body. Since I prefer to borrow the name for quantities so important for science from the old languages to permit their unaltered use in all languages, I suggest to name the quantity S the entropy of the body, after the Greek word η τροπή, the transformation. Intentionally I have formed the word entropy as similar as possible to the word energy because the two quantities denoted by those words are so closely related with respect to their physical meaning that a certain similarity in their nomenclature appears appropriate to me (English translation from [24]).
—Rudolf Clausius [5]
In the sense of this reasoning, entropy has been regarded as the value of energy by Clausius and Ostwald [40,41].
Entropy is an emergent property in so far as it cannot be reduced to a related property of a single particle or to the microscopic laws of motion. “Erst… das ungeregelte Durcheinanderschwirren sehr vieler Atome schafft die Vorbedingung… für die Existenz einer endlichen Entropie und einer endlichen Temperatur (Only… the chaotic flying around of very many atoms establishes the preconditions... for the existence of a finite entropy and a finite temperature) [42] (p. 116). Planck’s deep insight in the statistical nature of thermodynamic entropy, applied to electromagnetic radiation processes, sparked a scientific revolution in physics, namely by his discovery of the correct black-body radiation law and of the existence of a quantum of action (together with the first determination of the value of Boltzmann’s constant from Equation (4)).
As is evident from Planck’s quotation above, the ease of immediate physical sensation of temperature at our fingertips in the daily life [12] deceives about the temperature’s emergent physical nature as the thermodynamic conjugate of its closest relative, of entropy:
T = U / S .
In simple cases such as classical gas theory, temperature may be identified with the mean kinetic energy of the particles [12]. However, for example, the temperature of monochromatic radiation [43,44,45,46], as derived from Equation (1) and expressed in energy units,
k B T = h ν ln ( n + 1 ) ln ( n ) ,
differs from the kinetic energy of the photons, E = h ν , of which the radiation consists. Here, h is Planck’s constant, ν is the frequency, kB is Boltzmann’s constant, and n is the mean photon number of the given frequency, according to Bose–Einstein statistics [45]:
n = [ exp ( h ν k B T ) 1 ] 1 .
Similarly, the entropy flux of thermal radiation, (4J)/(3T), in relation to the heat flux, J, differs from the common expression for the entropy flux, J/T, of molecular heat conduction [43,44,47]. As an emergent property, temperature is the statistical weight by which energy is valuated in terms of entropy.
The meaning of the common thermodynamic “caloric” concept of entropy may equivalently be expressed in terms of either distance from equilibrium, of value of energy, or of relative phase-space occupation, in addition to the usual concept of entropy as a measure of disorder. The entropy difference between equilibrium and non-equilibrium states of a system is a quantity which measures the distance from equilibrium or the (work) value of the energy contained in a system. This lowering of entropy [24] is, similar to the negentropy [13,15], directly related the unoccupied fraction of the Boltzmann energy shell in the phase space. As can be shown [41], any non-equilibrium distribution is restricted to certain occupied parts of the energy hypersurface where the microscopic motion of the particles takes place. Boltzmann’s statistical entropy formula,
S eq ( U , V ) = k B ln W ,
for the equilibrium entropy, expressed by the number W of available microstates with the total energy U in a volume V, may be generalized by a formula for the non-equilibrium entropy in the form:
S ( U , V , X ) = k B ln Ω ( X ) .
Here, Ω ( X ) W is the fraction of the energy shell that is occupied by the system under certain non-equilibrium conditions, X.
Therefore, relaxation to equilibrium implies a spreading of the random distribution and a decrease of our knowledge on the microstates. The 2nd Law of Thermodynamics tells us that entropy can be produced by irreversible processes but never be destroyed. Since entropy is a measure of value of the energy, this leads to the formulation that the distance from equilibrium and the work value of energy in isolated systems cannot increase spontaneously. In other words, the entropy lowering, δ S = S eq S ( X ) , is a Lyapunov function expressing a tendency of devaluation of energy [41].
This understanding of dissipating structural information differs from other approaches which consider physical states in general as representing “information”, and the laws of reversible microscopic dynamics as laws that express “information conservation” [19,21,48]. The virtual paradox of theoretical microscopic reversibility on the one hand and the observed practical macroscopic irreversibility on the other hand has a long tradition of famous physical disputes between alternative positions brought forward by Poincarè, Einstein, Zermelo, Prigogine, and many other scientists. The related apparent contradiction between conservation and destruction of information, which is seen here as an indicator for the emergent character of entropy, is sometimes regarded as an “information paradox” that may perhaps result from mutually inconsistent or insufficiently rigorous alternative definitions of “information”.
Information carriers are physical structures which typically also carry energy and entropy. May entropy as a value of energy also serve as a value of information in such cases? This implicated question will be addressed in the following section.

3. Entropy as a Value of Structural Information

The close relation between thermodynamic entropy and information was first revealed by the investigations of Maxwell [12] and Szilard [49]. Recently, Eigen [50] returned to the question of whether an equation “information = entropy” may be justified, and came to an affirmative conclusion, at least as far as the amount of information is concerned. The associate question for the information value is unsettled and subject to ongoing scientific discussion [17,28,35,51]; it aims at a suitable quantitative measure of the impact of information on dynamical processes. Already Brillouin [14] had stated that (quotation from Eigen [50]):
The present theory of information completely ignores the value of the information handled, transmitted or processed… At any rate, one point is immediately obvious: any criterion for value will result in an evaluation of the information received. This is equivalent to selecting the information according to a certain figure of merit.
—Léon Brillouin [14]
In the following sections, we shall return to certain evolutionary aspects of this problem.
In contrast to the values of symbolic information with respect to semantics, the Shannon entropy [15] of a message is a value of its structural information that is unrelated to the meaning of the message. Originally, this entropy is defined as a statistical measure of the information source with respect to the probabilities for the occurrences of certain symbols. If we imagine the communication channel as a long conveyor belt with carries the symbols issued successively by the source, the entropy of the transmitter is equivalent to the information entropy of the sequence of symbols strung together along the belt. Symbols are physical structures, and a non-trivial sequence of symbols may be regarded as a non-equilibrium structure with a lowered entropy, as described in the previous section. Thus, as far as the sequence of symbols possesses a well-defined thermodynamic entropy, the Shannon entropy represents a certain part of the entropy lowering associated with the particular physical incarnation of the chain of symbols.
In all processes of transferring and processing information, necessarily some amount of entropy is flowing. While the quantity of the related Shannon entropy flux may be extremely small compared to the thermal background entropy [41], this part is far from being irrelevant. As an example in a similar context, Max Planck [1] had emphasized that relative to the vast amount of Einstein’s relativistic rest energy of a macroscopic body, merely the tiny changes related to heat or work are of any practical importance in thermodynamics. Those who may argue that the waves on an ocean or the surface structures of the terrestrial crust are “without relevance” because of their smallness compared to the planet’s dimension, may endeavour to cross the sea or a mountain range without modern technology. Not the quantity matters here, but the quality. Also, while the entropy related to the ink distribution of a printed letter may be small compared to the thermal background entropy of the paper sheet, Shannon’s information entropy of a modulated laser beam in vacuum [52,53] can be assumed as of similar order of magnitude as Planck’s thermodynamic entropy of a monochromatic ray of light [42,44], provided that both are expressed in comparable units. In those two cases of structurally different information carriers, the actual message transferred may be exactly the same.
As a result of the arbitrariness of the symbols chosen for coding an information (namely, the fundamental code symmetry of symbolic information [24,26,27], see also the following section), one and the same semantic message may be associated with very different Shannon entropies, depending on the code or language used, on the degree of its compression or redundancy, etc. Sequences with high complexity may equally be very rich (e.g., reversibly compressed) or very poor (e.g., random noise) with respect to embedded semantic information [54]. Proper differentiation between the distinct roles and values of symbolic and structural information is important to prevent from incorrect or misleading conclusions possibly being drawn [37]. Brillouin’s distinction [14] that “information is an absolute quantity which has the same numerical value for any observer” while “the human value on the other hand would necessarily be a relative quantity, and would have different values for different observers”, may be associated with the different values of structural and symbolic information.

4. Self-Organization of Symbolic Information

While structural information is physical, symbolic information is conventional. The receiver of symbolic information transforms an arriving symbol into the meaning of the symbol, using the convention implemented. Symbol and meaning are coupled but independent physical structures or processes, mutually associated by the convention. In the case of structural information, the structure and its meaning are identical, and no convention is involved. In contrast, even though symbolic information is always accompanied by structural information, the relation between the two is specified arbitrarily in the form of a convention.
A dynamic system may serve as a simple physical example. Let its boundary or initial conditions represent the symbol, and the associated particular attractor the meaning of the symbol. The physically implemented convention consists of the system’s dynamical laws (such as usually displayed in the form of its bifurcation diagram) which define the attractor approached as a function of the external condition imposed. If the symbol is a mechanical push of a switch, let the meaning of the message be the room’s light going on or off. A more intelligent switch may also respond to symbols in the form of acoustic or optical or any other signals, by materializing the same meaning, namely turning on the light. The meaning of symbolic information is not reducible to the structural information of the carrier symbols; in this sense, symbolic information is an emergent property.
A symbol may be replaced by a different symbol, such as taken from another language, another alphabet of written letters, another number system, or be transmitted by an electrical wire rather than an optical cable, without affecting the meaning of the message transferred, if only the convention implemented at transmitter and receiver is changed in accord with the set of symbols used. This fundamental and universal property of symbolic information is its code symmetry, or coding invariance. Symbols are “energy-degenerate” and “related to a referent by an arbitrary code” [30].
In the course of natural and social evolution, numerous symbolic information systems emerged by self-organization, starting from the genetic code and neuronal networks, and ending up at human languages and digital devices [24,25,26,27,28]. Traditionally, ritualization is the term used for this transition process, first introduced a century ago by Huxley [55,56] in behavioral biology, which, beyond its original restriction to that discipline, may quite generally describe the gradual changes by which structural information spawns symbolic information and gives birth to the novel code symmetry. Along this transition, the symbols initially resemble the structures they originate from, and retain this trace of their evolution history until this record is little by little erased by fluctuations or other changes permitted by the code symmetry, such as selective drive [26].
The evolution of human spoken language is an important example for a ritualization transition. Sound waves produced involuntarily by human babies when breathing and suckling represent structural information which is intuitively analyzed and occasionally reproduced by its mother. By a positive feedback loop, such sounds form the root of spoken symbolic language which we automatically acquire during our infancy. Subsequently in evolution history, this language developed to a powerful indispensable communication tool that characterizes and empowers the human species to an unrivalled extent. Historically, the very first information-processing system was the genetic expression machinery of early life, where RNA or DNA triplets evolved as symbols for associated amino-acid sequences of proteins to be mounted in a definite succession. However, the problem how exactly life appeared on Earth and in which way symbolic information spawned from non-symbolic, native one, is extremely complicated and only incompletely understood.
We know that the existence of all living beings is intimately connected with information processing and valuation. These concepts are considered here as the central aspect of life. On the other hand, Shannon’s concept of information refers to a product of technical sciences and was devised as a technical “signal theory” [36]. However, this does not mean that information is a merely technical concept. Information appeared on our planet first in the context of life at a time when no technicians or engineers were available yet who may have had designed it. We may define a living system as a “natural, ordered and information-processing macroscopic system with an evolutionary history”. This definition may even be used as a criterion for decisions. Imagine the staff on a space ship far from our home planet which observes unknown objects moving in space, sending signals and doing manoeuvers; should the crew approach them with the respect for living objects? The detection of extra-terrestrial symbolic information would be a strong indication for alien life forms, such as it had prematurely been concluded when the first extremely regular pulsar signals were discovered by astronomers. The emergence of symbolic information may “distinguish the living from the lifeless”, as the problem was raised by Pearson in 1892 [30].
We consider information processing as a special high form of self-organization and a necessary condition for Darwinian evolution. Symbolic information is an emergent property, but there are several unsettled problems to be addressed in this context. Genuine information is symbolic information which needs a source that creates signals or symbols, a carrier to store or transport it, and finally a receiver that knows about the meaning of the message and transforms it into the structure or function the message is a blueprint for. This way symbolic information is always related to an ultimate purpose connected with valuation. How exactly and in very detail did information emerge by self-organization? We do not even possess an explicit numerical or experimental tutorial simulation example for the emergence of symbols from physical structures, but we may investigate for this purpose, for instance, the evolution history of symbolic money or of digital computers [26], see also Section 6.
Information-processing systems exist only in the context of life and its descendants, such as animal behavior, human sociology, science, technology, etc. The details, how life appeared, which way symbolic information developed out of non-symbolic, native one, are hidden behind the veils of an ancient history. Other, later examples for the self-organization of information are much easier to study, and this was done in various studies of behavioral biology on the evolutionary process of the transition from use activities of animals to signal activities.
A more thorough view on this transition process reveals rather general features which may be considered as a universal way to information processing [24]. When a process or a structure undergoes the ritualization transition, the original full form of structural appearance is successively reduced to a representation by symbols, together with a building-up of its processing machinery, which still is capable to react on the symbol as if its complete original was still there. At the end of the transition, the physical properties of the symbolic representation are no longer dependent on the physical properties of its origin, and this new symmetry makes drift and diversification of symbols possible because of their neutral stability and code symmetry. In the form of their structural information, the symbols typically preserve traces of their evolution history, such as the onomatopoetic origin discernible in spoken words of various languages, or the pictograms from which written languages evolved, such as Chinese hieroglyphs, cuneiform signs, and even Latin letters and Arabic numerals [24,26]. The structural information of natural languages offers detailed preserved records on the evolution of human societies, and comparative language analyses provide insight in the progress of natural sciences, historical events and ethnographic relations.
The main reason for the emergence of various symbolic information systems in the course of natural evolution, despite their complexity and vulnerability, is found in their valuation, in the selective advantage their users gained over their competitors [57]. Among those superior properties of symbolic information are the capabilities of largely loss-free copying and of safe long-term storage [24]. Symbols and their meaning are different physical structures and are differently affected by changing external conditions. This beneficial aspect of symbolic information is nicely demonstrated by the flowers in the Death Valley or the Atacama Desert, where the seeds as carriers of the symbolic information may resist extreme conditions that are fatal to the information’s meaning, namely the phenotype of the living plant.

5. Selective Values in Biology

In the biological sciences, the value concept was introduced by Darwin and Wallace. Sewell Wright [58,59] developed the idea of a fitness landscape (value landscape) which was subsequently worked out by many authors. Selective values are key elements of the mathematical models for competition between biological individuals and evolution processes investigated by Fisher [60] and Eigen [61]. Fisher’s famous “fundamental law of natural selection” [62] states that the change of a phenotypical trait of a species is proportional to the variation of that trait within the population considered, times the rate by which the selective value grows with that trait, that is, times the local slope of the fitness landscape [28,63,64]. In the last years, many new results on the structure of landscapes were obtained by Peter Schuster and his coworkers in Vienna. Qualitatively different evolution behavior is found depending on the correlation length of the fitness landscape, varying between the extremes of random and smooth structures [24,63,65].
In simple cases such as the growth rate of chemical micro-reactors, fitness or selective values can be derived from the chemical reaction kinetics of the internal metabolism, in the form similar to an eigenvalue of the rate equations [24,66]. At this stage of molecular evolution, fitness is still a value of the structural information of the replicating protocell. After ritualization and the emergence of symbolic genetic information, fitness becomes a semantic value of the symbolic information, quantifying the meaning of the genetic message which the cell has inherited from its parent cell.
Selective values in biology express the value of the genetic information, of the “experience” accumulated over many generations in the struggle for survival. Selective values can be measured in the form of population numbers of a species in comparison to its competitors. Except for simple pre-biological cases, selective values cannot be computed from the physical or chemical properties of the genetic chain molecule of an individual or a species, rather, selective values are emergent properties. Misinterpreting this emergent character, Darwin’s “survival of the fittest” has sometimes been blamed for being a tautology [67,68]. The survival of a fish depends on the presence of water with dissolved oxygen, at suitable temperature and pressure conditions, on the presence of food and the absence of predators and of toxic solutes, on its ability to hide from enemies, to catch prey, and to mate and produce offspring, etc. The complexity of influencing factors, the unpredictable change of environmental conditions due to other species, to climate or natural catastrophes, all this makes selective values irreducible to merely the physics and chemistry of a given organism, let alone to just the physical DNA structure of nucleotides which represent the genetic information.

6. Exchange Values in Economy

In the social sciences, the concept of values was first introduced by Adam Smith in the 18th century in an economic context. Smith [69] distinguished between two different meanings of the word “value”, namely the “use value” which expresses the usefulness of a thing to its owner, and the “exchange value” which expresses the owner’s ability to obtain other things in return via barter. The two aspects are not necessarily related to one another. While the exchange value may be quantified in the form of market prices, the use value is essentially of qualitative nature. The fundamental ideas of Adam Smith were worked out later by Ricardo, Marx, Schumpeter and many other economists.
Karl Marx [70] had emphasized the emergent character of exchange values: “Exchange values of commodities… cannot be either a geometrical, a physical, a chemical, or any other natural property of commodities”. The valuation of a good can be understood as a self-organized competition process on the market [38,39,71]. Initially and primarily, the exchange value refers only to the relation between two particular persons, a seller and a purchaser, with respect to the commodity of their common interest, rather than being a fixed price tag sticking at the good, independent of any presence, or absence, or the kind of humans involved. On a market, the interests of the sellers and buyers may vary greatly from one to another, and the same good may represent very different individual exchange values to each bargaining pair [72]. However, sellers and buyers may compare the different offers available on the market, in order to sell at the highest or to buy at the lowest exchange value. As a result of this cooperative interaction, a uniform market price for a certain commodity will be established by self-organisation, forming a certain compromise of interests in the form of an average exchange value. Otherwise, cyclic exchange processes were possible on the market during which some participants would systematically gain and others lose. The market price then suddenly appears as if it were an intrinsic property of the commodity itself, independent of the special interests one seller or another buyer may have. This price, as a result of a cooperative valuation process, represents the exchange value of a commodity with respect to a given market. In turn, this value constitutes an information value with respect to the symbolic information in the form of the producer’s skills and experience, his recipes, patents, construction plans, or just his knowledge on how and where to get something precious. Even information itself may appear as a commodity with the “information value” as its price [73].
Exchange of property between two owners is primarily a mental process of the participants rather than a physical process affecting the items of the deal; if appropriate, the things may be symbolically handed over from the one to the other. Direct binary exchange of useful objects may be difficult when they have very different values to their owners, or are available at different times or locations [69,74]. In such cases, a widely accepted, durable and preferably countable intermediate exchange good is temporarily used to confirm the barter, commonly known as money.
With the introduction of coins, physical objects representing exchange values underwent a ritualization transition from commodity money to fiat money, that is, to symbols for values [26,27], during which the handing over of valuable objects such as precious metals became replaced by the transfer of information about the mutually agreed price in the form of coins and paper money. Schumpeter [74] described this historical transition process by the statement that “not only that names and appearances of coins often indicate a meaning of a commodity, … not only that often the coin verifiably replaced commodity money used at the same place at earlier times; even the transition of commodity money to the coin is detectable step by step”, and Marx [70] explained that “in the metallic money tags the purely symbolic character is somewhat hidden; in paper money that character is obvious” (English translation from [26]). The preservation of this trace of the evolution history of symbols for exchange values, namely of metallic coins and banknotes, in the form of their names, shapes, etc., is characteristic for the ritualization transition. Today we even observe that the traditional information transfer in the form of circulating banknotes is increasingly replaced by digital information transfer, i.e., electronic payments, using the internet and the omnipresent computers. Politicians speculate about a complete abolition of any form of “physical money” in favor of “digital money”. However, symbolic information about property can represent real ownership only as long as it is granted by a commonly accepted, reliable authority such as a state or a king. “Property and law are born and must die together”, as Bentham [75] formulated it. As an emergent social property, the meaning of money symbols, namely the nominal values of a certain currency, is known to change dramatically in times of social unrest, natural catastrophes or economic crises. As a widespread counter measure, private persons and governments tend to hoard gold which, by physical conservation laws, is protected against sudden disappearance, in contrast to symbolic money.
With respect to the market economy, two levels of emergent properties and symbolic information are considered here. The first level is the exchange value of a commodity which is the value of the symbolic information that the owner or producer of the commodity had exploited in order to make the good available for sale. The second level is fiat money which, after a ritualization transition in the course of social evolution, replaced commodity money. Fiat money is symbolic information about personal property and is used to represent exchange values of goods.

7. Discussion and Conclusions

Emergent properties are encountered in various fields of science. The striking irreducibility of higher-level quantities or models to properties available from the lower-level description is understood only unsatisfactorily, despite numerous explaining attempts published previously. To the famous examples for emergent properties belong entropy as a value of energy in physics, values of symbolic information, selective values in biology, or exchange values in economy. While a rigorous general definition of “value” is difficult, the examples described in the foregoing sections demonstrate the existence of common properties irrespective of their different backgrounds, such as their emergent character and their fundamental relevance as order parameters that express extremum principles and irreversibility of the dynamics.
Besides values, symbols are also emergent phenomena, irreducible to their particular physical properties. In the course of natural and social evolution, symbols have frequently appeared by self-organization as a result of ritualization processes. Symbols are closely related to information, and the way symbols develop from non-symbolic physical or other basic structures offers detailed insights in the nature of emergence.
From the viewpoint of physics, it is reasonable to distinguish between emergent symbolic information on the one hand, based on conventions and associated with life, meaning and a purpose, and structural information on the other hand, associated with arbitrary physical structures or processes. Ritualization is the transition by which emergent symbolic information gradually separates from the original structural information in the course of evolution. All ritualization processes have in common that after the transition, arbitrary symbols are produced and recognized by information processing systems such as transmitters and receivers in the sense of Shannon. The fundamental code symmetry of symbols implies that their structural information constitutes a record of the evolution history of the information-processing system.
In contrast to our approach, we note for completeness that in the literature also certain hypotheses have been suggested which understand information as the only genuine, fundamental “substance” of which our world is actually consisting. In that view, our observable reality would be nothing but a “cosmic hologram” generated from that “source code” and projected to our senses and measuring devices [76]. Baggott [77] plausibly criticized that such concepts of information are more of a religious, philosophical or metaphysical nature than classical physical science, based on and verified against observational facts. This assessment applies similarly to explanations for information as a result of divine interference [78,79]. In this paper, “information” and the related concept of “values” are not considered intrinsic physical terms, despite the fact that transfer of information and value is always accompanied by a transfer of fundamental physical quantities such as energy and entropy. In this sense, information is an emergent quantity that cannot be reduced to physics alone although it is subject to and consistent with fundamental physical laws. Information-processing systems are the result of cooperative self-organization processes in the course of natural evolution.
Information may be quantified by values. An emergent physical value of structural information is entropy. Values are among the most relevant emergent properties such as fitness in the sense of Darwin or market prices in economy, which evaluate the symbolic information exploited for bringing into existence an individual or a commodity, respectively. Competition between biological species or between producers of commodities is always based on some kind of valuation which drives and controls evolution. The concepts of values and fitness landscapes used to describe competition and evolution are rather abstract and qualitative. Our point of view is that values are emergent properties of subsystems (species, commodities) in certain dynamical contexts. Values express the essence of biological, ecological, economic or social properties and relations with respect to the dynamics of the system; they represent fundamental order parameters of self-organized cooperative processes in complex multi-particle systems in the sense of Haken’s “Synergetics” [80].
Selective values in biology express the value of genetic information, accumulated since the very beginning of life by inheritance and modified by mutations and sexual crossover. By counting the survival rates of competing species, selective values can be measured, at least in principle. The emergent character of selective values prevents the possibility of computing fitness merely from the physical or chemical properties of a given organism.
Exchange values in economy express the value of human mental information, of the knowledge, skill and experience of producers, such as symbolic information stored in construction plans or inventor patents. By comparing the success or profit of competing producers, exchange values can be measured in the form of market prices and quantities of sold commodities. These examples of information values are consistent with Brillouin’s [14] requirement that “any criterion for value will result in an evaluation of the information received”, in the form of measurable physical structures that result from the processing of the given symbolic information.

Acknowledgments

The authors express their sincere gratitude to several friends and former colleagues for many fruitful discussions and collaboration on entropy and information, such as Andreas Engel, Jan Freund, Miguel A. Jiménez-Montaño, Joachim Pelkowski, Lutz Schimansky-Geier, Jürn W. P. Schmelzer, Frank Schweitzer, and Igor M. Sokolov. Personal discussions with several pioneers of statistical physics and information theory such as Michael Conrad, Dmitri S. Chernavsky, Manfred Eigen, Hermann Haken, Roman S. Ingarden, Yuri L. Klimontovich, Gregoire Nicolis, Ilya Prigogine, Yuri M. Romanovsky, Ruslan L. Stratonovich, Günter Tembrock, Mikhail V. Volkenstein, and Dmitri N. Zubarev had invaluable influence on the development of our understanding as presented in this paper. The authors are also grateful for valuable suggestions of the reviewers.

Author Contributions

Rainer Feistel and Werner Ebeling wrote the paper, and they contributed equally to this article. Both authors have read and approved the final manuscript.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Planck, M. Über Neuere Thermodynamische Theorien. (Nernstsches Wärmetheorem und Quanten-Hypothese). Berichte der Deutschen Chemischen Gesellschaft 1912, 45, 5–23. (In German) [Google Scholar] [CrossRef]
  2. Prigogine, I. Time, Structure and Fluctuations. Available online: http://www.nobelprize.org/nobel_prizes/chemistry/laureates/1977/prigogine-lecture.pdf (accessed on 17 May 2016).
  3. Alberti, P.M.; Uhlmann, A. Dissipative Motion in State Spaces; Teubner: Leipzig, Germany, 1981. [Google Scholar]
  4. Broer, H.; Takens, F. Dynamical Systems and Chaos; Springer: New York, NY, USA, 2011. [Google Scholar]
  5. Clausius, R. Ueber verschiedene für die Anwendung bequeme Formen der Hauptgleichungen der mechanischen Wärmetheorie. Annalen der Physik 1865, 201, 353–400. (In German) [Google Scholar] [CrossRef]
  6. Burgin, M. Information Theory: A Multifaceted Model of Information. Entropy 2003, 5, 146–160. [Google Scholar] [CrossRef]
  7. Burgin, M. Theory of Information: Fundamentality, Diversity and Unification; World Scientific: Singapore, Singapore, 2010. [Google Scholar]
  8. Burgin, M. Information: Concept Clarification and Theoretical Representation. Commun. Capital. Crit. 2011, 9, 347–357. [Google Scholar]
  9. Ebeling, W.; Freund, J.; Schweitzer, F. Komplexe Strukturen: Entropie und Information; Teubner, B.G., Ed.; Springer: Leipzig, Germany, 1998. [Google Scholar]
  10. Ebeling, W.; Volkenstein, M.V. Entropy and the evolution of biological information. Physica A 1990, 163, 398–402. [Google Scholar] [CrossRef]
  11. Boltzmann, L. Ueber die Beziehung zwischen dem zweiten Hauptsatze der mechanischen Wärmetheorie und der Wahrscheinlichkeitsrechnung, respective den Sätzen über das Wärmegleichgewicht. Sitzb. d. Kaiserlichen Akademie der Wissenschaften 1877, 76, 373–435. Available online: http://users.polytech.unice.fr/~leroux/boltztrad.pdf (accessed on 17 May 2016). [Google Scholar]
  12. Maxwell, J.C. Theory of Heat; Longmans, Green and Co.: London, UK; New York, NY, USA, 1888. [Google Scholar]
  13. Schrödinger, E. What is Life? The Physical Aspect of the Living Cell; Cambridge University Press: Cambridge, UK, 1944. [Google Scholar]
  14. Brillouin, L. Science and Information Theory; Academic Press: New York, NY, USA, 1956. [Google Scholar]
  15. Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423, 623–656. [Google Scholar] [CrossRef]
  16. Stratonovich, R.L. Information Theory; Sovyetskoye Radio: Moscow, Russia, 1875. (In Russian) [Google Scholar]
  17. Volkenstein, M.V. Entropy and Information; Birkhäuser: Basel, Switzerland, 2009. [Google Scholar]
  18. Haken, H. Information and Self-Organization; Springer: Berlin, Germany, 1988. [Google Scholar]
  19. Landauer, R. Information is Physical. Phys. Today 1991, 44, 23–29. [Google Scholar] [CrossRef]
  20. Ayres, R.U. Information, Entropy, and Progress: A New Evolutionary Paradigm; AIP Press: Woodbury, NY, USA, 1994. [Google Scholar]
  21. Hawking, S. The Universe in a Nutshell; Bantam Books: New York, NY, USA, 2001. [Google Scholar]
  22. Hofkirchner, W. (Ed.) The Quest for a Unified Theory of Information; Gordon and Breach: Amsterdam, The Netherlands, 1999.
  23. Floridi, F. Information: A Very Short Introduction; Oxford University Press: Oxford, UK, 2010. [Google Scholar]
  24. Feistel, R.; Ebeling, W. Physics of Self-Organization and Evolution, 1st ed.; Wiley: Weinheim, Germany, 2011. [Google Scholar]
  25. Ebeling, W.; Feistel, R. Self-Organization of Symbols and Information. In Chaos, Information Processing and Paradoxical Games: The Legacy of John S Nicolis; Nicolis, G., Basios, V., Eds.; World Scientific: Singapore, Singapore, 2015; pp. 141–184. [Google Scholar]
  26. Feistel, R. Emergence of Symbolic Information by the Ritualisation Transition. In Information Studies and the Quest for Transdisciplinarity; Burgin, M., Hofkirchner, W., Eds.; World Scientific: Singapore, Singapore, 2016; in press. [Google Scholar]
  27. Feistel, R. Ritualisation und die Selbstorganisation der Information. In Selbstorganisation—Jahrbuch für Komplexität in der Natur-, Sozial- und Geisteswissenschaften; Duncker & Humblot: Berlin, Germany, 1990; Volume 1, pp. 83–98. (In German) [Google Scholar]
  28. Ebeling, W.; Feistel, R. Theory of self-organization: the role of entropy, information and value. J. Non-Equilib. Thermodyn. 1992, 17, 303–332. [Google Scholar]
  29. List, C. Free will, determinism, and the possibility of doing otherwise. Noûs 2014, 48, 156–178. [Google Scholar] [CrossRef] [Green Version]
  30. Pattee, H.H. The Physics of Symbols: Bridging the Epistemic Cut. Biosystems 2001, 60, 5–21. [Google Scholar] [CrossRef]
  31. Butterfield, J. Laws, causation and dynamics at different levels. Interface Focus 2012, 2, 101–114. [Google Scholar] [CrossRef] [PubMed]
  32. Fuentes, M.A. Complexity and the Emergence of Physical Properties. Entropy 2014, 16, 4489–4496. [Google Scholar] [CrossRef]
  33. Planck, M. Vom Wesen der Willensfreiheit; Johann Ambrosius Barth: Leipzig, Germany, 1948. [Google Scholar]
  34. Ebeling, W. Value in Physics and Self-Organization. Nat. Soc. Thought 2006, 19, 133–143. [Google Scholar]
  35. Stratonovich, R.L. On the problem of the valuability of information. In Thermodynamics and Regulation of Biological Processes; Lamprecht, I., Zotin, A.I., Eds.; De Gruyter: Berlin, Germany, 1984. [Google Scholar]
  36. Haken, H.; Portugali, J. Information Adaptation: The Interplay Between Shannon Information and Semantic Information in Cognition; Springer: Heidelberg/Berlin, Germany, 2015. [Google Scholar]
  37. Lochmann, D. Information und der Entropie-Irrtum; Shaker Verlag: Aachen, Germany, 2012. [Google Scholar]
  38. Feistel, R. On the Value Concept in Economy. In Models of Self-Organization in Complex. Systems MOSES; Ebeling, W., Peschel, M., Weidlich, W., Eds.; Akademie-Verlag: Berlin, Germany, 1991; pp. 37–44. [Google Scholar]
  39. Kaldasch, J. Evolutionary model of an anonymous consumer durable market. Physica A 2011, 390, 2692–2715. [Google Scholar] [CrossRef]
  40. Schöpf, H.G. Rudolf Clausius. Ein Versuch, ihn zu verstehen. Annalen der Physik 1984, 496, 185–207. (In German) [Google Scholar] [CrossRef]
  41. Ebeling, W. On the relation between various entropy concepts and the valoric interpretation. Physica A 1992, 182, 108–120. [Google Scholar] [CrossRef]
  42. Planck, M. Theorie der Wärmestrahlung: Vorlesungen; Johann Ambrosius Barth: Leipzig, Germany, 1966. (In German) [Google Scholar]
  43. Planck, M. Vorlesungen über die Theorie der Wärmestrahlung; Johann Ambrosius Barth: Leipzig, Germany, 1906. (In German) [Google Scholar]
  44. Pelkowski, J. On the Clausius-Duhem Inequality and Maximum Entropy Production in a Simple Radiating System. Entropy 2014, 16, 2291–2308. [Google Scholar] [CrossRef]
  45. Kittel, C. Thermal Physics; Wiley: New York, NY, USA, 1969. [Google Scholar]
  46. Kabelac, S.; Conrad, R. Entropy generation during the interaction of thermal radiation with a surface. Entropy 2012, 14, 717–735. [Google Scholar] [CrossRef]
  47. Feistel, R. Entropy flux and entropy production of stationary black-body radiation. J. Non-Equilib. Thermodyn. 2011, 36, 131–139. [Google Scholar] [CrossRef]
  48. Zhang, B.; Cai, Q.-Y.; Zhan, M.-S.; You, L. Information conservation is fundamental: Recovering the lost information in Hawking radiation. Int. J. Mod. Phys. D 2013, 22, 1341014. [Google Scholar] [CrossRef]
  49. Szilard, L. Über die Entropievermehrung in einem thermodynamischen System bei Eingriffen intelligenter Wesen. Zeitschrift für Physik A 1929, 53, 840–856. [Google Scholar] [CrossRef]
  50. Eigen, M. From Strange Simplicity to Complex Familiarity; Oxford University Press: Oxford, UK, 2013. [Google Scholar]
  51. Nicolis, J.S.; Bassos, V. Chaos, Information Processing and Paradoxical Games: The Legacy of John S Nicolis; World Scientific: Singapore, Singapore, 2015. [Google Scholar]
  52. Mungan, C.E. Radiation thermodynamics with applications to lasing and fluorescent cooling. Am. J. Phys. 2005, 73, 315–322. [Google Scholar] [CrossRef]
  53. Erkmen, B.I.; Moision, B.E.; Birnbaum, K.M. The classical capacity of single-mode free-space optical communication: A review. IPN Prog. Rep. 2009, 15, 42–179. Available online: http://ipnpr.jpl.nasa.gov/progress_report/42–179/179C.pdf (accessed on 17 May 2016). [Google Scholar]
  54. Jiménez-Montaño, M.A.; Feistel, R.; Diez-Martínez, O. On the information hidden in signals and macromolecules. Nonlinear Dyn. Psychol. Life Sci. 2004, 8, 445–478. [Google Scholar]
  55. Huxley, J. The Courtship-Habits of the Great Crested Grebe (Podiceps cristatus); with an Addition to the Theory of Sexual Selection. J. Zool. 1914, 84, 491–562. [Google Scholar]
  56. Huxley, J. The ritualization of behaviour in animals and man. Philos. Trans. R. Soc. 1966, 251, 249–269. [Google Scholar] [CrossRef]
  57. Bickerton, D. Adam’s Tongue; Hill and Wang: New York, NY, USA, 2009. [Google Scholar]
  58. Wright, S. Evolution in Mendelian Populations. Genetics 1931, 16, 97–159. [Google Scholar] [PubMed]
  59. Wright, S. The roles of mutation, imbreeding, crossbreeding and selection in evolution. In Proceedings of the Sixth International Congress of Genetics, Ithaca, New York, NY, USA, 1932; Brooklyn Botanic Garden: New York, NY, USA, 1932; Volume I, pp. 356–366. Available online: http://www.esp.org/books/6th-congress/facsimile/title3.html (accessed on 16 March 2016). [Google Scholar]
  60. Fisher, R.A. The Genetical Theory of Natural Selection; Clarendon Press: Oxford, UK, 1930. [Google Scholar]
  61. Eigen, M. The self-organization of matter and the evolution of biological macromolecules. Naturwiss 1971, 58, 465–523. [Google Scholar] [CrossRef] [PubMed]
  62. Wilson, E.O.; Bossert, W.H. Einführung in die Populationsbiologie; Springer: Berlin/Heidelberg, Germany, 1973. (In German) [Google Scholar]
  63. Feistel, R.; Ebeling, W. Models of Darwinian Processes and Evolutionary Principles. BioSystems 1982, 15, 291–299. [Google Scholar] [CrossRef]
  64. Lamprecht, I.; Zotin, A.I. (Eds.) Thermodynamics and Regulation of Biological Processes; De Gryuter: Berlin, Germany, 1984.
  65. Ebeling, W.; Engel, A.; Esser, B.; Feistel, R. Diffusion and Reaction in Random Media and Models of Evolutionary Processes. J. Stat. Phys. 1984, 37, 369–384. [Google Scholar] [CrossRef]
  66. Feistel, R.; Romanovsky, Yu.M.; Vasiliev, V.A. Evolyutsiya Gipertsiklov Eigena, Protekayushchikh v Koatservatakh (Evolution of Eigen’s Hypercycles Existing in Coacervates). Biofizika 1980, 25, 882–887. (In Russian) [Google Scholar]
  67. Eigen, M.; McCaskill, J.; Schuster, P. The Molecular Quasi-Species. In Advances in Chemical Physics; Prigogine, I., Rice, S.A., Eds.; Wiley: Hoboken, NJ, USA, 1989; Volume 75, pp. 149–264. [Google Scholar]
  68. Volkenstein, M.V. Physical Approaches to Biological Evolution; Springer: Heidelberg/Berlin, Germany, 1994. [Google Scholar]
  69. Smith, A. Der Wohlstand der Nationen; Deutscher Taschenbuch Verlag: München, Germany, 1999. (In German) [Google Scholar]
  70. Marx, K. Das Kapital: Kritik der Politischen Oekonomie; Dietz-Verlag: Berlin, Germany, 1951. [Google Scholar]
  71. Ebeling, W.; Feistel, R. Physik der Selbstorganisation und Evolution; Akademie-Verlag: Berlin, Germany, 1982. [Google Scholar]
  72. Bocharova, S.P. Influence of the Information Value of Objects on the Level of Involuntary Memorization. Sov. Psychol. 1969, 7, 28–36. [Google Scholar] [CrossRef]
  73. Value of information. Available online: https://en.wikipedia.org/wiki/value_of_information (accessed on 17 May 2016).
  74. Schumpeter, J.A. Das Wesen des Geldes: Aus dem Nachlaß Herausgegeben und mit Einer Einführung Versehen; Vandenhoeck & Ruprecht: Göttingen, Germany, 2008. [Google Scholar]
  75. Bentham, J. Principles of the Civil Code. Available online: https://www.laits.utexas.edu/poltheory/bentham/pcc/pcc.pa01.c08.html (accessed on 17 May 2016).
  76. Bousso, R. Holography in General Space-times. J. High Energy Phys. 1999, 1999. [Google Scholar] [CrossRef]
  77. Baggott, J. Farewell to Reality; Constable & Robinson: London, UK, 2013. [Google Scholar]
  78. Elsberry, W.; Shallit, J. Information theory, evolutionary computation, and Dembski’s complex specified information. Synthese 2011, 178, 237–270. [Google Scholar] [CrossRef]
  79. Abel, D.L. The capabilities of chaos and complexity. Int. J. Mol. Sci. 2009, 10, 247–291. [Google Scholar] [CrossRef] [PubMed]
  80. Haken, H. Synergetics: An Introduction; Springer: Berlin/Heidelberg, Germany, 1978. [Google Scholar]

Share and Cite

MDPI and ACS Style

Feistel, R.; Ebeling, W. Entropy and the Self-Organization of Information and Value. Entropy 2016, 18, 193. https://doi.org/10.3390/e18050193

AMA Style

Feistel R, Ebeling W. Entropy and the Self-Organization of Information and Value. Entropy. 2016; 18(5):193. https://doi.org/10.3390/e18050193

Chicago/Turabian Style

Feistel, Rainer, and Werner Ebeling. 2016. "Entropy and the Self-Organization of Information and Value" Entropy 18, no. 5: 193. https://doi.org/10.3390/e18050193

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop