Next Article in Journal
A New Approach to Nonlinear Invariants for Hybrid Systems Based on the Citing Instances Method
Previous Article in Journal
Selected Aspects of Evaluating Knowledge Management Quality in Contemporary Enterprises
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Information Dynamics in Complex Systems Negates a Dichotomy between Chance and Necessity

College of Pharmacy, University of Cincinnati, Cincinnati, OH 45267-0514, USA
Information 2020, 11(5), 245; https://doi.org/10.3390/info11050245
Submission received: 9 March 2020 / Revised: 1 May 2020 / Accepted: 1 May 2020 / Published: 2 May 2020
(This article belongs to the Section Information Theory and Methodology)

Abstract

:
Entropy increases in the execution of linear physical processes. At equilibrium, all uncertainty about the future is removed and information about the past is lost. Complex systems, on the other hand, can lead to the emergence of order, sustain uncertainty about the future, and generate new information to replace all old information about the system in finite time. The Kolmogorov–Sinai entropy for events and the Kolmogorov–Chaitin complexity for strings of numbers both approximate Shannon’s entropy (an indicator for the removal of uncertainty), indicating that information production is equivalent to the degree of complexity of an event. Thus, in the execution of non-linear processes, information entropy is inseparably tied to thermodynamic entropy. Therein, the critical decision points (bifurcations), which can exert lasting impact on the evolution of the future (the “butterfly effect”), defy the definition of being either born from randomness or from determination. Nevertheless, their information evolution and degree of complexity are amenable to measurement and can meaningfully replace the dichotomy of chance versus necessity. Common anthropomorphic perceptions do not accurately account for the transient durability of information, the potential for major consequences by small actions, or the absence of a discernible opposition between coincidence and inevitability.

The terms “chance” and “necessity” are part of the daily vocabulary. However, they have been difficult to define rigorously [1]. Simply the delineation of what is lawful and what is random has been subject to debate. Karl R. Popper wrote about the differentiation of planetary movements as lawful and of throws of dice as chance events that, thus far, prognoses of planetary orbits have been successful while prognoses of individual throws of dice have not. Extending a rationale developed earlier by Kurt Gödel, Popper demonstrated that self-prediction from within a system is impossible, and he was consecutively led to become an indeterminist [2].
The concept of randomness may be characterized in various ways as an event without a perceivable design (Appendix A(1)). The lack thereof can be reflected in the “random” mechanism of generation or in the “random” pattern of output.
-
Uncertainty in the mechanism that generates specific outcomes may produce randomness. The throwing of dice is such a mechanism of uncertainty. Likewise, a string of numbers is random if it is generated by an unpredictable mechanism [3]. The randomness resides in the disorder of the generating process.
-
The arrangement of a pattern or a string of numbers may reflect randomness. A string is random because there is no inherent design that allows the prediction of consecutive digits or because there are no simple rules that enable the description of the string in compressed form [4]. The randomness lies in the arrangement itself.
Information (the removal of uncertainty, Appendix A(2)) and complexity (non-linearity, not amenable to closed-form solutions) are intricately linked to our everyday understanding of chance, but unlike “chance” or “randomness” these physical entities can be subjected to rigorous definition and analysis. Whereas no information is yielded by equilibrium (stasis), information flow (more accurately information evolution [5]) is tied to an occurrence (Section 1). The complexity of that occurrence underlies the amount of new information generated (Section 2 and Section 3), and it produces emergent phenomena (Section 4). At the bifurcations that are inherent in non-linear systems, infinitesimal influences can yield dramatically divergent results (Section 4), but any analysis of deterministic versus indeterministic foundations for complex processes is bound to be unsuccessful (Section 5 and Section 6). Substantial implications derive from the transience of information, the potential for major consequences by small actions at decision points (bifurcations), and the breakdown of the dichotomy between chance and necessity.

1. Information is the Removal of Uncertainty

The information content of an event is principally quantifiable in the algorithms developed by Claude Shannon [6]. An event with a certain outcome (a probability of 1) yields no new information. The information content of a process is interpretable as the removal of the extent of uncertainty that exists before its execution. This uncertainty constitutes the random or chance component inherent in this process. With its execution, the random element is removed, and new information is communicated.
In Shannon’s model, communication takes place between a sender and a receiver via a channel, the capacity of which is a critical determinant that is calculated from its noise characteristics. For all communication rates below channel capacity, the probability of error can in principle become arbitrarily small. However, theoretically optimized communication schemes may be computationally impractical. Random processes have an irreducible complexity, below which the signal cannot be compressed. The ultimate data compression is dubbed the entropy (Appendix A(3) and Appendix A(4)). Entropy and mutual information are functions of the probability distributions that underlie the process of communication.

2. Information Has a Finite Lifespan

A swinging pendulum must eventually come to a halt. Once at rest, the history of its swing cannot be reconstructed by any means. The second law of thermodynamics states that a closed system progresses toward a state of maximal entropy. Once this state is reached, all information on the path that led to it is lost, as is any uncertainty about its future. Thermodynamics (Appendix A(5)) has introduced the asymmetry of time irreversibility into physical law, and it has established the inescapability of losing information about the past (a form of “collective forgetting”).
The energy of physical systems is describable on the macroscale, which in classical mechanics is completely intelligible and can be captured with deterministic algorithms, versus the micro-scale of thermal motion, which is randomly distributed and can be expressed only as averages over large numbers. In the epochs preceding non-linear systems research, the states of the microscales were assumed to be uniform or stochastic and to constitute a lower limit of feasible explanation [7]. An adjustment of this concept is required in non-linear events that arise far from equilibrium, where energy is believed to emerge from the microscales and affect the macroscopic outcome [8].
In dynamics, motion can be depicted as a flow along trajectories in a phase space that often maps momentum versus position variables (sometimes velocity versus position) [9,10]. This phase space has as many dimensions as the system has degrees of freedom. In laminar flow, motion is governed by boundary and initial conditions, and no new information is generated, as outlined above. In turbulent flow, information is continuously generated by the flow itself through an exchange between the micro- and macro-scales. The average rate of information production (bits of information per unit time, denoted as λ ¯ ; the term is identical to the Lyapunov characteristic exponent [11]) is a measure for the rate of divergence of nearby trajectories (Appendix A(6)). The transition of a system from laminar to turbulent behavior is reflected in a change of the Lyapunov characteristic exponent from negative to positive, corresponding to a change of the system from an information sink to an information source, even if this turbulent behavior is governed by simple system equations. The Lyapunov characteristic exponent is a system state function that remains invariant under coordinate transformations. The information generated anew by turbulent systems precludes prediction past a certain time, when new information has accumulated to entirely displace the initial data. “New information is continuously being injected into the macroscopic degrees of freedom of the world by every puff of wind and every swirl of water” [8]. The loss of information about the initial state of a system is a physical reality, whether the occurring process strives toward equilibrium or toward emergence.

3. Information and Complexity are Equivalent

An occurrence in phase space, state space, or event space is characterized by its occupancy of the attributed space. In essence, such an analysis of occurrences in abstract space removes the definition of dimensions from to the axes of concrete space and relieves its restriction to whole numbers. The application of the Lyapunov characteristic exponent λ ¯ to such a conceptual space yields a measure for the information flow (or the information evolution) associated with each step in the execution of the event (Appendix A(7)). It lays the foundation for the calculation of a fractal dimensionality, the Lyapunov dimension, that characterizes the information evolution of the event in its entirety. This dimensionality gives an estimate of the Kolmogorov–Sinai entropy [12], which defines the information change inherent in the execution of the process. The Lyapunov dimension DKY represents an upper bound for the information dimension of the system (Appendix A(8)).
D K Y = k + i = 1 k λ ¯ i | λ ¯ k + 1 |
k = maximum integer such that the sum of the k largest exponents is non-negative; λ ¯ = Lyapunov characteristic exponent. The insertion of the Lyapunov characteristic exponent into this formula achieves the connection between the complexity of a process, the execution of which impacts thermodynamic entropy development, to its information entropy.
As non-linear systems dynamics studies algorithms that do not have closed-form solutions (Appendix A(9)), its analyses draw heavily on information theory [13]. Complexity has been measured in simple strings of 0s and 1s. A string is random if its shortest description is obtained by writing it out in its entirety—the shortest description of the string is the string itself. The more compressible the string, the less random it is. Connecting complexity and information content, the Kolmogorov–Chaitin complexity (K) of a string is approximately equal to the Shannon entropy (H), if the sequence of the string under study is drawn at random from a distribution that has the entropy H [14]. Specifically, for almost all infinite sequences produced by a stationary process, the growth rate of the Kolmogorov–Chaitin complexity is equal to the Shannon entropy rate. More broadly, diverse measures for information change, entropy and dimensionality in conceptual space are interdependent (Table 1), suggesting that these modalities of assessment largely equate to each other.

4. Complexity Drives Emergence via Critical Decision Points

Complex systems are characterized by non-periodic flow with sensitive dependence on the starting conditions, emergence of properties from non-equilibrium states, and generation of new information to fully replace old information. Hence, there are limits to how precisely we can describe the evolution of non-linear processes. Uncertainty about the future grows rapidly. Further, in the progression of complex systems there exist decision points, at which infinitesimal influences can propel events onto one of several very different future paths. Proceeding in this manner, the emergence of ordered structures from non-equilibrium conditions has been described in physics [15], chemistry [16,17] and evolution [18].
States in thermodynamic equilibrium (or states that equate to a minimal entropy production in the linear thermodynamics of non-equilibrium) are stable states that are principally reversible. Yet, irreversible processes play a fundamental and constructive role in the physical world. The laws of irreversible processes [10] have embedded dynamics in a more comprehensive formalism that includes unstable states. Non-equilibrium can lead to dissipative structures, wherein fluctuations introduce a stochastic description into the macroscopic level. Instabilities far from equilibrium are essential elements for emerging systems [19].
Evolution has been investigated as occurring in a conceptual state space, the shape of which is defined by the distribution of properties across an ensemble (a fitness landscape; Appendix A(10)). The initiating event for every step in evolution is a change (a mutation). Once a mutation has taken place, its penetration of the population is subjected to the rule of selection. However, simple and complex systems can exhibit powerful self-organization, and the effects of mutation and selection are diminished when operating on systems that have their own robust self-ordered properties. Spontaneous order is maintained despite selection, not because of it. Selection may only be able to mitigate the tendency for adaptive processes to become trapped on continuously lower local optima of fitness as complexity increases. While the selective force is stronger than the mutational force below a critical complexity of an organism, above this critical complexity, the dispersing mutational pressure increases, and the population falls from the global optimum to a suboptimal stationary steady state. Hence, the outcome of selection is context dependent. It drives or maintains complex systems on the boundary between order and chaos (sometimes dubbed “the edge of chaos” or “the onset of chaos”). At that boundary (“the poised state”), systems are best able to coordinate complex tasks and evolve in a complex environment [18].
A lack of periodicity is very common in nature. It is one of the distinguishing features of complex systems, for which progression and advanced states are unpredictable [20]. Events that are governed by non-linear differential equations have traditionally been considered deterministic on the basis of their driving algorithms, even though the slightest immeasurable deviations in their initial states can lead to dramatically different outcomes. For a finite system of ordinary differential equations representing forced dissipative flow, frequently, all of its solutions are ultimately confined within the same bounds. For these equations, non-periodic solutions cannot readily be determined, except by numerical procedures. The evolution of dissipative systems under these equations is commonly modeled with trajectories in phase space. Prediction of the sufficiently distant future is not feasible by any method, unless the initial conditions are known exactly (an impossible feat, as detailed in Section 5 below). There is an eventual necessity for any bounded system of finite dimensionality to come arbitrarily close to acquiring a state it has previously assumed. Only if the system is stable will its future development then remain arbitrarily close to its past history, and it will be quasi-periodic. Unstable systems display the now-famous “butterfly effect”: one flap of a butterfly’s wings may change the future course of the weather in a place far away. “The result […] implies that two states differing by imperceptible amounts may eventually evolve into two considerably different states. If, then, there is any error whatever in observing the present state–and in any real system such errors seem inevitable—an acceptable prediction of an instantaneous state in the distant future may well be impossible.” [21].

5. At the Critical Decision Points, Randomness Versus Determination Is Undecidable

It is principally impossible to assess whether decision points, such as bifurcations, are governed by deterministic or indeterministic mechanisms. Phase space models have minimally discernible resolution (Appendix A(11)), description beyond which is infeasible [8]. Thermodynamic considerations striving to incorporate irreversibility into physical law have developed event space models, wherein internal time and entropy become operators. The resolution of the description is restrained by the eigenvectors and eigenvalues of the operators [10].
Non-linear systems research often describes events as trajectories in phase space. Such phase space is divisible into blocks of minimum size that represent states. Their expansions constitute the limits to the precision of obtainable knowledge. Progressive attempts to more precisely measure the system eventually lead to the microscopic scale, wherein the uncertainty principle assures that the more accurately the position in phase space is determined, the less accurately the velocity can be assessed [22]. The analogous relationship exists between energy and time. Two trajectories become indistinguishable after they have approached each other below the distance of a block size. At the heart of non-linear systems are bifurcation points, where a system can evolve either toward one state or toward another, yielding very different outcomes. The infinite accuracy of measurement at the bifurcation point that would be required to predict which state a system will assume is impossible. Hence, the unpredictability of complex systems is rooted in the physical limit on the precision of obtainable knowledge [23].
In the vicinity of decision points (bifurcations), predictability under the law of large numbers is not valid anymore. While the molecular interactions in chemical reactions or mechanical motion far from equilibrium do not change from those in equilibrium, they do become dependent on global conditions. The transition from the time-reversible formulas of chemistry or mechanics to the algorithms of emerging processes is accomplished through a unique form of a non-local transformation, in which the homogeneity of the space-time structure is eliminated and both entropy and time become operators in an event space. This transition involves an internal time that is derived from the uncertainty associated with the trajectories in unstable dynamic systems. The transformation leads to a spatiotemporally non-local description [19]. Intelligibility is limited to the eigenvectors and eigenvalues of the operators.

6. Complexity and Chance Are Mirror-Images of the Same Events

Any discrete process, which maps the interval onto itself and whose reverse process is double- or multiple-valued, fulfills the Li and Yorke criterion for chaos. Sequences of numbers obtained by iteration of such processes (t indicates iteration steps) are mirror images of paths of random walks in the reverse, branched processes [24]. A deterministic system may be defined as one whose future and past are both unique functions of the present. On this basis, the difference equation (the logistic map equation) is
x t + 1 = k · x t   · ( 1 x t ) ,
the inverse being
x t = 1 2   ±   1 2   1 4 x t + 1 k ,
which is not deterministic because there are two different pasts of equal probability, even though the future is a unique function of the present. Iteration of the latter equation is still possible if rules are defined to choose one of the two values. Generally, for any one-dimensional map, what is unstable in the iteration of the forward process is stable in the iteration of the inverse process and vice versa. If iteration of the inverse equation is performed with random choices at each step, there will be no convergence toward a point or a cycle. The mirror image of this iteration will be a chaotic sequence produced by the initial difference equation, as is evidenced if the last number of a sequence produced by the inverse equation with random decision making at each iteration is used as the first number of an iteration of the initial difference equation, which will yield the same sequence in reverse. That is, the same sequence of numbers can be produced by a stochastic as well as by a deterministic process. Chaos in a one-dimensional difference equation can be viewed as reversed random walk [25].

7. Conclusions

The present analysis shows that the perception of a dichotomy between determined and undetermined elements of the world is an artifact, generated by the human mind. A distinction between a “random” occurrence and a complex “deterministic” occurrence is impossible to make as it has no correlate in nature. Models in phase space (typically plotting impulse versus time) or event space (an operator space) inherently have insufficient resolution to enable an assessment in favor of one or the other (Section 5). Strings of numbers can reflect deterministic chaos (a form of complexity) or randomness, depending on the order in which they are encountered (Section 6). By contrast, we are enabled to measure new information produced by an event (through the Lyapunov characteristic exponent) and the degree of complexity inherent in the event (through the information dimension) in order to gain relevant, quantitative readouts (Section 3). Rather than a dichotomy between chance and necessity, quantifiable information evolution and information dimension, as well as entropy, are suitable scientific measures for the complexity of an occurrence, which can be interpreted as being reflective of the “ordered” and “chance” components that contribute to this occurrence [5].
The universe is one coherent entity. Nevertheless, the traditional, reductionist description of nature has typically categorized observations and created opposites that are purportedly mutually independent, thus generating partial entities of the world that have been treated as detached from one another. Such categorizations had the benefit of achieving tractability through a quasi-linearization of descriptions [26]. The construct of a dichotomy between necessary events and chance events enabled us—especially in the eras preceding computers and complexity research—to make meaningful observations about parts of the world. However, it has given rise to centuries of discussions about world views that span the entire spectrum from deterministic fatalism to indeterministic nihilism. The results from non-linear systems research must lead to the conclusion that the dichotomous perception itself is untenable. There is no meaningful answer to the question whether a complex event has originated in chance or necessity. Only readouts for its extent of complexity are meaningful assessments.
Human societal value systems are often rooted in the idea of name recognition as a modality of acknowledgement for achievement. For fame we compete. Yet, the present analysis has shown that information generally has a finite lifespan. Eventually, name recognition—like all information—will be replaced and the name will turn into smoke. Societies are emergent systems wherein new information will eventually entirely replace old information. By contrast, miniscule actions at decision points (bifurcations) of complex systems can have an irreversible and durable impact on the future. In keeping with natural laws, actions—even small representations thereof—may deserve priority over attribution of credit as the measure for contributions to progress.

Funding

This research received no external funding.

Conflicts of Interest

The author declares no conflict of interest.

Appendix A

(1). For a discussion of chance versus randomness see Eagle [27]. As the present discussion negates the physical possibility to resolve the dichotomy between chance and necessity, the more subtle distinction between chance and randomness becomes moot. Information flow (or information evolution) and complexity (fractal dimensionality) are amenable to measurements and offer well suited replacements for the difficult-to-define terms “chance”, “randomness”, “necessity” and “determination” in the description of nature.
(2). Randomness is often explicated as unpredictability [28]. Hence, uncertainty is a reflection of the random components of an event.
(3). The formal convergence between the Shannon formula for entropy in information theory and the Boltzmann formula for thermodynamic entropy has given rise to extensive debates over meaningful connections between them. They have a common origin in probability theory, and they need to satisfy common requirements such as additivity. “If one accepts the probabilistic interpretation of the entropy, and agrees on the meaning of Shannon’s information, then the interpretation of the thermodynamic entropy as thermodynamic information becomes inevitable” [29].
Other formulations of entropy have been proposed, which measure complexity in dynamical systems (Kolmogorov–Sinai entropy; see Section 3) or in strings of numbers (Kolmogorov–Chaitin complexity). They share with the Shannon entropy the dependence on information. While Kolmogorov–Chaitin complexity is concerned with the content of information, Shannon entropy is concerned with the missing information.
(4). The Rényi entropy [30] is a generalization of the Shannon entropy. In the limit where the order approaches 1, the Rényi definition of entropy converges to the Shannon definition. An attractor for which the Rényi dimensions are not all equal exhibits multifractal structure. Thus, the Rényi dimension connects uncertainty to the dimensionality of the space (state space, phase space or event space), in which it is measured.
(5). The concept of entropy was developed in the 1850s by Rudolf Clausius, who described it as the transformation content (dissipative use of energy) of a thermodynamic system or working body of chemical species during a change of state. Much of the mathematics for the field of thermodynamics was then formulated in the 1870s by Ludwig Boltzmann and J. Willard Gibbs.
(6). Stable periodic orbits have a Lyapunov characteristic exponent ( λ ¯ ) that is negative. Only in the hypothetical case of λ ¯ = 0 can transients persist indefinitely and information on the underlying perturbation of the system be preserved.
(7). To be consistent with the second law of thermodynamics, proper equations of motion must allow for the existence of an attractor. This is not the case for Newton’s laws or for Lagrangian and Hamiltonian mechanics, but it is achievable through the redefinition of entropy and an internal time in the form of operators. This description of dynamics in terms of operators dramatically reshapes space-time [10,19]. However, due to the reliance on the eigenvectors and eigenvalues of the operators for time and thermodynamic entropy, an assessment of the information evolution associated with an event now requires algorithms that are independent of trajectories in phase space. To achieve this, a reformulation of the Lyapunov characteristic exponent has been applied to the vector-based event space [5].
(8). Sinai and Kolmogorov built on the Shannon concept of information entropy to determine a measure for the complexity of the motion taking place in a dynamical system. Pesin later demonstrated that, when the Kolmogorov–Sinai entropy is greater than 0, the dynamical system will display non-linearity (chaos). According to Pesin’s theorem [31], the sum of all the positive Lyapunov exponents gives an estimate of the Kolmogorov–Sinai entropy.
(9). In non-periodic flow, closed-form predictions are impossible because the information they would represent simply does not exist prior to the operation of the mechanism.
(10). The movement over a fitness landscape is an alternative model to the progression of a trajectory in phase space.
(11). “As soon as any trajectories approach within some distance ∆h of each other, they will become indistinguishable. In any physical implementation of the system ∆h may vary depending on the accuracy of the instrument measuring the system position, the thermal motion of the system, or many other factors. However, even with “perfect” instruments and at absolute zero, ∆h can never be reduced to zero. The Uncertainty Principle assures us that there is a minimum block size in phase space, which is a physical constant of Nature. Should two orbits arrive within such a block, they are no longer distinguishable, and the information represented in their separate origins is lost.” [8].

References

  1. Ford, J. How random is a coin toss? Phys. Today 1983, 36, 40–47. [Google Scholar] [CrossRef]
  2. Popper, K.R. The Open Universe: An Argument for Indeterminism, 2nd ed.; Rowman and Littlefield: Totowa, NJ, USA, 1982. [Google Scholar]
  3. Eagle, A. Randomness is unpredictability. Br. J. Philos. Sci. 2005, 56, 749–790. [Google Scholar] [CrossRef]
  4. Chaitin, G. The Unknowable; Springer: Singapore, 1999. [Google Scholar]
  5. Weber, G.F. Information gain in event space reflects chance and necessity components of an event. Information 2019, 10, 358. [Google Scholar] [CrossRef] [Green Version]
  6. Shannon, C.E. A mathematical theory of communication. Bell Syst. Technol. J. 1948, 27, 379–423, 623–656. [Google Scholar] [CrossRef] [Green Version]
  7. McKelvey, B. Thwarting Faddism at the Edge of Chaos; European Institute for Advanced Studies in Management Workshop on Complexity and Organization: Brussels, Belgium, 1998. [Google Scholar]
  8. Shaw, R. Strange attractors, chaotic behavior, and information flow. Z. Nat. 1981, 36, 80–112. [Google Scholar] [CrossRef]
  9. Thompson, J.M.T.; Stewart, H.B. Nonlinear Dynamics and Chaos; John Wiley and Sons: Chichester, UK; New York, NY, USA; Brisbane, Australia; Toronto, ON, Canada; Singapore, 1986. [Google Scholar]
  10. Prigogine, I. From Being to Becoming: Time and Complexity in the Physical Sciences; W.H. Freeman and Company: New York, NY, USA, 1980. [Google Scholar]
  11. Oseledets, V.I. A multiplicative ergodic theorem. Characteristic Ljapunov, exponents of dynamical systems. Trans. Mosc. Math. Soc. 1968, 19, 197–231. [Google Scholar]
  12. Sinai, Y.G. On the Notion of Entropy of a Dynamical System. Dokl. Russ. Acad. Sci. 1959, 124, 768–771. [Google Scholar]
  13. Weber, G.F. Dynamic knowledge—A century of evolution. Sociol. Mind 2013, 3, 268–277. [Google Scholar] [CrossRef] [Green Version]
  14. Kolmogorov, A.N. Logical basis for information theory and probability theory. IEEE Trans. Inf. Theory 1968, 14, 662–664. [Google Scholar] [CrossRef] [Green Version]
  15. Prigogine, I.; Stengers, I. Order Out of Chaos; Bantam Books: Toronto, ON, Canada; New York, NY, USA; London, UK; Sydney, Australia, 1984. [Google Scholar]
  16. Belusov, R.P. Periodically acting reaction and its mechanism. In Сбoрник Рефератoв Пo Радиoциoннoй Медицине 1958; Collection of Abstracts on Radiation Medicine 1958; Medgiz: Moscow, Russia, 1959; pp. 145–147. [Google Scholar]
  17. Zhabotinsky, A.M. Periodical process of oxidation of malonic acid solution. Biophysics 1964, 9, 306–311. [Google Scholar]
  18. Kauffman, S.A. The Origins of Order: Self-Organization and Selection in Evolution; Oxford University Press: New York, NY, USA; Oxford, UK, 1993. [Google Scholar]
  19. Prigogine, I. Dissipative structures, dynamics, and entropy. Int. J. Quantum Chem. 1975, 9, 443–456. [Google Scholar] [CrossRef]
  20. Favre, A.; Guitton, H.; Guitton, J.; Lichnerowicz, A.; Wolff, E. Chaos and Determinism: Turbulence as a Paradigm for Complex Systems Converging toward Final States; The Johns Hopkins University Press: Baltimore, MD, USA, 1988. [Google Scholar]
  21. Lorenz, E.N. Deterministic nonperiodic flow. J. Atmos. Sci. 1963, 20, 130–141. [Google Scholar] [CrossRef] [Green Version]
  22. Heisenberg, W. Über den anschaulichen Inhalt der quantentheoretischen Kinematik und Mechanik. Z. Phys. 1927, 43, 172–198, [German]. [Google Scholar] [CrossRef]
  23. Heisenberg, W. Der Teil und das Ganze. Gespräche im Umkreis der Atomphysik, 8th ed.; Deutscher Taschenbuch Verlag: München, Germany, 1984. [Google Scholar]
  24. Degn, H. Discrete chaos is reversed random walk. Phys. Rev. A 1982, 26, 711–712. [Google Scholar] [CrossRef]
  25. Olsen, L.F.; Degn, H. Chaos in biological systems. Q. Rev. Biophys. 1985, 18, 165–225. [Google Scholar] [CrossRef] [PubMed]
  26. Milanowski, P.; Carter, T.J.; Weber, G.F. Enzyme catalysis and the outcome of biochemical reactions. J. Proteom. Bioinform. 2013, 6, 132–141. [Google Scholar] [CrossRef] [Green Version]
  27. Eagle, A. Chance versus randomness. In The Stanford Encyclopedia of Philosophy (Spring 2019 Edition); Zalta, E.N., Ed.; Stanford University: Stanford, CA, USA, 2019; Available online: https://plato.stanford.edu/archives/spr2019/entries/chance-randomness/ (accessed on 1 February 2020).
  28. Berkovitz, J.; Frigg, R.; Kronz, F. The Ergodic Hierarchy, Randomness and Chaos. Stud. Hist. Philos. Mod. Phys. 2006, 37, 661–691. [Google Scholar] [CrossRef] [Green Version]
  29. Ben-Naim, A. A Farewell to Entropy: Statistical Thermodynamics Based on Information; World Scientific Publishing Corporation: Singapore, 2008. [Google Scholar]
  30. Rényi, A. On measures of information and entropy. In Proceedings of the 4th Berkeley Symposium on Mathematics, Statistics and Probability, Statistical Laboratory, University of California, Oakland, CA, USA, 30 June–30 July 1960; pp. 547–561. [Google Scholar]
  31. Pesin, Y. Dimension Theory in Dynamical Systems: Contemporary Views and Applications. Chicago Lectures in Mathematics; University of Chicago Press: Chicago, IL, USA, 1997. [Google Scholar]
Table 1. Entropy, information and dimension. Characteristics describing the measures of complexity. The far-right column describes connections among the diverse measures.
Table 1. Entropy, information and dimension. Characteristics describing the measures of complexity. The far-right column describes connections among the diverse measures.
MeasureFormalismAreaDescriptionExplanationInterpretationCommonalities
Boltzmann entropy S = k B p i ln p i thermodynamicsentropy = degree of disorderdissipation of energy, arrow of timethermodynamic entropy equivalent to thermodynamic information
Shannon entropy S = i P i log P i = E P [ log P ] communicationentropy = ultimate data compressionremoval of the uncertainty that exists before communicationuncertainty constitutes the random or chance component inherent in this processwith Boltzmann: common origin in probability theory, common requirements
Rényi entropy D α = lim ε 0 1 α 1 log ( i p i α ) log ε abstract spaceuncertainty = dimensionconnects uncertainty to dimensionality of the space, in which it is measuredquantifies diversity, uncertainty or randomness of a systemgeneralization of the Shannon entropy
Kolmogorov-Sinai entropy H K S = lim ε 0 lim t 0 ( H t / t ) ) dynamical systemsinformation change inherent in the execution of a processmetric invariant of a dynamical systemmaximum capacity of information that can be generated by a dynamical systemessentially Shannon entropy per unit time
Kolmogorov-Chaitin complexity K T ( s ) = m i n { | p | ,   T ( p ) = s } number stringscomplexity = non-compressibilityconnects complexity and information contentreflective of the content of informationgrowth rate often equal to Shannon entropy rate
Lyapunov dimension D K Y = k + i = 1 k λ ¯ i | λ ¯ k + 1 | abstract spaceupper bound for the information dimension of a systemestimates fractal dimension of attractorsfunction of the Lyapunov characteristic exponentsestimate of the Kolmogorov-Sinai entropy
information dimension D 1 = lim ε 0 log p ε log 1 ε probabilityfractal dimension of a probability distributioninformation measure for random vectors in Euclidean spacemeasure for the fractal dimension of a probability distributioncharacterizes growth rate of Shannon entropy with fine-graining of space
Mandelbrot dimension log ε N = D = log N log ε geometrystatistical index of complexityratio of the change in detail to the change in scalemeasure for the space-filling capacity of a patternsimilar to box-counting dimension

Share and Cite

MDPI and ACS Style

Weber, G.F. Information Dynamics in Complex Systems Negates a Dichotomy between Chance and Necessity. Information 2020, 11, 245. https://doi.org/10.3390/info11050245

AMA Style

Weber GF. Information Dynamics in Complex Systems Negates a Dichotomy between Chance and Necessity. Information. 2020; 11(5):245. https://doi.org/10.3390/info11050245

Chicago/Turabian Style

Weber, Georg F. 2020. "Information Dynamics in Complex Systems Negates a Dichotomy between Chance and Necessity" Information 11, no. 5: 245. https://doi.org/10.3390/info11050245

APA Style

Weber, G. F. (2020). Information Dynamics in Complex Systems Negates a Dichotomy between Chance and Necessity. Information, 11(5), 245. https://doi.org/10.3390/info11050245

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop