Information Dynamics in Complex Systems Negates a Dichotomy between Chance and Necessity
Abstract
:- -
- Uncertainty in the mechanism that generates specific outcomes may produce randomness. The throwing of dice is such a mechanism of uncertainty. Likewise, a string of numbers is random if it is generated by an unpredictable mechanism [3]. The randomness resides in the disorder of the generating process.
- -
- The arrangement of a pattern or a string of numbers may reflect randomness. A string is random because there is no inherent design that allows the prediction of consecutive digits or because there are no simple rules that enable the description of the string in compressed form [4]. The randomness lies in the arrangement itself.
1. Information is the Removal of Uncertainty
2. Information Has a Finite Lifespan
3. Information and Complexity are Equivalent
4. Complexity Drives Emergence via Critical Decision Points
5. At the Critical Decision Points, Randomness Versus Determination Is Undecidable
6. Complexity and Chance Are Mirror-Images of the Same Events
7. Conclusions
Funding
Conflicts of Interest
Appendix A
References
- Ford, J. How random is a coin toss? Phys. Today 1983, 36, 40–47. [Google Scholar] [CrossRef]
- Popper, K.R. The Open Universe: An Argument for Indeterminism, 2nd ed.; Rowman and Littlefield: Totowa, NJ, USA, 1982. [Google Scholar]
- Eagle, A. Randomness is unpredictability. Br. J. Philos. Sci. 2005, 56, 749–790. [Google Scholar] [CrossRef]
- Chaitin, G. The Unknowable; Springer: Singapore, 1999. [Google Scholar]
- Weber, G.F. Information gain in event space reflects chance and necessity components of an event. Information 2019, 10, 358. [Google Scholar] [CrossRef] [Green Version]
- Shannon, C.E. A mathematical theory of communication. Bell Syst. Technol. J. 1948, 27, 379–423, 623–656. [Google Scholar] [CrossRef] [Green Version]
- McKelvey, B. Thwarting Faddism at the Edge of Chaos; European Institute for Advanced Studies in Management Workshop on Complexity and Organization: Brussels, Belgium, 1998. [Google Scholar]
- Shaw, R. Strange attractors, chaotic behavior, and information flow. Z. Nat. 1981, 36, 80–112. [Google Scholar] [CrossRef]
- Thompson, J.M.T.; Stewart, H.B. Nonlinear Dynamics and Chaos; John Wiley and Sons: Chichester, UK; New York, NY, USA; Brisbane, Australia; Toronto, ON, Canada; Singapore, 1986. [Google Scholar]
- Prigogine, I. From Being to Becoming: Time and Complexity in the Physical Sciences; W.H. Freeman and Company: New York, NY, USA, 1980. [Google Scholar]
- Oseledets, V.I. A multiplicative ergodic theorem. Characteristic Ljapunov, exponents of dynamical systems. Trans. Mosc. Math. Soc. 1968, 19, 197–231. [Google Scholar]
- Sinai, Y.G. On the Notion of Entropy of a Dynamical System. Dokl. Russ. Acad. Sci. 1959, 124, 768–771. [Google Scholar]
- Weber, G.F. Dynamic knowledge—A century of evolution. Sociol. Mind 2013, 3, 268–277. [Google Scholar] [CrossRef] [Green Version]
- Kolmogorov, A.N. Logical basis for information theory and probability theory. IEEE Trans. Inf. Theory 1968, 14, 662–664. [Google Scholar] [CrossRef] [Green Version]
- Prigogine, I.; Stengers, I. Order Out of Chaos; Bantam Books: Toronto, ON, Canada; New York, NY, USA; London, UK; Sydney, Australia, 1984. [Google Scholar]
- Belusov, R.P. Periodically acting reaction and its mechanism. In Сбoрник Рефератoв Пo Радиoциoннoй Медицине 1958; Collection of Abstracts on Radiation Medicine 1958; Medgiz: Moscow, Russia, 1959; pp. 145–147. [Google Scholar]
- Zhabotinsky, A.M. Periodical process of oxidation of malonic acid solution. Biophysics 1964, 9, 306–311. [Google Scholar]
- Kauffman, S.A. The Origins of Order: Self-Organization and Selection in Evolution; Oxford University Press: New York, NY, USA; Oxford, UK, 1993. [Google Scholar]
- Prigogine, I. Dissipative structures, dynamics, and entropy. Int. J. Quantum Chem. 1975, 9, 443–456. [Google Scholar] [CrossRef]
- Favre, A.; Guitton, H.; Guitton, J.; Lichnerowicz, A.; Wolff, E. Chaos and Determinism: Turbulence as a Paradigm for Complex Systems Converging toward Final States; The Johns Hopkins University Press: Baltimore, MD, USA, 1988. [Google Scholar]
- Lorenz, E.N. Deterministic nonperiodic flow. J. Atmos. Sci. 1963, 20, 130–141. [Google Scholar] [CrossRef] [Green Version]
- Heisenberg, W. Über den anschaulichen Inhalt der quantentheoretischen Kinematik und Mechanik. Z. Phys. 1927, 43, 172–198, [German]. [Google Scholar] [CrossRef]
- Heisenberg, W. Der Teil und das Ganze. Gespräche im Umkreis der Atomphysik, 8th ed.; Deutscher Taschenbuch Verlag: München, Germany, 1984. [Google Scholar]
- Degn, H. Discrete chaos is reversed random walk. Phys. Rev. A 1982, 26, 711–712. [Google Scholar] [CrossRef]
- Olsen, L.F.; Degn, H. Chaos in biological systems. Q. Rev. Biophys. 1985, 18, 165–225. [Google Scholar] [CrossRef] [PubMed]
- Milanowski, P.; Carter, T.J.; Weber, G.F. Enzyme catalysis and the outcome of biochemical reactions. J. Proteom. Bioinform. 2013, 6, 132–141. [Google Scholar] [CrossRef] [Green Version]
- Eagle, A. Chance versus randomness. In The Stanford Encyclopedia of Philosophy (Spring 2019 Edition); Zalta, E.N., Ed.; Stanford University: Stanford, CA, USA, 2019; Available online: https://plato.stanford.edu/archives/spr2019/entries/chance-randomness/ (accessed on 1 February 2020).
- Berkovitz, J.; Frigg, R.; Kronz, F. The Ergodic Hierarchy, Randomness and Chaos. Stud. Hist. Philos. Mod. Phys. 2006, 37, 661–691. [Google Scholar] [CrossRef] [Green Version]
- Ben-Naim, A. A Farewell to Entropy: Statistical Thermodynamics Based on Information; World Scientific Publishing Corporation: Singapore, 2008. [Google Scholar]
- Rényi, A. On measures of information and entropy. In Proceedings of the 4th Berkeley Symposium on Mathematics, Statistics and Probability, Statistical Laboratory, University of California, Oakland, CA, USA, 30 June–30 July 1960; pp. 547–561. [Google Scholar]
- Pesin, Y. Dimension Theory in Dynamical Systems: Contemporary Views and Applications. Chicago Lectures in Mathematics; University of Chicago Press: Chicago, IL, USA, 1997. [Google Scholar]
Measure | Formalism | Area | Description | Explanation | Interpretation | Commonalities |
---|---|---|---|---|---|---|
Boltzmann entropy | thermodynamics | entropy = degree of disorder | dissipation of energy, arrow of time | thermodynamic entropy equivalent to thermodynamic information | ||
Shannon entropy | communication | entropy = ultimate data compression | removal of the uncertainty that exists before communication | uncertainty constitutes the random or chance component inherent in this process | with Boltzmann: common origin in probability theory, common requirements | |
Rényi entropy | abstract space | uncertainty = dimension | connects uncertainty to dimensionality of the space, in which it is measured | quantifies diversity, uncertainty or randomness of a system | generalization of the Shannon entropy | |
Kolmogorov-Sinai entropy | dynamical systems | information change inherent in the execution of a process | metric invariant of a dynamical system | maximum capacity of information that can be generated by a dynamical system | essentially Shannon entropy per unit time | |
Kolmogorov-Chaitin complexity | number strings | complexity = non-compressibility | connects complexity and information content | reflective of the content of information | growth rate often equal to Shannon entropy rate | |
Lyapunov dimension | abstract space | upper bound for the information dimension of a system | estimates fractal dimension of attractors | function of the Lyapunov characteristic exponents | estimate of the Kolmogorov-Sinai entropy | |
information dimension | probability | fractal dimension of a probability distribution | information measure for random vectors in Euclidean space | measure for the fractal dimension of a probability distribution | characterizes growth rate of Shannon entropy with fine-graining of space | |
Mandelbrot dimension | geometry | statistical index of complexity | ratio of the change in detail to the change in scale | measure for the space-filling capacity of a pattern | similar to box-counting dimension |
© 2020 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Weber, G.F. Information Dynamics in Complex Systems Negates a Dichotomy between Chance and Necessity. Information 2020, 11, 245. https://doi.org/10.3390/info11050245
Weber GF. Information Dynamics in Complex Systems Negates a Dichotomy between Chance and Necessity. Information. 2020; 11(5):245. https://doi.org/10.3390/info11050245
Chicago/Turabian StyleWeber, Georg F. 2020. "Information Dynamics in Complex Systems Negates a Dichotomy between Chance and Necessity" Information 11, no. 5: 245. https://doi.org/10.3390/info11050245
APA StyleWeber, G. F. (2020). Information Dynamics in Complex Systems Negates a Dichotomy between Chance and Necessity. Information, 11(5), 245. https://doi.org/10.3390/info11050245