Entropy increases in the execution of linear physical processes. At equilibrium, all uncertainty about the future is removed and information about the past is lost. Complex systems, on the other hand, can lead to the emergence of order, sustain uncertainty about the future, and generate new information to replace all old information about the system in finite time. The Kolmogorov–Sinai entropy for events and the Kolmogorov–Chaitin complexity for strings of numbers both approximate Shannon’s entropy (an indicator for the removal of uncertainty), indicating that information production is equivalent to the degree of complexity of an event. Thus, in the execution of non-linear processes, information entropy is inseparably tied to thermodynamic entropy. Therein, the critical decision points (bifurcations), which can exert lasting impact on the evolution of the future (the “butterfly effect”), defy the definition of being either born from randomness or from determination. Nevertheless, their information evolution and degree of complexity are amenable to measurement and can meaningfully replace the dichotomy of chance versus necessity. Common anthropomorphic perceptions do not accurately account for the transient durability of information, the potential for major consequences by small actions, or the absence of a discernible opposition between coincidence and inevitability.
This is an open access article distributed under the Creative Commons Attribution License
which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited