Information flow for occurrences in phase space can be assessed through the application of the Lyapunov characteristic exponent (multiplicative ergodic theorem), which is positive for non-linear systems that act as information sources and is negative for events that constitute information sinks. Attempts to unify the reversible descriptions of dynamics with the irreversible descriptions of thermodynamics have replaced phase space models with event space models. The introduction of operators for time and entropy in lieu of traditional trajectories has consequently limited—to eigenvectors and eigenvalues—the extent of knowable details about systems governed by such depictions. In this setting, a modified Lyapunov characteristic exponent for vector spaces can be used as a descriptor for the evolution of information, which is reflective of the associated extent of undetermined features. This novel application of the multiplicative ergodic theorem leads directly to the formulation of a dimension that is a measure for the information gain attributable to the occurrence. Thus, it provides a readout for the magnitudes of chance and necessity that contribute to an event. Related algorithms express a unification of information content, degree of randomness, and complexity (fractal dimension) in event space.
This is an open access article distributed under the Creative Commons Attribution License
which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited