Editorial Comment on the Special Issue of “Information in Dynamical Systems and Complex Systems”
Abstract
:1. Introduction
- Information flow. In particular transfer entropy has gained a great deal of interest in recent years as future states may be conditioned on past states both with and without access to other stochastic processes as a test through Kullback–Leibler divergence, but recent work suggests that there are possible misinterpretations from the use of transfer entropy for causality inference.
- Causality, and information signatures of causation. A central question in science is what causes outcomes of interest, and in particular for affecting and controlling outcomes this is even more important. From the perspective of information flow, causation and causal inference becomes particularly poignant.
- Symmetries and reversibility, which may be exploited in special circumstances to enlighten understanding of causality, structure, as well as clustering.
- Scales, hierarchies, lead to understanding of relevance and nested topological partitions when defining a scale from which information symbols are determined. Variational and optimization principles of information in physics, in particular regarding maximum entropy principles that lead to the understanding of underlying laws.
- Randomness, structure and causality. In some sense randomness may be described as external and unmodelled effects, which we may interpret in the context here as unknown information. This leads to: hidden states and hidden processes, including such methods as hidden Markov models and more generally Bayesian inference methods. In the context of information content of a dynamical system, such perspective should potentially yield better understanding.
- Measures and metrics of complexity and information content. The phrase “complexity” is commonly used for a wide variety of systems, behaviors and processes, and yet a commonly agreed description as to what the phrase means is lacking.
- Physical laws as information filters or algorithms. Since physical laws lead to evolution equations, which from the perspective of this discussion defines evolution from some information state to a new information state, then it can be said that physical laws may be described either as algorithms or information filters that translate states.
- Can we develop a general mechanistic description of what renders a real complex system different from a large but perhaps simpler system (particularly from an information theory perspective)?
- This question could eventually lead to many directions across many scientific fields, and the group agreed that this was a problem worth emphasizing. Indeed, the concept is weaved deeply into several of the papers here. If there is a unifying theme, then the paper [3] suggests a skeleton or coarse structure across parameters. On the other hand the casual structure [24] suggests connections between many elements, but possibly when such structure is hierarchical then this should be considered as complex meaning multi-scaled as opposed to simply a large homogenous system. Finally, the question of finitary versus infinitary examples in [13] suggests an enumerable difference between different types of complex systems, not to mention the information anatomy [1] allows a detailed description of how past, present, and future influence each other, and perhaps when considered also conditioned across spatial scales, then this could be the basis of such an interpretation.
- Can physical laws be defined in an algorithmic and information theoretic manner?
- Clearly this will be a deeply important connection between physics, physical laws that describe governance of measurable quantities, and the knowledge of states that are governed by these laws. Since knowledge of states represents information, the specific physical laws that evolve these states is a flow of information, and the information anatomy [1] perhaps gets the bottom of the story of how past, present and future are coupled. Can the physics be inferred directly from the information theory? This would be a new paradigm to discovery to which we have not yet attained but the question remains extremely important. Partial steps are made in the sense that causal inferences are discussed in [24] and also a parametric approach for assumed model forms is the classic filtering problem as discussed in [14].
- Identify engineering applications, especially those that benefit directly from information theory perspective and methods. How can this perspective impact design? Can specific control methods be developed that benefit?
- This is the engineering repeat of the previous question, if engineering means designing a problem with desired properties. The connection between physics and information theory and an algorithmic formalism between them would suggest that if there is an desired engineering design then a good understanding of the information theory should be beneficial. Simply from an observational stand point, the study of fish (agents) and robot [23] has the engineering perspective in that the experimental design allows for studies of how adjusting the robots behavior results in the change of the agents behaviors, and how this influences and influenced by the information flow between them. This then hints at a possible engineering approach for this general question in the future.
- Can group behaviors and cooperative behaviors such as those of animals and humans be better understood in terms of information theoretic descriptions? What role does hierarchical structures come into play?
- Clearly the paper by [23] tackles this question directly by demonstrating that group behavior and cooperation as having an underlying information flow that can be measured and adjusted. What would remain now is to consider hierarchical aspects of group behaviors, perhaps in coarse grains.
- Can methods designed to identify causal influences be adapted to further adjust and define control strategies for complex systems in biological, social, physical and engineering contexts?
- As the previous question, this asks how can the information theory perspective be turned around for engineering questions of design. The casual inference studied in [24] gives the formal definitions and methods to define what it means to have control strategies that may causally influence states, but does not yet describe the inverse problems necessary for an engineering control strategy asked here. How can information flow be a design feature to affect desired and controlled states? This will be a question surely for future technological innovation. That said, and with the information flow studies of [23] and information anatomy discussed in [13], all of the elements are here to at least well define the question. Design of a solution here will likely lead to new technologies not even previously dream of.
- Is there a minimal information description of a dynamical system that will facilitate engineering design? Does approximate description of the formal language suffice for approximate modeling lead to faster and easier design?
- This question lies underlying essentially all of the papers here in, and central to our discussion at the workshop. The information in a description may be interpreted as imposing a partition of states on a stochastic system, and once the partition is imposed (see for example Reference [29]) the measure on states may follow from which any information theoretic quantity may be computed. Therefore the question of minimal information description maps to a question of an infemum of the relevant functional, over all possible partitions, and with respect to all measures on each partition. This is clearly computationally impossible to approach by brute force but the concept seems likely to lie at the heart of understanding a process. It is interesting to note that in topological dynamics there exists a variational description of the topological entropy which can be described in terms of the “maximal entropy measure” [30,31] which in that setting corresponds to the positive answer this question. On the other hand, this same question may be interpreted constructively; how much information is needed to build or design a given dynamical system. This brings the question closer to that of Kolmogorov complexity [32,33] which specialized to this question here, is how much information is needed to encode the algorithm that produces the observed data. That question is in general unanswered.
Acknowledgments
Conflicts of Interest
References
- Marzen, S.; Crutchfield, J.P. Information anatomy of stochastic equilibria. Entropy 2014, 4713–4748. [Google Scholar]
- James, R.G.; Ellison, C.J.; Crutchfield, J.P. Anatomy of a bit: Information in a time series observation. Chaos 2011, 21, 037109. [Google Scholar]
- Bush, J.; Mischaikow, K. Coarse dynamics for coarse modeling: An example from population biology. Entropy 2014, 16, 3379–3400. [Google Scholar]
- Arai, Z.; Kalies, W.; Kokubu, H.; Mischaikow, K.; Oka, H.; Pilarczyk, P. A database schema for the analysis of global dynamics of multiparameter systems. SIAM J. Appl. Dyn. Syst 2009, 8, 757–789. [Google Scholar]
- Bush, J.; Gameiro, M.; Harker, S.; Kokubu, H.; Mischaikow, K.; Obayashi, I.; Pilarczyk, P. Combinatorial-topological framework for the analysis of global dynamics. Chaos 2012, 22, 047508. [Google Scholar]
- Pethel, S.D.; Hahs, D.W. Exact significance test for mutual information. Entropy 2014, 16, 2839–2849. [Google Scholar]
- Hlaváčková-Schindlera, K.; Paluš, M.; Vejmelka, M.; Bhattacharya, J. Causality Detection Based on Information-Theoretic Approaches in Time Series Analysis. Phys. Rep 2007, 441, 1–46. [Google Scholar]
- Kraskov, A.; Stogbauer, H.; Grassberger, P. Estimating mutual information. Phys. Rev. E 2004, 69, 066138. [Google Scholar]
- Tsimpiris, A.; Vlachos, I.; Kugiumtzis, D. Nearest neighbor estimate of conditional mutual information in feature selection. Expert Syst. Appl 2012, 39, 12697. [Google Scholar]
- Vejmelka, M.; Palus, M. Inferring the directionality of coupling with conditional mutual information. Phys. Rev. E 2008, 77, 026214. [Google Scholar]
- Vlachos, I.; Kugiumtzis, D. Nonuniform state-space reconstruction and coupling detection. Phys. Rev. E 2010, 82, 016207. [Google Scholar]
- Pethel, S.D.; Hahs, D.W. Exact significance test for Markov order. Physica D 2014, 269, 42–47. [Google Scholar]
- Travers, N.F.; Crutchfield, J.P. Infinite excess entropy processes with countable-state generators. Entropy 2014, 16, 1396–1413. [Google Scholar]
- Giffin, A.; Urniezius, R. Simultaneous state and parameter estimation using maximum relative entropy with nonhomogenous differential equation constraints. Entropy 2014, 16, 4974–4991. [Google Scholar]
- Su, R-Q.; Lai, Y-C.; Wang, X. Identifying chaotic FitzHugh–Nagumo neurons using compressive sensing. Entropy 2014, 16, 3889–3902. [Google Scholar]
- Candes, E.J.; Tao, T. Decoding by linear programming. IEEE Trans. Inform. Theory 2005, 51, 4203–4215. [Google Scholar]
- Candes, E.J.; Tao, T. Near-optimal signal recovery from random projections and universal encoding strategies? IEEE Trans. Inform. Theory 2006, 52, 5406–5425. [Google Scholar]
- Candès, E.J.; Romberg, J.K.; Tao,, T. Stable signal recovery from incomplete and inaccurate measurements. Comm. Pure Appl. Math 2006, LIX, 1207–1223. [Google Scholar]
- FitzHugh, R. Impulses and physiological states in theoretical models of nerve membrane. Biophys. J 1961, 1, 445–466. [Google Scholar]
- Nagumo, J.; Arimoto, S.; Yoshizawa, S. An active pulse transmission line simulating nerve axon. Proc. IRE 1962, 50, 2061–2070. [Google Scholar]
- Schreiber, T. Measuring information transfer. Phys. Rev. Lett 2000, 85, 461. [Google Scholar]
- Palus, M.; Komarek, V.; Hrncir, Z.; Sterbova, K. Synchronization as adjustment of information rates: Detection from bivariate time series. Phys. Rev. E 2001, 63, 046211. [Google Scholar]
- Butail, S.; Ladu, F.; Spinello, D.; Porfiri, M. Information flow in animal-robot interactions. Entropy 2014, 16, 1315–1330. [Google Scholar]
- Sun, J.; Cafaro, C.; Bollt, E.M. Identifying coupling structure in complex systems through the optimal causation entropy principle. Entropy 2014, 3416–3433. [Google Scholar]
- Sun, J.; Bollt, E.M. Causation entropy identifies indirect influences, dominance of neighbors and anticipatory couplings. Phys. D: Nonlinear Phenom 2014, 267, 49–57. [Google Scholar]
- Sun, J.; Taylor, D.; Bollt, E.M. Causal network inference by optimal causation entropy 2014. arXiv:1401.7574.
- Elowitz, M.B.; Leibler, S. A synthetic oscillatory network of transcriptional regulators. Nature 2000, 403, 335–338. [Google Scholar]
- Palǔs, M. Cross-scale interactions and information transfer. Entropy 2014. submitted. [Google Scholar]
- Bollt, E.M.; Stanford, T.; Lai, Y.-C.; Zyczkowski, K. What symbolic dynamics do we get with a misplaced partition? On the validity of threshold crossing analysis of chaotic time-series. Physica D 2001, 154, 259. [Google Scholar]
- Bollt, E.M.; Santitissadeekorn, N. Applied and Computational Measurable Dynamics; Society for Industrial and Applied Mathematics: Philadelphia, PA, USA, 2013. [Google Scholar]
- Parry, W. Intrinsic Markov chains. Trans. Amer. Math. Soc 1964, 112, 55–66. [Google Scholar]
- Chaitin, G.J. On the length of programs for computing finite binary sequences. J. ACM 1966, 13, 547–569. [Google Scholar]
- Kolmogorov, A.N. Three approaches to the quantitative definition of information. Probl. Inf. Transm 1965, 1, 1–7. [Google Scholar]
© 2014 by the authors; licensee MDPI, Basel, Switzerland This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license (http://creativecommons.org/licenses/by/3.0/).
Share and Cite
Bollt, E.M.; Sun, J. Editorial Comment on the Special Issue of “Information in Dynamical Systems and Complex Systems”. Entropy 2014, 16, 5068-5077. https://doi.org/10.3390/e16095068
Bollt EM, Sun J. Editorial Comment on the Special Issue of “Information in Dynamical Systems and Complex Systems”. Entropy. 2014; 16(9):5068-5077. https://doi.org/10.3390/e16095068
Chicago/Turabian StyleBollt, Erik M., and Jie Sun. 2014. "Editorial Comment on the Special Issue of “Information in Dynamical Systems and Complex Systems”" Entropy 16, no. 9: 5068-5077. https://doi.org/10.3390/e16095068