entropy-logo

Journal Browser

Journal Browser

Information Theory in Complex Systems

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: closed (28 February 2019) | Viewed by 44655

Special Issue Editors


E-Mail Website
Guest Editor
Department of Physics and Astronomy, University of Potsdam, 14476 Potsdam, Germany
Interests: applied information theory; complex systems; physics of information
Computational Science, Faculty of Science, The University of Amsterdam, The Netherlands
Interests: information theory; statistical mechanics; complex systems; complex networks; dynamics on networks; theory of computation; theoretical computer science; formal languages; information geometry
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Complex systems are ubiquitous in the natural and engineered worlds. Examples are self-assembling materials, the Earth's climate, single- and multi-cellular organisms, the brain, and coupled socio-economic and socio-technical systems, to mention a few canonical examples. The use of Shannon information theory to study the behavior of such systems, and to explain and predict their dynamics, has gained significant attention, both from a theoretical and from an experimental viewpoint. There have been many advances in applying Shannon theory to complex systems, including correlation analyses for spatial and temporal data and construction and clustering techniques for complex networks. Progress has often been driven by the application areas, such as genetics, neurosciences, and the Earth sciences.

The application of Shannon theory to data of real-world complex systems are often hindered by the frequent lack of stationarity and sufficient statistics. Further progress on this front call for new statistical techniques based on Shannon information theory, for the sophistication of known techniques, as well as for an improved understanding of the meaning of entropy in complex systems. Contributions addressing any of these issues are very welcome.

This Special Issue aims to be a forum for the presentation of new and improved techniques of information theory for complex systems. In particular, the analysis and interpretation of real-world natural and engineered complex systems with the help of statistical tools based on Shannon information theory fall within the scope of this Special Issue.

Prof. Dr. Karoline Wiesner
Dr. Rick Quax
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Complex systems
  • data analysis
  • statistics
  • information theoretic techniques
  • complex networks
  • physics
  • chemistry
  • biology
  • earth sciences
  • social sciences
  • applications

Published Papers (11 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

20 pages, 4127 KiB  
Article
Dual-Space Information Modeling of Socio-Economic Systems under Information Asymmetry
by Lei Bao and Joseph Fritchman
Entropy 2019, 21(5), 528; https://doi.org/10.3390/e21050528 - 24 May 2019
Viewed by 3287
Abstract
Information definitions across many disciplines commonly treat information as a physical world entity. Information measures are used along with other physical variables undistinguished for modeling physical systems. Building on previous work, this research explicitly defines information as a unique category of entity that [...] Read more.
Information definitions across many disciplines commonly treat information as a physical world entity. Information measures are used along with other physical variables undistinguished for modeling physical systems. Building on previous work, this research explicitly defines information as a unique category of entity that is created by intelligent agents to represent aspects of the physical world but that is not part of the physical world. This leads to the formation of the dual-space information modeling (DSIM) framework, which clearly distinguishes an information space from the physically based material space. The separation of information and material spaces allows new insight and flexibility into modeling complex systems. In this research, DSIM based agent models are applied to study the impact of information asymmetry to marketing behaviors. This paper demonstrates the effectiveness of the DSIM framework in the modeling process and how emergent behavior from these systems is encapsulated in the dual-space. Full article
(This article belongs to the Special Issue Information Theory in Complex Systems)
Show Figures

Figure 1

31 pages, 2297 KiB  
Article
Optimal Microbiome Networks: Macroecology and Criticality
by Jie Li and Matteo Convertino
Entropy 2019, 21(5), 506; https://doi.org/10.3390/e21050506 - 17 May 2019
Cited by 20 | Viewed by 7093
Abstract
The human microbiome is an extremely complex ecosystem considering the number of bacterial species, their interactions, and its variability over space and time. Here, we untangle the complexity of the human microbiome for the Irritable Bowel Syndrome (IBS) that is the most prevalent [...] Read more.
The human microbiome is an extremely complex ecosystem considering the number of bacterial species, their interactions, and its variability over space and time. Here, we untangle the complexity of the human microbiome for the Irritable Bowel Syndrome (IBS) that is the most prevalent functional gastrointestinal disorder in human populations. Based on a novel information theoretic network inference model, we detected potential species interaction networks that are functionally and structurally different for healthy and unhealthy individuals. Healthy networks are characterized by a neutral symmetrical pattern of species interactions and scale-free topology versus random unhealthy networks. We detected an inverse scaling relationship between species total outgoing information flow, meaningful of node interactivity, and relative species abundance (RSA). The top ten interacting species are also the least relatively abundant for the healthy microbiome and the most detrimental. These findings support the idea about the diminishing role of network hubs and how these should be defined considering the total outgoing information flow rather than the node degree. Macroecologically, the healthy microbiome is characterized by the highest Pareto total species diversity growth rate, the lowest species turnover, and the smallest variability of RSA for all species. This result challenges current views that posit a universal association between healthy states and the highest absolute species diversity in ecosystems. Additionally, we show how the transitory microbiome is unstable and microbiome criticality is not necessarily at the phase transition between healthy and unhealthy states. We stress the importance of considering portfolios of interacting pairs versus single node dynamics when characterizing the microbiome and of ranking these pairs in terms of their interactions (i.e., species collective behavior) that shape transition from healthy to unhealthy states. The macroecological characterization of the microbiome is useful for public health and disease diagnosis and etiognosis, while species-specific analyses can detect beneficial species leading to personalized design of pre- and probiotic treatments and microbiome engineering. Full article
(This article belongs to the Special Issue Information Theory in Complex Systems)
Show Figures

Graphical abstract

11 pages, 2238 KiB  
Article
Spatial Organization of the Gene Regulatory Program: An Information Theoretical Approach to Breast Cancer Transcriptomics
by Guillermo de Anda-Jáuregui, Jesús Espinal-Enriquez and Enrique Hernández-Lemus
Entropy 2019, 21(2), 195; https://doi.org/10.3390/e21020195 - 19 Feb 2019
Cited by 12 | Viewed by 3394
Abstract
Gene regulation may be studied from an information-theoretic perspective. Gene regulatory programs are representations of the complete regulatory phenomenon associated to each biological state. In diseases such as cancer, these programs exhibit major alterations, which have been associated with the spatial organization of [...] Read more.
Gene regulation may be studied from an information-theoretic perspective. Gene regulatory programs are representations of the complete regulatory phenomenon associated to each biological state. In diseases such as cancer, these programs exhibit major alterations, which have been associated with the spatial organization of the genome into chromosomes. In this work, we analyze intrachromosomal, or cis-, and interchromosomal, or trans-gene regulatory programs in order to assess the differences that arise in the context of breast cancer. We find that using information theoretic approaches, it is possible to differentiate cis-and trans-regulatory programs in terms of the changes that they exhibit in the breast cancer context, indicating that in breast cancer there is a loss of trans-regulation. Finally, we use these programs to reconstruct a possible spatial relationship between chromosomes. Full article
(This article belongs to the Special Issue Information Theory in Complex Systems)
Show Figures

Figure 1

12 pages, 3017 KiB  
Article
A Robust Adaptive Filter for a Complex Hammerstein System
by Guobing Qian, Dan Luo and Shiyuan Wang
Entropy 2019, 21(2), 162; https://doi.org/10.3390/e21020162 - 9 Feb 2019
Cited by 9 | Viewed by 2558
Abstract
The Hammerstein adaptive filter using maximum correntropy criterion (MCC) has been shown to be more robust to outliers than the ones using the traditional mean square error (MSE) criterion. As there is no report on the robust Hammerstein adaptive filters in the complex [...] Read more.
The Hammerstein adaptive filter using maximum correntropy criterion (MCC) has been shown to be more robust to outliers than the ones using the traditional mean square error (MSE) criterion. As there is no report on the robust Hammerstein adaptive filters in the complex domain, in this paper, we develop the robust Hammerstein adaptive filter under MCC to the complex domain, and propose the Hammerstein maximum complex correntropy criterion (HMCCC) algorithm. Thus, the new Hammerstein adaptive filter can be used to directly handle the complex-valued data. Additionally, we analyze the stability and steady-state mean square performance of HMCCC. Simulations illustrate that the proposed HMCCC algorithm is convergent in the impulsive noise environment, and achieves a higher accuracy and faster convergence speed than the Hammerstein complex least mean square (HCLMS) algorithm. Full article
(This article belongs to the Special Issue Information Theory in Complex Systems)
Show Figures

Figure 1

12 pages, 968 KiB  
Article
Characterizing Complex Networks Using Entropy-Degree Diagrams: Unveiling Changes in Functional Brain Connectivity Induced by Ayahuasca
by Aline Viol, Fernanda Palhano-Fontes, Heloisa Onias, Draulio B. de Araujo, Philipp Hövel and Gandhi M. Viswanathan
Entropy 2019, 21(2), 128; https://doi.org/10.3390/e21020128 - 30 Jan 2019
Cited by 22 | Viewed by 4830
Abstract
With the aim of further advancing the understanding of the human brain’s functional connectivity, we propose a network metric which we term the geodesic entropy. This metric quantifies the Shannon entropy of the distance distribution to a specific node from all other [...] Read more.
With the aim of further advancing the understanding of the human brain’s functional connectivity, we propose a network metric which we term the geodesic entropy. This metric quantifies the Shannon entropy of the distance distribution to a specific node from all other nodes. It allows to characterize the influence exerted on a specific node considering statistics of the overall network structure. The measurement and characterization of this structural information has the potential to greatly improve our understanding of sustained activity and other emergent behaviors in networks. We apply this method to study how the psychedelic infusion Ayahuasca affects the functional connectivity of the human brain in resting state. We show that the geodesic entropy is able to differentiate functional networks of the human brain associated with two different states of consciousness in the awaking resting state: (i) the ordinary state and (ii) a state altered by ingestion of the Ayahuasca. The functional brain networks from subjects in the altered state have, on average, a larger geodesic entropy compared to the ordinary state. Finally, we discuss why the geodesic entropy may bring even further valuable insights into the study of the human brain and other empirical networks. Full article
(This article belongs to the Special Issue Information Theory in Complex Systems)
Show Figures

Figure 1

14 pages, 1592 KiB  
Article
Information Geometric Duality of ϕ-Deformed Exponential Families
by Jan Korbel, Rudolf Hanel and Stefan Thurner
Entropy 2019, 21(2), 112; https://doi.org/10.3390/e21020112 - 24 Jan 2019
Cited by 10 | Viewed by 3802
Abstract
In the world of generalized entropies—which, for example, play a role in physical systems with sub- and super-exponential phase space growth per degree of freedom—there are two ways for implementing constraints in the maximum entropy principle: linear and escort constraints. Both appear naturally [...] Read more.
In the world of generalized entropies—which, for example, play a role in physical systems with sub- and super-exponential phase space growth per degree of freedom—there are two ways for implementing constraints in the maximum entropy principle: linear and escort constraints. Both appear naturally in different contexts. Linear constraints appear, e.g., in physical systems, when additional information about the system is available through higher moments. Escort distributions appear naturally in the context of multifractals and information geometry. It was shown recently that there exists a fundamental duality that relates both approaches on the basis of the corresponding deformed logarithms (deformed-log duality). Here, we show that there exists another duality that arises in the context of information geometry, relating the Fisher information of ϕ -deformed exponential families that correspond to linear constraints (as studied by J.Naudts) to those that are based on escort constraints (as studied by S.-I. Amari). We explicitly demonstrate this information geometric duality for the case of ( c , d ) -entropy, which covers all situations that are compatible with the first three Shannon–Khinchin axioms and that include Shannon, Tsallis, Anteneodo–Plastino entropy, and many more as special cases. Finally, we discuss the relation between the deformed-log duality and the information geometric duality and mention that the escort distributions arising in these two dualities are generally different and only coincide for the case of the Tsallis deformation. Full article
(This article belongs to the Special Issue Information Theory in Complex Systems)
Show Figures

Figure 1

24 pages, 3039 KiB  
Article
Information Dynamics in Urban Crime
by Miguel Melgarejo and Nelson Obregon
Entropy 2018, 20(11), 874; https://doi.org/10.3390/e20110874 - 14 Nov 2018
Cited by 2 | Viewed by 4476
Abstract
Information production in both space and time has been highlighted as one of the elements that shapes the footprint of complexity in natural and socio-technical systems. However, information production in urban crime has barely been studied. This work copes with this problem by [...] Read more.
Information production in both space and time has been highlighted as one of the elements that shapes the footprint of complexity in natural and socio-technical systems. However, information production in urban crime has barely been studied. This work copes with this problem by using multifractal analysis to characterize the spatial information scaling in urban crime reports and nonlinear processing tools to study the temporal behavior of this scaling. Our results suggest that information scaling in urban crime exhibits dynamics that evolve in low-dimensional chaotic attractors, and this can be observed in several spatio-temporal scales, although some of them are more favorable than others. This evidence has practical implications in terms of defining the characteristic scales to approach urban crime from available data and supporting theoretical perspectives about the complexity of urban crime. Full article
(This article belongs to the Special Issue Information Theory in Complex Systems)
Show Figures

Figure 1

14 pages, 269 KiB  
Article
Probability Mass Exclusions and the Directed Components of Mutual Information
by Conor Finn and Joseph T. Lizier
Entropy 2018, 20(11), 826; https://doi.org/10.3390/e20110826 - 28 Oct 2018
Cited by 12 | Viewed by 2863
Abstract
Information is often described as a reduction of uncertainty associated with a restriction of possible choices. Despite appearing in Hartley’s foundational work on information theory, there is a surprising lack of a formal treatment of this interpretation in terms of exclusions. This paper [...] Read more.
Information is often described as a reduction of uncertainty associated with a restriction of possible choices. Despite appearing in Hartley’s foundational work on information theory, there is a surprising lack of a formal treatment of this interpretation in terms of exclusions. This paper addresses the gap by providing an explicit characterisation of information in terms of probability mass exclusions. It then demonstrates that different exclusions can yield the same amount of information and discusses the insight this provides about how information is shared amongst random variables—lack of progress in this area is a key barrier preventing us from understanding how information is distributed in complex systems. The paper closes by deriving a decomposition of the mutual information which can distinguish between differing exclusions; this provides surprising insight into the nature of directed information. Full article
(This article belongs to the Special Issue Information Theory in Complex Systems)
Show Figures

Figure 1

13 pages, 1099 KiB  
Article
Intrinsic Computation of a Monod-Wyman-Changeux Molecule
by Sarah Marzen
Entropy 2018, 20(8), 599; https://doi.org/10.3390/e20080599 - 11 Aug 2018
Cited by 2 | Viewed by 3365
Abstract
Causal states are minimal sufficient statistics of prediction of a stochastic process, their coding cost is called statistical complexity, and the implied causal structure yields a sense of the process’ “intrinsic computation”. We discuss how statistical complexity changes with slight changes to the [...] Read more.
Causal states are minimal sufficient statistics of prediction of a stochastic process, their coding cost is called statistical complexity, and the implied causal structure yields a sense of the process’ “intrinsic computation”. We discuss how statistical complexity changes with slight changes to the underlying model– in this case, a biologically-motivated dynamical model, that of a Monod-Wyman-Changeux molecule. Perturbations to kinetic rates cause statistical complexity to jump from finite to infinite. The same is not true for excess entropy, the mutual information between past and future, or for the molecule’s transfer function. We discuss the implications of this for the relationship between intrinsic and functional computation of biological sensory systems. Full article
(This article belongs to the Special Issue Information Theory in Complex Systems)
Show Figures

Figure 1

15 pages, 3927 KiB  
Article
A Novel Index Based on Binary Entropy to Confirm the Spatial Expansion Degree of Urban Sprawl
by Zhen Chen, Yinkang Zhou and Xiaobin Jin
Entropy 2018, 20(8), 559; https://doi.org/10.3390/e20080559 - 27 Jul 2018
Viewed by 3179
Abstract
The phenomenon of urban sprawl has received much attention. Accurately confirming the spatial expansion degree of urban sprawl (SEDUS) is a prerequisite to controlling urban sprawl. However, there is no reliable metric to accurately measure SEDUS. In this paper, based on binary entropy, [...] Read more.
The phenomenon of urban sprawl has received much attention. Accurately confirming the spatial expansion degree of urban sprawl (SEDUS) is a prerequisite to controlling urban sprawl. However, there is no reliable metric to accurately measure SEDUS. In this paper, based on binary entropy, we propose a new index named the spatial expansion degree index (SEDI), to overcome this difficulty. The study shows that the new index can accurately determine SEDUS and, compared with other commonly used measures, the new index has an obvious advantage in measuring SEDUS. The new index belongs to the second-order metrics of point pattern analysis, and greatly extends the concept of entropy. The new index can also be applied to other spatial differentiation research from a broader perspective. Although the new index is influenced by the scaling problem, because of small differences between different scales, given that the partition scheme in the research process is the same, the new index is a quite robust method for measuring SEDUS. Full article
(This article belongs to the Special Issue Information Theory in Complex Systems)
Show Figures

Figure 1

Review

Jump to: Research

13 pages, 991 KiB  
Review
Transients as the Basis for Information Flow in Complex Adaptive Systems
by William Sulis
Entropy 2019, 21(1), 94; https://doi.org/10.3390/e21010094 - 20 Jan 2019
Cited by 5 | Viewed by 3565
Abstract
Information is the fundamental currency of naturally occurring complex adaptive systems, whether they are individual organisms or collective social insect colonies. Information appears to be more important than energy in determining the behavior of these systems. However, it is not the quantity of [...] Read more.
Information is the fundamental currency of naturally occurring complex adaptive systems, whether they are individual organisms or collective social insect colonies. Information appears to be more important than energy in determining the behavior of these systems. However, it is not the quantity of information but rather its salience or meaning which is significant. Salience is not, in general, associated with instantaneous events but rather with spatio-temporal transients of events. This requires a shift in theoretical focus from instantaneous states towards spatio-temporal transients as the proper object for studying information flow in naturally occurring complex adaptive systems. A primitive form of salience appears in simple complex systems models in the form of transient induced global response synchronization (TIGoRS). Sparse random samplings of spatio-temporal transients may induce stable collective responses from the system, establishing a stimulus–response relationship between the system and its environment, with the system parsing its environment into salient and non-salient stimuli. In the presence of TIGoRS, an embedded complex dynamical system becomes a primitive automaton, modeled as a Sulis machine. Full article
(This article belongs to the Special Issue Information Theory in Complex Systems)
Show Figures

Figure 1

Back to TopTop