E-Mail Alert

Add your e-mail address to receive forthcoming issues of this journal:

Journal Browser

Journal Browser

Special Issue "Information Theory in Complex Systems"

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: 28 February 2019

Special Issue Editors

Guest Editor
Prof. Dr. Karoline Wiesner

School of Mathematics, University of Bristol, Bristol, BS8 1TW, UK
Website | E-Mail
Interests: applied information theory; complex systems; physics of information
Guest Editor
Dr. Rick Quax

Computational Science, Faculty of Science, The University of Amsterdam, The Netherlands
Website | E-Mail
Interests: information theory; statistical mechanics; complex systems; complex networks; dynamics on networks; theory of computation; theoretical computer science; formal languages; information geometry

Special Issue Information

Dear Colleagues,

Complex systems are ubiquitous in the natural and engineered worlds. Examples are self-assembling materials, the Earth's climate, single- and multi-cellular organisms, the brain, and coupled socio-economic and socio-technical systems, to mention a few canonical examples. The use of Shannon information theory to study the behavior of such systems, and to explain and predict their dynamics, has gained significant attention, both from a theoretical and from an experimental viewpoint. There have been many advances in applying Shannon theory to complex systems, including correlation analyses for spatial and temporal data and construction and clustering techniques for complex networks. Progress has often been driven by the application areas, such as genetics, neurosciences, and the Earth sciences.

The application of Shannon theory to data of real-world complex systems are often hindered by the frequent lack of stationarity and sufficient statistics. Further progress on this front call for new statistical techniques based on Shannon information theory, for the sophistication of known techniques, as well as for an improved understanding of the meaning of entropy in complex systems. Contributions addressing any of these issues are very welcome.

This Special Issue aims to be a forum for the presentation of new and improved techniques of information theory for complex systems. In particular, the analysis and interpretation of real-world natural and engineered complex systems with the help of statistical tools based on Shannon information theory fall within the scope of this Special Issue.

Prof. Dr. Karoline Wiesner
Dr. Rick Quax
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Complex systems
  • data analysis
  • statistics
  • information theoretic techniques
  • complex networks
  • physics
  • chemistry
  • biology
  • earth sciences
  • social sciences
  • applications

Published Papers (8 papers)

View options order results:
result details:
Displaying articles 1-8
Export citation of selected articles as:

Research

Jump to: Review

Open AccessArticle A Robust Adaptive Filter for a Complex Hammerstein System
Entropy 2019, 21(2), 162; https://doi.org/10.3390/e21020162
Received: 22 December 2018 / Revised: 2 February 2019 / Accepted: 6 February 2019 / Published: 9 February 2019
PDF Full-text (887 KB)
Abstract
The Hammerstein adaptive filter using maximum correntropy criterion (MCC) has been shown to be more robust to outliers than the ones using the traditional mean square error (MSE) criterion. As there is no report on the robust Hammerstein adaptive filters in the complex [...] Read more.
The Hammerstein adaptive filter using maximum correntropy criterion (MCC) has been shown to be more robust to outliers than the ones using the traditional mean square error (MSE) criterion. As there is no report on the robust Hammerstein adaptive filters in the complex domain, in this paper, we develop the robust Hammerstein adaptive filter under MCC to the complex domain, and propose the Hammerstein maximum complex correntropy criterion (HMCCC) algorithm. Thus, the new Hammerstein adaptive filter can be used to directly handle the complex-valued data. Additionally, we analyze the stability and steady-state mean square performance of HMCCC. Simulations illustrate that the proposed HMCCC algorithm is convergent in the impulsive noise environment, and achieves a higher accuracy and faster convergence speed than the Hammerstein complex least mean square (HCLMS) algorithm. Full article
(This article belongs to the Special Issue Information Theory in Complex Systems)
Open AccessArticle Characterizing Complex Networks Using Entropy-Degree Diagrams: Unveiling Changes in Functional Brain Connectivity Induced by Ayahuasca
Entropy 2019, 21(2), 128; https://doi.org/10.3390/e21020128
Received: 21 December 2018 / Revised: 13 January 2019 / Accepted: 30 January 2019 / Published: 30 January 2019
PDF Full-text (968 KB) | HTML Full-text | XML Full-text | Supplementary Files
Abstract
With the aim of further advancing the understanding of the human brain’s functional connectivity, we propose a network metric which we term the geodesic entropy. This metric quantifies the Shannon entropy of the distance distribution to a specific node from all other [...] Read more.
With the aim of further advancing the understanding of the human brain’s functional connectivity, we propose a network metric which we term the geodesic entropy. This metric quantifies the Shannon entropy of the distance distribution to a specific node from all other nodes. It allows to characterize the influence exerted on a specific node considering statistics of the overall network structure. The measurement and characterization of this structural information has the potential to greatly improve our understanding of sustained activity and other emergent behaviors in networks. We apply this method to study how the psychedelic infusion Ayahuasca affects the functional connectivity of the human brain in resting state. We show that the geodesic entropy is able to differentiate functional networks of the human brain associated with two different states of consciousness in the awaking resting state: (i) the ordinary state and (ii) a state altered by ingestion of the Ayahuasca. The functional brain networks from subjects in the altered state have, on average, a larger geodesic entropy compared to the ordinary state. Finally, we discuss why the geodesic entropy may bring even further valuable insights into the study of the human brain and other empirical networks. Full article
(This article belongs to the Special Issue Information Theory in Complex Systems)
Figures

Figure 1

Open AccessArticle Information Geometric Duality of ϕ-Deformed Exponential Families
Entropy 2019, 21(2), 112; https://doi.org/10.3390/e21020112
Received: 24 December 2018 / Revised: 11 January 2019 / Accepted: 16 January 2019 / Published: 24 January 2019
PDF Full-text (1592 KB) | HTML Full-text | XML Full-text
Abstract
In the world of generalized entropies—which, for example, play a role in physical systems with sub- and super-exponential phase space growth per degree of freedom—there are two ways for implementing constraints in the maximum entropy principle: linear and escort constraints. Both appear naturally [...] Read more.
In the world of generalized entropies—which, for example, play a role in physical systems with sub- and super-exponential phase space growth per degree of freedom—there are two ways for implementing constraints in the maximum entropy principle: linear and escort constraints. Both appear naturally in different contexts. Linear constraints appear, e.g., in physical systems, when additional information about the system is available through higher moments. Escort distributions appear naturally in the context of multifractals and information geometry. It was shown recently that there exists a fundamental duality that relates both approaches on the basis of the corresponding deformed logarithms (deformed-log duality). Here, we show that there exists another duality that arises in the context of information geometry, relating the Fisher information of ϕ -deformed exponential families that correspond to linear constraints (as studied by J.Naudts) to those that are based on escort constraints (as studied by S.-I. Amari). We explicitly demonstrate this information geometric duality for the case of ( c , d ) -entropy, which covers all situations that are compatible with the first three Shannon–Khinchin axioms and that include Shannon, Tsallis, Anteneodo–Plastino entropy, and many more as special cases. Finally, we discuss the relation between the deformed-log duality and the information geometric duality and mention that the escort distributions arising in these two dualities are generally different and only coincide for the case of the Tsallis deformation. Full article
(This article belongs to the Special Issue Information Theory in Complex Systems)
Figures

Figure 1

Open AccessArticle Information Dynamics in Urban Crime
Entropy 2018, 20(11), 874; https://doi.org/10.3390/e20110874
Received: 28 September 2018 / Revised: 1 November 2018 / Accepted: 6 November 2018 / Published: 14 November 2018
PDF Full-text (3039 KB) | HTML Full-text | XML Full-text
Abstract
Information production in both space and time has been highlighted as one of the elements that shapes the footprint of complexity in natural and socio-technical systems. However, information production in urban crime has barely been studied. This work copes with this problem by [...] Read more.
Information production in both space and time has been highlighted as one of the elements that shapes the footprint of complexity in natural and socio-technical systems. However, information production in urban crime has barely been studied. This work copes with this problem by using multifractal analysis to characterize the spatial information scaling in urban crime reports and nonlinear processing tools to study the temporal behavior of this scaling. Our results suggest that information scaling in urban crime exhibits dynamics that evolve in low-dimensional chaotic attractors, and this can be observed in several spatio-temporal scales, although some of them are more favorable than others. This evidence has practical implications in terms of defining the characteristic scales to approach urban crime from available data and supporting theoretical perspectives about the complexity of urban crime. Full article
(This article belongs to the Special Issue Information Theory in Complex Systems)
Figures

Figure 1

Open AccessArticle Probability Mass Exclusions and the Directed Components of Mutual Information
Entropy 2018, 20(11), 826; https://doi.org/10.3390/e20110826
Received: 26 September 2018 / Revised: 22 October 2018 / Accepted: 23 October 2018 / Published: 28 October 2018
Cited by 1 | PDF Full-text (269 KB) | HTML Full-text | XML Full-text
Abstract
Information is often described as a reduction of uncertainty associated with a restriction of possible choices. Despite appearing in Hartley’s foundational work on information theory, there is a surprising lack of a formal treatment of this interpretation in terms of exclusions. This paper [...] Read more.
Information is often described as a reduction of uncertainty associated with a restriction of possible choices. Despite appearing in Hartley’s foundational work on information theory, there is a surprising lack of a formal treatment of this interpretation in terms of exclusions. This paper addresses the gap by providing an explicit characterisation of information in terms of probability mass exclusions. It then demonstrates that different exclusions can yield the same amount of information and discusses the insight this provides about how information is shared amongst random variables—lack of progress in this area is a key barrier preventing us from understanding how information is distributed in complex systems. The paper closes by deriving a decomposition of the mutual information which can distinguish between differing exclusions; this provides surprising insight into the nature of directed information. Full article
(This article belongs to the Special Issue Information Theory in Complex Systems)
Figures

Figure 1

Open AccessArticle Intrinsic Computation of a Monod-Wyman-Changeux Molecule
Entropy 2018, 20(8), 599; https://doi.org/10.3390/e20080599
Received: 12 July 2018 / Revised: 6 August 2018 / Accepted: 10 August 2018 / Published: 11 August 2018
PDF Full-text (1099 KB) | HTML Full-text | XML Full-text
Abstract
Causal states are minimal sufficient statistics of prediction of a stochastic process, their coding cost is called statistical complexity, and the implied causal structure yields a sense of the process’ “intrinsic computation”. We discuss how statistical complexity changes with slight changes to the [...] Read more.
Causal states are minimal sufficient statistics of prediction of a stochastic process, their coding cost is called statistical complexity, and the implied causal structure yields a sense of the process’ “intrinsic computation”. We discuss how statistical complexity changes with slight changes to the underlying model– in this case, a biologically-motivated dynamical model, that of a Monod-Wyman-Changeux molecule. Perturbations to kinetic rates cause statistical complexity to jump from finite to infinite. The same is not true for excess entropy, the mutual information between past and future, or for the molecule’s transfer function. We discuss the implications of this for the relationship between intrinsic and functional computation of biological sensory systems. Full article
(This article belongs to the Special Issue Information Theory in Complex Systems)
Figures

Figure 1

Open AccessArticle A Novel Index Based on Binary Entropy to Confirm the Spatial Expansion Degree of Urban Sprawl
Entropy 2018, 20(8), 559; https://doi.org/10.3390/e20080559
Received: 18 May 2018 / Revised: 25 June 2018 / Accepted: 25 July 2018 / Published: 27 July 2018
PDF Full-text (3927 KB) | HTML Full-text | XML Full-text | Supplementary Files
Abstract
The phenomenon of urban sprawl has received much attention. Accurately confirming the spatial expansion degree of urban sprawl (SEDUS) is a prerequisite to controlling urban sprawl. However, there is no reliable metric to accurately measure SEDUS. In this paper, based on binary entropy, [...] Read more.
The phenomenon of urban sprawl has received much attention. Accurately confirming the spatial expansion degree of urban sprawl (SEDUS) is a prerequisite to controlling urban sprawl. However, there is no reliable metric to accurately measure SEDUS. In this paper, based on binary entropy, we propose a new index named the spatial expansion degree index (SEDI), to overcome this difficulty. The study shows that the new index can accurately determine SEDUS and, compared with other commonly used measures, the new index has an obvious advantage in measuring SEDUS. The new index belongs to the second-order metrics of point pattern analysis, and greatly extends the concept of entropy. The new index can also be applied to other spatial differentiation research from a broader perspective. Although the new index is influenced by the scaling problem, because of small differences between different scales, given that the partition scheme in the research process is the same, the new index is a quite robust method for measuring SEDUS. Full article
(This article belongs to the Special Issue Information Theory in Complex Systems)
Figures

Figure 1

Review

Jump to: Research

Open AccessReview Transients as the Basis for Information Flow in Complex Adaptive Systems
Entropy 2019, 21(1), 94; https://doi.org/10.3390/e21010094
Received: 31 December 2018 / Revised: 17 January 2019 / Accepted: 19 January 2019 / Published: 20 January 2019
PDF Full-text (991 KB) | HTML Full-text | XML Full-text
Abstract
Information is the fundamental currency of naturally occurring complex adaptive systems, whether they are individual organisms or collective social insect colonies. Information appears to be more important than energy in determining the behavior of these systems. However, it is not the quantity of [...] Read more.
Information is the fundamental currency of naturally occurring complex adaptive systems, whether they are individual organisms or collective social insect colonies. Information appears to be more important than energy in determining the behavior of these systems. However, it is not the quantity of information but rather its salience or meaning which is significant. Salience is not, in general, associated with instantaneous events but rather with spatio-temporal transients of events. This requires a shift in theoretical focus from instantaneous states towards spatio-temporal transients as the proper object for studying information flow in naturally occurring complex adaptive systems. A primitive form of salience appears in simple complex systems models in the form of transient induced global response synchronization (TIGoRS). Sparse random samplings of spatio-temporal transients may induce stable collective responses from the system, establishing a stimulus–response relationship between the system and its environment, with the system parsing its environment into salient and non-salient stimuli. In the presence of TIGoRS, an embedded complex dynamical system becomes a primitive automaton, modeled as a Sulis machine. Full article
(This article belongs to the Special Issue Information Theory in Complex Systems)
Figures

Figure 1

Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top