E-Mail Alert

Add your e-mail address to receive forthcoming issues of this journal:

Journal Browser

Journal Browser

Special Issue "Information Theory in Complex Systems"

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: 31 December 2018

Special Issue Editors

Guest Editor
Prof. Dr. Karoline Wiesner

School of Mathematics, University of Bristol, Bristol, BS8 1TW, UK
Website | E-Mail
Interests: applied information theory; complex systems; physics of information
Guest Editor
Dr. Rick Quax

Computational Science, Faculty of Science, The University of Amsterdam, The Netherlands
Website | E-Mail
Interests: information theory; statistical mechanics; complex systems; complex networks; dynamics on networks; theory of computation; theoretical computer science; formal languages; information geometry

Special Issue Information

Dear Colleagues,

Complex systems are ubiquitous in the natural and engineered worlds. Examples are self-assembling materials, the Earth's climate, single- and multi-cellular organisms, the brain, and coupled socio-economic and socio-technical systems, to mention a few canonical examples. The use of Shannon information theory to study the behavior of such systems, and to explain and predict their dynamics, has gained significant attention, both from a theoretical and from an experimental viewpoint. There have been many advances in applying Shannon theory to complex systems, including correlation analyses for spatial and temporal data and construction and clustering techniques for complex networks. Progress has often been driven by the application areas, such as genetics, neurosciences, and the Earth sciences.

The application of Shannon theory to data of real-world complex systems are often hindered by the frequent lack of stationarity and sufficient statistics. Further progress on this front call for new statistical techniques based on Shannon information theory, for the sophistication of known techniques, as well as for an improved understanding of the meaning of entropy in complex systems. Contributions addressing any of these issues are very welcome.

This Special Issue aims to be a forum for the presentation of new and improved techniques of information theory for complex systems. In particular, the analysis and interpretation of real-world natural and engineered complex systems with the help of statistical tools based on Shannon information theory fall within the scope of this Special Issue.

Prof. Dr. Karoline Wiesner
Dr. Rick Quax
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1500 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Complex systems
  • data analysis
  • statistics
  • information theoretic techniques
  • complex networks
  • physics
  • chemistry
  • biology
  • earth sciences
  • social sciences
  • applications

Published Papers (4 papers)

View options order results:
result details:
Displaying articles 1-4
Export citation of selected articles as:

Research

Open AccessArticle Information Dynamics in Urban Crime
Entropy 2018, 20(11), 874; https://doi.org/10.3390/e20110874
Received: 28 September 2018 / Revised: 1 November 2018 / Accepted: 6 November 2018 / Published: 14 November 2018
PDF Full-text (3039 KB) | HTML Full-text | XML Full-text
Abstract
Information production in both space and time has been highlighted as one of the elements that shapes the footprint of complexity in natural and socio-technical systems. However, information production in urban crime has barely been studied. This work copes with this problem by
[...] Read more.
Information production in both space and time has been highlighted as one of the elements that shapes the footprint of complexity in natural and socio-technical systems. However, information production in urban crime has barely been studied. This work copes with this problem by using multifractal analysis to characterize the spatial information scaling in urban crime reports and nonlinear processing tools to study the temporal behavior of this scaling. Our results suggest that information scaling in urban crime exhibits dynamics that evolve in low-dimensional chaotic attractors, and this can be observed in several spatio-temporal scales, although some of them are more favorable than others. This evidence has practical implications in terms of defining the characteristic scales to approach urban crime from available data and supporting theoretical perspectives about the complexity of urban crime. Full article
(This article belongs to the Special Issue Information Theory in Complex Systems)
Figures

Figure 1

Open AccessArticle Probability Mass Exclusions and the Directed Components of Mutual Information
Entropy 2018, 20(11), 826; https://doi.org/10.3390/e20110826
Received: 26 September 2018 / Revised: 22 October 2018 / Accepted: 23 October 2018 / Published: 28 October 2018
PDF Full-text (269 KB) | HTML Full-text | XML Full-text
Abstract
Information is often described as a reduction of uncertainty associated with a restriction of possible choices. Despite appearing in Hartley’s foundational work on information theory, there is a surprising lack of a formal treatment of this interpretation in terms of exclusions. This paper
[...] Read more.
Information is often described as a reduction of uncertainty associated with a restriction of possible choices. Despite appearing in Hartley’s foundational work on information theory, there is a surprising lack of a formal treatment of this interpretation in terms of exclusions. This paper addresses the gap by providing an explicit characterisation of information in terms of probability mass exclusions. It then demonstrates that different exclusions can yield the same amount of information and discusses the insight this provides about how information is shared amongst random variables—lack of progress in this area is a key barrier preventing us from understanding how information is distributed in complex systems. The paper closes by deriving a decomposition of the mutual information which can distinguish between differing exclusions; this provides surprising insight into the nature of directed information. Full article
(This article belongs to the Special Issue Information Theory in Complex Systems)
Figures

Figure 1

Open AccessArticle Intrinsic Computation of a Monod-Wyman-Changeux Molecule
Entropy 2018, 20(8), 599; https://doi.org/10.3390/e20080599
Received: 12 July 2018 / Revised: 6 August 2018 / Accepted: 10 August 2018 / Published: 11 August 2018
PDF Full-text (1099 KB) | HTML Full-text | XML Full-text
Abstract
Causal states are minimal sufficient statistics of prediction of a stochastic process, their coding cost is called statistical complexity, and the implied causal structure yields a sense of the process’ “intrinsic computation”. We discuss how statistical complexity changes with slight changes to the
[...] Read more.
Causal states are minimal sufficient statistics of prediction of a stochastic process, their coding cost is called statistical complexity, and the implied causal structure yields a sense of the process’ “intrinsic computation”. We discuss how statistical complexity changes with slight changes to the underlying model– in this case, a biologically-motivated dynamical model, that of a Monod-Wyman-Changeux molecule. Perturbations to kinetic rates cause statistical complexity to jump from finite to infinite. The same is not true for excess entropy, the mutual information between past and future, or for the molecule’s transfer function. We discuss the implications of this for the relationship between intrinsic and functional computation of biological sensory systems. Full article
(This article belongs to the Special Issue Information Theory in Complex Systems)
Figures

Figure 1

Open AccessArticle A Novel Index Based on Binary Entropy to Confirm the Spatial Expansion Degree of Urban Sprawl
Entropy 2018, 20(8), 559; https://doi.org/10.3390/e20080559
Received: 18 May 2018 / Revised: 25 June 2018 / Accepted: 25 July 2018 / Published: 27 July 2018
PDF Full-text (3927 KB) | HTML Full-text | XML Full-text | Supplementary Files
Abstract
The phenomenon of urban sprawl has received much attention. Accurately confirming the spatial expansion degree of urban sprawl (SEDUS) is a prerequisite to controlling urban sprawl. However, there is no reliable metric to accurately measure SEDUS. In this paper, based on binary entropy,
[...] Read more.
The phenomenon of urban sprawl has received much attention. Accurately confirming the spatial expansion degree of urban sprawl (SEDUS) is a prerequisite to controlling urban sprawl. However, there is no reliable metric to accurately measure SEDUS. In this paper, based on binary entropy, we propose a new index named the spatial expansion degree index (SEDI), to overcome this difficulty. The study shows that the new index can accurately determine SEDUS and, compared with other commonly used measures, the new index has an obvious advantage in measuring SEDUS. The new index belongs to the second-order metrics of point pattern analysis, and greatly extends the concept of entropy. The new index can also be applied to other spatial differentiation research from a broader perspective. Although the new index is influenced by the scaling problem, because of small differences between different scales, given that the partition scheme in the research process is the same, the new index is a quite robust method for measuring SEDUS. Full article
(This article belongs to the Special Issue Information Theory in Complex Systems)
Figures

Figure 1

Back to Top