Next Article in Journal
Effects of Urban Producer Service Industry Agglomeration on Export Technological Complexity of Manufacturing in China
Next Article in Special Issue
Kullback–Leibler Divergence of a Freely Cooling Granular Gas
Previous Article in Journal
Evidential Estimation of an Uncertain Mixed Exponential Distribution under Progressive Censoring
Previous Article in Special Issue
Entropy Analysis of a Flexible Markovian Queue with Server Breakdowns
Article

Complexity as Causal Information Integration

by 1,* and 1,2,3
1
Max Planck Institute for Mathematics in the Sciences, 04103 Leipzig, Germany
2
Faculty of Mathematics and Computer Science, University of Leipzig, PF 100920, 04009 Leipzig, Germany
3
Santa Fe Institute, Santa Fe, NM 87501, USA
*
Author to whom correspondence should be addressed.
Entropy 2020, 22(10), 1107; https://doi.org/10.3390/e22101107
Received: 21 August 2020 / Revised: 25 September 2020 / Accepted: 27 September 2020 / Published: 30 September 2020
(This article belongs to the Special Issue Entropy: The Scientific Tool of the 21st Century)
Complexity measures in the context of the Integrated Information Theory of consciousness try to quantify the strength of the causal connections between different neurons. This is done by minimizing the KL-divergence between a full system and one without causal cross-connections. Various measures have been proposed and compared in this setting. We will discuss a class of information geometric measures that aim at assessing the intrinsic causal cross-influences in a system. One promising candidate of these measures, denoted by ΦCIS, is based on conditional independence statements and does satisfy all of the properties that have been postulated as desirable. Unfortunately it does not have a graphical representation, which makes it less intuitive and difficult to analyze. We propose an alternative approach using a latent variable, which models a common exterior influence. This leads to a measure ΦCII, Causal Information Integration, that satisfies all of the required conditions. Our measure can be calculated using an iterative information geometric algorithm, the em-algorithm. Therefore we are able to compare its behavior to existing integrated information measures. View Full-Text
Keywords: complexity; integrated information; causality; conditional independence; em-algorithm complexity; integrated information; causality; conditional independence; em-algorithm
Show Figures

Figure 1

MDPI and ACS Style

Langer, C.; Ay, N. Complexity as Causal Information Integration. Entropy 2020, 22, 1107. https://doi.org/10.3390/e22101107

AMA Style

Langer C, Ay N. Complexity as Causal Information Integration. Entropy. 2020; 22(10):1107. https://doi.org/10.3390/e22101107

Chicago/Turabian Style

Langer, Carlotta, and Nihat Ay. 2020. "Complexity as Causal Information Integration" Entropy 22, no. 10: 1107. https://doi.org/10.3390/e22101107

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop