Next Article in Journal
Multiscale Information Decomposition Dissects Control Mechanisms of Heart Rate Variability at Rest and During Physiological Stress
Next Article in Special Issue
Integrated Information in Process-Algebraic Compositions
Previous Article in Journal / Special Issue
The Evolution of Neuroplasticity and the Effect on Integrated Information
Open AccessArticle

Evaluating Approximations and Heuristic Measures of Integrated Information

Brain Signalling Group, Department of Physiology, Institute of Basic Medicine, University of Oslo, Sognsvannsveien 9, 0315 Oslo, Norway
Department of Psychiatry, University of Wisconsin, Madison, WI 53719, USA
Department of Mathematics and Statistics, Brock University, St. Catharines, ON L2S 3A1, Canada
Author to whom correspondence should be addressed.
Entropy 2019, 21(5), 525;
Received: 8 March 2019 / Revised: 16 May 2019 / Accepted: 22 May 2019 / Published: 24 May 2019
(This article belongs to the Special Issue Integrated Information Theory)
Integrated information theory (IIT) proposes a measure of integrated information, termed Phi (Φ), to capture the level of consciousness of a physical system in a given state. Unfortunately, calculating Φ itself is currently possible only for very small model systems and far from computable for the kinds of system typically associated with consciousness (brains). Here, we considered several proposed heuristic measures and computational approximations, some of which can be applied to larger systems, and tested if they correlate well with Φ. While these measures and approximations capture intuitions underlying IIT and some have had success in practical applications, it has not been shown that they actually quantify the type of integrated information specified by the latest version of IIT and, thus, whether they can be used to test the theory. In this study, we evaluated these approximations and heuristic measures considering how well they estimated the Φ values of model systems and not on the basis of practical or clinical considerations. To do this, we simulated networks consisting of 3–6 binary linear threshold nodes randomly connected with excitatory and inhibitory connections. For each system, we then constructed the system’s state transition probability matrix (TPM) and generated observed data over time from all possible initial conditions. We then calculated Φ, approximations to Φ, and measures based on state differentiation, coalition entropy, state uniqueness, and integrated information. Our findings suggest that Φ can be approximated closely in small binary systems by using one or more of the readily available approximations (r > 0.95) but without major reductions in computational demands. Furthermore, the maximum value of Φ across states (a state-independent quantity) correlated strongly with measures of signal complexity (LZ, rs = 0.722), decoder-based integrated information (Φ*, rs = 0.816), and state differentiation (D1, rs = 0.827). These measures could allow for the efficient estimation of a system’s capacity for high Φ or function as accurate predictors of low- (but not high-)Φ systems. While it is uncertain whether the results extend to larger systems or systems with other dynamics, we stress the importance that measures aimed at being practical alternatives to Φ be, at a minimum, rigorously tested in an environment where the ground truth can be established. View Full-Text
Keywords: integrated information theory; differentiation; integration; complexity; consciousness; computational; IIT; Phi integrated information theory; differentiation; integration; complexity; consciousness; computational; IIT; Phi
Show Figures

Figure 1

MDPI and ACS Style

Sevenius Nilsen, A.; Juel, B.E.; Marshall, W. Evaluating Approximations and Heuristic Measures of Integrated Information. Entropy 2019, 21, 525.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

Back to TopTop