Next Article in Journal
First- and Second-Order Hypothesis Testing for Mixed Memoryless Sources
Previous Article in Journal
Q-Neutrosophic Soft Relation and Its Application in Decision Making
Previous Article in Special Issue
The Identity of Information: How Deterministic Dependencies Constrain Information Synergy and Redundancy
Article Menu
Issue 3 (March) cover image

Export Article

Open AccessArticle
Entropy 2018, 20(3), 173; https://doi.org/10.3390/e20030173

Efficient Algorithms for Searching the Minimum Information Partition in Integrated Information Theory

1
Araya, Inc., Toranomon 15 Mori Building, 2-8-10 Toranomon, Minato-ku, Tokyo 105-0001, Japan
2
Graduate School of Engineering, Kobe University, 1-1 Rokkodai-cho, Nada-ku, Kobe-shi, Hyogo 657-8501, Japan
3
RIKEN Brain Science Institute, 2-1 Hirosawa Wako City, Saitama 351-0198, Japan
*
Authors to whom correspondence should be addressed.
Received: 18 December 2017 / Revised: 26 February 2018 / Accepted: 27 February 2018 / Published: 6 March 2018
(This article belongs to the Special Issue Information Theory in Neuroscience)
View Full-Text   |   Download PDF [708 KB, uploaded 6 March 2018]   |  

Abstract

The ability to integrate information in the brain is considered to be an essential property for cognition and consciousness. Integrated Information Theory (IIT) hypothesizes that the amount of integrated information ( Φ ) in the brain is related to the level of consciousness. IIT proposes that, to quantify information integration in a system as a whole, integrated information should be measured across the partition of the system at which information loss caused by partitioning is minimized, called the Minimum Information Partition (MIP). The computational cost for exhaustively searching for the MIP grows exponentially with system size, making it difficult to apply IIT to real neural data. It has been previously shown that, if a measure of Φ satisfies a mathematical property, submodularity, the MIP can be found in a polynomial order by an optimization algorithm. However, although the first version of Φ is submodular, the later versions are not. In this study, we empirically explore to what extent the algorithm can be applied to the non-submodular measures of Φ by evaluating the accuracy of the algorithm in simulated data and real neural data. We find that the algorithm identifies the MIP in a nearly perfect manner even for the non-submodular measures. Our results show that the algorithm allows us to measure Φ in large systems within a practical amount of time. View Full-Text
Keywords: integrated information theory; integrated information; minimum information partition; submodularity; Queyranne’s algorithm; consciousness integrated information theory; integrated information; minimum information partition; submodularity; Queyranne’s algorithm; consciousness
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Kitazono, J.; Kanai, R.; Oizumi, M. Efficient Algorithms for Searching the Minimum Information Partition in Integrated Information Theory. Entropy 2018, 20, 173.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top