Next Article in Journal
Using k-Mix-Neighborhood Subdigraphs to Compute Canonical Labelings of Digraphs
Previous Article in Journal
The More You Know, the More You Can Grow: An Information Theoretic Approach to Growth in the Information Age
Article Menu
Issue 2 (February) cover image

Export Article

Open AccessArticle
Entropy 2017, 19(2), 85; doi:10.3390/e19020085

Quantifying Synergistic Information Using Intermediate Stochastic Variables

1
Computational Science Lab, University of Amsterdam, 1098 XH Amsterdam, The Netherlands
2
The Institute for Advanced Study, University of Amsterdam, Oude Turfmarkt 147, 1012 GC Amsterdam, The Netherlands
3
Advanced Computing Lab, ITMO University, Kronverkskiy pr. 49, 197101 Saint Petersburg, Russia
4
Complexity Institute, Nanyang Technological University, 639673 Singapore, Singapore
This paper is an extended version of our paper published in the Conference on Complex Systems, Amsterdam, The Netherlands, 19–22 September 2016.
*
Author to whom correspondence should be addressed.
Academic Editor: J. A. Tenreiro Machado
Received: 1 November 2016 / Revised: 16 February 2017 / Accepted: 19 February 2017 / Published: 22 February 2017
(This article belongs to the Section Complexity)
View Full-Text   |   Download PDF [1538 KB, uploaded 22 February 2017]   |  

Abstract

Quantifying synergy among stochastic variables is an important open problem in information theory. Information synergy occurs when multiple sources together predict an outcome variable better than the sum of single-source predictions. It is an essential phenomenon in biology such as in neuronal networks and cellular regulatory processes, where different information flows integrate to produce a single response, but also in social cooperation processes as well as in statistical inference tasks in machine learning. Here we propose a metric of synergistic entropy and synergistic information from first principles. The proposed measure relies on so-called synergistic random variables (SRVs) which are constructed to have zero mutual information about individual source variables but non-zero mutual information about the complete set of source variables. We prove several basic and desired properties of our measure, including bounds and additivity properties. In addition, we prove several important consequences of our measure, including the fact that different types of synergistic information may co-exist between the same sets of variables. A numerical implementation is provided, which we use to demonstrate that synergy is associated with resilience to noise. Our measure may be a marked step forward in the study of multivariate information theory and its numerous applications. View Full-Text
Keywords: synergy; synergistic information; synergistic entropy; information theory; stochastic variables; higher order information synergy; synergistic information; synergistic entropy; information theory; stochastic variables; higher order information
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Quax, R.; Har-Shemesh, O.; Sloot, P.M.A. Quantifying Synergistic Information Using Intermediate Stochastic Variables. Entropy 2017, 19, 85.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top