Next Article in Journal
A Fuzzy Parallel Processing Scheme for Enhancing the Effectiveness of a Dynamic Just-in-time Location-aware Service System
Next Article in Special Issue
Measuring the Complexity of Self-Organizing Traffic Lights
Previous Article in Journal
Retraction: Zheng, T. et al. Effect of Heat Leak and Finite Thermal Capacity on the Optimal Configuration of a Two-Heat-Reservoir Heat Engine for Another Linear Heat Transfer Law. Entropy 2003, 5, 519–530
Previous Article in Special Issue
Autonomous Search for a Diffusive Source in an Unknown Structured Environment
Entropy 2014, 16(4), 1985-2000; doi:10.3390/e16041985
Article

Intersection Information Based on Common Randomness

1,* , 2
, 3
, 4
 and 5,6
1 Computation and Neural Systems, Caltech, Pasadena, CA 91125, USA 2 Dept. of Electrical & Computer Engineering, Colorado State University, Fort Collins, CO 80523, USA 3 Department of Computer Science, University of Colorado, Boulder, CO 80309, USA 4 Center for Complexity and Collective Computation, Wisconsin Institute for Discovery, University of Wisconsin-Madison, Madison, WI 53715, USA 5 Complexity Sciences Center and Physics Dept, University of California Davis, Davis, CA 95616, USA 6 Santa Fe Institute, 1399 Hyde Park Rd, Santa Fe, NM 87501, USA
* Author to whom correspondence should be addressed.
Received: 25 October 2013 / Revised: 27 March 2014 / Accepted: 28 March 2014 / Published: 4 April 2014
(This article belongs to the Special Issue Entropy Methods in Guided Self-Organization)
View Full-Text   |   Download PDF [516 KB, uploaded 24 February 2015]   |   Browse Figures

Abstract

The introduction of the partial information decomposition generated a flurry of proposals for defining an intersection information that quantifies how much of “the same information” two or more random variables specify about a target random variable. As of yet, none is wholly satisfactory. A palatable measure of intersection information would provide a principled way to quantify slippery concepts, such as synergy. Here, we introduce an intersection information measure based on the Gács-Körner common random variable that is the first to satisfy the coveted target monotonicity property. Our measure is imperfect, too, and we suggest directions for improvement.
Keywords: intersection information; partial information decomposition; lattice; Gács–Körner; synergy; redundant information intersection information; partial information decomposition; lattice; Gács–Körner; synergy; redundant information
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Share & Cite This Article

Export to BibTeX |
EndNote


MDPI and ACS Style

Griffith, V.; Chong, E.K.P.; James, R.G.; Ellison, C.J.; Crutchfield, J.P. Intersection Information Based on Common Randomness. Entropy 2014, 16, 1985-2000.

View more citation formats

Article Metrics

Comments

Citing Articles

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert