Intersection Information Based on Common Randomness
AbstractThe introduction of the partial information decomposition generated a flurry of proposals for defining an intersection information that quantifies how much of “the same information” two or more random variables specify about a target random variable. As of yet, none is wholly satisfactory. A palatable measure of intersection information would provide a principled way to quantify slippery concepts, such as synergy. Here, we introduce an intersection information measure based on the Gács-Körner common random variable that is the first to satisfy the coveted target monotonicity property. Our measure is imperfect, too, and we suggest directions for improvement. View Full-Text
Share & Cite This Article
Griffith, V.; Chong, E.K.P.; James, R.G.; Ellison, C.J.; Crutchfield, J.P. Intersection Information Based on Common Randomness. Entropy 2014, 16, 1985-2000.
Griffith V, Chong EKP, James RG, Ellison CJ, Crutchfield JP. Intersection Information Based on Common Randomness. Entropy. 2014; 16(4):1985-2000.Chicago/Turabian Style
Griffith, Virgil; Chong, Edwin K.P.; James, Ryan G.; Ellison, Christopher J.; Crutchfield, James P. 2014. "Intersection Information Based on Common Randomness." Entropy 16, no. 4: 1985-2000.