Next Article in Journal
Continuous Variable Quantum Key Distribution with a Noisy Laser
Next Article in Special Issue
Prebiotic Competition between Information Variants, With Low Error Catastrophe Risks
Previous Article in Journal
Differentiating Interictal and Ictal States in Childhood Absence Epilepsy through Permutation Rényi Entropy
Previous Article in Special Issue
Maximum Entropy Rate Reconstruction of Markov Dynamics
Open AccessArticle

Quantifying Redundant Information in Predicting a Target Random Variable

by Virgil Griffith 1,* and Tracey Ho 2
1
School of Computing, National University of Singapore, Singapore 119077, Singapore
2
Computer Science and Electrical Engineering, Caltech, Pasadena, CA 91125, USA
*
Author to whom correspondence should be addressed.
Academic Editor: Rick Quax
Entropy 2015, 17(7), 4644-4653; https://doi.org/10.3390/e17074644
Received: 18 March 2015 / Revised: 24 June 2015 / Accepted: 26 June 2015 / Published: 2 July 2015
(This article belongs to the Special Issue Information Processing in Complex Systems)
We consider the problem of defining a measure of redundant information that quantifies how much common information two or more random variables specify about a target random variable. We discussed desired properties of such a measure, and propose new measures with some desirable properties. View Full-Text
Keywords: synergy; information theory; complex systems; irreducibility; synergistic information; intersection-information synergy; information theory; complex systems; irreducibility; synergistic information; intersection-information
MDPI and ACS Style

Griffith, V.; Ho, T. Quantifying Redundant Information in Predicting a Target Random Variable. Entropy 2015, 17, 4644-4653.

Show more citation formats Show less citations formats

Article Access Map by Country/Region

1
Only visits after 24 November 2015 are recorded.
Back to TopTop