Next Article in Journal
Exergy Analysis of a Ground-Coupled Heat Pump Heating System with Different Terminals
Next Article in Special Issue
Uncovering Discrete Non-Linear Dependence with Information Theory
Previous Article in Journal
Some Comments on the Entropy-Based Criteria for Piping
Article

Information-Theoretic Inference of Common Ancestors

by 1 and 1,2,3,*
1
Max Planck Institute for Mathematics in the Sciences, Inselstraße 22, 04103 Leipzig, Germany
2
Faculty of Mathematics and Computer Science, University of Leipzig, PF 100920, 04009 Leipzig, Germany
3
Santa Fe Institute, 1399 Hyde Park Road, Santa Fe, NM 87501, USA
*
Author to whom correspondence should be addressed.
Academic Editor: Rick Quax
Entropy 2015, 17(4), 2304-2327; https://doi.org/10.3390/e17042304
Received: 12 February 2015 / Revised: 29 March 2015 / Accepted: 1 April 2015 / Published: 16 April 2015
(This article belongs to the Special Issue Information Processing in Complex Systems)
A directed acyclic graph (DAG) partially represents the conditional independence structure among observations of a system if the local Markov condition holds, that is if every variable is independent of its non-descendants given its parents. In general, there is a whole class of DAGs that represents a given set of conditional independence relations. We are interested in properties of this class that can be derived from observations of a subsystem only. To this end, we prove an information-theoretic inequality that allows for the inference of common ancestors of observed parts in any DAG representing some unknown larger system. More explicitly, we show that a large amount of dependence in terms of mutual information among the observations implies the existence of a common ancestor that distributes this information. Within the causal interpretation of DAGs, our result can be seen as a quantitative extension of Reichenbach’s principle of common cause to more than two variables. Our conclusions are valid also for non-probabilistic observations, such as binary strings, since we state the proof for an axiomatized notion of “mutual information” that includes the stochastic as well as the algorithmic version. View Full-Text
Keywords: information theory; common cause principle; directed acyclic graphs; Bayesian nets; causality; mutual information; Kolmogorov complexity information theory; common cause principle; directed acyclic graphs; Bayesian nets; causality; mutual information; Kolmogorov complexity
Show Figures

MDPI and ACS Style

Steudel, B.; Ay, N. Information-Theoretic Inference of Common Ancestors. Entropy 2015, 17, 2304-2327. https://doi.org/10.3390/e17042304

AMA Style

Steudel B, Ay N. Information-Theoretic Inference of Common Ancestors. Entropy. 2015; 17(4):2304-2327. https://doi.org/10.3390/e17042304

Chicago/Turabian Style

Steudel, Bastian, and Nihat Ay. 2015. "Information-Theoretic Inference of Common Ancestors" Entropy 17, no. 4: 2304-2327. https://doi.org/10.3390/e17042304

Find Other Styles

Article Access Map by Country/Region

1
Only visits after 24 November 2015 are recorded.
Back to TopTop