Next Article in Journal
Exergy Analysis of a Ground-Coupled Heat Pump Heating System with Different Terminals
Next Article in Special Issue
Uncovering Discrete Non-Linear Dependence with Information Theory
Previous Article in Journal
Some Comments on the Entropy-Based Criteria for Piping
Article Menu

Export Article

Open AccessArticle
Entropy 2015, 17(4), 2304-2327; doi:10.3390/e17042304

Information-Theoretic Inference of Common Ancestors

1
and
1,2,3,*
1
Max Planck Institute for Mathematics in the Sciences, Inselstraße 22, 04103 Leipzig, Germany
2
Faculty of Mathematics and Computer Science, University of Leipzig, PF 100920, 04009 Leipzig, Germany
3
Santa Fe Institute, 1399 Hyde Park Road, Santa Fe, NM 87501, USA
*
Author to whom correspondence should be addressed.
Academic Editor: Rick Quax
Received: 12 February 2015 / Revised: 29 March 2015 / Accepted: 1 April 2015 / Published: 16 April 2015
(This article belongs to the Special Issue Information Processing in Complex Systems)
View Full-Text   |   Download PDF [165 KB, uploaded 20 April 2015]   |  

Abstract

A directed acyclic graph (DAG) partially represents the conditional independence structure among observations of a system if the local Markov condition holds, that is if every variable is independent of its non-descendants given its parents. In general, there is a whole class of DAGs that represents a given set of conditional independence relations. We are interested in properties of this class that can be derived from observations of a subsystem only. To this end, we prove an information-theoretic inequality that allows for the inference of common ancestors of observed parts in any DAG representing some unknown larger system. More explicitly, we show that a large amount of dependence in terms of mutual information among the observations implies the existence of a common ancestor that distributes this information. Within the causal interpretation of DAGs, our result can be seen as a quantitative extension of Reichenbach’s principle of common cause to more than two variables. Our conclusions are valid also for non-probabilistic observations, such as binary strings, since we state the proof for an axiomatized notion of “mutual information” that includes the stochastic as well as the algorithmic version. View Full-Text
Keywords: information theory; common cause principle; directed acyclic graphs; Bayesian nets; causality; mutual information; Kolmogorov complexity information theory; common cause principle; directed acyclic graphs; Bayesian nets; causality; mutual information; Kolmogorov complexity
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Steudel, B.; Ay, N. Information-Theoretic Inference of Common Ancestors. Entropy 2015, 17, 2304-2327.

Show more citation formats Show less citations formats

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top