Next Article in Journal
A Probabilistic Description of the Configurational Entropy of Mixing
Next Article in Special Issue
Coarse Dynamics for Coarse Modeling: An Example From Population Biology
Previous Article in Journal
Randomized Binary Consensus with Faulty Agents
Previous Article in Special Issue
Infinite Excess Entropy Processes with Countable-State Generators
Article Menu

Export Article

Open AccessArticle
Entropy 2014, 16(5), 2839-2849;

Exact Test of Independence Using Mutual Information

Army RDECOM, RDMR-WDS-WO, Redstone Arsenal, AL 35898, USA
Torch Technologies, Inc., Huntsville, AL 35802, USA
Author to whom correspondence should be addressed.
Received: 18 February 2014 / Revised: 15 May 2014 / Accepted: 20 May 2014 / Published: 23 May 2014
(This article belongs to the Special Issue Information in Dynamical Systems and Complex Systems)
View Full-Text   |   Download PDF [126 KB, uploaded 24 February 2015]


Using a recently discovered method for producing random symbol sequences with prescribed transition counts, we present an exact null hypothesis significance test (NHST) for mutual information between two random variables, the null hypothesis being that the mutual information is zero (i.e., independence). The exact tests reported in the literature assume that data samples for each variable are sequentially independent and identically distributed (iid). In general, time series data have dependencies (Markov structure) that violate this condition. The algorithm given in this paper is the first exact significance test of mutual information that takes into account the Markov structure. When the Markov order is not known or indefinite, an exact test is used to determine an effective Markov order. View Full-Text
Keywords: mutual information; significance test; surrogate data mutual information; significance test; surrogate data
This is an open access article distributed under the Creative Commons Attribution License (CC BY 3.0).

Share & Cite This Article

MDPI and ACS Style

Pethel, S.D.; Hahs, D.W. Exact Test of Independence Using Mutual Information. Entropy 2014, 16, 2839-2849.

Show more citation formats Show less citations formats

Related Articles

Article Metrics

Article Access Statistics



[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top