Next Article in Journal
A Probabilistic Description of the Configurational Entropy of Mixing
Next Article in Special Issue
Coarse Dynamics for Coarse Modeling: An Example From Population Biology
Previous Article in Journal
Randomized Binary Consensus with Faulty Agents
Previous Article in Special Issue
Infinite Excess Entropy Processes with Countable-State Generators
Entropy 2014, 16(5), 2839-2849; doi:10.3390/e16052839
Article

Exact Test of Independence Using Mutual Information

1,*  and 2
1 Army RDECOM, RDMR-WDS-WO, Redstone Arsenal, AL 35898, USA 2 Torch Technologies, Inc., Huntsville, AL 35802, USA
* Author to whom correspondence should be addressed.
Received: 18 February 2014 / Revised: 15 May 2014 / Accepted: 20 May 2014 / Published: 23 May 2014
(This article belongs to the Special Issue Information in Dynamical Systems and Complex Systems)
View Full-Text   |   Download PDF [126 KB, uploaded 24 February 2015]   |   Browse Figures

Abstract

Using a recently discovered method for producing random symbol sequences with prescribed transition counts, we present an exact null hypothesis significance test (NHST) for mutual information between two random variables, the null hypothesis being that the mutual information is zero (i.e., independence). The exact tests reported in the literature assume that data samples for each variable are sequentially independent and identically distributed (iid). In general, time series data have dependencies (Markov structure) that violate this condition. The algorithm given in this paper is the first exact significance test of mutual information that takes into account the Markov structure. When the Markov order is not known or indefinite, an exact test is used to determine an effective Markov order.
Keywords: mutual information; significance test; surrogate data mutual information; significance test; surrogate data
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Share & Cite This Article

Further Mendeley | CiteULike
Export to BibTeX |
EndNote
MDPI and ACS Style

Pethel, S.D.; Hahs, D.W. Exact Test of Independence Using Mutual Information. Entropy 2014, 16, 2839-2849.

View more citation formats

Related Articles

Article Metrics

For more information on the journal, click here

Comments

Cited By

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert