Entropy 2014, 16(5), 2839-2849; doi:10.3390/e16052839
Article

Exact Test of Independence Using Mutual Information

1,* email and 2email
Received: 18 February 2014; in revised form: 15 May 2014 / Accepted: 20 May 2014 / Published: 23 May 2014
(This article belongs to the Special Issue Information in Dynamical Systems and Complex Systems)
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Abstract: Using a recently discovered method for producing random symbol sequences with prescribed transition counts, we present an exact null hypothesis significance test (NHST) for mutual information between two random variables, the null hypothesis being that the mutual information is zero (i.e., independence). The exact tests reported in the literature assume that data samples for each variable are sequentially independent and identically distributed (iid). In general, time series data have dependencies (Markov structure) that violate this condition. The algorithm given in this paper is the first exact significance test of mutual information that takes into account the Markov structure. When the Markov order is not known or indefinite, an exact test is used to determine an effective Markov order.
Keywords: mutual information; significance test; surrogate data
PDF Full-text Download PDF Full-Text [126 KB, uploaded 23 May 2014 11:29 CEST]

Export to BibTeX |
EndNote


MDPI and ACS Style

Pethel, S.D.; Hahs, D.W. Exact Test of Independence Using Mutual Information. Entropy 2014, 16, 2839-2849.

AMA Style

Pethel SD, Hahs DW. Exact Test of Independence Using Mutual Information. Entropy. 2014; 16(5):2839-2849.

Chicago/Turabian Style

Pethel, Shawn D.; Hahs, Daniel W. 2014. "Exact Test of Independence Using Mutual Information." Entropy 16, no. 5: 2839-2849.

Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert