- freely available
Exact Test of Independence Using Mutual Information
AbstractUsing a recently discovered method for producing random symbol sequences with prescribed transition counts, we present an exact null hypothesis significance test (NHST) for mutual information between two random variables, the null hypothesis being that the mutual information is zero (i.e., independence). The exact tests reported in the literature assume that data samples for each variable are sequentially independent and identically distributed (iid). In general, time series data have dependencies (Markov structure) that violate this condition. The algorithm given in this paper is the first exact significance test of mutual information that takes into account the Markov structure. When the Markov order is not known or indefinite, an exact test is used to determine an effective Markov order.
Share & Cite This Article
Pethel, S.D.; Hahs, D.W. Exact Test of Independence Using Mutual Information. Entropy 2014, 16, 2839-2849.View more citation formats
Pethel SD, Hahs DW. Exact Test of Independence Using Mutual Information. Entropy. 2014; 16(5):2839-2849.Chicago/Turabian Style
Pethel, Shawn D.; Hahs, Daniel W. 2014. "Exact Test of Independence Using Mutual Information." Entropy 16, no. 5: 2839-2849.
Notes: Multiple requests from the same IP address are counted as one view.