Uncovering Discrete Non-Linear Dependence with Information Theory
AbstractIn this paper, we model discrete time series as discrete Markov processes of arbitrary order and derive the approximate distribution of the Kullback-Leibler divergence between a known transition probability matrix and its sample estimate. We introduce two new information-theoretic measurements: information memory loss and information codependence structure. The former measures the memory content within a Markov process and determines its optimal order. The latter assesses the codependence among Markov processes. Both measurements are evaluated on toy examples and applied on high frequency foreign exchange data, focusing on 2008 financial crisis and 2010/2011 Euro crisis. View Full-Text
Scifeed alert for new publicationsNever miss any articles matching your research from any publisher
- Get alerts for new papers matching your research
- Find out the new papers from selected authors
- Updated daily for 49'000+ journals and 6000+ publishers
- Define your Scifeed now
Golub, A.; Chliamovitch, G.; Dupuis, A.; Chopard, B. Uncovering Discrete Non-Linear Dependence with Information Theory. Entropy 2015, 17, 2606-2623.
Golub A, Chliamovitch G, Dupuis A, Chopard B. Uncovering Discrete Non-Linear Dependence with Information Theory. Entropy. 2015; 17(5):2606-2623.Chicago/Turabian Style
Golub, Anton; Chliamovitch, Gregor; Dupuis, Alexandre; Chopard, Bastien. 2015. "Uncovering Discrete Non-Linear Dependence with Information Theory." Entropy 17, no. 5: 2606-2623.