Next Article in Journal
Recurrence Plot Based Damage Detection Method by Integrating Control Chart
Next Article in Special Issue
AIM for Allostery: Using the Ising Model to Understand Information Processing and Transmission in Allosteric Biomolecular Systems
Previous Article in Journal
Entropy and Recurrence Measures of a Financial Dynamic System by an Interacting Voter System
Previous Article in Special Issue
Information-Theoretic Inference of Common Ancestors
Article Menu

Export Article

Open AccessArticle
Entropy 2015, 17(5), 2606-2623; doi:10.3390/e17052606

Uncovering Discrete Non-Linear Dependence with Information Theory

1
Olsen Ltd, Eierbrechtstrasse 50, 8053 Zürich, Switzerland
2
Computer Science Department, University of Geneva, rte de Drize 7, 1227 Carouge, Switzerland
*
Author to whom correspondence should be addressed.
Academic Editor: Rick Quax
Received: 27 February 2015 / Revised: 21 April 2015 / Accepted: 22 April 2015 / Published: 23 April 2015
(This article belongs to the Special Issue Information Processing in Complex Systems)
View Full-Text   |   Download PDF [1874 KB, uploaded 23 April 2015]   |  

Abstract

In this paper, we model discrete time series as discrete Markov processes of arbitrary order and derive the approximate distribution of the Kullback-Leibler divergence between a known transition probability matrix and its sample estimate. We introduce two new information-theoretic measurements: information memory loss and information codependence structure. The former measures the memory content within a Markov process and determines its optimal order. The latter assesses the codependence among Markov processes. Both measurements are evaluated on toy examples and applied on high frequency foreign exchange data, focusing on 2008 financial crisis and 2010/2011 Euro crisis. View Full-Text
Keywords: Markov process; Kullback-Leibler divergence; information theory Markov process; Kullback-Leibler divergence; information theory
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Golub, A.; Chliamovitch, G.; Dupuis, A.; Chopard, B. Uncovering Discrete Non-Linear Dependence with Information Theory. Entropy 2015, 17, 2606-2623.

Show more citation formats Show less citations formats

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top