Next Article in Journal
Maximum Entropy Production vs. Kolmogorov-Sinai Entropy in a Constrained ASEP Model
Next Article in Special Issue
Matrix Algebraic Properties of the Fisher Information Matrix of Stationary Processes
Previous Article in Journal
Prediction Method for Image Coding Quality Based on Differential Information Entropy
Open AccessArticle

Learning from Complex Systems: On the Roles of Entropy and Fisher Information in Pairwise Isotropic Gaussian Markov Random Fields

Computing Department, Federal University of São Carlos, Rod. Washington Luiz, km 235, São Carlos, SP, Brazil
Entropy 2014, 16(2), 1002-1036; https://doi.org/10.3390/e16021002
Received: 4 December 2013 / Accepted: 30 January 2014 / Published: 18 February 2014
(This article belongs to the Special Issue Information Geometry)
Markov random field models are powerful tools for the study of complex systems. However, little is known about how the interactions between the elements of such systems are encoded, especially from an information-theoretic perspective. In this paper, our goal is to enlighten the connection between Fisher information, Shannon entropy, information geometry and the behavior of complex systems modeled by isotropic pairwise Gaussian Markov random fields. We propose analytical expressions to compute local and global versions of these measures using Besag’s pseudo-likelihood function, characterizing the system’s behavior through its Fisher curve , a parametric trajectory across the information space that provides a geometric representation for the study of complex systems in which temperature deviates from infinity. Computational experiments show how the proposed tools can be useful in extracting relevant information from complex patterns. The obtained results quantify and support our main conclusion, which is: in terms of information, moving towards higher entropy states (A –> B) is different from moving towards lower entropy states (B –> A), since the Fisher curves are not the same, given a natural orientation (the direction of time). View Full-Text
Keywords: Markov random fields; information theory; Fisher information; entropy; maximum pseudo-likelihood estimation Markov random fields; information theory; Fisher information; entropy; maximum pseudo-likelihood estimation
MDPI and ACS Style

Levada, A. Learning from Complex Systems: On the Roles of Entropy and Fisher Information in Pairwise Isotropic Gaussian Markov Random Fields. Entropy 2014, 16, 1002-1036. https://doi.org/10.3390/e16021002

AMA Style

Levada A. Learning from Complex Systems: On the Roles of Entropy and Fisher Information in Pairwise Isotropic Gaussian Markov Random Fields. Entropy. 2014; 16(2):1002-1036. https://doi.org/10.3390/e16021002

Chicago/Turabian Style

Levada, Alexandre. 2014. "Learning from Complex Systems: On the Roles of Entropy and Fisher Information in Pairwise Isotropic Gaussian Markov Random Fields" Entropy 16, no. 2: 1002-1036. https://doi.org/10.3390/e16021002

Find Other Styles

Article Access Map by Country/Region

1
Only visits after 24 November 2015 are recorded.
Search more from Scilit
 
Search
Back to TopTop