Fisher Information Properties
AbstractA set of Fisher information properties are presented in order to draw a parallel with similar properties of Shannon differential entropy. Already known properties are presented together with new ones, which include: (i) a generalization of mutual information for Fisher information; (ii) a new proof that Fisher information increases under conditioning; (iii) showing that Fisher information decreases in Markov chains; and (iv) bound estimation error using Fisher information. This last result is especially important, because it completes Fano’s inequality, i.e., a lower bound for estimation error, showing that Fisher information can be used to define an upper bound for this error. In this way, it is shown that Shannon’s differential entropy, which quantifies the behavior of the random variable, and the Fisher information, which quantifies the internal structure of the density function that defines the random variable, can be used to characterize the estimation error. View Full-Text
Share & Cite This Article
Zegers, P. Fisher Information Properties. Entropy 2015, 17, 4918-4939.
Zegers P. Fisher Information Properties. Entropy. 2015; 17(7):4918-4939.Chicago/Turabian Style
Zegers, Pablo. 2015. "Fisher Information Properties." Entropy 17, no. 7: 4918-4939.