Next Article in Journal
Next Article in Special Issue
Previous Article in Journal
Previous Article in Special Issue
Entropy 2014, 16(7), 3670-3688; doi:10.3390/e16073670
Article

Extending the Extreme Physical Information to Universal Cognitive Models via a Confident Information First Principle

1
, 1,2,* , 1,3
 and 2
Received: 25 March 2014; in revised form: 6 June 2014 / Accepted: 20 June 2014 / Published: 1 July 2014
(This article belongs to the Special Issue Information Geometry)
View Full-Text   |   Download PDF [397 KB, uploaded 1 July 2014]   |   Browse Figures
Abstract: The principle of extreme physical information (EPI) can be used to derive many known laws and distributions in theoretical physics by extremizing the physical information loss K, i.e., the difference between the observed Fisher information I and the intrinsic information bound J of the physical phenomenon being measured. However, for complex cognitive systems of high dimensionality (e.g., human language processing and image recognition), the information bound J could be excessively larger than I (J ≫ I), due to insufficient observation, which would lead to serious over-fitting problems in the derivation of cognitive models. Moreover, there is a lack of an established exact invariance principle that gives rise to the bound information in universal cognitive systems. This limits the direct application of EPI. To narrow down the gap between I and J, in this paper, we propose a confident-information-first (CIF) principle to lower the information bound J by preserving confident parameters and ruling out unreliable or noisy parameters in the probability density function being measured. The confidence of each parameter can be assessed by its contribution to the expected Fisher information distance between the physical phenomenon and its observations. In addition, given a specific parametric representation, this contribution can often be directly assessed by the Fisher information, which establishes a connection with the inverse variance of any unbiased estimate for the parameter via the Cramér–Rao bound. We then consider the dimensionality reduction in the parameter spaces of binary multivariate distributions. We show that the single-layer Boltzmann machine without hidden units (SBM) can be derived using the CIF principle. An illustrative experiment is conducted to show how the CIF principle improves the density estimation performance.
Keywords: information geometry; Boltzmann machine; Fisher information; parametric reduction information geometry; Boltzmann machine; Fisher information; parametric reduction
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Export to BibTeX |
EndNote


MDPI and ACS Style

Zhao, X.; Hou, Y.; Song, D.; Li, W. Extending the Extreme Physical Information to Universal Cognitive Models via a Confident Information First Principle. Entropy 2014, 16, 3670-3688.

AMA Style

Zhao X, Hou Y, Song D, Li W. Extending the Extreme Physical Information to Universal Cognitive Models via a Confident Information First Principle. Entropy. 2014; 16(7):3670-3688.

Chicago/Turabian Style

Zhao, Xiaozhao; Hou, Yuexian; Song, Dawei; Li, Wenjie. 2014. "Extending the Extreme Physical Information to Universal Cognitive Models via a Confident Information First Principle." Entropy 16, no. 7: 3670-3688.


Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert