Next Article in Journal
Radial Wavelet Neural Network with a Novel Self-Creating Disk-Cell-Splitting Algorithm for License Plate Character Recognition
Next Article in Special Issue
The Intrinsic Cause-Effect Power of Discrete Dynamical Systems—From Elementary Cellular Automata to Adapting Animats
Previous Article in Journal
On the Detection of Fake Certificates via Attribute Correlation
Previous Article in Special Issue
Information Geometry on Complexity and Stochastic Interaction
Article Menu

Export Article

Open AccessArticle
Entropy 2015, 17(6), 3838-3856; doi:10.3390/e17063838

The Fisher Information as a Neural Guiding Principle for Independent Component Analysis

Institute for Theoretical Physics, Goethe University Frankfurt, Frankfurt, 60438, Germany
*
Author to whom correspondence should be addressed.
Received: 27 February 2015 / Revised: 28 May 2015 / Accepted: 5 June 2015 / Published: 9 June 2015
(This article belongs to the Special Issue Information Theoretic Incentives for Cognitive Systems)
View Full-Text   |   Download PDF [398 KB, uploaded 9 June 2015]   |  

Abstract

The Fisher information constitutes a natural measure for the sensitivity of a probability distribution with respect to a set of parameters. An implementation of the stationarity principle for synaptic learning in terms of the Fisher information results in a Hebbian self-limiting learning rule for synaptic plasticity. In the present work, we study the dependence of the solutions to this rule in terms of the moments of the input probability distribution and find a preference for non-Gaussian directions, making it a suitable candidate for independent component analysis (ICA). We confirm in a numerical experiment that a neuron trained under these rules is able to find the independent components in the non-linear bars problem. The specific form of the plasticity rule depends on the transfer function used, becoming a simple cubic polynomial of the membrane potential for the case of the rescaled error function. The cubic learning rule is also an excellent approximation for other transfer functions, as the standard sigmoidal, and can be used to show analytically that the proposed plasticity rules are selective for directions in the space of presynaptic neural activities characterized by a negative excess kurtosis. View Full-Text
Keywords: Fisher information; guiding principle; excess kurtosis; objective functions; synaptic plasticity; Hebbian learning; independent component analysis Fisher information; guiding principle; excess kurtosis; objective functions; synaptic plasticity; Hebbian learning; independent component analysis
Figures

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Echeveste, R.; Eckmann, S.; Gros, C. The Fisher Information as a Neural Guiding Principle for Independent Component Analysis. Entropy 2015, 17, 3838-3856.

Show more citation formats Show less citations formats

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top