Next Article in Journal
Energetic and Exergetic Analysis of an Ejector-Expansion Refrigeration Cycle Using the Working Fluid R32
Next Article in Special Issue
Deformed Algebras and Generalizations of Independence on Deformed Exponential Families
Previous Article in Journal
Geometric Interpretation of Surface Tension Equilibrium in Superhydrophobic Systems
Previous Article in Special Issue
Information Geometry Formalism for the Spatially Homogeneous Boltzmann Equation
Article Menu

Export Article

Open AccessArticle
Entropy 2015, 17(7), 4701-4743; doi:10.3390/e17074701

Asymptotic Description of Neural Networks with Correlated Synaptic Weights

INRIA Sophia Antipolis Mediterannee, 2004 Route Des Lucioles, Sophia Antipolis, 06410, France
*
Author to whom correspondence should be addressed.
Academic Editor: Giorgio Kaniadakis
Received: 13 February 2015 / Revised: 23 May 2015 / Accepted: 23 June 2015 / Published: 6 July 2015
(This article belongs to the Special Issue Entropic Aspects in Statistical Physics of Complex Systems)
View Full-Text   |   Download PDF [399 KB, uploaded 6 July 2015]

Abstract

We study the asymptotic law of a network of interacting neurons when the number of neurons becomes infinite. Given a completely connected network of neurons in which the synaptic weights are Gaussian correlated random variables, we describe the asymptotic law of the network when the number of neurons goes to infinity. We introduce the process-level empirical measure of the trajectories of the solutions to the equations of the finite network of neurons and the averaged law (with respect to the synaptic weights) of the trajectories of the solutions to the equations of the network of neurons. The main result of this article is that the image law through the empirical measure satisfies a large deviation principle with a good rate function which is shown to have a unique global minimum. Our analysis of the rate function allows us also to characterize the limit measure as the image of a stationary Gaussian measure defined on a transformed set of trajectories. View Full-Text
Keywords: large deviations; good rate function; stationary gaussian processes; stationary measures; spectral representations; neural networks; firing rate neurons; correlated synaptic weights large deviations; good rate function; stationary gaussian processes; stationary measures; spectral representations; neural networks; firing rate neurons; correlated synaptic weights
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Faugeras, O.; MacLaurin, J. Asymptotic Description of Neural Networks with Correlated Synaptic Weights. Entropy 2015, 17, 4701-4743.

Show more citation formats Show less citations formats

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top