Next Article in Journal
Energetic and Exergetic Analysis of an Ejector-Expansion Refrigeration Cycle Using the Working Fluid R32
Next Article in Special Issue
Deformed Algebras and Generalizations of Independence on Deformed Exponential Families
Previous Article in Journal
Geometric Interpretation of Surface Tension Equilibrium in Superhydrophobic Systems
Previous Article in Special Issue
Information Geometry Formalism for the Spatially Homogeneous Boltzmann Equation
Open AccessArticle

Asymptotic Description of Neural Networks with Correlated Synaptic Weights

INRIA Sophia Antipolis Mediterannee, 2004 Route Des Lucioles, Sophia Antipolis, 06410, France
*
Author to whom correspondence should be addressed.
Academic Editor: Giorgio Kaniadakis
Entropy 2015, 17(7), 4701-4743; https://doi.org/10.3390/e17074701
Received: 13 February 2015 / Revised: 23 May 2015 / Accepted: 23 June 2015 / Published: 6 July 2015
(This article belongs to the Special Issue Entropic Aspects in Statistical Physics of Complex Systems)
We study the asymptotic law of a network of interacting neurons when the number of neurons becomes infinite. Given a completely connected network of neurons in which the synaptic weights are Gaussian correlated random variables, we describe the asymptotic law of the network when the number of neurons goes to infinity. We introduce the process-level empirical measure of the trajectories of the solutions to the equations of the finite network of neurons and the averaged law (with respect to the synaptic weights) of the trajectories of the solutions to the equations of the network of neurons. The main result of this article is that the image law through the empirical measure satisfies a large deviation principle with a good rate function which is shown to have a unique global minimum. Our analysis of the rate function allows us also to characterize the limit measure as the image of a stationary Gaussian measure defined on a transformed set of trajectories. View Full-Text
Keywords: large deviations; good rate function; stationary gaussian processes; stationary measures; spectral representations; neural networks; firing rate neurons; correlated synaptic weights large deviations; good rate function; stationary gaussian processes; stationary measures; spectral representations; neural networks; firing rate neurons; correlated synaptic weights
MDPI and ACS Style

Faugeras, O.; MacLaurin, J. Asymptotic Description of Neural Networks with Correlated Synaptic Weights. Entropy 2015, 17, 4701-4743.

Show more citation formats Show less citations formats

Article Access Map by Country/Region

1
Only visits after 24 November 2015 are recorded.
Search more from Scilit
 
Search
Back to TopTop