Next Article in Journal
Effect of Annealing on Microstructure and Tensile Behavior of CoCrNi Medium Entropy Alloy Processed by High-Pressure Torsion
Next Article in Special Issue
Stability Analysis for Memristor-Based Complex-Valued Neural Networks with Time Delays
Previous Article in Journal
Mechanical Fault Diagnosis of HVCBs Based on Multi-Feature Entropy Fusion and Hybrid Classifier
Previous Article in Special Issue
The Capacity for Correlated Semantic Memories in the Cortex
Article Menu
Issue 11 (November) cover image

Export Article

Open AccessArticle

Modeling of a Neural System Based on Statistical Mechanics

1
Department of Global Medical Science, Sungshin Women’s University, Seoul 01133, Korea
2
Department of Physics and Astronomy and Center for Theoretical Physics, Seoul National University, Seoul 08826, Korea
*
Author to whom correspondence should be addressed.
Entropy 2018, 20(11), 848; https://doi.org/10.3390/e20110848
Received: 28 August 2018 / Revised: 29 October 2018 / Accepted: 2 November 2018 / Published: 5 November 2018
(This article belongs to the Special Issue Statistical Mechanics of Neural Networks)
  |  
PDF [874 KB, uploaded 15 November 2018]
  |  

Abstract

The minimization of a free energy is often regarded as the key principle in understanding how the brain works and how the brain structure forms. In particular, a statistical-mechanics-based neural network model is expected to allow one to interpret many aspects of the neural firing and learning processes in terms of general concepts and mechanisms in statistical physics. Nevertheless, the definition of the free energy in a neural system is usually an intricate problem without an evident solution. After the pioneering work by Hopfield, several statistical-mechanics-based models have suggested a variety of definition of the free energy or the entropy in a neural system. Among those, the Feynman machine, proposed recently, presents the free energy of a neural system defined via the Feynman path integral formulation with the explicit time variable. In this study, we first give a brief review of the previous relevant models, paying attention to the troublesome problems in them, and examine how the Feynman machine overcomes several vulnerable points in previous models and derives the outcome of the firing or the learning rule in a (biological) neural system as the extremum state in the free energy. Specifically, the model reveals that the biological learning mechanism, called spike-timing-dependent plasticity, relates to the free-energy minimization principle. Basically, computing and learning mechanisms in the Feynman machine base on the exact spike timings of neurons, such as those in a biological neural system. We discuss the consequence of the adoption of an explicit time variable in modeling a neural system and the application of the free-energy minimization principle to understanding the phenomena in the brain. View Full-Text
Keywords: neural network model; statistical mechanics; free-energy minimization principle neural network model; statistical mechanics; free-energy minimization principle
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Cho, M.W.; Choi, M.Y. Modeling of a Neural System Based on Statistical Mechanics. Entropy 2018, 20, 848.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top