Memory Storage Capacity in Recurrent Neural Networks
A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".
Deadline for manuscript submissions: closed (18 September 2022) | Viewed by 17408
Special Issue Editors
Interests: artificial and biological neural networks; computational neuroscience; systems neuroscience; information theory; sensory processing; decision making
Interests: neural networks; collective behavior; machine learning; computational models; combinatorial problems
Interests: molecular biophysics; image recognition; molecular interaction; graph theory; machine learning in bioscience
Special Issue Information
Dear Colleagues,
A neural network (NN) is an ensemble of simple analog-signal-processing units which are highly interconnected. Recurrent neural networks (RNNs) are a general class of neural networks with node connections defined by a bidirectional “coupling matrix”, allowing the existence of loops among neurons. This architecture produces recursive dynamics in which the network state depends on previous ones. RNNs exhibit a wide range of dynamic temporal behavior, and they can generate steady-states, limit-cycles, quasi-periodic and chaotic orbits, depending on the specific topology of their connectivity. A memory is defined as a set of activation states, the patterns of activation of the neurons at time t, which remains unchanged with network iterations. Hopfield neural networks are the simplest structure of RNN; these NNs are widely used as storage devices capable of storing patterns and can model “associative memory”. The attractor states of these neural networks can be considered “stored patterns”. This is possible because, given sufficient time, Hopfield neural networks may map input activation patterns (stimuli) to output activation patterns which can be represented by a steady state, or a limit cycle composed of several states. Thus, Hopfield RNNs can store stimuli-response associations and serve as a model of how biological neural networks store and recall behaviors as responses to given stimuli. The storage capacity limit of Hopfield RNNs was immediately recognized by Amit, Gutfreund, and Sompolinsky in the specific case of the multi-dyadic form of the coupling matrix, which represents the simplest form of learning strategy (Hebbian learning). This limit is linear with the network dimension N. For a Hopfield RNN to be effective, the retrieval error probability must be low, and a Hopfield RNN can only be efficient if the stored memories do not exceed 14% of the network size N. This strongly limits the application of neural networks for information storage. By contrast, randomly generated coupling matrices, without imposing any dyadic structure, show an exponentially large number of memory states. Clearly, the optimal storage problem is still open, and how to achieve this optimal storage is still the subject of research.
This Special Issue on Hopfield RNN and its storage capacity invites researchers to present state-of-the-art approaches, focusing on how modifications of the traditional Hopfield and Hebbian architecture affect behavior and storage capacities with the objective of finding more efficient memory strategies.
The topics relevant to this Special Issue include but are not limited to the following:
- Theory of Hopfield RNN;
- New information theories based on novel learning strategies;
- Optimization of Hopfield RNN architecture;
- RNN models of memory storage;
- RNN for brain-inspired machine learning and biological modeling.
Dr. Viola Folli
Dr. Giorgio Gosti
Dr. Edoardo Milanetti
Guest Editors
Manuscript Submission Information
Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.
Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.
Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.
Keywords
neural networks;
Hopfield model;
learning and memory;
collective dynamics;
structural connectivity;
machine learning;
complexity and information theory;
storage capacity;
Benefits of Publishing in a Special Issue
- Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
- Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
- Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
- External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
- e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.
Further information on MDPI's Special Issue policies can be found here.