# Beyond the Maximum Storage Capacity Limit in Hopfield Recurrent Neural Networks

^{1}

^{2}

^{3}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. How to Use Autapses to Overcome the Hebbian Memory Bound

## 3. Discrete Time Hopfield RNN

## 4. Storing Patterns

## 5. Simulation Results

## 6. Conclusions

## Author Contributions

## Funding

## Acknowledgments

## Conflicts of Interest

## Abbreviations

RNN | Recurrent Neural Network |

## References

- Amit, D.J. Modeling Brain Function: The World of Attractor Neural Networks; Cambridge University Press: Cambridge, UK, 1989; Volume 13, pp. 357–358. [Google Scholar] [CrossRef]
- Haykin, S. Neural Networks: A Comprehensive Foundation, 2nd ed.; Prentice Hall PTR: Upper Saddle River, NJ, USA, 1999. [Google Scholar] [CrossRef]
- Carpenter, G.A. Neural network models for pattern recognition and associative memory. Neural Netw.
**1989**, 2, 243–257. [Google Scholar] [CrossRef] - Hopfield, J.J. Neural networks and physical systems with emergent collective computational abilities. Proc. Natl. Acad. Sci. USA
**1982**, 79, 2554–2558. [Google Scholar] [CrossRef] [PubMed] - Folli, V.; Gosti, G.; Leonetti, M.; Ruocco, G. Effect of dilution in asymmetric recurrent neural networks. Neural Netw.
**2018**, 104, 50–59. [Google Scholar] [CrossRef] [PubMed] - Carley, K. Organizational Learning and Personnel Turnover. Org. Sci.
**1992**, 3, 20–46. [Google Scholar] [CrossRef] - Gosti, G. Signaling Chains with Probe and Adjust Learning. Connect. Sci.
**2017**. [Google Scholar] [CrossRef] - Mante, V.; Sussillo, D.; Shenoy, K.V.; Newsome, W.T. Context-dependent computation by recurrent dynamics in prefrontal cortex. Nature
**2013**, 503, 78–84. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Carnevale, F.; de Lafuente, V.; Romo, R.; Barak, O.; Parga, N. Dynamic Control of Response Criterion in Premotor Cortex during Perceptual Detection under Temporal Uncertainty. Neuron
**2015**, 86, 1067–1077. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Perin, R.; Berger, T.K.; Markram, H. A synaptic organizing principle for cortical neuronal groups. Proc. Natl. Acad. Sci. USA
**2011**, 108, 5419–5424. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Amit, D.J.; Gutfreund, H.; Sompolinsky, H. Spin-glass models of neural networks. Phys. Rev. A
**1985**, 32, 1007–1018. [Google Scholar] [CrossRef] - Amit, D.J.; Gutfreund, H.; Sompolinsky, H. Storing Infinite Numbers of Patterns in a Spin-Glass Model of Neural Networks. Phys. Rev. Lett.
**1985**, 55, 1530–1533. [Google Scholar] [CrossRef] [PubMed] - Tanaka, F.; Edwards, S.F. Analytic theory of the ground state properties of a spin glass. I. Ising spin glass. J. Phys. F Met. Phys.
**1980**, 10, 2769–2778. [Google Scholar] [CrossRef] - Wainrib, G.; Touboul, J. Topological and dynamical complexity of random neural networks. Phys. Rev. Lett.
**2013**, 110, 118101. [Google Scholar] [CrossRef] [PubMed] - Abu-Mostafa, Y.; St. Jacques, J. Information capacity of the Hopfield model. IEEE Trans. Inf. Theory
**1985**, 31, 461–464. [Google Scholar] [CrossRef] - McEliece, R.; Posner, E.; Rodemich, E.; Venkatesh, S. The capacity of the Hopfield associative memory. IEEE Trans. Inf. Theory
**1987**, 33, 461–482. [Google Scholar] [CrossRef] - Sollacher, R.; Gao, H. Towards Real-World Applications of Online Learning Spiral Recurrent Neural Networks. J. Intell. Learn. Syst. Appl.
**2009**, 1, 1–27. [Google Scholar] [CrossRef] [Green Version] - Brunel, N. Is cortical connectivity optimized for storing information? Nat. Neurosci.
**2016**, 19, 749–755. [Google Scholar] [CrossRef] [PubMed] - Folli, V.; Leonetti, M.; Ruocco, G. On the Maximum Storage Capacity of the Hopfield Model. Front. Comput. Neurosci.
**2017**, 10, 144. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Rocchi, J.; Saad, D.; Tantari, D. High storage capacity in the Hopfield model with auto-interactions—Stability analysis. J. Phys. A Math. Theor.
**2017**, 50, 465001. [Google Scholar] [CrossRef] - Hebb, D. The Organization of Behavior; Wiley: New York, NY, USA, 1949. [Google Scholar] [CrossRef]
- Flum, J.; Grohe, M.M. Parameterized Complexity Theory; Springer Science & Business Media: Berlin, Germany, 2006; p. 493. [Google Scholar]

**Figure 1.**Sketch of the vector states space of a Hopfield RNN. Each point of $\Omega $ is one of all the ${2}^{N}$ possible binary vectors. Among these states, ${2}^{\gamma N}$ with $\gamma =0.29$ may be steady states and belong to ${\Omega}_{TE}$. (

**A**) In the standard approach to memory storage, each stable state is a single memory state, or a spurious memory ${\overline{\xi}}^{\left(\mu \right)}\in {\Omega}_{TE}$. (

**B**) In the approach proposed here, each memory element is represented by a neighborhood of vectors ${\overline{\xi}}^{(\mu ,l)}\in {\Omega}_{TE}$ which surrounds the “seed” vector ${\overline{\xi}}^{\left(\mu \right)}$. This neighborhood is obtained by the collection of all vectors ${\overline{\xi}}^{\left(\nu \right)}$ that differ from ${\overline{\xi}}^{\left(\mu \right)}$ of k-bits at the most. The neighborhood of ${\overline{\xi}}^{\left(\mu \right)}$ is defined with the shorthand ${V}_{k}^{\left(\mu \right)}=\{{\overline{\xi}}^{\left(\nu \right)}\in \Omega |d({\overline{\xi}}^{\left(\mu \right)},{\overline{\xi}}^{\left(\nu \right)})<k\}$.

**Figure 2.**Given the neighborhood ${V}_{k}^{\left(\mu \right)}$ of each memory ${\overline{\xi}}^{\left(\mu \right)}$, the dashed lines show how the retrieval rate $R\left({d}_{{\overline{\xi}}^{\left(\mu \right)}}\right)$ changes for a sample of neighbors at a distance d from ${\overline{\xi}}^{\left(\mu \right)}$. The red line shows the average of $R\left({d}_{{\overline{\xi}}^{\left(\mu \right)}}\right)$ over all ${\overline{\xi}}^{\left(\mu \right)}$. The vertical line represents the $\beta =4\%$ limit which corresponds to the neighborhood bound $k=8$.

**Figure 3.**Given each memory ${\overline{\xi}}^{\left(\mu \right)}$ and its neighborhood ${V}_{k}^{\left(\mu \right)}$, the dashed lines show the average distance from ${\overline{\xi}}^{\left(\mu \right)}$ for the attractors of neighbors at a distance ${d}_{{\overline{\xi}}^{\left(\mu \right)}}$ from ${\overline{\xi}}^{\left(\mu \right)}$. The red line shows the average of $D\left({d}_{{\overline{\xi}}^{\left(\mu \right)}}\right)$ over all ${\overline{\xi}}^{\left(\mu \right)}$. The vertical and the horizontal line represent the $k=8$ limit and the neighborhood bound.

**Figure 4.**Average ${\langle R\left({d}_{{\overline{\xi}}^{\left(\mu \right)}}\right)\rangle}_{r,{\overline{\xi}}^{\left(\mu \right)}}$ computed over eight Hopfield RNN replicas for different N, with $\beta =4\%$, and $P={P}_{c}$. The vertical line represents the $\beta =4\%$ limit and the neighborhood bound.

**Figure 5.**Mean neighborhood retrieval rate R for eight Hopfield RNN replicas for different N values averaged over all memory states ${\overline{\xi}}^{\left(\mu \right)}$, with $\beta =4\%$, and $P={P}_{c}$. The horizontal line represents the optimal recall of all state in the neighberhood.

© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Gosti, G.; Folli, V.; Leonetti, M.; Ruocco, G.
Beyond the Maximum Storage Capacity Limit in Hopfield Recurrent Neural Networks. *Entropy* **2019**, *21*, 726.
https://doi.org/10.3390/e21080726

**AMA Style**

Gosti G, Folli V, Leonetti M, Ruocco G.
Beyond the Maximum Storage Capacity Limit in Hopfield Recurrent Neural Networks. *Entropy*. 2019; 21(8):726.
https://doi.org/10.3390/e21080726

**Chicago/Turabian Style**

Gosti, Giorgio, Viola Folli, Marco Leonetti, and Giancarlo Ruocco.
2019. "Beyond the Maximum Storage Capacity Limit in Hopfield Recurrent Neural Networks" *Entropy* 21, no. 8: 726.
https://doi.org/10.3390/e21080726