Next Article in Journal
Modeling and Performance Evaluation of a Context Information-Based Optimized Handover Scheme in 5G Networks
Next Article in Special Issue
α-Connections and a Symmetric Cubic Form on a Riemannian Manifold
Previous Article in Journal
Dynamics of Entanglement in Jaynes–Cummings Nodes with Nonidentical Qubit-Field Coupling Strengths
Previous Article in Special Issue
Regularizing Neural Networks via Retaining Confident Connections
Article Menu
Issue 7 (July) cover image

Export Article

Open AccessArticle
Entropy 2017, 19(7), 310; doi:10.3390/e19070310

Comparing Information-Theoretic Measures of Complexity in Boltzmann Machines

1
Department of Electrical Engineering and Computer Science, University of California, Berkeley, CA 94720, USA
2
Departments of Computer Science and Mathematics, University of Colorado, Boulder, CO 80309, USA
3
Santa Fe Institute, Santa Fe, NM 87501, USA
4
Max Planck Institute for Mathematics in the Sciences, 04103 Leipzig, Germany
5
Faculty of Mathematics and Computer Science, University of Leipzig, 04009 Leipzig, Germany
*
Author to whom correspondence should be addressed.
Received: 30 April 2017 / Revised: 19 June 2017 / Accepted: 23 June 2017 / Published: 3 July 2017
(This article belongs to the Special Issue Information Geometry II)
View Full-Text   |   Download PDF [1461 KB, uploaded 4 July 2017]   |  

Abstract

In the past three decades, many theoretical measures of complexity have been proposed to help understand complex systems. In this work, for the first time, we place these measures on a level playing field, to explore the qualitative similarities and differences between them, and their shortcomings. Specifically, using the Boltzmann machine architecture (a fully connected recurrent neural network) with uniformly distributed weights as our model of study, we numerically measure how complexity changes as a function of network dynamics and network parameters. We apply an extension of one such information-theoretic measure of complexity to understand incremental Hebbian learning in Hopfield networks, a fully recurrent architecture model of autoassociative memory. In the course of Hebbian learning, the total information flow reflects a natural upward trend in complexity as the network attempts to learn more and more patterns. View Full-Text
Keywords: complexity; information integration; information geometry; Boltzmann machine; Hopfield network; Hebbian learning complexity; information integration; information geometry; Boltzmann machine; Hopfield network; Hebbian learning
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Kanwal, M.S.; Grochow, J.A.; Ay, N. Comparing Information-Theoretic Measures of Complexity in Boltzmann Machines. Entropy 2017, 19, 310.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top