Comparing Information-Theoretic Measures of Complexity in Boltzmann Machines
AbstractIn the past three decades, many theoretical measures of complexity have been proposed to help understand complex systems. In this work, for the first time, we place these measures on a level playing field, to explore the qualitative similarities and differences between them, and their shortcomings. Specifically, using the Boltzmann machine architecture (a fully connected recurrent neural network) with uniformly distributed weights as our model of study, we numerically measure how complexity changes as a function of network dynamics and network parameters. We apply an extension of one such information-theoretic measure of complexity to understand incremental Hebbian learning in Hopfield networks, a fully recurrent architecture model of autoassociative memory. In the course of Hebbian learning, the total information flow reflects a natural upward trend in complexity as the network attempts to learn more and more patterns. View Full-Text
Scifeed alert for new publicationsNever miss any articles matching your research from any publisher
- Get alerts for new papers matching your research
- Find out the new papers from selected authors
- Updated daily for 49'000+ journals and 6000+ publishers
- Define your Scifeed now
Kanwal, M.S.; Grochow, J.A.; Ay, N. Comparing Information-Theoretic Measures of Complexity in Boltzmann Machines. Entropy 2017, 19, 310.
Kanwal MS, Grochow JA, Ay N. Comparing Information-Theoretic Measures of Complexity in Boltzmann Machines. Entropy. 2017; 19(7):310.Chicago/Turabian Style
Kanwal, Maxinder S.; Grochow, Joshua A.; Ay, Nihat. 2017. "Comparing Information-Theoretic Measures of Complexity in Boltzmann Machines." Entropy 19, no. 7: 310.
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.