Next Article in Journal
Optimization of Selective Assembly for Shafts and Holes Based on Relative Entropy and Dynamic Programming
Previous Article in Journal
The Heisenberg Indeterminacy Principle in the Context of Covariant Quantum Gravity
Article

The World as a Neural Network

Department of Physics, University of Minnesota, Duluth, Minnesota, MN 55812, USA
Entropy 2020, 22(11), 1210; https://doi.org/10.3390/e22111210
Received: 9 September 2020 / Revised: 19 October 2020 / Accepted: 23 October 2020 / Published: 26 October 2020
(This article belongs to the Section Statistical Physics)
We discuss a possibility that the entire universe on its most fundamental level is a neural network. We identify two different types of dynamical degrees of freedom: “trainable” variables (e.g., bias vector or weight matrix) and “hidden” variables (e.g., state vector of neurons). We first consider stochastic evolution of the trainable variables to argue that near equilibrium their dynamics is well approximated by Madelung equations (with free energy representing the phase) and further away from the equilibrium by Hamilton–Jacobi equations (with free energy representing the Hamilton’s principal function). This shows that the trainable variables can indeed exhibit classical and quantum behaviors with the state vector of neurons representing the hidden variables. We then study stochastic evolution of the hidden variables by considering D non-interacting subsystems with average state vectors, x¯1, …, x¯D and an overall average state vector x¯0. In the limit when the weight matrix is a permutation matrix, the dynamics of x¯μ can be described in terms of relativistic strings in an emergent D+1 dimensional Minkowski space-time. If the subsystems are minimally interacting, with interactions that are described by a metric tensor, and then the emergent space-time becomes curved. We argue that the entropy production in such a system is a local function of the metric tensor which should be determined by the symmetries of the Onsager tensor. It turns out that a very simple and highly symmetric Onsager tensor leads to the entropy production described by the Einstein–Hilbert term. This shows that the learning dynamics of a neural network can indeed exhibit approximate behaviors that were described by both quantum mechanics and general relativity. We also discuss a possibility that the two descriptions are holographic duals of each other. View Full-Text
Keywords: general relativity; machine learning; quantum mechanics; thermodynamics of learning general relativity; machine learning; quantum mechanics; thermodynamics of learning
MDPI and ACS Style

Vanchurin, V. The World as a Neural Network. Entropy 2020, 22, 1210. https://doi.org/10.3390/e22111210

AMA Style

Vanchurin V. The World as a Neural Network. Entropy. 2020; 22(11):1210. https://doi.org/10.3390/e22111210

Chicago/Turabian Style

Vanchurin, Vitaly. 2020. "The World as a Neural Network" Entropy 22, no. 11: 1210. https://doi.org/10.3390/e22111210

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop