Adiabatic Quantum Computation Applied to Deep Learning Networks
1
Department of Computer Science, University of Southern California, Los Angeles, CA 90089, USA
2
Information Sciences Institute, University of Southern California, Marina del Rey, CA 90292, USA
3
Department of Electrical Engineering, University of Southern California, Los Angeles, CA 90089, USA
4
Computational Data Analytics Group, Oak Ridge National Laboratory, Oak Ridge, TN 37830, USA
5
Department of Electrical Engineering & Computer Science, University of Tennessee, Knoxville, TN 37996, USA
*
Author to whom correspondence should be addressed.
Entropy 2018, 20(5), 380; https://doi.org/10.3390/e20050380
Received: 6 April 2018 / Revised: 15 May 2018 / Accepted: 16 May 2018 / Published: 18 May 2018
(This article belongs to the Special Issue Quantum Foundations: 90 Years of Uncertainty)
Training deep learning networks is a difficult task due to computational complexity, and this is traditionally handled by simplifying network topology to enable parallel computation on graphical processing units (GPUs). However, the emergence of quantum devices allows reconsideration of complex topologies. We illustrate a particular network topology that can be trained to classify MNIST data (an image dataset of handwritten digits) and neutrino detection data using a restricted form of adiabatic quantum computation known as quantum annealing performed by a D-Wave processor. We provide a brief description of the hardware and how it solves Ising models, how we translate our data into the corresponding Ising models, and how we use available expanded topology options to explore potential performance improvements. Although we focus on the application of quantum annealing in this article, the work discussed here is just one of three approaches we explored as part of a larger project that considers alternative means for training deep learning networks. The other approaches involve using a high performance computing (HPC) environment to automatically find network topologies with good performance and using neuromorphic computing to find a low-power solution for training deep learning networks. Our results show that our quantum approach can find good network parameters in a reasonable time despite increased network topology complexity; that HPC can find good parameters for traditional, simplified network topologies; and that neuromorphic computers can use low power memristive hardware to represent complex topologies and parameters derived from other architecture choices.
View Full-Text
▼
Show Figures
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited
MDPI and ACS Style
Liu, J.; Spedalieri, F.M.; Yao, K.-T.; Potok, T.E.; Schuman, C.; Young, S.; Patton, R.; Rose, G.S.; Chamka, G. Adiabatic Quantum Computation Applied to Deep Learning Networks. Entropy 2018, 20, 380. https://doi.org/10.3390/e20050380
AMA Style
Liu J, Spedalieri FM, Yao K-T, Potok TE, Schuman C, Young S, Patton R, Rose GS, Chamka G. Adiabatic Quantum Computation Applied to Deep Learning Networks. Entropy. 2018; 20(5):380. https://doi.org/10.3390/e20050380
Chicago/Turabian StyleLiu, Jeremy; Spedalieri, Federico M.; Yao, Ke-Thia; Potok, Thomas E.; Schuman, Catherine; Young, Steven; Patton, Robert; Rose, Garrett S.; Chamka, Gangotree. 2018. "Adiabatic Quantum Computation Applied to Deep Learning Networks" Entropy 20, no. 5: 380. https://doi.org/10.3390/e20050380
Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.
Search more from Scilit