Next Article in Journal
Enhancing Crowd Monitoring System Functionality through Data Fusion: Estimating Flow Rate from Wi-Fi Traces and Automated Counting System Data
Next Article in Special Issue
A Multipath Fusion Strategy Based Single Shot Detector
Previous Article in Journal
Internet of Things and Machine Learning for Healthy Ageing: Identifying the Early Signs of Dementia
Article

Compressing Deep Networks by Neuron Agglomerative Clustering

1
Department of Computer Science and Technology, Ocean University of China, Qingdao 266100, China
2
Innovation Center, Ocean University of China, Qingdao 266100, China
3
Department of Computer Science and Engineering, Indian Institute of Technology Roorkee, Roorkee 247667, Uttarakhand, India
4
Department of Electrical and Electronical Engineering, Xi’an Jiaotong-Liverpool University, Suzhou 215123, China
*
Author to whom correspondence should be addressed.
Sensors 2020, 20(21), 6033; https://doi.org/10.3390/s20216033
Received: 4 September 2020 / Revised: 3 October 2020 / Accepted: 16 October 2020 / Published: 23 October 2020
In recent years, deep learning models have achieved remarkable successes in various applications, such as pattern recognition, computer vision, and signal processing. However, high-performance deep architectures are often accompanied by a large storage space and long computational time, which make it difficult to fully exploit many deep neural networks (DNNs), especially in scenarios in which computing resources are limited. In this paper, to tackle this problem, we introduce a method for compressing the structure and parameters of DNNs based on neuron agglomerative clustering (NAC). Specifically, we utilize the agglomerative clustering algorithm to find similar neurons, while these similar neurons and the connections linked to them are then agglomerated together. Using NAC, the number of parameters and the storage space of DNNs are greatly reduced, without the support of an extra library or hardware. Extensive experiments demonstrate that NAC is very effective for the neuron agglomeration of both the fully connected and convolutional layers, which are common building blocks of DNNs, delivering similar or even higher network accuracy. Specifically, on the benchmark CIFAR-10 and CIFAR-100 datasets, using NAC to compress the parameters of the original VGGNet by 92.96% and 81.10%, respectively, the compact network obtained still outperforms the original networks. View Full-Text
Keywords: deep learning; network compression; neurons; feature maps; agglomerative clustering deep learning; network compression; neurons; feature maps; agglomerative clustering
Show Figures

Figure 1

MDPI and ACS Style

Wang, L.-N.; Liu, W.; Liu, X.; Zhong, G.; Roy, P.P.; Dong, J.; Huang, K. Compressing Deep Networks by Neuron Agglomerative Clustering. Sensors 2020, 20, 6033. https://doi.org/10.3390/s20216033

AMA Style

Wang L-N, Liu W, Liu X, Zhong G, Roy PP, Dong J, Huang K. Compressing Deep Networks by Neuron Agglomerative Clustering. Sensors. 2020; 20(21):6033. https://doi.org/10.3390/s20216033

Chicago/Turabian Style

Wang, Li-Na, Wenxue Liu, Xiang Liu, Guoqiang Zhong, Partha P. Roy, Junyu Dong, and Kaizhu Huang. 2020. "Compressing Deep Networks by Neuron Agglomerative Clustering" Sensors 20, no. 21: 6033. https://doi.org/10.3390/s20216033

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop