Next Article in Journal
Enhancing Crowd Monitoring System Functionality through Data Fusion: Estimating Flow Rate from Wi-Fi Traces and Automated Counting System Data
Previous Article in Journal
Internet of Things and Machine Learning for Healthy Ageing: Identifying the Early Signs of Dementia
Open AccessArticle

Compressing Deep Networks by Neuron Agglomerative Clustering

Department of Computer Science and Technology, Ocean University of China, Qingdao 266100, China
Innovation Center, Ocean University of China, Qingdao 266100, China
Department of Computer Science and Engineering, Indian Institute of Technology Roorkee, Roorkee 247667, Uttarakhand, India
Department of Electrical and Electronical Engineering, Xi’an Jiaotong-Liverpool University, Suzhou 215123, China
Author to whom correspondence should be addressed.
Sensors 2020, 20(21), 6033;
Received: 4 September 2020 / Revised: 3 October 2020 / Accepted: 16 October 2020 / Published: 23 October 2020
In recent years, deep learning models have achieved remarkable successes in various applications, such as pattern recognition, computer vision, and signal processing. However, high-performance deep architectures are often accompanied by a large storage space and long computational time, which make it difficult to fully exploit many deep neural networks (DNNs), especially in scenarios in which computing resources are limited. In this paper, to tackle this problem, we introduce a method for compressing the structure and parameters of DNNs based on neuron agglomerative clustering (NAC). Specifically, we utilize the agglomerative clustering algorithm to find similar neurons, while these similar neurons and the connections linked to them are then agglomerated together. Using NAC, the number of parameters and the storage space of DNNs are greatly reduced, without the support of an extra library or hardware. Extensive experiments demonstrate that NAC is very effective for the neuron agglomeration of both the fully connected and convolutional layers, which are common building blocks of DNNs, delivering similar or even higher network accuracy. Specifically, on the benchmark CIFAR-10 and CIFAR-100 datasets, using NAC to compress the parameters of the original VGGNet by 92.96% and 81.10%, respectively, the compact network obtained still outperforms the original networks. View Full-Text
Keywords: deep learning; network compression; neurons; feature maps; agglomerative clustering deep learning; network compression; neurons; feature maps; agglomerative clustering
Show Figures

Figure 1

MDPI and ACS Style

Wang, L.-N.; Liu, W.; Liu, X.; Zhong, G.; Roy, P.P.; Dong, J.; Huang, K. Compressing Deep Networks by Neuron Agglomerative Clustering. Sensors 2020, 20, 6033.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

Search more from Scilit
Back to TopTop