Next Article in Journal
Mapping Arctic Bottomfast Sea Ice Using SAR Interferometry
Next Article in Special Issue
Multilayer Perceptron Neural Network for Surface Water Extraction in Landsat 8 OLI Satellite Images
Previous Article in Journal
Auto-Extraction of Linear Archaeological Traces of Tuntian Irrigation Canals in Miran Site (China) from Gaofen-1 Satellite Imagery
Previous Article in Special Issue
Supervised Classification High-Resolution Remote-Sensing Image Based on Interval Type-2 Fuzzy Membership Function
Open AccessArticle

Training Small Networks for Scene Classification of Remote Sensing Images via Knowledge Distillation

State Key Laboratory of Information Engineering in Surveying Mapping and Remote Sensing, Wuhan University, Wuhan 430079, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2018, 10(5), 719; https://doi.org/10.3390/rs10050719
Received: 3 April 2018 / Revised: 29 April 2018 / Accepted: 3 May 2018 / Published: 7 May 2018
Scene classification, aiming to identify the land-cover categories of remotely sensed image patches, is now a fundamental task in the remote sensing image analysis field. Deep-learning-model-based algorithms are widely applied in scene classification and achieve remarkable performance, but these high-level methods are computationally expensive and time-consuming. Consequently in this paper, we introduce a knowledge distillation framework, currently a mainstream model compression method, into remote sensing scene classification to improve the performance of smaller and shallower network models. Our knowledge distillation training method makes the high-temperature softmax output of a small and shallow student model match the large and deep teacher model. In our experiments, we evaluate knowledge distillation training method for remote sensing scene classification on four public datasets: AID dataset, UCMerced dataset, NWPU-RESISC dataset, and EuroSAT dataset. Results show that our proposed training method was effective and increased overall accuracy (3% in AID experiments, 5% in UCMerced experiments, 1% in NWPU-RESISC and EuroSAT experiments) for small and shallow models. We further explored the performance of the student model on small and unbalanced datasets. Our findings indicate that knowledge distillation can improve the performance of small network models on datasets with lower spatial resolution images, numerous categories, as well as fewer training samples. View Full-Text
Keywords: knowledge distillation; scene classification; convolutional neural networks (CNNs); remote sensing; deep learning knowledge distillation; scene classification; convolutional neural networks (CNNs); remote sensing; deep learning
Show Figures

Figure 1

MDPI and ACS Style

Chen, G.; Zhang, X.; Tan, X.; Cheng, Y.; Dai, F.; Zhu, K.; Gong, Y.; Wang, Q. Training Small Networks for Scene Classification of Remote Sensing Images via Knowledge Distillation. Remote Sens. 2018, 10, 719.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop