Next Article in Journal
Mapping Soil Degradation on Arable Land with Aerial Photography and Erosion Models, Case Study from Danube Lowland, Slovakia
Next Article in Special Issue
Combined Geometric Positioning and Performance Analysis of Multi-Resolution Optical Imageries from Satellite and Aerial Platforms Based on Weighted RFM Bundle Adjustment
Previous Article in Journal
Field-Scale Characterization of Spatio-Temporal Variability of Soil Salinity in Three Dimensions
Open AccessArticle

AMN: Attention Metric Network for One-Shot Remote Sensing Image Scene Classification

School of Electronic Information, Wuhan University, Wuhan 430079, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2020, 12(24), 4046; https://doi.org/10.3390/rs12244046
Received: 31 October 2020 / Revised: 1 December 2020 / Accepted: 8 December 2020 / Published: 10 December 2020
(This article belongs to the Special Issue Advanced Artificial Intelligence and Deep Learning for Remote Sensing)
In recent years, deep neural network (DNN) based scene classification methods have achieved promising performance. However, the data-driven training strategy requires a large number of labeled samples, making the DNN-based methods unable to solve the scene classification problem in the case of a small number of labeled images. As the number and variety of scene images continue to grow, the cost and difficulty of manual annotation also increase. Therefore, it is significant to deal with the scene classification problem with only a few labeled samples. In this paper, we propose an attention metric network (AMN) in the framework of the few-shot learning (FSL) to improve the performance of one-shot scene classification. AMN is composed of a self-attention embedding network (SAEN) and a cross-attention metric network (CAMN). In SAEN, we adopt the spatial attention and the channel attention of feature maps to obtain abundant features of scene images. In CAMN, we propose a novel cross-attention mechanism which can highlight the features that are more concerned about different categories, and improve the similarity measurement performance. A loss function combining mean square error (MSE) loss with multi-class N-pair loss is developed, which helps to promote the intra-class similarity and inter-class variance of embedding features, and also improve the similarity measurement results. Experiments on the NWPU-RESISC45 dataset and the RSD-WHU46 dataset demonstrate that our method achieves the state-of-the-art results on one-shot remote sensing image scene classification tasks. View Full-Text
Keywords: remote sensing image scene classification; few-shot learning; cross-attention mechanism; metric network; multi-class N-pair loss remote sensing image scene classification; few-shot learning; cross-attention mechanism; metric network; multi-class N-pair loss
Show Figures

Graphical abstract

MDPI and ACS Style

Li, X.; Pu, F.; Yang, R.; Gui, R.; Xu, X. AMN: Attention Metric Network for One-Shot Remote Sensing Image Scene Classification. Remote Sens. 2020, 12, 4046. https://doi.org/10.3390/rs12244046

AMA Style

Li X, Pu F, Yang R, Gui R, Xu X. AMN: Attention Metric Network for One-Shot Remote Sensing Image Scene Classification. Remote Sensing. 2020; 12(24):4046. https://doi.org/10.3390/rs12244046

Chicago/Turabian Style

Li, Xirong; Pu, Fangling; Yang, Rui; Gui, Rong; Xu, Xin. 2020. "AMN: Attention Metric Network for One-Shot Remote Sensing Image Scene Classification" Remote Sens. 12, no. 24: 4046. https://doi.org/10.3390/rs12244046

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Search more from Scilit
 
Search
Back to TopTop