Next Article in Journal
Circumpolar Thin Arctic Sea Ice Thickness and Small-Scale Roughness Retrieval Using Soil Moisture and Ocean Salinity and Soil Moisture Active Passive Observations
Previous Article in Journal
Wetland Loss Identification and Evaluation Based on Landscape and Remote Sensing Indices in Xiong’an New Area
Open AccessArticle

Matching RGB and Infrared Remote Sensing Images with Densely-Connected Convolutional Neural Networks

by Ruojin Zhu 1, Dawen Yu 1, Shunping Ji 1,* and Meng Lu 2
1
School of Remote Sensing and Information Engineering, Wuhan University, 129 Luoyu Road, Wuhan 430079, China
2
Department of Physical Geography, Faculty of Geoscience, Utrecht University, Princetonlaan 8, 3584 CB Utrecht, The Netherlands
*
Author to whom correspondence should be addressed.
Remote Sens. 2019, 11(23), 2836; https://doi.org/10.3390/rs11232836
Received: 17 October 2019 / Revised: 21 November 2019 / Accepted: 26 November 2019 / Published: 29 November 2019
(This article belongs to the Section Remote Sensing Image Processing)
We develop a deep learning-based matching method between an RGB (red, green and blue) and an infrared image that were captured from satellite sensors. The method includes a convolutional neural network (CNN) that compares the RGB and infrared image pair and a template searching strategy that searches the correspondent point within a search window in the target image to a given point in the reference image. A densely-connected CNN is developed to extract common features from different spectral bands. The network consists of a series of densely-connected convolutions to make full use of low-level features and an augmented cross entropy loss to avoid model overfitting. The network takes band-wise concatenated RGB and infrared images as the input and outputs a similarity score of the RGB and infrared image pair. For a given reference point, the similarity scores within the search window are calculated pixel-by-pixel, and the pixel with the highest score becomes the matching candidate. Experiments on a satellite RGB and infrared image dataset demonstrated that our method obtained more than 75% improvement on matching rate (the ratio of the successfully matched points to all the reference points) over conventional methods such as SURF, RIFT, and PSO-SIFT, and more than 10% improvement compared to other most recent CNN-based structures. Our experiments also demonstrated high performance and generalization ability of our method applying to multitemporal remote sensing images and close-range images. View Full-Text
Keywords: image matching; convolutional neural network; remote sensing image; template matching image matching; convolutional neural network; remote sensing image; template matching
Show Figures

Graphical abstract

MDPI and ACS Style

Zhu, R.; Yu, D.; Ji, S.; Lu, M. Matching RGB and Infrared Remote Sensing Images with Densely-Connected Convolutional Neural Networks. Remote Sens. 2019, 11, 2836.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop