Satellite Image Super-Resolution via Multi-Scale Residual Deep Neural Network
1
Hubei Key Laboratory of Intelligent Robot, School of Computer Science and Engineering, Wuhan Institute of Technology, Wuhan 430205, China
2
School of Computer Science, Wuhan University, Wuhan 430072, China
3
School of Computer Science and Technology, Harbin Institute of Technology, Harbin 150001, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2019, 11(13), 1588; https://doi.org/10.3390/rs11131588
Received: 13 May 2019 / Revised: 28 June 2019 / Accepted: 2 July 2019 / Published: 4 July 2019
(This article belongs to the Special Issue Robust Multispectral/Hyperspectral Image Analysis and Classification)
Recently, the application of satellite remote sensing images is becoming increasingly popular, but the observed images from satellite sensors are frequently in low-resolution (LR). Thus, they cannot fully meet the requirements of object identification and analysis. To utilize the multi-scale characteristics of objects fully in remote sensing images, this paper presents a multi-scale residual neural network (MRNN). MRNN adopts the multi-scale nature of satellite images to reconstruct high-frequency information accurately for super-resolution (SR) satellite imagery. Different sizes of patches from LR satellite images are initially extracted to fit different scale of objects. Large-, middle-, and small-scale deep residual neural networks are designed to simulate differently sized receptive fields for acquiring relative global, contextual, and local information for prior representation. Then, a fusion network is used to refine different scales of information. MRNN fuses the complementary high-frequency information from differently scaled networks to reconstruct the desired high-resolution satellite object image, which is in line with human visual experience (“look in multi-scale to see better”). Experimental results on the SpaceNet satellite image and NWPU-RESISC45 databases show that the proposed approach outperformed several state-of-the-art SR algorithms in terms of objective and subjective image qualities.
View Full-Text
Keywords:
satellite imagery; super-resolution; residual network; multi-scale image; convolutional neural network
▼
Show Figures
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited
MDPI and ACS Style
Lu, T.; Wang, J.; Zhang, Y.; Wang, Z.; Jiang, J. Satellite Image Super-Resolution via Multi-Scale Residual Deep Neural Network. Remote Sens. 2019, 11, 1588. https://doi.org/10.3390/rs11131588
AMA Style
Lu T, Wang J, Zhang Y, Wang Z, Jiang J. Satellite Image Super-Resolution via Multi-Scale Residual Deep Neural Network. Remote Sensing. 2019; 11(13):1588. https://doi.org/10.3390/rs11131588
Chicago/Turabian StyleLu, Tao; Wang, Jiaming; Zhang, Yanduo; Wang, Zhongyuan; Jiang, Junjun. 2019. "Satellite Image Super-Resolution via Multi-Scale Residual Deep Neural Network" Remote Sens. 11, no. 13: 1588. https://doi.org/10.3390/rs11131588
Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.
Search more from Scilit