Next Article in Journal
Resonance Frequency Readout Circuit for a 900 MHz SAW Device
Next Article in Special Issue
A Hyperspectral Image Classification Framework with Spatial Pixel Pair Features
Previous Article in Journal
Simultaneous Recognition and Relative Pose Estimation of 3D Objects Using 4D Orthonormal Moments
Previous Article in Special Issue
A Robust Sparse Representation Model for Hyperspectral Image Classification
Open AccessArticle

A Sparse Dictionary Learning-Based Adaptive Patch Inpainting Method for Thick Clouds Removal from High-Spatial Resolution Remote Sensing Imagery

by Fan Meng 1, Xiaomei Yang 1,2,*, Chenghu Zhou 1 and Zhi Li 3,4
1
State Key Laboratory of Resources and Environmental Information System, Institute of Geographic Sciences and Natural Resources Research, Chinese Academy of Sciences, Beijing 100101, China
2
Jiangsu Center for Collaborative Innovation in Geographical Information Resource Development and Application, Nanjing 210023, China
3
State Key Laboratory of Desert and Oasis Ecology, Xinjiang Institute of Ecology and Geography, Chinese Academy of Sciences, Urumqi 830011, China
4
University of Chinese Academy of Sciences, Beijing 100049, China
*
Author to whom correspondence should be addressed.
Sensors 2017, 17(9), 2130; https://doi.org/10.3390/s17092130
Received: 7 July 2017 / Revised: 11 September 2017 / Accepted: 13 September 2017 / Published: 15 September 2017
(This article belongs to the Special Issue Analysis of Multispectral and Hyperspectral Data)
Cloud cover is inevitable in optical remote sensing (RS) imagery on account of the influence of observation conditions, which limits the availability of RS data. Therefore, it is of great significance to be able to reconstruct the cloud-contaminated ground information. This paper presents a sparse dictionary learning-based image inpainting method for adaptively recovering the missing information corrupted by thick clouds patch-by-patch. A feature dictionary was learned from exemplars in the cloud-free regions, which was later utilized to infer the missing patches via sparse representation. To maintain the coherence of structures, structure sparsity was brought in to encourage first filling-in of missing patches on image structures. The optimization model of patch inpainting was formulated under the adaptive neighborhood-consistency constraint, which was solved by a modified orthogonal matching pursuit (OMP) algorithm. In light of these ideas, the thick-cloud removal scheme was designed and applied to images with simulated and true clouds. Comparisons and experiments show that our method can not only keep structures and textures consistent with the surrounding ground information, but also yield rare smoothing effect and block effect, which is more suitable for the removal of clouds from high-spatial resolution RS imagery with salient structures and abundant textured features. View Full-Text
Keywords: sparse representation; dictionary learning; image inpainting; thick clouds removal; high resolution remote sensing image sparse representation; dictionary learning; image inpainting; thick clouds removal; high resolution remote sensing image
Show Figures

Figure 1

MDPI and ACS Style

Meng, F.; Yang, X.; Zhou, C.; Li, Z. A Sparse Dictionary Learning-Based Adaptive Patch Inpainting Method for Thick Clouds Removal from High-Spatial Resolution Remote Sensing Imagery. Sensors 2017, 17, 2130.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop