Next Article in Journal
Retrieval of Aerosol Optical Depth in the Arid or Semiarid Region of Northern Xinjiang, China
Next Article in Special Issue
Deep Salient Feature Based Anti-Noise Transfer Network for Scene Classification of Remote Sensing Imagery
Previous Article in Journal
On the Use of the Eddy Covariance Latent Heat Flux and Sap Flow Transpiration for the Validation of a Surface Energy Balance Model
Previous Article in Special Issue
Comparative Analysis of Responses of Land Surface Temperature to Long-Term Land Use/Cover Changes between a Coastal and Inland City: A Case of Freetown and Bo Town in Sierra Leone
Article Menu
Issue 2 (February) cover image

Export Article

Open AccessArticle
Remote Sens. 2018, 10(2), 196;

Learning a Dilated Residual Network for SAR Image Despeckling

School of Geodesy and Geomatics, Wuhan University, Wuhan 430079, China
International School of Software, Wuhan University, Wuhan 430079, China
School of Resource and Environmental Science, Wuhan University, Wuhan 430079, China
School of Resources and Environmental Engineering, Anhui University, Hefei 230000, China
Author to whom correspondence should be addressed.
Received: 13 November 2017 / Revised: 17 January 2018 / Accepted: 24 January 2018 / Published: 29 January 2018
(This article belongs to the Collection Learning to Understand Remote Sensing Images)
PDF [9907 KB, uploaded 29 January 2018]


In this paper, to break the limit of the traditional linear models for synthetic aperture radar (SAR) image despeckling, we propose a novel deep learning approach by learning a non-linear end-to-end mapping between the noisy and clean SAR images with a dilated residual network (SAR-DRN). SAR-DRN is based on dilated convolutions, which can both enlarge the receptive field and maintain the filter size and layer depth with a lightweight structure. In addition, skip connections and a residual learning strategy are added to the despeckling model to maintain the image details and reduce the vanishing gradient problem. Compared with the traditional despeckling methods, the proposed method shows a superior performance over the state-of-the-art methods in both quantitative and visual assessments, especially for strong speckle noise. View Full-Text
Keywords: SAR image; despeckling; dilated convolution; skip connection; residual learning SAR image; despeckling; dilated convolution; skip connection; residual learning

Graphical abstract

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).

Share & Cite This Article

MDPI and ACS Style

Zhang, Q.; Yuan, Q.; Li, J.; Yang, Z.; Ma, X. Learning a Dilated Residual Network for SAR Image Despeckling. Remote Sens. 2018, 10, 196.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics



[Return to top]
Remote Sens. EISSN 2072-4292 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top