Effective SAR Image Despeckling Using Noise-Guided Transformer and Multi-Scale Feature Fusion
Highlights
- A novel SAR image despeckling method is proposed, which incorporates a dual-branch network for noise estimation and coarse despeckling, along with a noise-guided Transformer for refinement.
- The model uses multi-scale fusion with grouped pooling attention (GPA) and context-aware fusion (CAF), along with deformable convolutions and masked self-attention for region-specific improvements.
- Separating noise estimation and despeckling improves both noise suppression and the preservation of fine details, especially in areas with varying noise.
- Experiments on synthetic and real SAR images show our method outperforms existing approaches, providing a strong solution for SAR applications in noisy conditions.
Abstract
1. Introduction
- (1)
- We propose a novel dual-branch architecture that decouples the tasks of coarse despeckling and noise estimation, enabling each branch to specialize in its respective function. This decoupling improves noise suppression accuracy while effectively preserving structural image details. The proposed grouped pooling attention (GPA) and context aware fusion (CAF) modules leverage multi-scale contextual integration, which enables the model to combine local details with global contextual information effectively. Additionally, the introduction of a bidirectional feature interaction mechanism between the two branches further enhances the accuracy of both noise estimation and despeckling performance.
- (2)
- We introduce a noise-guided Transformer subnet that leverages the adaptive, learned coarse despeckling map and noise map from the dual-branch subnet as prior knowledge. By incorporating deformable convolutions and a learnable attention mask, the Transformer subnet can capture complex long-range dependencies and selectively focus on relevant regions of the image, enhancing despeckling in areas with varying noise levels.
- (3)
- Extensive experiments conducted on both synthetic and real-world SAR datasets demonstrate the superiority of the proposed method over existing state-of-the-art approaches. The results show significant improvements in both quantitative metrics and visual quality, underscoring the robustness and generalization ability of the method across different noise levels and types of SAR images.
2. Related Works
2.1. Traditional Despeckling Methods
2.2. Deep Learning-Based Despeckling Methods
3. Proposed Method
3.1. Dual-Branch Subnet for Coarse Despeckling and Noise Estimation
3.2. Noise-Guided Despeckling Subnet
3.3. Loss Function and Implementation Details
| Algorithm 1 Noise-Aware Transformer-based Network for SAR Despeckling |
| Input: Speskled image Y, Clean image X, speckle noise N Initialization: All network parameters
|
4. Experimental Results
4.1. Comparison Methods and Objective Evaluation Metrics
- (1)
- PSNR measures the ratio between the maximum possible power of a signal and the power of the distortion (noise). A higher PSNR generally indicates better despeckling performance and less image distortion:where is the maximum possible pixel value. MSE is the mean squared error:where y and x represent noisy and clean reference images, respectively, and M and N denote the height and width of the image in pixels, respectively.
- (2)
- SSIM is a perceptual metric that compares the luminance, contrast, and structural similarity between the reference and the processed image. SSIM values range from 0 to 1, with values closer to 1 indicating higher structural similarity. The SSIM index is defined as:where are the average intensities of x and y, respectively. are the variances of x and y. is the covariance between x and y. and are small constants used to stabilize the division with weak denominators.
- (3)
- ENL evaluates the noise suppression capability and is typically computed within selected homogeneous regions of interest (RoI) in the image. The ENL is given by:where and are the mean and standard deviation of pixel intensities in the selected homogeneous region. Higher ENL values imply better speckle suppression.
- (4)
- TCR is primarily employed to quantify the relative strength of the target signal compared to the surrounding clutter. A higher TCR indicates that the target signal is significantly stronger than the clutter, leading to better detection and recognition of the target. The TCR index is defined as:
- (5)
- NIQE is a no-reference metric that estimates perceptual image quality based on statistical features derived from natural scenes. It does not require a reference image and is computed by comparing the distribution of features from the test image with those learned from a corpus of high-quality natural images. Lower NIQE scores indicate better perceptual quality.
- (6)
- BRISQUE is another no-reference quality metric that evaluates the spatial naturalness of an image using features derived from local image statistics. It relies on a machine learning model trained on human-rated image datasets. As with NIQE, lower BRISQUE scores correspond to higher perceptual quality.
4.2. Experiments on Synthetic SAR Images
4.3. Experiments on Real SAR Images
5. Discussion
6. Conclusions
Supplementary Materials
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Baraha, S.; Sahoo, A.K. Synthetic aperture radar image and its despeckling using variational methods: A Review of recent trends. Signal Process. 2023, 212, 109156. [Google Scholar] [CrossRef]
- Zhang, T.; Zhang, X.; Gao, G. Divergence to Concentration and Population to Individual: A Progressive Approaching Ship Detection Paradigm for Synthetic Aperture Radar Remote Sensing Imagery. IEEE Trans. Aerosp. Electron. Syst. 2025; early access. [Google Scholar] [CrossRef]
- Ke, H.; Ke, X.; Zhang, Z.; Chen, X.; Xu, X.; Zhang, T. SLA-Net: A Novel Sea–Land Aware Network for Accurate SAR Ship Detection Guided by Hierarchical Attention Mechanism. Remote Sens. 2025, 17, 3576. [Google Scholar] [CrossRef]
- Xue, W.; Ai, J.; Zhu, Y.; Chen, J.; Zhuang, S. AIS-FCANet: Long-Term AIS Data Assisted Frequency-Spatial Contextual Awareness Network for Salient Ship Detection in SAR Imagery. IEEE Trans. Aerosp. Electron. Syst. 2025, 61, 15166–15171. [Google Scholar] [CrossRef]
- Ai, J.; Mao, Y.; Luo, Q.; Jia, L.; Xing, M. SAR Target Classification Using the Multikernel-Size Feature Fusion-Based Convolutional Neural Network. IEEE Trans. Geosci. Remote Sens. 2022, 60, 5214313. [Google Scholar] [CrossRef]
- Chen, Y.; Shen, Y.; Duan, C.; Wang, Z.; Mo, Z.; Liang, Y.; Zhang, Q. Robust and Efficient SAR Ship Detection: An Integrated Despecking and Detection Framework. Remote Sens. 2025, 17, 580. [Google Scholar] [CrossRef]
- Cardona-Mesa, A.A.; Vásquez-Salazar, R.D.; Travieso-González, C.M.; Gómez, L. Comparative Analysis of Despeckling Filters Based on Generative Artificial Intelligence Trained with Actual Synthetic Aperture Radar Imagery. Remote Sens. 2025, 17, 828. [Google Scholar] [CrossRef]
- Fracastoro, G.; Magli, E.; Poggi, G.; Scarpa, G.; Valsesia, D.; Verdoliva, L. Deep learning methods for synthetic aperture radar image despeckling: An overview of trends and perspectives. IEEE Geosci. Remote Sens. Mag. 2021, 9, 29–51. [Google Scholar] [CrossRef]
- An, X.; Zeng, H.; Li, Z.; Yang, W.; Xiong, W.; Wang, Y.; Liu, Y. SAR Images Despeckling Using Subaperture Decomposition and Non-Local Low-Rank Tensor Approximation. Remote Sens. 2025, 17, 2716. [Google Scholar] [CrossRef]
- Fang, J.; Mao, T.; Bo, F.; Hao, B.; Zhang, N.; Hu, S.; Lu, W.; Wang, X. A SAR image-despeckling method based on HOSVD using tensor patches. Remote Sens. 2023, 15, 3118. [Google Scholar] [CrossRef]
- Bo, F.; Ma, X.; Cen, Y.; Hu, S. SAR Image Speckle Reduction Based on Nuclear Norm Minus Frobenius Norm Regularization. IEEE Trans. Geosci. Remote Sens. 2024, 62, 5227915. [Google Scholar] [CrossRef]
- Deledalle, C.A.; Denis, L.; Tupin, F. Iterative weighted maximum likelihood denoising with probabilistic patch-based weights. IEEE Trans. Image Process. 2009, 18, 2661–2672. [Google Scholar] [CrossRef]
- Parrilli, S.; Poderico, M.; Angelino, C.V.; Verdoliva, L. A nonlocal SAR image denoising algorithm based on LLMMSE wavelet shrinkage. IEEE Trans. Geosci. Remote Sens. 2011, 50, 606–616. [Google Scholar] [CrossRef]
- Singh, P.; Diwakar, M.; Shankar, A.; Shree, R.; Kumar, M. A Review on SAR Image and its Despeckling. Arch. Comput. Methods Eng. 2021, 28, 4633–4653. [Google Scholar] [CrossRef]
- Shen, Y.; Chen, Y.; Wang, Y.; Ma, L.; Zhang, X. DATNet: Dynamic Adaptive Transformer Network for SAR Image Denoising. Remote Sens. 2025, 17, 3031. [Google Scholar] [CrossRef]
- Chierchia, G.; Cozzolino, D.; Poggi, G.; Verdoliva, L. SAR image despeckling through convolutional neural networks. In Proceedings of the 2017 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Fort Worth, TX, USA, 23–28 July 2017; pp. 5438–5441. [Google Scholar]
- Lin, C.; Qiu, C.; Jiang, H.; Zou, L. A Deep Neural Network Based on Prior-Driven and Structural Preserving for SAR Image Despeckling. J. Sel. Top. Appl. Earth Obs. Remote Sens. 2023, 16, 6372–6392. [Google Scholar] [CrossRef]
- Cheng, L.; Guo, Z.; Li, Y.; Xing, Y. Two-Stream Multiplicative Heavy-Tail Noise Despeckling Network With Truncation Loss. IEEE Trans. Geosci. Remote Sens. 2023, 61, 5213817. [Google Scholar] [CrossRef]
- Guo, Y.; Lu, Y.; Liu, R.W.; Zhu, F. Blind Image Despeckling Using a Multiscale Attention-Guided Neural Network. IEEE Trans. Artif. Intell. 2024, 5, 205–216. [Google Scholar] [CrossRef]
- Liu, S.; Zhang, L.; Tian, S.; Hu, Q.; Li, B.; Zhang, Y. MFAENet: A Multi-Scale Feature Adaptive Enhancement Network for SAR Image Despeckling. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2023, 16, 10420–10433. [Google Scholar] [CrossRef]
- Pan, Y.; Zhong, L.; Chen, J.; Li, H.; Zhang, X.; Pan, B. SAR image despeckling based on denoising diffusion probabilistic model and swin transformer. Remote Sens. 2024, 16, 3222. [Google Scholar] [CrossRef]
- Guo, Z.; Hu, W.; Zheng, S.; Zhang, B.; Zhou, M.; Peng, J.; Yao, Z.; Feng, M. Efficient Conditional Diffusion Model for SAR Despeckling. Remote Sens. 2025, 17, 2970. [Google Scholar] [CrossRef]
- Liu, S.; Lei, Y.; Zhang, L.; Li, B.; Hu, W.; Zhang, Y.D. MRDDANet: A Multiscale Residual Dense Dual Attention Network for SAR Image Denoising. IEEE Trans. Geosci. Remote Sens. 2022, 60, 5214213. [Google Scholar] [CrossRef]
- Thakur, R.K.; Maji, S.K. AGSDNet: Attention and Gradient-Based SAR Denoising Network. IEEE Geosci. Remote Sens. Lett. 2022, 19, 4506805. [Google Scholar] [CrossRef]
- Wang, X.; Wu, Y.; Shi, C.; Yuan, Y.; Zhang, X. ANED-Net: Adaptive Noise Estimation and Despeckling Network for SAR Image. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2024, 17, 4036–4051. [Google Scholar] [CrossRef]
- Liu, S.; Tian, S.; Zhao, Y.; Hu, Q.; Li, B.; Zhang, Y.D. LG-DBNet: Local and Global Dual-Branch Network for SAR Image Denoising. IEEE Trans. Geosci. Remote Sens. 2024, 62, 5205515. [Google Scholar] [CrossRef]
- Wang, C.; Zheng, R.; Zhu, J.; Xu, W.; Li, X. A Practical SAR Despeckling Method Combining Swin Transformer and Residual CNN. IEEE Geosci. Remote Sens. Lett. 2024, 21, 4001205. [Google Scholar] [CrossRef]
- Xiao, S.; Zhang, S.; Huang, L.; Wang, W.Q. Trans-NLM Network for SAR Image Despeckling. IEEE Trans. Geosci. Remote Sens. 2024, 62, 5211912. [Google Scholar] [CrossRef]
- Bo, F.; Lu, W.; Wang, G.; Zhou, M.; Wang, Q.; Fang, J. A blind SAR image despeckling method based on improved weighted nuclear norm minimization. IEEE Geosci. Remote Sens. Lett. 2022, 19, 4515305. [Google Scholar] [CrossRef]
- Lee, J.S. Digital image enhancement and noise filtering by use of local statistics. IEEE Trans. Pattern Anal. Mach. Intell. 1980, PAMI-2, 165–168. [Google Scholar] [CrossRef]
- Kuan, D.; Sawchuk, A.; Strand, T.; Chavel, P. Adaptive restoration of images with speckle. IEEE Trans. Acoust. Speech Signal Process. 1987, 35, 373–383. [Google Scholar] [CrossRef]
- Frost, V.S.; Stiles, J.A.; Shanmugan, K.S.; Holtzman, J.C. A Model for Radar Images and Its Application to Adaptive Digital Filtering of Multiplicative Noise. IEEE Trans. Pattern Anal. Mach. Intell. 1982, PAMI-4, 157–166. [Google Scholar] [CrossRef]
- Xu, L.; Liu, P.; Jin, Y.Q. A New Nonlocal Iterative Trilateral Filter for SAR Images Despeckling. IEEE Trans. Geosci. Remote Sens. 2024, 62, 5213319. [Google Scholar] [CrossRef]
- Aranda-Bojorges, G.; Ponomaryov, V.; Reyes-Reyes, R.; Sadovnychiy, S.; Cruz-Ramos, C. Clustering-Based 3-D-MAP Despeckling of SAR Images Using Sparse Wavelet Representation. IEEE Geosci. Remote Sens. Lett. 2022, 19, 4018005. [Google Scholar] [CrossRef]
- Penna, P.A.; Mascarenhas, N.D. SAR speckle nonlocal filtering with statistical modeling of HAAR wavelet coefficients and stochastic distances. IEEE Trans. Geosci. Remote Sens. 2019, 57, 7194–7208. [Google Scholar] [CrossRef]
- Buades, A.; Coll, B.; Morel, J.M. A non-local algorithm for image denoising. In Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05), San Diego, CA, USA, 20–25 June 2005; Volume 2, pp. 60–65. [Google Scholar]
- Wang, G.; Bo, F.; Chen, X.; Lu, W.; Hu, S.; Fang, J. A collaborative despeckling method for SAR images based on texture classification. Remote Sens. 2022, 14, 1465. [Google Scholar] [CrossRef]
- Zhang, J.; Chen, J.; Yu, H.; Yang, D.; Xu, X.; Xing, M. Learning an SAR Image Despeckling Model Via Weighted Sparse Representation. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 7148–7158. [Google Scholar] [CrossRef]
- Liang, Y.; Yang, X.; Tan, W.; Wang, Z.; Huang, P.; Yang, J. Ratio-Based Multitemporal SAR Image Despeckling With Low-Rank Approximation. IEEE Geosci. Remote Sens. Lett. 2024, 21, 4000105. [Google Scholar] [CrossRef]
- Guan, J.; Liu, R.; Tian, X.; Tang, X.; Li, S. Robust SAR Image Despeckling by Deep Learning From Near-Real Datasets. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2024, 17, 2963–2979. [Google Scholar] [CrossRef]
- Wang, P.; Zhang, H.; Patel, V.M. SAR Image Despeckling Using a Convolutional Neural Network. IEEE Signal Process. Lett. 2017, 24, 1763–1767. [Google Scholar] [CrossRef]
- Zhang, Q.; Yuan, Q.; Li, J.; Yang, Z.; Ma, X. Learning a dilated residual network for SAR image despeckling. Remote Sens. 2018, 10, 196. [Google Scholar] [CrossRef]
- Bai, Y.; Xiao, Y.; Hou, X.; Li, Y.; Shang, C.; Shen, Q. SAR Image Despeckling with Residual-in-Residual Dense Generative Adversarial Network. In Proceedings of the 2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Rhodes Island, Greece, 4–10 June 2023; pp. 1–5. [Google Scholar] [CrossRef]
- Bo, F.; Jin, Y.; Ma, X.; Cen, Y.; Hu, S.; Li, Y. SemDNet: Semantic-Guided Despeckling Network for SAR Images. Expert Syst. Appl. 2025, 296, 129200. [Google Scholar] [CrossRef]
- Dalsasso, E.; Denis, L.; Tupin, F. SAR2SAR: A Semi-Supervised Despeckling Algorithm for SAR Images. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 4321–4329. [Google Scholar] [CrossRef]
- Dalsasso, E.; Denis, L.; Tupin, F. As If by Magic: Self-Supervised Training of Deep Despeckling Networks With MERLIN. IEEE Trans. Geosci. Remote Sens. 2022, 60, 4704713. [Google Scholar] [CrossRef]
- Lin, H.; Zhuang, Y.; Huang, Y.; Ding, X. Unpaired Speckle Extraction for SAR Despeckling. IEEE Trans. Geosci. Remote Sens. 2023, 61, 5201014. [Google Scholar] [CrossRef]
- Molini, A.B.; Valsesia, D.; Fracastoro, G.; Magli, E. Speckle2Void: Deep self-supervised SAR despeckling with blind-spot convolutional neural networks. IEEE Trans. Geosci. Remote Sens. 2021, 60, 5204017. [Google Scholar] [CrossRef]
- Bo, F.; Ma, X.; Hu, S.; An, G.; Li, Y.; Cen, Y. Speckle-Driven Unsupervised Despeckling for SAR Images. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2025, 18, 13023–13034. [Google Scholar] [CrossRef]
- Deng, J.W.; Li, M.D.; Chen, S.W. Sublook2Sublook: A Self-Supervised Speckle Filtering Framework for Single SAR Images. IEEE Trans. Geosci. Remote Sens. 2024, 62, 5211613. [Google Scholar] [CrossRef]
- Zhang, T.; Zhang, X.; Shi, J.; Wei, S. HyperLi-Net: A hyper-light deep learning network for high-accurate and high-speed ship detection from synthetic aperture radar imagery. ISPRS J. Photogramm. Remote Sens. 2020, 167, 123–153. [Google Scholar] [CrossRef]
- Zhang, T.; Zhang, X.; Liu, C.; Shi, J.; Wei, S.; Ahmad, I.; Zhan, X.; Zhou, Y.; Pan, D.; Li, J.; et al. Balance learning for ship detection from synthetic aperture radar remote sensing imagery. ISPRS J. Photogramm. Remote Sens. 2021, 182, 190–207. [Google Scholar] [CrossRef]
- Zhang, T.; Zhang, X.; Ke, X.; Liu, C.; Xu, X.; Zhan, X.; Wang, C.; Ahmad, I.; Zhou, Y.; Pan, D.; et al. HOG-ShipCLSNet: A novel deep learning network with hog feature fusion for SAR ship classification. IEEE Trans. Geosci. Remote Sens. 2021, 60, 5210322. [Google Scholar] [CrossRef]
- Cozzolino, D.; Parrilli, S.; Scarpa, G.; Poggi, G.; Verdoliva, L. Fast Adaptive Nonlocal SAR Despeckling. IEEE Geosci. Remote Sens. Lett. 2014, 11, 524–528. [Google Scholar] [CrossRef]
- Deledalle, C.A.; Denis, L.; Tabti, S.; Tupin, F. MuLoG, or How to Apply Gaussian Denoisers to Multi-Channel SAR Speckle Reduction? IEEE Trans. Image Process. 2017, 26, 4389–4403. [Google Scholar] [CrossRef] [PubMed]
- Mittal, A.; Soundararajan, R.; Bovik, A.C. Making a “completely blind” image quality analyzer. IEEE Signal Process. Lett. 2012, 20, 209–212. [Google Scholar] [CrossRef]
- Mittal, A.; Moorthy, A.K.; Bovik, A.C. No-reference image quality assessment in the spatial domain. IEEE Trans. Image Process. 2012, 21, 4695–4708. [Google Scholar] [CrossRef] [PubMed]















| SAR-9 | SSAR | |||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| L= 1 | L= 2 | L= 4 | L= 1 | L= 2 | L= 4 | |||||||
| PSNR | SSIM | PSNR | SSIM | PSNR | SSIM | PSNR | SSIM | PSNR | SSIM | PSNR | SSIM | |
| SAR-BM3D [13] | 25.21 | 0.7334 | 27.03 | 0.7991 | 28.75 | 0.8540 | 27.09 | 0.7739 | 28.92 | 0.8240 | 30.64 | 0.8670 |
| FANS [54] | 25.07 | 0.7066 | 26.98 | 0.7865 | 28.83 | 0.8487 | 27.05 | 0.7571 | 28.91 | 0.8136 | 30.66 | 0.8603 |
| MulogB [55] | 25.06 | 0.7162 | 26.82 | 0.7846 | 28.68 | 0.8460 | 27.09 | 0.7676 | 28.87 | 0.8170 | 30.63 | 0.8619 |
| SAR-NNFN [11] | 25.07 | 0.7047 | 27.16 | 0.7927 | 29.03 | 0.8548 | 27.10 | 0.7505 | 29.12 | 0.8167 | 30.98 | 0.8658 |
| SDUDNet [49] | 25.61 | 0.7305 | 27.51 | 0.8024 | 29.19 | 0.8562 | 28.08 | 0.7934 | 29.37 | 0.8263 | 30.85 | 0.8662 |
| SemDNet [44] | 26.10 | 0.7612 | 27.62 | 0.8072 | 29.30 | 0.8596 | 28.03 | 0.7938 | 29.58 | 0.8335 | 30.98 | 0.8736 |
| PDSNet [17] | 25.88 | 0.7334 | 27.32 | 0.7900 | 28.74 | 0.8378 | 27.73 | 0.7688 | 29.27 | 0.8171 | 30.73 | 0.8566 |
| HTNet [18] | 25.66 | 0.7202 | 27.10 | 0.7767 | 28.52 | 0.8269 | 27.54 | 0.7609 | 29.02 | 0.8064 | 30.42 | 0.8464 |
| MSANN [19] | 25.81 | 0.7395 | 27.24 | 0.7984 | 28.58 | 0.8421 | 27.49 | 0.7689 | 28.93 | 0.8188 | 30.18 | 0.8541 |
| MFAENet [20] | 26.05 | 0.7528 | 27.68 | 0.8104 | 29.29 | 0.8593 | 28.01 | 0.7875 | 29.61 | 0.8332 | 31.11 | 0.8713 |
| LGDBNet [26] | 24.08 | 0.6529 | 26.97 | 0.7824 | 28.87 | 0.8458 | 24.95 | 0.6237 | 28.34 | 0.7754 | 30.67 | 0.8565 |
| Proposed | 26.15 | 0.7638 | 27.67 | 0.8166 | 29.32 | 0.8637 | 28.09 | 0.7936 | 29.60 | 0.8365 | 31.13 | 0.8750 |
| NIQE ↓ | BRISQUE ↓ | ENL ↑ | TCR ↑ | Time (s)↓ | ||
|---|---|---|---|---|---|---|
| SAR-BM3D [13] | 5.62 | 41.08 | 25.87 | 3.20 | 42.5 | |
| FANS [54] | 5.76 | 37.84 | 44.62 | 2.58 | 1.62 | |
| Non-Learning | MulogB [55] | 6.00 | 39.88 | 85.43 | 1.80 | 10.71 |
| SAR-NNFN [11] | 7.09 | 40.40 | 136.25 | 3.39 | 51.12 | |
| SemDNet [44] | 5.22 | 27.34 | 67.38 | 5.44 | 0.09 | |
| PDSNet [17] | 5.04 | 32.26 | 72.51 | 5.35 | 1.22 | |
| Supervised | HTNet [18] | 5.01 | 33.88 | 78.39 | 4.94 | 1.38 |
| MSANN [19] | 4.91 | 26.88 | 56.74 | 5.08 | 0.86 | |
| MFAENet [20] | 5.16 | 28.17 | 72.58 | 5.87 | 0.06 | |
| LGDBNet [26] | 4.75 | 28.79 | 56.43 | 4.21 | 0.25 | |
| SAR2SAR [45] | 5.03 | 25.80 | 158.39 | 2.20 | 1.13 | |
| Unsupervised | Speckle2Void [48] | 5.01 | 34.33 | 64.27 | 4.35 | 1.09 |
| MERLIN [46] | 7.26 | 28.00 | 70.58 | 4.97 | 0.84 | |
| SDUDNet [49] | 4.71 | 25.47 | 83.24 | 5.62 | 0.04 | |
| Proposed | 4.32 | 24.46 | 133.71 | 6.86 | 0.61 |
| SAR-9 | Real SAR | |||||
|---|---|---|---|---|---|---|
| Noise Branch | Coarse Branch | DMSA | PSNR | SSIM | NIQE | BRISQUE |
| × | × | × | 26.05 | 0.7552 | 6.25 | 33.16 |
| ✓ | × | × | 26.10 | 0.7594 | 5.04 | 28.52 |
| ✓ | ✓ | × | 26.12 | 0.7611 | 4.85 | 26.09 |
| ✓ | ✓ | ✓ | 26.15 | 0.7638 | 4.32 | 24.46 |
| SAR-9 | Real SAR | ||||
|---|---|---|---|---|---|
| GPA | CAF | PSNR | SSIM | NIQE | BRISQUE |
| × | × | 26.07 | 0.7523 | 7.36 | 30.75 |
| ✓ | × | 26.09 | 0.7571 | 6.65 | 26.19 |
| × | ✓ | 26.13 | 0.7604 | 5.62 | 25.73 |
| ✓ | ✓ | 26.15 | 0.7638 | 4.32 | 24.46 |
| SAR-9 | Real SAR | |||
|---|---|---|---|---|
| PSNR | SSIM | NIQE | BRISQUE | |
| 2 | 26.11 | 0.7573 | 5.07 | 24.13 |
| 4 | 26.15 | 0.7638 | 4.32 | 24.46 |
| 6 | 26.17 | 0.7629 | 4.93 | 25.41 |
| 8 | 26.12 | 0.7588 | 5.25 | 25.83 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhang, L.; Zheng, L.; Wen, Y.; Zhang, F.; Bo, F.; Cen, Y. Effective SAR Image Despeckling Using Noise-Guided Transformer and Multi-Scale Feature Fusion. Remote Sens. 2025, 17, 3863. https://doi.org/10.3390/rs17233863
Zhang L, Zheng L, Wen Y, Zhang F, Bo F, Cen Y. Effective SAR Image Despeckling Using Noise-Guided Transformer and Multi-Scale Feature Fusion. Remote Sensing. 2025; 17(23):3863. https://doi.org/10.3390/rs17233863
Chicago/Turabian StyleZhang, Linna, Le Zheng, Yuxin Wen, Fugui Zhang, Fuyu Bo, and Yigang Cen. 2025. "Effective SAR Image Despeckling Using Noise-Guided Transformer and Multi-Scale Feature Fusion" Remote Sensing 17, no. 23: 3863. https://doi.org/10.3390/rs17233863
APA StyleZhang, L., Zheng, L., Wen, Y., Zhang, F., Bo, F., & Cen, Y. (2025). Effective SAR Image Despeckling Using Noise-Guided Transformer and Multi-Scale Feature Fusion. Remote Sensing, 17(23), 3863. https://doi.org/10.3390/rs17233863

