Next Article in Journal
Improvement of Lithological Identification Under the Impact of Sparse Vegetation Cover with 1D Discrete Wavelet Transform for Gaofen-5 Hyperspectral Data
Previous Article in Journal
Deep Reinforcement Learning-Based Two-Phase Hybrid Optimization for Scheduling Agile Earth Observation Satellites
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

Hyperspectral and Multispectral Remote Sensing Image Fusion Based on a Retractable Spatial–Spectral Transformer Network

1
College of Geo-Exploration Science and Technology, Jilin University, No. 938 Ximinzhu Street, Changchun 130026, China
2
2 College of Software, Jilin University, No. 2699 Qianjin Street, Changchun 130012, China
3
Hangzhou Institute of Technology, Xidian University, No. 177 Lvxue Road, Hangzhou 311231, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2025, 17(12), 1973; https://doi.org/10.3390/rs17121973
Submission received: 2 April 2025 / Revised: 30 May 2025 / Accepted: 3 June 2025 / Published: 6 June 2025
(This article belongs to the Section Remote Sensing Image Processing)

Abstract

Hyperspectral and multispectral remote sensing image fusion is an optimal approach for generating hyperspectral–spatial-resolution images, effectively overcoming the physical limitations of sensors. In transformer-based image fusion methods constrained by the local window self-attention mechanism, the extraction of global information and coordinated contextual features is often insufficient. Fusion that aims to emphasize spatial–spectral heterogeneous characteristics may significantly enhance the robustness of joint representation for multi-source data. To address these issues, this study proposes a hyperspectral and multispectral remote sensing image fusion method based on a retractable spatial–spectral transformer network (RSST) and introduces the attention retractable mechanism into the field of remote sensing image fusion. Furthermore, a gradient spatial–spectral recovery block is incorporated to effectively mitigate the limitations of token interactions and the loss of spatial–spectral edge information. A series of experiments across multiple scales demonstrate that RSST exhibits significant advantages over existing mainstream image fusion algorithms.
Keywords: image fusion; hyperspectral image; multispectral image; transformer; retractable attention; gradient spatial–spectral recovery image fusion; hyperspectral image; multispectral image; transformer; retractable attention; gradient spatial–spectral recovery

Share and Cite

MDPI and ACS Style

He, Y.; Li, H.; Zhang, M.; Liu, S.; Zhu, C.; Xin, B.; Wang, J.; Wu, Q. Hyperspectral and Multispectral Remote Sensing Image Fusion Based on a Retractable Spatial–Spectral Transformer Network. Remote Sens. 2025, 17, 1973. https://doi.org/10.3390/rs17121973

AMA Style

He Y, Li H, Zhang M, Liu S, Zhu C, Xin B, Wang J, Wu Q. Hyperspectral and Multispectral Remote Sensing Image Fusion Based on a Retractable Spatial–Spectral Transformer Network. Remote Sensing. 2025; 17(12):1973. https://doi.org/10.3390/rs17121973

Chicago/Turabian Style

He, Yilin, Heng Li, Miaosen Zhang, Shuangqi Liu, Chunyu Zhu, Bingxia Xin, Jun Wang, and Qiong Wu. 2025. "Hyperspectral and Multispectral Remote Sensing Image Fusion Based on a Retractable Spatial–Spectral Transformer Network" Remote Sensing 17, no. 12: 1973. https://doi.org/10.3390/rs17121973

APA Style

He, Y., Li, H., Zhang, M., Liu, S., Zhu, C., Xin, B., Wang, J., & Wu, Q. (2025). Hyperspectral and Multispectral Remote Sensing Image Fusion Based on a Retractable Spatial–Spectral Transformer Network. Remote Sensing, 17(12), 1973. https://doi.org/10.3390/rs17121973

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop