remotesensing-logo

Journal Browser

Journal Browser

Multimodal Data Fusion for Synthetic Aperture Radar (SAR) Image Processing

A special issue of Remote Sensing (ISSN 2072-4292). This special issue belongs to the section "Remote Sensing Image Processing".

Deadline for manuscript submissions: 31 July 2026 | Viewed by 773

Special Issue Editors


E-Mail Website
Guest Editor
College of Electronic Science and Engineering, National University of Defense Technology, Changsha 410073, China
Interests: SAR target detection; SAR target classification; SAR image processing with deep learning
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
School of Information and Communication Engineering, University of Electronic Science and Technology of China, Chengdu 611731, China
Interests: radar signal processing; machine learning
Special Issues, Collections and Topics in MDPI journals
College of Information Communication, National University of Defense Technology, Changsha 410073, China
Interests: target detection and recognition; few-shot learning; domain adaptive learning; intelligent interpretation in SAR and optical remote-sensing images

Special Issue Information

Dear Colleagues,

Remote Sensing is pleased to announce a Special Issue dedicated to ‌multimodal data fusion for synthetic aperture radar (SAR) image processing‌. SAR technology, renowned for all-weather imaging capabilities, faces challenges in complex environments due to its inherent limitations in texture for the special scattering mechanism. Integrating SAR with complementary modalities—such as optical, infrared, hyperspectral, and automatic identification system (AIS) data—promises transformative advancements in remote sensing applications.

This Special Issue seeks original research contributions exploring multimodal fusion techniques, including the following:

  • ‌Optical–SAR fusion‌: Enhancing object detection and scene understanding by combining SAR's all-weather robustness with optical imagery's high-resolution details.
  • ‌Infrared–SAR fusion‌: Improving target recognition in low-visibility conditions by leveraging thermal signatures from infrared data.
  • ‌Hyperspectral–SAR fusion‌: Enabling material classification and environmental monitoring through spectral-SAR synergy.
  • ‌AIS–SAR fusion‌: Advancing maritime surveillance by fusing SAR ship detection with AIS-derived vessel motion data.

Key applications include fusion-based detection, classification, image enhancement, and precise geolocation. We encourage submissions addressing novel methodologies, fusion frameworks, and real-world applications such as urban monitoring, disaster response, and maritime security.

We encourage submissions of both regular research papers and reviews on topics including, but not limited to, the following:

  • Multimodal fusion architectures (e.g., deep learning, feature-level fusion);
  • Cross-modal domain adaptation and transfer learning;
  • Fusion applications in object detection, terrain classification, and change analysis;
  • Performance evaluation and benchmark datasets.

This Special Issue aims to foster interdisciplinary collaboration, bridging remote sensing, computer vision, and SAR image processing.

Dr. Ronghui Zhan
Dr. Zongyong Cui
Dr. Shiqi Chen
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 250 words) can be sent to the Editorial Office for assessment.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Remote Sensing is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2700 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • SAR
  • ship targets
  • target tracking
  • object detection
  • SAR interpretation
  • SAR recognition

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • Reprint: MDPI Books provides the opportunity to republish successful Special Issues in book format, both online and in print.

Further information on MDPI's Special Issue policies can be found here.

Published Papers (1 paper)

Order results
Result details
Select all
Export citation of selected articles as:

Research

33 pages, 3891 KB  
Article
Correlation and Semantic Prior-Guided Multi-Scale Cross-Modal Interaction Network for SAR-OPT Image Fusion
by Xiaoyang Hou, Lingxi Zhou, Chenguo Feng, Hao Cha, Yang Liu, Liguo Liu and Haibo Liu
Remote Sens. 2026, 18(7), 975; https://doi.org/10.3390/rs18070975 - 24 Mar 2026
Viewed by 413
Abstract
Syntheticaperture radar (SAR) and optical (OPT) image fusion aims to leverage their complementary information to obtain a more comprehensive representation of ground objects. However, significant discrepancies exist between the two modalities in terms of imaging mechanisms and feature distributions. Consequently, existing multi-modal image [...] Read more.
Syntheticaperture radar (SAR) and optical (OPT) image fusion aims to leverage their complementary information to obtain a more comprehensive representation of ground objects. However, significant discrepancies exist between the two modalities in terms of imaging mechanisms and feature distributions. Consequently, existing multi-modal image fusion methods struggle to achieve robust cross-modal feature alignment and deep semantic consistency between the fused results and the source modalities. To address the above challenges, this paper proposes a correlation and semantic prior-guided multi-scale cross-modal interaction network (CSP-MCIN) for effective SAR-OPT image fusion. Specifically, CSP-MCIN first employs two modality-specific encoders based on ResNet-18 to extract low-level details and high-level semantic features from SAR and OPT images, respectively. Subsequently, a multi-scale interactive decoder integrating cross-modal Transformers and gated fusion units is constructed to align and aggregate semantic and detail information from both encoders. Finally, to strengthen source-modality representations, a novel loss function combining a pixel-domain correlation loss and a CLIP-guided semantic consistency loss is designed and optimized under a PCGrad-based multi-objective optimization scheme. Experimental results on public SAR-OPT image datasets demonstrate that the proposed CSP-MCIN achieves superior fusion performance and computational efficiency compared with state-of-the-art approaches. Full article
Show Figures

Figure 1

Back to TopTop