remotesensing-logo

Journal Browser

Journal Browser

City Future: The Innovative Fusion of Artificial Intelligence and Multi-Source Remote Sensing Data

A special issue of Remote Sensing (ISSN 2072-4292). This special issue belongs to the section "Urban Remote Sensing".

Deadline for manuscript submissions: 16 April 2026 | Viewed by 622

Special Issue Editors


E-Mail Website
Guest Editor
Faculty of Geosciences and Engineering, Southwest Jiaotong University, Chengdu, China
Interests: remote sensing; multi-modal data; data fusion; land-use classification; change detection

E-Mail Website
Guest Editor
College of Geoscience and Surveying Engineering, China University of Mining and Technology (Beijing), Beijing, China
Interests: urban big data; remote sensing; deep learning; data fusion; environmental monitoring
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Alba Regia Technical Faculty, Obuda University, Budapest, Hungary
Interests: remote sensing; digital photogrammetry; urban ecology studies; 3D modeling

E-Mail Website
Guest Editor
Institute for Earth Observation, Eurac Research, Bolzano, Italy
Interests: multi-source data fusion; remote sensing; generative model; deep learning; data augmentation; super-resolution; land cover mapping; onboard AI

E-Mail Website
Guest Editor Assistant
Aerospace Information Research Institute, Chinese Academy of Sciences, Beijing, China
Interests: remote sensing; urban environment analysis; deep learning; climate change

Special Issue Information

Dear Colleagues,

The rapid evolution of artificial intelligence (AI), coupled with the explosive growth of multi-source remote sensing data, is profoundly transforming the way we perceive, analyze, and manage cities. In recent years, the integration of data from optical, SAR, LiDAR, hyperspectral, and social sensing platforms has provided unprecedented opportunities to monitor urban dynamics across multiple spatial and temporal scales. Meanwhile, breakthroughs in deep learning, self-supervised learning, and foundation models have enabled machines to extract complex spatial patterns and semantic knowledge from massive heterogeneous data sources, bridging the gap between observation and intelligent interpretation. As global cities face growing challenges such as climate change, resource scarcity, and population expansion, the fusion of AI and remote sensing offers a powerful path toward intelligent urban governance, sustainable development, and resilient city design.

This Special Issue aims to provide an international forum for sharing innovative theories, methodologies, and applications that harness AI to unlock the full potential of multi-source remote sensing data for future cities. We welcome original research focusing on the synergy between AI and remote sensing for urban studies. Topics of interest include, but are not limited to, the following:

  • AI-driven fusion and interpretation of multi-source data such as optical, SAR, LiDAR, hyperspectral, and social sensing data;
  • Foundation models and self-supervised learning for large-scale urban analysis;
  • Urban land use and land cover mapping, change detection, and 3D reconstruction;
  • Data reliability assessment and uncertainty modeling in heterogeneous urban datasets;
  • Dynamic monitoring of urban infrastructure, transportation, and environment;
  • Intelligent urban planning, risk assessment, and sustainable development supported by remote sensing.

Dr. Maofan Zhao
Dr. Shouhang Du
Prof. Dr. Jancsó Tamás
Dr. Abhishek Singh
Guest Editors

Dr. Jiahao Wu
Guest Editor Assistant

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 250 words) can be sent to the Editorial Office for assessment.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Remote Sensing is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2700 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • artificial intelligence
  • urban remote sensing
  • multi-source remote sensing
  • foundation models
  • land use and change detection
  • 3D reconstruction

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • Reprint: MDPI Books provides the opportunity to republish successful Special Issues in book format, both online and in print.

Further information on MDPI's Special Issue policies can be found here.

Published Papers (1 paper)

Order results
Result details
Select all
Export citation of selected articles as:

Research

22 pages, 3276 KB  
Article
AFR-CR: An Adaptive Frequency Domain Feature Reconstruction-Based Method for Cloud Removal via SAR-Assisted Remote Sensing Image Fusion
by Xiufang Zhou, Qirui Fang, Xunqiang Gong, Shuting Yang, Tieding Lu, Yuting Wan, Ailong Ma and Yanfei Zhong
Remote Sens. 2026, 18(2), 201; https://doi.org/10.3390/rs18020201 - 8 Jan 2026
Viewed by 355
Abstract
Optical imagery is often contaminated by clouds to varying degrees, which greatly affects the interpretation and analysis of images. Synthetic Aperture Radar (SAR) possesses the characteristic of penetrating clouds and mist, and a common strategy in SAR-assisted cloud removal involves fusing SAR and [...] Read more.
Optical imagery is often contaminated by clouds to varying degrees, which greatly affects the interpretation and analysis of images. Synthetic Aperture Radar (SAR) possesses the characteristic of penetrating clouds and mist, and a common strategy in SAR-assisted cloud removal involves fusing SAR and optical data and leveraging deep learning networks to reconstruct cloud-free optical imagery. However, these methods do not fully consider the characteristics of the frequency domain when processing feature integration, resulting in blurred edges of the generated cloudless optical images. Therefore, an adaptive frequency domain feature reconstruction-based cloud removal method is proposed to solve the problem. The proposed method comprises four key sequential stages. First, shallow features are extracted by fusing optical and SAR images. Second, a Transformer-based encoder captures multi-scale semantic features. Subsequently, the Frequency Domain Decoupling Module (FDDM) is employed. Utilizing a Dynamic Mask Generation mechanism, it explicitly decomposes features into low-frequency structures and high-frequency details, effectively suppressing cloud interference while preserving surface textures. Finally, robust information interaction is facilitated by the Cross-Frequency Reconstruction Module (CFRM) via transposed cross-attention, ensuring precise fusion and reconstruction. Experimental evaluation on the M3R-CR dataset confirms that the proposed approach achieves the best results on all four evaluated metrics, surpassing the performance of the eight other State-of-the-Art methods. It has demonstrated its effectiveness and advanced capabilities in the task of SAR-optical fusion for cloud removal. Full article
Show Figures

Figure 1

Back to TopTop