Next Article in Journal
Ionospheric Corrections for Space Domain Awareness Using HF Line-of-Sight Radar
Previous Article in Journal
Geological Evolution of Rima Bode on the Moon Revealed by Multi-Source Remote Sensing Data
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

Physics-Informed Deep Learning for 3D Wind Field Retrieval of Open-Ocean Typhoons

1
Department of Civil and Airport Engineering, Nanjing University of Aeronautics and Astronautics, Liyang 213300, China
2
Jiangsu University Key Laboratory of Dynamic Multi-Hazard Protection in Civil Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 211106, China
3
School of Sustainable Design, University of Toyama, Toyama 930-8555, Japan
4
State Grid Electric Power Research Institute, Nari Group Corporation, Nanjing 211106, China
5
Institute of Hypergravity Science and Technology, Zhejiang University, Hangzhou 310058, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2025, 17(23), 3825; https://doi.org/10.3390/rs17233825
Submission received: 7 October 2025 / Revised: 22 November 2025 / Accepted: 24 November 2025 / Published: 26 November 2025
(This article belongs to the Section AI Remote Sensing)

Abstract

Accurate retrieval of three-dimensional (3D) typhoon wind fields over the open ocean remains a critical challenge due to observational gaps and physical inconsistencies in existing methods. Based on multi-channel data from the Himawari-8/9 geostationary satellites, this study proposes a physics-informed deep learning framework for high-resolution 3D wind field reconstruction of open-ocean typhoons. A convolutional neural network was designed to establish an end-to-end mapping from 16-channel satellite imagery to the 3D wind field across 16 vertical levels. To enhance physical consistency, the continuity equation, enforcing mass conservation, was embedded as a strong constraint into the loss function. Four experimental scenarios were designed to evaluate the contributions of multi-channel data and physical constraints. Results demonstrate that the full model, integrating both visible/infrared channels and the physical constraint, achieved the best performance, with mean absolute errors of 2.73 m/s and 2.54 m/s for U- and V-wind components, respectively. This represents significant improvements over the baseline infrared-only model (29.6% for U, 21.6% for V), with notable error reductions in high-wind regions (>20 m/s). The approach effectively captures fine-scale structures like eyewalls and spiral rainbands while maintaining vertical physical coherence, offering a robust foundation for typhoon monitoring and reanalysis.
Keywords: typhoon; deep learning; satellite; 3D wind field reconstruction typhoon; deep learning; satellite; 3D wind field reconstruction

Share and Cite

MDPI and ACS Style

Zhang, X.; Zhang, T.; Ke, S.; He, H.; Zhang, R.; Miao, Y.; Liang, T. Physics-Informed Deep Learning for 3D Wind Field Retrieval of Open-Ocean Typhoons. Remote Sens. 2025, 17, 3825. https://doi.org/10.3390/rs17233825

AMA Style

Zhang X, Zhang T, Ke S, He H, Zhang R, Miao Y, Liang T. Physics-Informed Deep Learning for 3D Wind Field Retrieval of Open-Ocean Typhoons. Remote Sensing. 2025; 17(23):3825. https://doi.org/10.3390/rs17233825

Chicago/Turabian Style

Zhang, Xingyu, Tian Zhang, Shitang Ke, Houtian He, Ruihan Zhang, Yongqi Miao, and Teng Liang. 2025. "Physics-Informed Deep Learning for 3D Wind Field Retrieval of Open-Ocean Typhoons" Remote Sensing 17, no. 23: 3825. https://doi.org/10.3390/rs17233825

APA Style

Zhang, X., Zhang, T., Ke, S., He, H., Zhang, R., Miao, Y., & Liang, T. (2025). Physics-Informed Deep Learning for 3D Wind Field Retrieval of Open-Ocean Typhoons. Remote Sensing, 17(23), 3825. https://doi.org/10.3390/rs17233825

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop