Robust Synthesis Weather Radar from Satellite Imagery: A Light/Dark Classification and Dual-Path Processing Approach
Highlights
- A Dual-path U-Net is proposed for weather radar synthesis.
- This Unet can automatically remove noise from visible-light observations.
- An illumination-aware classification network is fused with weather radar synthesis.
- The automatic noise removal feature improves the accuracy of weather radar synthesis.
- The fusion of an illumination-aware classification network provides a better solution for weather radar synthesis under different lighting conditions.
Abstract
1. Introduction
2. Materials and Methods
2.1. Illumination-Induced Challenges in Visible-Light-Based Radar Synthesis
2.2. Data and Preprocessing
2.2.1. Himawari-8 Imagery
2.2.2. CREF
2.2.3. Data Processing
2.3. Model
- Single-Channel 2D Feature Map Extraction. Visible-light channels are extracted and averaged across the six channels to form a single-channel 2D grayscale image that captures the overall brightness distribution while reducing input dimensionality.
- Light/Dark Separation Network. The single-channel 2D grayscale image is passed through a lightweight convolutional classifier composed of two convolution layers, residual blocks, global-average pooling, and a SoftMax layer to determine whether the input sample is noise-free (daytime) or noisy (nighttime).
- Feedback and Labeling. The classification result is used as a pseudo-label and fed back to guide automatic path selection in stage II.
- Automatic Path Selection. If the sample is classified as noise-free, both visible and infrared data are stacked and sent into the multimodal synthesis path. If the sample is classified as noisy, only infrared data are used and sent to the infrared-only synthesis path.
- Radar Synthesis Network. Both synthesis paths share the same U-Net structure for end-to-end CREF prediction. The paths differ only in terms of the numbers and types of input channels. The final output is a single-channel CREF map that maintains high synthesis quality under varying illumination conditions.
2.3.1. Stage I: Light/Dark Separation
- Single-Channel 2D Feature Map Extraction.
- Light/Dark Separation Network Structure.
- Feedback and Labeling of Noisy Data.
2.3.2. Stage II: Radar Synthesis Network Structure
- Automatic Path Selection Mechanism.
- Dual-Path Radar Synthesis Network Structure.
- Nighttime or noisy conditions:
- Daytime or noise-free conditions:
2.4. Model Training
2.4.1. Design of the Loss Function
2.4.2. Training of the Light-Dark Separation Module
2.4.3. Training for Radar Synthesis Stage
3. Result
3.1. Evaluation Metrics
3.2. Experimental Results
3.3. Ablation Experiments
4. Case Studies Exploring Temporal and Regional Variations in Illumination
4.1. Case Study I: Impact of Sunrise Time Across Seasons
4.2. Case Study II: Regional Variations in Illumination
4.3. Case Study III: Daytime Subset Comparison of Multimodal and Infrared-Only Models
5. Discussion and Conclusions
- (1)
- The illumination classification module enables the distinction between daytime and nighttime imagery, thereby mitigating noise contamination caused by visible-band degradation under low-light conditions. The results clearly show that nighttime noise significantly affects the quality of radar synthesis. Properly separating noisy nighttime visible-light data improves the accuracy of radar reconstruction, while retaining as much usable visible-light information as possible also helps enhance the synthesis quality. This demonstrates the feasibility and effectiveness of our illumination-based classification mechanism.
- (2)
- The multimodal input simultaneously exploits features from both infrared (IR) and visible (VIS) channels, allowing the model to capture the thermodynamic and optical structures of convective systems, which further improves the quality of synthesized radar reflectivity. The results show that radar synthesized solely from infrared channels still exhibits a certain degree of underestimation, particularly within intense convective regions. When visible-light information is incorporated, the cloud structures become more clearly defined, leading to more accurate radar synthesis in strong convective cores. According to related studies, this underestimation primarily arises from the limited penetration capability of infrared radiation, which prevents the retrieval of realistic reflectivity features from within deep convective clouds [45]. As a result, the model relies more heavily on cloud-top brightness temperature gradients while underrepresenting internal convective structures [46]. This further confirms the importance and necessity of using multimodal data in radar synthesis.
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Racah, E.; Beckham, C.; Maharaj, T.; Ebrahimi Kahou, S.; Prabhat, M.; Pal, C. Extremeweather: A large-scale climate dataset for semi-supervised detection, localization, and understanding of extreme weather events. In Proceedings of the 31st International Conference on Neural Information Processing Systems, Long Beach, CA, USA, 4–9 December 2017. [Google Scholar]
- Taszarek, M.; Brooks, H.E.; Czernecki, B. Sounding-derived parameters associated with convective hazards in Europe. Mon. Weather. Rev. 2017, 145, 1511–1528. [Google Scholar] [CrossRef]
- Kober, K.; Tafferner, A. Tracking and nowcasting of convective cells using remote sensing data from radar and satellite. Meteorol. Z. 2009, 1, 75–84. [Google Scholar] [CrossRef] [PubMed]
- Burcea, S.; Cică, R.; Bojariu, R. Radar-derived convective storms’ climatology for the Prut River basin: 2003–2017. Nat. Hazards Earth Syst. Sci. 2019, 19, 1305–1318. [Google Scholar] [CrossRef]
- Huang, Y.; Wang, J.; Cai, L.; Fan, Y.; Wang, H.; Zhang, T. Relationship between radar reflectivity thresholds and very low frequency/low frequency total lightning for thunderstorm identification. High Voltage 2024, 9, 1068–1080. [Google Scholar] [CrossRef]
- Yang, L.; Zhao, Q.; Xue, Y.; Sun, F.; Li, J.; Zhen, X.; Lu, T. Radar composite reflectivity reconstruction based on FY-4A using deep learning. Sensors 2022, 23, 81. [Google Scholar] [CrossRef]
- Liu, Z.; Min, M.; Li, J.; Sun, F.; Di, D.; Ai, Y.; Li, Z.; Qin, D.; Li, G.; Lin, Y.; et al. Local severe storm tracking and warning in pre-convection stage from the new generation geostationary weather satellite measurements. Remote Sens. 2019, 11, 383. [Google Scholar] [CrossRef]
- Sun, L.; Zhuge, X.; Zhu, S. Geostationary Satellite-Based Overshooting Top Detections and Their Relationship to Severe Weather over Eastern China. Remote Sens. 2024, 16, 2015. [Google Scholar] [CrossRef]
- Sun, F.; Li, B.; Min, M.; Qin, D. Deep learning-based radar composite reflectivity factor estimations from Fengyun-4A geostationary satellite observations. Remote Sens. 2021, 13, 2229. [Google Scholar] [CrossRef]
- Enos, G.R.; Reagor, M.J.; Henderson, M.P.; Young, C.; Horton, K.; Birch, M.; Rigetti, C. Synthetic weather radar using hybrid quantum-classical machine learning. arXiv 2021, arXiv:2111.15605. [Google Scholar] [CrossRef]
- Chen, H.; Chandrasekar, V.; Cifelli, R.; Xie, P. A machine learning system for precipitation estimation using satellite and ground radar network observations. IEEE Trans. Geosci. Remote Sens. 2019, 58, 982–994. [Google Scholar] [CrossRef]
- Hsu, K.L.; Gao, X.; Sorooshian, S.; Gupta, H.V. Precipitation Estimation from Remotely Sensed Information Using Artificial Neural Networks. J. Appl. Meteorol. 1997, 36, 1176–1190. [Google Scholar] [CrossRef]
- Sadeghi, M.; Asanjan, A.A.; Faridzad, M.; Nguyen, P.; Hsu, K.; Sorooshian, S.; Braithwaite, D. PERSIANN-CNN: Precipitation estimation from remotely sensed information using artificial neural networks–convolutional neural networks. J. Hydrometeorol. 2019, 20, 2273–2289. [Google Scholar] [CrossRef]
- Wang, C.; Xu, J.; Tang, G.; Yang, Y.; Hong, Y. Infrared precipitation estimation using convolutional neural network. IEEE Trans. Geosci. Remote Sens. 2020, 58, 8612–8625. [Google Scholar] [CrossRef]
- Kim, Y.; Hong, S. Hypothetical ground radar-like rain rate generation of geostationary weather satellite using data-to-data translation. IEEE Trans. Geosci. Remote Sens. 2023, 61, 1–14. [Google Scholar] [CrossRef]
- Yao, J.; Du, P.; Zhao, Y.; Wang, Y. Simulating Nighttime Visible Satellite Imagery of Tropical Cyclones Using Conditional Generative Adversarial Networks. arXiv 2024, arXiv:2401.11679. [Google Scholar] [CrossRef]
- Mills, S.; Weiss, S.; Liang, C. VIIRS day/night band (DNB) stray light characterization and correction. In Proceedings of the Earth Observing Systems XVIII, San Diego, CA, USA, 25–29 August 2013; Volume 8866, pp. 549–566. [Google Scholar]
- Min, M.; Zheng, J.; Zhang, P.; Hu, X.; Chen, L.; Li, X.; Huang, Y.; Zhu, L. A low-light radiative transfer model for satellite observations of moonlight and earth surface light at night. J. Quant. Spectrosc. Radiat. Transf. 2020, 247, 106954. [Google Scholar] [CrossRef]
- Kim, J.H.; Ryu, S.; Jeong, J.; So, D.; Ban, H.J.; Hong, S. Impact of satellite sounding data on virtual visible imagery generation using conditional generative adversarial network. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2020, 13, 4532–4541. [Google Scholar] [CrossRef]
- Wang, Z.; Román, M.O.; Kalb, V.L.; Miller, S.D.; Zhang, J.; Shrestha, R.M. Quantifying uncertainties in nighttime light retrievals from Suomi-NPP and NOAA-20 VIIRS Day/Night Band data. Remote Sens. Environ. 2021, 263, 112557. [Google Scholar] [CrossRef]
- Bessho, K.; Date, K.; Hayashi, M.; Ikeda, A.; Imai, T.; Inoue, H.; Kumagai, Y.; Miyakawa, T.; Murata, H.; Ohno, T.; et al. An introduction to Himawari-8/9—Japan’s new-generation geostationary meteorological satellites. J. Meteorol. Soc. Jpn. Ser. II 2016, 94, 151–183. [Google Scholar] [CrossRef]
- Govekar, P.; Griffin, C.; Embury, O.; Mittaz, J.; Beggs, H.M.; Merchant, C.J. Himawari-8 Sea Surface Temperature Products from the Australian Bureau of Meteorology. Remote Sens. 2024, 16, 3381. [Google Scholar] [CrossRef]
- Sulistiyono, W.; Tuna, M.S.; Ramadhan, S.A. Pemanfaatan Data Satelit Himawari-8 Dalam Analisa Kejadian Hujan Lebat Di Jombang Tanggal 1–2 Februari 2021. Time Phys. 2024, 2, 31–40. [Google Scholar] [CrossRef]
- Zhang, C.; Zhuge, X.; Yu, F. Development of a high spatiotemporal resolution cloud-type classification approach using Himawari-8 and CloudSat. Int. J. Remote Sens. 2019, 40, 6464–6481. [Google Scholar] [CrossRef]
- Sun, H.; Wang, D.; Han, W.; Yang, Y. Quantifying the impact of aerosols on geostationary satellite infrared radiance simulations: A study with Himawari-8 AHI. Remote Sens. 2024, 16, 2226. [Google Scholar] [CrossRef]
- Do, H.N.; Ngo, T.X.; Nguyen, A.H.; Nguyen, T.T.N. Precipitation Estimation from Himawari-8 Multiple Spectral Channels Using U-Net. In Proceedings of the 2023 15th International Conference on Knowledge and Systems Engineering (KSE), Ha Noi, Vietnam, 18–20 October 2023; pp. 1–6. [Google Scholar]
- Nishiyama, G.; Suzuki, Y.; Uno, S.; Aoki, S.; Iwanaka, T.; Imamura, T.; Fujii, Y.; Müller, T. Temporal Variation of Venus Brightness Temperature Seen by the Japanese Meteorological Satellite Himawari-8/9. In Proceedings of the Europlanet Science Congress, Berlin, Germany, 8–13 September 2024. Technical Report, Copernicus Meetings. [Google Scholar]
- Broomhall, M.A.; Majewski, L.J.; Villani, V.O.; Grant, I.F.; Miller, S.D. Correcting Himawari-8 Advanced Himawari Imager Data for the Production of Vivid True-Color Imagery. J. Atmos. Ocean. Technol. 2019, 36, 427–442. [Google Scholar] [CrossRef]
- Chen, Y.; Chen, J.; Chen, D.; Xu, Z.; Sheng, J.; Chen, F. A simulated radar reflectivity calculation method in numerical weather prediction models. Weather Forecast. 2021, 36, 341–359. [Google Scholar] [CrossRef]
- Doviak, R.J.; Zrnic, D.S. Doppler Radar & Weather Observations; Academic Press: Cambridge, MA, USA, 2014. [Google Scholar]
- Rauber, R.M.; Nesbitt, S.W. Radar Meteorology: A First Course; John Wiley & Sons: Hoboken, NJ, USA, 2018. [Google Scholar]
- Wen, Y.; Zhang, J.; Wang, D.; Peng, X.; Wang, P. A Quantitative Precipitation Estimation Method Based on 3D Radar Reflectivity Inputs. Symmetry 2024, 16, 555. [Google Scholar] [CrossRef]
- Zhang, X.; Gao, F.; Wang, J.; Ye, Y. Evaluating a spatiotemporal shape-matching model for the generation of synthetic high spatiotemporal resolution time series of multiple satellite data. Int. J. Appl. Earth Obs. Geoinf. 2021, 104, 102545. [Google Scholar] [CrossRef]
- Press, W.H. Numerical Recipes 3rd Edition: The Art of Scientific Computing; Cambridge University Press: Cambridge, UK, 2007. [Google Scholar]
- Mining, W.I.D. Data mining: Concepts and techniques. Morgan Kaufinann 2006, 10, 4. [Google Scholar]
- Rumelhart, D.E.; Hinton, G.E.; Williams, R.J. Learning representations by back-propagating errors. Nature 1986, 323, 533–536. [Google Scholar] [CrossRef]
- Shi, X.; Gao, Z.; Lausen, L.; Wang, H.; Yeung, D.Y.; Wong, W.K.; Woo, W.C. Deep learning for precipitation nowcasting: A benchmark and a new model. In Proceedings of the 31st International Conference on Neural Information Processing Systems, Long Beach, CA, USA, 4–9 December 2017. [Google Scholar]
- Zhang, W.; Chen, H.; Han, L.; Zhang, R.; Ge, Y. Pixel-CRN: A new machine learning approach for convective storm nowcasting. IEEE Trans. Geosci. Remote Sens. 2023, 61, 1–12. [Google Scholar] [CrossRef]
- Jähne, B. Digital Image Processing; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2005. [Google Scholar]
- Wang, W.; Zhong, X.; Su, Z.; Li, D.; Guo, Z. Signal-to-Noise Ration Evaluation of Luojia 1-01 Satellite Nighttime Light Remote Sensing Camera Based on Time Sequence Images. Preprints 2019, 2019010088. [Google Scholar] [CrossRef]
- Kingma, D.P. Adam: A method for stochastic optimization. arXiv 2014, arXiv:1412.6980. [Google Scholar]
- Loshchilov, I.; Hutter, F. Sgdr: Stochastic gradient descent with warm restarts. arXiv 2016, arXiv:1608.03983. [Google Scholar]
- Alzubaidi, L.; Bai, J.; Al-Sabaawi, A.; Santamaría, J.; Albahri, A.S.; Al-dabbagh, B.S.N.; Fadhel, M.A.; Manoufali, M.; Zhang, J.; Al-Timemy, A.H.; et al. A survey on deep learning tools dealing with data scarcity: Definitions, challenges, solutions, tips, and applications. J. Big Data 2023, 10, 46. [Google Scholar] [CrossRef]
- Veenhuis, B.A.; Brill, K.F. On the Emergence of Frequency Bias from Accumulating or Disaggregating Bias-Corrected Quantitative Precipitation Forecasts. Weather Forecast. 2022, 37, 511–524. [Google Scholar] [CrossRef]
- Kotarba, A.Z.; Wojciechowska, I. Satellite-based detection of deep-convective clouds: The sensitivity of infrared methods and implications for cloud climatology. Atmos. Meas. Tech. 2025, 18, 2721–2738. [Google Scholar] [CrossRef]
- Kumar, P.A.; Anuradha, B.; Siddaiah, N. Comparison of convective clouds extraction based on satellite and RADAR data. J. Adv. Res. Dyn. Control. Syst. 2017, 9, 1715–1724. [Google Scholar]
- Sha, Y.; Sobash, R.A.; Gagne, D.J. Generative ensemble deep learning severe weather prediction from a deterministic convection-allowing model. Artif. Intell. Earth Syst. 2024, 3, e230094. [Google Scholar] [CrossRef]













| Threshold (dBZ) | CSI_6 | CSI_12 | Avg_CSI |
|---|---|---|---|
| 10 | 0.5913 | 0.5959 | 0.5936 |
| 20 | 0.6410 | 0.7191 | 0.6801 |
| 30 | 0.6426 | 0.6816 | 0.6621 |
| 35 | 0.6013 | 0.6271 | 0.6115 |
| 40 | 0.3420 | 0.3372 | 0.3396 |
| Threshold (dBZ) | POD_6 | POD_12 | Avg_POD |
|---|---|---|---|
| 10 | 0.8384 | 0.6906 | 0.7645 |
| 20 | 0.8289 | 0.8869 | 0.8579 |
| 30 | 0.8300 | 0.8788 | 0.8544 |
| 35 | 0.7742 | 0.7976 | 0.7859 |
| 40 | 0.4412 | 0.4238 | 0.4325 |
| Threshold (dBZ) | FAR_6 | FAR_12 | Avg_FAR |
|---|---|---|---|
| 10 | 0.3358 | 0.1824 | 0.2591 |
| 20 | 0.2635 | 0.2071 | 0.2353 |
| 30 | 0.2599 | 0.2434 | 0.2516 |
| 35 | 0.2668 | 0.2458 | 0.2563 |
| 40 | 0.3406 | 0.3526 | 0.3466 |
| Threshold (dBZ) | FBias_6 | FBias_12 | Avg_FBias |
|---|---|---|---|
| 10 | 1.321 | 1.176 | 1.249 |
| 20 | 1.056 | 1.024 | 1.040 |
| 30 | 0.897 | 0.910 | 0.904 |
| 35 | 0.872 | 0.883 | 0.878 |
| 40 | 0.501 | 0.592 | 0.547 |
| Threshold (dBZ) | Model | CSI | POD | FAR | FBias |
|---|---|---|---|---|---|
| 10 | Our Model | 0.5936 | 0.7645 | 0.2591 | 1.249 |
| U-Net | 0.6030 | 0.7443 | 0.2362 | 1.321 | |
| 20 | Our Model | 0.6800 | 0.8579 | 0.2353 | 1.040 |
| U-Net | 0.6389 | 0.8321 | 0.2671 | 1.152 | |
| 30 | Our Model | 0.6621 | 0.8544 | 0.2516 | 0.904 |
| U-Net | 0.5795 | 0.8140 | 0.3219 | 0.744 | |
| 35 | Our Model | 0.6142 | 0.7859 | 0.2563 | 0.878 |
| U-Net | 0.5191 | 0.7111 | 0.3176 | 0.701 | |
| 40 | Our Model | 0.3509 | 0.4455 | 0.3540 | 0.547 |
| U-Net | 0.1898 | 0.2251 | 0.3386 | 0.434 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhang, W.; Ma, H.; Gan, Y.; Dong, J.; Pang, R.; Song, X.; Liu, C.; Liu, H. Robust Synthesis Weather Radar from Satellite Imagery: A Light/Dark Classification and Dual-Path Processing Approach. Remote Sens. 2025, 17, 3609. https://doi.org/10.3390/rs17213609
Zhang W, Ma H, Gan Y, Dong J, Pang R, Song X, Liu C, Liu H. Robust Synthesis Weather Radar from Satellite Imagery: A Light/Dark Classification and Dual-Path Processing Approach. Remote Sensing. 2025; 17(21):3609. https://doi.org/10.3390/rs17213609
Chicago/Turabian StyleZhang, Wei, Hongbo Ma, Yanhai Gan, Junyu Dong, Renbo Pang, Xiaojiang Song, Cong Liu, and Hongmei Liu. 2025. "Robust Synthesis Weather Radar from Satellite Imagery: A Light/Dark Classification and Dual-Path Processing Approach" Remote Sensing 17, no. 21: 3609. https://doi.org/10.3390/rs17213609
APA StyleZhang, W., Ma, H., Gan, Y., Dong, J., Pang, R., Song, X., Liu, C., & Liu, H. (2025). Robust Synthesis Weather Radar from Satellite Imagery: A Light/Dark Classification and Dual-Path Processing Approach. Remote Sensing, 17(21), 3609. https://doi.org/10.3390/rs17213609

