Applying Machine Learning in Numerical Weather and Climate Modeling Systems
Abstract
:1. Introduction
Everything we think we know about the world is a modelOur models do have a strong congruence with the worldOur models fall far short of representing the real world fully.Donella H. Meadows [1]
- The vast amounts of observational data available from satellites, in situ scientific measurements, and in the future, from Internet-of-Things devices, increase with tremendous speed. Even now, only a small percentage of the available data is used in modern DASs. The problems with the assimilation of new data in DASs range from increased time consumption (with increasing amounts of data) vs. limited computational resources to the necessity of new approaches for assimilating new types of data [4,5];
- The increasing requirements to improve the accuracy and the forecast horizon of numerical weather/climate modeling systems are leading to their growing complexity, due to increasing horizontal and vertical resolutions and the related complexity of model physics. Thus, global and regional modeling activities consume a tremendous amount of computing resources, which presents a significant challenge despite growing computing capabilities. Model ensemble systems have already faced the computational resources problem that limits the resolution and/or the number of ensemble members in these systems [5];
- Model physics is the most computationally demanding part of numerical weather/climate modeling systems. With the increase in model resolutions, many subgrid physical processes that are currently parameterized become resolved processes and should be treated correspondingly. However, the nature of these processes is not always sufficiently understood to develop a description of the processes based on first principles. With the increase in model resolution, the scales of the subgrid processes that should be parameterized become smaller and smaller. Parameterizations of such processes often become more and more time-consuming and sometimes less accurate because the underlying physical principles may not be fully understood [4,5];
- Current NWCMSs produce improved forecasts with better accuracy. A major part of these improvements is due to the increase in supercomputing power that has enabled higher model resolution, better physics/chemistry/biology description, and more comprehensive data assimilation [5]. Yet, the “demise of the ‘laws’ of Dennard and Moore” [6,7] indicates that this progress is unlikely to continue, due to an increase in the required computer power. Moore’s law drove the economics of computing by stating that every 18 months, the number of transistors on a chip would double at an approximately equal cost. However, the cost per transistor starts to grow with the latest chip generations, indicating an end to this law. Thus, due to the aforementioned limitations, results produced by NWCMSs still contain errors of various natures. Thus, the PP correction of model output errors becomes even more important [8]. Currently used in NWP operational practices, post-processing systems like Model Output Statistics (MOS) [9] are based on linear techniques (linear regressions). However, because optimal corrections of model outputs are nonlinear, correcting the biases of even regional fields requires the introduction of many millions of linear regressions in MOS [10,11], making such systems cumbersome and resource-consuming.
2. ML for NWCMSs Background
2.1. ML Tools
- 5.
- Many NWCMS applications, from a mathematical point of view, may be considered as mapping, M, that is a relationship between two vectors or two sets of parameters X and Y, as follows:
- 6.
- ML provides an all-purpose non-linear fitting capability. NN, the major ML tool that is used in applications, are “universal approximators” [26] for complex multidimensional nonlinear mappings [27,28,29,30,31]. Such tools can be used, and have already been used, to develop a large variety of applications for NWCMSs.
2.2. ML for NWCMS Specifics
3. Climate and Weather Systems
3.1. Systems and Subsystems
3.2. ML for NWCMS and Its Subsystems
4. Hybridization of ML with Traditional Numerical Modeling
4.1. ML for Data Assimilation
4.1.1. Fast ML Forward Models for Direct Assimilation of Satellite Measurements
4.1.2. Fast ML Observation Operators
4.1.3. Fast ML Models and Adjoints
4.1.4. Data Pre-Processing and Quality Control
4.2. ML for Model Physics
4.2.1. Fast ML Radiation
4.2.2. Fast and Better ML Microphysics
4.2.3. New ML Parameterizations
4.2.4. ML Full Physics
4.2.5. ML Weather and Climate Models
4.2.6. ML Stochastic Physics
4.2.7. ML Model Chemistry
4.3. ML for Post-Processing
5. Conclusions
Funding
Data Availability Statement
Conflicts of Interest
References
- Meadows, D.H. Thinking in Systems: A Primer; Chelsea Green Publishing Co.: Chelsea, VT, USA, 2008. [Google Scholar]
- Uccellini, L.W.; Spinrad, R.W.; Koch, D.M.; McLean, C.N.; Lapenta, W.M. EPIC as a Catalyst for NOAA’s Future Earth Prediction System. BAMS 2022, 103, E2246–E2264. [Google Scholar] [CrossRef]
- Zhu, Y.; Fu, B.; Guan, H.; Sinsky, E.; Yang, B. The Development of UFS Coupled GEFS for Subseasonal and Seasonal Forecasts. In Proceedings of the NOAA’s 47th Climate Diagnostics and Prediction Workshop Special Issue, Logan, UT, USA, 25–27 October 2022; NOAA: Washington, DC, USA, 2022; pp. 79–88. [Google Scholar] [CrossRef]
- Boukabara, S.-A.; Krasnopolsky, V.; Stewart, J.Q.; Maddy, E.S.; Shahroudi, N.; Hoffman, R.N. Leveraging modern artificial intelligence for remote sensing and NWP: Benefits and challenges. BAMS 2019, 100, ES473–ES491. [Google Scholar] [CrossRef]
- Bauer, P.; Thorpe, A.; Brunet, G. The quiet revolution of numerical weather prediction. Nature 2015, 525, 47–55. [Google Scholar] [CrossRef] [PubMed]
- Bauer, P.; Dueben, P.D.; Hoefler, T.; Quintino, T.; Schulthness, T.C.; Wedi, N.P. The digital revolution of Earth-system science. Nat. Comput. Sci. 2021, 1, 104–113. [Google Scholar] [CrossRef] [PubMed]
- Khan, H.N.; Hounshell, D.A.; Fuchs, E.R.H. Science and research policy at the end of Moore’s law. Nat. Electron. 2018, 1, 14–21. [Google Scholar] [CrossRef]
- Haupt, S.E.; Chapman, W.; Adams, S.V.; Kirkwood, C.; Hosking, J.S.; Robinson, N.H.; Lerch, S.; Subramanian, A.C. Towards implementing artificial intelligence post-processing in weather and climate: Proposed actions from the Oxford 2019 workshop. Philos. Trans. R. Soc. 2021, A 379, 20200091. [Google Scholar] [CrossRef]
- Carter, G.M.; Dallavalle, J.P.; Glahn, H.R. Statistical forecasts based on the National Meteorological Center’s numerical weather prediction system. Weather Forecast. 1989, 4, 401–412. [Google Scholar] [CrossRef]
- Gneiting, T.; Raftery, A.E.; Westveld, A.H.; Goldman, T. Calibrated probabilistic forecasting using ensemble model output statistics and minimum CRPS estimation. Mon. Wea. Rev. 2005, 133, 1098–1118. [Google Scholar] [CrossRef]
- Wilks, D.S.; Hamill, T.M. Comparison of ensemble-MOS methods using GFS reforecasts. Mon. Wea. Rev. 2007, 135, 2379–2390. [Google Scholar] [CrossRef]
- Christensen, H.; Zanna, L. Parametrization in Weather and Climate Models. Oxf. Res. Encycl. 2022. [Google Scholar] [CrossRef]
- Düeben, P.; Bauer, P.; Adams, S. Deep Learning to Improve Weather Predictions. In Deep Learning for the Earth Sciences: A Comprehensive Approach to Remote Sensing, Climate Science, and Geosciences; Camps-Valls, G., Tuia, D., Zhu, X.X., Reichstein, M., Eds.; John Wiley & Sons Ltd.: Hoboken, NJ, USA, 2021. [Google Scholar] [CrossRef]
- Hsieh, W.W. Introduction to Environmental Data Science; Cambridge University Press: Cambridge, MA, USA, 2023; ISBN 978-1-107-06555-0. [Google Scholar]
- Krasnopolsky, V. The Application of Neural Networks in the Earth System Sciences. Neural Network Emulations for Complex Multidimensional Mappings; Atmospheric and Oceanic Science Library; Springer: Berlin/Heidelberg, Germany, 2013; Volume 46. [Google Scholar] [CrossRef]
- Dueben, P.D.; Bauer, P. Challenges and design choices for global weather and climate models based on machine learning. Geosci. Mod. Dev 2018, 11, 3999–4009. [Google Scholar] [CrossRef]
- Bishop, C.M. Pattern Recognition and Machine Learning; Springer: New York, NY, USA, 2006; ISBN 0-387-31073-8. [Google Scholar]
- Cherkassky, V.; Muller, F. Learning from Data: Concepts, Theory, and Methods; Wiley: Hoboken, NJ, USA, 1998. [Google Scholar]
- Chevallier, F.; Morcrette, J.-J.; Chéruy, F.; Scott, N.A. Use of a neural-network-based longwave radiative transfer scheme in the ECMWF atmospheric model. Q. J. R. Meteorol. Soc. 2000, 126, 761–776. [Google Scholar]
- Krasnopolsky, V.M.; Fox-Rabinovitz, M.S.; Chalikov, D.V. New Approach to Calculation of Atmospheric Model Physics: Accurate and Fast Neural Network Emulation of Long Wave Radiation in a Climate Model. Mon. Weather. Rev. 2005, 133, 1370–1383. [Google Scholar] [CrossRef]
- Rasp, S.; Pritchard, M.S.; Gentine, P. Deep learning to represent subgrid processes in climate models. Proc. Natl. Acad. Sci. USA 2018, 115, 9684–9689. [Google Scholar] [CrossRef] [PubMed]
- Brenowitz, N.D.; Perkins, W.A.; Nugent, J.M.; Wyatt-Meyer, O.; Clark, S.K.; Kwa, A.; Henn, B.; McGibbon, J.; Bretherton, C.S. Emulating Fast Processes in Climate Models. arXiv 2022, arXiv:2211.10774. [Google Scholar]
- Geer, A.J. Learning Earth System Models from Observations: Machine Learning or Data Assimilation? Philos. Trans. R. Soc. 2021, A 379, 20200089. [Google Scholar] [CrossRef]
- Belochitski, A.; Binev, P.; DeVore, R.; Fox-Rabinovitz, M.; Krasnopolsky, V.; Lamby, P. Tree Approximation of the Long Wave Radiation Parameterization in the NCAR CAM Global Climate Model. J. Comput. Appl. Math. 2011, 236, 447–460. [Google Scholar] [CrossRef]
- O’Gorman, P.A.; Dwyer, J.G. Using machine learning to parameterize moist convection: Potential for modeling of climate, climate change and extreme events. J. Adv. Model. Earth Syst. 2018, 10, 2548–2563. [Google Scholar] [CrossRef]
- Hornik, K. Approximation Capabilities of Multilayer Feedforward Network. Neural Netw. 1991, 4, 251–257. [Google Scholar] [CrossRef]
- Vapnik, V.N. The Nature of Statistical Learning Theory; Springer: New York, NY, USA, 1995; p. 189. [Google Scholar]
- Vapnik, V.N.; Kotz, S. Estimation of Dependences Based on Empirical Data (Information Science and Statistics); Springer: New York, NY, USA, 2006. [Google Scholar]
- Vapnik, V.N. Complete Statistical Theory of Learning. Autom. Remote Control. 2019, 80, 1949–1975. [Google Scholar] [CrossRef]
- Cybenko, G. Approximation by superposition of sigmoidal functions. Math. Control Signals Syst. 1989, 2, 303–314. [Google Scholar] [CrossRef]
- Funahashi, K. On the approximate realization of continuous mappings by neural networks. Neural Netw. 1989, 2, 183–192. [Google Scholar] [CrossRef]
- Chen, T.; Chen, H. Universal approximation to nonlinear operators by neural networks with arbitrary activation function and its application to dynamical systems. Neural Netw. 1995, 6, 911–917. [Google Scholar] [CrossRef]
- Attali, J.-G.; Pagès, G. Approximations of functions by a multilayer perceptron: A new approach. Neural Netw. 1997, 6, 1069–1081. [Google Scholar] [CrossRef]
- Breiman, L. Random forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef]
- Rudin, C.; Chen, C.; Chen, Z.; Huang, H.; Semenova, L.; Zhong, C. Interpretable machine learning: Fundamental principles and 10 grand challenges. Statist. Surv. 2022, 16, 1–85. [Google Scholar] [CrossRef]
- Valerie, A.; Allen, T.F.H. Hierarchy Theory: A Vision, Vocabulary, and Epistemology; Columbia University Press: New York, NY, USA, 1996. [Google Scholar]
- Salthe, S.N. Evolving Hierarchical Systems Their Structure and Representation; Columbia University Press: New York, NY, USA, 1985. [Google Scholar]
- Krasnopolsky, V.; Belochitski, A. Using Machine Learning for Model Physics: An Overview. arXiv 2022, arXiv:2002.00416. [Google Scholar]
- Campos, R.M.; Krasnopolsky, V.; Alves, J.-H.; Penny, S.G. Improving NCEP’s global-scale wave ensemble averages using neural networks. Ocean Model. 2022, 149, 101617. [Google Scholar] [CrossRef]
- Düben, P.; Modigliani, U.; Pappenberger, F.; Bauer, P.; Wedi, N.; Baousis, V. Machine Learning at ECMWF: A Roadmap for the Next 10 Years. ECMWF Tech. Memo. 2021, 878. Available online: https://www.ecmwf.int/sites/default/files/elibrary/2021/19877-machine-learning-ecmwf-roadmap-next-10-years.pdf (accessed on 20 March 2024).
- Krasnopolsky, V.M.; Fox-Rabinovitz, M.S. Complex Hybrid Models Combining Deterministic and Machine Learning Components for Numerical Climate Modeling and Weather Prediction. Neural Netw. 2006, 19, 122–134. [Google Scholar] [CrossRef]
- Brenowitz, N.D.; Bretherton, C.S. Prognostic Validation of a Neural Network Unified Physics Parameterization. Geophys. Res. Lett. 2018, 45, 6289–6298. [Google Scholar] [CrossRef]
- Dong, R.; Leng, H.; Zhao, J.; Song, J.; Liang, S. A Framework for Four-Dimensional Variational Data Assimilation Based on Machine Learning. Entropy 2022, 24, 264. [Google Scholar] [CrossRef] [PubMed]
- Bauer, P.; Geer, A.J.; Lopez, P.; Salmond, D. Direct 4D-Var assimilation of all-sky radiances. Part I: Implementation. Quat. J. R. Met. Soc. 2010, 136, 1868–1885. [Google Scholar] [CrossRef]
- Mellor, G.L.; Ezer, T. A Gulf Stream model and an altimetry assimilation scheme. JGR 1991, 96C, 8779–8795. [Google Scholar] [CrossRef]
- Guinehut, S.; Le Traon, P.Y.; Larnicol, G.; Philipps, S. Combining Argo and remote-sensing data to estimate the ocean three-dimensional temperature fields—A first approach based on simulated observations. J. Mar. Syst. 2004, 46, 85–98. [Google Scholar] [CrossRef]
- Krasnopolsky, V.; Nadiga, S.; Mehra, A.; Bayler, E. Adjusting Neural Network to a Particular Problem: Neural Network-based Empirical Biological Model for Chlorophyll Concentration in the Upper Ocean. Appl. Comput. Intell. Soft Comput. 2018, 2018, 7057363. [Google Scholar] [CrossRef]
- Cheng, S.; Chen, J.; Anastasiou, C. Generalized Latent Assimilation in Heterogeneous Reduced Spaces with Machine Learning Surrogate Models. J. Sci. Comput. 2023, 94, 11. [Google Scholar] [CrossRef]
- Hatfield, S.; Chantry, M.; Dueben, P.; Lopez, P.; Geer, T.; Palmer, A. Building Tangent-Linear and Adjoint Models for Data Assimilation with Neural Networks. J. Adv. Model. Earth Syst. 2021, 13, e2021MS002521. [Google Scholar] [CrossRef]
- Maulik, R.; Rao, V.; Wang, J.; Mengaldo, G.; Constantinescu, E.; Lusch, B.; Balaprakash, P.; Foster, I.; Kotamarthi, R. Efficient high-dimensional variational data assimilation with machine-learned reduced-order models. Geosci. Mod. Dev. 2022, 15, 3433–3445. [Google Scholar] [CrossRef]
- Krasnopolsky, V.M. Reducing Uncertainties in Neural Network Jacobians and Improving Accuracy of Neural Network Emulations with NN Ensemble Approaches. Neural Netw. 2007, 20, 454–461. [Google Scholar] [CrossRef]
- Geer, A.J.; Lonitz, K.; Weston, P.; Kazumori, M.; Zhu, Y.; Liu, E.H.; Collard, A.; Bell, W.; Migliorini, S.; Chambon, P.; et al. All-sky satellite data assimilation at operational weather forecasting centres. Quart. J. Roy. Meteor. Soc. 2017, 144, 1191–1217. [Google Scholar] [CrossRef]
- Krasnopolsky, V.M.; Fox-Rabinovitz, M.S.; Belochitski, A. Decadal climate simulations using accurate and fast neural network emulation of full, long- and short-wave, radiation. Mon. Weather Rev. 2008, 136, 3683–3695. [Google Scholar] [CrossRef]
- Krasnopolsky, V.M.; Fox-Rabinovitz, M.S.; Hou, Y.T.; Lord, S.J.; Belochitski, A.A. Accurate and Fast Neural Network Emulations of Model Radiation for the NCEP Coupled Climate Forecast System: Climate Simulations and Seasonal Predictions. Mon. Weather Rev. 2010, 138, 1822–1842. [Google Scholar] [CrossRef]
- Krasnopolsky, V.; Belochitski, A.; Hou, Y.-T.; Lord, S.; Yang, F. Accurate and Fast Neural Network Emulations of Long and Short Wave Radiation for the NCEP Global Forecast System Model. NCEP Off. Note 2012, 471. Available online: http://www.lib.ncep.noaa.gov/ncepofficenotes/files/on471.pdf (accessed on 20 March 2024).
- Pal, A.; Mahajan, S.; Norman, M.R. Using deep neural networks as cost-effective surrogate models for Super-Parameterized E3SM radiative transfer. Geophys. Res. Lett. 2019, 46, 6069–6079. [Google Scholar] [CrossRef]
- Roh, S.; Song, H.-J. Evaluation of neural network emulations for radiation parameterization in cloud resolving model. Geophys. Res. Lett. 2020, 47, e2020GL089444. [Google Scholar] [CrossRef]
- Ukkonen, P.; Pincus, R.; Hogan, R.J.; Nielsen, K.P.; Kaas, E. Accelerating radiation computations for dynamical models with targeted machine learning and code optimization. J. Adv. Model. Earth Syst. 2020, 12, e2020MS002226. [Google Scholar] [CrossRef]
- Lagerquist, R.; Turner, D.; Ebert-Uphoff, I.; Stewart, J.; Hagerty, V. Using deep learning to emulate and accelerate a radiative-transfer model. J. Atmos. Ocean. Technol. 2021, 38, 1673–1696. [Google Scholar] [CrossRef]
- Belochitski, A.; Krasnopolsky, V. Robustness of neural network emulations of radiative transfer parameterizations in a state-of-the-art general circulation model. Geosci. Model Dev. 2021, 14, 7425–7437. [Google Scholar] [CrossRef]
- Khain, A.; Rosenfeld, D.; Pokrovsky, A. Aerosol impact on the dynamics and microphysics of deep convective clouds. Q.J.R. Meteorol. Soc. A J. Atmos. Sci. Appl. Meteorol. Phys. Oceanogr. 2000, 131, 2639–2663. [Google Scholar] [CrossRef]
- Thompson, G.; Field, P.R.; Rasmussen, R.M.; Hall, W.D. Explicit Forecasts of Winter Precipitation Using an Improved Bulk Microphysics Scheme. Part II: Implementation of a New Snow Parameterization. Mon. Weather Rev. 2008, 138, 5095–5115. [Google Scholar] [CrossRef]
- Krasnopolsky, V.; Middlecoff, J.; Beck, J.; Geresdi, I.; Toth, Z. A Neural Network Emulator for Microphysics Schemes. In Proceedings of the AMS Annual Meeting, Seattle, WA, USA, 22–26 January 2017; Available online: https://ams.confex.com/ams/97Annual/webprogram/Paper310969.html (accessed on 20 March 2024).
- Jensen, A.; Weeks, C.; Xu, M.; Landolt, S.; Korolev, A.; Wolde, M.; DiVito, S. The prediction of supercooled large drops by a microphysics and a machine-learning model for the ICICLE field campaign. Weather Forecast. 2023, 38, 1107–1124. [Google Scholar] [CrossRef]
- Krasnopolsky, V.M.; Fox-Rabinovitz, M.S.; Belochitski, A.A. Using Ensemble of Neural Networks to Learn Stochastic Convection Parameterization for Climate and Numerical Weather Prediction Models from Data Simulated by Cloud Resolving Model. Adv. Artif. Neural Syst. 2013, 2013, 485913. [Google Scholar] [CrossRef]
- Schneider, T.; Lan, S.; Stuart, A.; Teixeira, J. Earth System Modeling 2.0: A Blueprint for Models That Learn from Observations and Targeted High-Resolution Simulations. Geophys. Res. Lett. 2017, 44, 396–417. [Google Scholar] [CrossRef]
- Gentine, P.; Pritchard, M.; Rasp, S.; Reinaudi, G.; Yacalis, G. Could Machine Learning Break the Convection Parameterization Deadlock? Geophys. Res. Lett. 2018, 45, 5742–5751. [Google Scholar] [CrossRef]
- Pal, A. Deep Learning Emulation of Subgrid-Scale Processes in Turbulent Shear Flows. Geophys. Res. Lett. 2020, 47, e2020GL087005. [Google Scholar] [CrossRef]
- Wang, L.-Y.; Tan, Z.-M. Deep Learning Parameterization of the Tropical Cyclone Boundary Layer. J. Adv. Model. Earth Syst. 2023, 15, e2022MS003034. [Google Scholar] [CrossRef]
- Krasnopolsky, V.M.; Lord, S.J.; Moorthi, S.; Spindler, T. How to Deal with Inhomogeneous Outputs and High Dimensionality of Neural Network Emulations of Model Physics in Numerical Climate and Weather Prediction Models. In Proceedings of the International Joint Conference on Neural Networks, Atlanta, GA, USA, 14–19 June 2009; pp. 1668–1673. Available online: https://ieeexplore.ieee.org/document/5178898 (accessed on 20 March 2024).
- Wang, X.; Han, Y.; Xue, W.; Yang, G.; Zhang, G.J. Stable climate simulations using a realistic general circulation model with neural network parameterizations for atmospheric moist physics and radiation processes. Geosci. Model Dev. 2022, 15, 3923–3940. [Google Scholar] [CrossRef]
- Scher, S. Toward data-driven weather and climate forecasting: Approximating a simple general circulation model with deep learning. Geophys. Res. Lett. 2018, 45, 616–622. [Google Scholar] [CrossRef]
- Scher, S.; Messori, G. Weather and climate forecasting with neural networks: Using GCMs with different complexity as study-ground. Geosci. Model Dev. Discuss. 2019, 12, 2797–2809. [Google Scholar] [CrossRef]
- Yik, W.; Silva, S.J.; Geiss, A.; Watson-Parris, D. Exploring Randomly Wired Neural Networks for Climate Model Emulation. arXiv 2022, arXiv:2212.03369v2. Available online: https://arxiv.org/pdf/2212.03369.pdf (accessed on 20 March 2024). [CrossRef]
- Pawar, S.; San, O. Equation-Free Surrogate Modeling of Geophysical Flows at the Intersection of Machine Learning and Data Assimilation. J. Adv. Model. Earth Syst. 2022, 14, e2022MS003170. [Google Scholar] [CrossRef]
- Schultz, M.G.; Betancourt, C.; Gong, B.; Kleinert, F.; Langguth, M.; Leufen, L.H.; Mozaffari, A.; Stadtler, S. Can deep learning beat numerical weather prediction? Philos. Trans. R. Soc. 2021, A 379, 20200097. [Google Scholar] [CrossRef]
- Wang, J.; Tabas, S.; Cui, L.; Du, J.; Fu, B.; Yang, F.; Levit, J.; Stajner, I.; Carley, J.; Tallapragada, V.; et al. Machine learning weather prediction model development for global ensemble forecasts at EMC. In Proceedings of the EGU24 AS5.5—Machine Learning and Other Novel Techniques in Atmospheric and Environmental Science: Application and Development, Vienna, Austria, 18 April 2024. [Google Scholar]
- Bi, K.; Xie, L.; Zhang, H.; Chen, X.; Gu, X.; Tian, Q. Accurate medium-range global weather forecasting with 3D neural networks. Nature 2023, 619, 533–538. [Google Scholar] [CrossRef] [PubMed]
- Kochkov, D.; Yuval, J.; Langmore, I.; Norgaard, P.; Smith, J.; Mooers, G.; Klower, M.; Lottes, J.; Rasp, S.; Duben, P.; et al. Neural general circulation models. arXiv 2023, arXiv:2311.07222. [Google Scholar]
- Xu, W.; Balaguru, K.; August, A.; Lalo, N.; Hodas, N.; DeMaria, M.; Judi, D. Deep Learning Experiments for Tropical Cyclone Intensity Forecasts. Weather. Forecast. 2021, 36, 1453–1470. [Google Scholar] [CrossRef]
- Krasnopolsky, V.M.; Fox-Rabinovitz, M.S.; Belochitski, A. Using neural network emulations of model physics in numerical model ensembles. In Proceedings of the 2008 IEEE International Joint Conference on Neural Networks, Hong Kong, China, 1–8 June 2008. 2008 IEEE World Congress on Computational Intelligence. [Google Scholar] [CrossRef]
- Kelpa, M.M.; Tessuma, C.W.; Marshall, J.D. Orders-of-magnitude speedup in atmospheric chemistry modeling through neural network-based emulation. arXiv 2018, arXiv:1808.03874. [Google Scholar]
- Schreck, J.S.; Becker, C.; Gagne, D.J.; Lawrence, K.; Wang, S.; Mouchel-Vallon, C.; Choi, J.; Hodzic, A. Neural network emulation of the formation of organic aerosols based on the explicit GECKO-A chemistry model. J. Adv. Model. Earth Syst. 2022, 14, e2021MS002974. [Google Scholar] [CrossRef]
- Sharma, H.; Shrivastava, M.; Singh, B. Physics informed deep neural network embedded in a chemical transport model for the Amazon rainforest. Clim. Atmos. Sci. 2023, 6, 28. [Google Scholar] [CrossRef]
- Geiss, A.; Ma, P.-L.; Singh, B.; Hardin, J.C. Emulating aerosol optics with randomly generated neural networks. Geosci. Model Dev. 2023, 16, 2355–2370. [Google Scholar] [CrossRef]
- Vannitsem, S.; Bjornar Bremnes, J.; Demaeyer, J.; Evans, G.R.; Flowerdew, J.; Hemru, S.; Lerch, S.; Roberts, N.; Theis, S.; Atencia, A.; et al. Statistical Postprocessing for Weather Forecasts: Review, Challenges, and Avenues in a Big Data World. BAMS 2021, 102, E681. [Google Scholar] [CrossRef]
- Klein, W.H.; Lewis, B.M.; Enger, I. Objective prediction of five-day mean temperatures during winter. J. Meteor. 1959, 16, 672–682. [Google Scholar] [CrossRef]
- Glahn, H.R.; Lowry, D.A. The use of model output statistics (MOS) in objective weather forecasting. J. Appl. Meteor. 1972, 11, 1203–1211. [Google Scholar] [CrossRef]
- Hemri, S.; Scheuerer, M.; Pappenberger, F.; Bogner, K.; Haiden, T. Trends in the predictive performance of raw ensemble weather forecasts. Geophys. Res. Lett. 2014, 41, 9197–9205. [Google Scholar] [CrossRef]
- Grönquist, P.; Yao, C.; Ben-Nun, T.; Dryden, N.; Dueben, P.; Li, S.; Hoefler, T. Deep learning for post-processing ensemble weather forecasts. Philos. Trans. R. Soc. 2021, A 379, 20200092. [Google Scholar] [CrossRef]
- Bouallègue, Z.B.; Cooper, F.; Chantry, M.; Düben, P.; Bechtold, P.; Sandu, I. Statistical Modelling of 2 m Temperature and 10 m Wind Speed Forecast Errors. Mon. Weather Rev. 2023, 151, 897–911. [Google Scholar] [CrossRef]
- Rojas-Campos, R.; Wittenbrink, M.; Nieters, P.; Schaffernicht, E.J.; Keller, J.D.; Pipa, G. Postprocessing of NWP Precipitation Forecasts Using Deep Learning. Weather Forecast. 2023, 38, 487–497. [Google Scholar] [CrossRef]
- Benáček, P.; Farda, A.; Štěpánek, P. Postprocessing of Ensemble Weather Forecast Using Decision Tree–Based Probabilistic Forecasting Methods. Weather Forecast. 2023, 38, 69–82. [Google Scholar] [CrossRef]
- Krasnopolsky, V.; Lin, Y. A Neural Network Nonlinear Multimodel Ensemble to Improve Precipitation Forecasts over Continental US. Adv. Meteorol. 2012, 2012, 649450. [Google Scholar] [CrossRef]
- Wang, T.; Zhang, Y.; Zhi, X.; Ji, Y. Multi-Model Ensemble Forecasts of Surface Air Temperatures in Henan Province Based on Machine Learning. Atmosphere 2023, 14, 520. [Google Scholar] [CrossRef]
- Acharya, N.; Hall, K. A machine learning approach for probabilistic multi-model ensemble predictions of Indian summer monsoon rainfall. MAUSAM 2023, 74, 421–428. Available online: https://mausamjournal.imd.gov.in/index.php/MAUSAM/article/view/5997 (accessed on 20 March 2024). [CrossRef]
- Rodrigues, E.R.; Oliveira, I.; Cunha, R.; Netto, M. DeepDownscale: A deep learning strategy for high-resolution weather forecast. In Proceedings of the 2018 IEEE 14th International Conference on e-Science (e-Science), Amsterdam, The Netherlands, 29 October–1 November 2018; pp. 415–422. [Google Scholar]
- Li, L. Geographically weighted machine learning and downscaling for high-resolution spatiotemporal estimations of wind speed. Remote Sens. 2019, 11, 1378. [Google Scholar] [CrossRef]
- Höhlein, K.; Kern, M.; Hewson, T.; Westermann, R. A comparative study of convolutional neural network models for wind field downscaling. Meteorol. Appl. 2020, 27, e1961. [Google Scholar] [CrossRef]
- Sekiyama, T.T. Statistical Downscaling of Temperature Distributions from the Synoptic Scale to the Mesoscale Using Deep Convolutional Neural Networks. arXiv 2020, arXiv:2007.10839. [Google Scholar]
- Sebbar, B.-E.; Khabba, S.; Merlin, O.; Simonneaux, V.; Hachimi, C.E.; Kharrou, M.H.; Chehbouni, A. Machine-Learning-Based Downscaling of Hourly ERA5-Land Air Temperature over Mountainous Regions. Atmosphere 2023, 14, 610. [Google Scholar] [CrossRef]
- Agrawal, S.; Carver, R.; Gazen, C.; Maddy, E.; Krasnopolsky, V.; Bromberg, C.; Ontiveros, Z.; Russell, T.; Hickey, J.; Boukabara, S. A Machine Learning Outlook: Post-processing of Global Medium-range Forecasts. arXiv 2023, arXiv:2303.16301. [Google Scholar] [CrossRef]
- Krasnopolsky, V.M.; Fox-Rabinovitz, M.S.; Tolman, H.L.; Belochitski, A. Neural network approach for robust and fast calculation of physical processes in numerical environmental models: Compound parameterization with a quality control of larger errors. Neural Netw. 2008, 21, 535–543. [Google Scholar] [CrossRef] [PubMed]
- Poggio, T.; Banburski, A.; Liao, Q. Theoretical issues in deep networks. Proc. Natl. Acad. Sci USA 2020, 117, 30039–30045. [Google Scholar] [CrossRef]
- Thompson, N.C.; Greenewald, K.; Lee, K.; Manso, G.F. The Computational Limits of Deep Learning. arXiv 2020, arXiv:2007.05558. [Google Scholar]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Krasnopolsky, V. Applying Machine Learning in Numerical Weather and Climate Modeling Systems. Climate 2024, 12, 78. https://doi.org/10.3390/cli12060078
Krasnopolsky V. Applying Machine Learning in Numerical Weather and Climate Modeling Systems. Climate. 2024; 12(6):78. https://doi.org/10.3390/cli12060078
Chicago/Turabian StyleKrasnopolsky, Vladimir. 2024. "Applying Machine Learning in Numerical Weather and Climate Modeling Systems" Climate 12, no. 6: 78. https://doi.org/10.3390/cli12060078
APA StyleKrasnopolsky, V. (2024). Applying Machine Learning in Numerical Weather and Climate Modeling Systems. Climate, 12(6), 78. https://doi.org/10.3390/cli12060078