Next Article in Journal
Scale Dependence of Errors in Snow Water Equivalent Simulations Using ERA5 Reanalysis over Alpine Basins
Next Article in Special Issue
Classification of Rainfall Intensity and Cloud Type from Dash Cam Images Using Feature Removal by Masking
Previous Article in Journal
Artificial Neural Network (ANN)-Based Long-Term Streamflow Forecasting Models Using Climate Indices for Three Tributaries of Goulburn River, Australia
Previous Article in Special Issue
Numerical Simulation of Winter Precipitation over the Western Himalayas Using a Weather Research and Forecasting Model during 2001–2016
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Evaluation of the 3DVAR Operational Implementation of the Colombian Air Force for Aircraft Operations: A Case Study

by
Jhon Edinson Hinestroza-Ramirez
1,2,
Juan Ernesto Soto Barbosa
3,
Andrés Yarce Botero
1,4,*,
Danilo Andrés Suárez Higuita
1,
Santiago Lopez-Restrepo
1,
Lisseth Milena Cruz Ruiz
1,
Valeria Sólorzano Araque
1,
Andres Céspedes
3,
Sara Lorduy Hernandez
3,
Richard Caceres
3,
Giovanni Jiménez-Sánchez
3 and
Olga Lucia Quintero
1
1
Mathematical Modelling Research Group, Universidad EAFIT, Medellín 050022, Colombia
2
Grupo de Investigaciones Pedagógicas en el Área de las Matemáticas (INPEMA), Universidad Tecnológica del Chocó, Diego Luis Cordoba, Quibdó 270001, Colombia
3
Fuerza Aérea Colombiana (FAC), Bogotá 110911, Colombia
4
Department of Applied Mathematics, TU Delft, 2600 AA Delft, The Netherlands
*
Author to whom correspondence should be addressed.
Climate 2023, 11(7), 153; https://doi.org/10.3390/cli11070153
Submission received: 14 June 2023 / Revised: 15 July 2023 / Accepted: 18 July 2023 / Published: 20 July 2023
(This article belongs to the Special Issue Extreme Weather Detection, Attribution and Adaptation Design)

Abstract

:
This manuscript introduces an exploratory case study of the SIMFAC’s (Sistema de Información Meteorológica de la Fuerza Aérea Colombiana) operational implementation of the Weather Research and Forecasting (WRF) model with a 3DVAR (three-dimensional variational) data assimilation scheme that provides meteorological information for military, public, and private aviation. In particular, it investigates whether the assimilation scheme in SIMFAC’s implementation improves the prediction of the variables of interest compared to the implementation without data assimilation (CTRL). Consequently, this study compares SIMFAC’S 3DVAR-WRF operational implementation in Colombia with a CTRL with the same parameterization (without 3DVAR assimilation) against the ground and satellite observations in two operational forecast windows. The simulations are as long as an operational run, and the evaluation is performed using the root mean square error, the mean fractional bias, the percent bias, the correlation factor, and metrics based on contingency tables. It also evaluates the model’s results according to the regions of Colombia, accounting for the country’s topographical differences. The findings reveal that, in general, the operational forecast (3DVAR) is similar to the CTRL without data assimilation, indicating the need for further improvement of the 3DVAR-WRF implementation.

1. Introduction

Colombia is located in the northwestern part of South America. This geographical location results in a topography influenced by the Andes mountain range, creating unique meteorological conditions that favor the occurrence of precipitation events [1,2,3]. The Colombian Air Force (FAC, Spanish acronym) understands the importance of forecasting meteorological conditions for the strategic planning and development of aircraft operations and has created the Sistema de Información Meteorológica de la Fuerza Aérea Colombiana (SIMFAC), a system to provide applied meteorological products, including aeronautical meteorology, synoptic meteorology, mesoscale processes, aeronautical climatology, cloud physics, numerical weather prediction, remote sensing, and meteorological surveillance, to the public forces (army, navy, FAC, and police), as well as the civil aviation sector.
One of the tools used by SIMFAC to provide meteorological products is the Weather Research and Forecasting (WRF) model, which is a mesoscale model widely used in numerical weather prediction (NWP) [4]. Various studies have been conducted in Colombia to explore the capabilities of this model. For instance, WRF downscaling was employed to analyze the complex region of the Cauca River [5]. Another study focused on the ability of the WRF model to replicate meteorological conditions in several cities throughout Colombia [6]. Furthermore, investigations were carried out using subkilometer horizontal resolution in the Aburrá Valley [7], which demonstrated an improved performance in terms of the surface temperature and wind direction but also revealed the challenge of overestimating the wind speed at such resolutions. Furthermore, sensitivity analyses were conducted on the model’s planetary boundary layer (PBL) [8] and initial conditions [9], highlighting their impact on model outcomes. Notably, identifying the low-level jet, an atmospheric phenomenon associated with hazardous wind and consequently with aircraft, has also been documented [1].
Although the WRF is used to predict meteorological events, it is fundamental to recognize the inherent constraints and uncertainties associated with its output. Factors that contribute to these constraints, as discussed in [10,11], include the complex terrain characteristics, the horizontal resolution [12], the specification of the initial and boundary conditions (IC/BC) [13], the physical parameterizations [13,14,15], the representation of the topographic characteristics in the input data [14], or even that the mathematics are a simplification of reality, as highlighted in [16,17,18].
The preceding discussion emphasizes the weakness and potential of the WRF model and the need for its evaluation. One promising approach to enhance the performance of the model in regional settings, particularly in the complex tropical Andes region (TAR), by using data assimilation (DA) [19]. DA is a mathematical/statistical technique integrating the available information from various sources with a mathematical model [20,21]. Therefore, DA can potentially improve forecasts by reducing the accumulated errors [22]. Various DA techniques have been used within the framework of the WRF model. These include three-dimensional variational data assimilation (3DVAR) [23,24], four-dimensional variational data assimilation (4DVAR) [25,26,27], ensemble methods [28,29], and the ensemble transform Kalman filter–3DVAR (ETKF–3DVAR) system [30]. The operational implementation of the WRF in SIMFAC utilizes the 3DVAR method and assimilates METAR (Aviation Routine Weather Report), radio soundings, SYNOP, RADAR, and GOES-16.
The 3DVAR aims to improve the model’s initial conditions through observation [31] and reduce the errors propagating through time. Several studies have shown that the 3DVAR in the WRF improves the associated errors in the output. So, for example, ref. [32] used it in Polar WRF to predict the wind speed, temperature, and radiation, and ref. [33] showed that assimilating dropsonde and satellite data improved the forecast. However, if the uncertainty in the model output holds, it is necessary to assess the NWP implementations to determine whether the errors are controlled.
The study of the performance of the model is so important that the World Meteorological Organization (WMO) has designed comparison and verification mechanisms for the different products generated by Centers to provide information to their users [34]. In the same way, the European Center for Medium-Range Weather Forecasts (ECMWF) conducts daily evaluations of forecast performance to provide feedback to users and model developers [35]. Moreover, some research centers explore the potential for comparing different methods. For example, ref. [36] assessed the 4DVAR and 3DVAR DA techniques in a cyclone case study. The process of evaluation of the operational implementation usually includes comparing the model’s output with observations [37,38], the sensitivity analysis of the implementations [39], or comparing the ability of different models to represent a variable [40].
Despite the indicated importance of evaluating implementations of NWP models, there is as yet no evidence of research work to account for the operational implementation of the WRF model used by SIMFAC in its forecasts and the impact of the implemented DA scheme. This article discusses the outcomes of the operational forecast implementation of the 3DVAR-WRF (3DVAR from here on) model employed in SIMFAC, compared with a control run without 3DVAR (CTRL). This discussion covers two operational windows and focuses on the surface temperature, pressure, and precipitation variables. For the analysis of the outcome, we use the root mean square error (RMSE), the mean fractional bias (MFB), the correlation factor (Corr), the mean bias (MB), the normalized mean bias (NMB), the percent bias (PBIAS), the equitable threat score (ETS), and associated metrics of the contingency tables (the false alarm ratio and the probability of detection).
This paper is organized as follows: Section 2 presents a comprehensive overview of the WRF model, including the specific parameterization used in the SIMFAC WRF operational implementation and CTRL run. The data used in the assimilation process and the methods for evaluating the results obtained from the simulations are also described. Section 3 presents the results obtained from the ground-based and satellite data evaluations. Finally, in Section 4, we discuss the results and show the conclusions.

2. Model, Methods, and Data

This study focuses on SIMFAC’S operational implementation of the WRF that employed 3DVAR. The 3DVAR data assimilation process seeks the maximum of a posterior probability density function, which is achieved by minimizing a cost function as in [41]:
J ( x ) = 1 2 x x b T B 1 x x b + 1 2 y y 0 T R 1 y y 0 ,
where x a = a r g m i n ( J ( x ) ) is the analysis to be found that minimizes the cost function given the data sources, x b is the forecast for the problem of estimation or the first guess of the NWP model, and y 0 is the assimilated observation. B and R are the background error covariance matrix and the observation error covariance matrix, respectively. Further, y = H ( x ) is the model-derived observation transformed from the analysis by the observation operator H.
The primary assumption in (1) is that the observation and background error covariances are described using Gaussian probability density functions (PDFs) with 0 mean error. Non-Gaussian PDFs, due, for example, to nonlinear observation operators, are permitted using an appropriate nonquadratic version of the cost function. If a model state x has n degrees of freedom (the number of grid points multiplied by the number of independent variables), calculating the whole background term J b of the cost function requires O ( n 2 ) calculations. For a typical NWP model with n 2 ( 10 12 10 14 ) , a direct solution is not feasible in the time interval allotted for DA in operational applications. One practical solution to reduce the computational cost is to calculate J b with respect to the control variables defined through the relationship x = U v . The U transform is designed to non-dimensionalize the variational problem and permit efficient filtering techniques that approximate the full background error covariance matrix [42].
The quality of a DA method depends on its analysis state, which relies on several components, including the impact of observations [43]. In the case of the 3DVAR method, the effect of observations depends on the background error statistics matrix B , which is crucial for the optimal initialization of the method and, therefore, for its overall success [43,44].
The WRF model estimates the background error statistics matrix using the GEN_BE (Generalized Background Error Covariance Matrix Model), which assumes that the errors are Gaussian and calculated using control variables [45]. However, there are limitations to calculating this vital factor of the DA process, which can introduce uncertainty in the results. The SIMFAC 3DVAR-WRF, reviewed here, used the control variables (CV) for u, v (horizontal wind components in the x and y direction, respectively), T (temperature), P s (surface pressure), and R H s (pseudo relative humidity), denominated CV7 [46,47]. The matrix B is defined as:
B = ϵ ϵ T ¯ x x T ¯ ,
with ϵ = x x b , and the background error and x are the perturbed states of the model used to calculate this matrix because ϵ is unknown. The background error statistics matrix B in the SIMFAC’s implementation was estimated with six prognostics from one month of the WRF model (00, 06, 12, and 18 UTC) of 12 h of duration with six shared and x being the difference between two consecutive outputs [47].
The methodology employed in this case study involved using SIMFAC’s WRF model as implemented without any modifications in its physical parameterizations, data source, and, in general, architecture. The implementation incorporated the 3DVAR DA techniques to assimilate the observational data.

2.1. WRF Model Setup

The WRF model operational implementation performs four daily runs at 00 UTC, 06 UTC, 12 UTC, and 18 UTC, and its parameterization, described here, is based on the study by [48]. SIMFAC’s 3DVAR implementation involves two forecast horizons in cold-start mode. The first horizon, spanning 73 h, is utilized for aircraft operations planning, while the second horizon, lasting 6 h, is specifically for executing these operations.
The SIMFAC application assumes the nested domains shown in Figure 1. This domain configuration is designed to simulate and predict weather patterns in Colombia and the surrounding continental area [48].
The first domain (D01) covers the entirety of Colombia. The second domain (D02) extends beyond the country’s borders and excludes the San Andrés, Providencia, and Santa Catalina islands in the north of Colombia. The topography includes the Andean mountain range in the northwest region of South America.
The San Andrés, Providencia, and Santa Catalina islands belong to Colombia but are not included in the domain D02. Consequently, information regarding these locations is obtained from the output generated by domain D01. The distribution of the domains, in terms of grid number and horizontal resolution, is presented in Table 1.
The SIMFAC’s WRF implementation provides IC/BC data every 3 h to the model and is obtained from the Global Forecast System (GFS) [49] to a resolution of 0 . 25 × 0 . 25 , consequently, these are the data used in the experiment.
Table 2 presents the parameterization used in the SIMFAC operational implementation, which is based on the results of [47,48]. It includes the WSM six-class graupel scheme for microphysics, which is appropriate for high-resolution simulations when precipitation is of interest [50]. The Goddard scheme is used for short-wave radiation, considering both diffuse and direct solar radiation and cloud effects [51]. The Community Atmospheric Model (CAM) scheme is used for longwave radiation [52].
Moreover, the implementation uses the Mellor–Yamada–Janjic (Eta) TKE scheme for the PBL [53] and the Kain–Fritsch (new Eta) scheme for the cumulus option [54] for domain D01. The land surface model uses four soil layers, corresponding to the Noah land surface model. The urban canopy model is configured to activate shadow effects from neighboring points and slope effects. Refer to Table 2 for additional simulation setups.

2.2. The 3DVAR Implementation of SIMFAC

Various data sources are integrated into the SIMFAC’s 3DVAR operational implementation. These include the Geostationary Operational Environmental Satellite (GOES) 16, METAR, radio soundings, and radar [55].
METAR, SYNOP, and radio soundings used for the SIMFAC’s 3DVAR are provided by IDEAM (Instituto de Hidrología, Meteorología y Estudios Ambientales), a public institution in Colombia that offers scientific and technical support for studying nature and environmental resources. IDEAM operates stations to measure meteorological variables at 28 airports in Colombia, some of which provide frequent METAR information.
IDEAM also generates radio soundings and SYNOP. The first is retrieved from the Wyoming radio soundings database [56] and the other from the IDEAM database.
It is important to note that the SIMFAC’s 3DVAR implementation assimilates the radiance of GOES-16 and the radial velocity and reflectivity of radar. All the steps followed in the implementation are depicted in Figure 2. The procedures follow the WRF user guide [4]. It starts with processing static data (boundary and initial conditions) by WPS and continues with the real module. When DA is used, the METAR, SYNOP, and radio soundings observations are organized and transformed to a LITTLE_R file corresponding individually. After that, the data go to OBSPROC, where they are organized and filtered according to the domain and simulation time. In addition, information about altitude and pressure is retrieved based on the data. Finally, a text file is produced to be employed by the WRFDA module. PREPBUFR or BUFR checks the GOES-16 and radar information following a similar process to OBSPROC for the WRFDA module. Having the observation, the WRFDA module updates the boundary and initial conditions, and this information and that generated by the WPS go to the ARW, where the simulation is conducted.
In examining the SIMFAC’s implementation, it was possible to see that there was no bias correction when GOES-16 was assimilated; consequently, the implementation is exposed to systematic errors for this kind of data. Furthermore, it was observed that there was no defined warning when a low amount of information was assimilated or when one kind of data was unavailable. Additionally, it was observed that the bias accepted for the radar data was high. Consequently, the implementation admitted significant errors in this type of information, which could introduce further uncertainty into the forecast.

2.3. Evaluation of the Operational Implementation

The simulations in the experiment spanned the same duration as the 3DVAR operational implementation. While acknowledging that a comprehensive evaluation might be limited within a 73 h window, the obtained results were expected to contribute to the application utilized by SIMFA’s procedure to provide information for aircraft operations in Colombia.
The experiment for the evaluation of the case study consisted of two operational forecast windows (73 h long), where the model implementation was run with DA (3DVAR) and without DA (CTRL), and the results were compared with the actual observations. Both simulations used the same parameterization of the model given in Table 2.
The first simulation window corresponded to 18 August 2020, at 00:00:00, until 21 August 2020, at 00:00:00 UTC (W1), and the second simulation window was 1 September 2020, at 00:00:00, until 4 September 2020, at 00:00:00 UTC (W2).
The selection of the two windows, W1 and W2, offered a valuable opportunity to observe the model’s performance during a critical period of the year when Colombia experiences some of its highest precipitation accumulations. Additionally, these windows coincided with the transition from a drier period to a rainier one in the Andean region.
Each simulation followed the scheme shown in Figure 2 and comprised two stages. In the first stage, the SIMFAC architecture incorporated multiple observation sources (Section 2.2) through WRF-3DVAR directly before running the model. In the second stage, a control run (CTRL) was conducted without DA, where the BC/IC remained unchanged. However, the parameterization, static data, domain setup, and other essential components of the model remained identical to those of the 3DVAR run.

2.4. Ground-Based Data

The simulations obtained in Section 2.3 were compared with the actual surface temperature, surface pressure, and precipitation retrieved from IDEAM and IMERG (Integrated Multisatellite Retrievals for the Global Precipitation Measurement (GPM) mission). Particularly for the validation of precipitation, studies such as [47,57] have classified this variable into rain events for days where the accumulated precipitation is more significant than 0.1 mm or no rain when the accumulated precipitation is less than or equal to 0.1 mm. The categorization of precipitation is important for SIMFAC because the warning for aircraft is given in this term as the likelihood of rain or not. This operational need means we did not use the quantitative precipitation forecast (QPF), a method of verification more commonly used in the literature.
Furthermore, we used data from the integrated multi-satellite recovery for the Global Precipitation Measurement (GPM) mission (IMERG) [58]. The IMERG database provides precipitation information every half hour with a spatial resolution of 0 . 1 × 0 . 1 .
IMERG data for spatiotemporal validation is valuable because it provides a high level of detail regarding the precipitation patterns over time and space. Some evaluations have shown that it is one of the gridded precipitation products with the best performance in the country ([59]). This product can help identify areas where the model may not accurately predict precipitation. Furthermore, using satellite data for validation is beneficial because it provides an independent source of information not affected by ground-based measurement limitations.
Using actual observations for comparison is crucial for this case study. By comparing the model’s output with the actual observations, the experiment was able to determine how closely the model’s predictions matched real-world conditions.

2.5. Statistical Performance Metrics

The evaluation metrics included in the model study included the root mean square error (RMSE), the mean absolute fractional bias (MFB), the correlation factor (CF) [60], and a contingency table that includes the false alarm ratio (FAR) and the probability of detection (POD) [23]. Additionally, it used percent bias (PBIAS) and the success percentage, which is calculated as the ratio between the number of times the model successfully detects an event and the total number of occurrences of that event. The comparison was conducted by comparing the observed data with the corresponding nearest neighbor values obtained from the output of the WRF model.
The PBIAS is defined in Equation (3) as a percentage that indicates overestimation or underestimation.
PBIAS [ % ] = y f i y i o y i o × 100 ,
where y i f is the model simulation output, and y i o is the observation.
The MFB, defined in Equation (4), is a statistical measure commonly used in assessing weather and climate models and represents the average of the normalized bias for each model–observation pair.
MFB = 2 M i = 1 M y f i y i o y f i + y i o .
M is the number of observations from all valid monitoring station data for the comparison period of interest. A zero value indicates perfect agreement between the model and observations, while positive or negative values indicate overestimation or underestimation. The MFB has a numerical range from 2 to 2.
The RMSE represents the differences between the predicted values and the observed values (Equation (5)), defined as
RMSE = 1 M i = 1 M y f i y i o 2 .
The RMSE tries to penalize a high variance by giving errors with larger absolute values more weight than errors with smaller ones. Finally, the Pearson correlation coefficient measures the linear relationship between the simulated values and the observations, written as:
C F = i = 1 M ( y f ) i ( y f ) ¯ ( y i o y o ¯ ) i = 1 M ( y f ) i ( y f ) ¯ 2 i = 1 M ( y i o y o ¯ ) 2 .
A CF value close to 1.0 indicates a strong positive relationship, while values around 0.5 suggest a moderate relationship. Values below 0.3 indicate a weak relationship, and a low negative value approaching 1 represents a strong inverse relationship. A value near 0 indicates a small or negligible relationship between the variables.
The precipitation is considered a categorical event, and its performance is evaluated using contingency tables, which provide information about hits, false alarms, and errors. Hits occur when the model predicts rain, and it does rain, whereas false alarms occur when the model predicts rain, but it does not rain. Errors occur when the model does not predict rain, but it does rain. Correct rejections are identified for observations where the model predicts no rain, and it does not rain [61]. A perfect model would obtain all the values between the correct hits and rejections, leaving the other values at zero. Table 3 provides an example of the contingency tables used in this case study.
Contingency tables are an effective way to evaluate the quality of a model’s forecast. These tables can be used to calculate various verification measures for the categorical forecast performance’s probability of detection (POD) or sensitivity, the false alarm ratio (FAR), and other measures. These measures provide valuable insight into the model’s performance and can help identify areas for improvement.
The PC is a measure that indicates the percentage of forecasted events that the model correctly predicted. It is calculated by dividing the total number of correct event forecasts (hits) by the total number of events observed for the model. The POD, also known as the sensitivity, is the percentage of correct forecasts. It is computed by dividing the total number of correct forecasts (hits + correct rejections) by the total number of forecasts.
On the other hand, the FAR is the ratio of false alarms to the total number of event forecasts. The best possible FAR is zero, which means no false alarms, while the worst is one, indicating that all forecasts were false alarms.

3. Results

The results were focused on the temperature, surface pressure, and precipitation due to the importance of these variables for aircraft. Pressure influences the lift of the aircraft through density and is the basis for various flight instruments such as the altimeter, the anemometer, and the variometer.
Figure 3 illustrates the assimilated data within the forecast window W1. Each point in Figure 3a–d represents a location where METAR, radio soundings, SYNOP, and radar data were retrieved, respectively. The number on the right side indicates the quantity of data utilized in the 3DVAR process. Additionally, Figure 3e,f depicts the radiances and brightness temperature retrieved from GOES-16, respectively. It can be observed from Figure 3a–d that the availability of data was relatively low in the lower part (Amazonian and Orinoquia regions). A similar pattern was observed for the window W2.

3.1. Evaluation of Surface Temperature and Pressure

This section discusses the results of the 3DVAR and CTRL model implementation for the surface temperature and pressure in the locations corresponding to the Cali and Rionegro international airports. The data used for analysis corresponded to the available METAR data for the operational forecast windows of the experiment, W1 and W2. It is important to note that the selected aerodromes were located in areas with unique weather conditions because of their proximity to large bodies of water or mountain ranges.
Figure 4 and Figure 5 present the temporal series of the surface temperature and pressure for the aerodromes in Barranqilla, Bogotá, Cali, and Rionegro (Colombia) in the operational forecast window W1. The intention was to see the adjustment of the actual data with the model output (3DVAR and CTRL). The observed pattern can be attributed to the accuracy of the measurement (data report is only integers).
As noted in Figure 4 and Figure 5, there was a substantial similarity between the temporal behavior of the surface temperature and the pressure for the 3DVAR and CTRL run during the forecast window. Sometimes, it was impossible to discern differences between the two implementations. Similar results were observed for the forecast window W2 (not shown).
The previous observation suggests that the 3DVAR did not significantly improve the model’s forecast capabilities for these variables, at least for the studied periods.
Furthermore, Figure 4 and Figure 5 highlight the differences between the observations and the simulations for both variables. This indicates that there may be significant errors in the model predictions, particularly for localized weather conditions such as those found at aerodromes. Accordingly, the observations may have an impact locally and close to the ground, which may fade immediately when the model forecast starts.
In general, while Figure 4 and Figure 5 suggest that there may be few differences between the 3DVAR and CTRL model implementations, they also highlight the need for the further analysis and evaluation of the model’s performance. For this reason, the statistics such as the MFB, RMSE, and CF, as defined in Section 2.5 were introduced too. These statistics can provide valuable insights into the accuracy and reliability of the model’s predictions and can be used to compare the model implementations.
Continuing with the previous discussion, Table 4 and Table 5 present the statistical performance metrics for the surface temperature and pressure variables in the aerodromes of Barranquilla, Bogotá, Cali, and Rionegro. The tables compare the performance of the 3DVAR implementation with the CTRL simulation during the selected operational forecast windows, W1 and W2.
In accordance with Table 4 and Table 5, it was impossible to generalize the behavior of the model for the locations analyzed. However, the results suggest that, on average, there was either an underestimation or overestimation of the temperature variables depending on where they were observed. However, for the temperature, the MFB showed a better estimation. The results for both variables indicated that the model representation was similar for the two forecast windows studied.
Additionally, Table 4 and Table 5 indicate that except for the pressure and the Cali location in the window W2, the model results had a high linear correlation for both the evaluated variables (temperature and pressure) and the simulation windows studied. However, the RMSE for the temperature suggested, on average, a lower model bias than for the pressure concerning the observations used.
The results in Table 4 and Table 5 suggest that both scenarios exhibited comparable statistical measures for the variables examined and the operational forecasts, W1 and W2, indicating that the performance of the 3DVAR implementation is similar to the CTRL simulation in terms of precision and precision for the operational forecast and the variables evaluated. However, it is essential to note that this observation was based on the specific experimental setup (and station spot measurements) and may not necessarily hold for other scenarios or variables. Therefore, further analysis and evaluation are required to generalize the model’s performance for other applications and scenarios.

3.2. Evaluation of Precipitation

After analyzing the statistical metrics presented in Table 4 and Table 5, it is challenging to determine which implementation, 3DVAR or CTRL, performed better in terms of matching the observations or metrics due to the similarity or the minor differences in both the scenarios and the variables.
This section focuses on a composite variable, precipitation, which is influenced by various factors, including the temperature, pressure, and humidity.
Figure 6 illustrates the average accumulated precipitation over time and space for each domain throughout the 73-hour simulation. Additionally, it showcases the difference between the 3DVAR and CTRL simulations for the selected operational windows, W1 and W2. This comparison helps assess the possible impact of DA on the model’s forecast outcomes in the operational forecast selected.
Interestingly, on average, Figure 6 shows that the differences between the periods were similar temporally for domain D01. However, more spatial differences existed between the CTRL and 3DVAR implementation for the forecast window W2. These spatial differences may be related to incorporating additional data through DA and may impact the temporal differences in the forecast window W2 in the domain D02.
Figure 7 displays the differences between the original ICs and those updated by the 3DVAR for precipitation near the first time step for W2 (see Section 2.2). In particular, the impact was more significant in areas with more ground-based stations and other data sources.
Figure 7a shows that the most significant differences occurred in the Andean and Amazonian regions. Furthermore, Figure 7c shows that the differences between the 3DVAR and CTRL occurred throughout most of the forecast simulation window but became smaller at the end of the window, as we hoped, because the boundary conditions were the same for both simulations, and the forecast horizon was so long. Consequently, the effect of the updated ICs through the DA process vanished.
An evaluation of the IMERG database against the CTRL and 3DVAR implementations revealed that the 3DVAR implementation, on average, produced an IC closer to the database (see Figure 8d). However, Figure 8b,c demonstrates that the CTRL and 3DVAR results overestimated the precipitation across most of the domain, with the largest overestimations observed in the Amazonian and Andean regions.
Figure 9 shows the differences between the IMERG database and the 3DVAR (Figure 9a) and the CTRL (Figure 9b) run. It also presents the MFB results (Figure 9c for the entire domain at each time step to evaluate the total accumulated precipitation of the whole domain.
Figure 9a,b shows the similarity between the 3DVAR and CTRL runs; however, more bias was observed close to the Orinoquia and the Amazonian regions. Furthermore, Figure 9c shows that the 3DVAR had, on average, a more significant underestimate than the CTRL at the beginning of the simulation, similar to Figure 8, where the cumulative total precipitation domain led to a decrease in the error.
When considering the overestimation of precipitation, it is necessary to acknowledge that this discrepancy could stem from various factors. These include the inherent complexity of the model, the representation of the atmospheric processes, the influence of the topography, and the persistent uncertainties in the input data and the IC/BC. Figure 10 presents a division of Colombia into five distinct regions: the Caribbean, Andean, Pacific, Orinoquia, and the Amazonian. These regions are delineated based on heterogeneous characteristics, considering factors such as topographic variations, climate patterns, and vegetation throughout the country. Previous research has indicated that the model’s performance can vary across these different regions.
Additionally, Figure 10 displays the success percentage of the operational implementation and CTRL models for the precipitation during the two operational windows selected, W1 and W2. Figure 10a,b presents the information for W1 and W2, respectively.
Figure 10 displays that the success percentages for the operational forecast windows were similar in the two simulation scenarios (3DVAR and CTRL). However, a slight advantage of the operational implementation was evident for W1 concerning the CTRL run in the domain D01. In contrast, for the domain D02, the CTRL outperformed the operational implementation. In W2, as illustrated in Figure 10b, the CTRL model performed better in both domains.
Figure 10 shows that the model successfully identified the precipitation with percentages exceeding 50% in all regions, except the Andean and Amazonian regions during August and the Pacific region for September. The Andean, Amazonian, and Pacific regions have unique weather patterns and topography, which could have influenced the accuracy of the model in identifying the precipitation. These results suggest the need to account for regional variations in climatic conditions when developing precipitation models, as these can significantly impact the model’s accuracy and performance.
Figure 11 compares the IMERG observations with the implementations of the 3DVAR and CTRL for W2 in the Caribbean, Pacific, Andean, Orinoquia, and Amazonian regions, as defined in Figure 10. The comparison is made in terms of the MFB.
Figure 11 shows significant differences between the model output and the observations for all regions and that the overestimation of the 3DVAR under the initial conditions was more critical in the Amazonian region. In contrast, in other areas, there were less significant differences. This indicates that both model runs tended to overestimate the precipitation in both studied operational forecast windows, W1 and W2. This observation was consistent with the temporal series in Figure 4 and Figure 5 and the statistical measures presented in Table 4 and Table 5, which showed that the CTRL and 3DVAR had similar performances in terms of the MFB and RMSE, but the correlation factor was relatively low.
The results in Figure 11 suggest that the CTRL and 3DVAR implementations were limited in accurately representing the precipitation in the studied regions for the operational forecast windows used, particularly in the Orinoquia and Amazonian regions. The limited availability of the observation data, shown in Figure 3, may have contributed to the model’s inaccuracies. The benchmark used in processing GOES-16 data and radar in the data assimilation process can also contribute to the observed errors.
Further research may be necessary to improve the model’s representation of precipitation in these regions.
Finally, in order to obtain more evidence of the quality of the model forecast to represent the precipitation for the 3DVAR and CRTL implementation, contingency tables for the accumulated precipitation were obtained for W1 and W2. Table 6 shows the POD and FA.
The results shown in Table 6 were similar between the CTRL and 3DVAR for each window W1 and W2, suggesting an important opportunity to improve the operational implementation and control run because they indicate a high probability of false alarm and a low rate of detection.
According to the previous results, the differences between the operational implementation and the CTRL were slight. The 3DVAR implementation offered an improvement in the second period, reducing the FAR and increasing the POD. However, these differences were lowest in the calculated statistics.

4. Discussion and Conclusions

This paper presented an exploratory case study of the 3DVAR-WRF used operationally in SIMFAC.
The results of the 3DVAR are used for aircraft operations and are a significant undertaking to provide weather information and improve air safety in the country. The study used two operational forecasts in W1 and W2 and two CTRL simulations conducted for this study in the same forecast windows, with identical parameterizations of the model and input data but with the 3DVAR module turned off. The operational output (3DVAR) and non-operational (CTRL) simulations were compared against observations from the IDEAM and the IMERG databases.
The results indicated that the 3DVAR and CTRL implementations analyzed tended to overestimate the METAR observations and precipitation, regardless of the locations or operational forecast windows selected. The analysis also indicated that both implementations performed poorly regarding the POD and FAR for precipitation. Therefore, there are suggestions that the implementation needs to be inspected.
In particular, the results for the operational forecast windows selected did not show any significant improvement in the model with the 3DVAR implementation. At some locations, the CTRL was slightly closer in terms of the precipitation to the observations.
In general, the 3DVAR implementation did not show significant differences compared to the CTRL, possibly due to several aspects. One may be because, in practice, only some of the information available in the model is being assimilated. It is also possible that there are problems with the observational operators in variables such as the radar reflectivity and the satellite brightness temperature, which generally involve nonlinear relationships with the variables available in the model.
The evaluation represents a step forward in advancing our comprehension of the 3DVAR operational implementation’s capability to forecast weather patterns in Colombia. Nevertheless, further research and in-depth evaluation are necessary to overcome the remaining challenges and to comprehensively understand the model’s performance. This will enable us to make any necessary adjustments to the implementation to enhance the model’s accuracy. Such improvements are vital for supporting aviation decision makers, ensuring reliable weather information for aircraft operations.
The limitation of the operational implementation in the forecast windows studied may stem from several factors, such as the complexity of the model, which affect the representation of atmospheric processes. There could also be uncertainties in the input data and BC/IC. The overestimation of certain variables, including the surface temperature, pressure, and precipitation, could be partly attributed to the model’s spatial resolution, which may not capture small-scale atmospheric features that affect these variables. Further studies could focus on refining the model’s parameterizations and improving the input data quality, especially in remote areas like the Amazonian and Andean regions.
Another aspect that requires further analysis is the construction of the background error covariance matrix, which is a crucial step in the assimilation process. It is necessary to review how this matrix is estimated and whether it is the most appropriate method to obtain it. It is possible that the selection of control variables could be more optimal or that the assumption of the Gaussianity of the error needs to be met. Previous evaluations using ensembles of operational implementation have shown that covariances vary depending on the location, highlighting the need for further research. Concerning this, the authors of [44] emphasized the importance of the background error covariance matrix B in the cost function (1) and the minimization process. This matrix B needs to be determined by the user, but it is challenging to obtain or estimate due to the problem’s dimensionality.

Author Contributions

Conceptualization, J.E.H.-R., J.E.S.B. and O.L.Q.; methodology, J.E.H.-R., A.Y.B., J.E.S.B. and O.L.Q.; software, J.E.H.-R., J.E.S.B., A.Y.B., D.A.S.H., S.L.H. and S.L.-R.; formal analysis, J.E.H.-R., J.E.S.B., A.Y.B., D.A.S.H. and S.L.-R.; investigation, J.E.H.-R., A.Y.B. and O.L.Q.; resources, J.E.H.-R., A.C., R.C., G.J.-S. and O.L.Q.; Data curation, J.E.S.B., V.S.A. and L.M.C.R.; writing—original draft preparation, J.E.H.-R., A.Y.B. and O.L.Q.; writing—review and editing, J.E.H.-R., A.Y.B., D.A.S.H. and O.L.Q.; visualization, J.E.H.-R., J.E.S.B., D.A.S.H. and A.Y.B.; supervision, A.C., R.C., G.J.-S. and O.L.Q.; project administration, A.C., R.C., G.J.-S. and O.L.Q.; funding acquisition, O.L.Q. and J.E.H.-R. All authors have read and agreed to the published version of the manuscript.

Funding

Universidad EAFIT funded this research through the project “Sensitivity and Uncertainty of Numerical Weather Predictions Models” and The Colombian Ministry of Sciences and Technology MINCIENCIAS, under the Becas Bicentenario Scholarship granted to the main author.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data are available upon request. However, access to codes or operational implementation is not available for security reasons.

Acknowledgments

The authors thank the Colombian Air Force (FAC) for supporting this research and facilitating the use of human and technical resources for its development. We thank the anonymous reviewers for their insightful comments on an earlier manuscript version.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
DAData assimilation
3DVARThree-dimensional variational data assimilation
WRFWeather Research and Forecasting
TARTropical Andes region
FACColombian Air Force
SIMFACSistema de Información meteorológica de la Fuerza Aerea Colombiana
IMERGIntegrated multi-satellite retrievals for GPM
NWPNumerical weather prediction
BC/ICBoundary/initial conditions
ETKF–3DVAREnsemble transform Kalman filter–three-dimensional variational data assimilation
WMOWorld Meteorological Organisation
HRRRHigh-resolution rapid refresh
SNOTELSnowpack telemetry
4DVARFour-dimensional variational data assimilation
ECMWFEuropean Center for Medium-Range Weather Forecasts
GFSGlobal Forecast System

References

  1. Jiménez-Sánchez, G.; Markowski, P.M.; Jewtoukoff, V.; Young, G.S.; Stensrud, D.J. The Orinoco Low-Level Jet: An Investigation of Its Characteristics and Evolution Using the WRF Model. J. Geophys. Res. Atmos. 2019, 124, 10696–10711. [Google Scholar] [CrossRef]
  2. Mejía, J.F.; Yepes, J.; Henao, J.J.; Poveda, G.; Zuluaga, M.D.; Raymond, D.J.; Fuchs-Stone, Z. Towards a Mechanistic Understanding of Precipitation Over the Far Eastern Tropical Pacific and Western Colombia, One of the Rainiest Spots on Earth. J. Geophys. Res. Atmos. 2021, 126, 1–23. [Google Scholar] [CrossRef]
  3. Cáceres León, R.H. Meteorología Aplicada a la Seguridad de las Operaciones Aéreas, 1st ed.; Number 1; Escuela de Postgrados de la Fuerza Aérea Colombiana: Bogotá, Colombia, 2017; p. 142. [Google Scholar] [CrossRef]
  4. Skamarock, W.; Klemp, J.; Dudhia, J.; Gill, D.; Zhiquan, L.; Berner, J.; Wang, W.; Powers, J.; Duda, M.G.; Barker, D.M.; et al. A Description of the Advanced Research WRF Model Version 4. In NCAR Technical Note NCAR/TN-475+STR; National Center for Atmospheric Research: Boulder, CO, USA, 2019; p. 145. [Google Scholar] [CrossRef]
  5. Posada-Marín, J.A.; Rendón, A.M.; Salazar, J.F.; Mejía, J.F.; Villegas, J.C. WRF downscaling improves ERA-Interim representation of precipitation around a tropical Andean valley during El Niño: Implications for GCM-scale simulation of precipitation over complex terrain. Clim. Dyn. 2018, 52, 3609–3629. [Google Scholar] [CrossRef]
  6. Luna, M.G.; Ceron, L.C.B.; Clappier, A. Implementation and validation of the performance of meteorological modeling with WRF in Colombian cities. In Proceedings of the 2019 Congreso Colombiano y Conferencia Internacional de Calidad de Aire y Salud Publica, CASAP 2019, Barranquilla, Colombia, 14–16 August 2019; pp. 8–11. [Google Scholar] [CrossRef]
  7. Henao, J.J.; Mejía, J.F.; Rendón, A.M.; Salazar, J.F. Sub-kilometer dispersion simulation of a CO tracer for an inter-Andean urban valley. Atmos. Pollut. Res. 2020, 11, 928–945. [Google Scholar] [CrossRef]
  8. Arregocés, H.A.; Rojano, R.; Restrepo, G. Sensitivity analysis of planetary boundary layer schemes using the WRF model in Northern Colombia during 2016 dry season. Dyn. Atmos. Ocean. 2021, 96, 101261. [Google Scholar] [CrossRef]
  9. Hinestroza-Ramirez, J.E.; Rengifo-castro, J.D.; Quintero, O.L.; Botero, A.Y.; Rendon-perez, A.M. Non-Parametric and Robust Sensitivity Analysis of the Weather Research and Forecast ( WRF ) Model in the Tropical Andes Region. Atmosphere 2023, 14, 23. [Google Scholar] [CrossRef]
  10. Fernández-González, S.; Martín, M.L.; García-Ortega, E.; Merino, A.; Lorenzana, J.; Sánchez, J.L.; Valero, F.; Rodrigo, J.S. Sensitivity analysis of the WRF model: Wind-resource assessment for complex terrain. J. Appl. Meteorol. Climatol. 2018, 57, 733–753. [Google Scholar] [CrossRef]
  11. Wu, C.; Luo, K.; Wang, Q.; Fan, J. Simulated potential wind power sensitivity to the planetary boundary layer parameterizations combined with various topography datasets in the weather research and forecasting model. Energy 2022, 239, 122047. [Google Scholar] [CrossRef]
  12. Jee, J.B.; Kim, S. Sensitivity sudy on high-resolution WRF precipitation forecast for a heavy rainfall event. Atmosphere 2017, 8, 96. [Google Scholar] [CrossRef] [Green Version]
  13. Jankov, I.; Gallus, W.A.; Segal, M.; Koch, S.E. Influence of initial conditions on the WRF-ARW Model QPF response to physical parameterization changes. Weather Forecast. 2007, 22, 501–519. [Google Scholar] [CrossRef] [Green Version]
  14. Carvalho, D.; Rocha, A.; Gómez-Gesteira, M.; Santos, C. A sensitivity study of the WRF model in wind simulation for an area of high wind energy. Environ. Model. Softw. 2012, 33, 23–34. [Google Scholar] [CrossRef] [Green Version]
  15. Carvalho, D.; Rocha, A.; Gómez-Gesteira, M.; Silva Santos, C. Sensitivity of the WRF model wind simulation and wind energy production estimates to planetary boundary layer parameterizations for onshore and offshore areas in the Iberian Peninsula. Appl. Energy 2014, 135, 234–246. [Google Scholar] [CrossRef]
  16. Bolgiani, P.; Fernández-González, S.; Valero, F.; Merino, A.; García-Ortega, E.; Sánchez, J.L.; Martín, M.L. Numerical simulation of a heavy precipitation event in the vicinity of Madrid-Barajas International Airport: Sensitivity to initial conditions, domain resolution, and microphysics parameterizations. Atmosphere 2018, 9, 329. [Google Scholar] [CrossRef] [Green Version]
  17. Falasca, S.; Gandolfi, I.; Argentini, S.; Barnaba, F.; Casasanta, G.; Di Liberto, L.; Petenko, I.; Curci, G. Sensitivity of near-surface meteorology to PBL schemes in WRF simulations in a port-industrial area with complex terrain. Atmos. Res. 2021, 264, 105824. [Google Scholar] [CrossRef]
  18. Hakami, A.; Henze, D.K.; Seinfeld, J.H.; Singh, K.; Sandu, A.; Kim, S.; Byun; Li, Q. The Adjoint of CMAQ. Environ. Sci. Technol. 2007, 41, 7807–7817. [Google Scholar] [CrossRef] [PubMed]
  19. Montoya, O.L.; Niño-Ruiz, E.D.; Pinel, N. On the mathematical modelling and data assimilation for air pollution assessment in the Tropical Andes. Environ. Sci. Pollut. Res. 2020, 27, 35993–36012. [Google Scholar] [CrossRef]
  20. Carrassi, A.; Bocquet, M.; Bertino, L.; Evensen, G. Data assimilation in the geosciences: An overview of methods, issues, and perspectives. Wiley Interdiscip. Rev. Clim. Chang. 2018, 9, 1–51. [Google Scholar] [CrossRef] [Green Version]
  21. Farchi, A.; Bocquet, M. On the Efficiency of Covariance Localisation of the Ensemble Kalman Filter Using Augmented Ensembles. Front. Appl. Math. Stat. 2019, 5, 1–15. [Google Scholar] [CrossRef]
  22. Bannister, R.N. A review of operational methods of variational and ensemble-variational data assimilation. Q. J. R. Meteorol. Soc. 2017, 143, 607–633. [Google Scholar] [CrossRef] [Green Version]
  23. Giannaros, C.; Kotroni, V.; Lagouvardos, K.; Giannaros, T.M.; Pikridas, C. Assessing the impact of GNSS ZTD Data assimilation into the WRF modeling system during high-impact rainfall events over Greece. Remote Sens. 2020, 12, 383. [Google Scholar] [CrossRef] [Green Version]
  24. Koopmans, S.; van Haren, R.; Theeuwes, N.; Ronda, R.; Uijlenhoet, R.; Holtslag, A.A.; Steeneveld, G.J. The set-up and evaluation of fine-scale data assimilation for the urban climate of Amsterdam. Q. J. R. Meteorol. Soc. 2023, 149, 171–191. [Google Scholar] [CrossRef]
  25. Kawabata, T.; Kuroda, T.; Seko, H.; Saito, K. A Cloud-Resolving 4DVAR Assimilation Experiment for a Local Heavy Rainfall Event in the Tokyo Metropolitan Area. Mon. Weather Rev. 2011, 139, 1911–1931. [Google Scholar] [CrossRef]
  26. Wang, H.; Sun, J.; Zhang, X.; Huang, X.Y.; Auligné, T. Radar Data Assimilation with WRF 4D-Var. Part I: System Development and Preliminary Testing. Mon. Weather Rev. 2013, 141, 2224–2244. [Google Scholar] [CrossRef]
  27. Liu, J.; Bray, M.; Han, D. A study on WRF radar data assimilation for hydrological rainfall prediction. Hydrol. Earth Syst. Sci 2013, 17, 3095–3110. [Google Scholar] [CrossRef] [Green Version]
  28. Liu, C.; Xiao, Q.; Wang, B. An Ensemble-Based Four-Dimensional Variational Data Assimilation Scheme. Part II: Observing System Simulation Experiments with Advanced Research WRF (ARW). Mon. Weather Rev. 2009, 137, 1687–1704. [Google Scholar] [CrossRef]
  29. Zhang, F.; Meng, Z.; Poterjoy, J. E3DVar: Coupling an Ensemble Kalman Filter with Three-Dimensional Variational Data Assimilation in a Limited-Area Weather Prediction Model and Comparison to E4DVar. Mon. Weather Rev. 2013, 141, 900–917. [Google Scholar] [CrossRef]
  30. Wang, X.; Barker, D.M.; Snyder, C.; Hamill, T.M. A Hybrid ETKF–3DVAR Data Assimilation Scheme for the WRF Model. Part I: Observing System Simulation Experiment. Mon. Weather Rev. 2008, 136, 5116–5131. [Google Scholar] [CrossRef] [Green Version]
  31. Lam, M.; Fung, J.C. Model sensitivity evaluation for 3dvar data assimilation applied on WRF with a nested domain configuration. Atmosphere 2021, 12, 682. [Google Scholar] [CrossRef]
  32. Kim, D.H.; Kim, H.M. Effect of data assimilation in the Polar WRF with 3DVAR on the prediction of radiation, heat flux, cloud, and near surface atmospheric variables over Svalbard. Atmos. Res. 2022, 272, 106155. [Google Scholar] [CrossRef]
  33. Sun, W.; Liu, Z.; Davis, C.A.; Ralph, F.M.; Monache, L.D.; Zheng, M. Impacts of dropsonde and satellite observations on the forecasts of two atmospheric-river-related heavy rainfall events. Atmos. Res. 2022, 278, 106327. [Google Scholar] [CrossRef]
  34. WMO. Manual on the Global Data-Processing and Forecasting System (WMO-No. 485): Annex IV to the WMO Technical Regulations; Number 485; World Meteorological Organization (WMO): Geneva, Switzerland, 2019; p. 148. [Google Scholar]
  35. ECMWRF’s Portal. ECMWF’s Forecast Evaluation. Available online: https://www.ecmwf.int/en/research/modelling-and-prediction/forecast-evaluation (accessed on 11 April 2023).
  36. Tiwari, G.; Kumar, P. Predictive skill comparative assessment of WRF 4DVar and 3DVar data assimilation: An Indian Ocean tropical cyclone case study. Atmos. Res. 2022, 277, 106288. [Google Scholar] [CrossRef]
  37. Agnihotri, G. Evaluation of operational forecasts from weather research and forecasting model during southwest monsoon 2011 using MET 3.0. Mausam 2015, 66, 423–432. [Google Scholar] [CrossRef]
  38. Caron, M.; Steenburgh, W.J. Evaluation of recent NCEP operational model upgrades for cool-season precipitation forecasting over the western conterminous united states. Weather Forecast. 2020, 35, 857–877. [Google Scholar] [CrossRef]
  39. Brunner, D.; Savage, N.; Jorba, O.; Eder, B.; Giordano, L.; Badia, A.; Balzarini, A.; Baró, R.; Bianconi, R.; Chemel, C.; et al. Comparative analysis of meteorological performance of coupled chemistry-meteorology models in the context of AQMEII phase 2. Atmos. Environ. 2015, 115, 470–498. [Google Scholar] [CrossRef]
  40. Lian, J.; Wu, L.; Bréon, F.M.; Broquet, G.; Vautard, R.; Zaccheo, T.S.; Dobler, J.; Ciais, P. Evaluation of the WRF-UCM mesoscale model and ECMWF global operational forecasts over the Paris region in the prospect of tracer atmospheric transport modeling. Elementa 2018, 6, 64. [Google Scholar] [CrossRef] [Green Version]
  41. Courtier, P.; Thépaut, J.N.; Hollingsworth, A. A strategy for operational implementation of 4D-Var, using an incremental approach. Q. J. R. Meteorol. Soc. 1994, 120, 1367–1387. [Google Scholar] [CrossRef]
  42. Barker, D.M.; Huang, W.; Guo, Y.R.; Bourgeois, A.J.; Xiao, Q.N. A three-dimensional variational data assimilation system for MM5: Implementation and initial results. Mon. Weather Rev. 2004, 132, 897–914. [Google Scholar] [CrossRef]
  43. Asch, M.; Bocquet, M.; Nodet, M. Data Assimilation: Methods, Algorithms, and Applications; Fundamentals of Algorithms, Society for Industrial and Applied Mathematics: Philadelphia, PA, USA, 2016; p. 310. [Google Scholar] [CrossRef] [Green Version]
  44. Robert, C.; Durbiano, S.; Blayo, E.; Verron, J.; Blum, J.; Le Dimet, F.X. A reduced-order strategy for 4D-Var data assimilation. J. Mar. Syst. 2005, 57, 70–82. [Google Scholar] [CrossRef] [Green Version]
  45. Descombes, G.; Auligné, T.; Vandenberghe, F.; Barker, D.M.; Barré, J. Generalized background error covariance matrix model (GEN-BE v2.0). Geosci. Model Dev. 2015, 8, 669–696. [Google Scholar] [CrossRef] [Green Version]
  46. Sun, J.; Wang, H.; Tong, W.; Zhang, Y.; Lin, C.Y.; Xu, D. Comparison of the impacts of momentum control variables on high-resolution variational data assimilation and precipitation forecasting. Mon. Weather Rev. 2016, 144, 149–169. [Google Scholar] [CrossRef]
  47. Cáceres, R.; Codina, B. Radar data assimilation impact over nowcasting a mesoscale convective system in Catalonia using the WRF model. Tethys 2018, 2018, 3–17. [Google Scholar] [CrossRef]
  48. Cáceres, R. Impacto de la Asimilación Radar en el Pronóstico de Precipitación a Muy Corto Plazo Usando el Modelo WRF. Ph.D. Thesis, Universidad de Barcelona, Barcelona, Spain, 2018. [Google Scholar]
  49. NCEP GFS 0.25 Degree Global Forecast Grids Historical Archive. Available online: https://rda.ucar.edu/datasets/ds084.1/ (accessed on 19 July 2023).
  50. Hong, S.; Lim, J. The WRF single-moment 6-class microphysics scheme (WSM6). Asia-Pac. J. Atmos. Sci. 2006, 42, 129–151. [Google Scholar]
  51. Chou, M.D.; Suarez, M.J. Technical report series on global modeling and data assimilation. Volume 3: An efficient thermal infrared radiation parameterization for use in general circulation models. NASA Tech. Rep. NASA/TM-1999-10460 1994, 3, 102. [Google Scholar]
  52. Collins, W.; Rasch, P.J.; Boville, B.A.; McCaa, J.R.; Williamson, D.L.; Kiehl, J.T.; Bruce, B. Description of the NCAR Community Atmosphere Model (CAM 3.0); Technical Report June; University Corporation for Atmospheric Research: Boulder, CO, USA, 2004. [Google Scholar] [CrossRef]
  53. Zaviša, I.J. The Step-Mountain Eta Coordinate Model: Further Developments of the Convection, Viscous Sublayer, and Turbulence Closure Schemes. Mon. Weather Rev. 1994, 122, 927–945. [Google Scholar]
  54. Kain, J.S. The Kain–Fritsch Convective Parameterization: An Update. J. Appl. Meteorol. Climatol. 2004, 43, 170–181. [Google Scholar] [CrossRef]
  55. NOAA. GOES-R Series Satellites. 2017. Available online: https://www.ncdc.noaa.gov/data-access/satellite-data/goes-r-series-satellites (accessed on 12 February 2020).
  56. UWYO. UWYO’s Sounding Portal. Available online: https://weather.uwyo.edu/upperair/sounding.html (accessed on 12 April 2023).
  57. Jiménez García, M. Validación de la Capacidad de Modelo WRF “Weather Research and Forecasting” para Pronósticar Lluvia Intensa, Usando el Método Orientado a Objetos y Tablas de Contingencia (Tesis de Maestria). Ph.D. Thesis, Universidad Nacional de Colombia, Bogotá, Colombia, 2014. [Google Scholar]
  58. Huffman, G.J.; Bolvin, D.T.; Braithwaite, D.; Hsu, K.L.; Joyce, R.J.; Kidd, C.; Nelkin, E.J.; Sorooshian, S.; Stocker, E.F.; Tan, J.; et al. Integrated multi-satellite retrievals for the global precipitation measurement (GPM) mission (IMERG). In Satellite Precipitation Measurement: Volume 1; Springer: Cham, Switzerland, 2020; pp. 343–353. [Google Scholar]
  59. Valencia, S.; Marín, D.E.; Gómez, D.; Hoyos, N.; Salazar, J.F.; Villegas, J.C. Spatio-temporal assessment of Gridded precipitation products across topographic and climatic gradients in Colombia. Atmos. Res. 2023, 285, 106643. [Google Scholar] [CrossRef]
  60. Lingard, J.; Labrador, L.; Brookes, D.; Fraser, A. Statistical Evaluation of the Input Meteorological Data Used for the UK Air Quality Forecast (UK-AQF); Technical Report 1; Department for Environment, Food and Rural Affairs: Harwell, Didcot, UK, 2013. [Google Scholar]
  61. Wilks, D.S. Statistical Methods in the Atmospheric Sciences; Elsevier: Amsterdam, The Netherlands, 2011. [Google Scholar]
Figure 1. Visual representation of the SIMFAC’s domains utilized in the WRF model over Colombia, superimposed on the topography of the region.
Figure 1. Visual representation of the SIMFAC’s domains utilized in the WRF model over Colombia, superimposed on the topography of the region.
Climate 11 00153 g001
Figure 2. Scheme of the SIMFAC’s 3DVAR implementation and model evaluation procedure in the simulation window selected.
Figure 2. Scheme of the SIMFAC’s 3DVAR implementation and model evaluation procedure in the simulation window selected.
Climate 11 00153 g002
Figure 3. Observation used in the simulation of window W1. (a) METAR, (b) radio soundings, (c) SYNOP, (d) radar, (e) radiances from GOES-16, and (f) brightness temperature.
Figure 3. Observation used in the simulation of window W1. (a) METAR, (b) radio soundings, (c) SYNOP, (d) radar, (e) radiances from GOES-16, and (f) brightness temperature.
Climate 11 00153 g003
Figure 4. Representation of the temperature for the METAR data, 3DVAR, and CTRL in W1 for (a) Barranquilla, (b) Bogotá, (c) Cali, and (d) Rionegro for the forecast window W1.
Figure 4. Representation of the temperature for the METAR data, 3DVAR, and CTRL in W1 for (a) Barranquilla, (b) Bogotá, (c) Cali, and (d) Rionegro for the forecast window W1.
Climate 11 00153 g004
Figure 5. Representation of the surface pressure for the METAR data, 3DVAR, and CTRL in W1 for (a) Barranquilla, (b) Bogotá, (c) Cali, and (d) Rionegro for the forecast window W1.
Figure 5. Representation of the surface pressure for the METAR data, 3DVAR, and CTRL in W1 for (a) Barranquilla, (b) Bogotá, (c) Cali, and (d) Rionegro for the forecast window W1.
Climate 11 00153 g005
Figure 6. Spatiotemporal differences between the 3DVAR and CTRL for the two forecast windows (a) W1 and (b) W2 for the accumulated precipitation (accum. ppt) in D01 and D02.
Figure 6. Spatiotemporal differences between the 3DVAR and CTRL for the two forecast windows (a) W1 and (b) W2 for the accumulated precipitation (accum. ppt) in D01 and D02.
Climate 11 00153 g006
Figure 7. Differences between the CTRL and 3DVAR implementation in W2, D01. (a) Difference in the precipitation (PPT) near the start of the simulation, (b) difference in the accumulated precipitation, (c) hourly precipitation (continuous), and accumulated precipitation (dotted lines).
Figure 7. Differences between the CTRL and 3DVAR implementation in W2, D01. (a) Difference in the precipitation (PPT) near the start of the simulation, (b) difference in the accumulated precipitation, (c) hourly precipitation (continuous), and accumulated precipitation (dotted lines).
Climate 11 00153 g007
Figure 8. Comparison between the observed precipitation in the IMERG database and the CTRL and 3DVAR output. (a) Accumulated precipitation from the IMERG data, (b) accumulated precipitation from the CTRL output, (c) accumulated precipitation from the 3DVAR output, and (d) percent bias precipitation for the 3DVAR and CTRL cases. The fields in (ac) are the accumulated precipitation over the entire forecast window.
Figure 8. Comparison between the observed precipitation in the IMERG database and the CTRL and 3DVAR output. (a) Accumulated precipitation from the IMERG data, (b) accumulated precipitation from the CTRL output, (c) accumulated precipitation from the 3DVAR output, and (d) percent bias precipitation for the 3DVAR and CTRL cases. The fields in (ac) are the accumulated precipitation over the entire forecast window.
Climate 11 00153 g008
Figure 9. (a) Differences between the IMERG and 3DVAR, (b) differences between the IMERG and CTRL, and (c) the MFB for the CTRL and 3DVAR implementations with respect to the IMERG.
Figure 9. (a) Differences between the IMERG and 3DVAR, (b) differences between the IMERG and CTRL, and (c) the MFB for the CTRL and 3DVAR implementations with respect to the IMERG.
Climate 11 00153 g009
Figure 10. Comparison of the percentages of the success of the model to detect precipitation according to the regions in the operational windows. (a) Results in forecast window W1. (b) Results for forecast window W2.
Figure 10. Comparison of the percentages of the success of the model to detect precipitation according to the regions in the operational windows. (a) Results in forecast window W1. (b) Results for forecast window W2.
Climate 11 00153 g010
Figure 11. The MFB for the September period in the different regions of Colombia, (a) Caribbean (b) Pacific, (c) Andean, (d) Orinoquean, (e) Amazonian.
Figure 11. The MFB for the September period in the different regions of Colombia, (a) Caribbean (b) Pacific, (c) Andean, (d) Orinoquean, (e) Amazonian.
Climate 11 00153 g011
Table 1. Number of grid points per domain in the WRF model configuration.
Table 1. Number of grid points per domain in the WRF model configuration.
DomainLatitudeLongitudeGrid
D01 (9 km) ( 8.42264 , 17.5811 ) ( 85.1025 , 60.8975 ) 300 × 326
D02 (3 km) ( 4.90034 , 12.8275 ) ( 80.0936 , 65.7439 ) 532 × 661
Table 2. The WRF model setup in the SIMFAC operational implementation.
Table 2. The WRF model setup in the SIMFAC operational implementation.
ParameterSelection in WRF
Domain settingsCoordinate systemMercator
Vertical levels42
NestingOne way,
run using adaptive time steps
Physic SettingsPBL SchemeMYJ
Cumulus optionKF
Longwave schemeCAM
Shortwave schemeGoddard
MicrophysicsWSM 6-class graupel scheme
Land surfaceNoah Land Surface Model
Table 3. Contingency table used to evaluate precipitation as a categorical event.
Table 3. Contingency table used to evaluate precipitation as a categorical event.
Event ForecastEvent Observed
YesNoMarginal Total
YesHitFalse AlarmFc Yes
NoMissCorrect RejectionFc No
Marginal TotalObs YesObs NoSum Total
Table 4. The RMSE, MFB, and CF metrics for the temperature and pressure in the forecast window W1.
Table 4. The RMSE, MFB, and CF metrics for the temperature and pressure in the forecast window W1.
VariableCity3DVARCTRL
MFBRMSECFMFBRMSECF
Temperature (°C)Barranquilla0.0472.8350.6170.04892.8170.619
Bogotá0.1402.3430.9420.1542.4730.937
Cali−0.0041.6870.836−0.0091.6220.851
Rionegro−0.0063.1190.548−0.0043.1830.542
Pressure (hPa)Barranquilla0.4451130.7140.4191080.726
Bogotá−0.47598.950.787−0.5491000.781
Cali0.2941250.7980.3131220.817
Rionegro0.8661920.0660.4341920.066
Table 5. The RMSE, MFB and CF metrics for the temperature and pressure in the forecast window W2.
Table 5. The RMSE, MFB and CF metrics for the temperature and pressure in the forecast window W2.
VariableCity3DVARCTRL
MFBRMSECFMFBRMSECF
Temperature (°C)Barranquilla−0.0261.5350.810−0.0281.5890.800
Bogotá0.0322.0880.8700.0282.0400.876
Cali−0.0293.5770.733−0.0333.5960.730
Rionegro−0.0393.6690.584−0.0393.5910.604
Pressure (hPa)Barranquilla0.51891.6720.8780.473289.3160.877
Bogotá−1.14584.5160.882−0.62183.1770.884
Cali0.313106.192−0.1560.342106.202−0.158
Rionegro−1.219183.9500.111−0.9071.8390.1178
Table 6. Results of the contingency tables for the 3DVAR and CTRL simulations in the two windows of study, W1 and W2.
Table 6. Results of the contingency tables for the 3DVAR and CTRL simulations in the two windows of study, W1 and W2.
Window3DVARCTRL
PODFARPODFAR
W1 0.154 0.852 0.154 0.846
W2 0.556 0.737 0.500 0.750
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Hinestroza-Ramirez, J.E.; Soto Barbosa, J.E.; Yarce Botero, A.; Suárez Higuita, D.A.; Lopez-Restrepo, S.; Cruz Ruiz, L.M.; Sólorzano Araque, V.; Céspedes, A.; Lorduy Hernandez, S.; Caceres, R.; et al. Evaluation of the 3DVAR Operational Implementation of the Colombian Air Force for Aircraft Operations: A Case Study. Climate 2023, 11, 153. https://doi.org/10.3390/cli11070153

AMA Style

Hinestroza-Ramirez JE, Soto Barbosa JE, Yarce Botero A, Suárez Higuita DA, Lopez-Restrepo S, Cruz Ruiz LM, Sólorzano Araque V, Céspedes A, Lorduy Hernandez S, Caceres R, et al. Evaluation of the 3DVAR Operational Implementation of the Colombian Air Force for Aircraft Operations: A Case Study. Climate. 2023; 11(7):153. https://doi.org/10.3390/cli11070153

Chicago/Turabian Style

Hinestroza-Ramirez, Jhon Edinson, Juan Ernesto Soto Barbosa, Andrés Yarce Botero, Danilo Andrés Suárez Higuita, Santiago Lopez-Restrepo, Lisseth Milena Cruz Ruiz, Valeria Sólorzano Araque, Andres Céspedes, Sara Lorduy Hernandez, Richard Caceres, and et al. 2023. "Evaluation of the 3DVAR Operational Implementation of the Colombian Air Force for Aircraft Operations: A Case Study" Climate 11, no. 7: 153. https://doi.org/10.3390/cli11070153

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop