Accelerated Time and High-Resolution 3D Modeling of the Flow and Dispersion of Noxious Substances over a Gigantic Urban Area—The EMERGENCIES Project

: Accidental or malicious releases in the atmosphere are more likely to occur in built-up areas, where ﬂow and dispersion are complex. The EMERGENCIES project aims to demonstrate the operational feasibility of three-dimensional simulation as a support tool for emergency teams and ﬁrst responders. The simulation domain covers a gigantic urban area around Paris, France, and uses high-resolution metric grids. It relies on the PMSS modeling system to model the ﬂow and dispersion over this gigantic domain and on the Code_Saturne model to simulate both the close vicinity and the inside of several buildings of interest. The accelerated time is achieved through the parallel algorithms of the models. Calculations rely on a two-step approach: the ﬂow is computed in advance using meteorological forecasts, and then on-demand release scenarios are performed. Results obtained with actual meteorological mesoscale data and realistic releases occurring both inside and outside of buildings are presented and discussed. They prove the feasibility of operational use by emergency teams in cases of atmospheric release of hazardous materials.


Introduction
Many types of accidents, malicious actions, or terrorist attacks lead to the releases of noxious gases into the atmosphere. Densely built-up and highly populated areas, such as industrial sites or urban districts, are the critical locations for such events. The health effects are most severe close to the source of emission. On a local scale, the buildings significantly influence the airflow and consequently the dispersion. Thus, atmospheric dispersion models need to account for the influence on the flow of the three-dimensional (3D) geometry of the buildings.
Up to now, first responders and emergency teams essentially relied on modified Gaussian modeling. Most of the fast response modeling tools indeed use simplified flow formulations in the urban canopy and concentration analytical solutions with assumptions about the initial size of the plume and increased turbulence due to the urban environment: for examples, see ADMS-Urban [1], PRIME [2], SCIPUFF [3], and the dispersion model tested by Hanna and Baja [4]. However, Gaussian approaches do not apply to the complex flow patterns in the vicinity of the release in built-up areas, where the most serious consequences of the release are happening. Further away from the source, they cannot model specific effects due to buildings, such as the entrapment and subsequent releases of noxious substances due to canyon streets, cavity zones behind buildings, etc. Field experiments, such as the Mock Urban Setting Test (MUST) field experiment [5], clearly demonstrated the importance of high-resolution modeling in complex built-up areas: critical effects, such as

•
First guess computation starting from heterogeneous meteorological input data, mixing surface and profile measurements with mesoscale model outputs; • Modification of the first guess using analytical zones defined around isolated buildings or within groups of buildings [14,15]; • Mass consistency with impermeability on ground and buildings.
The PSWIFT model also incorporates a RANS solver [24,25]. It can be used as an alternate in the third step above. The RANS solver provides a more accurate pressure field: this is essential to derive the pressure on facades of buildings and model the infiltration inside.
The PSPRAY model [23,26,27] is a parallelized LPDM [16] that takes into account obstacles. The PSPRAY model simulates the dispersion of an airborne contaminant by following the trajectories of numerous numerical particles. The velocity of each virtual particle is the sum of a transport component and a turbulence component. The transport component is derived from the local average wind vector, while the turbulence component is derived from the stochastic scheme developed by Thomson [28] that solves a 3D form of the Langevin equation. This equation comprises a deterministic term that depends on the Eulerian probability density function of the turbulent velocity, and is determined from the Fokker-Planck equation, and a stochastic diffusion term that is obtained from a Lagrangian structure function. The PSPRAY model treats elevated and ground-level emissions, instantaneous and continuous emissions, or time-varying sources. Additionally, it is able to deal with plumes with initial arbitrarily oriented momentum, negative or positive buoyancy, and cloud spread at the ground due to gravity.

Scalability
In order to allow for very large calculations and near-real-time calculations, the PMSS modeling system integrates parallelism using both weak and strong scalability [17].
The weak scalability relies on domain decomposition (DD). The domain is divided into tiles that are compatible in size with the memory available to a single core. The strong scalability is implemented differently within the PSWIFT and PSPRAY models. Within the PSWIFT model, strong scalability relies on the diagnostic property of the code: time frames are computed in parallel. Within the PSPRAY model, strong scalability is achieved by distributing the virtual particles among the computing cores. An example of a combination of strong and weak scaling is illustrated for the PSWIFT model and for the PSPRAY model in Figure 1. The PMSS modeling system has been validated both in scalar mode [22,26,27] a regarding parallel algorithms [17,20]. The parallel testing was performed in compu tional environments ranging from a multicore laptop to several hundred cores in a hig performance computing center.   The PMSS modeling system has also the ability to handle multiple nested domains. The PSPRAY model can hence use a nested domain computed by the PWIFT flow model or computed by another flow model such as, in our case, Code_Saturne (see Section 2.2). The results of this flow model on the nested domain must be stored in the same binary format as a PSWIFT calculation, and they must contain at least wind velocity and turbulence fields. The PMSS modeling system has been validated both in scalar mode [22,26,27] and regarding parallel algorithms [17,20]. The parallel testing was performed in computational environments ranging from a multicore laptop to several hundred cores in a highperformance computing center.

Presentation of the Code_Saturne Model
In an urban area, one cannot exclude transfers of pollutants from outside to the inner part of buildings or, vice versa, pollution originating in a building and being transferred outside. Neither the diagnostic nor the momentum version of PSWIFT is appropriate to model the flow inside a building. Thus, computational fluid dynamics (CFD) is needed; more specifically, we used the Code_Saturne model [29], an open-source general-purpose and environmental application-oriented CFD model.
The Code_Saturne model (http://code-saturne.org, accessed on 17 May 2021) is an open-source CFD model developed at EDF R&D. Based on a finite volume method, it simulates incompressible or compressible laminar and turbulent flows in complex 2D and 3D geometries. Code_Saturne solves the RANS equations for continuity, momentum, energy, and turbulence. The turbulence modeling uses eddy-viscosity model or secondmoment closure, such as k-epsilon. The time-marching scheme is based on a prediction of velocity followed by a pressure correction step. Additional details are provided in [29].
The Code_Saturne model also has an atmospheric option [7]. The model has been used extensively, not only for atmospheric flow but also to model the flow within buildings, including the complicated setup of an intensive care hospital room [30].

The Computing Infrastructure
Modeling the flow and dispersion at high resolution in very large domains of several tens of kilometers of extent requires parallel domain decomposition using several hundreds or even thousands of cores of a supercomputer.
The simulations were carried out on a supercomputer consisting of 5040 B510 bull-X nodes, each with two eight-core Intel Sandy Bridge EP (E5-2680) processors at 2.7 GHz and with 64 GB of memory. The network is an InfiniBand QDR Full Fat Tree network. The file system offers 5 PB of disk storage.

The EMERGENCIES Project
The EMERGENCIES project aims to demonstrate the operational feasibility of accelerated time tracking of toxic atmospheric releases, be they accidental or intentional, in a large city and its buildings through 3D numerical simulation.
After describing the domain setup, we present the flow modeling and then the release scenarios.

Domain Setup
The modeling domain covers Greater Paris: it includes the City of Paris, the Hauts-de-Seine, the Seine-Saint-Denis, and the Val-de-Marne (see Figure 2). It extends to the airports of Orly, to the south, and of Roissy Charles de Gaulle, to the north. This geographic area is under the authority of the Paris Fire Brigade.  The simulation domain has a uniform horizontal resolution of 3 m and dimensions of 38.4 km × 40.8 km, leading to 12,668 × 13,335 points in a horizontal plane. The vertical grid has 39 points from the ground up to 1000 m, the resolution near the ground being 1.5 m. The grid, therefore, has more than 6 billion points.
The static data used for the modeling consists of the following: • The topography at an original resolution of 25 m taken from the BD Topo of the French National Geographical Institute (IGN); • The land use data with the same gridded resolution but obtained from the CORINE Land Cover data [31]; • The buildings data, taken also from the BD Topo of IGN, consisting of 1.15 million polygons with height attributes.
These data have a disk footprint ranging from 1 GB for the topography or land use to 3 GB for the building data. Figure 3 presents a view of the topography and the building data. Atmosphere 2021, 12, x FOR PEER REVIEW 7 of 15 This very large domain is treated using domain decomposition: the domain is divided into horizontal tiles of 401 × 401 horizontal grid points (see Figure 4). In total, 1088 tiles are used to cover the domain, 32 along the west-east axis, and 34 along the southnorth axis. Three buildings of particular interest have been chosen: a museum (M), a train station (TS), and an administration building (A). Their locations are presented in Figure 5. They have been selected to demonstrate the capability of the system to provide relevant information regarding releases either propagating from inner parts of buildings to the outside or occurring in the urban environment and then penetrating buildings. Three nested domains around M, TS, and A have been defined using a grid with a very high horizontal and vertical resolution both inside the buildings and in their vicinity. The characteristics of these nested domains are summarized in Table 1. The grid size is about 1 m horizontally and vertically. This very large domain is treated using domain decomposition: the domain is divided into horizontal tiles of 401 × 401 horizontal grid points (see Figure 4). In total, 1088 tiles are used to cover the domain, 32 along the west-east axis, and 34 along the south-north axis.  This very large domain is treated using domain decomposition: the domain is divided into horizontal tiles of 401 × 401 horizontal grid points (see Figure 4). In total, 1088 tiles are used to cover the domain, 32 along the west-east axis, and 34 along the southnorth axis. Three buildings of particular interest have been chosen: a museum (M), a train station (TS), and an administration building (A). Their locations are presented in Figure 5. They have been selected to demonstrate the capability of the system to provide relevant information regarding releases either propagating from inner parts of buildings to the outside or occurring in the urban environment and then penetrating buildings. Three nested domains around M, TS, and A have been defined using a grid with a very high horizontal and vertical resolution both inside the buildings and in their vicinity. The characteristics of these nested domains are summarized in Table 1. The grid size is about 1 m horizontally and vertically. Three buildings of particular interest have been chosen: a museum (M), a train station (TS), and an administration building (A). Their locations are presented in Figure 5. They have been selected to demonstrate the capability of the system to provide relevant information regarding releases either propagating from inner parts of buildings to the outside or occurring in the urban environment and then penetrating buildings. Three nested domains around M, TS, and A have been defined using a grid with a very high horizontal and vertical resolution both inside the buildings and in their vicinity. The characteristics of these nested domains are summarized in Table 1. The grid size is about 1 m horizontally and vertically.

Flow Simulations
The flow accounting for the buildings in the huge EMERGENCIES domain is downscaled from the forecasts of the mesoscale Weather Research and Forecasting (WRF) model [32]. These simulations run every day and require 100 computing cores for 2 h 10 min for 72 h simulated.
The WRF forecasts are extracted every hour on vertical profiles inside the huge urban simulation domain and provided as inputs to the PMSS modeling system. Thus, 24 timeframes of the microscale 3D flow around buildings are computed by PSWIFT. Finally, the inflow conditions for flow and turbulence are extracted on the boundaries of the very high resolution nested domain to perform the Code_Saturne simulations.
These calculations can be performed routinely every day provided the computing resource is available. The flow and turbulence fields are then available every day in advance both for the full domain and the chosen nests. Accidental scenarios can then be simulated on an on-demand basis.

Dispersion Simulations
In the framework of the project, fictitious multiple attacks consisting of the releases of substances with potential health consequences were considered. These substances could be radionuclides, chemical products, or pathogenic biological agents. Substances

Flow Simulations
The flow accounting for the buildings in the huge EMERGENCIES domain is downscaled from the forecasts of the mesoscale Weather Research and Forecasting (WRF) model [32]. These simulations run every day and require 100 computing cores for 2 h 10 min for 72 h simulated.
The WRF forecasts are extracted every hour on vertical profiles inside the huge urban simulation domain and provided as inputs to the PMSS modeling system. Thus, 24 timeframes of the microscale 3D flow around buildings are computed by PSWIFT. Finally, the inflow conditions for flow and turbulence are extracted on the boundaries of the very high resolution nested domain to perform the Code_Saturne simulations.
These calculations can be performed routinely every day provided the computing resource is available. The flow and turbulence fields are then available every day in advance both for the full domain and the chosen nests. Accidental scenarios can then be simulated on an on-demand basis.

Dispersion Simulations
In the framework of the project, fictitious multiple attacks consisting of the releases of substances with potential health consequences were considered. These substances could be radionuclides, chemical products, or pathogenic biological agents. Substances were assumed to be released inside or near the public buildings introduced in the previous The day for the release was chosen arbitrarily to be 23 October 2014. The releases were carried out in the morning, starting at 10 a.m. for the first one. The simulated period for the dispersion is a 5-h duration period between 10 a.m. and 3 p.m.
The flow conditions for 23 October 2014 show an average wind speed of 3 m/s at 20 m above ground with winds coming from the southwest.
Each release has a 10 min duration, and the material emitted is considered to be a gas, without density effect, or fine particulate matter, i.e., with aerodynamic diameters below 2.5 µm. One kilogram of the material is emitted during each release. The first release occurs at 10 a.m. local time in the museum, the second one occurs at 11 a.m. inside the administration building, and the last one starts at 12 a.m. in front of the train station.
All these releases are purely fictional, but the intention is to try to apply the modeling system to a scenario that could be faced by emergency teams and first responders.
The nested modeling is handled in a two-way mode by the LPDM PSPRAY model within the PMSS modeling system: numerical particles move from the nested domains around the buildings where the attacks take place to the large domain encompassing the Paris area or, vice versa, from this large domain back to a nested domain.

Results and Discussion
In this section, the results obtained in the framework of the EMERGENCIES project are presented and discussed, taking into account the characteristics of the simulations and the associated computational resources (duration, CPU required, and storage). After describing the results for the flow simulations, we then present the results obtained for the dispersion before briefly discussing the constraints in rapidly and efficiently visualizing very large results.

Flow Simulations
After presenting the flow results for the large domain and the nested domains, we discuss the operating applicability of the system developed in the frame of the EMERGEN-CIES project.
The calculation for the flow on the whole domain using the PMSS modeling system requires a minimum of 1089 computing cores, i.e., one core per tile plus a master core. Time frames can be simulated in parallel. Several parallel configurations have been tested and are presented in Table 2. An example of the level of precision of the flow obtained throughout the whole domain is presented in Figure 6. The duration ranges from around 2 h 40 min using the minimum number of cores down to around 1 h 20 min when computing eight timeframes concurrently. Regarding the storage, a single time frame requires 200 GB for the whole domain, leading to around 8 TB for 24 timeframes.
When considering the simulation for the nested domains, the Code_Saturne CFD model has been set up using around 200 cores per nested domain and per time frame. It requires 14,400 cores to handle the 24 timeframes for each of the three domains. The simulation duration is then 100 min for the domain M, 69 min for the domain TS, and 64 min for the domain A. The storage per time frame is 281, 232, and 80 MB, respectively. The storage for the detailed computations in and around the buildings is low compared to the large urban domain covering Greater Paris, contrarily to the computational costs. Indeed, the CFD model is much more computationally intensive than the diagnostic approach used in the PMSS modeling system. An example of the flow both in the close vicinity of and inside the administration building is presented in Figure 7. When considering the simulation for the nested domains, the Code_Saturne CFD model has been set up using around 200 cores per nested domain and per time frame. It requires 14,400 cores to handle the 24 timeframes for each of the three domains. The simulation duration is then 100 min for the domain M, 69 min for the domain TS, and 64 min for the domain A. The storage per time frame is 281, 232, and 80 MB, respectively. The storage for the detailed computations in and around the buildings is low compared to the large urban domain covering Greater Paris, contrarily to the computational costs. Indeed, the CFD model is much more computationally intensive than the diagnostic approach used in the PMSS modeling system. An example of the flow both in the close vicinity of and inside the administration building is presented in Figure 7.
The flow modeling is intended for use on a daily basis: each day, a high-resolution flow modeling over the whole Paris area is simulated to be made available should any dispersion modeling of accidental or malevolent airborne releases be necessary.
After the WRF forecast modeling, around 3 h down to no more than 1 h 20 min is required to downscale the flow on the large Parisian domain at 3 m resolution, depending on the number of cores dedicated, respectively 1089 and 8705.
Then, around 5000 cores are required to handle the 24 timeframes of each nest in less than 1 h 40 min. If the three nested domains at very high resolution are treated, around 10,000 cores need to be available for 2 h: 5000 cores available each hour to handle the domain TS and then the domain A and 5000 cores to handle the domain M for 1 h 40 min.
Hence, the whole microscale flow modeling could be performed in 3 h 20 min using 10,000 cores. The flow modeling is intended for use on a daily basis: each day, a high-resolution flow modeling over the whole Paris area is simulated to be made available should any dispersion modeling of accidental or malevolent airborne releases be necessary.
After the WRF forecast modeling, around 3 h down to no more than 1 h 20 min is required to downscale the flow on the large Parisian domain at 3 m resolution, depending on the number of cores dedicated, respectively 1089 and 8705.
Then, around 5000 cores are required to handle the 24 timeframes of each nest in less than 1 h 40 min. If the three nested domains at very high resolution are treated, around 10,000 cores need to be available for 2 h: 5000 cores available each hour to handle the domain TS and then the domain A and 5000 cores to handle the domain M for 1 h 40 min.
Hence, the whole microscale flow modeling could be performed in 3 h 20 min using 10,000 cores.

Dispersion Simulations
After presenting the dispersion results for the large domain and the nested domains, we discuss the operating applicability of the system developed in the framework of the EMERGENCIES project.

Dispersion Simulations
After presenting the dispersion results for the large domain and the nested domai we discuss the operating applicability of the system developed in the framework of EMERGENCIES project.
The dispersion modeling benefits from two-way nested computations: numeri particles can move from the inner-most 1 m resolution domains to the large domain ov Paris and its vicinity, and vice versa. At each 5 s emission step, 40,000 numerical partic are emitted. Since there are three 10 min-long releases, 14.4 million particles are emitt A parametric study regarding the number of cores is presented in Table 3. Concentratio are calculated every 10 min using 10 min averages. Table 3. Influence of the number of cores on the duration of the dispersion simulation.

Number of Cores
Simulation Duration (min)  250  150  500  88  750  96  1000  120 Five hundred cores were retained for operational use. The simulation results produ 90 GB for 5 simulated hours. Eighty tiles of the large domain were reached by the plum during the calculation, plus the three nested domains. Figure 8 illustrates the evolution of the plume inside, in the vicinity of, and furth The dispersion modeling benefits from two-way nested computations: numerical particles can move from the inner-most 1 m resolution domains to the large domain over Paris and its vicinity, and vice versa. At each 5 s emission step, 40,000 numerical particles are emitted. Since there are three 10 min-long releases, 14.4 million particles are emitted. A parametric study regarding the number of cores is presented in Table 3. Concentrations are calculated every 10 min using 10 min averages. Table 3. Influence of the number of cores on the duration of the dispersion simulation.

Number of Cores Simulation Duration (min)
250 150 500 88 750 96 1000 120 Five hundred cores were retained for operational use. The simulation results produce 90 GB for 5 simulated hours. Eighty tiles of the large domain were reached by the plumes during the calculation, plus the three nested domains. Figure 8 illustrates the evolution of the plume inside, in the vicinity of, and further away from the museum. In an operational context, the dispersion simulation would be activated on demand. Using 500 cores, the duration of the calculation, one and a half hours, has to be compared to the physical duration of five hours simulated here. The simulation is thus 3.3 times faster than the reality. Moreover, the results can be analyzed on the fly, without requiring the whole simulation to be completed.

Visualization of the Simulation Results
Due to the large amount of data generated, the efficient visualization of data for operational use has proved to be a challenge in itself. After introducing a first attempt using a traditional scientific visualization application, we present a short overview of a more operative approach [33].
3D modeling results can be explored using 3D scientific visualization applications. Due to the large quantity of data, the viewer chosen has to handle heavy parallelization.
We chose the open-source software ParaView and developed a dedicated plugin. Nonetheless, interactive visualization using several hundred cores required several seconds or tens of seconds for refreshing: this is too ineffective for a user to navigate the results. Noninteractive image rendering has also been used (see Figure 1), but it would be ineffective for the user in an emergency situation. In an operational context, the dispersion simulation would be activated on demand. Using 500 cores, the duration of the calculation, one and a half hours, has to be compared to the physical duration of five hours simulated here. The simulation is thus 3.3 times faster than the reality. Moreover, the results can be analyzed on the fly, without requiring the whole simulation to be completed.

Visualization of the Simulation Results
Due to the large amount of data generated, the efficient visualization of data for operational use has proved to be a challenge in itself. After introducing a first attempt using a traditional scientific visualization application, we present a short overview of a more operative approach [33].
3D modeling results can be explored using 3D scientific visualization applications. Due to the large quantity of data, the viewer chosen has to handle heavy parallelization. We chose the open-source software ParaView and developed a dedicated plugin. Nonetheless, interactive visualization using several hundred cores required several seconds or tens of seconds for refreshing: this is too ineffective for a user to navigate the results. Noninteractive image rendering has also been used (see Figure 1), but it would be ineffective for the user in an emergency situation.
Hence, we relied finally on a tiled web map approach. The results of the simulation are processed on the fly as results are produced [33]. Multiple vertical levels can be treated to provide views not solely on the ground but also in the volume at multiple heights above the ground. Processing takes a few minutes for each time frame since it consists only of a change of file format. Then, results can be consulted in real time through a tile map service using any web browser. Figures 6, 8 and 9 use such an approach. Hence, we relied finally on a tiled web map approach. The results of the simulation are processed on the fly as results are produced [33]. Multiple vertical levels can be treated to provide views not solely on the ground but also in the volume at multiple heights above the ground. Processing takes a few minutes for each time frame since it consists only of a change of file format. Then, results can be consulted in real time through a tile map service using any web browser. Figures 6, 8 and 9 use such an approach.

Conclusions
The EMERGENCIES project proved it is feasible to provide emergency teams and first responders with high-resolution simulation results to support field decisions. The project showed that the areas of responsibility of several tens of kilometers of length can be managed operationally. The domain, at a 3 m resolution, includes very high resolution modeling for specific buildings of interest where both the outside and the inside of the building have been modeled at 1 m resolution.
The modeling is carried out in two steps: First, the modeling of the flow is computed in advance starting from mesoscale forecasts provided each day for the following day. Then, these flows are used for on-demand modeling of dispersion that can occur everywhere in the domain and at multiple locations.
The modeling of the flow for the 24 timeframes can be achieved in 3 h 20 min using 10,000 cores both for the large domain and the three nested domains. If 5000 cores are available, the large domain requires 1 h 30 min and each nested domain requires an additional duration ranging between 1 h and 1 h 40 min. Five hundred cores are then required to be ready for on-demand modeling of the dispersion: the dispersion is modeled 3.3 times faster than it occurs in real life.
This amount of computer power is large but not inconsistent with the means usually dedicated to crisis management.
The dispersion requires a lower amount of computing cores than the modeling of the flow. This allows the modeler to perform several dispersion calculations concurrently, particularly to create alternate scenarios for the source terms. This is of particular interest considering the uncertainties related to the source term estimation. Successive scenarios may also be considered as information is updated and the comprehension of the situation by the emergency teams improves.
A large amount of 3D data was produced: 200 GB for the flow and 90 GB for the dispersion. While data produced near the ground are of the uttermost importance, the

Conclusions
The EMERGENCIES project proved it is feasible to provide emergency teams and first responders with high-resolution simulation results to support field decisions. The project showed that the areas of responsibility of several tens of kilometers of length can be managed operationally. The domain, at a 3 m resolution, includes very high resolution modeling for specific buildings of interest where both the outside and the inside of the building have been modeled at 1 m resolution.
The modeling is carried out in two steps: First, the modeling of the flow is computed in advance starting from mesoscale forecasts provided each day for the following day. Then, these flows are used for on-demand modeling of dispersion that can occur everywhere in the domain and at multiple locations.
The modeling of the flow for the 24 timeframes can be achieved in 3 h 20 min using 10,000 cores both for the large domain and the three nested domains. If 5000 cores are available, the large domain requires 1 h 30 min and each nested domain requires an additional duration ranging between 1 h and 1 h 40 min. Five hundred cores are then required to be ready for on-demand modeling of the dispersion: the dispersion is modeled 3.3 times faster than it occurs in real life.
This amount of computer power is large but not inconsistent with the means usually dedicated to crisis management.
The dispersion requires a lower amount of computing cores than the modeling of the flow. This allows the modeler to perform several dispersion calculations concurrently, particularly to create alternate scenarios for the source terms. This is of particular interest considering the uncertainties related to the source term estimation. Successive scenarios may also be considered as information is updated and the comprehension of the situation by the emergency teams improves.
A large amount of 3D data was produced: 200 GB for the flow and 90 GB for the dispersion. While data produced near the ground are of the uttermost importance, the vertical distribution of concentration from the ground to a moderate elevation is also particularly relevant. Firstly, calculations of inflow and outflow exchanges for specific buildings of interest require such information. Then, the vertical distribution of concentration can support decisions to design evacuation strategies for specific buildings, such as moving people toward the building rooftop rather than evacuating them at the ground level into the neighboring streets.
One key aspect also put forward during the project is the difficulty of efficiently visualizing this huge amount of data produced. After focusing initially on a parallel scientific visualization application, we relied on on-the-fly data processing to create tiled web maps for the flow and the dispersion. These data can then be accessed in real time by the emergency team. This approach to visualizing data will be the focus of a dedicated paper.