Next Article in Journal
The Fire Behaviour of Fabrics Containing Dried Emollient Residues
Previous Article in Journal
Fire Safety of Steel Envelope Systems with Bio-Based Insulation: Evaluation of Smoldering Phenomenon
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Transferring Fire Modelling Sciences into Augmented Reality: A Realistic and Safe Reconstructed Fire Scenario

1
School of Engineering, RMIT University, Melbourne, VIC 3000, Australia
2
School of Property Construction and Project Management, RMIT University, Melbourne, VIC 3000, Australia
3
Department of Building Environment and Energy Engineering, Hong Kong Polytechnic University, Hong Kong SAR, China
*
Author to whom correspondence should be addressed.
Fire 2025, 8(4), 132; https://doi.org/10.3390/fire8040132
Submission received: 3 February 2025 / Revised: 20 March 2025 / Accepted: 27 March 2025 / Published: 28 March 2025

Abstract

Fire emergencies present significant challenges to human safety, with evacuation success relying on situational awareness and informed decision-making. Traditional methods, such as rendered fire simulations and physical evacuation drills, often fail to capture the complexity of fire dynamics or provide realistic, immersive environments for evaluating human behaviour. To address these limitations, this study pioneers a novel augmented reality (AR) platform that, for the first time, integrates real-time, scientifically accurate fire dynamics simulations with immersive visualisations. Unlike existing approaches, the proposed AR workflow offers an end-to-end process, from geometry extraction, fire simulation, and data processing to visualisation in real-world settings. This enables a high-fidelity representation of flame structures and smoke layers, providing an interactive tool for studying evacuee behaviour. A primary survey was conducted to evaluate user perceptions and exit choice preferences in AR environments. Results showed that 77% of participants preferred AR over traditional simulations, citing its interactivity and improved situational awareness. The survey also confirmed that clear signage significantly influences evacuation decisions, with 71% choosing the nearest exit when the exit sign was visible, compared to 31% when obscured. These findings demonstrate the feasibility of AR for evaluating human behaviour in fire scenarios and highlight its potential as a safe, cost-effective tool for fire safety engineering and emergency preparedness.

Graphical Abstract

1. Introduction

Fire safety is a crucial aspect of modern infrastructure design in a wide range of projects, including buildings and tunnels [1]. In a fire, ensuring a safe evacuation for all occupants is a vital factor in protecting lives and property. One of the cornerstones of the performance-based fire engineering approach is the accurate assessment of the required safe escape time (RSET) and the available safe escape time (ASET). To achieve an acceptable fire safety level, the ASET must always exceed the RSET by a prudent safety margin to account for potential fire scenarios and uncertainties in predictions. This principle underscores the importance of integrating fire safety measures early in the design process to ensure occupants can evacuate safely during emergencies. By prioritising fire safety in design through precise escape time analysis, designers and engineers can create safer, more resilient structures that protect lives and adapt to the challenges of modern infrastructure. Since protecting human life is the primary aim of fire safety engineering, capturing human behaviour during a fire event and its attributes during the evacuation process is crucial in fire engineering. During the evacuation process, the behaviour of evacuees and their decision-making may be affected by the fire situation [2,3]. Therefore, it is essential to consider pertinent human factors, including individual decisions and parameters that characterise human behaviour [2,4,5].
Various factors, including the environment, social dynamics, and individual characteristics and traits, influence human behaviour [1,5,6]. Social factors involve interactions among occupants, such as their roles and responsibilities within the setting, whether as workers or family members [7,8,9]. Individual characteristics include knowledge, experience, familiarity with the environment, and physical condition during incidents [3,8,10,11]. Additionally, individual traits such as stress resistance, observational skills, judgment, and mobility also play a significant role in decision-making during emergencies [12]. Previous studies have explored these factors through questionnaires, simulations, and field experiments, providing a comprehensive understanding of how they influence behaviour during fire evacuations.
Environmental factors include aspects such as layout, visual access, and signage for identification or direction [13,14], as well as fire conditions like the presence of smoke and flames. Fire generates heat and toxic substances, while smoke reduces visibility, which poses significant risks to evacuees. These conditions can lead to compromised reactions, such as slowed movement or choosing longer escape routes.
In recent decades, field experimental research has been conducted to investigate the behaviour of evacuees under the influence of environmental factors. To replicate low visibility conditions during fire situations, early fire evacuation experiments often utilised artificial smoke [13,15,16,17,18]. In these experiments, participants were typically required to simulate an evacuation by physically following designated routes. While artificial smoke can mimic low visibility, as demonstrated in full-scale fire experiments, it lacks the natural buoyancy of hot smoke produced in real fires. Hot smoke is subject to buoyant forces, forming stratified layers above cooler air. Without this buoyant behaviour, artificial smoke fails to accurately emulate the stratification and visibility dynamics critical for understanding evacuee behaviour in real fire scenarios [19]. Additionally, in multiroom fire scenarios, artificial smoke fails to capture the movement and descent of the hot upper layer, further reducing the realism of the emulated fire situations [20]. These limitations indicate that artificial smoke alone is insufficient for accurately modelling the complex thermal dynamics involved in fire evacuations. Traditional fire training methods, including fire drills and classroom-based simulations, present notable challenges. Although fire drills provide practical experience, they are resource-intensive and difficult to organise frequently [16]. Furthermore, such drills do not always account for dynamic fire conditions or the influence of smoke and heat on evacuation behaviour [17]. Computational simulations and video-based fire training help visualise fire dynamics but lack participant interaction and immersive decision-making experiences [21].
Recently, virtual reality (VR) and augmented reality (AR) have emerged as low-cost, viable, and immersive alternatives for studying human behaviour during fire emergencies [22]. VR enables the reconstruction of fire scenarios using computer-generated visuals in a fully immersive environment, while AR blends virtual, computer-generated content with actual in situ surroundings. Both AR and VR have been applied in various studies related to fires, including evacuation procedures from buildings [21,23,24,25,26], understanding occupant behaviour during fire incidents [27,28,29,30], and enhancing fire training methodologies [31,32,33,34].
Compared to conventional field studies, VR and AR methodologies allow for the flexible construction of hypothetical fire conditions, ensuring consistent presentation of fire scenarios across all participants. More importantly, VR and AR provide a safe and immersive environment for participants to engage in hazardous scenarios, such as extreme fires, without exposure to physical danger. As a result, studies utilising VR and AR are considered ethically superior to field studies, as they replicate hazardous scenarios without risk to participants, simplifying the recruitment process. Additionally, VR and AR studies are generally more cost-effective than field studies, as the VR system setup can be reused indefinitely. These technologies also enable real-time feedback and precise measurements, such as body movement tracking, which can be easily monitored.
Nevertheless, there are instances where the VR experience may lack full immersion. Despite the interactivity of virtual environments, participants often rely on devices to navigate and interact within the virtual space or require a physical space of sufficient size to match the virtual environment. This requirement for tangible interaction can serve as a constant reminder to participants that they are engaged in an artificial scenario, potentially leading to biased behaviour. Furthermore, the VR approach is often labour-intensive, as creating highly detailed virtual environments that accurately mirror real-world settings requires substantial resources, including significant manpower.
Augmented reality (AR) offers a unique opportunity to explore human behaviour during evacuations, building upon the advantages of virtual reality (VR) [35]. The key strength of AR lies in its integration with the real environment. AR enables users to interact with real objects alongside virtual elements within the augmented environment, enhancing their perception and understanding of the real world [36]. Participants experience a heightened sense of immersion as they engage with real-time surroundings through AR. By leveraging the realism of the actual environment, participant reactions and behaviours can be measured with greater precision [37]. With mobile devices like smartphones and tablets becoming increasingly compact, affordable, and powerful, AR applications are becoming more practical for field use [38].
Despite these advantages, AR applications in fire safety training remain underexplored. Prior AR-based fire training studies have often focused on visualisation rather than detailed fire dynamics, limiting their applicability to performance-based fire safety training. Recent studies have demonstrated the potential of AR in enhancing emergency response training, particularly by simulating fire hazards and evacuation scenarios in real-world settings [22,23]. AR has also been used for improving situational awareness and decision-making during emergency evacuations, leveraging real-time hazard overlays and interactive user engagement [24,25]. Furthermore, research on AR-based human behaviour modelling has indicated its effectiveness in analysing evacuee responses under different levels of fire risk perception and environmental stressors [30,31]. These advancements highlight AR’s growing role in emergency research, bridging the gap between traditional fire studies and computational fire modelling. This study addresses this gap by developing a systematic framework that combines computational fire simulations with AR-based visualisations, creating an immersive yet realistic fire scenario for studying human behaviour. By integrating computational fluid dynamics (CFD)-based fire modelling into AR, this research enhances the accuracy and effectiveness of AR-based fire scenario visualisation. This paper bridges the gap by developing a comprehensive workflow, starting from geometry extraction and fire simulation to data processing and visualisation in AR. For the first time, this end-to-end approach ensures the accurate reconstruction of real-world fire scenarios, enabling realistic and immersive representations of fire dynamics and smoke behaviour. The subsequent sections will explore the development of the AR application, evaluate its outcomes, and discuss its potential applications in understanding human behaviour during evacuations. The initial phase involves establishing a methodology to integrate fire dynamics simulations into AR, accommodating both outdoor open environments and confined indoor spaces. This study’s findings demonstrate the feasibility and utility of the proposed AR platform as a transformative tool for fire safety training and evacuation planning. By offering real-time, immersive, and scientifically accurate visualisations, the platform paves the way for safer, more informed design and emergency response strategies.

2. Methodology

The primary purpose of this paper is to develop a systematic workflow to fuse high-fidelity transient fire predictions (e.g., flame shape, temperature, and smoke concentration) into an immersive AR environment. The workflow consists of four main steps: Geometry dimension extraction, fire dynamics simulation, data processing, and AR visualisation. Detailed descriptions of each step are provided in the following sections.

2.1. Geometry Dimension Extraction

To ensure accurate predictions from fire dynamics simulation, it is essential to extract the geometry arrangement and precise dimension of the targeted building or room for computational model construction. The geometry can be imported from a CAD/BIM model or created using the bundled geometry creation interface within the computational simulation packages. To develop a generic workflow applicable to a wide range of fire scenarios, two distinct cases were selected for this study, demonstrating the feasibility of fusing fire model predictions with the AR immersive experience. In Case 1, a large-scale external cladding fire incident was reconstructed based on the well-documented Lacrosse Building fire in Melbourne. The Lacrosse fire, which occurred on November 25, 2014, in Melbourne’s Docklands, was caused by an unextinguished cigarette on an eighth-floor balcony and spread rapidly due to combustible aluminium composite panels (ACP) [39]. According to official fire investigation reports, flames spread rapidly to the upper floors of the 21-story building, leading to an urgent evacuation [40]. This study applies augmented reality (AR) to represent the fire dynamics of this event, integrating fire modelling data with immersive visualisation techniques to enhance the understanding of evacuation decision-making and environmental conditions during fire emergencies. In Case 2, a designed indoor fire scenario was developed to visualise flame structures and smoke propagation in a near-field indoor environment. The designed fire is located in a common student area on Level 4 of RMIT Building 8, connecting both Building 10 and Building 12.
In Case 1, this study utilises publicly available three-dimensional (3D) data from Google Earth [41] to capture the high-level complexity of large-scale structures and geometries (see Figure 1). Google Earth’s primary data sources include satellite and aerial imagery collected by commercial satellite companies, government agencies, and other organisations. A key challenge in the workflow is converting 3D data from the public domain into compatible file formats for CAD design packages such as CATIA [42], Blender [43], or SolidWorks [44]. To address this, the open-source graphic debugger RenderDoc, developed by MIT [45], was used in this study to extract 3D data from Google Earth. RenderDoc captures the desired frame and extracts 3D data from graphic applications. The captured frame, along with the embedded 3D data, is then converted and imported into Blender as a generic 3D model (e.g., STL format). A scaling process is performed in Blender to ensure that the model’s size aligns with the real-life geometry. Depending on computational requirements, the 3D model is subsequently imported into a CAD package (e.g., CATIA V5) for further editing, including cleaning up redundant surfaces and simplifying geometrical features for use in fire simulation models. Finally, the simplified 3D model is imported into Pyrosim [46] for meshing and setting boundary conditions for fire simulations. In Case 2, the computational model is constructed directly within Pyrosim, following a conventional fire simulation workflow. The model is based on actual dimensions obtained from provided digital drawings and in situ measurements.
To achieve more accurate results in computational simulations, settings such as meshing and boundary conditions must be appropriately configured. Data processing involves converting all simulation results into a usable format for AR visualisation. Ultimately, an APK application for Android will be created and deployed to operate in the actual environment.

2.2. Scientific Fire Dynamics Reconstruction

Reconstructing scientific fire scenario visualisations in an immersive AR environment requires high-fidelity fire dynamics simulations as a critical step in the workflow. This study uses the Fire Dynamics Simulator (FDS) (Version 6.7.9) for fire dynamics predictions, as it is a widely available open-source package for fire simulations. FDS is a computational fluid dynamics (CFD) software developed by the National Institute of Standards and Technology (NIST) to model fire and smoke spread in buildings and other structures. It employs the finite difference method to numerically solve the Navier–Stokes equations for fluid flow, coupled with thermodynamic equations for heat transfer and combustion chemistry. Notably, the proposed workflow is also compatible with other CFD or multiphysics simulation packages, such as ANSYS FLUENT, STAR-CCM+, and COMSOL, offering flexibility for different applications.

2.3. Computational Mesh Construction and Numerical Details

This section summarises the numerical details of fire simulations for both scenarios (i.e., Case 1 and Case 2). The fire simulation was conducted using the Very Large Eddy Simulation (VLES) mode in FDS. The subgrid-scale turbulence was modelled using the Deardorff SGS model, and near-wall turbulence effects were handled using the Wall-Adapting Local Eddy-Viscosity (WALE) model. Due to the wide adoption of FDS, this paper does not include specific details regarding mathematical models, turbulence and combustion closures, and radiation treatments. Interested readers are referred to the latest fire modelling articles and their references [47]. Multiple zones were created for both cases to ensure sufficient mesh resolution for resolving the fire source while maintaining manageable computational costs (see Figure 2). The cell size of each zone was determined based on general flow behaviour and its significance in fire dynamics. This multi-zone approach enables parallel computing using multi-core systems, accelerating simulation times while ensuring consistent and accurate results. In Case 1, finer cells with dimensions of 0.0625 m were used near the fire source to capture the fire spread along the aluminium composite cladding. Away from the fire, the mesh spacing gradually coarsened, with the largest cell dimensions reaching 1 m at the far-field boundaries. Similarly, for Case 2, finer cells with dimensions of 0.05 m × 0.05 m × 0.04 m were used near the fire source, while coarser cells of up to 0.4 m × 0.4 m × 0.3 m were applied in the far-field regions. According to the following equation by NIST, an approximate fine mesh size can be calculated based on the heat release rate and fire scenario: D * = Q ˙ ρ c p T g 2 5 where D * is the characteristic fire diameter, Q ˙ is the total heat release rate of the fire, ρ is the density of ambient air, c p is the specific heat of ambient air, T is the temperature of ambient air, and g is the gravitation constant. Case 1, involving aluminium composite cladding with a heat release rate of 1.5 MW, resulted in D* = 1.14 m. Case 2, with a heat release rate of 1 MW, resulted in D* = 0.93 m. For a fine mesh, D * d x = 16, yielding d x = 0.07 m for Case 1 and d x = 0.06 m for Case 2. A refined mesh size was applied to ensure accurate fire dynamics representation in the simulation.
For case 1, the fire scenario was modelled using visual references from news footage, post-incident reports, and previous studies to approximate the fire’s behaviour [39,40,48]. The combustion of polyethylene within aluminium composite panels (ACP) was assumed based on the known fire spread in the Lacrosse Building incident. Fire growth was estimated from observed flame spread and duration in available footage. The material properties and combustion reaction were modelled using the one-step decomposition model for low-density polyethylene (LDPE), which aligns with the known fire behaviour of ACP cladding [49]. Case 2 was a designed fire scenario used to study evacuation behaviour in a controlled setting. The fire involved a sofa with burning characteristics based on typical furniture materials. The combustion model was informed by established polyurethane fire properties [50]. For both cases, a static pressure of 1 atm is specified at all far-field boundaries. Transient fire prediction results are output in Plot3D file format, allowing field information (e.g., temperature, smoke concentration) to be extracted for data processing.

2.4. Data Processing

Similar to other simulation packages, FDS simulation results are stored in a proprietary file format that is not compatible across multiple platforms. To transfer results from FDS to an immersive AR visualisation platform, several steps are required to convert the original file format into a compatible format. Figure 3 illustrates the workflow for transferring simulated results into portable AR applications. The conversion process involves multiple file formats at different stages: .q files store fire simulation data from PyroSim, .xyz is used to represent simulation results as point cloud data, .x3d is a 3D graphics format for visualisation and rendering, and .fbx ensures proper mesh structure and material mapping for Unity integration. These formats collectively enable the seamless transfer and visualisation of fire dynamics in AR. As shown in Figure 3, the workflow involves multiple steps and utilises various software packages. First, after completing fire simulations in FDS, the results are imported into Paraview (Version 5.10.1) [51] for post-processing. In Paraview, field data from the simulation (e.g., temperature and smoke concentration) are used to generate three-dimensional iso-surface contours representing flame shapes or smoke levels at various temperatures. These 3D contours are then imported into Blender, where irrelevant geometrical features are decimated or simplified to reduce complexity and ensure compatibility. Finally, the processed 3D models from Blender are exported and imported into Unity3D (Version 2020.3.16f1) [52] for final setup and integration within the Unity environment before deployment to portable devices. Notably, this workflow is also compatible with other fire modelling packages (e.g., OpenFOAM, ANSYS) that can output simulation results in formats (e.g., CGNS) compatible with Paraview.

2.5. Immersive Augmented Reality (AR) Platform

The visualisation of simulation results in a real-world environment is achieved using AR technology. To accurately represent real-time, dynamic fire simulations in AR, Unity3D was utilised for the augmented reality environment setup. The Vuforia engine (SDK) was integrated into Unity3D to enable target tracking, facilitating the augmentation of virtual visualisations onto real-life objects or environments.

2.5.1. Target Trackers for Recognition in Vuforia Engine

The Vuforia engine (SDK) is a widely used platform for developing portable augmented reality applications, allowing for seamless integration of virtual content into real-world environments through image recognition and object tracking. Vuforia enables Unity3D applications to recognise objects and spaces, facilitating the overlay of augmented content in real time. In the context of our AR fire simulation application, Vuforia plays a central role in enabling real-time interaction between the users and the AR environment. The accuracy of AR object placement is crucial in ensuring a realistic fire scenario. A well-aligned AR fire model enhances the user’s perception of fire severity and evacuation decisions by reinforcing the spatial realism of flames and smoke in relation to escape routes.
As illustrated in Figure 4, the process begins with the camera capturing input from the real-world scene, which is converted into raw frames. These frames are processed and sent to the target tracker, where Vuforia identifies and tracks custom-defined targets, such as building models or fire locations. The system uses local databases to match the scene with predefined targets. Once the model or area target is detected, Vuforia maintains the tracking of objects and updates the current state. The application logic then queries the current state, allowing it to render the augmented content, such as fire dynamics, on the display. This real-time interaction provides participants with an immersive experience, enhancing decision-making during fire evacuation scenarios by integrating scientifically accurate fire models into real-world environments.
Vuforia offers multiple recognition types of target trackers, each tailored to specific AR applications and environments. For the large-scale fire scenario in Case 1, the Vuforia model target feature was utilised as the recognition method. This feature is particularly suited for tracking architectural landmarks, where the target object must be geometrically rigid with stable surface features. The model target for Unity3D was generated using the Vuforia Model Target Generator. In this scenario, a CAD model created in CATIA V5 was imported into the Vuforia Model Target Generator to produce the model target, which was subsequently integrated into the Unity3D setup for AR tracking (see Figure 5a).
In contrast, the Vuforia area target feature was employed as the recognition method in Case 2. This method is particularly effective for tracking indoor environments, such as office areas and public spaces, by utilising a 3D scan to create an accurate model of the area. The area target requires an area with distinctive and stable objects that do not change frequently over time. Vuforia supports several scanning methods to create area targets, including ARKit-enabled devices with built-in LiDAR sensors, Matterport™ Pro2 3D cameras, NavVis M6 and VLX scanners, and Leica BLK360 and RTC360 scanners. In this case, the Vuforia Area Target Creator app, installed on an Apple iPad Pro with an inbuilt LiDAR scanner, was used to scan the indoor environment and generate the area target database (see Figure 5b). This app produces dataset files, meshes, and Unity packages, which are subsequently utilised in Unity3D for AR tracking.

2.5.2. Unity3D AR Environment Setup

To achieve real-time fire dynamics visualisation, all results generated from PyroSim are imported into Unity3D after data processing. Recreating the simulation results in Unity3D requires careful management of the spatial location and time sequence of the data. The results are imported into Unity3D as a series of individual 3D objects corresponding to different time instants, with each object positioned at its correct coordinates. A custom script was developed in Unity3D to control the sequential appearance of these 3D objects, forming an animation that accurately represents real-time fire dynamics.
The Unity application is designed for mobile devices, with Android chosen as the target platform for the AR experience. The Unity3D settings are configured to target Android 9.0 ‘Pie’ (API level 28), and the application is built as an APK file. This file can be installed on Android devices, enabling users to launch the application and interact with its AR features.

2.6. Survey of AR User Perceptions and Its Impact on Evacuation Decision-Making

With the development of the AR fire visualisation application, a comprehensive survey was conducted to compare user perceptions of fire scenarios visualised through augmented reality with those presented using conventional rendered simulation results. More importantly, the survey aimed to evaluate how AR visualisation impacts participants’ evacuation decision-making, focusing on factors such as the visibility of exit signs and other dynamic environmental conditions.
The survey was structured into two parts and designed to assess participants’ experiences with immersive fire scenarios using AR technology. In the first part, participants compared their perception of fire scenarios presented through two methods: the AR application and conventional rendered fire simulation results. This section aimed to evaluate how AR influences user perception compared to traditional simulation methods. The second part focused on assessing the feasibility of using AR to reconstruct a fire scenario in an indoor environment. Participants were shown three perspectives of the same fire scenario, along with a visual map indicating the fire location, two possible exit routes, and participant positions. They were then tasked with selecting an exit route using a multiple-choice (MC) format, focusing on how the visibility of exit signs and dynamic environmental factors impacted their decision-making. A total of 239 participants were surveyed, with demographic information such as age, gender, and prior experience with evacuation or fire safety training collected. The tasks were designed to closely simulate real-life conditions, ensuring the data accurately reflected participant behaviour and decision-making during fire evacuation scenarios.

3. Results and Discussions

3.1. Case 1: Reconstructed Fire Scenario at Lacrosse Building

As discussed in the previous section, an APK application was installed on a mobile Android device (e.g., Samsung Galaxy Tab S5e) to run the AR application on-site. By pointing the built-in camera at the target object or environment, the reconstructed fire scene was augmented onto the Lacrosse Building. An Android tablet was chosen for the AR application due to its ease of use in capturing augmented videos for publication. Alternatively, a head-mounted device (e.g., Hololens 2) could be employed, offering users a fully immersive experience of the fire scenario from their own perspective. One of the key advantages of AR applications is their ability to overlay fire spread and smoke propagation directly onto real-world environments, providing users with an intuitive understanding of spatial relationships and reducing the complexity of data interpretation. The AR-based visualisation of the Lacrosse Building fire represents the observed fire progression based on available footage, images, and reports. The MFB investigation noted that the fire spread rapidly due to ACP cladding flammability and the stack effect in a high-rise structure, leading to significant external flame propagation [39]. Figure 6 shows the comparison between the real-life Lacrosse Building fire and the FDS simulation. The fire pattern, flame length, and vertical spread in the simulation closely align with the real-life fire, demonstrating similar fire growth behaviour along the ACP façade. This visual agreement supports the realism of the simulated scenario in capturing the key characteristics of the actual incident. The fire growth and smoke movement in the AR scenario align with visual patterns observed in news footage and official reports (see Figure 7), effectively illustrating how flames spread externally and how smoke accumulated during the incident. It is important to acknowledge that uncertainties in modelling parameters, material properties, and environmental conditions may contribute to discrepancies in fire predictions. However, the visualisation provides a structured representation of fire dynamics and their influence on evacuation decision-making. These findings underscore the potential of AR in fire safety training, emergency preparedness, and risk communication, particularly for large-scale fire events.
A survey was conducted to recruit 238 participants from diverse backgrounds, ensuring a mix of relevant demographics. The participants included individuals with varying levels of fire safety awareness and training. Specifically, 45% had undergone fire safety training, while 43% had experienced more than six fire drills, indicating a reasonable level of familiarity with fire safety protocols. The sample also comprised individuals with different educational backgrounds, with 48% holding a Bachelor’s degree, 16% a Master’s degree, and 18% having completed high school, reflecting a general population rather than exclusively fire safety professionals. Additionally, 53% of participants were aged between 20 and 30, which aligns with a group likely to engage with AR-based fire safety training methods.
According to our survey results, 77% of the 239 participants preferred the AR application for visualising the reconstructed fire scenario, citing its ability to simplify interpretation. This strong preference highlights the effectiveness of AR in enhancing the comprehension of fire scenarios. Unlike traditional rendered videos, which are static and limited to predetermined viewpoints (or require complex viewpoint controls through a user interface), the AR application offers real-time, interactive visual cues. Participants can explore various perspectives and scenarios by moving around and viewing the fire from different angles, making AR a more dynamic and engaging tool for understanding fire dynamics.

3.2. Case 2: Choice of Exit Route Under an Indoor Fire Scenario

In Case 2, a designed indoor fire scenario was developed to demonstrate the visualisation of flame structures and smoke propagation in a near-field indoor environment. The primary aim of this study was to investigate how participants react and make evacuation decisions based on the visibility of exit signs and their perception or position within the fire scenario. Figure 8 illustrates the schematic of the designed fire scenario and the three selected perspectives analysed in this study. During this part of the survey, participants were tasked with choosing between the nearest exit, which was partially obscured by fire or smoke, and a farther exit that remained unobstructed by fire or smoke.
For Perspective A, the AR application’s viewpoint represents an evacuee standing at location A, observing the fire and smoke development originating from the fire source (i.e., couches). Internal walls block the visibility of any exit signs from this location (see Figure 9 (a1–a6)). However, participants are informed of the nearest exit’s location through the map provided. For Perspective B, the viewpoint represents an evacuee seated on a couch near the fire source at location B. The distance to the nearest exit (Exit 1) is approximately the same as from location A. In this perspective, however, a clear exit sign above Exit 1 is visible (see Figure 9 (b1–b6)). For Perspective C, the viewpoint represents an evacuee standing on the opposite side of the area, with a clear view of both the exit door and the fire source (see Figure 8). In all perspectives, as the fire and smoke propagate, the exit signs become obstructed during the later stages of the scenario.
The survey design investigates how the visibility of exit signs, varying distances to exits, and reduced visibility caused by the smoke layer influence evacuees’ exit choices. To maintain a consistent visual perspective for all participants, different views of the fire and smoke dynamics were captured using an Android tablet and presented in the online survey. Figure 9 depicts the three captured videos, showcasing real-time simulated fire results at selected time steps.
Table 1 summarises participants’ preferences across three fire scenarios surveyed. In Perspective A, where the nearest exit was partially obscured by fire, 31% of participants selected Exit 1 (the nearest exit), while 69% opted for Exit 2 (the further exit). This result indicates that fire obstruction caused hesitation or uncertainty, leading the majority to avoid the nearest exit. In Perspective B, where the nearest exit featured a visible exit sign, 48% of participants chose Exit 1, compared to 52% who selected Exit 2. The visibility of the exit sign appeared to partially alleviate hesitation, boosting participants’ confidence in choosing the closer exit. In Perspective C, where exit signs were prominently visible and no fire or smoke obstruction blocked Exit 1, 71% of participants selected the nearest exit, while only 29% opted for Exit 2. This strong preference highlights the critical role of clear exit sign visibility and the absence of fire obstruction in guiding participants toward the quickest escape route.
The progressive increase in participants choosing the nearest exit (Exit 1) across the three perspectives (31%, 48%, and 71%) highlights the critical role of clear signage and unobstructed paths in guiding evacuees during emergencies. These findings demonstrate that both visibility and fire obstruction significantly influence evacuation decision-making. Notably, the fire scenario (i.e., flame structure and smoke concentration) remained consistent across all perspectives, with only the viewing angle varying for participants. This design facilitated an isolated evaluation of the impact of exit signs and guiding information on decision-making, aligning with findings from prior research. Previous studies have shown that evacuees adjust their decisions based on dynamically changing environmental information [6]. The visibility of exit signs significantly influences evacuees’ preference for the nearest exit, even in the presence of smoke or other hazards. Several previous works on exit choice behaviour during evacuations indicate that clear signage reduces uncertainty, guides evacuees toward safer routes, and improves decision-making, even in smoke-filled environments [54,55,56,57]. Fire obstructions, such as flames or heavy smoke, create perceived risks that often steer evacuees toward alternative routes [55]. However, risky behaviours, such as taking shortcuts through smoke instead of opting for safer, longer routes, are also observed during evacuations and are driven by urgency and limited visibility [58]. The survey results align with these observations, showcasing the potential of transferring scientific fire simulations into the AR environment. The reconstructed fire scenarios presented in this study provide a realistic yet safe platform for studying human behaviour and improving fire safety training.

4. Limitations and Future Work

There are some limitations and challenges associated with using AR platforms for on-site visualisations, particularly in outdoor environments. Recognition errors and dimensional inaccuracies may arise due to hardware constraints in capturing building dimensions, camera resolution, field of view, and sensor limitations, leading to target recognition errors. Additionally, varying light conditions—such as direct sunlight, shadows, and reflections—can adversely affect the recognition process, as AR tracking relies on well-lit environments to recognise feature points accurately. This limitation was a key factor in conducting the AR demonstration during daytime rather than nighttime, as stable lighting conditions improve tracking reliability and ensure an accurate overlay of fire visualisations. However, outdoor settings present additional challenges, such as glare or rapid changes in ambient light, which may impact AR object stability and alignment.
The accuracy of AR object placement plays a crucial role in shaping user perception of fire severity and influencing their decision-making regarding escape routes. A more precise alignment between the AR-generated fire and the real-world environment enhances realism, potentially making users more aware of the fire’s severity and encouraging safer evacuation choices. However, any misalignment in AR object placement—such as flames appearing to be offset from their actual position—could misrepresent the fire’s spread and severity, potentially affecting evacuation decisions. While recognition errors can occur due to hardware limitations in AR, previous research has demonstrated that AR-based spatial accuracy remains high across different scenarios. Previous research evaluating AR marker tracking accuracy reported depth recognition errors within ±1.0% for distances up to 1.05 m, confirming that spatial accuracy remains high within typical indoor AR viewing distances [59]. These challenges, which have also been identified in previous studies [35,60,61], highlight the need for further improvements in AR-based fire scenario visualisations. Future research will focus on refining fire scenario modelling and expanding user studies to enhance the accuracy and applicability of AR-based fire visualisation. Improvements in AR tracking algorithms and environmental adaptability could enhance spatial accuracy, especially in outdoor environments with dynamic lighting conditions. Additionally, future studies will extend participant surveys to analyse cognitive responses and assess the effectiveness of AR-based fire studies.

5. Conclusions

This study successfully developed an augmented reality (AR) platform to visualise transient fire dynamics in real time, providing a reliable and immersive approach to studying human behaviour and decision-making in realistic fire scenarios. By integrating advanced fire simulation techniques with AR technology, the platform offers a precise and interactive tool for analysing how fire structure, fire spread, and smoke propagation influence evacuation choices. A primary survey highlighted the advantages of AR applications over conventionally rendered visualisations, with 77% of participants favouring AR for its immersive experience and enhanced situational awareness. These findings underscore AR’s potential to revolutionise fire safety training and emergency preparedness by delivering realistic, interactive visualisations of fire dynamics.
Evacuation decisions were found to be strongly influenced by the visibility of exit signs and the presence of fire obstructions. For instance, 71% of participants in Perspective C chose the nearest exit when the exit sign was clearly visible, compared to only 31% in Perspective A, where the exit sign was obscured by internal walls. This progression emphasises the critical role of clear signage and unobstructed escape routes in improving evacuation efficiency. The study also supports previous observations that risky behaviours, such as taking shortcuts through smoke, are often driven by urgency and limited visibility.
The AR platform developed in this study bridges the gap between recent fire modelling and immersive training tools, offering a safe and realistic environment for analysing human behaviour and designing effective evacuation strategies. Expanding AR applications to diverse building types and fire scenarios will further validate its utility. By combining real-time fire dynamics with cutting-edge AR technology, this study makes a significant contribution to advancing fire safety engineering and emergency preparedness.

Author Contributions

J.C.S.W.: Conceptualisation, Software, Data curation, Resources, Formal analysis, Investigation, Visualisation, Methodology, Writing—original draft. P.S.P.W.: Supervision, Project administration. R.D.: Supervision, Project administration. A.C.Y.Y.: Supervision, Project administration. S.C.P.C.: Conceptualisation, Supervision, Project administration, Funding acquisition, Writing—review and editing. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Ethics Committee of RMIT University (Review reference: 2024-27150-23984, date of approval: 7 May 2024).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Conflicts of Interest

Sherman C.P. Cheung reports that financial support was provided by RMIT University. Jason C.S. Wong reports that financial support was provided by RMIT University. Peter S.P. Wong reports that financial support was provided by RMIT University. Raj Das reports that financial support was provided by RMIT University. Anthony C.Y. Yuen reports that financial support was provided by The Hong Kong Polytechnic University.

References

  1. Sime, J.D. An occupant response shelter escape time (ORSET) model. Saf. Sci. 2001, 38, 109–125. [Google Scholar] [CrossRef]
  2. Kuligowski, E.D. Human Behavior in Fire; Springer: New York, NY, USA, 2016; pp. 2070–2114. [Google Scholar] [CrossRef]
  3. Tong, D.; Canter, D. The decision to evacuate: A study of the motivations which contribute to evacuation in the event of fire. Fire Saf. J. 1985, 9, 257–265. [Google Scholar] [CrossRef]
  4. Galea, E.R.; Sharp, G.; Lawrence, P.; Dixon, A. Investigating the impact of occupant response time on computer simulations of the WTC North Tower evacuation. In Proceedings of the Interflam 2007 Conference Proceedings, London, UK, 3–5 September 2007. [Google Scholar]
  5. Wang, Y.; Kyriakidis, M.; Dang, V.N. Incorporating human factors in emergency evacuation: An overview of behavioral factors and models. Int. J. Disaster Risk Reduct. 2021, 60, 102254. [Google Scholar] [CrossRef]
  6. Kobes, M.; Helsloot, I.; de Vries, B.; Post, J.G. Building safety and human behaviour in fire: A literature review. Fire Saf. J. 2010, 45, 1–11. [Google Scholar] [CrossRef]
  7. Cornwell, B. Bonded fatalities: Relational and ecological dimensions of a fire evacuation. Sociol. Q. 2003, 44, 617–638. [Google Scholar] [CrossRef]
  8. Sandberg, A. Unannounced Evacuation of Large Retail Stores: An Evaluation of Human Behaviour and the Computer Model Simulex. Bachelor’s Thesis, Lund University, Lund, Sweden, 1997. [Google Scholar]
  9. Sime, J.D. Crowd psychology and engineering. Saf. Sci. 1995, 21, 1–14. [Google Scholar] [CrossRef]
  10. Kinateder, M.; Comunale, B.; Warren, W.H. Exit choice in an emergency evacuation scenario is influenced by exit familiarity and neighbor behavior. Saf. Sci. 2018, 106, 170–175. [Google Scholar] [CrossRef]
  11. Li, D.; Han, B. Behavioral effect on pedestrian evacuation simulation using cellular automata. Saf. Sci. 2015, 80, 41–55. [Google Scholar] [CrossRef]
  12. Proulx, G. A stress model for people facing a fire. J. Environ. Psychol. 1993, 13, 137–147. [Google Scholar] [CrossRef]
  13. Jin, T. Studies on human behavior and tenability in fire smoke. Fire Saf. Sci. 1997, 5, 3–21. [Google Scholar]
  14. Sime, J.D. Accidents and disasters: Vulnerability in the built environment. Saf. Sci. 1991, 14, 109–124. [Google Scholar] [CrossRef]
  15. Fridolf, K.; Ronchi, E.; Nilsson, D.; Frantzich, H. Movement speed and exit choice in smoke-filled rail tunnels. Fire Saf. J. 2013, 59, 8–21. [Google Scholar] [CrossRef]
  16. Kobes, M.; Helsloot, I.; de Vries, B.; Post, J.G.; Oberijé, N.; Groenewegen, K. Wayfinding during fire evacuation: An analysis of unannounced fire drills in a hotel at night. Build. Environ. 2010, 45, 537–548. [Google Scholar] [CrossRef]
  17. Ronchi, E.; Fridolf, K.; Frantzich, H.; Nilsson, D.; Walter, A.L.; Modig, H. A tunnel evacuation experiment on movement speed and exit choice in smoke. Fire Saf. J. 2018, 97, 126–136. [Google Scholar] [CrossRef]
  18. Seike, M.; Kawabata, N.; Hasegawa, M. Experiments of evacuation speed in smoke-filled tunnel. Tunn. Undergr. Space Technol. 2016, 53, 61–67. [Google Scholar] [CrossRef]
  19. Yang, D.; Hu, L.H.; Huo, R.; Jiang, Y.Q.; Liu, S.; Tang, F. Experimental study on buoyant flow stratification induced by a fire in a horizontal channel. Appl. Therm. Eng. 2010, 30, 872–878. [Google Scholar] [CrossRef]
  20. Cooper, L.Y.; Harkleroad, M.; Quintiere, J.; Rinkinen, W. An Experimental Study of Upper Hot Layer Stratification in Full-Scale Multiroom Fire Scenarios. J. Heat Transf. 1982, 104, 741–749. [Google Scholar] [CrossRef]
  21. Xia, X.; Li, N.; González, V.A. Exploring the influence of emergency broadcasts on human evacuation behavior during building emergencies using virtual reality technology. J. Comput. Civ. Eng. 2020, 35, 04020065. [Google Scholar] [CrossRef]
  22. Lee, J.-K.; Lee, S.; Kim, Y.-C.; Kim, S.; Hong, S.-W. Augmented virtual reality and 360 spatial visualization for supporting user-engaged design. J. Comput. Des. Eng. 2023, 10, 1047–1059. [Google Scholar] [CrossRef]
  23. Cao, L.; Lin, J.; Li, N. A virtual reality based study of indoor fire evacuation after active or passive spatial exploration. Comput. Hum. Behav. 2019, 90, 37–45. [Google Scholar] [CrossRef]
  24. Lorusso, P.; De Iuliis, M.; Marasco, S.; Domaneschi, M.; Cimellaro, G.P.; Villa, V. Fire Emergency Evacuation from a School Building Using an Evolutionary Virtual Reality Platform. Buildings 2022, 12, 223. [Google Scholar] [CrossRef]
  25. Lochhead, I.; Hedley, N. Communicating multilevel evacuation context using situated augmented reality. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2018, 4, 33–40. [Google Scholar] [CrossRef]
  26. Stigall, J.; Sharma, S. Evaluation of mobile augmented reality application for building evacuation. In Proceedings of the ISCA 28th International Conference on Software Engineering and Data Engineering, San Diego, CA, USA, 30 September–2 October 2019. [Google Scholar] [CrossRef]
  27. Kinateder, M.; Ronchi, E.; Nilsson, D.; Kobes, M.; Müller, M.; Pauli, P.; Mühlberger, A. Virtual reality for fire evacuation research. In Proceedings of the 2014 Federated Conference on Computer Science and Information Systems, Warsaw, Poland, 7–10 September 2014. [Google Scholar] [CrossRef]
  28. Arias, S.; Mossberg, A.; Nilsson, D.; Wahlqvist, J. A study on evacuation behavior in physical and virtual reality experiments. Fire Technol. 2022, 58, 817–849. [Google Scholar] [CrossRef]
  29. Mañas, E.L.; Plá, J.; Herrero, G.M.; Gervás, P. Augmented reality and indoors Wi-Fi positioning for conducting fire evacuation drills using mobile phones. In Proceedings of the 4th Symposium of Ubiquitous Computing and Ambient Intelligence UCAmI, Toledo, Spain, 2–5 December 2019. [Google Scholar]
  30. Russell, M.D.; Bonny, J.W.; Reed, R. Impact of Virtual Reality on Decision-Making and Risk Assessment during Fire Evacuation. Fire 2024, 7, 427. [Google Scholar] [CrossRef]
  31. Lu, X.; Yang, Z.; Xu, Z.; Xiong, C. Scenario simulation of indoor post-earthquake fire rescue based on building information model and virtual reality. Adv. Eng. Softw. 2020, 143, 102792. [Google Scholar] [CrossRef]
  32. Rüppel, U.; Schatz, K. Designing a BIM-based serious game for fire safety evacuation simulations. Adv. Eng. Inform. 2011, 25, 600–611. [Google Scholar] [CrossRef]
  33. Guo, Y.; Zhu, J.; Wang, Y.; Chai, J.; Li, W.; Fu, L.; Xu, B.; Gong, Y. A Virtual Reality Simulation Method for Crowd Evacuation in a Multiexit Indoor Fire Environment. ISPRS Int. J. Geo-Inf. 2020, 9, 750. [Google Scholar] [CrossRef]
  34. Xu, Z.; Lu, X.Z.; Guan, H.; Chen, C.; Ren, A.Z. A virtual reality based fire training simulator with smoke hazard assessment capacity. Adv. Eng. Softw. 2014, 68, 1–8. [Google Scholar] [CrossRef]
  35. Carmigniani, J.; Furht, B.; Anisetti, M.; Ceravolo, P.; Damiani, E.; Ivkovic, M. Augmented reality technologies, systems and applications. Multimed. Tools Appl. 2011, 51, 341–377. [Google Scholar] [CrossRef]
  36. Li, W.; Nee, A.Y.C.; Ong, S.K. A State-of-the-Art Review of Augmented Reality in Engineering Analysis and Simulation. Multimodal Technol. Interact. 2017, 1, 17. [Google Scholar] [CrossRef]
  37. Lovreglio, R.; Kinateder, M. Augmented reality for pedestrian evacuation research: Promises and limitations. Saf. Sci. 2020, 128, 104750. [Google Scholar] [CrossRef]
  38. Chi, H.-L.; Kang, S.-C.; Wang, X. Research trends and opportunities of augmented reality applications in architecture, engineering, and construction. Autom. Constr. 2013, 33, 116–122. [Google Scholar] [CrossRef]
  39. Badrock, G. Post-incident analysis report: Lacrosse Docklands fire, 25 November 2014. MATEC Web Conf. 2016, 46, 06002. [Google Scholar] [CrossRef]
  40. Victorian Building Authority (VBA). Investigation Report on the Lacrosse Apartments Fire. 2014. Available online: https://www.vba.vic.gov.au/__data/assets/pdf_file/0018/33606/VBA-MR-VBA-investigation-of-Lacrosse-Apartments-fire-progressing.pdf (accessed on 1 March 2023).
  41. Google Earth. Google Earth. Satellite Imagery and 3D Terrain Visualization. Google. 2023. Available online: https://www.google.com/earth/ (accessed on 1 March 2023).
  42. Dassault Systèmes. CATIA V5—Computer-Aided Three-Dimensional Interactive Application. 2023. Available online: https://www.3ds.com/products-services/catia/ (accessed on 1 March 2023).
  43. Blender Foundation. Blender—Open Source 3D Modeling Software. 2023. Available online: https://www.blender.org/ (accessed on 1 March 2023).
  44. Dassault Systèmes. SolidWorks—3D CAD Design Software. 2023. Available online: https://www.solidworks.com/ (accessed on 1 March 2023).
  45. Karlsson, B. RenderDoc (Version 1.16) [Software]. 2023. Available online: https://renderdoc.org/ (accessed on 1 March 2023).
  46. Thunderhead Engineering. PyroSim—Fire Dynamics Simulator (FDS) User Interface. 2023. Available online: https://www.thunderheadeng.com/pyrosim (accessed on 1 March 2023).
  47. Fang, X.; Yuen, A.C.; Yeoh, G.H.; Lee, E.W.; Cheung, S.C. Numerical study on using vortex flow to improve smoke exhaust efficiency in large-scale atrium fires. Indoor Built Environ. 2023, 32, 98–115. [Google Scholar] [CrossRef]
  48. ABC News. Residents Evacuated After Fire in Melbourne CBD Apartment Building. 2014. Available online: https://www.abc.net.au/news/2014-11-25/residents-evacuated-after-fire-in-melbourne-cbd-apartment-build/5914978 (accessed on 1 March 2023).
  49. Zhang, X.; Zhang, P.; Zhang, Y.; Zhang, L.; Zhang, J.; Zhang, H. Numerical Investigations on the Influencing Factors of Rapid Fire Spread of Flammable Cladding in a High-Rise Building. Fire 2022, 5, 149. [Google Scholar] [CrossRef]
  50. Babrauskas, V. The Cone Calorimeter. In SFPE Handbook of Fire Protection Engineering, 3rd ed.; NFPA; SFPE: Washington, DC, USA, 2002; Available online: https://rakandotcom.wordpress.com/wp-content/uploads/2016/05/sfpe.pdf (accessed on 1 March 2023).
  51. Kitware. ParaView—Open-Source Data Analysis and Visualization Software. 2023. Available online: https://www.paraview.org/ (accessed on 1 March 2023).
  52. Unity Technologies. Unity—Real-Time 3D Development Platform. 2023. Available online: https://unity.com/ (accessed on 1 March 2023).
  53. Nubia, R.; Garay Rairan, F.; Wilson, R.; Wilmer, P. Development of a mobile application in augmented reality to improve the communication field of autistic children at a Neurorehabilitar Clinic. In Proceedings of the 2015 Workshop on Engineering Applications—International Congress on Engineering (WEA), Bogota, Colombia, 28–30 October 2015. [Google Scholar] [CrossRef]
  54. Benthorn, L.; Frantzich, H. Fire alarm in a public building: How do people evaluate information and choose an evacuation exit? Fire Mater. 1999, 23, 311–315. [Google Scholar] [CrossRef]
  55. Fujii, K.; Sano, T.; Ohmiya, Y. Influence of lit emergency signs and illuminated settings on walking speeds in smoky corridors. Fire Saf. J. 2021, 120, 103026. [Google Scholar] [CrossRef]
  56. Yen, H.-H.; Lin, C.-H. Intelligent Evacuation Sign Control Mechanism in IoT-Enabled Multi-Floor Multi-Exit Buildings. Sensors 2024, 24, 1115. [Google Scholar] [CrossRef]
  57. Zhao, H.; Schwabe, A.; Schläfli, F.; Thrash, T.; Aguilar, L.; Dubey, R.K.; Karjalainen, J.; Hölscher, C.; Helbing, D.; Schinazi, V.R. Fire evacuation supported by centralised and decentralised visual guidance systems. Saf. Sci. 2022, 145, 105451. [Google Scholar] [CrossRef]
  58. Ronchi, E.; Nilsson, D. Fire evacuation in high-rise buildings: A review of human behaviour and modelling research. Fire Sci. Rev. 2013, 2, 7. [Google Scholar] [CrossRef]
  59. Haraguchi, H.; Miyahara, K. High accuracy and wide range recognition of micro AR markers with dynamic camera parameter control. Electronics 2023, 12, 4398. [Google Scholar] [CrossRef]
  60. Iqbal, M.Z.; Mangina, E.; Campbell, A.G. Current Challenges and Future Research Directions in Augmented Reality for Education. Multimodal Technol. Interact. 2022, 6, 75. [Google Scholar] [CrossRef]
  61. Lee, T.; Jung, C.; Lee, K.; Seo, S. A study on recognising multi-real world object and estimating 3D position in augmented reality. J. Supercomput. 2022, 78, 7509–7528. [Google Scholar] [CrossRef]
Figure 1. Workflow for large-scale building dimension extraction and model reconstruction. This schematic diagram illustrates the process of extracting structural dimensions from publicly available 3D data for fire simulation modelling. The workflow involves sourcing geometry data from Google Earth, extracting 3D data using RenderDoc (Version 1.16), converting and refining the model in Blender (Version 2.93.10) and CATIA V5 (Version 6R2022) and finally, importing it into PyroSim (Version 2022.2.0803) for mesh generation in fire dynamics simulations.
Figure 1. Workflow for large-scale building dimension extraction and model reconstruction. This schematic diagram illustrates the process of extracting structural dimensions from publicly available 3D data for fire simulation modelling. The workflow involves sourcing geometry data from Google Earth, extracting 3D data using RenderDoc (Version 1.16), converting and refining the model in Blender (Version 2.93.10) and CATIA V5 (Version 6R2022) and finally, importing it into PyroSim (Version 2022.2.0803) for mesh generation in fire dynamics simulations.
Fire 08 00132 g001
Figure 2. Isometric view of the computational models illustrating the geometric configuration and mesh distribution for Case 1 ((a) high-rise building scenario) and Case 2 ((b) indoor evacuation scenario). The block-like structures in Case 2 represent individual rooms within the indoor environment, which are assumed to be locked and unoccupied during the fire event, meaning smoke does not enter these spaces.
Figure 2. Isometric view of the computational models illustrating the geometric configuration and mesh distribution for Case 1 ((a) high-rise building scenario) and Case 2 ((b) indoor evacuation scenario). The block-like structures in Case 2 represent individual rooms within the indoor environment, which are assumed to be locked and unoccupied during the fire event, meaning smoke does not enter these spaces.
Fire 08 00132 g002
Figure 3. Workflow for converting and integrating fire dynamics simulation results into an augmented reality (AR) platform, illustrating the step-by-step process from simulation in PyroSim to final deployment in AR. File formats used in the workflow: .q (PyroSim simulation data), .xyz (point cloud format for 3D representation), .x3d (Extensible 3D Graphics format for visualisation), and .fbx (Autodesk Filmbox format for 3D model exchange and rendering in Unity).
Figure 3. Workflow for converting and integrating fire dynamics simulation results into an augmented reality (AR) platform, illustrating the step-by-step process from simulation in PyroSim to final deployment in AR. File formats used in the workflow: .q (PyroSim simulation data), .xyz (point cloud format for 3D representation), .x3d (Extensible 3D Graphics format for visualisation), and .fbx (Autodesk Filmbox format for 3D model exchange and rendering in Unity).
Fire 08 00132 g003
Figure 4. Flow diagram of the end-to-end architecture using the Vuforia SDK, illustrating the process of generating and utilising model and area targets from CAD or scanned data, integrating them with the local target database, and rendering augmented content for AR applications [53].
Figure 4. Flow diagram of the end-to-end architecture using the Vuforia SDK, illustrating the process of generating and utilising model and area targets from CAD or scanned data, integrating them with the local target database, and rendering augmented content for AR applications [53].
Fire 08 00132 g004
Figure 5. Visualisation of Unity3D models reconstructed using Vuforia’s model target for AR tracking. (a) The high-rise building model used for Case 1 fire scenario visualisation. (b) The area target model for Case 2 represents the indoor environment where fire and evacuation simulations take place. The reconstruction captures key structural details to align the AR visualisations with real-world spatial dimensions.
Figure 5. Visualisation of Unity3D models reconstructed using Vuforia’s model target for AR tracking. (a) The high-rise building model used for Case 1 fire scenario visualisation. (b) The area target model for Case 2 represents the indoor environment where fire and evacuation simulations take place. The reconstruction captures key structural details to align the AR visualisations with real-world spatial dimensions.
Fire 08 00132 g005
Figure 6. A comparison between a real-world image of the Lacrosse Building fire in Melbourne (left) [39] and an FDS simulation at 250 s (right). Both illustrate the rapid vertical flame spread along the building façade, primarily driven by the combustible aluminium composite panel (ACP) cladding. The simulation effectively captures the fire dynamics observed in the actual incident, highlighting the role of façade materials in fire propagation.
Figure 6. A comparison between a real-world image of the Lacrosse Building fire in Melbourne (left) [39] and an FDS simulation at 250 s (right). Both illustrate the rapid vertical flame spread along the building façade, primarily driven by the combustible aluminium composite panel (ACP) cladding. The simulation effectively captures the fire dynamics observed in the actual incident, highlighting the role of façade materials in fire propagation.
Fire 08 00132 g006
Figure 7. Sequential snapshots of AR-based fire visualisation for the outdoor building cladding fire scenario (Case 1), illustrating the simulated fire spread progression from 6 to 238 s. The AR overlay, using Vuforia-based tracking, aligns the virtual fire dynamics with the real-world structure to enhance situational awareness.
Figure 7. Sequential snapshots of AR-based fire visualisation for the outdoor building cladding fire scenario (Case 1), illustrating the simulated fire spread progression from 6 to 238 s. The AR overlay, using Vuforia-based tracking, aligns the virtual fire dynamics with the real-world structure to enhance situational awareness.
Fire 08 00132 g007
Figure 8. Schematic floor plan of the indoor fire scenario, illustrating the spatial arrangement of key elements, including fire location, emergency exits (1 and 2), and couches as potential fire sources. The three marked perspectives (A, B, C) represent observer positions used for evaluating evacuation decision-making based on visibility conditions and fire proximity.
Figure 8. Schematic floor plan of the indoor fire scenario, illustrating the spatial arrangement of key elements, including fire location, emergency exits (1 and 2), and couches as potential fire sources. The three marked perspectives (A, B, C) represent observer positions used for evaluating evacuation decision-making based on visibility conditions and fire proximity.
Fire 08 00132 g008
Figure 9. Captured snapshots of AR visualisation for the indoor fire scenario (Case 2), illustrating the progression of fire and smoke over time (2 to 58 s). The images depict fire spread and smoke accumulation from three different observer perspectives (A, B, and C), demonstrating how visibility conditions vary based on position and proximity to the fire source, which influences evacuation decision-making.
Figure 9. Captured snapshots of AR visualisation for the indoor fire scenario (Case 2), illustrating the progression of fire and smoke over time (2 to 58 s). The images depict fire spread and smoke accumulation from three different observer perspectives (A, B, and C), demonstrating how visibility conditions vary based on position and proximity to the fire source, which influences evacuation decision-making.
Fire 08 00132 g009
Table 1. Survey results for exit route preferences in Case 2 across three perspectives.
Table 1. Survey results for exit route preferences in Case 2 across three perspectives.
Survey Results: Participant Preferences Between Nearest and Further Exits Across Perspectives
Perspective APerspective BPerspective C
Exit 1 (Nearest exit)75 (31%)115 (48%)170 (71%)
Exit 2 (Further exit)164 (69%)124 (52%)69 (29%)
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wong, J.C.S.; Wong, P.S.P.; Das, R.; Yuen, A.C.Y.; Cheung, S.C.P. Transferring Fire Modelling Sciences into Augmented Reality: A Realistic and Safe Reconstructed Fire Scenario. Fire 2025, 8, 132. https://doi.org/10.3390/fire8040132

AMA Style

Wong JCS, Wong PSP, Das R, Yuen ACY, Cheung SCP. Transferring Fire Modelling Sciences into Augmented Reality: A Realistic and Safe Reconstructed Fire Scenario. Fire. 2025; 8(4):132. https://doi.org/10.3390/fire8040132

Chicago/Turabian Style

Wong, Jason C. S., Peter S. P. Wong, Raj Das, Anthony C. Y. Yuen, and Sherman C. P. Cheung. 2025. "Transferring Fire Modelling Sciences into Augmented Reality: A Realistic and Safe Reconstructed Fire Scenario" Fire 8, no. 4: 132. https://doi.org/10.3390/fire8040132

APA Style

Wong, J. C. S., Wong, P. S. P., Das, R., Yuen, A. C. Y., & Cheung, S. C. P. (2025). Transferring Fire Modelling Sciences into Augmented Reality: A Realistic and Safe Reconstructed Fire Scenario. Fire, 8(4), 132. https://doi.org/10.3390/fire8040132

Article Metrics

Back to TopTop