Skip to Content
FireFire
  • Article
  • Open Access

12 February 2026

Developing an Integrated Command-and-Control Training Environment for Fire and Rescue Services: From GIS and UAV Data to Virtual Reality Simulation

,
and
Department of Fire Protection, Faculty of Wood Sciences and Technology, Technical University in Zvolen, T. G. Masaryka 24, 960 01 Zvolen, Slovakia
*
Author to whom correspondence should be addressed.

Abstract

Effective command-and-control (C2) decision-making during emergency response relies on timely access to spatially accurate information. It also requires a clear understanding of evolving incident conditions. Traditional fire-service training methods provide limited opportunities to rehearse complex, high-risk, and large-scale incidents under realistic yet safe conditions. This exploratory pilot study presents the design and experimental evaluation of an integrated training environment that combines geographic information system (GIS) data, unmanned aerial vehicle (UAV) imagery, and immersive virtual reality (VR) simulations to support C2 training for fire-service incident commanders. The system was assessed through scenario-based exercises involving 23 active incident commanders across three representative emergency scenarios: wildland fire, hazardous materials transport accident, and flood response. The training scenarios were based on real geographic areas in central Slovakia, using authentic terrain, land-cover, infrastructure, and hydrological GIS layers to ensure spatial realism of the simulated emergency environments. Pre-training and post-training questionnaires were used to evaluate perceived training realism, preparedness for command tasks, decision-making confidence, and the perceived usefulness of digital spatial information tools. Results indicate a substantial post-training increase in perceived realism and preparedness, with strong positive correlation between these variables (Spearman ρ = 0.71, p < 0.001). Participants reported improved confidence in assessing incident conditions, prioritizing operational tasks, and allocating resources under dynamically evolving scenarios. The study evaluates perceived spatial situational understanding derived from multi-source spatial information integration rather than directly measured situational awareness using standardized psychometric instruments. UAV imagery was found to be particularly valuable for rapid incident size-up, while GIS layers primarily supported spatial planning, hazard delineation, and resource coordination; VR served as a unifying platform for fusing these information sources into a coherent operational picture. Scenario-specific differences in tool usefulness were observed, reflecting the spatial and risk characteristics of each incident type. Overall, the findings indicate that integrated GIS–UAV–VR environments provide a realistic and scalable complement to traditional fire-service command training, enhancing spatially supported decision-making and preparedness for complex emergency response. Given the single-group pretest–posttest design, limited sample size, absence of a control group, and reliance on perceived evaluation measures, the results should be interpreted as indicative rather than as generalizable evidence of training effectiveness.

1. Introduction

Fire-service incident commanders operate in highly dynamic, time-critical, and information-intensive environments in which their decisions directly affect the safety of responders, civilians, and critical infrastructure. Effective command performance therefore depends not only on technical resources and operational experience but also on well-developed situational awareness, spatial understanding, and non-technical skills such as communication and coordination [1,2,3]. The growing complexity of emergency situations places increasing cognitive demands on commanders and requires corresponding advances in training methods.
In contemporary emergency response, commanders increasingly rely on geospatial information systems, aerial reconnaissance, and digital communication technologies to support operational decision-making [4,5,6]. Emergency scenarios are often spatially extensive, interdependent, and rapidly evolving, with significant uncertainty regarding hazard development and resource availability [6]. These trends substantially increase the volume, diversity, and tempo of information that must be processed during command, thereby raising the importance of effective information integration and spatial reasoning skills.
Traditional fire-service training is primarily based on tabletop exercises, field drills, and classroom-based instruction. While these approaches remain essential for procedural learning and tactical coordination, they exhibit inherent limitations with respect to scenario variability, spatial scale, safety constraints, repeatability, and cost efficiency. Many large-scale or high-risk emergency situations, such as wildland fires, hazardous materials transport accidents, or extensive floods, cannot be realistically reproduced under real conditions without considerable logistical effort and potential safety risks [4,5,6]. As a result, commanders often lack opportunities to systematically train complex spatial decision-making under controlled yet realistic conditions.
Recent advances in virtual reality (VR), unmanned aerial vehicles (UAVs), and geospatial data processing offer new possibilities to overcome these limitations. UAVs enable rapid aerial situational intelligence with high spatial and temporal resolution. Their use in forest fire monitoring and operational reconnaissance is well documented [7,8,9], while other studies emphasize their role in photogrammetry, terrain modelling, and disaster mapping using remote sensing methods [10,11]. VR and related immersive technologies have been applied in firefighter training and spatial navigation studies [12,13], as well as in evacuation research and analysis of human behavior under emergency conditions [14,15,16]. GIS provides a structured spatial context for hazard assessment and fire risk analysis [4,5] and is widely used for spatial planning and uncertainty assessment in wildland fire management [6]. Each of these technologies has indicated significant individual benefits, but in practical command education, they are still frequently applied in isolation rather than within a fully integrated system that reflects real operational information flows.
Situational awareness is commonly understood as the perception and comprehension of environmental elements [1,2], together with the ability to project their future development in dynamic systems [17]. In real operations, situational awareness does not arise from any single data source, but from the continuous integration of visual, spatial, procedural, and contextual information. Consequently, in digital training environments, situational awareness should not be treated as an isolated directly measurable variable but instead as an emergent cognitive property resulting from multi-source information fusion and interactive decision-making processes [1,3,17].
Despite extensive research on VR-based training, UAV-supported reconnaissance, and GIS-based emergency management as separate technologies, their full integration into a unified C2 training environment for fire-service incident commanders remains insufficiently explored. Studies addressing GIS and fire modelling primarily focus on hazard assessment and spatial planning in wildland fire management [4,5], as well as on uncertainty analysis and risk evaluation [6]. Research on VR applications has largely examined firefighter training and spatial navigation [12,13], evacuation behavior and human response in emergency situations [14,15] and questions related to simulation fidelity and realism of virtual environments [16]. Investigations of UAV use concentrate mainly on aerial monitoring and operational reconnaissance in fire scenarios [7,8,9], while other authors emphasize photogrammetry and disaster mapping based on remote sensing techniques [10,11]. However, only a limited number of studies examine how integrated digital environments influence perceived training realism, preparedness for command tasks, decision-making confidence, and the cognitive processes associated with situational awareness in an operational command context.
Furthermore, there is still limited empirical evidence on how such integrated environments support incident commanders operating within established incident command structures and standard operating procedures [18].
Although prior studies have demonstrated the value of GIS-based hazard modelling for fire risk assessment and spatial planning [4,5] and for uncertainty analysis in wildland fire management [6], UAV-supported reconnaissance has been mainly investigated for forest fire monitoring and operational aerial observation [7,8,9] as well as for photogrammetry and disaster mapping applications [10,11]. VR-based emergency training has primarily focused on firefighter training and spatial navigation studies [12,13], evacuation behavior and emergency response research [14,15,16], and simulation fidelity assessment [19]. Despite these advances, these technologies are still commonly deployed independently and are seldom examined as components of a unified C2 training environment for fire-service incident commanders. Existing research rarely evaluates how multi-source spatial information—derived from GIS layers, UAV imagery, and immersive VR visualization—jointly influences perceived realism, preparedness, decision-making confidence, and the cognitive processes associated with situational awareness (spatial situational understanding), despite long-standing evidence that situational awareness emerges from the integration of diverse perceptual and contextual inputs [1,2,17]. Furthermore, most VR-focused fire-service studies emphasize behavioral training, evacuation modeling, or system capability assessments rather than command-level information synthesis and decision-making under realistic conditions [12,13,14,15,16,19].
This study addresses this gap by developing and experimentally evaluating an integrated GIS–UAV–VR C2 training environment specifically designed for fire-service incident commanders. The proposed environment combines real geospatial data to ensure spatial realism, UAV-derived imagery to provide aerial situational intelligence, and immersive VR visualization to support interactive scenario-based command training. Rather than directly quantifying situational awareness using psychometric instruments, this study interprets situational awareness as a derived cognitive effect emerging from the interaction between perceived training realism, preparedness for command tasks, decision-making confidence, and spatial information integration, in line with established situation awareness theory [1,17]. The study therefore aims to assess how the integrated environment influences perceived realism and preparedness, to examine changes in decision-making confidence, and to analyze how these effects vary across different emergency scenarios and levels of command experience. By combining design science development with empirical evaluation, this work contributes to the advancement of next-generation digital training environments for emergency command, building on existing knowledge related to non-technical skills [3], incident command systems [18], and simulation-based training in complex domains [12,13,14,15,16,19,20].
The primary aim of this study is to design and experimentally evaluate an integrated GIS–UAV–VR training environment for fire-service incident commanders that supports spatially informed command-and-control decision-making. The study examines how this integrated environment influences perceived training realism, preparedness for command tasks, decision-making confidence, and the cognitive processes associated with situational awareness. To clearly structure this investigation, the following research questions are defined:
  • (RQ1) How does an integrated GIS–UAV–VR training environment influence perceived training realism in fire-service command training?
  • (RQ2) How does the integrated spatial information environment affect perceived preparedness and decision-making confidence of incident commanders?
  • (RQ3) How do different spatial data sources (GIS layers and UAV imagery) contribute to information integration and derived situational awareness across different emergency scenarios?
These questions structure the design of the training environment and the empirical evaluation presented in this study.

2. Materials and Methods

2.1. Research Design

This study adopted a design science research approach combined with an empirical validation. The methodology consisted of two main phases:
  • the development of an integrated command-and-control (C2) training environment for fire services, and
  • experimental validation of the proposed environment through scenario-based training exercises involving active fire-service incident commanders.
The design science approach was chosen to ensure that the training environment reflects real operational workflows, technological constraints, and command responsibilities. Empirical validation was conducted to assess usability, perceived realism, preparedness for command tasks, and decision-support effects related to spatial information use and scenario structure.

2.2. Architecture of the Integrated Training Environment

The training environment was designed as a modular system composed of three interrelated layers: data acquisition, simulation, and C2 support.
The data acquisition layer integrated spatial and situational data from two primary sources:
  • Geoinformation systems (GIS), the ArcGIS Desktop 10.8.2 (Esri, Redlands, CA, USA):
GIS datasets included digital elevation models (DEM), land cover and vegetation layers, transportation networks, watercourses, and critical infrastructure elements. These datasets formed the spatial basis for scenario development and incident visualization.
  • Unmanned aerial vehicle (UAV), the DJI Matrice RTK 350 (SZ DJI Technology Co., Ltd., Shenzhen, China), data:
UAV-based photogrammetry was used to obtain high-resolution orthophotos and terrain models for selected training areas. UAV reconnaissance supported the generation of realistic spatial inputs, enabling commanders to work with data comparable to real incident conditions.
All spatial data were harmonized within a unified coordinate reference system and processed to ensure compatibility with the simulation and visualization components.
GIS data included a digital elevation model (DEM) with 5 m spatial resolution, ortho photos with 0.5 m resolution, forestry data, transportation networks, watercourse layers, and critical infrastructure datasets obtained from national geospatial data providers with 10 m spatial resolution. UAV photogrammetric surveys were conducted over selected training areas, producing orthophotos and surface models with a ground sampling distance of approximately 12 cm. All datasets were harmonized in a unified coordinate reference system (ETRS89/UTM zone 34N) and processed to ensure compatibility with the VR simulation environment.

2.3. Simulation and Virtual Reality Environment

2.3.1. Virtual Environment Design

The simulation layer transformed GIS- and UAV-derived data into immersive three-dimensional environments suitable for virtual reality (VR) training. Terrain geometry, infrastructure elements, and environmental features were reconstructed to reflect real-world spatial conditions.
The VR environment enabled trainees to:
  • observe incident development from multiple perspectives,
  • interpret spatial relationships and terrain constraints,
  • perform tactical decision-making under dynamically evolving conditions.
During training, participants were provided with layered spatial information directly within the VR environment. This included terrain models derived from the digital elevation model, orthophotos, transport networks, watercourses, critical infrastructure layers, hazard zones, and UAV-derived aerial imagery. These datasets were visualized as interactive overlays that could be activated or deactivated by the trainee, allowing incident commanders to combine aerial perspective with GIS-based analytical layers while assessing the scenario. This visual composition closely resembled the type of spatial information available during real emergency operations.

2.3.2. Incident Dynamics and Scenario Logic

Incident progression within the VR environment was controlled through scripted scenario logic incorporating predefined events, time-dependent changes, and decision-triggered responses. Hazard development (fire spread, hazardous-material leakage, flood progression) was represented using simplified rule-based evolution rather than detailed physical simulation models. The primary focus was placed on command decision-making, prioritization, and coordination under changing conditions, rather than on high-fidelity physical hazard modelling. This approach ensured consistent scenario repeatability while preserving operationally realistic decision demands.
Scenario progression was not based on predefined game-like branching paths. Instead, an instructor supervised each training session and dynamically triggered scenario events in response to participant decisions and timing of actions. This approach allowed flexible adaptation of incident development while preserving overall scenario structure and repeatability. As a result, the training more closely resembled real command supervision rather than automated game logic, enabling realistic interaction between the trainee and the evolving emergency situation.

2.3.3. Conceptual Alignment with Cognitive Measures

In this study, situational awareness was not measured directly using standardized real-time assessment instruments such as SAGAT or SART. Instead, it was conceptualized as an emergent cognitive outcome inferred from participants’ perceived training realism, preparedness for command tasks, decision-making confidence, and reported effectiveness of spatial information integration. Consequently, situational awareness is interpreted as a derived cognitive construct emerging from information integration and interactive decision-making, which should be considered when comparing the present findings with studies employing direct situational awareness metrics.
In this study, situational awareness is used as a theoretical reference framework rather than as a directly measured construct. The evaluation focuses on perceived spatial situational understanding derived from participants’ assessments of realism, confidence, and information integration.

2.3.4. Technical Implementation

The virtual training environment was implemented using a real-time 3D simulation engine, enabling interactive visualization of GIS- and UAV-derived spatial data in an immersive format. The system was implemented as a fully immersive single-user VR environment using head-mounted display technology rather than desktop-based visualization. Participants interacted with the environment using a head-mounted virtual reality display and handheld controllers, allowing free navigation, scene observation from multiple viewpoints, and selection of command actions through menu-based interfaces. Training sessions were conducted individually, with an instructor supervising scenario progression and decision logging. The system architecture allowed dynamic updating of spatial layers and scenario events during runtime, providing real-time feedback on command decisions. This configuration ensured a fully immersive single-user command training experience consistent with contemporary VR-based professional training systems.
The VR environment was developed using the Unity 3D engine (Unity Technologies, San Francisco, CA, USA) and deployed on Meta Quest 2 head-mounted displays with handheld motion controllers (Meta Platforms Inc., Menlo Park, CA, USA). The system operated at a refresh rate of 72 Hz with an approximate field of view of 90 degrees, ensuring smooth visualization and minimizing motion discomfort during training. Spatial datasets originating from GIS sources and UAV surveys were unified within the ETRS89/UTM zone 34N coordinate reference system. The digital elevation model (5 m resolution) served as the base terrain layer, while high-resolution UAV orthophotos and surface models (12 cm resolution) were georeferenced and spatially aligned using ground control points and coordinate transformation to ensure consistent spatial registration across datasets of differing precision. This allowed accurate visual fusion of terrain morphology, infrastructure layers, and aerial imagery within the immersive environment.

2.4. Command-and-Control Framework

The training environment was structured around established fire-service C2 principles and standard operating procedures (SOPs). Participants operated in the role of incident commanders and were required to:
  • assess situational information from GIS and UAV sources,
  • identify hazards and operational priorities,
  • allocate resources and coordinate response actions,
  • adapt decisions to changing incident conditions.
The system supported structured decision-making workflows consistent with incident command system (ICS) principles and national fire-service doctrine.

2.5. Scenario Development

Three representative operational scenarios were developed to reflect common but high-risk emergency situations faced by fire-service commanders:
  • Wildland fire:
Scenario focused on terrain influence, vegetation distribution, access limitations, and fire spread risk under changing environmental conditions.
  • Hazardous materials transport accident:
The scenario involved a traffic accident with hazardous substances, emphasizing risk zoning, protection of responders and civilians, and coordination with specialized units.
  • Flood response:
The scenario addressed large-area flooding, infrastructure impacts, evacuation planning, and coordination of resources across affected zones.
Each scenario included defined initial conditions, decision points, information updates, and evaluation criteria.
All three training scenarios were developed in close consultation with experienced incident commanders and senior training officers from the Fire and Rescue Service, ensuring that scenario structure, objectives, spatial constraints, and hazard dynamics reflected realistic operational practice. The scenarios were modeled on actual incident types frequently encountered within the national response system, specifically wildland fires in rugged terrain, hazardous-materials transportation accidents, and large-scale flood events. Where possible, real GIS layers were incorporated, including topography, land cover, transport networks, and hydrological features, to replicate authentic operational environments. SMEs reviewed and validated each scenario’s design to confirm that the decision points, tactical challenges, and resource demands corresponded to established response doctrine and command procedures. This validation ensured that the integrated GIS–UAV–VR training environment was grounded in operational fidelity and suitable for evaluating command decision-making under realistic conditions.

2.6. Study Area

The spatial datasets used to construct the training scenarios were derived from real geographic locations in central Slovakia, selected to represent typical operational environments encountered by the national Fire and Rescue Service. The study area includes mountainous forest terrain for the wildland fire scenario, a major transportation corridor for the hazardous-materials scenario, and a river-floodplain zone for the flood scenario. Using real geographic areas ensured that terrain morphology, infrastructure layout, and land-cover patterns reflected realistic operational conditions.
The selection of study areas and spatial datasets was based on criteria including terrain diversity, presence of transport infrastructure, hydrological features, and representation of environments typical for national fire-service operations. These criteria ensured that the spatial inputs used for scenario construction reflected realistic operational conditions encountered by incident commanders.

2.7. Participants

The experimental evaluation involved active fire and rescue service incident commanders participating in professional training exercises. All participants had operational command experience and were familiar with standard fire and rescue service procedures.
The participant group consisted of 23 incident commanders enrolled in the 36th Specialized Professional Training of Command Functions. This training program represents the official qualification pathway for command positions in the Slovak Fire and Rescue Service. Therefore, the sample corresponds directly to the target population for which the training environment is intended. While the sample size limits broad statistical generalization, the study aims to explore internal relationships and cognitive effects within a representative group of professional commanders rather than population-level inference.
Participant demographic and professional characteristics are summarized in Table 1.
Table 1. Demographic and professional characteristics of participating incident commanders.
Although wildland fire incidents constitute a frequent part of participants’ operational responsibilities, hazardous materials transport accidents and flood response events occur less regularly but remain formally within the scope of command duties of the Fire and Rescue Service. Incident commanders are required by national doctrine to be prepared to manage all three types of emergencies, even if some occur only occasionally in daily practice. The inclusion of these scenarios therefore reflects realistic command responsibilities rather than routine operational frequency and ensures that the training environment addresses both common and less frequent but high-risk incident types.
Table 1 summarizes the composition of the participant group involved in the experimental evaluation, including gender, age distribution, years of service, command positions, and prior exposure to GIS, UAV, and simulation technologies. These background variables were used to describe the sample and to support subsequent correlation and cluster analyses examining training effects in relation to experience level and technological familiarity.
The sample size reflects the limited availability of qualified incident commanders participating in national command training during the study period and therefore limits broader statistical generalization.
The sample size corresponds to the full cohort of incident commanders participating in the national command qualification training during the study period rather than to a statistically pre-determined experimental group. This reflects the practical constraints of accessing highly specialized professional participants and supports the ecological validity of the study within its operational context. The gender homogeneity of the sample reflects the current demographic structure of command positions within the Slovak Fire and Rescue Service rather than intentional selection criteria. While these factors limit the generalizability of the findings, the sample can be considered representative of the professional target group for which the training environment is intended.

2.8. Data Collection

2.8.1. Questionnaire Design

Data were collected using a structured questionnaire administered before and after participation in the training exercise. The empirical evaluation focused on:
  • perceived realism of the training environment,
  • perceived preparedness for command tasks,
  • decision-making confidence,
  • usefulness of scenario structure and digital tools,
  • and qualitative feedback on usability and information integration.
Items were rated using ordinal and categorical response formats. Questionnaire content was developed based on relevant literature and expert consultation.
Example questionnaire items included statements such as: “The training scenario realistically represented real operational conditions”; “The integration of GIS and UAV data improved my understanding of the incident situation”; “I felt confident in prioritizing operational tasks during the scenario”; and “After completing the training, I feel better prepared to perform incident command tasks”. Respondents rated these statements using a five-point Likert agreement scale ranging from strongly disagree to strongly agree.
The questionnaire consisted of Likert-scale and categorical items addressing perceived realism, preparedness, decision-making confidence, and perceived usefulness of digital tools. Example items included: “The training scenario realistically represented real operational conditions” and “After completing the training, I feel better prepared to perform incident command tasks”. Responses were recorded on a five-point agreement scale.

2.8.2. Training Procedure

Participants completed the pre-training questionnaire, followed by scenario-based training using the integrated environment. After completing the scenarios, participants filled out the post-training questionnaire reflecting their experience.

2.9. Data Analysis

Quantitative data analysis included:
  • descriptive statistics of ordinal and categorical questionnaire variables,
  • Spearman’s and Pearson’s correlation coefficients to evaluate relationships between perceived realism, preparedness, scenario structure, and training characteristics,
  • principal component analysis (PCA) for dimensionality reduction and identification of dominant response patterns,
  • cluster analysis to identify participant groups based on experience level and perceived training benefits.
Pre/post differences were evaluated primarily for perceived preparedness and perceived realism.
Effects related to situational awareness were interpreted through correlation structures, regression dependencies, PCA projections, and cluster patterns, rather than through direct mean-score comparison. Qualitative data from open-ended responses were analyzed using thematic content analysis and further examined through cluster-based grouping.
Questionnaire responses were collected using five-point Likert-type ordinal scales. Consequently, non-parametric statistical methods were applied throughout the analysis. Pre-training and post-training differences were evaluated using the Wilcoxon signed-rank test. Relationships between key evaluation variables were examined using Spearman’s rank correlation coefficient. Principal component analysis (PCA) was applied to identify latent dimensions underlying participant responses, and hierarchical cluster analysis was used to classify participants according to similarity in response patterns. All statistical analyses were conducted in an exploratory manner to identify internal relationships within the participant group rather than to provide population-level inference.

2.10. Ethical Considerations

The study was conducted in accordance with ethical principles for research involving human participants. Participation was voluntary, and all respondents were informed about the purpose of the study and the anonymous handling of data. The training activities posed no risk beyond standard professional training exercises.

3. Results

This section presents the results of the experimental validation of the integrated command-and-control (C2) training environment based on correlation analysis, multivariate statistical techniques, and qualitative feedback. The evaluation focuses on perceived training realism, preparedness for command tasks, scenario-specific usefulness of digital tools, and derived effects related to situational awareness (in the sense of improved spatial situational understanding) through spatial information integration.

3.1. Participant Characteristics and Training Background

The experimental evaluation involved active fire-service incident commanders with varying levels of operational and command experience. Participants represented different organizational units of the fire and rescue service and covered a broad range of command roles and years of service.
Most participants reported regular involvement in operational response and periodic participation in professional training exercises. While traditional field exercises and classroom-based instruction constituted the dominant forms of prior training experience, only a limited proportion of participants had previously encountered simulation-based or immersive digital training tools.
With respect to technological exposure, respondents indicated that their prior experience with GIS, UAV systems, and virtual simulation tools was generally limited and fragmented. This baseline condition is important for interpreting the post-training results, as it indicates that the integrated C2 training environment represented a novel training modality for the majority of participants.
Overall, the participant group can be characterized as operationally experienced but with limited prior exposure to immersive, digitally integrated training environments.

3.2. Perceived Training Realism and Preparedness Before and After Training

Table 2 presents pre-training and post-training ratings of perceived training realism. The Wilcoxon signed-rank test confirmed a statistically significant post-training increase in perceived realism (median increased from 3.0 to 4.0; p = 0.00075). This indicates that participants perceived the VR-based training as substantially closer to real operational conditions than previously experienced training formats.
Table 2. Pre- and post-training ratings of perceived training realism (n = 23).
Table 2 presents paired pre-training and post-training ratings of perceived training realism. Median values and interquartile ranges (IQR) are reported due to the ordinal nature of Likert-type data. Statistical significance of pre/post differences was evaluated using the Wilcoxon signed-rank test.
Before the training, participants generally rated the realism of traditional training methods as moderate. While conventional exercises were acknowledged as operationally relevant, respondents frequently identified limitations related to restricted scenario variability, limited spatial scale, and insufficient representation of complex and dynamically evolving emergency conditions.
After participation in the simulation-based training, respondents reported a substantial increase in perceived realism, particularly in relation to (i) spatial representation of terrain and infrastructure, (ii) dynamic development of emergency scenarios, and (iii) integration of multiple information sources. As shown in Table 2, this improvement was accompanied by a corresponding increase in perceived preparedness for command decision-making. Participants reported higher confidence in their ability to assess incident conditions, prioritize operational tasks, and allocate resources under time pressure after completing the training.
Correlation analysis revealed that perceived realism and perceived preparedness were strongly associated (Spearman ρ = 0.71, p < 0.001). This finding indicates that training realism, enabled primarily by immersive VR visualization and the integration of GIS and UAV data, played a decisive role in shaping participants’ sense of operational readiness. The observed statistical relationships provide a quantitative basis for interpreting the training effects presented in this section.

3.3. Decision-Making Confidence and Perceived Usefulness of Digital Tools

Post-training questionnaire responses indicated that the integrated GIS–UAV–VR training environment significantly supported decision-making processes under complex emergency conditions. Participants consistently reported that the availability and integration of digital spatial information increased their confidence in tactical and operational decision-making.
Participants emphasized that UAV reconnaissance is not only valuable within the training environment but is already increasingly applied during real emergency operations in the fire service. The training scenario therefore reflected existing and expected future operational practice rather than introducing a tool used exclusively for training purposes.
The perceived contribution of GIS-based spatial layers was especially pronounced in the areas of:
  • terrain interpretation and accessibility assessment,
  • identification and delineation of hazard zones,
  • analysis of infrastructure exposure,
  • prioritization of operational sectors.
UAV-derived imagery was rated as particularly valuable during the initial incident size-up, where it supported rapid spatial orientation, verification of scene conditions, and detection of potential secondary hazards. Participants emphasized that the overhead perspective and near-real-time visual information significantly reduced uncertainty in the early phase of command decision-making.
The immersive VR visualization enabled the integration of multiple spatial and situational information sources into a single coherent operational picture. This facilitated continuous reassessment of operational priorities and improved understanding of the consequences of tactical decisions within dynamically evolving scenarios.
Overall, the results indicate that decision-making confidence increased primarily as a function of information integration and perceived training realism, rather than as an isolated effect of any single digital technology. The combined use of GIS, UAV, and VR tools created a unified cognitive framework that supported structured assessment, prioritization, and coordination during simulated emergency response.

3.4. Relationships Between Evaluation Variables

Spearman’s rank correlation analysis was performed to examine relationships among perceived realism, perceived preparedness for command tasks, decision-making confidence, and perceived usefulness of GIS, UAV, and VR information sources. Spearman’s rank correlation analysis was applied to questionnaire variables measured on five-point Likert-type ordinal scales. Correlation coefficients were therefore computed only for evaluation items that were explicitly collected using ordinal response categories. Several supplementary questionnaire items addressing tool preferences and scenario suggestions were recorded as categorical or open-ended responses and were analyzed descriptively rather than included in correlation calculations. This approach ensured full consistency between data type and applied statistical methods.
The correlation matrix is presented in Table 3.
Table 3. Spearman correlation matrix among key evaluation variables (n = 23).
The correlation analysis presented in Table 3 confirms a strong association between perceived training realism and perceived preparedness for command tasks. To further contextualize these cognitive evaluation outcomes, Table 4 and Figure 1 present descriptive results concerning participants’ preferred spatial information and visualization technologies. Preferences were collected using a multiple-choice questionnaire item. Percentages indicate the proportion of participants who selected a given technology. Respondents could select more than one option; therefore, the sum of selections may exceed the total number of participants. The strong preference for UAV-based reconnaissance imagery indicates a high perceived value of real-time overhead spatial data in supporting incident command decision-making.
Table 4. Participant-selected usefulness of spatial information tools (multiple selections permitted, n = 23).
Figure 1. Distribution of participant-selected spatial information and visualization technologies for command training (multiple selections permitted, n = 23).
Figure 1 illustrates participant preferences for spatial information and visualization technologies used in command training. UAV-based reconnaissance imagery was selected by 69.6% of participants, indicating a strong interest in real-time aerial spatial data for incident command support. GIS spatial layers were selected by 39.1% of participants, reflecting substantial demand for structured geospatial information in operational training. VR-based simulation and visualization was selected by 26.1% of participants, indicating that, although selected less frequently than UAV and GIS tools, VR visualization was still recognized by participants as a valuable component of command training because it enabled integration of all spatial information sources into a coherent operational picture.
The indicated preference for UAV reconnaissance imagery can be interpreted by the intuitive visual nature of aerial information. Unlike GIS layers, which require familiarity with spatial data interpretation and layer abstraction, UAV imagery provides immediate, easily interpretable visual cues that do not depend heavily on prior geospatial training. Participants reported that overhead imagery allowed rapid situational orientation without the need for advanced technical knowledge, which explains its dominant selection among spatial information tools.

3.5. Perceived Situational Awareness Based on Spatial Information Integration

In this study, situational awareness was not measured using standardized instruments but was inferred indirectly from perceptual variables related to realism, information integration, and decision-making confidence.
Perceived situational awareness (in the sense of improved spatial situational understanding) was evaluated as a derived cognitive construct based on participants’ assessments of training realism, decision-making confidence, scenario structure, and the perceived contribution of spatial information sources. Rather than being measured as a standalone numerical variable, situational awareness was interpreted through patterns of information integration and statistically supported relationships among key evaluation variables.
Participants consistently indicated that the combined use of UAV reconnaissance data and GIS spatial layers substantially enhanced their ability to interpret spatial relationships, assess hazards, and anticipate incident development. The availability of up-to-date overhead imagery was reported to significantly reduce uncertainty during the initial incident size-up, particularly in spatially complex and dynamically evolving environments.
The contribution of GIS-based terrain models, vegetation layers, transport networks, and hydrological features was reflected in improved spatial orientation and prioritization of operational sectors. Layered visualization of these datasets supported identification of terrain constraints, recognition of infrastructure vulnerabilities, and assessment of potential escalation zones.
The immersive VR-based visualization environment facilitated the fusion of spatial and situational information into a unified operational picture. This supported continuous reassessment of incident development and enhanced the cognitive linkage between spatial perception and tactical decision-making.
Overall, the results demonstrate that perceived situational awareness increased primarily as a consequence of integrated spatial information use, mediated through improved training realism and decision-making confidence. Situational awareness thus emerged as an emergent property of information fusion and scenario-based command training, rather than as a directly measured performance metric.

3.6. Scenario-Specific Results

Scenario-dependent differences in the perceived usefulness of digital tools and information integration were observed across the three experimental training scenarios. While the integrated GIS–UAV–VR environment was evaluated positively in all cases, the role of specific spatial data sources varied according to the operational characteristics of each scenario.

3.6.1. Wildland Fire Scenario

In the wildland fire scenario, participants indicated a strong reliance on terrain models, vegetation layers, and UAV overview imagery. The spatial distribution of vegetation, slope configuration, and accessibility of terrain were identified as the primary factors influencing tactical decision-making. UAV-derived overhead imagery supported rapid assessment of fire front development, identification of threatened areas, and evaluation of access routes for ground units. Participants reported that the integration of terrain and vegetation data significantly improved their ability to anticipate potential fire spread and recognize operational constraints.

3.6.2. Hazardous Materials Transport Accident

In the hazardous materials transport accident scenario, GIS-based risk zoning and infrastructure proximity layers were rated as particularly important. These datasets supported rapid identification of endangered zones, exposed infrastructure, and potential evacuation areas. UAV imagery was primarily used for verification of on-scene conditions and detection of secondary hazards, such as leaking containers, traffic congestion, or bystander exposure. The combined spatial overview enhanced situational interpretation under time-critical conditions and supported safer command decisions.

3.6.3. Flood Response Scenario

The flood response scenario emphasized the importance of a large-area spatial overview and coordination across multiple affected zones. Participants reported that GIS-based visualization of watercourses, flooded areas, transport networks, and critical infrastructure was essential for prioritization of response actions and allocation of limited resources. UAV-derived imagery supported the identification of infrastructure failures, inaccessible routes, and dynamically changing water levels. The scenario highlighted the value of integrated spatial information for strategic-level coordination under conditions of large-scale spatial uncertainty.

3.6.4. Scenario-Level Interpretation

The scenario-level interpretation does not represent a grading of scenarios according to risk level. Instead, it reflects differences in spatial structure, hazard dynamics, and command-information demands characteristic of each incident type. The scenarios differ primarily in how spatial information must be interpreted and prioritized during command decision-making. Consequently, the varying usefulness of GIS, UAV, and VR tools across scenarios results from these differing spatial and operational characteristics rather than from any ranking of scenario severity.
Overall, the scenario-specific results confirm that integrated spatial information support enhances command decision-making across different types of emergency situations, while the relative importance of individual data sources varies according to the spatial scale, dynamics, and risk structure of each scenario.

3.7. Cluster Analysis of Participant Responses

Cluster analysis was applied to identify dominant response patterns among participants with respect to experience level, perceived training benefits, and decision-making confidence. The main characteristics of the identified participant clusters are summarized in Table 5.
Table 5. Cluster characteristics of participant responses.
The first cluster consisted predominantly of highly experienced incident commanders with extensive operational practice and higher baseline confidence. These participants reported consistently high levels of perceived preparedness and training realism. Their relative improvement after training was more moderate; however, they emphasized the value of the integrated training environment mainly for advanced tactical rehearsal, testing of complex scenarios, and inter-agency coordination. For this group, the training environment primarily served as a high-fidelity rehearsal platform for demanding operational situations.
The second cluster was characterized mainly by less experienced commanders, including participants at earlier stages of their command careers. This group showed lower baseline confidence and preparedness but showed substantially stronger relative gains after completing the simulation-based training. Participants in this cluster highlighted the importance of immersive spatial visualization, scenario repetition, and integrated information support for building command competence and decision-making confidence.
Despite these differences, both clusters reported overall positive effects of the integrated GIS–UAV–VR training environment. As summarized in Table 3, the cluster structure indicates that training realism, information integration, and scenario complexity interact with prior experience to shape perceived training outcomes. These results further support the relevance of adaptive, scenario-based digital training environments for heterogeneous groups of incident commanders.

3.8. Summary of Key Experimental Findings

The experimental validation of the integrated GIS–UAV–VR C2 training environment confirmed its relevance as an advanced tool for scenario-based training of fire-service incident commanders. The key findings derived from the empirical evaluation, correlation analysis, and qualitative interpretation can be summarized as follows:
  • The integrated training environment was perceived as usable, realistic, and operationally relevant, with a clear increase in perceived training realism after participation in the simulation-based exercises.
  • Simulation-based training led to a systematic increase in perceived preparedness and decision-making confidence, particularly in relation to spatially complex and dynamically evolving emergency situations.
  • Perceived situational awareness (in the sense of improved spatial situational understanding) increased as a derived cognitive effect of spatial information integration, supported by the combined use of GIS spatial layers, UAV-derived imagery, and immersive VR visualization.
  • Scenario-based training indicated that the relative importance of individual digital data sources is strongly scenario-dependent, while their integrated use consistently supports command decision-making across different types of emergencies.
  • Cluster analysis confirmed that commanders with different levels of operational experience benefit from the training environment in distinct but complementary ways, with developing commanders showing stronger relative gains and experienced commanders emphasizing the value of high-fidelity tactical rehearsal.
Collectively, the results indicate that the proposed integrated training environment provides measurable cognitive and operational benefits, strengthens spatially supported decision-making, and represents a scalable platform for advanced command training under complex emergency conditions.

4. Discussion

Non-parametric comparison of paired questionnaire responses showed a statistically significant improvement in perceived training realism following participation in the VR-based command training environment (Wilcoxon signed-rank test, p = 0.00075). This result indicates that participants perceived the integrated GIS–UAV–VR training environment as substantially closer to real operational conditions than previously experienced training formats. The finding supports the assumption that immersive spatial information integration enhances subjective realism of incident command training.
This study investigated the impact of an integrated GIS–UAV–VR command-and-control (C2) training environment on perceived realism, preparedness, decision-making confidence, and derived situational awareness (in the sense of improved spatial situational understanding) of fire-service incident commanders. The results indicated consistently positive effects across all evaluated dimensions and operational scenarios. These findings align with broader trends reported in recent literature on immersive and simulation-based training for emergency services, which increasingly emphasize the role of digital environments in enhancing cognitive readiness, spatial reasoning, and command performance [21,22,23].

4.1. Training Realism and Preparedness in the Context of Immersive Training Research

The significant post-training increase in perceived realism and preparedness observed in this study is consistent with contemporary findings on immersive learning environments for fire and rescue training. Recent reviews indicate that VR- and XR (Extended Reality)-based systems enhance the subjective realism of training by enabling safe exposure to complex, dynamically evolving emergency conditions that are difficult to reproduce in live exercises [21,24]. The present study extends this knowledge by showing that realism is further strengthened when immersive visualization is combined with real GIS and UAV data, rather than relying solely on synthetic environments.
Preparedness gains observed after training reflect the ability of immersive environments to support experiential learning under time pressure and uncertainty, which are core characteristics of emergency command [22,25]. Unlike traditional classroom-based instruction, the integrated environment used in this study required participants to actively interpret evolving spatial information, prioritize tasks, and adapt resource allocation. This supports recent findings that immersive training improves not only declarative knowledge but also procedural and strategic decision competence [24,25,26].

4.2. Situational Awareness as a Product of Multi-Source Information Fusion

Situational awareness was treated as an emergent construct and was not assessed through direct measures such as SAGAT or SART; instead, it was inferred from correlated perceptual indicators reflecting information integration, spatial understanding, and confidence in command decision-making. This interpretation corresponds well with recent conceptual work emphasizing that situational awareness in digital training environments arises primarily through information fusion and interactive sensemaking, rather than through isolated perceptual metrics [27,28,29].
The integrated use of UAV imagery and GIS layers played a decisive role in this process. UAV data supported rapid perception and verification of on-scene conditions, while GIS layers enabled structured comprehension of terrain, infrastructure exposure, and risk zones. The ability to project future incident development was further enhanced through VR-based scenario evolution. Similar multi-source approaches have been highlighted in recent XR-based command platforms, where real-time geospatial overlays and dynamic simulation substantially improve cognitive mapping and predictive reasoning [23,30].

4.3. Contribution of UAV and GIS to Command Decision Support

The complementary roles of UAV reconnaissance and GIS analysis identified in this study strongly correspond with recent findings from disaster management and XR-supported command research. UAV imagery is widely recognized for its value in rapid damage assessment, hazard verification, and monitoring of inaccessible zones [31,32,33]. In this study, commanders explicitly reported reduced uncertainty during initial incident assessment when UAV imagery was available.
GIS, on the other hand, provided the analytical backbone for spatial decision-making, supporting risk zoning, accessibility analysis, and prioritization of protective measures. Recent studies confirm that GIS-based overlays within immersive environments significantly enhance command efficiency, particularly in multi-hazard scenarios [34,35,36]. The present study shows that the true operational benefit emerges not from either technology alone but from their synchronized use within a single immersive command interface.

4.4. Scenario-Specific Interpretation of Digital Tool Effectiveness

The scenario-dependent variation in perceived tool usefulness observed in this study reflects a key principle emphasized in recent simulation-based training research: digital support must be tailored to the spatial and risk structure of the incident type [24,37]. Wildland fire scenarios benefited primarily from terrain models, vegetation data, and wide-area aerial overview, which directly correspond to fire spread dynamics and accessibility constraints.
In hazardous materials transport accidents, GIS-based risk zoning and infrastructure proximity analysis were dominant, which is consistent with XR-based hazmat training systems focusing on exclusion zones, evacuation corridors, and responder safety envelopes [38,39]. Flood scenarios, characterized by spatial extent and cascading infrastructure impacts, emphasized the strategic value of wide-area geospatial visualization for evacuation planning and inter-agency coordination, in line with recent cloud-based simulation approaches for distributed disaster management [34,40].

4.5. Training Effects Across Experience Levels

Cluster analysis showed that both less experienced and highly experienced commanders benefited from the integrated training environment, though in different ways. Less experienced participants showed stronger relative gains in preparedness and confidence, which is consistent with findings from immersive learning research indicating that VR and simulation environments are particularly effective for accelerating early-stage competence development [25,41].
More experienced commanders emphasized the system’s value for advanced tactical rehearsal, rare scenario training, and coordination testing. This aligns with research showing that simulation-based environments serve not only as educational tools but also as strategic laboratories for expert-level decision validation and SOP refinement [26,42].

4.6. Practical Implications for Fire-Service Training Systems

From an applied perspective, the results confirm that integrated GIS–UAV–VR training environments represent a viable, scalable, and safe complement to live-fire and field-based training. Recent economic analyses indicate that Immersive training platforms offer a scalable and repeatable complement to traditional field exercises while supporting a higher degree of training standardization [43,44,45]. The ability to log decisions, visualize consequences, and conduct structured after-action reviews further enhances training effectiveness.
The findings also support the emerging shift toward data-driven, adaptive, and cloud-enabled training ecosystems, where real geospatial data, remote participation, and AI-supported scenario generation are increasingly integrated [34,40,46].
Recent advances in adaptive VR-based training that integrate AI-driven content adaptation and real-time feedback demonstrate promising enhancements to trainee engagement and personalized learning pathways [47,48]. Empirical work by FLAIM Systems in 2024 [33,49] further illustrates how multimodal data fusion, combining physiological, behavioral, and spatial sensing, can support adaptive scenario pacing and assessment in immersive environments.
Unlike commercially available VR firefighter training platforms, which primarily target tactical firefighter skill development, the proposed system focuses specifically on incident commander decision-making and C2 processes. Its key novelty lies in the integration of real GIS layers and UAV-derived spatial data into an immersive command interface, enabling spatially grounded decision-making across multi-hazard scenarios. This distinguishes the present approach from existing systems that rely predominantly on synthetic or pre-modeled virtual environments without direct linkage to real geospatial datasets.
It is important to emphasize that the proposed system is not intended to replace commercially available VR firefighter training platforms. Commercial systems typically focus on tactical firefighter skills, procedural drills, and individual task training. In contrast, the system presented in this study addresses incident commander decision-making and spatial information integration at the command-and-control level. These approaches are complementary rather than interchangeable. The integrated GIS–UAV–VR environment is particularly suited for command training, scenario planning, and spatial decision rehearsal, while commercial platforms remain valuable for operational and procedural skill development.

4.7. Limitations

The methodological design intentionally prioritizes ecological validity and operational realism over strict experimental control. Consequently, the study focuses on perceptual and cognitive outcomes observed within a realistic training setting rather than on laboratory-style performance metrics. This approach was chosen to reflect authentic command conditions while acknowledging the limits of controlled experimental measurement.
Situational awareness was not measured using standardized instruments such as SAGAT or SART. The study therefore evaluates perceived spatial situational understanding rather than directly measured situational awareness in the sense defined by Endsley’s theory.
The study design did not include a traditional training control group for comparison with the integrated GIS–UAV–VR environment. As a result, the research does not quantify the incremental benefit of this approach relative to conventional training methods. Instead, the study provides exploratory evidence of perceived training value within the participant group. Future research should incorporate controlled comparisons or longitudinal performance tracking to more precisely evaluate the relative advantages of integrated digital training environments.
The evaluation relied primarily on self-reported perceptual measures of realism, preparedness, and decision-making confidence and did not include objective operational performance indicators such as response time, resource allocation efficiency, or error rates.
The limitations of this study are consistent with those reported in broader immersive training research. First, situational awareness was evaluated indirectly rather than through validated real-time situational awareness measurement instruments. Second, the study relied on self-reported questionnaire data, which may introduce subjective bias. Third, hazard dynamics were intentionally simplified to focus on command cognition rather than detailed physical modelling. Finally, long-term transfer of training effects to real operational performance was not directly assessed, which remains a key challenge in immersive training validation [24,41].
The sample size used in this study reflects the limited availability of certified incident commanders undergoing national command training during the study period. Consequently, the results are interpreted as exploratory evidence of training effects within a representative professional group rather than as population-level statistical generalization.

4.8. Future Research Directions

Future research should focus on integrating objective performance metrics, biometric monitoring, and real-time physiological feedback into command training environments, as increasingly explored in recent XR research [50,51,52]. Further work should also examine multi-user, inter-agency cloud-based training, AI-driven adaptive scenario generation, and long-term learning transfer through longitudinal field studies [40,46,53]. Finally, research into ethical design, accessibility, and psychological safety will remain essential for responsible deployment of highly realistic immersive training systems [54,55].

5. Conclusions

This study presented the design and experimental evaluation of an integrated GIS–UAV–VR command-and-control (C2) training environment for fire-service incident commanders. The results confirmed that the proposed training concept provides a high level of perceived realism and significantly supports preparedness for command decision-making across different emergency scenarios, including wildland fire, hazardous materials transport accidents, and flood response.
The findings indicate that situational awareness (in the sense of improved spatial situational understanding) emerges as a cognitive outcome of multi-source spatial information integration rather than as an isolated measurable variable. The combined use of UAV-derived aerial imagery and GIS-based spatial layers within an immersive VR environment enabled more effective interpretation of spatial relationships, faster initial incident assessment, improved hazard identification, and better anticipation of incident development. These effects were observed consistently across different levels of command experience.
Scenario-specific analyses further showed that the usefulness of individual digital tools depends on the spatial and risk characteristics of the incident type, confirming the importance of flexible, multi-hazard training environments for contemporary fire-service command education. Cluster analysis indicated that both less experienced and highly experienced commanders benefit from the integrated environment, though in different ways, supporting its applicability for both competence development and advanced tactical rehearsal.
From a practical perspective, the proposed training environment represents a safe, scalable, and cost-effective complement to traditional field-based exercises, particularly for spatially extensive and high-risk emergency scenarios that are difficult to reproduce under real conditions. The ability to repeatedly train complex command tasks under controlled but realistic conditions contributes directly to improved operational preparedness.
From a methodological standpoint, this study indicates the feasibility of evaluating immersive command training environments using a combination of perceived realism, preparedness, decision-making confidence, and derived situational awareness, rather than relying solely on direct situational awareness metrics. This approach reflects the real cognitive processes underlying emergency command and provides a robust framework for future training evaluation studies.
The proposed GIS–UAV–VR command-training environment provides a scalable and realistic framework for improving situational awareness, decision-making confidence, and preparedness of fire service incident commanders.

Author Contributions

Conceptualization, A.M. and D.K.; methodology, D.H. and A.M.; software, D.H.; validation, D.H., A.M. and D.K.; formal analysis, D.K.; investigation, D.H.; resources, D.H.; data curation, D.H. and A.M.; writing—original draft preparation, D.H.; writing—review and editing, D.H., A.M. and D.K.; visualization, D.H.; supervision, D.K. and A.M.; project administration, D.K.; funding acquisition, D.K. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Slovak Research and Development Agency under the contract No. APVV-22-0030.

Institutional Review Board Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on reasonable request from the corresponding author. The data are not publicly available due to restrictions related to the protection of participants’ privacy and confidentiality of professional training procedures.

Acknowledgments

The authors would like to thank the participating fire service units and incident commanders for their engagement in the experimental training and evaluation. The authors also acknowledge the technical support provided during the preparation of the GIS datasets, UAV data acquisition, and the development of the virtual reality training environment. Special thanks are extended to the instructors and technical staff who assisted with the organization and execution of the simulation-based training sessions.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
C2Command-and-Control
DEMDigital Elevation Model
GISGeographic Information System
ICSIncident Command System
IQRInterquartile Ranges
PCAPrincipal Component Analysis
SOPStandard Operating Procedure
UAVUnmanned Aerial Vehicle
VRVirtual Reality
XRExtended Reality

References

  1. Endsley, M.R. Toward a Theory of Situation Awareness in Dynamic Systems. Hum. Factors 1995, 37, 32–64. [Google Scholar] [CrossRef]
  2. Klein, G. Naturalistic Decision Making. Hum. Factors 2008, 50, 456–460. [Google Scholar] [CrossRef] [PubMed]
  3. Flin, R.; O’Connor, P.; Crichton, M. Safety at the Sharp End: A Guide to Non-Technical Skills; Ashgate: Farnham, UK, 2008. [Google Scholar] [CrossRef]
  4. Chuvieco, E.; Aguado, I.; Yebra, M.; Nieto, H.; Salas, J.; Martín, M.P.; Vilar, L.; Martínez, J.; Martín, S.; Ibarra, P.; et al. Development of a Framework for Fire Risk Assessment Using Remote Sensing and Geographic Information System Technologies. Ecol. Modell. 2010, 221, 46–58. [Google Scholar] [CrossRef]
  5. Finney, M.A. FARSITE: Fire Area Simulator—Model Development and Evaluation; U.S. Department of Agriculture: Ogden, UT, USA, 1998. [CrossRef]
  6. Thompson, M.P.; Calkin, D.E. Uncertainty and Risk in Wildland Fire Management. J. Environ. Manag. 2011, 92, 1895–1909. [Google Scholar] [CrossRef]
  7. Lattimer, B.Y.; Huang, X.; Delichatsios, M.A.; Levendis, Y.A.; Kochersberger, K.; Manzello, S.; Frank, P.; Jones, T.; Salvador, J.; Delgado, C.; et al. Use of Unmanned Aerial Systems in Outdoor Firefighting. Fire Technol. 2023, 59, 2961–2988. [Google Scholar] [CrossRef]
  8. Merino, L.; Caballero, F.; Martínez-de-Dios, J.R.; Maza, I.; Ollero, A. An Unmanned Aircraft System for Automatic Forest Fire Monitoring and Measurement. J. Intell. Robot. Syst. 2012, 65, 533–548. [Google Scholar] [CrossRef]
  9. Colomina, I.; Molina, P. Unmanned Aerial Systems for Photogrammetry and Remote Sensing: A Review. ISPRS J. Photogramm. Remote Sens. 2014, 92, 79–97. [Google Scholar] [CrossRef]
  10. Tang, L.; Shao, G. Drone remote sensing for forestry research and practices. J. For. Res. 2015, 26, 791–797. [Google Scholar] [CrossRef]
  11. Adams, S.M.; Friedland, C.J. A Survey of Unmanned Aerial Vehicle (UAS) Usage for Imagery Collection in Disaster Research and Management. In Proceedings of the 9th International Workshop on Remote Sensing for Disaster Response; Stanford University: Stanford, CA, USA, 2011. [Google Scholar]
  12. Bliss, J.P.; Tidwell, P.D.; Guest, M.A. The Effectiveness of Virtual Reality for Administering Spatial Navigation Training to Firefighters. Presence Teleoperators Virtual Environ. 1997, 6, 73–86. [Google Scholar] [CrossRef]
  13. Kinateder, M.; Ronchi, E.; Nilsson, D.; Kobes, M.; Müller, M.; Pauli, P. Virtual Reality for Fire Evacuation Research. Ann. Comput. Sci. Inf. Syst. 2014, 2, 313–321. [Google Scholar] [CrossRef]
  14. Juřík, V.; Uhlík, O.; Snopková, D.; Kvarda, O.; Apeltauer, T.; Apeltauer, J. Analysis of the Use of Behavioral Data from Virtual Reality for Calibration of Agent-Based Evacuation Models. Heliyon 2023, 9, e14275. [Google Scholar] [CrossRef] [PubMed]
  15. Cha, M.; Han, S.; Lee, J.; Choi, B. A Virtual Reality-Based Fire Training Simulator Integrated with Fire Dynamics Data. Fire Saf. J. 2012, 50, 12–24. [Google Scholar] [CrossRef]
  16. Çoban, M.; Bolat, Y.I.; Göksu, İ. The Potential of Immersive Virtual Reality to Enhance Learning: A Meta-Analysis. Educ. Res. Rev. 2022, 36, 100452. [Google Scholar] [CrossRef]
  17. Endsley, M.R.; Jones, D.G. Designing for Situation Awareness: An Approach to User-Centered Design, 3rd ed.; CRC Press: Boca Raton, FL, USA, 2025. [Google Scholar] [CrossRef]
  18. Neal, S.; Palen, L.; Anderson, K.; Anderson, C.; Harper, R. Disaster response training in immersive virtual environments: The role of individual and collaborative learning. Disasters 2014, 38, 437–458. [Google Scholar] [CrossRef]
  19. Ronchi, E.; Nilsson, D. Fire Evacuation in High-Rise Buildings: A Review of Human Behaviour and Modelling Research. Fire Sci. Rev. 2013, 2, 7. [Google Scholar] [CrossRef]
  20. Salas, E.; Wildman, J.L.; Piccolo, R.F. Using Simulation-Based Training to Enhance Management Education. Acad. Manag. Learn. Educ. 2009, 8, 559–573. [Google Scholar] [CrossRef]
  21. Hemmatjo, R.; Hajaghazadeh, M.; Allahyari, T.; Zare, S.; Kazemi, R. The Effects of Live-Fire Drills on Visual and Auditory Cognitive Performance among Firefighters. Ann. Glob. Health 2020, 86, 144. [Google Scholar] [CrossRef]
  22. Reis, V.; Neves, C. Application of Virtual Reality Simulation in Firefighter Training for the Development of Decision-Making Competences. In Proceedings of the 2019 International Symposium on Computers in Education (SIIE); IEEE: Tomar, Portugal, 2019; pp. 1–6. [Google Scholar] [CrossRef]
  23. Williams-Bell, F.M.; Kapralos, B.; Hogue, A.; Murphy, B.; Weckman, E. Using Serious Games and Virtual Simulation for Training in the Fire Service: A Review. Fire Technol. 2015, 51, 553–584. [Google Scholar] [CrossRef]
  24. Bellemans, M.; Lamrnens, D.; De Sloover, J.; De Vleeschauwer, T.; Schoofs, E.; Jordens, W.; Van Steenhuyse, B.; Mangelschots, J.; Selleri, S.; Hamesse, C.; et al. Training Firefighters in Virtual Reality. In Proceedings of the 2020 International Conference on 3D Immersion (IC3D 2020); Institute of Electrical and Electronics Engineers Inc.: Brussels, Belgium, 2020; p. 9376336. [Google Scholar] [CrossRef]
  25. Kinateder, M.; Müller, M.; Jost, M.; Mühlberger, A.; Pauli, P. Social Influence in a Virtual Tunnel Fire—Influence of Conflicting Information on Evacuation Behavior. Appl. Ergon. 2014, 45, 1649–1659. [Google Scholar] [CrossRef]
  26. Feng, Z.; González, V.A.; Amor, R.; Lovreglio, R.; Cabrera-Guerrero, G. Immersive Virtual Reality Serious Games for Evacuation Training and Research: A Systematic Literature Review. Comput. Educ. 2018, 127, 252–266. [Google Scholar] [CrossRef]
  27. Berthiaume, M.; Kinateder, M.; Emond, B.; Cooper, N.; Obeegadoo, I.; Lapointe, J.-F. Evaluation of a Virtual Reality Training Tool for Firefighters Responding to Transportation Incidents with Dangerous Goods. Educ. Inf. Technol. 2024, 29, 14929–14967. [Google Scholar] [CrossRef]
  28. Stefan, H.; Mortimer, M.; Horan, B. Evaluating the Effectiveness of Virtual Reality for Safety-Relevant Training: A Systematic Review. Virtual Real. 2023, 27, 2839–2869. [Google Scholar] [CrossRef]
  29. Caldas, O.I.; Sánchez, N.; Mauledeux, M.; Avilés, O.F.; Rodríguez-Guerrero, C.D. Leading Presence-Based Strategies to Manipulate User Experience in Virtual Reality Environments. Virtual Real. 2022, 26, 1507–1518. [Google Scholar] [CrossRef]
  30. Weidinger, J. What Is Known and What Remains Unexplored: A Review of the Firefighter Information Technologies Literature. Int. J. Disaster Risk Reduct. 2022, 78, 103115. [Google Scholar] [CrossRef]
  31. Fanfarová, A.; Mariš, L. Simulation Tool for Fire and Rescue Services. Procedia Comput. Sci. 2017, 108, 160–165. [Google Scholar] [CrossRef]
  32. Smith, D.L.; Horn, G.P.; Petruzzello, S.J.; Fahey, G.; Woods, J.; Fernhall, B. Clotting and Fibrinolytic Changes After Firefighting Activities. Med. Sci. Sports Exerc. 2014, 46, 448–454. [Google Scholar] [CrossRef][Green Version]
  33. Oliveira, J.; Aires Dias, J.; Correia, R.; Pinheiro, R.; Reis, V.; Sousa, D.; Agostinho, D.; Simões, M.; Castelo-Branco, M. Exploring Immersive Multimodal Virtual Reality Training, Affective States, and Ecological Validity in Healthy Firefighters: Quasi-Experimental Study. JMIR Serious Games 2024, 12, e53683. [Google Scholar] [CrossRef]
  34. Mystakidis, S.; Besharat, J.; Papantzikos, G.; Christopoulos, A.; Stylios, C.; Agorgianitis, S.; Tselentis, D. Design, Development, and Evaluation of a Virtual Reality Serious Game for School Fire Preparedness Training. Educ. Sci. 2022, 12, 281. [Google Scholar] [CrossRef]
  35. Calandra, D.; Pratticò, F.G.; Migliorini, M.; Verda, V.; Lamberti, F. A Multi-Role, Multi-User, Multi-Technology Virtual Reality-Based Road Tunnel Fire Simulator for Training Purposes. In Proceedings of the 16th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2021)—GRAPP; SciTePress: Setúbal, Portugal, 2021; pp. 96–105. [Google Scholar] [CrossRef]
  36. Hoey, I. Exploring the Impact of VR on Firefighter Training with FLAIM’s Innovative Platform. 2024. Available online: https://internationalfireandsafetyjournal.com/exploring-the-impact-of-vr-on-firefighter-training-with-flaims-innovative-platform/ (accessed on 15 March 2025).
  37. Gronowski, A.; Arness, D.C.; Ng, J.; Qu, Z.; Lau, C.W.; Catchpoole, D.; Nguyen, Q.V. The Impact of Virtual and Augmented Reality on Presence, User Experience and Performance of Information Visualisation. Virtual Real. 2024, 28, 133. [Google Scholar] [CrossRef]
  38. Seo, S.; Park, H.; Koo, C. Impact of Interactive Learning Elements on Personal Learning Performance in Immersive Virtual Reality for Construction Safety Training. Expert Syst. Appl. 2024, 251, 124099. [Google Scholar] [CrossRef]
  39. Wheeler, S.G.; Hoermann, S.; Lukosch, S.; Lindeman, R.W. Design and Assessment of a Virtual Reality Learning Environment for Firefighters. Front. Comput. Sci. 2024, 6, 1274828. [Google Scholar] [CrossRef]
  40. FLAIM Systems. Firefighter Training. Available online: https://flaimsystems.com/firefighter-training (accessed on 7 March 2025).
  41. Bicalho, D.R.; Piedade, J.M.N.; Matos, J.F. The Use of Immersive Virtual Reality in Educational Practices in Higher Education: A Systematic Review. In Proceedings of the 25th International Symposium on Computers in Education (SIIE 2023); IEEE: Setúbal, Portugal, 2023. [Google Scholar] [CrossRef]
  42. Moro, C.; Štromberga, Z.; Raikos, A.; Stirling, A. The Effectiveness of Virtual and Augmented Reality in Health Sciences and Medical Anatomy. Anat. Sci. Educ. 2017, 10, 549–559. [Google Scholar] [CrossRef] [PubMed]
  43. Morenilla Rodriguez, R.M.; Cintora Sanz, A.M. FASTER Project First Responder Advanced Technologies for Safe and Efficient Emergency Response. Int. J. Integr. Care 2022, 22, 462. [Google Scholar] [CrossRef]
  44. Hancko, D.; Majlingova, A.; Kačíková, D. Integrating Virtual Reality, Augmented Reality, Mixed Reality, Extended Reality, and Simulation-Based Systems into Fire and Rescue Service Training: Current Practices and Future Directions. Fire 2025, 8, 228. [Google Scholar] [CrossRef]
  45. Fiore, S.M.; Wiltshire, T.J. Technology as Teammate: Examining the Role of External Cognition in Support of Team Cognitive Processes. Front. Psychol. 2016, 7, 1531. [Google Scholar] [CrossRef]
  46. Emaliyawati, E.; Ibrahim, K.; Trisyani, Y.; Nuraeni, A.; Sugiharto, F.; Miladi, Q.N.; Abdillah, H.; Christina, M.; Setiawan, D.R.; Sutini, T. Enhancing Disaster Preparedness Through Tabletop Disaster Exercises: A Scoping Review of Benefits for Health Workers and Students. Adv. Med. Educ. Pract. 2025, 16, 1–11. [Google Scholar] [CrossRef]
  47. Tadikonda, S.K.K. Cognitive Immersion: AI-Driven Frameworks for Enhanced Virtual Reality Experiences. World J. Adv. Res. Rev. 2025, 26, 479–487. [Google Scholar] [CrossRef]
  48. Uhl, J.C.; Zechner, O.; Baetzner, A.; Birrenbach, T.; Egger-Lampl, S.; Schrom-Feiertag, H.; Tscheligi, M. Mixed reality training for medical first responders: System evaluation and recommendations. Virtual Real. 2025, 29, 69. [Google Scholar] [CrossRef]
  49. Pillajo, E.; Mourgues, C.; Neyem, A.; González, V.A. An Interface Design Method Based on Situation Awareness and Immersive Analytics for Augmented and Mixed Reality Decision Support Systems in Construction. Appl. Sci. 2025, 15, 7820. [Google Scholar] [CrossRef]
  50. Kman, N.E.; Price, A.; Berezina-Blackburn, V.; Patterson, J.; Maicher, K.; Way, D.P.; McGrath, J.; Panchal, A.R.; Luu, K.; Oliszewski, A.; et al. First Responder Virtual Reality Simulator to Train and Assess Emergency Personnel for Mass Casualty Response. JACEP Open 2023, 4, e12903. [Google Scholar] [CrossRef]
  51. XVR Simulation. Crisis Management and Command Training Platform. Available online: https://medsimhealth.com/xvrsimulation/ (accessed on 7 March 2025).
  52. Alshowair, A.; Bail, J.; AlSuwailem, F.; Mostafa, A.; Abdel-Azeem, A. Use of Virtual Reality Exercises in Disaster Preparedness Training: A Scoping Review. SAGE Open Med. 2024, 12, 20503121241241936. [Google Scholar] [CrossRef]
  53. Tsai, M.-H.; Chang, Y.-L.; Shiau, J.-S.; Wang, S.-M. Exploring the Effects of a Serious Game-Based Learning Package for Disaster Prevention Education: The Case of Battle of Flooding Protection. Int. J. Disaster Risk Reduct. 2020, 43, 101393. [Google Scholar] [CrossRef]
  54. Wijkmark, C.H.; Metallinou, M.M.; Heldal, I. Remote Virtual Simulation for Incident Commanders—Cognitive Aspects. Appl. Sci. 2021, 11, 6434. [Google Scholar] [CrossRef]
  55. FLAIM Systems. Emergency Services|FLAIM Training Solutions. Available online: https://flaimsystems.com/emergency-services (accessed on 4 February 2026).
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Article Metrics

Citations

Article Access Statistics

Multiple requests from the same IP address are counted as one view.