2.1. Research Background
In the literature, a disaster is defined as natural, technological, or human-induced event that causes physical, economic, and social losses for the whole or certain segments of the society, halts or interrupts normal life and human activities, and for which the capacity of the affected society to cope is not sufficient; in other words, it is not the event itself, but its consequence [
14]. Disaster refers to a sudden incident, such as an accident or natural disaster, which leads to significant damage or loss of life [
15]. A disaster is also described as an occurrence that causes major damage and loss to human property or people themselves [
16]. Disasters are described as threats to life, welfare, material assets, and the environment that arise from the excessive effects of either natural processes or technology [
14]. Disasters are classified as natural and human-made [
17]. Natural disasters are divided into two distinct categories: (1) slow-onset disasters and (2) rapid-onset disasters [
14]. Slow-onset disasters include severe cold, drought, famine, etc., while rapid-onset disasters include earthquakes, floods, landslides, avalanches, storms, tornadoes, volcanoes, fires, etc. [
14,
17]. Nuclear, biological, and chemical accidents, transportation accidents, industrial accidents, accidents caused by overcrowding, immigrants and displaced persons, etc., can be given as examples of man-made disasters [
17].
Disaster management is a collective process undertaken by society to prevent disasters, reduce their impacts, and respond effectively when they occur, with the ultimate goal of creating safer living environments for affected communities [
14]. It is an interdisciplinary field that brings together practitioners and academics, volunteers and professionals, NGOs, government agencies, and international organizations to coordinate efforts in saving lives and reducing losses under challenging conditions [
18]. The primary aim is to minimize the social and physical impacts of large-scale events by reducing loss of life and damage to infrastructure and property [
19,
20,
21]. Both natural and man-made disasters require timely assistance and mitigation measures to protect populations [
22]. Key objectives include preparedness, rapid response and recovery, efficient resource allocation, timely repair of damage, and minimizing negative impacts on society [
23]. Disaster management, therefore, encompasses two sets of activities: mitigation and preparedness before a disaster, and response and recovery afterward [
14]. These four phases—mitigation, preparedness, response, and recovery—constitute the widely recognized disaster management cycle [
22,
23,
24,
25].
The first two phases take place before the disaster occurs, while the last two phases take place after the disaster [
26]. The mitigation phase includes all actions designed to reduce, minimize, or eliminate the likelihood, impact, and consequences of future hazards, disasters, and emergencies with long-term benefits. Preparedness is a stage that gets involved when an emergency or disaster is expected or imminent, where measures are taken to mitigate the effects of disasters and include providing the necessary tools to increase the chances of survival and minimize the financial losses of those who may be affected by or assist those affected by a disaster [
23,
25,
26]. The response phase encompasses emergency actions taken both during and after the impact of a disaster to mitigate or remove the effects of disasters that have already happened or are about to happen and to prevent further suffering and financial losses [
25,
26]. Recovery is a process that involves repairing damage, restoring services, rebuilding facilities, and restoring the lives of victims to a normal state to return to a normal or better level of functioning after the occurrence of a disaster and the impact of its consequences [
23,
25,
26].
Over the recent years, digital tools have found a wide range of applications in disaster management studies. Among these tools, DTs offer various advantages. A DT is a software model representing a real-world object in its specific context [
27]. DTs are intelligent and constantly developing virtual or digital copies or representations of a real-world physical object, product, or process [
28,
29]. DT, also recognized as digital vision and digital mapping, is an advanced digital transformation technology that combines multiple physical, scale, and disciplinary features [
30,
31]. A DT is a constantly changing, self-improving virtual model or simulation of a real-world subject or object that uses both real-time and historical data to represent the exact situation of its physical twin at any point in time [
32].
The idea of DT technology was first articulated by David Gelernter in 1991 with the publication of his book Mirror Worlds, and the idea was called ‘Mirror Worlds’ [
32]. The concept of DT was later proposed in 2002 by Professor Michael W. Grieves in his Product Lifecycle Management course at the University of Michigan, by presenting a virtual prototype of a physical product, and the concept was named ‘Mirror Spaces Model’ [
32,
33,
34,
35]. In 2006, the name ‘Mirror Spaces Model’ was changed to ‘Information Mirroring Model’ and the term ‘DT’ was first used in the draft version of NASA’s technological road map in 2010 [
32].
The studies reveal that DT was first applied in the aviation and aerospace industry and the military [
28,
32,
33,
35]. Furthermore, Crespi et al. [
27] stated that DT was designed for manufactured items such as airplanes, automobiles, industrial facilities, and urban environments.
DT is involved in intelligent transportation, agriculture, education, industrial production, metaverse, construction and construction of smart cities, and energy due to the ongoing development of new information technologies, including the industrial internet, internet of things, cloud computing, big data, artificial intelligence, and extended reality [
28,
30,
32,
33,
36].
Some studies state that the DT model has three main components: (1) the physical entity/space, (2) the virtual entity/space, and (3) the connection mechanism/data and software [
29,
30,
32]. Wu et al. [
31] stated that the data composition of DT encompasses product design, process, production, service, and retirement and scrap data. Beyond its structure, the DT plays key roles: (1) simulating, monitoring, diagnosing, predicting, and controlling how products form and behave in the real world, (2) fostering coordination across all stages of a product’s life cycle and production innovation, and (3) creating a data foundation for digital product records, quality tracking, and ongoing product development improvement [
34].
DT intends to increase competitiveness, productivity, and efficiency, as well as ensure the access of all users and stakeholders to the DT model [
37,
38].
The previous research revealed that DT technology has many advantages. Tao et al. [
36] indicated that benefits include increased visibility, reduced energy consumption, increased user engagement, reduced time-to-market, maintenance of optimal operations, and integration of information technologies. Singh et al. [
32] also stated that the benefits of DT include rapid prototyping and product redesign, cost-effectiveness, anticipation of problems and system planning, optimization of solutions and improved maintenance, accessibility, safer than its physical counterpart, reduced waste, documentation, communication, and training opportunities. Additionally, DT can connect physical and virtual worlds in real time, which enables a more accurate and comprehensive assessment of unforeseen and unpredictable situations [
32].
Some of the studies mention the benefits of DT in the construction industry. Kineber et al. [
28] stated that DTs can optimize design, simulate the construction process, detect potential problems, monitor real-time performance throughout the building’s life cycle, make better decisions, and make preventive predictions. The added benefits are that construction sites using DTs are more efficient, produce higher quality work, and have higher levels of health and safety performance. DT technologies improve the performance and sustainability of construction projects and the safety of construction workers [
29].
In the literature, the advantages as well as the challenges of DT are mentioned. Tao et al. [
36] pointed out that technology and infrastructure, standards and specifications, the cognitive and technical level of people, cost control and management, support tools, intellectual property rights, cybersecurity, and insufficient development of DT are among the challenges of DT. Singh et al. [
32] also noted that the novelty of technology, problems with data, lack of standards and regulations, time and cost, and life-cycle incompatibility are among the challenges.
DT tools and platforms can be grouped into four categories: modeling, simulation, operation, and interaction [
31]. The modeling includes SolidWorks, 3D Studio Max, and AutoCAD. The simulation involves Simulink, Twin Builder, Azure, 3D Experience, and Teamcenter. The operation contains Plant Simulation, Eclipse Ditto, Watson IoT, and IBM. The interaction comprises Unity 3D and Predix Platform.
Recent debates have intensified about the distinction between a Building Information Model (BIM) and DT. BIM is a digital representation of the physical and functional characteristics of a building and helps to manage the design, delivery, construction, maintenance, and operation processes [
39,
40]. Tomek and Matějka [
41] defined BIM as a contemporary method of construction management that enables users to develop multidimensional, object-based parametric models as a tool for managing construction projects for their whole life cycle. Toprak and Demirkesen [
42] aimed to summarize the fundamental concepts of BIM, provide a comparative assessment using examples from geotechnical and infrastructure projects, and develop recommendations regarding potential integrations with software. The methodology involved a literature review, software comparison, case study analysis, and a mandatory screening. The results showed that BIM provides time/cost savings, improved coordination and safety; strengthens nD lifecycle management; accelerates the adoption of BIM through regulatory requirements; and facilitates decision support, asset management, and visualization through GIS/Lean integrations and IFC-based multi-scale models.
Beyond BIM, DT, a dynamic and interactive model, entails digitizing a physical asset, like a building or infrastructure, and using analytics and real-time data to give information on its maintenance requirements, potential problems, and performance, and updating itself [
43]. Deng et al. [
44] introduced a five-tier ladder taxonomy that illustrates the transformation from BIM to DT. BIM has rather evolved from static BIM to autonomous DT, where this evolvement is shown on 
Figure 2.
Ford and Wolf [
45] proposed and tested a conceptual Smart City Model based on Digital Twins (SCDT) for disaster management. The researchers emphasized that DT should include images that reflect community characteristics and component interactions and encompass multiple computer simulation models that combine different images and data to improve disaster management decisions. Furthermore, two threats (integration and fatigue risks) were identified that could be mitigated by focusing on disaster management during the development of SCDT. Fan et al. [
46] presented the Disaster City Digital Twin as a unifying paradigm for the integration of different research streams related to AI in disasters and crisis informatics. The DT paradigm consists of four main components: (1) multi-data sensing for data collection, (2) data integration and analytics, (3) multi-actor game-theoretic decision making, and (4) dynamic network analysis, which together enhance disaster response coordination and emergency management effectiveness.
Ariyachandra and Wedawatta [
47] aimed to reveal the evolution of the Digital Twin–Smart City (DTSC) approach in disaster risk management, map its applications and benefit/obstacle areas in the disaster life cycle, and proposed a future research agenda. As a method, 72 articles were analyzed in a systematic search conducted in Web of Science and Scopus between 2011 and 2021 using keywords and filtering. The findings indicated that DTSC significantly improved early warning, situational awareness, resource allocation, and decision support with its core technological components; but data quality/standards and sharing, regulation-privacy/fairness, multi-stakeholder coordination, IT/communication infrastructure resilience, and institutional/social capacity deficiencies limited scalability. Therefore, real-time integration, ethical/privacy compliance, cost-effectiveness impact measurement, and field validation in different contexts were identified as priority research gaps.
Lee et al. [
48] conducted a three-stage (DT application, classification of underground utility tunnel database, and algorithm application) study aimed at creating and implementing scenarios for identifying alterations in spatial entities, intending to forecast potential disasters and the growth of underground utility tunnels. Yu and He [
49] presented a scientific notion that the construction of Intelligent Disaster Prevention and Mitigation for Infrastructure (IDPMI) is guided by the DT. The researchers examined how IDPMI is implemented under DT standards and analyzed the progress and challenges related to each technology. The study emphasized that DT-related technology is being developed through smart design, construction, maintenance, and disaster management, and outlined a five-layer framework (data, object, technology, connection, and service) for DT-IPDMI’s future development.
Yun et al. [
50] proposed a DT software architecture that operates using similarity-based hybrid modeling for reliable disaster management systems. Using real 2016 North American forest fires and multi-channel environmental data as a case study, the context, dataset, and scenarios were defined. The physics-based, data-driven, and proposed hybrid models were run under equal conditions and compared in the same scenarios. The hybrid model was significantly more accurate than the physics-based approach across all climate/scenario classes, and produced lower error compared to the purely data-driven approach. Furthermore, similarity-based training data selection reduced error compared to random selection and provided more stable performance in data scarcity.
Kamari and Ham [
51] presented a new vision-based digital twinning and threat evaluation framework for disaster (hurricane) preparedness at construction sites. The study identified potential wind-borne debris at 2D and 3D levels using visual data obtained from construction sites and assessed the risks using kinetic energy analysis and site-based heat maps. Lee et al. [
52] focused on developing a methodology for implementing a DT of an underground utility tunnel, susceptible to disasters and accidents, and validating their methodology through real implementation. This study provided a step-by-step process to create the DT technology by integrating the three main layers: data acquisition (infrastructure and sensor DTs), digital modeling (BIM, CityGML, 3D grid), and service (disaster detection, prediction, and asset management).
Vuoto et al. [
43] provided an overview of the DT idea in the field of architecture, engineering, construction, and management, and suggested a prototype of the DT paradigm aimed at protecting the structural integrity of heritage structures. Based on the principles of the Venice Charter, the study emphasized real-time monitoring, minimal intervention, and rapid disaster response supported by digital technologies such as IoT, 3D scanning, robotics, BIM, and AI.
In summary, the literature shows that DT applications in disaster management are becoming widespread in various fields, from smart cities to manufacturing, infrastructure, and cultural heritage preservation [
27]. Common technologies such as IoT, sensors, 3D scanning, BIM, and artificial intelligence are being used; however, the real challenge lies in processing digital data through automated processes and converting it into actionable information. This emphasizes that DTs are not merely monitoring and prediction tools but also provide a strategic and holistic framework that enhances safety and resilience and improves decision making processes.
  2.2. Research Method
This study adopts a case study methodology supported by a mixed-methods approach, combining spatial data analysis, simulation modeling, and qualitative assessment. The research design focuses on the development and implementation of an open-source digital twin model tailored for disaster management and sustainability evaluation in the Cayirova region of Turkey. The methodological framework is structured around three core components: data-driven modeling, disaster scenario simulation, and sustainability evaluation. This framework is operationalized through the development of a DT model that replicates the physical and social realities of the Cayirova urban environment. The study’s design allows for both qualitative and quantitative assessments of disaster risks and response strategies, providing a comprehensive basis for sustainable urban planning and policy development.
  2.2.1. Hardware Selection
The hardware configuration for this study was carefully selected to balance computational performance, cost-effectiveness, energy efficiency, and scalability. Given the dual nature of the project—requiring both local development and remote accessibility—a hybrid hardware approach was adopted (
Figure 3). This comprises a high-performance personal workstation for model development, simulation, and data processing, and a cloud-based server architecture to support real-time access, scalability, and public demonstrations of the DT system.
The hybrid setup enables seamless integration between local and cloud environments. IoT sensors collect seismic, structural, environmental, and energy data, which are transmitted via LoRaWAN, Wi-Fi/5G, and MQTT protocols for preprocessing and simulation on the local workstation. The processed data are then synchronized with the cloud environment, where a PostGIS-based server hosted on AWS/DigitalOcean handles data storage, scenario analysis, and visualization through web dashboards and risk maps. This configuration supports multi-user access, remote monitoring, and efficient disaster information sharing through municipal and emergency management systems.
The local workstation, equipped with an AMD Ryzen 9 7950X CPU, NVIDIA RTX 4070 Ti GPU, 64 GB DDR5 RAM, and a 2 TB NVMe SSD, provides sufficient computing capacity for large-scale 3D modeling, spatial analysis, and high-fidelity simulation tasks. The cloud server, provisioned with 8 vCPUs, 32 GB RAM, and 1 TB SSD, offers dynamic scalability, remote accessibility, and off-site redundancy. Together, these components ensure stable real-time performance for digital twin operations and provide a flexible foundation for future extensions, including live IoT data integration and AI-driven risk assessment.
The hardware and cloud configurations were selected to ensure stable and efficient real-time performance during DT operations. The local workstation enables high-fidelity simulation and rendering in Unreal Engine, while the AWS-based PostGIS architecture supports seamless spatial data storage, query processing, and synchronization between local and cloud environments. This infrastructure, therefore, meets the operational requirements of the proposed framework without necessitating external high-performance computing resources.
  2.2.2. Data Integration and Processing Pipeline
The integration workflow renders heterogeneous inputs interoperable at the building level, which serves as the anchor entity for exposure and risk estimation. Five input families are ingested—vector, raster, 3D, point-cloud, and tabular/demographic—and processed through a staged PostGIS schema (staging → curated → published). All spatial data are harmonized to a common reference system (UTM Zone 35N, WGS 84), units are standardized (length in meters, area in square meters), and timestamps are retained to support temporal auditing and refresh.
Vector layers (building footprints, road centerlines, utilities, administrative boundaries, land use) undergo geometry validation to remove self-intersections and slivers, topology correction to enforce closed rings and planarity where required, and attribute enrichment from authoritative registries. Raster layers (DEM/DTM, satellite imagery, orthophotos, land-cover) are georeferenced, resampled to a project baseline resolution, and tiled for efficient retrieval. Point clouds (LiDAR and photogrammetric) are filtered, ground/non-ground classified, and surface models are derived to reconcile elevation with footprints. Three-dimensional city objects are converted to consistent formats and LoD conventions, where available (e.g., CityGML/CityJSON), semantics are preserved and mapped to the project data dictionary (
Figure 4).
Building inventories from municipal sources are normalized to controlled vocabularies for typology, lateral system, material, and occupancy; year-built and height attributes are range-checked and converted to canonical units. A stable identifier service issues or reconciles a Building_ID, and cross-references to parcel and address registries are maintained (Parcel_ID, Address_ID) to support multi-source joins and longitudinal updates.
Census and household statistics are aligned to a single reference year, variable definitions are harmonized, and metadata about the collection method and uncertainty are retained. To permit building-level inference while preserving official totals, areal units are pre-indexed with geometry hashes and linked to administrative hierarchies.
Integration proceeds in two steps. First, structural tables are joined to footprints by Building_ID; when absent, a spatial key is created using centroid-in-polygon with a small tolerance, falling back to nearest-neighbor along the street axis, with match confidence recorded. Many-to-one cases (e.g., multi-entrance buildings) are resolved by rules based on area and entrance counts; one-to-many cases (e.g., multi-parcel buildings) are retained via a bridge table to avoid information loss. Second, demographic variables are disaggregated from census units to buildings using population-weighted areal interpolation refined asymmetrically by floorspace and declared occupancy. Re-aggregation of building-level estimates to census boundaries reproduces official marginals within acceptable error, ensuring statistical coherence across scales.
Automated range and domain checks guard against implausible values (e.g., negative height, construction year outside plausible bounds). Referential integrity is enforced across entity tables (buildings–parcels–blocks), and cross-layer consistency tests verify spatial logic (e.g., footprints within administrative limits; floor-area totals consistent with dwelling counts). Missing attributes are imputed hierarchically: deterministic fills from municipal archives where available; otherwise, hot-deck or model-based estimates using neighborhood, typology, and era predictors. Every ETL run produces a provenance ledger (source, transformation, version hash) and a versioned snapshot of the curated schema to ensure reproducibility.
The curated database exposes materialized views optimized for analytics and visualization: building-level exposure tables; precomputed risk surfaces; and 3D tiles/feature services for the dashboard and game-engine interfaces. This decouples heavy ETL from interactive use, enabling near real-time querying while supporting periodic refresh as new sources (e.g., IoT or updated imagery) arrive.
Figure 5 shows an end-to-end integration and processing pipeline. Heterogeneous sources (vector, raster, 3D, point cloud, and tabular/demographic) are cleaned, standardized, and harmonized to a common CRS and data dictionary. Structural inventories are reconciled to footprints through stable identifiers and spatial joins; census variables are downscaled to buildings via population-weighted, asymmetrically refined interpolation. Quality control, referential integrity, and provenance logging precede publication of materialized views and 3D tiles to support analysis and visualization in the DT.
 In order to enhance the capabilities of the digital twin (DT) for sustainable disaster management, this study proposes the integration of real-time data acquisition through the deployment of sensors and sensor networks. The incorporation of Internet of Things (IoT) technologies enables the DT to move beyond static simulations and dynamically adjust to evolving real-world conditions. Sensor nodes placed strategically across the Cayirova region monitor critical parameters such as ground motion, structural health, air quality, and environmental conditions. These sensors serve as primary data acquisition points, facilitating real-time monitoring of disaster indicators and contributing to a more responsive and intelligent disaster management platform.
The design of the sensor network includes diverse types of sensors tailored to specific hazards and monitoring needs. Seismic sensors (accelerometers) detect ground vibrations indicative of seismic activity; structural health monitoring sensors (strain gauges and displacement meters) measure the integrity of bridges and high-rise structures; environmental sensors capture temperature, humidity, air quality, and water levels; and energy meters track the performance of critical power infrastructure. Together, these components provide the DT with a continuous flow of physical-environmental data essential for early detection, situational awareness, and post-event assessment.
Data transmission from sensor nodes to the DT platform is facilitated through multiple communication technologies. For long-range and low-power communication, LoRaWAN (Long Range Wide Area Network) offers an energy-efficient solution, while Wi-Fi and 5G support bandwidth-intensive, high-density urban data streams. The MQTT protocol, a lightweight and efficient messaging standard, forms the backbone of real-time sensor communication, ensuring timely and reliable data delivery to the cloud infrastructure.
To optimize system performance, edge computing devices such as Raspberry Pi or NVIDIA Jetson Nano units act as IoT gateways. These devices perform preliminary data processing near the source, reducing transmission load and enabling faster local decision making. Edge-level filtering and aggregation ensure that the system remains operational even under intermittent network conditions, maintaining essential functions during disaster events.
Figure 6 illustrates the vertical sensor and IoT data pipeline that supports the digital twin framework. It shows how sensor data are collected and preprocessed at the edge, transmitted through IoT communication networks, integrated within a cloud-based PostGIS environment, and visualized in the DT platform for monitoring and decision support. Feedback loops between end users and the DT further enhance adaptability, allowing configuration updates and control actions to be transmitted back to the sensor layer.
 The integration of sensors and sensor networks presents several advantages. It significantly enhances the situational awareness of the DT, enabling real-time assessment of disaster impacts and supporting proactive decision-making. Furthermore, it offers the opportunity to develop early warning systems that can provide critical lead time for emergency response. Continuous monitoring of key parameters contributes to long-term resilience building by identifying vulnerabilities before they lead to catastrophic failures. However, the deployment also introduces certain challenges. Initial costs for sensor procurement and installation may be substantial, and ongoing maintenance is required to ensure data quality and system reliability. Network resilience must also be considered, as communication infrastructures may be affected during disasters.