Next Article in Journal
Autonomous Shuttle Service in Sub-Urban Mixed Traffic Conditions: Microscopic Simulation-Based Impact Assessment
Previous Article in Journal
A Reputation-Aware Adaptive Incentive Mechanism for Federated Learning-Based Smart Transportation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Implementation Maturity Levels of Digital Twin Technology and Data Content Design for Flood Digital Twin

Department of Crisis Management, Faculty of Security Engineering, University of Žilina; Univerzitná 8215/1, 010 26 Žilina, Slovakia
*
Author to whom correspondence should be addressed.
Smart Cities 2026, 9(2), 28; https://doi.org/10.3390/smartcities9020028
Submission received: 22 July 2024 / Revised: 13 January 2026 / Accepted: 27 January 2026 / Published: 6 February 2026

Highlights

What are the main findings?
  • This article proposes a new possible division of phases of digital twin maturity levels.
  • The article also proposes a categorization of data types and a specific way of obtaining, implementing and monitoring them for digital twin technology aimed at mitigating the risk and impact of floods.
What are the implications of the main finding?
  • The article describes essential elements for building proper, effective and rapid flood management tools.
  • The article gives recommendations on how the digital twin concept can help in the prevention and mitigation of floods.

Abstract

This study examines the potential of digital twin (DT) technology to strengthen urban security, with a specific focus on flood risk management in smart cities. A DT is understood as a virtual representation of real-world assets and processes, continuously synchronised with data from the physical environment. Building on an analysis of the existing DT literature and maturity assessment, identified operational requirements and the authors’ expertise in crisis management, this study proposes a structured set of DT maturity levels with stage boundary conditions and illustrative measurable indications and designs a maturity-driven data content model for a flood-oriented DT. The framework identifies essential data layers, sensing requirements and integration mechanisms necessary for representing hydrological, infrastructural and environmental conditions at operationally meaningful update frequencies. This study further outlines the conceptual architecture of a flood DT and discusses its potential to support prediction, situational awareness and decision making across crisis management phases. By providing recommendations for DT implementation and highlighting opportunities for future development, this study contributes to ongoing efforts to enhance the resilience and safety of urban areas through advanced digital technologies.

1. Introduction

In the current period of digital revolution and data transformation, characterised by the transition from mechanical and analogue systems to automated digital electronics [1], the role of data and the implementation of the digital twin (DT) concept are becoming increasingly important. Although the DT concept was initially applied mainly for industrial purposes [2], it has gradually expanded into a variety of other domains, including crisis management (CM). Crisis management focuses, among other things, on the prevention of, and preparedness for, emergencies affecting cities and their inhabitants [3]. DT technology offers significant potential to enhance resilience against disasters by using information and communication technologies to monitor urban activity, detect changes in the behaviour of observed phenomena, and provide timely predictions of future developments [4].
Several authors have addressed questions related to the applicability of DTs in crisis management within smart cities, contributing valuable insights to the field. Examples include the works of Ariyachandra (2023) [4], Park (2022) [5], Hyun et al. (2024) [6] and Khan et al. (2022) [7]. The topic of DT implementation is not new and has been discussed for many years. Most studies focus on specific applications within particular scenarios, which deepens domain-specific expertise but also highlights the multidisciplinary and complex nature of this field, which requires sustained attention and research. For example, Habib et al. (2025) [8] examined the enhancement of smart city resilience during earthquakes, and Park et al. (2024) [9] explored the use of DTs in flood modelling.
At present, however, it is equally important to address the broader question of DT development itself. This includes the progression from initial stages to more advanced levels of autonomy. These developmental aspects, along with the latest technological advances and the potential integration of artificial intelligence, form a central theme of this study. By proposing a structured set of DT development phases and designing corresponding identification questions, we aim to support the recognition of recurring developmental patterns. This may help establish initial development goals, technical specifications and the creation of DT-enabled products suitable for real-world use.
Within the context of the Slovak Republic, the practical implementation of DT concepts for improving urban safety, particularly in the field of disaster management, is still largely absent, both in theory and in practice. There is currently a lack of DT-based solutions aimed at increasing urban resilience and community protection. For this reason, this study focuses specifically on flood protection and proposes a structured approach to DT data content and design. Analyses and risk assessments relating to emergencies in Slovakia have traditionally relied on expert judgment and conventional methods. The integration of DTs and artificial intelligence, therefore, represents an important opportunity to modernise and improve these processes. This study aims to contribute to the ongoing academic and professional discussion by providing assessment-oriented insights that link multidisciplinary perspectives on DT maturity, data integration and crisis management decision support. It may be viewed as an extension of the author’s previous work on digital twins for enhancing the security and resilience of smart cities by advancing a staged DT maturity framework and translating it into practical requirements for flood risk management. Recent work in the Slovak context shows that the successful deployment of advanced digital technologies for safety and resilience must build on both technical and organisational foundations. Kollarova et al. proposed a conceptual model of key security and privacy aspects for smart cities in Slovakia, emphasising that sustainable smart city solutions depend not only on IoT infrastructures and data flows but also on appropriate norms, policies and meaningful human control that preserve the trust and engagement of residents [10].

2. Materials and Methods

The proposed DT maturity levels were examined in relation to existing DT maturity and assessment models and mapped to the conceptual architecture presented in Figure 1. This comparison made it possible to identify shared developmental characteristics as well as conceptual gaps that are particularly relevant to crisis management applications. The formulation of the flood-oriented DT data content model was shaped by a maturity-driven requirements logic, practical requirements and informed by the authors’ experience with advanced simulation platforms, including MIKE Flood, ArcGIS and VR-Forces, over the years. This tool-informed perspective supported the specification of data layers, model coupling needs, and decision-loop considerations required for a flood-oriented DT. These tools, which support hydrodynamic and geospatial modelling as well as scenario-based representation, provided valuable insight into the types of data, model structures and behavioural characteristics necessary to represent flood dynamics in a way that is meaningful for crisis decision making.
The methodological approach used in this study combined an extensive analysis of the domestic and international literature and domain expertise in crisis management, simulation technologies and smart city development. This analytical process enabled the identification of the current state of DT use, the limitations that inhibit wider adoption and the future needs that should guide research priorities. Feasibility considerations and the specific characteristics of the Slovak environment were also taken into account, with the intention of creating a framework that could be used locally and adapted by neighbouring countries with similar conditions. This study further examined available options for sensory data provision and assessed their relevance, availability and capacity to support real-time DT functionality. Methods applied throughout included analysis, synthesis, and criterion-based comparison; professional observations were used to contextualise requirements and to check practical feasibility of the proposed data and modelling elements. The resulting framework is intended to support developers, decision makers and other stakeholders by clarifying the opportunities, capabilities and strategic purpose of adopting a flood-oriented DT. It highlights the organisational, technological and analytical elements required for effective deployment and reflects the growing potential of DT systems to strengthen resilience, enhance preparedness and improve crisis management practice in smart city environments.
The literature used in this study consisted of peer-reviewed sources to ensure traceability of concepts and reproducibility of comparisons. This study also takes into account the increasingly important role of artificial intelligence in the development and operation of digital twins, particularly in relation to prediction, optimisation and decision support within crisis management. Coherence between the reviewed studies was ensured through systematic comparison with attention to conceptual consistency and relevance to the objectives of the proposed DT maturity and flood DT design framework. Priority was given to research demonstrating operational applicability, including studies that specify staged capability progressions, measurable requirements, or validation and governance considerations relevant to crisis management and flood risk decision support.
Differential analysis (was conducted to enable a structured comparison of digital twin maturity models. The literature set was identified in Scopus using the following constraints: document type “article”, English language, and publication years 2021–2025. The initial search string entered in Scopus was “Digital Twin Maturity Levels”. Initial query returned 66 records. Because the objective was to compare maturity approaches with demonstrated scientific visibility, the initial screening prioritised highly cited publications.
Highly cited digital twin publications from 2021 to 2025 (Scopus) were additionally screened, and papers were considered eligible if they explicitly proposed or operationalised a maturity model or staging framework. Where the most cited set did not contain a sufficient number of maturity-focused papers, additional highly cited maturity-model publications were included until the final corpus reached N = 7.
Eligibility screening followed the criteria (C1 to C11) defined in Section 3.2. Publications were included in the assessment only if they defined DT maturity through explicit stages/levels with a clear progression logic and/or provided a maturity assessment framework with operational elements, such as rubrics, indicators, questionnaires, or a scoring method that could be coded against Table 2. Publications were excluded if they addressed DT only at the level of definitions, general benefits, architectures, applications, or reviews without an explicit maturity staging or assessment instrument, because such studies cannot be evaluated consistently using the criteria-based coding scheme.
Overall, 40 publications were screened in full text against the inclusion/exclusion rules prior to coding. Although additional digital twin publications were identified, only seven studies met the inclusion criteria of providing explicit maturity staging and/or a maturity assessment framework suitable for criterion-based assessment. The screening was not expanded further because the remaining candidate publications had zero citations at the time of analysis, so citation count was, therefore, used as a practical threshold to maintain focus on contributions with measurable scholarly uptake within the defined scope.

3. Results

A DT is a technology that consists of a computerised digital model of specific physical elements or processes that are interconnected by data links, in which real-time data synchronisation takes place between physical and virtual environments.
The literature describes several definitions for the term DT: “A digital twin is a set of computer-generated models that map a physical object into a virtual space. Both physical and virtual elements exchange information to monitor, simulate, predict, diagnose, and control the state and behaviour of the physical object in the virtual space” [14]. Another definition presents DT as follows: “A dynamic virtual representation of a physical object or system, usually in multiple phases of its life cycle. It uses real data, simulation or machine learning models combined with data analysis to enable understanding, learning and reasoning. DT can be used to answer “what if” questions and should be able to represent knowledge in an intuitive way” [15].
As a virtual digital model of an asset, process or service in the real world, DT creates a connection between the physical and digital worlds. This connection and communication between reality and digitised functions enable real-time monitoring, intuitive understanding of situations, and the ability to identify problems before they occur. Simulation and predictive modelling in DT facilitate optimal decision making through bag of models, analyses, scenarios, etc. [16].
As several definitions of the DT appear in the literature, what is terminologically satisfactory for one field of activity may be unsatisfactory for another. A similar situation can be expected in the field of security and crisis management. For this reason, when proposing a definition of a DT for crisis management purposes, it is advisable to emphasise clarity, operational relevance and consistency with established crisis management terminology.
A possible recommended definition is as follows: A digital twin in crisis management is a virtual representation of real-world objects, events and processes that directly correspond to conditions in the physical environment. Its mechanisms support a continuous two-way flow of information that enables the monitoring and management of the real environment. The digital twin reacts to deviations or emerging instability in the monitored system, and these changes are immediately reflected in its virtual state. Through real-time assessments, modelling functions and scenario analyses, the digital twin provides early warning and supports the prediction of crisis developments or security breaches.
For DT, a virtual replica of AI-based physical systems, its basic parts were originally defined, consisting of a physical object part, a virtual object part, and the interconnection between them [17]. Other authors gradually started to discuss parts such as two-way communication and feedback form, which are also expected to be common features of different types of DT [1].
The basic parts of the DT aimed at security and monitoring of the real city environment to identify potential risks should consist of the following:
  • Physical part is the real city environment, such as infrastructure, public spaces, critical assets, people and traffic flows, in which actual events and processes take place and which is represented by the DT.
  • Virtual part or virtual representation is an analogical description, a logical model of the asset, and represents the transformed real environment in digital form.
  • Data-driven services increase the convenience, reliability and productivity of the system.
  • Connections characterise the digital links and transmission mechanism between data sources, enabling the transfer and control of data from the real environment of the physical part to the virtual environment.
  • Services must provide services, such as simulation, decision making, monitoring and control of a physical object, a means for storing data.
  • Technologies installed in a real environment, as they represent a means of data collection.
  • State is the specific condition in which a unique physical asset or process is located at a specific time.
Due to its popularity in many areas, DT is also finding widespread application in smart cities and crisis management. The use of smart maps and geospatial virtual infrastructure through DT can be used for the overall prosperity of cities and their inhabitants. According to Pramotedham, CEO of Esri Singapore, “only with a digital twin can government agencies effectively analyse what can be done with data to improve the lives of citizens, create economic opportunities and revitalise a closer community” [18]. It, therefore, has great potential for CM in the security field [19]. These technologies enable the exchange of information, the monitoring of public spaces, the resolution of security challenges and the enhancement of urban security and safety. Therefore, knowledge of the possibilities of applying DT should be key in terms of the availability of the literature.

3.1. Maturity Levels of Digital Twins

Several different terms are used in the literature to assess and determine the stage of development of DTs. In the available literature, we found terms like Maturity Levels of the DT (DT Levels), DT Classes [13,20] or DT Integration Levels, DT Classification, etc., all of which attempt to characterise the functionality of different levels of the DT [1].
DT maturity levels are used to assess and categorise the level of development and implementation of DT within the environment into which it is to be deployed. The levels provide a structured approach to assessing the maturity of the development of DT initiatives [21] based on defined criteria and stages of progress. The current maturity level of a DT deployer is defined by one of the maturity stages, which represents a level based on a defined stage of progress according to the criteria. The maturity levels consist of several maturity stages, each representing a progressive stage of DT development.
Maturity levels add functionality to the DT in order to progress to more complex stages and to streamline decision-making processes by automating sequential tasks and fully exploiting the potential of the DT.
In their contributions, several authors dealt with the maturity levels of DT, defining their phases and characteristics. As the topic is still relatively new and DTs are undoubtedly used to a greater extent in various areas and there is no established framework for defining the individual phases of DT maturity, the following overview presents selected authors who have addressed this issue with their phases of DT maturity levels (Figure 1).
Figure 1 shows the definition of different types of DT levels by different authors. Individual levels are divided in terms of content into categories:
  • No Twin
  • Data Collection
  • Analysis
  • Prediction
  • Optimalization
  • Autonomy
Due to the inconsistency in the available literature, we also present a possible proposal for DT maturity levels. Table 1 provides comprehensive information, which should not differ depending on the application in a specific area. The following overview, into which the principles of the knowledge pyramid are integrated, defines the hierarchical framework of the different DT maturity levels.
The elaborated DT maturity-level design according to the above model consists of a total of seven levels. Other authors recognise and define, in most cases, five or six levels. In the proposal, level zero is referred to as preparation and initialization, as the level before the start of DT development. The other levels, from the Conceptualization Phase to the Autonomous Phase, are the levels during which development progress occurs and DT is gradually technologically improved. The design incorporates the knowledge pyramid principle to the conceptualization issues.
The following description briefly characterises the different stages of DT maturity, specific to each level, followed by questions that should be answered, especially before but also during the implementation of these phases, before moving on to the next level. The characteristics of each level provide a way to better understand the current level DT development is at.
The preparation and initialization phase can be considered as the zero level of DT maturity, the design level before the actual start of technology development. The phase is based on the initial idea of identifying a problem or an identified need, in order to improve a certain condition or a need for change, which is expected to start from a proper understanding of the project output, from development and implementation of the DT to its use.
Questions:
  • Is DT the answer to the identified problem or is there a need for a change in CM?
  • What should be the application area in which DT is to be implemented for CM decision support purposes?
  • For what purpose is DT to be used in the selected CM domain?
In the case of the practical creation of DT focused on a specific environment fulfilling certain functions, it is necessary to establish selection criteria that will take into account the overall concept of the proposal. In this case, the final choice and decision would be to create a flood DT.
The conceptualization phase is the first maturity level of DT. It is mainly characterised by the creation of virtual digital models of the physical assets, the environment, as well as the descriptive analysis of the physical assets that correspond to the real environment. The model is based on specified data and parameters, representing the properties and state of reality. At this stage, the linking of the models to the real environment is not fully developed. For the purpose of linking the DT, the existence of sensors already implemented in the real environment, which have not been used for this purpose so far but are a suitable solution, is considered, or new sensors with the capability of interconnecting the two environments are introduced.
Questions:
  • What are the primary data sources needed to create a digital model?
  • Are the data needed to create the digital visualization and DT model available, complete and reliable?
  • Is a virtual digital model of the real environment developed?
  • Are there opportunities for real-time integration of virtual and real environments?
In the case of a flood DT, a real virtual digital model of the selected environment or a suitable environment for the purpose of monitoring and predicting flood activity would be created as part of the creation. The model takes into account specific data and parameters, representing the properties and real state of reality.
The development and descriptive phase is the second level of DT maturity. It focuses on capturing, collecting, and visualizing data from real-world environments. This level consists of updating the data in the DT according to a specified update frequency, using real-time statistics, ensuring an accurate representation of the current state of the physical twin.
Questions:
  • Are sensor data streams integrated in real time into the DT?
  • How often is the DT updated with real-time data?
  • What is the level of connectivity coverage between the virtual and real environments?
  • To what extent is the DT connected and synchronised with the physical twin?
This phase represents the collection of specific types of data focused on early-day DT according to the framework.
In the integrative diagnostic phase, the DT, thanks to the IoT coverage, will establish a connection with sensors transmitting real-time data sources, enabling condition monitoring and detection of potential anomalies and problems. Incorporating historical data and prescriptive information into databases will set the stage for the use of advanced analytics and machine learning, with progressive recognition of real-world behavioural patterns.
Questions:
  • Is the visualization of historical data and information in DT complete?
  • Is a real-time synchronous link established to transfer current data?
  • Are algorithms for analytics, generation and machine learning implemented in the system?
The main essence of the phase is the creation of a connection between the real and virtual environment of the flood DT.
The predictive, analytical phase. Here, organisations use historical and real-time data, predictive analytics and machine learning to predict future states or system performance or models based on physical processes, through machine learning and thresholds that are set. At the fourth stage of maturity, DT can have prerequisites for creating scenarios of future potential developments by examining real situations in virtual space.
Questions:
  • Are recommendations and regulations implemented in DT, and how often are they updated?
  • To what extent does DT use predictive analysis techniques?
  • Can DT predict potential scenarios?
The scenarios that are created in DT in the optimization phase are fully capable of simulating the flood situation with regard to future developments in the form of the most diverse scenarios. When creating scenarios, past events are taken into account in the form of historical data on floods, which were integrated in the previous phases, as well as the current situation from the real environment.
The optimisation phase. In this phase, DT is able to use prescriptive analysis and allows us to explore different scenarios generated from the captured current data, taking into account possible future directions. DT within maturity level 5 can provide actionable insights and recommendations to derive decision outcomes and resource optimization.
Questions:
  • To what extent does our DT use prescriptive analysis techniques?
  • Can DT effectively simulate different scenarios?
  • Is DT capable of continuous optimization, learning and improvement?
Based on the knowledge made possible by various scenarios and other analytical tools, it is already possible to react to the potential danger in the real environment within this phase, through the overall optimization of the flood-oriented system.
The autonomous phase. This phase is within the sixth maturity level, and the DT is fully functional and ready for testing within the environment under investigation or, in the next step, capable for deployment and use. Advanced AI and machine learning algorithms are integrated, and the automation of the system improves. DT should be ready to implement continuous evaluation of and improvement in autonomic performance within the phase. DT achieves comprehensive coverage extension and integrates with user stakeholders. With real-time data collected and linked to historical data, DT achieves the ability to derive the optimal scenario that will support making informed and real-time decisions that lead to the execution of necessary actions in a realistic environment.
Questions:
  • Is the DT fully autonomous and capable of evaluating and learning from changing data in real time, according to which it can infer the optimal scenario to support decision making and execution of necessary actions?
  • Is the DT integrated among user stakeholders?
In its autonomous phase of maturity levels, the flood digital twin is capable of real-time prediction, monitoring and analysis of flood risks, automatically generating proposals for measures to prevent damage and protect the population. Thanks to the integration of sensor data and artificial intelligence, the model can simulate the development of the flood situation, warn the relevant authorities and activate rescue systems in time, thereby effectively protecting the lives and property of citizens.
Achieving the autonomous phase is currently the highest form of DT. A DT development approach based on characterizing and understanding the maturity levels of DT will provide a transition between phases and allow for a better understanding of the system. Starting from the basic levels, the emphasis is on information replication, gradually moving towards the implementation of higher levels, where interaction, autonomy and innovation elements are integrated. At each stage, it is essential to pay attention to and highlight the associated challenges that the level poses.
The implementation of DT CM requires a gradual roll-out, which can be achieved by forming a strategic process. The steps of the process consider the succession of the different levels of maturity of DT, emphasizing the importance of each phase. In this way, the implementation of technology deployment will be systematic and comprehensive, ensuring that DT evolves in synchrony with the needs and stated objectives.
The strategic approach should be designed to be flexible and adaptable, to support the implementation planning process [22] to match the different dynamics of each phase of DT maturity. The duration of each level phase may not be identical and may depend on many factors [23]. Various factors, such as technical requirements, integration with existing systems, resource availability, government regulation, and so on, can influence the actual pace of implementation.
Although the maturity stages are described conceptually, the maturity classification becomes defensible only when each stage is linked to observable and measurable indicators. Assessment-oriented maturity models in the digital twin literature, therefore, operationalise maturity through indicator sets, rubrics, and measurable thresholds rather than stage labels alone, enabling consistent classification and comparison across implementations. However, the indicator set cannot be fully universal because digital twins differ by domain, purpose, and operational constraints; some indicators remain broadly applicable (e.g., data update frequency and integration latency, verification/validation practices, interoperability capability, and decision-loop integration), while others must be defined in a domain-specific manner (e.g., model skill metrics, uncertainty expressions, automation boundaries, or operator intervention modes that are meaningful for the specific system). For this reason, indicators should be specified and refined before implementation, as part of requirements and design, so that data, modelling, governance, and operational workflows are aligned with what will later be measured and audited. In this article, the indicator principle is implemented by defining a general indicator logic at the framework level and by illustrating, in Section 3.3, how the indicator set should be tailored and finalised for a specific digital twin application prior to implementation [24,25].

3.2. Differential Analysis of Existing DT Maturity Models

This subsection establishes a systematic framework for comparing digital twin (DT) maturity models. Rather than providing a descriptive list of existing models, the comparison applies a common set of criteria and a transparent scoring rubric to evaluate each selected maturity framework in a consistent manner. The goal is to identify how models differ in their maturity logic, the DT capabilities they emphasise (e.g., data integration, modelling, analytics, interoperability), and the extent to which they provide practical assessment guidance rather than only conceptual stage labels.
The eleven criteria (C1–C11) were selected because they represent the core components that determine whether a digital twin maturity model is comparable to other maturity models, internally consistent as a staged framework, and usable for assessment in practice rather than remaining a purely conceptual typology. Each criterion corresponds to an aspect of maturity modelling that affects either (a) the interpretability of the stages, (b) technical capability progression, or (c) implementability and accountability of maturity claims.
C1–C2 establish comparability and scope. Maturity models are frequently proposed for specific domains (e.g., industrial assets, cities, infrastructure) and are designed for different decision contexts (strategic planning, benchmarking, investment prioritization, operational decision support). Without an explicit purpose and domain (C1), and without a clearly defined unit of analysis and scope (C2), maturity levels cannot be interpreted consistently across studies, and “maturity” may refer to fundamentally different objects (a single asset, a system, or a system-of-systems). These two criteria, therefore, ensure that any cross-model comparison starts from a transparent statement of what the model is intended to evaluate and at what system scale.
C3 assesses the quality of the staged maturity construct itself. Many maturity models use stage labels but not all define what evidence is required to claim a stage. C3, therefore, tests whether a model provides explicit stage boundaries to avoid ambiguous interpretation of stages. Models that define boundary conditions allow for consistent classification and reduce subjectivity when different evaluators apply the same model.
C4–C6 capture three foundational capability progressions that recur across digital twin implementations: data, models, and forward-looking analytics. Data integration and timeliness (C4) are fundamental because maturity progression typically involves moving from static representations to regularly updated states and, ultimately, near-real-time or real-time synchronisation. Modelling and simulation integration (C5) is essential because digital twins differ from static digital representations by their ability to support simulation, calibration, and coupling of models with observed data. Prediction and scenario capability (C6) captures whether maturity includes progression toward forecasting and structured “what-if” scenario analysis, which is a central motivation for digital twins in planning and operations. These criteria are included because they distinguish models that describe a digital twin as a visual or data integration platform from those that define maturity in terms of analytical and simulation capability growth.
C7–C10 address requirements that determine whether maturity claims remain credible and deployable beyond a single technical prototype. Validation and uncertainty (C7) are included because digital twin outputs increasingly support decisions. Therefore, maturity progression should reflect stronger verification, validation, and uncertainty management practices rather than only more data or more automation. Decision loop and automation (C8) capture how maturity changes the relationship between the digital twin and decision making from monitoring and advisory roles to automated recommendations or closed-loop actions with appropriate oversight. Interoperability and standards (C9) are included because digital twin ecosystems commonly require integration across tools, data providers, and organisational boundaries. Models that only describe internal integration may not generalise to multi-stakeholder settings. Governance and accountability (C10) are included because higher maturity levels often imply broader operational impact, data sharing, and coordination. Therefore, roles, responsibilities, stewardship, and coordination mechanisms become maturity-relevant conditions rather than external context.
Finally, C11 assesses whether a maturity model is operationalised as an assessment instrument. A central limitation of many maturity proposals is that they offer conceptual stages and guiding questions but do not provide measurable indicators, scoring rules, or diagnostic procedures. C11, therefore, separates maturity models that can be implemented as a repeatable assessment. C11 ensures the comparison can distinguish conceptual contributions from assessment-grade tools and supports consistent application in empirical settings.
To support replicability and reduce subjective interpretation, each maturity model is coded using the criteria defined in Table 2 and assigned a score on a 0–2 scale: 0 = absent, 1 = partially addressed, and 2 = explicitly defined with structured guidance. This approach enables a comparative analysis that highlights convergences, divergences, and systematic gaps across maturity models and provides a clear basis for positioning the contribution of the proposed approach relative to prior work.
All assessed works articulate a progression logic, in which maturity increases with added capabilities (for example, connectivity and synchronisation, bidirectional interaction, higher levels of analytics, and increased autonomy). This convergence explains why a superficial comparison based on naming stages can be misleading: the stage labels vary, but the underlying capability ladders are often similar. Consequently, novelty must be demonstrated through how maturity is operationalised (measured), how the maturity boundaries are defined, and what additional constructs are introduced that are missing or inconsistently treated in existing models.
A key outcome of the differential analysis is the clear separation of the literature into two methodological families. The first family consists of maturity models that are “assessment-grade”, meaning they provide explicit rubrics or indicators and a scoring procedure that allows a digital twin implementation to be assigned a maturity level in a repeatable way. These approaches typically define evaluation dimensions and detailed criteria, and they translate qualitative maturity concepts into measurable assessment items (for example, through multi-criteria decision methods or questionnaires) [24,26,27,28,29]. The second family consists of conceptual maturity frameworks and review-based maturity spectra that provide strong high-level maturity logic and positioning of maturity in the field but do not provide a complete diagnostic instrument for scoring a specific implementation [25,30]. This distinction is important because it directly addresses what the reviewer requested: systematic comparison should not only describe how many levels a model has but must demonstrate whether the model can actually be used as an assessment tool and what evidence supports its applicability.
In terms of measurability and diagnostic application, the assessment-grade models are consistently stronger because they operationalise maturity using either (i) explicit rubrics with weighted scoring and aggregation methods or (ii) questionnaire-based maturity scoring mapped to maturity levels [24,26,27,28,29]. This makes them suitable for benchmarking and gap analysis because they can produce an interpretable maturity score and, in many cases, identify improvement priorities. By contrast, conceptual frameworks typically clarify the conceptual meaning of maturity and propose capability thresholds at each level, but they are less directly usable for assessment because they do not provide full measurement instruments (for example, indicator sets, scoring rules, and data collection protocols) [25]. Review-based spectra offer evidence about typical maturity positions in the published literature and identify common patterns and limitations, but they are usually descriptive rather than diagnostic; they classify cases rather than provide an instrument that can be administered to an organisation to compute a maturity score [30]. This differential result supports a concrete methodological conclusion: a contribution that claims novelty in maturity staging must show how it improves measurability, not only how it renames stages.
Interoperability emerges from the assessment as one of the strongest differentiators among maturity approaches and a key area where the literature remains inconsistent. Some models frame interoperability as the defining feature of the highest maturity state (i.e., maturity of a digital twin within an ecosystem of interconnected digital twins) [25], while others incorporate interoperability within broader technical or organisational rubrics (for example, as data sharing, standards adoption, or system integration criteria) [26,29]. Several quantitative maturity assessments focus primarily on internal system capabilities and treat interoperability implicitly or as a secondary aspect of integration rather than as a dedicated maturity boundary [24,27,28]. This inconsistency matters because it shows that although interoperability is widely recognised, it is not uniformly operationalised across maturity models. Therefore, an important evidence-based gap is the absence of a widely adopted, measurable construct that captures interoperability readiness at the digital twin ecosystem level and connects it to assessment indicators.
A second differentiating dimension concerns socio-technical and governance aspects. While organisational readiness, trust, and human factors are acknowledged across the literature, they are not consistently translated into measurable maturity criteria. Some frameworks embed governance, trust, and accountability concepts explicitly in their rubric structures (for example, via dimensions that include purpose, trust, and functional governance elements) [26], and human-centred dimensions are central in maturity assessment when the application domain is inherently human-in-the-loop [28]. In contrast, several quantitative models emphasise technical capability progression and provide limited explicit governance indicators in the maturity scoring structure [24,27]. The review-based city digital twin maturity spectrum highlights the importance of participation, institutionalisation, and broader socio-technical integration but primarily as a research and implementation challenge rather than as a standardised measurement instrument [30]. This result supports a second evidence-based gap: governance and organisational readiness are present as concepts but not consistently operationalised as maturity indicators across models, limiting comparability and practical diagnosis.
Finally, the assessment clarifies how data requirements and data handling are treated in the maturity literature. Across models, data availability, synchronisation, and integration are consistently implied as maturity markers, but there is substantial variability in how explicitly models specify measurable data requirements (for example, data flow structure, update frequency, quality expectations, and assessment of data inflow/outflow). Questionnaire-based and rubric-based assessment models tend to handle data more explicitly because their assessment instruments require concrete evaluation items about data structure and operational flows [26,29]. Conceptual maturity frameworks link maturity levels to notions such as real-time synchronisation, bidirectional interaction, and automation but often do not specify minimum data requirements or a measurement procedure for verifying that those conditions are met [25]. Review-based maturity spectra provide evidence about what data and technologies typically appear at different maturity positions but do not establish a standard minimum dataset or a reproducible scoring mechanism for assessing a specific case [30]. This differential analysis, therefore, supports a third gap: while data are central to maturity progression, explicit, reproducible data requirement specification and verification are uneven across the maturity literature.
Overall, the criteria-based assessment directly addresses the reviewer’s request for a non-superficial comparison by demonstrating that existing maturity approaches differ less in the existence of staged levels and more in (i) whether they provide assessment instruments and measurable indicators, (ii) how they treat interoperability as either a top-level ecosystem capability or a within-system integration attribute, and (iii) how consistently they operationalise governance and data requirements. These findings provide a clear basis for positioning this study’s contribution; rather than claiming novelty purely through new stage labels, the contribution should be articulated in terms of strengthened operationalisation (measurable indicators and diagnostic tool capability), clearer boundary conditions (what must be evidenced to claim a stage), and explicit treatment of the differentiating gaps revealed by the assessment (interoperability readiness, socio-technical governance indicators, and maturity-linked data requirements) [24,25,26,27,28,29,30].

3.3. Flood Digital Twin Maturity-Driven Specification and Assessment

A flood digital twin can be defined as a dynamic digital representation of a flood-prone physical environment (catchment, watershed, or city) that continuously integrates multi-source observations with modelling and decision support functions to support preparedness, response, and recovery. Recent flood digital twin research converges on layered architectures that combine data ingestion from sensors and external products, processing and integration (including geospatial and hydrometeorological fusion), simulation and analytics (hydrologic/hydrodynamic modelling and data-driven methods), and decision support for operational actions such as warning, response planning, and infrastructure operation. Systematic and application-focused studies also show an increasing tendency to integrate remote sensing products, Internet of Things sensing, and artificial intelligence modules to improve the timeliness, reliability, and actionability of flood warnings and response measures [5,31,32,33,34,35,36]. At the same time, published implementations show that DT is applied to systems that differ substantially in maturity, spanning from static or primarily visualisation-oriented digital representations to near-real-time forecasting systems coupled with hydrologic–hydrodynamic simulations and early warning services [30,32,36]. For this reason, Table 1, Table 2 and Table 3 are applied as a structured basis to specify what constitutes maturity in a flood DT, what minimum data and modelling capabilities are required at each stage, and what evidence is necessary to claim a given maturity level. The maturity levels in Table 1 are operationalised for floods by:
  • Defining stage boundary conditions (entry/exit requirements) consistent with Table 3;
  • Mapping each maturity level to concrete data layers from Table 2 (meteorological, hydrological, geomorphological/land cover, infrastructure, building/asset, demographic and economic layers);
  • Identifying measurable indicators and evidence artifacts.
Table 3. Assessment of the criteria.
Table 3. Assessment of the criteria.
ArticleC1C2C3C4C5C6C7C8C9C10C11
[24]12222210221
[26]12222222111
[25]22101121110
[27]12222221222
[28]12222222212
[30]22112111111
[29]12221221222
This approach is aligned with assessment-grade maturity frameworks in the broader digital twin literature, which emphasise measurable indicators, scoring or rubrics, and explicit conditions for maturity classification [24,25,26,27,28,29,30]. The resulting specification ensures that the flood digital twin maturity concept is applicable as a diagnostic instrument for benchmarking and gap identification.
At level 0–1, the flood digital twin is primarily an organisational and representational capability. The objective is to define the flood decision context and build a coherent static geospatial foundation (terrain, hydrography, built environment exposure) with documented data ownership and governance. The design logic reflects that operational flood digital twins depend on a stable baseline representation before dynamic data streams can be meaningfully assimilated, calibrated, and validated, and this baseline-first principle is consistent with flood digital twin platforms and frameworks that begin with digital representation and data structuring prior to real-time integration and model coupling [5,32,36,37,38,39,40].
At levels 2–3, maturity is defined by the ability to ingest and synchronise meteorological and hydrological observations at operationally meaningful update frequencies, progressing from periodic updates (level 2) to near-real-time integration plus diagnostic and calibration/assimilation routines (level 3). This is the stage at which the flood digital twin becomes more than a digital map: it starts to function as a continuously updated operational representation where sensor pipelines, event logging, anomaly or fault detection, and calibration practices provide the basis for credible forecasting and operational reliability. Operational digital twin implementations for stormwater and drainage systems, rainfall-data integration frameworks, and data-driven early prediction work illustrate this coupling of live sensing with simulation and analytics as a defining characteristic of higher-maturity systems [30,33,41,42,43,44].
At level 4, the flood digital twin becomes predictive and assessment critical. Forecasting and scenario exploration must be validated against observed events, and uncertainty must be communicated explicitly. In flood contexts, uncertainty is not an optional feature because rainfall variability, intermittent sensing, model structural limits, and input data quality directly affect forecast confidence and downstream response decisions. Empirical studies demonstrate that rainfall data treatment and real-time quality control can materially influence hazard interpretation, while systematic syntheses emphasise that early warning usefulness depends on robust modelling workflows and uncertainty-aware outputs rather than forecasts alone [31,32,34,35,41,44].
At levels 5–6, maturity is defined less by modelling and more by decision integration and governance. The flood digital twin must produce actionable recommendations (level 5) and, where appropriate, support partially automated or closed-loop actions (level 6) with auditability, human override, interoperability arrangements, and accountable governance. City-scale flood digital twin frameworks demonstrate this evolution by integrating flood analytics with infrastructure layers to support operational decisions, while governance-oriented reviews and platform governance studies emphasise that stakeholder roles, institutional readiness, transparency, and accountability become decisive when advanced analytics is embedded into real decision workflows and cross-organisation data exchange [33,41,42,45,46,47,48].
Finally, the maturity-driven flood digital twin specification aligns directly with testbed selection and deployment feasibility. The selected environment must provide sufficient variation in flood mechanisms (e.g., rainfall-driven flooding, drainage failure, river overtopping), measurable rainfall intensity and water-level variability, and adequate data availability and quality to support real-time analysis, simulation, and validation. As shown by operational flood digital twin studies, data reliability and continuity are not only technical prerequisites but practical constraints that determine whether higher-maturity capabilities (near-real-time integration, validated forecasting, and decision-loop integration) can be achieved and sustained [32,34,35].
To operationalise the maturity assessment for a flood digital twin, Table 4 translates the assessment-grade maturity phases into flood-specific entry conditions, exit criteria, indicators, and evidence artifacts. The logic follows the progression reported across the flood and water-sector digital twin literature. Implementations typically start from a high-quality static geospatial baseline (terrain, hydrography, drainage/river networks, exposure layers) and progressively add operational data pipelines, near-real-time synchronisation, validated forecasting, and finally decision workflows and (where appropriate) automated control, while also increasing requirements for governance and accountability as decision impact grows [5,32,34,36,37,38,39,41,42,44,47].

3.4. Conceptual Design of Data Sensor Provisioning of Flood Digital Twin

Floods are described as Slovakia’s most frequent emergency event, which in many cases cause negative impacts on the lives of residents and destroy local infrastructure. Floods can affect towns and cities directly or indirectly. The negative impacts not only affect buildings and roads but can greatly hinder the economic and social activity of towns and cities.
Given the increasing risk of flooding around the world, which many experts attribute to climate change [23], many towns and cities are not sufficiently resilient to such risks. Proper, effective and rapid flood management tools [49] are essential to avoid, reduce and mitigate likely flood risks. This study develops a conceptual specification of a flood digital twin and does not report a deployed prototype or an implemented test site. Instead, Slovakia is used as a motivating operational context to structure requirements, while evidence for feasibility and typical implementation patterns is drawn from published flood digital twin systems and frameworks. Recent studies document operationally oriented flood digital twins that couple multi-source observations with hydrologic and hydrodynamic modelling, data assimilation or quality control, and early warning workflows; these provide empirical grounding for the staged data and capability requirements proposed here. Consequently, the purpose is to translate the maturity logic into a practical design roadmap (data, sensing, integration, modelling, and governance requirements) that can be instantiated for any selected watershed or city in future implementation work.
Flood DT, as a possible solution that uses real-time data and simulation models [50,51] to support decision making on flood response and management of water resources and economy, needs to combine a variety of datasets to evaluate the current and future flood situation [52]. Flood DT requires data incorporation from different domains and the creation of data layers to fulfil the desired purpose. Flood DT, as a comprehensive system, should be able to integrate different data layers that depend on each other to provide accurate and up-to-date model and scenarios of flood risk in a specific area. The data layers of the flood DT are key to the functioning of the technology for the comprehensiveness and accuracy of the CM purposes and objectives.
  • Meteorological data;
  • Hydrological data;
  • Infrastructure data;
  • Building construction data;
  • Geomorphological data and land cover data;
  • Demographic, economic and industry data;
  • Underground infrastructure data;
  • Geophysical data of subsurface structures.
Table 5 provides an overview of the different data layers and the types of data that should be included in the flood digital twin. In addition to outlining the data categories, the table also indicates how data in each layer can be obtained. These data layers require a robust information technology infrastructure, the standardisation of data formats and their interconnection, as well as mechanisms for data sharing, relationship evaluation and collaboration among stakeholders and users. The data layers within the flood digital twin are characterised by strong interdependencies. These interdependencies arise from the interaction between the real environment and the virtual model in real time, where dynamic data reflecting flood-related phenomena influence the behaviour of the DT and interact with its more stable, foundational layers.
Table 5 is comprehensive and should be interpreted as a superset of candidate layers rather than a fixed checklist. For implementation planning, the layers need to be prioritised into a minimum viable dataset (MVP) aligned with the intended decision context and the targeted maturity level. Published flood digital twin implementations consistently indicate that an operational MVP must first secure a stable geospatial baseline for the flood domain (terrain, hydrography and drainage/river network, land cover, and exposure of buildings and critical infrastructure), and a core set of dynamic observations that drive flood hazard dynamics (rainfall inputs and water level or discharge observations). This baseline MVP logic supports subsequent calibration, validation, and trustworthy forecasting, and it is consistent with flood digital twin frameworks that begin with a coherent digital representation and structured data foundations before expanding toward real-time integration and advanced analytics [5,30,32,33,34,35,37,43]. Beyond availability, data quality and acquisition feasibility determine whether higher-maturity capabilities can be achieved. Flood digital twin studies emphasise that rainfall products, sensor reliability, and online quality control materially influence predicted hazards and warning usefulness, especially under uncertainty and data gaps. Therefore, each prioritised MVP layer should be accompanied by explicit quality requirements (e.g., spatial accuracy class for terrain and exposure layers, timeliness and latency targets for rainfall and water-level observations, completeness thresholds, metadata and provenance, and continuity expectations during extreme events). Practical acquisition challenges include heterogeneous formats and ownership, intermittent sensor availability, access restrictions, and cross-organisation coordination needs; governance-oriented flood risk reviews further note that institutional readiness, accountability, and transparent decision processes become increasingly decisive as data sharing and analytics are embedded into operational workflows [32,34,35,45,46].

3.5. Selection of the Flood Digital Twin Test Environment

When selecting a test environment for an innovative flood management platform via DT in the territory of Slovakia and for its needs, it is possible to base historical data on the most frequent occurrences of floods in the mentioned territory.
Based on the available data, it is possible to determine the number of flood events recorded in individual Slovak municipalities. However, when selecting a test environment for the flood digital twin, it is necessary to consider not only the feasibility of implementation but also the suitability of the territory in relation to the availability of relevant information. This suitability should be assessed according to several conditions, which can be grouped into the following categories: geographical, hydrological, data-related and other contextual factors.
Geographical conditions:
  • The type of terrain should correspond to the realism of the test environment, characterising its size and flatness, the type of terrain, considering the heterogeneity of the terrain.
  • Watercourses should be considered in the selection of the test environment because of the diversity of watercourses (rivers, streams, lakes or reservoirs). The selection conditions should consider the size of the water bodies, the gradients of the watercourses, or the presence of dams, weirs or water management structures.
  • The built environment when selecting a test environment should offer sufficiently built-up, populated urban areas, using industrial areas for the overall purposefulness of the flood DT.
Hydrological conditions:
  • The flood types within the test environment should be able to provide different possibilities for their occurrence (floods caused by changes in rainfall intensity, flash floods, failure of drainage systems, levees or other control devices, etc.).
  • The rainfall intensity of the selected test environment should be used as available data in analysing, predicting and simulating floods in real time within the selected real environment, which is able to provide data on different types of rainfall intensity (from moderate rainfall to heavy persistent storms).
  • The water surface elevation of the selected test environment should be used as available data in analysing, predicting, and simulating real-time flooding within the selected real environment that is capable of providing data on water surface elevation variability (from average water surface elevation to inundation and persistent flooding).
Data conditions:
  • The data availability of the selected test environment should provide sufficient data inputs for the flood DT.
  • Data quality considers accuracy, availability, completeness, compatibility of data formats with flood DT.
Other conditions:
  • Availability of resources such as funds, personnel and time in selecting a test environment.
  • Restrictions of various kinds and types on the choice of test environment.
Creating a flood digital twin is a challenging task from an implementation perspective. Such a system requires a large volume of input data and the ability to represent the complex dependencies that exist between interconnected urban infrastructure systems [53]. Ongoing technological advances and the increasing availability of data indicate strong potential for the future development of digital twins in the area of flood forecasting for municipalities and cities. For this reason, the selection of an appropriate testbed is a crucial step in the development process. A well-chosen test environment can serve as a prototype for other municipalities and jurisdictions, helping to ensure accurate and reliable DT functionality and supporting the transfer of contextual knowledge.

3.6. Potential Users of the Flood Digital Twin

Potential users who would have access to the outputs of the flood digital twin include organisations, institutions and designated individuals authorised by law or by a competent authority to carry out activities related to flood monitoring, prevention and protection; the management of watercourses and water resources; and overall flood risk management within the territory of the Slovak Republic. Table 6 provides an overview of the main official institutions responsible for the current organisation and governance of flood risk in Slovakia, presented in English translation. These institutions, along with their authorised employees and other competent persons, may, therefore, be considered the primary direct users of the flood digital twin.

3.7. Recommendations for the Introduction of a Digital Twin

Addressing the question of what a particular DT will do and how it will do it, what are its features, functions, etc., requires communication between developers, users and other stakeholders. What will be the key components of the DT must be discussed early in the development process. The resulting DT will depend, among other things, on the final goal, as well as on any legal, technological and overall resource considerations and on the various decisions taken in the design phase. In this section, we, therefore, discuss other key aspects in addition to those mentioned above and propose recommendations for the successful implementation of the concept in practice, in terms of DT development.
Recommendations regarding the challenges in implementing DT in CM would cover the following key areas:
  • Area identification, risk assessment and quantitative and qualitative indicators.
Before deciding to develop DT, it is important to select a specific area that will benefit from this technology in terms of CM. For this reason, it is recommended to develop an overview of the most frequent MUs for the studied environment or to delineate the area based on an area analysis or by determining the vulnerability of the sites.
Once the area for which we will develop DT has been determined, it is necessary to identify all possible influencing options of adverse events, revealing their causes as well as external and internal risk factors and all influencing factors. Finally, by analysing the risks, based on the identified quantitative and qualitative indicators and estimating the probability of occurrence and the impacts caused by adverse events, a risk weighting will be determined.
The DT system, in its autonomous operational stage, should be able to predict, simulate and provide early warning of increases in the level of risk being monitored, based on predetermined parameters. These parameters should allow the DT system to identify specific risks that need attention and suggest ways to deal with them. If the level of risk increases, the DT system should immediately inform managers about the situation through alert mechanisms. In this way, it provides the necessary early warning so that managers can react immediately, make decisions and take action to minimise or mitigate the risk.
  • Communication links, coverage and sensors
DT works on the principle of transferring data from the real environment to a virtual model, fostering the creation of entities to support decision making. The interconnections created form an extensive communication system, including the real-time transmission of data originating from a wide range of sensors and IoT devices.
In the development and implementation of a DT capable of bidirectional communication, it is advisable to focus on the following areas:
-
Existing sensors;
-
The introduction of new sensors and actuators;
-
Communication coverage and transmission.
The use of existing sensors capable of interfacing and transferring data from the real environment to the DT environment is a possible initial form of establishing a link. This form does not require modification of existing physical installations. This approach requires the creation of a link and subsequent transfer of data to DT from an already-existing data infrastructure. Another option is to use already-installed and connected sensors. The sensors may require modification after consideration of their suitability for use in this infrastructure. Another case would be the complete deployment of new sensors and actuators. This way of creating a data link would constitute a new DT data infrastructure.
New sensor devices and controllers can also be introduced into an existing DT data infrastructure. In this case, feeder modules that can incorporate a new set of sensors may require modification of existing software as well as the development of new functions that manage the new sensors with the existing ones [3].
Communication coverage needs special attention in the development and implementation of DT, due to the construction of communication coverage as an additional challenge to ensure the transmission of data from the real environment.
In particular, the Internet of Things (IoT) with a wireless sensor network and devices that are designed to capture and transmit information directly to DT cloud applications will support the communication link between DT and the real environment.
  • Cybersecurity
Components involved in the functioning of DT include hardware, software, sensors, and a robust database of data and information. Similarly to other software technologies, there is a potential risk of misuse of DT or risk of damage to the components ensuring its function. A cyberattack, security breach, infiltration, or damage to sensory devices and other components or software could result in an undesired modification of DT. DT misuse could lead to the creation of a false, distorted image of reality, which could further lead to attempts of deception in crisis situations, influence public opinion, and spread fake information. DT can be exploited for various forms of cybercrime and unauthorised surveillance. Data protection, privacy, and cybersecurity must play a key role [53]. Developers and users of DT should acknowledge these types of risks and pay considerable attention to prevention, both during development and in use. They should systematically implement security measures to minimise the surface for cyberattacks and misuse of DT. Developing and implementing sophisticated security measures, algorithms, and mechanisms for detecting and responding to cyberthreats will be an integral part in the development of autonomous digital transformation and strengthening the system’s ability to adapt to various situations without the need for human intervention, with the aim of anticipating and addressing potential problems before they become critical.
  • Testing and Personnel Training
Before deployment, it is necessary to test and verify DT to ensure its accuracy, reliability and functionality. Verifying and validating DR models in the testing phase are possible by defining goals and criteria for simulation and forecasting of future developments in the studied environments and their processes and by proposing their own verification methods.
The realism and representativeness of data reflecting the real-world dynamic conditions are also verified during the testing phase in terms of consistency, completeness, accuracy and sufficient coverage, scope and frequency of variables for scenario creation.
During the testing and verification of DT models and scenarios, controlling and documenting the results of their analysis and interpreting the results in relation to the goals and criteria can help draw conclusions, and the recommendations drawn can be used for decision making, optimization or innovation purposes.
In addition to focusing on the aforementioned key areas, the development of DT itself will go through maturity phases (Table 1), each representing a separate challenge for DT, with the aim of achieving its autonomous form.
The creation and implementation of DT in CM are a complex process, in which it is important to gain support from experts and involve relevant stakeholders throughout the process from start to the creation and use of this fully functional and autonomous system. We can conclude that after successful implementation, DT technology will serve in the field of CM primarily as a tool that helps create a safer and more efficient environment through risk anticipation and subsequent management. This functionality has the potential to improve the overall prevention and detection of possible problems and, overall, increase in the security of places, as well as technological progress and social development.

3.8. Integrated Architecture of a Flood-Oriented DT

The architecture presented in this figure offers a structured overview of the key components that form a digital twin designed to support crisis management in smart cities, with particular emphasis on flood-related emergencies. It demonstrates how conditions in the physical environment are continuously observed through a wide range of sensing technologies and heterogeneous data sources, which are subsequently integrated within a unified data platform. These data streams supply the virtual twin with high-quality inputs that allow hydrodynamic models, geospatial representations and state estimation techniques to generate an accurate and continuously updated picture of the evolving situation. The inclusion of artificial intelligence further strengthens the analytical capacity of the system by enabling diagnostic, predictive and optimisation functions that align with advanced stages of digital twin maturity. The upper layer translates these insights into practical decision support, facilitating timely warnings, coordinated crisis response and post-event learning for authorities and stakeholders. The architecture in Figure 2 illustrates a coherent framework that interconnects data, models, analytics and decision processes, and it represents the conceptual foundation for the digital twin approach explored in this study.

4. Discussion

Representing the real environment within a virtual digital twin (DT) space is a fundamental challenge, particularly as digital technologies continue to advance and become increasingly integrated into everyday urban life. The ability of a DT to accurately mirror physical conditions depends on the quality, relevance and structure of the data available to it. The availability of such data is directly influenced by the level of technological deployment within the monitored territory. Sensors, therefore, play a central role, as they provide the real-time information needed to construct, update and validate the DT. For this reason, the selection of appropriate sensing technologies requires careful consideration during system design. Decisions must be made regarding the types of data necessary for a particular municipality, the formats in which these data should be collected and stored, and the broader implications for interoperability and long-term system maintenance. A clear understanding of how sensors contribute to the DT, and of how they function within broader technological ecosystems, is essential for ensuring the safe, reliable and effective use of digital technologies in crisis management. From a maturity perspective, this challenge is not only technical but also assessment related. Higher DT maturity requires evidence that data are timely, integrated, and fit for purpose (e.g., validated update frequency, traceable provenance, and stable operational pipelines), rather than simply “available”. This is consistent with assessment-grade maturity approaches that treat data readiness, interoperability and governance as differentiating requirements between maturity stages, especially when DT outputs are intended to influence operational decisions [24,25,26,27,28,29,30].
Many generic DT frameworks identify a physical entity, virtual model and connection as basic components, with variations that often incorporate data layers, analytical services or communication infrastructures. Hu (2021) [54] provides a detailed review of DT architectures and notes that most rely on these three foundational elements. In this respect, the proposed structural components of the DT aimed at security and urban monitoring are aligned with established conceptualisations in the literature. Similarly, Botín-Sanabria et al. (2022) [55] described sensors, communication interfaces, the DT core, data processing capabilities and visualisation services as essential components, which correspond closely with the decomposition adopted in this study. However, crisis management use cases impose additional constraints that are often under-specified in generic architectures, such as decision relevance, operational timeliness, and accountability. These constraints are reflected in maturity models that explicitly distinguish between static digital representations and DTs capable of near-real-time integration, validated forecasting, and decision-loop integration [24,25,26,27,28,29,30]. In the flood domain, published DT frameworks and implementations similarly indicate that operational value depends on a reliable chain from sensing and integration to modelling and warning workflows, rather than on visualisation alone [5,31,32,33,34,35].
The efficiency and practicality of introducing sensory devices must also be considered. A single physical environment may be monitored using a variety of sensors that capture temperature, humidity, water level, movement, sound and many other variables. These data sources, and the sensors that produce them, may support the creation of multiple DTs that represent different aspects of the same environment or serve the needs of different independent stakeholders. Further discussion is required regarding how sensory components, or complete DT systems, may be linked to wider technological infrastructures. Digital twins should not exist in isolation. Instead, they should be embedded within broader analytical and operational systems that support modelling, control, optimisation and strategic planning. This integration will influence not only the structure of the DT itself but also the functional capabilities it can provide. For flood-oriented DTs, the literature indicates that such integration typically progresses from a high-quality geospatial baseline (terrain, hydrography, drainage and river networks, exposure layers) toward the operational ingestion of rainfall and water-level or discharge observations and then to model-based forecasting and early warning systems [5,31,32,33,34,35,43,44]. This staged progression supports a practical interpretation of data layers as implementation priorities. A minimum viable dataset enables descriptive monitoring, while higher maturity requires additional layers and stronger integration practices to support calibration, validation, uncertainty handling and operational reliability [32,34,35,43,44].
Legal, ethical and security considerations are equally important when creating and deploying DTs, particularly in the context of crisis management. Issues, such as personal data protection, cybersecurity, system reliability and the potential misuse of information, must be addressed from the outset [10]. A well-informed discussion of these aspects is essential for developing guidelines, standards and governance mechanisms that protect individual and societal interests. Such measures will support the safe and successful implementation of DT technology and strengthen its role in enhancing crisis management and public safety. This governance requirement strengthens rather than weakens the case for maturity-driven design. As DT maturity increases toward stronger decision-loop integration, interoperability across organisations, and potentially semi-automated actions, the need for clear roles, auditability, and accountable coordination mechanisms increases accordingly [45,46]. In flood risk management, stakeholder- and governance-focused reviews emphasise that institutional readiness and transparent responsibility allocation are decisive for sustainable implementation, because operational decisions must remain traceable and legitimate even when supported by advanced analytics [45,46].
Sensors and early warning systems play a crucial role in enabling a flood digital twin to respond effectively to evolving conditions in the real environment. Recent research highlights that integrating IoT sensing, environmental monitoring and real-time analytics significantly improves the accuracy and responsiveness of flood prediction and early warning capabilities, thereby enhancing preparedness and crisis decision making [56]. At the same time, the deployment of such sensor-rich and data-intensive systems requires careful consideration of security, privacy and ethical dimensions. Studies on smart city infrastructures stress that safeguarding data integrity, ensuring cybersecurity resilience and establishing responsible governance mechanisms are essential for maintaining trust and supporting the sustainable use of advanced technologies such as digital twins [57]. Together, these perspectives underline that the successful implementation of a flood DT relies not only on high-quality sensing and early warning but also on robust security measures across all aspects of the system.

5. Conclusions

In conclusion, the rapidly changing technological environment [5,58] and the increasing demands placed on urban safety highlight the need for flexible and innovative solutions to support decision making in crisis management. Developing proposals for the implementation and practical deployment of digital transformation through digital twin (DT) technology within the urban context remains a significant challenge in domestic conditions. The ability of a DT to mirror multiple functions of the city using real-time data is one of its most valuable features, enabling users to respond quickly to changes in the physical environment and the issues that arise from them. DTs can provide tools for operational optimisation and autonomy, support the resolution of infrastructure and transport network challenges, anticipate the formation of potential municipal problems and contribute to raising the overall quality of life in urban settings. It is, therefore, essential to embrace this challenge and adapt to the dynamic digital landscape, while ensuring that safety and innovation remain central to the functioning of modern cities. DT technology offers access to information that would traditionally require physical inspection, thereby enhancing the efficiency and precision of urban management.
Addressing questions related to DT functions and capabilities requires close collaboration between developers, stakeholders and end users. A range of considerations must be taken into account, including legal and regulatory requirements, specific user needs, available resources and the strategic objectives of implementation. Experience from the Slovak crisis management context further indicates that information systems only become operationally effective when responsibilities, coordination mechanisms and practical implementation constraints are explicitly addressed, rather than treated as implicit assumptions [59]. These aspects should guide both the development and the future research directions of DT technologies. Further research should focus on specifying the input data required for DT operation and determining the appropriate types of sensors needed to collect and monitor such data in real time, depending on the type of municipal unit. DT-based data can support the simulation of current and future scenarios, providing users with a comprehensive description of the environment and deepening their understanding of its physical entities. In this study, the proposed maturity levels and data-layer specification provide a structured basis for this progression by clarifying what capabilities and evidence are required at successive stages of DT development and use.
This study, therefore, aimed to define DT technology, describe the exchange of information between the physical and virtual environments, introduce technologies relevant to DT development, outline DT maturity levels as progressive stages towards increasing automation and provide recommendations for the gradual introduction of DTs. To address the need for methodological transparency and non-intuitive maturity classification, the maturity stages were benchmarked against established DT maturity and assessment frameworks and expressed through observable criteria and indicator examples that can be adapted to the specific DT domain and governance setting. In a dynamic and continually evolving technological context, it is essential to monitor emerging challenges and adopt innovative approaches to advance urban safety and support crisis management decision making. Improved flood risk modelling and early warning systems, such as those enabled by a flood digital twin, can significantly reduce health hazards associated with flood events and enhance community resilience. The use of DT-based prediction and warning systems, thus, not only supports infrastructure and crisis management but also contributes directly to protecting public health in urban settings [57].

Limitations and Future Work

This study is conceptual and does not report a deployed prototype or a validated test site. Accordingly, the proposed flood DT specification should be interpreted as a maturity-driven design roadmap that can be instantiated for a selected watershed or city in future implementation work [5,31,32,33,34,35]. A key next step is empirical instantiation: selecting a specific municipality or watershed, documenting existing sensors and data availability, and applying the maturity assessment to identify gaps in data readiness, integration capability, modelling workflow, and governance arrangements. In parallel, the indicator examples associated with the maturity levels should be refined into an application-specific operational indicator set for the chosen context and then evaluated through retrospective event replay and stakeholder exercises, consistent with assessment-grade maturity approaches that emphasise measurable indicators, evidence artifacts, and stage boundary conditions.

Author Contributions

Conceptualization, J.R. and B.H.; methodology, D.C. and K.N.; validation, J.R., D.C. and B.H.; formal analysis, K.N.; investigation, J.R., B.H., D.C. and K.N.; resources, B.H. and D.C.; data curation, B.H. and J.R.; writing—original draft preparation, J.R. and B.H.; writing—review and editing, D.C.; visualization, K.N.; supervision, J.R.; funding acquisition, J.R. All authors have read and agreed to the published version of the manuscript.

Funding

Funded by the EU NextGenerationEU through the Recovery and Resilience Plan for Slovakia under the project No. 17R05-04-V01-00005.

Data Availability Statement

Data are contained within the article.

Conflicts of Interest

The authors declare no conflicts of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Metcalfe, B.; Boshuizen, C.H.; Bulens, J.; Koehorst, J.J. Digital twin maturity levels: A theoretical framework for defining capabilities and goals in the life and environmental sciences. F1000Research 2023, 12, 961. [Google Scholar] [CrossRef]
  2. Milner, G. South Korean City Uses a Digital Twin to Meet Challenges. ArcUser. Summer 2021. Available online: https://www.esri.com/about/newsroom/arcuser/south-korean-city-uses-a-digital-twin-to-meet-challenges (accessed on 7 July 2024).
  3. Nleya, S.M.; Velempini, M. Industrial Metaverse: A Comprehensive Review, Environmental Impact, and Challenges. Appl. Sci. 2024, 14, 5736. [Google Scholar] [CrossRef]
  4. Ariyachandra, M.R.; Wedawatta, G. Digital Twin Smart Cities for Disaster Risk Management: A Review of Evolving Concepts. Sustainability 2023, 15, 11910. [Google Scholar] [CrossRef]
  5. Park, D.; You, H. A Digital Twin Dam and Watershed Management Platform. Water 2023, 15, 2106. [Google Scholar] [CrossRef]
  6. Hyun, C.-T.; Lee, S.; Jin, C. Digital Twin for Disaster Management in Smart City. In Advances in Information and Communication; Arai, K., Ed.; Lecture Notes in Networks and Systems; Springer Nature: Cham, Switzerland, 2024; Volume 919, pp. 627–641. ISBN 978-3-031-53959-6. [Google Scholar]
  7. Khan, M.S.; Chinnaiyan, R.; Balachandar, S.; Ali Ibrahim, S.J.; Chakravarthy, N.S.K.; Kalaiarasan, C.; Divya, R. Centralized and Reliable Digital Twin Models for Smart City’s Buildings Protection during Disaster. In Proceedings of the 2022 International Conference on Computational Modelling, Simulation and Optimization (ICCMSO); IEEE: Pathum Thani, Thailand, 2022; pp. 226–229. [Google Scholar]
  8. Habib, A.; Habib, M.; Bashir, B.; Bachir, H. Exploring the Sustainability Benefits of Digital Twin Technology in Achieving Resilient Smart Cities During Strong Earthquake Events. Arab. J. Sci. Eng. 2025, 50, 16869–16883. [Google Scholar] [CrossRef]
  9. Park, S.; Kim, J.; Kim, Y.; Kang, J. Participatory Framework for Urban Pluvial Flood Modeling in the Digital Twin Era. Sustain. Cities Soc. 2024, 108, 105496. [Google Scholar] [CrossRef]
  10. Kollarova, M.; Granak, T.; Strelcova, S.; Ristvej, J. Conceptual Model of Key Aspects of Security and Privacy Protection in a Smart City in Slovakia. Sustainability 2023, 15, 6926. [Google Scholar] [CrossRef]
  11. Wilking, F.; Schleich, B.; Wartzack, S. Digital twins-definitions, classes and business scenarios for different industry sectors. Proc. Des. Soc. 2021, 1, 1293–1302. [Google Scholar] [CrossRef]
  12. Kharche, V. Digital Twin Maturity. LinkedIn Pulse. 2022. Available online: https://www.linkedin.com/pulse/digital-twin-maturity-vinayak-kharche/ (accessed on 7 July 2024).
  13. Rao, A. Evolution of Digital Twins Open Data Science Conference. May 2022. Reference Source. Available online: https://odsc.com/blog/evolution-of-digital-twins/ (accessed on 12 July 2024).
  14. Segovia, M.; Garcia-Alfaro, J. Design, Modeling and Implementation of Digital Twins. Sensors 2022, 22, 5396. [Google Scholar] [CrossRef]
  15. Stanford-Clark, A.; Frank-Schultz, E.; Harris, M. What are Digital Twins? IBM Developer. 2019. Available online: https://developer.ibm.com/articles/digital-twins-and-the-internet-of-things/ (accessed on 7 July 2024).
  16. Grieves, M. Digital twin: Manufacturing Excellence Through Virtual factory Replication. White Pap. 2014, 1, 1–7. Available online: https://www.researchgate.net/publication/275211047_Digital_Twin_Manufacturing_Excellence_through_Virtual_Factory_Replication (accessed on 7 July 2024).
  17. Miskins, C. How Digital Twin Will be Utilized to Create Smart Cities. [Online]. 2019. Available online: https://www.challenge.org/insights/digital-twins-and-smart-cities/ (accessed on 7 July 2024).
  18. Ibrahim, M.; Rjabtšikov, V.; Gilbert, R. Overview of Digital Twin Platforms for EV Applications. Sensors 2023, 23, 1414. [Google Scholar] [CrossRef] [PubMed]
  19. Kubasakova, I.; Kubanova, J.; Benco, D.; Kadlecová, D. Implementation of Automated Guided Vehicles for the Automation of Selected Processes and Elimination of Collisions between Handling Equipment and Humans in the Warehouse. Sensors 2024, 24, 1029. [Google Scholar] [CrossRef]
  20. Deleu, R. The Digital Twin Maturity Model. 2023. Available online: https://di-phy-innovations.com/#what-is-digital-twin (accessed on 7 July 2024).
  21. Simak, L. Crisis Management in Public Administration, 2nd ed.; FSI UNIZA: Žilina, Slovakia, 2016; p. 263. [Google Scholar]
  22. Empl, P.; Pernul, G. Digital-Twin-Based Security Analytics for the Internet of Things. Information 2023, 14, 95. [Google Scholar] [CrossRef]
  23. Nofal, O.M.; De Lindt, J.W.V. Understanding Flood Risk in the Context of Community. Resilience Modeling for the Built Environment: Research Needs and Trends Built Environment: Research Needs and Trends. Sustain. Resilient Infrastruct. 2020, 7, 171–178. [Google Scholar] [CrossRef]
  24. Hu, W.; Fang, J.; Zhang, T.; Liu, Z.; Tan, J. A New Quantitative Digital Twin Maturity Model for High-End Equipment. J. Manuf. Syst. 2023, 66, 248–259. [Google Scholar] [CrossRef]
  25. Klar, R.; Arvidsson, N.; Angelakis, V. Digital Twins’ Maturity: The Need for Interoperability. IEEE Syst. J. 2024, 18, 713–724. [Google Scholar] [CrossRef]
  26. Chen, L.; Xie, X.; Lu, Q.; Parlikad, A.K.; Pitt, M.; Yang, J. Gemini Principles-Based Digital Twin Maturity Model for Asset Management. Sustainability 2021, 13, 8224. [Google Scholar] [CrossRef]
  27. Li, T.; Rui, Y.; Zhao, S.; Zhang, Y.; Zhu, H.; Li, X. A Quantitative Digital Twin Maturity Model for Underground Infrastructure Based on D-ANP. Tunn. Undergr. Space Technol. 2024, 146, 105612. [Google Scholar] [CrossRef]
  28. Liu, X.; Li, G.; Xiang, F.; Tao, B.; Jiang, G. Expert Opinion Aggregation-Based Decision Support for Human-Robot Collaboration Digital Twin Maturity Assessment. J. Ind. Inf. Integr. 2024, 42, 100710. [Google Scholar] [CrossRef]
  29. Wei, Y.; Lei, Z.; Altaf, S. An Off-Site Construction Digital Twin Assessment Framework Using Wood Panelized Construction as a Case Study. Buildings 2022, 12, 566. [Google Scholar] [CrossRef]
  30. Masoumi, H.; Shirowzhan, S.; Eskandarpour, P.; Pettit, C.J. City Digital Twins: Their Maturity Level and Differentiation from 3D City Models. Big Earth Data 2023, 7, 1–36. [Google Scholar] [CrossRef]
  31. Rápalo, L.M.C.; Gomes, M.N., Jr.; Mendiondo, E.M. Developing an open-source flood forecasting system adapted to data-scarce regions: A digital twin coupled with hydrologic-hydrodynamic simulations. J. Hydrol. 2024, 644, 131929. [Google Scholar] [CrossRef]
  32. Hlal, M.; Baraka Munyaka, J.-C.; Chenal, J.; Azmi, R.; Diop, E.B.; Bounabi, M.; Ebnou Abdem, S.A.; Almouctar, M.A.S.; Adraoui, M. Digital Twin Technology for Urban Flood Risk Management: A Systematic Review of Remote Sensing Applications and Early Warning Systems. Remote Sens. 2025, 17, 3104. [Google Scholar] [CrossRef]
  33. Kaynak, S.; Kaynak, B.; Mermer, O.; Demir, I. City-Scale Digital Twin Framework for Flood Impact Analysis: Integrating Urban Infrastructure and Real-Time Data Analytics. Urban Clim. 2025, 64, 102640. [Google Scholar] [CrossRef]
  34. Kim, Y.; Oh, J.; Bartos, M. Stormwater Digital Twin with Online Quality Control Detects Urban Flood Hazards under Uncertainty. Sustain. Cities Soc. 2025, 118, 105982. [Google Scholar] [CrossRef]
  35. Green, A.C.; Lewis, E.; Tong, X.; Wardle, R. A Framework for Incorporating Rainfall Data into a Flooding Digital Twin. J. Hydrol. 2025, 656, 132893. [Google Scholar] [CrossRef]
  36. Ge, C.; Qin, S. Urban Flooding Digit. Twin System Framework. Syst. Sci. Control Eng. 2025, 13, 2460432. [Google Scholar] [CrossRef]
  37. Chen, C.; Han, Y.; Galinski, A.; Calle, C.; Carney, J.; Ye, X.; van Westen, C. Integrating Urban Digital Twin with Cloud-Based Geospatial Dashboard for Coastal Resilience Planning: A Case Study in Florida. J. Plan. Educ. Res. 2025, 45, 743–759. [Google Scholar] [CrossRef]
  38. Henriksen, H.J.; Schneider, R.; Koch, J.; Ondracek, M.; Troldborg, L.; Seidenfaden, I.K.; Kragh, S.J.; Bøgh, E.; Stisen, S. A New Digital Twin for Climate Change Adaptation, Water Management, and Disaster Risk Reduction (HIP Digital Twin). Water 2023, 15, 25. [Google Scholar] [CrossRef]
  39. Pal, D.; Marttila, H.; Ala-Aho, P.; Lotsari, E.; Ronkanen, A.K.; Gonzales-Inca, C.; Croghan, D.; Korppoo, M.; Kämäri, M.; van Rooijen, E.; et al. Blueprint Conceptualization for a River Basin’s Digital Twin. Hydrol. Res. 2025, 56, 197–212. [Google Scholar] [CrossRef]
  40. Ghorbani Bam, P.; Rezaei, N.; Roubanis, A.; Austin, D.; Austin, E.; Tarroja, B.; Takacs, I.; Villez, K.; Rosso, D. Digital Twin Applications in the Water Sector: A Review. Water 2025, 17, 2957. [Google Scholar] [CrossRef]
  41. Fan, C.; Hou, J.; Li, X.; Song, G.; Yang, Y.; Liang, X.; Zhou, Q.; Imran, M.; Chen, G.; Wang, Z.; et al. Efficient Urban Flood Control and Drainage Management Framework Based on Digital Twin Technology and Optimization Scheduling Algorithm. Water Res. 2025, 282, 123711. [Google Scholar] [CrossRef]
  42. Fadmastuti, M.; Nowak, D.; Crompvoets, J. Flood Data Platform Governance: Identifying the Technological and Socio-Technical Approach(es) Differences. Environ. Sci. Policy 2024, 162, 103938. [Google Scholar] [CrossRef]
  43. Bartos, M.; Kerkez, B. Pipedream: An Interactive Digital Twin Model for Natural and Urban Drainage Systems. Environ. Model. Softw. 2021, 144, 105120. [Google Scholar] [CrossRef]
  44. Ghaith, M.; Yosri, A.; El-Dakhakhni, W. Synchronization-Enhanced Deep Learning Early Flood Risk Predictions: The Core of Data-Driven City Digital Twins for Climate Resilience Planning. Water 2022, 14, 3619. [Google Scholar] [CrossRef]
  45. Bakhtiari, V.; Piadeh, F.; Chen, A.S.; Behzadian, K. Stakeholder Analysis in the Application of Cutting-Edge Digital Visualisation Technologies for Urban Flood Risk Management: A Critical Review. Expert Syst. Appl. 2024, 236, 121426. [Google Scholar] [CrossRef]
  46. Bhanye, J. Flood-tech Frontiers: Smart but Just? A Systematic Review of AI-Driven Urban Flood Adaptation and Associated Governance Challenges. Discover Glob. Soc. 2025, 3, 59. [Google Scholar] [CrossRef]
  47. Bitencourt, J.; Osho, J.; Wooley, A.; Harris, G. Do You Trust Digital Twins? A Framework to Support the Development of Trusted Digital Twins through Verification and Validation. Int. J. Prod. Res. 2025, 1–21. [Google Scholar] [CrossRef]
  48. Expósito, A.; Díez Cebollero, E. How the Digital Revolution Is Reshaping Water Management and Policy: A Focus on Spain. Util. Policy 2025, 96, 102020. [Google Scholar] [CrossRef]
  49. Ersan, M.; Irmák, E. Development and Integration of a Digital Twin Model for a Real Hydroelectric Power Plant. Sensors 2024, 24, 4174. [Google Scholar] [CrossRef]
  50. Somanath, S.; Naserentin, V.; Eleftheriou, O.; Sjölie, D.; Wästberg, B.S.; Logg, A. Towards Urban Digital Twins: A Workflow for Procedural Visualization Using Geospatial Data. Remote Sens. 2024, 16, 1939. [Google Scholar] [CrossRef]
  51. Cai, Z.; Wang, Y.; Zhang, D.; Wen, L.; Liu, H.; Xiong, Z.; Wajid, K.; Feng, R. Digital Twin Modeling for Hydropower System Based on Radio Frequency Identification Data Collection. Electronics 2024, 13, 2576. [Google Scholar] [CrossRef]
  52. Zhu, J.; Sun, B.; Jia, L.; Hu, H. From Sensing Technology towards Digital Twin in Applications. Inventions 2024, 9, 43. [Google Scholar] [CrossRef]
  53. Lovecek, T.; Ristvej, J. Quantitative Assessment Parameters of the Protection Level of National Strategic Sites in the EU. In Risk Analysis VII: Simulation and Hazard Mitigation; WIT Transaction on Information and Communication Technologies; WIT Press: Southampton, UK, 2010; Volume 43, pp. PI69–PI80. [Google Scholar] [CrossRef]
  54. Hu, W.; Zhang, T.; Deng, X.; Liu, Z.; Tan, J. Digital Twin: A State-of-the-Art Review of Its Enabling Technologies, Applications and Challenges. JIMSE 2021, 2, 1–34. [Google Scholar] [CrossRef]
  55. Botín-Sanabria, D.M.; Mihaita, A.-S.; Peimbert-García, R.E.; Ramírez-Moreno, M.A.; Ramírez-Mendoza, R.A.; Lozoya-Santos, J.D.J. Digital Twin Technology Challenges and Applications: A Comprehensive Review. Remote Sens. 2022, 14, 1335. [Google Scholar] [CrossRef]
  56. Chovanec, D.; Kollár, B.; Halúsková, B.; Kubás, J.; Pawęska, M.; Ristvej, J. A Component-Based Approach to Early Warning Systems: A Theoretical Model. Appl. Sci. 2025, 15, 3218. [Google Scholar] [CrossRef]
  57. Kubás, J.; Bugánová, K.; Polorecká, M.; Petrlová, K.; Stolínová, A. Citizens’ Preparedness to Deal with Emergencies as an Important Component of Civil Protection. IJERPH 2022, 19, 830. [Google Scholar] [CrossRef]
  58. Holla, K. Complex model for risk assessment of industrial processes. IDRiM J. 2014, 4, 93–102. Available online: https://www.idrimjournal.com/api/v1/articles/11687-complex-model-for-risk-assessment-of-industrial-processes.pdf (accessed on 7 July 2024). [CrossRef]
  59. Ristvej, J.; Sokolová, Ľ.; Staračková, J.; Ondrejka, R.; Lacinák, M. Experiences with Implementation of Information Systems within Preparation to Deal with Crisis Situations in Terms of Crisis Management and Building Resilience in the Slovak Republic. In Proceedings of the 2017 International Carnahan Conference on Security Technology (ICCST), Madrid, Spain, 2–5 October 2017; pp. 1–6. [Google Scholar] [CrossRef]
Figure 1. Overview of the different types of DT maturity levels [1,11,12,13].
Figure 1. Overview of the different types of DT maturity levels [1,11,12,13].
Smartcities 09 00028 g001
Figure 2. Architecture of flood DT for CM.
Figure 2. Architecture of flood DT for CM.
Smartcities 09 00028 g002
Table 1. Phases of digital twin maturity levels.
Table 1. Phases of digital twin maturity levels.
Level 0Level 1Level 2Level 3Level 4Level 5Level 6
Smartcities 09 00028 i001
Smartcities 09 00028 i002
Preparatory
and initialization phase
Conceptualisation phaseDevelopment,
descriptive phase
Integrative
diagnostic phase
Predictive,
analytical phase
Optimisation phaseAutonomous phase
ProposalData A set of values that represent the properties or state of an assetInformation Structuring data into a coherent organised formatKnowledge Processing of structured data using previous experienceUnderstanding Scenario analysis through case studiesInsight Optimizing and developing new relationshipsWisdom Feedback, system control based on understanding
Know what?Know how?Appreciate why?Know why?
(understanding the need)(understanding relationships)(understanding the rules)(understanding the principles)
Table 2. Definition of criteria for literature assessment.
Table 2. Definition of criteria for literature assessment.
CodeCriterion (Short Label)DefinitionScore 0 (Absent)Score 1 (Partial)Score 2 (Explicit/Strong)
C1Purpose and domainWhether the model defines the target domain (e.g., industry, smart cities, hazards) and intended decision context.Domain/purpose not stated.Domain stated, decision context unclear.Domain + decision context clearly defined.
C2Unit of analysis and scopeWhat the model assesses: a single asset, a single system, or a city/system.Asset-only or scope unclear.System-level (single system).System-of-systems/city.
C3Stage logic and boundariesWhether stages have clear “entry/exit” conditions (what must be true to claim a stage).Labels only; no boundaries.Some boundaries mentioned.Boundary conditions explicitly defined by stage.
C4Data integration and timelinessWhether maturity includes progression from static data to periodic updates to near-real-time/real-time streams.Data timeliness not addressed.Periodic updates mentioned.Real-time/streaming addressed with integration guidance.
C5Modelling and simulation integrationHow modelling capability evolves (visualization only vs. scenario/physics-based/hybrid simulation).No modelling (visual-only).Modelling mentioned but not staged.Explicit staged modelling integration (e.g., calibration, coupling).
C6Prediction and scenario capabilityWhether forecasting and “what-if” scenario capability is defined as maturity increases.Absent.Mentioned without staged capability.Explicit staged predictive/scenario progression.
C7Validation and uncertaintyWhether verification/validation and uncertainty management are required as maturity increases.Absent.Qualitative mention only.Explicit V&V and uncertainty handling practices.
C8Decision-loop and automationWhether the DT maturity includes how outputs influence decisions (human-in-loop vs. automation).No decision loop.Decision support claimed, not staged.Explicit staged automation/decision-loop integration.
C9Interoperability and standardsWhether maturity includes interoperability across tools/organisations (APIs, standards).Absent.Mentioned as a general need.Explicit interoperability dimension or staged requirements.
C10Governance and accountabilityWhether maturity includes roles, responsibilities, data stewardship, and coordination mechanisms.Absent.Stakeholders mentioned, no structure.Explicit governance/accountability requirements by maturity.
C11Operationalization (measurable indicators)Whether the model provides measurable indicators, scoring, or an assessment instrument (not only guiding questions).No indicators/tools.Guiding questions/checklists only.Indicators + scoring/rubric/diagnostic tool provided.
Table 4. Assessment-grade phases of flood digital twin maturity levels.
Table 4. Assessment-grade phases of flood digital twin maturity levels.
LevelCore ObjectiveEntry ConditionsExit CriteriaMeasurable Indicators (e.g.)Evidence ArtifactsSupporting Literature
0Define flood decision context and confirm institutional and data readiness for a flood digital twin.Flood mechanism and decision context defined (preparedness/response/recovery); unit of analysis defined; stakeholders and responsible organisations identified; initial inventory of baseline datasets and monitoring assets completed; access constraints documented.Concept and staged implementation plan approved; governance roles assigned; minimum baseline dataset agreed; data-sharing pathways defined for core datasets.Use case defined; stakeholder map completeness; baseline data inventory coverage; datasets with defined access pathway; governance roles assigned.Concept note; stakeholder map; requirements document; initial architecture sketch; baseline data register; monitoring asset inventory; draft data-sharing notes/SOP outline.[32,42,45,46,48]
1Build a coherent static flood-relevant representation to support assimilation, modelling, and validation.Terrain model assembled; hydrography and drainage/river network compiled; land cover layer prepared; exposure layers prepared (buildings, critical infrastructure); metadata and coordinate harmonisation completed.Baseline validated for completeness and spatial consistency; authoritative baseline and update responsibilities defined; integration plan for dynamic observations specified.DEM resolution/vertical accuracy class documented; network completeness; exposure coverage (buildings/critical assets mapped); metadata completeness.Versioned geodatabase/GIS project; metadata catalogue; data dictionary; baseline QA report; lineage/provenance documentation; baseline flood-relevant maps.[5,32,36,37,38,39]
2Establish systematic ingestion and descriptive monitoring.Operational pipeline exists for rainfall and water level/discharge; dashboards or maps exist; event logging initiated.Periodic updates reliable; basic automated QA implemented (range checks, missingness, sensor status); operational KPIs defined and reviewed.Update frequency; ingestion success rate; data completeness; sensor uptime; observations passing QA; median end-to-end latency.Ingestion configuration; pipeline logs; QA rule set and reports; dashboard exports; KPI definitions; event log repository.[34,35,43,44]
3Synchronise multi-source observations with low latency and add diagnostics, calibration/assimilation routines.Multi-source integration implemented (e.g., gauges, radar/satellite products); synchronisation rules defined; anomaly/fault detection enabled; diagnostic analytics available.Near-real-time linkage demonstrated (latency target met); diagnostic routines tested on historical events; calibration/assimilation routines established (e.g., rainfall blending, rating curves, drainage parameters).End-to-end latency); fused sources; anomaly detection coverage; calibration/assimilation frequency; documented data lineage.Data fusion workflow; synchronisation logs; anomaly detection documentation; calibration/assimilation reports; replay diagnostic cases; lineage documentation.[34,35,41,44]
4Provide validated forecasts and scenario exploration with explicit uncertainty communication for early warning.Predictive model coupled; validation plan defined (events, metrics); scenario set defined); uncertainty method selected.Forecast skill demonstrated on observed events; uncertainty outputs published with forecasts; scenario library maintained and updated; drift/performance monitoring initiated; warning thresholds linked to outputs.Forecast horizon; skill metrics (e.g., hit rate/false alarm); forecasts with uncertainty bounds; time to update forecast after new data; scenario library size and refresh cadence.Model documentation; calibration/validation reports; uncertainty specification; scenario library; forecast dashboards; warning threshold rationale; post-event evaluation summaries; drift reports.[31,32,34,35,41,42,43,44]
5Generate actionable recommendations and support response/operations planning.Decision objectives defined (risk reduction, service continuity, safety); constraints encoded (capacity, regulations); prescriptive method implemented (rules/optimisation); decision workflow agreed.Recommendations tested in exercises or events; workflow institutionalised; measurable benefit demonstrated; continuous improvement loop defined.Recommendation generation time; adoption rate; benefit metrics (e.g., response time reduction); constraint coverage; decision traceability (% recommendations with justification).Decision SOP; prescriptive/optimisation model specification; recommendation logs; exercise/after-action reports; benefit evaluation report; change log for updates.[5,33,41,45,48]
6Enable partial automation/closed-loop control with human override, auditability, interoperability, and accountability.Actuation pathways exist (e.g., pump/gate scheduling, dynamic alerts); override protocol defined; audit trail implemented; interoperability interfaces and access controls operational; governance agreed.Continuous monitoring and drift controls demonstrated; interoperability validated across organisations; audit and accountability mechanisms functioning for semi-automated actions; performance monitored.Automated actions (% with override); override response time; audit completeness (% actions traceable); interoperability uptime—downtime during events.Control and audit logs; override playbook; interface/API specifications; access control policy; inter-agency agreements; continuous monitoring dashboards; drift reports; periodic audit summaries.[5,33,37,42,44,45,46,47,48]
Table 5. The data layers of the flood’s digital twin.
Table 5. The data layers of the flood’s digital twin.
Data CategoryTypes of DataPossibility of Obtaining Data
Meteorological dataMeteorological data and information
precipitation, air temperature, air pressure, humidity, wind, snow, air quality, historical data
use of data sources from existing stations, meteorological forecasts, barometers, rain gauges, anemometers, meteorological radars, meteorological maps, meteorological satellites, satellites,
Hydrological dataHydrological data and informationuse of data sources from existing stations, hydrological forecasts, hydrological maps, satellite and aerial imagery, radar, sensing equipment, status of water structures,
water level, flow, inflow, outflow, water temperature, ice phenomena, water quality, historical data
Infrastructure dataData and information on the layout of constructed utility infrastructuresexisting documentation and records, maps, satellite images, traffic maps, GPS,
transport network (road, rail, water, bridges, ports), energy network, communication network
Building construction dataData and information on built-up areasexisting cadastral maps, plan records, databases, urbanisation drawings, laser scanners, photogrammetry, GPS, existing 2D/3D visualizations
residential houses, hospitals, schools, social facilities care facilities, stadiums, parks, industrial plants, municipal/district offices
Geomorphological data and land cover dataData and field informationlaser scanning (LiDAR), satellite imagery, aerial imagery, GPS, integration of existing topographic maps,
digital elevation models (DEM: elevation), distribution and size areas of rivers, relief features, slopes, valleys, wetlands, forests, vegetation
Demographic and economic and industry dataData and information on the demographic and economic distribution of the territoryregularly updated existing records, statistical surveys, map documents
demographic distribution and population density of the selected area, delineation of industrial zones, economic activity
Underground infrastructure dataData and information on the location of constructed subsurface utility infrastructuresexisting documentation and map records of subsurface utilities, electromagnetic induction equipment, radars, magnetometers, laser scanning, thermographs,
water pipes, sewers, gas pipes, optical and telecommunication networks, drainage facilities
Geophysical data of subsurface structuresData and information on subsurface structures, soil types and groundwaterintegration of existing map data, sensor seismographs, GPR, probes, GPS, historical records.
soil maps (soil physical properties, structure composition), geological maps (rock types, permeability, groundwater presence and quality, landslides), hydrological groundwater maps), historical data
Table 6. Leading institutions of the current organisation and management of flood risk in Slovak Republic.
Table 6. Leading institutions of the current organisation and management of flood risk in Slovak Republic.
InstitutionAdministrative LevelFlood-Risk FunctionKey Datasets Relevant to Flood Digital Twin
Ministry of Environment of the Slovak RepublicCentral state administrationNational flood policy and governance; coordinationPolicy and regulatory framework; strategic plans; national reporting and coordination rules
Slovak Water Management Enterprisedistrict officesRiver basin operation; hydraulic structures; flood protection measures River network and hydraulic structure inventories; operational regimes; maintenance and intervention records
Water Research InstitutePublic research instituteMethods support; modelling/analysis supportModelling methods; data quality guidance; analytical studies
Slovak Hydrometeorological InstituteCentral specialised agencyHydrometeorological monitoring; forecasting; warning informationRainfall observations/products; hydrological monitoring; forecasts; warning thresholds and bulletins
Water Management ConstructionState enterprise/contractorConstruction and maintenance of flood protection infrastructureFlood protection infrastructure records; works/upgrade documentation
Ministry of Agriculture and Rural Development of the Slovak RepublicCentral state administrationLand and water management policy influencing runoff/drainageLand management and water/soil policy context; agriculture-related runoff/drainage measures
Forests of the Slovak RepublicState organisationForest management affecting runoff/erosion and retentionForest management layers; interventions; land-cover updates relevant to hydrology
Forests and Estates UličSectoral/local organisationLocal forest estate managementLocal forestry management data relevant to runoff/erosion
HydromeliorationState organisationDrainage/melioration infrastructure operationDrainage/irrigation networks; asset status; operational records
National ParksProtected area authorityProtected-area land management constraintsProtected-area land management and restrictions affecting measures
Ministry of the Interior of the Slovak RepublicCentral state administrationPreparedness; crisis planning; civil protection coordination; response proceduresEmergency and response plans; preparedness procedures; operational coordination records
Integrated Rescue System (MoI SR)Central operational coordinationDispatch/coordination for emergency response; multi-agency operational coordinationIncident handling and coordination workflows; operational communication/dispatch information
District offices (regional)Local state administrationState coordination; enforcement; regional situational reportingRegional situation reporting; coordination outputs; directives
District offices (district)Local state administrationLocal incident coordination and response supportEvent logs (local); local coordination actions; reporting; local crisis plans
Higher territorial unitsRegional self-governmentRegional planning; resource coordinationRegional plans; resource allocation
MunicipalitiesLocal self-governmentLocal measures; local asset management; citizen communicationLocal exposure layers; local reporting; local infrastructure actions
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Ristvej, J.; Halúsková, B.; Nováková, K.; Chovanec, D. Implementation Maturity Levels of Digital Twin Technology and Data Content Design for Flood Digital Twin. Smart Cities 2026, 9, 28. https://doi.org/10.3390/smartcities9020028

AMA Style

Ristvej J, Halúsková B, Nováková K, Chovanec D. Implementation Maturity Levels of Digital Twin Technology and Data Content Design for Flood Digital Twin. Smart Cities. 2026; 9(2):28. https://doi.org/10.3390/smartcities9020028

Chicago/Turabian Style

Ristvej, Jozef, Bronislava Halúsková, Karin Nováková, and Daniel Chovanec. 2026. "Implementation Maturity Levels of Digital Twin Technology and Data Content Design for Flood Digital Twin" Smart Cities 9, no. 2: 28. https://doi.org/10.3390/smartcities9020028

APA Style

Ristvej, J., Halúsková, B., Nováková, K., & Chovanec, D. (2026). Implementation Maturity Levels of Digital Twin Technology and Data Content Design for Flood Digital Twin. Smart Cities, 9(2), 28. https://doi.org/10.3390/smartcities9020028

Article Metrics

Back to TopTop