Next Article in Journal
Decoding Digital Synergies: How Mechatronic Systems and Artificial Intelligence Shape Banking Performance Through Quantile-Driven Method of Moments
Previous Article in Journal
Plasma-Assisted Decoration of Gold Nanoparticles on Bioinspired Polydopamine Nanospheres as Effective Catalyst for Organic Pollutant Removal
Previous Article in Special Issue
Random Forest Spatial-Temporal and State Space Models to Assess the Impact of Bushfire-Induced Aerosol Events on Ozone Depletion in Australia
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Immersive Hydroinformatics Framework with Extended Reality for Enhanced Visualization and Simulation of Hydrologic Data

by
Uditha Herath Mudiyanselage
1,2,
Eveline Landes Gonzalez
1,2,
Yusuf Sermet
2,* and
Ibrahim Demir
3,4
1
Computer Science, University of Iowa, Iowa City, IA 52242, USA
2
IIHR Hydroscience & Engineering, University of Iowa, Iowa City, IA 52242, USA
3
River-Coastal Science and Engineering, Tulane University, New Orleans, LA 70118, USA
4
ByWater Institute, Tulane University, New Orleans, LA 70118, USA
*
Author to whom correspondence should be addressed.
Appl. Sci. 2025, 15(10), 5278; https://doi.org/10.3390/app15105278
Submission received: 13 March 2025 / Revised: 18 April 2025 / Accepted: 27 April 2025 / Published: 9 May 2025
(This article belongs to the Special Issue AI-Enhanced 4D Geospatial Monitoring for Healthy and Resilient Cities)

Abstract

:
This study introduces a novel framework with the use of extended reality (XR) systems in hydrology, particularly focusing on immersive visualization of hydrologic data for enhanced environmental planning and decision making. The study details the shift from traditional 2D data visualization methods in hydrology to more advanced XR technologies, including virtual and augmented reality. Unlike static 2D maps or charts that require cross-referencing disparate data sources, this system consolidates real-time, multivariate datasets, such as streamflow, precipitation, and terrain, into a single interactive, spatially contextualized 3D environment. Immersive information systems facilitate dynamic interaction with real-time hydrological and meteorological datasets for various stakeholders and use cases, and pave the way for metaverse and digital twin systems. This system, accessible via web browsers and XR devices, allows users to navigate a 3D representation of the continental United States. The paper addresses the current limitations in hydrological visualization, methodology, and system architecture while discussing the challenges, limitations, and future directions to extend its applicability to a wider range of environmental management and disaster response scenarios. Future application potential includes climate resilience planning, immersive disaster preparedness training, and public education, where stakeholders can explore scenario-based outcomes within a virtual space to support real-time or anticipatory decision making.

1. Introduction

Historically, the representation and interpretation of spatial and environmental data have been predominantly grounded in 2D visualization practices [1]. These methods, commonly involving static graphical displays such as charts, maps, or plots, formed the foundation for data communication and analysis. By leveraging visual elements like color, shape, and layout, these 2D visualizations enabled researchers and analysts to identify patterns, detect changes, and derive insights from complex datasets, especially when presented on flat surfaces like paper or screens [2,3]. With the advancement of computational technologies and increasing research demands, hydrological studies transitioned from empirical and qualitative approaches to the application of physical and statistical models, enabling more rigorous and quantitative analysis of hydrological processes [4].
It is noteworthy that as computational methodologies gained prominence, the evolution of data visualization techniques was also significantly influenced by technological advancements [5]. An illuminating review of this transformation can be found in the work of Ladson et al. [6], which not only elucidates the process of creating such visualizations but also proposes avenues for enhancement. Despite the prevalence of graphical methods in the field, a paradigm shift occurred with the ascendancy of data-driven analyses [7].
Hydrological data is inherently complex, characterized by multifaceted attributes, including multivariate properties, temporal dependencies, diverse origins, and data from various sources or models and simulations [8]. Compounded with growing efficiency in data collection, the intricate nature of hydrological data has presented formidable challenges to researchers and decision-makers [8]. The voluminous and multivariate nature of these datasets has made the extraction of critical insights a daunting task, thus hindering the potential to apply the data effectively [9].
In the realm of environmental management and planning, hydrological data plays a critical role in supporting intricate decision-making processes, with a pronounced emphasis on the social dimensions of water resource management [10]. The social dimension refers to the interaction between the water environment and human society, including how public attitudes, behaviors, and online activity inform and respond to water-related challenges [11]. A comprehensive hydroinformatics framework necessitates the engagement of a diverse set of stakeholders, including water utility operators, urban planners, researchers, technology providers, policy makers, and citizens, each contributing unique perspectives and benefiting from enhanced decision making, transparency, and system resilience [12]. Given this critical focus on comprehension and stakeholder engagement, effective representation of data assumes paramount significance in this field.
This study introduces an innovative framework utilizing extended reality (XR) to transform the visualization of hydrologic data in environmental planning and decision making. This framework leverages the Unity Engine and Mapbox Maps Software Development Kit (SDK) version 2.1.1 as the base and WebXR exporter and Virtual Reality Tool Kit (VRTK) SDK for dynamic interaction with real-time hydrological and meteorological data. This system, accessible via web browsers and VR devices, enables users to navigate a 3D representation of the continental United States, interactively engaging with detailed environmental data from federal agencies like the United States Geological Survey (USGS) and the National Oceanic and Atmospheric Administration (NOAA). By integrating these diverse, spatially referenced data streams into a unified XR interface, the system ensures the relevance, accessibility, and contextualization of hydrological information for a variety of user groups.
The project represents a significant departure from conventional 2D visualization paradigms in hydrology, offering a more intuitive, immersive, and interactive experience. By leveraging XR, the framework elevates users’ ability to comprehend spatiotemporal dynamics, such as changes in stream discharge, precipitation, or terrain conditions, through visually enriched environments. These enhancements promote deeper insight into hydrological phenomena and more informed decision making, especially in scenarios where real-time situational awareness and stakeholder collaboration are critical. This is made possible through the integration of reliable, high-resolution spatial data from Mapbox and real-time hydrological and meteorological inputs from USGS and NOAA, which collectively provide an accurate and context-rich foundation for immersive visualization. XR enhances hydrological data visualization by enabling users to explore dynamic, 3D representations of environmental systems that respond in real time to user input and live data streams. The framework’s use of interactive 3D terrain, fluid simulations, and weather overlays creates a spatially and temporally accurate environment where users can observe hydrologic processes as they unfold. Immersion is achieved and maintained through visual realism such as animated rainfall and snowfall, flowing water behavior, and contextualized sensor data. It is further reinforced by intuitive navigation controls and responsive UI elements.
The remainder of this paper is organized as follows. Section 2 discusses the methods used to create the virtual environment. Section 3 presents the products of our development along with a discussion of strengths and weaknesses. Section 4 concludes the paper and highlights plans for future research.

2. Materials and Methods

2.1. Related Work

A promising avenue that has gained prominence in recent years is the adoption of XR systems. While XR systems have witnessed substantial adoption in various realms [13,14], there still remains untapped potential for diverse applications in the hydrological sciences. Creating virtual realities often necessitates the use of visualization construction tools, which, while valuable, have been critiqued for their limitations in user manipulation and control [15]. Furthermore, the utilization of visualization-based software libraries has mitigated the complexities of creating graphical visualizations, albeit at the cost of requiring a certain degree of familiarity with technological systems [15].
Augmented and virtual reality applications present an opportunity to develop fully immersive experiences to recreate real-time and historical disaster scenarios [16] in a controlled experimental environment that allows repetition. In the environmental sciences, applications of XR technology have proliferated with a strong emphasis on education. This can be exemplified by the Virtual Hydrology Observatory, which is designed to augment student learning by offering immersive field experiences from within the classroom setting [17]. Other projects extend the concept to virtual field trips, enabling users to explore geographic locations that are relevant and representative of current topics of study. Such is the case with VR4Hydro, which allows for digital tours within the catchment of the Gersprenz river in an immersive fashion [18]. XR technology is also being explored for its utility in hydrological visualizations, as evidenced by Hedley et al., who integrate augmented reality technology into the visualization of geographic and hydrological information, facilitating real-time data exploration while navigating a modeled terrain [19].
In the realm of XR technologies for environmental disaster management and response, significant strides have been made. Sermet and Demir [20] presented Flood Action VR, a virtual reality tool designed for disaster awareness and emergency training, integrating real-time weather data and gamification strategies. They further expanded their research by emphasizing the integration of XR in environmental education, disaster management, and data visualization to enhance decision support and public awareness [21].
Building on these efforts, HoloFlood was developed as an XR framework that simulates flood events and presents damage assessments and population impact visualizations through holographic projections [22]. This system laid the foundation for GeospatialVR, a generalized framework for immersive visualization of diverse environmental hazards such as floods and wildfires, designed to support environmental planning and emergency response [23].
Another notable contribution is from Demiray et al. [24], who developed FloodGame, an interactive 3D serious game on flood mitigation for disaster awareness and education. This web-based game focuses on educating users about flood mitigation strategies, using realistic 3D environments to foster community resilience and engagement. Further, XR technology has been effectively used in data visualization and situational analysis, as illustrated by Ready et al. [25], Haynes et al. [26], and Macchione et al. [27], showcasing its utility in presenting complex environmental data in an accessible format.
In immersive disaster simulation, Kawai et al. [28] introduced an AR-based tsunami evacuation drill system that challenges users to navigate disaster scenarios within time constraints. Iguchu et al. [29] developed a gamified disaster evacuation training framework, enhancing disaster response education by immersing users in various scenarios. Finally, Wang et al. [30] utilized VR to simulate floods, integrating hydrological models into a 3D environment, providing a realistic experience of flood scenarios. Recent studies have underscored the importance of user immersion in XR environments, showing that enhanced immersion can improve cognitive performance while reducing simulation-related discomfort [31]. Ensuring high immersion is particularly vital in environmental applications, where the risk of evoking uncanny or disjointed user experiences can undermine system effectiveness [32]. Immersive XR systems for hydrological data are thus increasingly being designed not only to visualize data but also to create intuitive and accessible workflows for users across disciplines.

2.2. Gaps and Limitations in Previous Approaches

While the adoption of XR technology in the field of hydrological visualization has been steadily increasing [33], the predominant focus of these endeavors has been the enhancement of educational experiences. Many of these initiatives are tailored for classroom settings and involve providing students with some means of visualizing and exploring hydrological processes. Notably, a dearth of efforts exists concerning the development of digital systems for hydrological visualization that directly cater to the support of environmental policy formulation, strategic planning, and informed decision making.
Furthermore, there appears to be a limited number of models that not only facilitate data visualization but also foster face-to-face interactions centered around the data. Augmented reality emerges as a compelling candidate for enabling such interactions [20]. AR’s unique capacity to encourage collaborative engagement surpasses the capabilities of VR enhancements, making it a particularly promising avenue [19]. Moreover, AR integrates into pre-established workflows, a characteristic that further enhances its appeal in the context of hydrological visualization [19].

2.3. Scope and Purpose

This project endeavors to enhance the immersive visualization of hydrological processes with a specific focus on the continental United States. The primary objective is to enable the visualization of hydrological phenomena and pertinent information throughout the region, thereby providing valuable insights for stakeholders involved in water resources planning. These stakeholders encompass a broad spectrum of individuals and organizations impacted by hydrological events.
While the project primarily caters to planners and decision-makers, its utility extends to a wider audience. The application dynamically generates 3D models to represent terrain and relevant infrastructure, adapting in real-time as users navigate the interface to display terrain tiles, visualizing the landscape of the continental United States. All 3D modeling and generation for this process is executed locally, leveraging client-side hardware resources and eliminating the need for resource-intensive server-side preprocessing. The application draws upon data from sensors maintained by the United States Geological Survey (USGS) to present real-time information on local weather conditions, stream height, and discharge. By implementing this framework, we contribute to an ongoing endeavor to revolutionize the field of hydrological visualization, facilitating collaborative planning and response to climate-related disasters.
The innovation of this research lies in its departure from static, 2D graphs and maps used in hydrology, which often separate spatial, temporal, and contextual information across multiple platforms. In contrast, this XR-based framework consolidates multivariate, real-time data into a single immersive environment, allowing users to engage with complex hydrological systems in situational context. Unlike traditional systems that rely on abstract representations and require prior domain expertise for interpretation, the proposed solution offers intuitive, spatially embedded interactions that improve accessibility, comprehension, and stakeholder engagement and positions the framework as a foundation for future applications in digital twin environments, climate resilience planning, immersive training, and public education on environmental risks.
Through the development of this application, we aim to address the following key questions: (a) What data sources are most effective for creating immersive hydrological visualizations? (b) How can XR be used to enhance hydrological data visualizations? (c) How can user immersion be created, maintained, and leveraged in data visualization? (d) What are the future prospects and potential improvements that can be made in the use of XR systems for hydrological data visualization?

2.4. Data Sources

This section explores the data sources that significantly contribute to the functionality and efficacy of the immersive hydroinformatics framework. It outlines the process of selecting appropriate map providers, detailing the criteria and considerations that informed this selection, including the evaluation of features such as data availability, geospatial calculation capabilities, and cross-platform compatibility. Furthermore, it presents an overview of the integration of real-time sensor data, explicating the methodologies employed to incorporate and visualize hydrological and meteorological data from third-party providers.

2.4.1. Review of Map Providers

In the field of interactive geospatial applications, selecting an appropriate map provider is crucial. This is because the available SDKs and libraries consist of different capabilities and cater to different requirements. Providers offer a range of services, including the provision of a high-precision WGS84 globe (a global coordinate reference system that defines the Earth’s shape and location using latitude and longitude), geospatial calculations, cross-platform compatibility (such as Unity and Unreal game engines and JavaScript), open-source geospatial data standards, and data pipelines for geospatial information, such as 3D vector tiles. Moreover, they typically feature active community forums where users can find collaborative support for solving problems.
During the project’s planning phase, several map providers were evaluated by developing prototypes. The chosen map providers, namely Cesium, Mapbox, and Google’s Photogrammetry 3D data, were assessed based on a range of features and services provided, such as availability of data, capability of geospatial calculations, and cross-platform compatibility.
Cesium: Cesium, a prominent map provider, extends beyond its mapping services to offer various geospatial functionalities. Notably, it has pioneered the development of a standardized format for 3D geospatial objects known as 3D vector tiles. Furthermore, Cesium Ion, an associated service, provides hosting for these vector tiles and streamlines the process of programmatically converting datasets into 3D vector tiles. Unity and Unreal developers can visualize these datasets through Cesium’s dedicated SDK. Additionally, Cesium offers certain basic 3D data for free, such as OpenStreetMap Building data.
Mapbox: Mapbox, another map provider, stands out for its extensive capabilities. It offers 3D building data and the ability to dynamically load and view any location with precision by providing longitude and latitude coordinates. Beyond basic mapping, Mapbox includes valuable features such as live traffic data and bathymetry. Users can choose from various terrain options, including flat terrain and terrains with elevation data. Mapbox displays high accuracy in precision calculations when working with coordinates, allowing for precise placement of pointers and markers. Nevertheless, Mapbox has faced criticisms regarding the frequency of SDK updates, the last of which dated back to 2019. This has necessitated manual adjustments when integrating the SDK into new projects, such as deleting deprecated components and replacing deprecated methods. Despite this, Mapbox continues to provide support through its forums and offers a free tier during prototype development.
Google’s Photorealistic 3D Tiles: A unique offering for geospatial data comes from Google in the form of Photorealistic 3D Tiles. This dataset represents a 3D mesh model of the world, created from satellite imagery and Google Street View. Though experimental during the time of development, it exhibited impressive smoothness and precision. This dataset adheres to Cesium’s open standard and is hosted on Cesium’s platform, requiring Google’s map API with an authentication key for access. One of its standout features lies in its remarkable precision when placing different objects on the terrain. These objects integrate with the landscape, even following its elevations. Google’s Photorealistic 3D Tiles are observed to offer the best support for accurate visualization of spatial data that supports environmental decision making and disaster response, while providing limited capabilities to modify individual entities.
In choosing a map provider for the application, Mapbox emerged as the preferred choice for several compelling reasons. Foremost, Mapbox boasts an abundance of data, ensuring a rich and comprehensive foundation for our visualization needs. The platform’s free tier allows for substantial exploration and development work. Moreover, Mapbox’s intuitive functions cater specifically to geospatial-based calculations and data manipulation, streamlining the development process and enhancing the system’s capabilities. Additionally, the community support surrounding Mapbox was also weighted in.

2.4.2. Spatiotemporal Data

Incorporating real-time sensor data is an important aspect of the application’s functionality, particularly for the purpose of visualizing hydrological data. To enable this, various REST APIs were integrated, facilitating the display of real-time data from third-party providers while accurately geolocating it within the application’s terrain. The application relies not only on data fetched from external APIs but also on information available through the map provider.
USGS Discharge and Weather API: One of the key components of the hydrological data integration is the utilization of the United States Geological Survey (USGS) REST API for water bodies. This API serves as a vital resource for accessing real-time data concerning water bodies across the United States, precisely pinpointed to their geographic locations. When a user loads a specific location within the application, the USGS API is invoked to retrieve pertinent information about the water bodies encompassed within a square area of 10 square kilometers. Subsequently, the application instantiates markers that are positioned at the exact geographic coordinates of these water bodies. These markers actively display dynamic data, including critical parameters such as discharge rate and flow velocity.
In addition to hydrological data, real-time weather information plays a crucial role in enhancing the application’s functionality. To this end, a weather API was thoughtfully integrated, offering continuous updates on meteorological conditions. Users can readily access this real-time weather data through a dedicated weather panel within the application. The weather panel is designed with user-friendliness in mind, presenting an array of weather-related information, including temperature and current weather conditions. These details are presented with an intuitive and visually engaging interface, featuring distinct icons that correspond to various weather conditions.
Incorporating Data Provided by Mapbox: Within the application, we leverage Mapbox’s diverse dataset, offering users the choice between terrain views, including both street and satellite perspectives, depending on their preferences. This versatility extends to the representation of urban landscapes, with a rich assortment of detailed building data that further enhances the user experience. To ensure users have an enhanced understanding of urban conditions, we integrate live traffic data from Mapbox. This dynamic feature displays traffic information on roadways, categorizing congestion levels into three distinct tiers: severe, moderate, and low. This aids users in planning their hydrological exploration routes effectively and efficiently. Additionally, we do not limit our geographical scope to land; the application also incorporates bathymetry data, extending our visualization capabilities to underwater landscapes, further broadening our understanding of hydrological systems and the interconnectedness of water bodies.

2.5. System Architecture

The system architecture of the application under consideration primarily hinges on the Unity game engine, specifically Unity version 2022.3. Unity is chosen for its exceptional support of lightweight applications and cross-platform capabilities, which aligns with our vision of creating a lightweight and device-agnostic framework. Figure 1 presents the overview of the system architecture.
This application is designed to be accessible through web browsers to ensure broad accessibility. Moreover, the application offers immersive experiences through various VR devices. To enable this compatibility, the “WebXR exporter” SDK for Unity is utilized. Rigorous testing has been conducted with devices such as Oculus Quest and Magic Leap to verify compatibility. Furthermore, the application harnesses the VRTK to integrate the controllers of the VR device. The VRTK SDK’s device-agnostic design is instrumental in capturing inputs from different controller types. This design choice was made to ensure a standardized and seamless experience for users across various VR environments.
The system architecture of the application reflects a methodical blend of advanced technology and inclusivity. Unity, combined with the WebXR exporter and VRTK, forms the foundation for a versatile, immersive, and user-friendly environment. This architectural approach aims to deliver a comprehensive and standardized user experience.
Game Manager Component: In the context of Unity applications, a game manager serves as a central component responsible for managing the application’s states and logic. It acts as a coordinator, ensuring seamless communication between various modules of the game. In our application, the game manager module plays a crucial role in controlling the available features. While the invocation of third-party APIs is delegated to a dedicated module, it is always initiated through the game manager. This design choice is attributed to the dynamic loading of the map, where the need for API invocation is contingent on specific conditions. Moreover, the game manager module also houses the control logic necessary to handle the dynamic zooming feature. This logic enables the application to respond to user-initiated zooming actions by adjusting the map’s level of detail, enhancing visuals when the camera is focused on smaller areas, and reducing image quality when zooming out to save on performance.
WebXR Exporter and Multiplatform Support: WebXR is an innovative, open-source SDK that allows developers to export a WebGL build of the Unity games. This application utilized the WebXR exporter to produce the final build for direct use by other platforms. This build is hosted on a static web host, making it accessible to PCs and VR devices through web browsers. Hosting the finished build in this way reduced the necessity of having separate builds catered to the particular needs of individual devices and platforms.
Dynamic Location Update: The application offers the ability to update the map dynamically in response to user requests. Users can input longitude and latitude values, enabling the application to load any desired location, even locations outside of the continental United States. This functionality is achieved through the utilization of the AbstractMapScript.UpdateMap() method, a feature provided by Mapbox. Notably, this method streamlines the process by eliminating the need for map re-initialization. Furthermore, it facilitates the conversion of longitude and latitude values into world coordinates. The application also allows for dynamic zooming capabilities. Users can navigate the scene both horizontally and vertically, effectively zooming in or out. The application intelligently adjusts the level of detail displayed based on the user’s zoom level. Similar to the map updating function, this dynamic zooming feature is also managed by the AbstractMapScript.UpdateMap() method provided by Mapbox.

2.6. Immersive Hydrological Visualizations

This section discusses the strategies used to create a sense of immersion in the digital environment. It first describes the methods used to create immersive precipitation, and then moves on to the use of real-time fluid simulation. Lastly, it details the visualization of critical hydrological infrastructure.

2.6.1. Rainfall and Snowfall Animations

Rainfall and snowfall animations were created using Unity’s built-in particle system. Rain particles were simulated by giving the particles a 3D start size that varied within a range along the Y axis. The particles were then given a linear velocity over time that could take on a range of values, allowing for variability between the velocities of different particles. The snow particle system was created with the same method used to create the rainfall particle system, though range values for the particle size and velocity were adjusted to allow the look and behavior of the particle system to conform more to that of snowfall.
These particle systems were simulated within the world space, rather than the local space. Doing this allowed for the particle system to be moved without affecting the positions of particles after they have been emitted. This was particularly important because the particle systems are attached to the main camera, and thus, move within the world as the main camera moves. This was done to increase the presence of the particles without needing to simulate the particles over an incredibly large range within the game space.
A C# class was created to control the relevant visual settings of the particle systems. This control was achieved by attaching the class as a script component to the appropriate particle systems. Serialized variables within this class allowed for easy modification of the particle system’s emission rate over time, simulation speed, maximum particle count, and transform component axis-rotation values.

2.6.2. Water Animation

Many solutions were considered for water simulation within the virtual environment. Initially, water had been simulated by using a plane with animated normal maps and reflections to give the plane the illusion of waves, ripples, and flow. Although this method of water simulation looked realistic and supported user immersion, it was fundamentally flawed. This method did not support our vision of visualizing complex hydrological processes because the plane that was employed could not be compelled to behave like an amorphous body of water. For this reason, this method was utilized, and we decided to work on real-time fluid simulation solutions.
Water simulation within the virtual environment was achieved using the off-the-shelf Zibra Liquids asset. This asset allowed for real-time fluid simulation, which could be used to visualize fluid behaviors within the virtual environment. To fully integrate this asset into our digital environment, we had to first resolve incompatibilities between the Maps SDK and the Zibra Liquids asset. All colliding object instances must be defined within the Unity editor before running the fluid simulation. This results in dynamically generated object instances, such as those utilized by our terrain generation method, being unable to be added to the array of meshes for which complex collision interactions would be calculated. This problem was partially circumvented by disabling the instance of the fluid simulation, programmatically adding the newly generated terrain tiles to the array of objects for which collisions would be processed, and then enabling the fluid simulation instance. A new problem arose, however, when the terrain became too complex to be simulated using the collision systems provided by the plugin. For these reasons, integration of real-time fluid simulation is a feature still considered in development.

2.7. User Experience, Interaction, and User Control

The web application, accessible through a PC and Virtual Reality (VR) devices, has been designed to ensure seamless user interaction across different platforms. The VRTK is used predominantly for managing inputs from the VR devices, while the Unity input system caters to keyboard-based interactions, complemented by independently developed modules for input handling. Users have the flexibility to navigate through the scene, able to switch from an aerial view to a closer, ground view on command. The WASD keys on the keyboard facilitate movement within the scene, following conventions set by other immersive digital experiences. For VR devices, the thumbstick controller is configured for navigation, and users have the option to teleport between locations, further enhancing the interactive experience of visualizing hydrologic data.
The primary user interface serves as a central hub for users to orchestrate all activities within the application. This interface allows the user to input latitude and longitude values of a specific location and to dynamically load the location corresponding to the latitude and longitude entered. Additionally, the interface includes toggle options that allow users to access and view USGS information panels, as well as real-time traffic and weather information. There are two types of information panels utilized in the application for data comprehension: panels displaying USGS data and the one showing weather data. The USGS information panel, designed as a prefabricated element (prefab), is instantiated at each sensor location within the virtual environment.

3. Results and Discussions

This study presents the development of an immersive hydroinformatics framework, integrating XR technologies. This framework, leveraging the Unity engine alongside the Mapbox Maps SDK, renders a detailed 3D representation of the continental United States and incorporates real-time data from the USGS API and a weather API. This facilitates an interactive environment for users to engage with hydrological and meteorological data, aiming to enhance understanding of complex hydrological phenomena through visualization. Designed for accessibility, the system is accessible within a web browser and is compatible with various VR devices, thereby supporting a wide user base, including researchers, decision-makers, and educators.

3.1. Mapbox Maps SDK Integration

Mapbox’s rich and diverse dataset has supported the presented application’s functionality and visualization prowess and proved to be an effective data source for visualizing the continental United States (Figure 2). Placement of models representing infrastructure is highly accurate, creating an effectively representative scale model of real space. There were a few instances where the Maps SDK inhibited the display of data from other sources. Furthermore, there was one notable instance where the Maps SDK presented an issue, stemming from incompatibilities with the fluid simulation plugin. Another concern regarding the Maps SDK is the potential for future support. As of the writing of this paper, there have been a few major updates made to the Maps SDK that support our future vision for the project.

3.2. USGS API for Discharge and Weather Data

By integrating the USGS weather and discharge APIs, the application supports users with instant access to crucial hydrological data, enhancing their understanding of the surrounding water bodies (Figure 3). Additionally, the inclusion of bathymetry data, visualized as raster data, offers a comprehensive view of underwater terrain features, further enriching the users’ spatial awareness and comprehension of aquatic ecosystems (Figure 4). Stream characteristics data are presented to users in a simple and effective format, and markers allow users to visualize where the data is coming from, improving user spatial comprehension. By offering up-to-the-minute meteorological insights, the application equips users with invaluable information, enabling informed decision making and effective planning. By visualizing relevant data, it is our vision that in-person and remote planning be enhanced with exceptional comprehension of environmental conditions. Future work on this system will focus on improving the graphics of the information panels used to present the data. Significant effort will be made to unify all stylistic aspects of the project to avoid diminishing user immersion through stylistic inconsistencies.

3.3. Immersive Visualization Systems

The presence of rainfall (Figure 5) and snowfall (Figure 6) particle systems is not crucial to the display of hydrological data, but it is important for the creation and maintenance of user immersion. The presence of these systems gives the digital environment a sense of realism and dynamism, which allows users to become more immersed in the simulated world than they could with the absence of dynamic weather [32]. User immersion is a key consideration in this model, as it has been found to increase cognitive performance and reduce the adverse effects of simulation use [31] (Mostajeran et al., 2023). The visualization of precipitation in this project is currently a static, visual phenomenon and does not have complex interactions with the terrain or with other data visualization and simulation systems. Future work on this system will prioritize implementing default visualizations that correspond to real-world weather data, thus allowing users to see local weather conditions in real time and then configure these visualizations if they choose to do so. Additionally, future work will aim to connect the weather visualizations to the fluid simulation system, allowing for a more seamless and holistic simulation of complex hydrological interactions.
The precipitation configuration script (Figure 7) allows users to control aspects of the particle system that are relevant to the graphical attributes of the precipitation. By implementing this script, other factors of the particle system that are not relevant can be stripped, reducing the likelihood of users becoming confused when trying to adjust the precipitation attributes. Allowing users to control these graphical attributes should allow for visualization to be catered to the needs of the user and their specific scenario. Additionally, this script acts as the foundation for implementing control of these graphical systems during runtime. Future work on this project will make these attributes accessible to users within the game world through a control panel user interface. This would allow these attributes to be configurable on the fly, without needing to be initialized before runtime or modified in the Unity editor. The configuration script is easily modifiable and can be changed to accommodate other visualizations that use Unity’s particle system.
The Zibra Liquids plugin provided effective real-time fluid simulation. The fluid is simple enough to configure and can be controlled well via scripting. The major issue with using this plugin was the inability to add collisions to the simulation at runtime. This limitation, although partially resolved, has been quite a setback to the progress of this application due to the terrain provided by the Maps SDK being entirely dynamically generated. Additionally, because terrain tiles can display a wide variety of topographic features, often the prebuilt collision boxes provided were not enough to create entirely emergent visualizations of hydrological processes. The ‘Neural Colliders’ offered by the plugin allow for the generation of more complex collision meshes, but this is more suited towards facilitating collisions between key set-piece assets. Because the process of creating these neural colliders must be done through the Zibra AI servers, it is not practical to generate these collision meshes for each terrain tile at runtime.
Another concern for fluid simulation was considerations of scale. The Zibra Liquids plugin simulates fluids using a large number of particles. Increasing the volume of fluid to be simulated increases the particle count, thereby increasing the need for computer processing power. To simulate large-scale hydrological processes such as floods, the volume of fluid must be strategically scaled in such a way that an accurate representation of the event can be created while also operating within the limitations of available hardware. How these simulations can be most effectively scaled is a consideration for future work on this project. As the development of this application progresses, it may become necessary to utilize fluid simulations provided through other means. We expect that the generalized nature of the project will allow for seamless integration of new simulation systems, thus allowing this project to be expanded upon with the addition of other environmental visualizations.

3.4. User Interaction and Control

An important aspect of the user experience is the application’s accessibility. Throughout development, we have taken action to make this application available to as many modes of use as possible. This was greatly facilitated by the use of the WebXR exporter for the creation of the WebGL build. The WebGL build exhibited incredible adaptability, accommodating a plethora of VR devices. Because WebXR has such a device-agnostic nature, developers can bypass the need to incorporate additional packages for supporting particular devices (such as the Oculus Integration package). Using a unified build greatly increases compatibility and reduces development complexity, but also comes with the cost of limiting the use of device-specific technology. The application may be able to run on many devices, but it is unlikely that the application will make the most of the technology it is being deployed.
Within the application, users are given the ability to move their position in three dimensions via controls or a web interface, allowing for seamless navigation of the generated scenes. This design choice ensures a consistent and user-friendly experience that aims to accommodate users and their viewing needs and preferences. Currently, the application best supports web and VR users with controls built in to accommodate these methods of viewing. To fully achieve our goal of a device-agnostic framework, an effort must still be made to allow for user interaction for AR users. To achieve this, a user interface will be crafted and rendered within the world space and combined with the capabilities of the VRTK to provide users with intuitive and accessible control over the application.
In addition to these needs, we must acknowledge the absence of control mechanisms optimized for mobile devices, such as touch-based interfaces and mobile browser compatibility. This gap in the system’s design restricts the accessibility and usability of the application on widely used mobile platforms. Additionally, the VR component of the system does not support activation via mobile VR headsets, further limiting its applicability in mobile contexts. These shortcomings, while notable, also present opportunities for enhancement and refinement that shall be addressed in subsequent developments of the project.

3.5. Evaluation of Findings

The development and implementation of the immersive hydroinformatics framework yielded insights that directly respond to the research questions posed in this paper. Regarding the question about the most effective data sources for creating immersive hydrological visualizations, the integration of the USGS API and weather API proved to be paramount. The selection of Mapbox as the map provider, as detailed in the consideration of various map providers, facilitated a rich and dynamic visualization platform. This choice was initiated by the accurate rendering of hydrological phenomena and the seamless integration of real-time data, pointing to the effectiveness of these data sources in enhancing the immersive experience.
In response to focusing on how XR can be utilized to enhance hydrological data visualizations, the use of the Unity engine and Mapbox Maps SDK, alongside VR devices, created a multi-dimensional and interactive environment. The application’s ability to dynamically generate 3D models and visualize real-time data within an immersive setting exemplifies the potential of XR technologies in transforming hydrological data visualization.
Concerning the development, maintenance, and leverage of user immersion in data visualization, the incorporation of real-time weather conditions and fluid simulations significantly contributed to user engagement. The visualization of rainfall and snowfall animations, as well as the simulation of water behavior using the Zibra Liquids asset, not only enhanced the realism of the environment but also facilitated a deeper understanding of hydrological processes, thereby addressing the role of immersion in data visualization. Table 1 provides an overview of the core components of the proposed framework in terms of their core value and strengths.
Lastly, with respect to identifying future prospects and potential improvements in the use of XR systems for hydrological data visualization. The discussion on integrating additional hydrological phenomena, such as percolation and soil moisture, into the framework suggests a pathway for expanding the application’s capabilities. Moreover, the emphasis on refining user interfaces and ensuring cross-platform compatibility highlights the ongoing need to enhance accessibility and user experience. These findings underscore the framework’s capacity to address the initial research questions, demonstrating the considerable promise of XR technologies in hydrological visualization and the potential for future advancements in this field.

4. Conclusions

The development of this framework marks significant progress toward our vision for an immersive information system. By integrating the Maps SDK with USGS stream data into the Unity engine, hydrological information is presented within its spatial context, enhancing understanding of local conditions. Users can traverse the continental United States virtually and visualize hydrological data from streams with USGS sensors. This includes stream conditions and real-time updates on local weather parameters such as temperature, wind speed, direction, and precipitation. Dynamic weather visualizations add a sense of dynamism to the digital environment.
Ongoing efforts focus on refining the user experience, emphasizing an intuitive and accessible interface adaptable to diverse devices while maintaining functionality. Future development priorities include ensuring compatibility across all data visualization and simulation systems. A crucial objective is the integration of real-time fluid simulations with dynamically generated terrain to depict intricate hydrological processes. Future phases will incorporate additional facets of the hydrological cycle, such as percolation and the accumulation of precipitation particles, into the fluid simulation. We also plan to engage with key stakeholders to demonstrate the system and collect feedback. This feedback will help assess the usability, clarity, and relevance of the immersive visualizations and guide further improvements to better support real-world environmental planning and decision-making needs.
A key consideration in the project’s design is its openness and generalizability. While currently focused on hydrological data, the framework allows users to incorporate new data APIs for spatial contextualization. Users can adapt the project to showcase various information, from drought patterns and forest fire coverage to soil erosion and transport. This versatility extends the framework’s utility, enabling researchers to explore disaster simulations and visualize diverse natural disasters, including tornadoes, fires, and hurricanes, fostering applications in data visualization, environmental planning, education, and disaster training.

5. Declaration of Generative AI and AI-Assisted Technologies in the Writing Process

During the preparation of this work, the authors used ChatGPT (4o) to improve the flow of the text, correct any potential grammatical errors, and improve the writing. After using this tool, the authors reviewed and edited the content as needed and take full responsibility for the content of the publication.

Author Contributions

U.H.M.: Software, Investigation, Writing—original draft, Visualization, Methodology, Validation. E.L.G.: Software, Investigation, Writing—original draft, Visualization, Methodology, Validation. Y.S.: Conceptualization, Supervision, Methodology, Validation, Writing—review and editing. I.D.: Conceptualization, Supervision, Validation, Writing—review and editing, Project administration, Funding acquisition. All authors have read and agreed to the published version of the manuscript.

Funding

This project was funded by the National Oceanic and Atmospheric Administration (NOAA) via a cooperative agreement with the University of Alabama (NA22NWS4320003) awarded to the Cooperative Institute for Research to Operations in Hydrology (CIROH).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data are contained within the article.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Vuckovic, M.; Schmidt, J.; Ortner, T.; Cornel, D. Combining 2D and 3D Visualization with Visual Analytics in the Environmental Domain. Information 2022, 13, 7. [Google Scholar] [CrossRef]
  2. Tsou, M.H.; Mejia, C. Beyond mapping: Extend the role of cartographers to user interface designers in the Metaverse using virtual reality, augmented reality, and mixed reality. Cartogr. Geogr. Inf. Sci. 2023, 51, 659–673. [Google Scholar] [CrossRef]
  3. Horbiński, T.; Zagata, K. View of Cartography in Video Games: Literature Review and Examples of Specific Solutions. KN-J. Cartogr. Geogr. Inf. 2022, 72, 117–128. [Google Scholar] [CrossRef]
  4. Miao, C.; Hu, J.; Moradkhani, H.; Destouni, G. Hydrological research evolution: A large language model-based analysis of 310,000 studies published globally between 1980 and 2023. Water Resour. Res. 2024, 60, e2024WR038077. [Google Scholar] [CrossRef]
  5. Liang, R.; Huang, C.; Zhang, C.; Li, B.; Saydam, S.; Canbulat, I. Exploring the Fusion Potentials of Data Visualization and Data Analytics in the Process of Mining Digitalization. IEEE Access 2023, 11, 40608–40628. [Google Scholar] [CrossRef]
  6. Ladson, A. Visualising hydrologic data. In Proceedings of the Hydrology and Water Resources Symposium, Melbourne, Australia, 3–6 December 2018; pp. 456–470. [Google Scholar]
  7. Koldasbayeva, D.; Tregubova, P.; Gasanov, M.; Zaytsev, A.; Petrovskaia, A.; Burnaev, E. Challenges in data-driven geo-spatial modeling for environmental research and practice. Nat. Commun. 2024, 15, 10700. [Google Scholar] [CrossRef]
  8. Kehrer, J. Interactive Visual Analysis of Multi-Faceted Scientific Data. Ph.D. Dissertation, The University of Bergen, Bergen, Norway, 2011. [Google Scholar]
  9. Chalh, R.; Bakkoury, Z.; Ouazar, D.; Hasnaoui, M.D. Big Data Open Platform for Water Resources Management. In Proceedings of the Cloud Technologies and Applications (CloudTech), International Conference on IEEE, Marrakech, Morocco, 2–4 June 2015; pp. 1–8. [Google Scholar]
  10. Xu, H.; Berres, A.; Liu, Y.; Allen-Dumas, M.R.; Sanyal, J. An overview of visualization and visual analytics applications in water resources management. Environ. Model. Softw. 2022, 153, 105396. [Google Scholar] [CrossRef]
  11. Chen, Y.; Han, D. Big data and hydroinformatics. J. Hydroinform. 2016, 18, 599–614. [Google Scholar] [CrossRef]
  12. Makropoulos, C.; Savić, D.A. Urban Hydroinformatics: Past, Present and Future. Water 2019, 11, 1959. [Google Scholar] [CrossRef]
  13. Hughes, C.E.; Stapleton, C.B.; Hughes, D.E.; Smith, E.M. Mixed reality in education, entertainment, and training. IEEE Comput. Graph. Appl. 2005, 25, 24–30. [Google Scholar] [CrossRef]
  14. Cheok, A.D.; Haller, M.; Fernando, O.N.N.; Wijesena, J.P. Mixed Reality Entertainment and Art. Int. J. Virtual Real. 2009, 8, 83–90. [Google Scholar] [CrossRef]
  15. Korkut, E.H.; Surer, E. Visualization in virtual reality: A systematic review. Virtual Real. 2023, 27, 1447–1480. [Google Scholar] [CrossRef]
  16. Yildirim, E.; Just, C.; Demir, I. Flood risk assessment and quantification at the community and property level in the State of Iowa. Int. J. Disaster Risk Reduct. 2022, 77, 103106. [Google Scholar] [CrossRef]
  17. Su, S.; Cruz-Neira, C.; Habib, E.; Gerndt, A. Virtual Hydrology Observatory: An Immersive Visualization of Hydrology Modeling. In Proceedings of the Engineering Reality of Virtual Reality 2009, San Jose, CA, USA, 18–22 January 2009; Volume 7238. [Google Scholar] [CrossRef]
  18. Grosser, P.F.; Xia, Z.; Alt, J.; Rüppel, U.; Schmalz, B. Virtual field trips in hydrological field laboratories: The potential of virtual reality for conveying hydrological engineering content. Educ. Inf. Technol. 2023, 28, 6977–7003. [Google Scholar] [CrossRef] [PubMed]
  19. Hedley, N.R.; Billinghurst, M.; Postner, L.; May, R.; Kato, H. Explorations in the Use of Augmented Reality for Geographic Visualization. Presence Teleoper. Virtual Environ. 2002, 11, 119–133. [Google Scholar] [CrossRef]
  20. Sermet, Y.; Demir, I. Flood action VR: A Virtual Reality Framework for Disaster Awareness and Emergency Response Training. In Proceedings of the ACM SIGGRAPH 2019 Posters, New York, NY, USA, 28 July 2019; Association for Computing Machinery: New York, NY, USA, 2019; pp. 1–2. [Google Scholar] [CrossRef]
  21. Sermet, Y.; Demir, I. Virtual and Augmented Reality Applications for Environmental Science Education and Training. In New Perspectives on Virtual and Augmented Reality; Routledge: Abingdon, UK, 2020; pp. 261–275. [Google Scholar] [CrossRef]
  22. Sermet, Y. Knowledge Generation and Communication in Intelligent and Immersive Systems: A Case Study on Flooding; The University of Iowa: Iowa City, IA, USA, 2020. [Google Scholar]
  23. Sermet, Y.; Demir, I. GeospatialVR: A web-based virtual reality framework for collaborative environmental simulations. Comput. Geosci. 2022, 159, 105010. [Google Scholar] [CrossRef]
  24. Demiray, B.Z.; Sermet, Y.; Yildirim, E.; Demir, I. FloodGame: An Interactive 3D Serious Game on Flood Mitigation for Disaster Awareness and Education. Environ. Model. Softw. 2025, 188, 106418. [Google Scholar] [CrossRef]
  25. Ready, M.; Dwyer, T.; Haga, J.H. Immersive Visualization of Big Data for River Disaster Management; Edinburgh University Press: Edinburgh, UK, 2018. [Google Scholar]
  26. Haynes, P.; Hehl-Lange, S.; Lange, E. Mobile Augmented Reality for Flood Visualization. Environ. Model. Softw. 2018, 109, 380–389. [Google Scholar] [CrossRef]
  27. Macchione, F.; Costabile, P.; Costanzo, C.; De Santis, R. Moving to 3-D flood hazard maps for enhancing risk communication. Environ. Model. Softw. 2019, 111, 510–522. [Google Scholar] [CrossRef]
  28. Kawai, J.; Mitsuhara, H.; Shishibori, M. Tsunami evacuation drill system using smart glasses. Procedia Comput. Sci. 2015, 72, 329–336. [Google Scholar] [CrossRef]
  29. Iguchi, K.; Mitsuhara, H.; Shishibori, M. Evacuation Instruction Training System Using Augmented Reality and a Smartphone-based Head Mounted Display. In Proceedings of the 2016 3rd International Conference on Information and Communication Technologies for Disaster Management (ICT-DM), Vienna, Austria, 13–15 December 2016; IEEE: New York, NY, USA, 2016; pp. 1–6. [Google Scholar]
  30. Wang, C.; Hou, J.; Miller, D.; Brown, I.; Jiang, Y. Flood risk management in sponge cities: The role of integrated simulation and 3D visualization. Int. J. Disaster Risk Reduct. 2019, 39, 101139. [Google Scholar] [CrossRef]
  31. Mostajeran, F.; Fischer, M.; Steinicke, F.; Kühn, S. Effects of exposure to immersive computer-generated virtual nature and control environments on affect and cognition. Sci. Rep. 2023, 13, 220. [Google Scholar] [CrossRef] [PubMed]
  32. Roberts, S.; Patterson, D. Virtual Weather Systems. In Proceedings of the Australasian Computer Science Week Multiconference, Geelong, Australia, 30 January–3 February 2017. [Google Scholar] [CrossRef]
  33. Demir, I.; Sermet, Y.; Rink, K. Next generation visualization and communication systems for earth science using immersive reality and serious gaming. Front. Earth Sci. 2022, 10, 1101538. [Google Scholar] [CrossRef]
Figure 1. System architecture and components with sample datasets and use cases.
Figure 1. System architecture and components with sample datasets and use cases.
Applsci 15 05278 g001
Figure 2. 3D map view generated via Mapbox along with input and information panel.
Figure 2. 3D map view generated via Mapbox along with input and information panel.
Applsci 15 05278 g002
Figure 3. Weather information from the USGS Weather API being displayed in the user view along with stream conditions from the USGS Discharge API.
Figure 3. Weather information from the USGS Weather API being displayed in the user view along with stream conditions from the USGS Discharge API.
Applsci 15 05278 g003
Figure 4. Bathymetry data for lakes and water bodies.
Figure 4. Bathymetry data for lakes and water bodies.
Applsci 15 05278 g004
Figure 5. Rainfall particle system at: (a) low; and (b) high intensity configurations.
Figure 5. Rainfall particle system at: (a) low; and (b) high intensity configurations.
Applsci 15 05278 g005
Figure 6. Snowfall particle system at: (a) low; and (b) high intensity configurations.
Figure 6. Snowfall particle system at: (a) low; and (b) high intensity configurations.
Applsci 15 05278 g006
Figure 7. Configurable attributes of the configuration script.
Figure 7. Configurable attributes of the configuration script.
Applsci 15 05278 g007
Table 1. Summary of Tools and SDKs considered, with their functionality and key benefits.
Table 1. Summary of Tools and SDKs considered, with their functionality and key benefits.
Tool/SDK/APIFunctionality/Use CaseStrengths
Unity EngineCore development platform for 3D visualization and simulationWell documented
Strong XR support
Mapbox Maps SDKTerrain, street, and satellite map rendering with 3D buildings, traffic, and bathymetry supportHigh geospatial accuracy
Rich dataset (3D buildings, live traffic, terrain)
Intuitive coordinate-based API
Free tier
USGS Discharge and Weather APIReal-time streamflow and weather dataUpToDate hydrologic and meteorological data
RESTful access
Reliable source
Weather API (3rd party)Provides temperature, precipitation, wind speed, etc.Real-time weather information for immersive visual context
Zibra Liquids PluginReal-time fluid simulationEasy integration
Modifiable via script
Realistic visuals
WebXR Exporter for UnityWebGL builds for browser and VR compatibilityDevice-agnostic
Eliminates the need for platform-specific builds
Virtual Reality Tool Kit (VRTK)Handles VR controller input and interactionsDevice-agnostic controller support
Simplifies input mapping
CesiumMap provider evaluated during planning phase3D Tiles standard
OpenStreetMap integration
Cross-platform support
Google Photorealistic 3D TilesExperimental photogrammetric mesh used for terrainHigh-precision visuals
Follows elevation contours
Based on satellite/Street View data
Unity Particle SystemRainfall/snowfall visual effectsLightweight weather visualization
Customizable attributes
Real-time adjustments possible
Unity C# ScriptsCustom interaction logic for particles, weather, and UI controlsFully modifiable
Exposes attributes to UI or runtime panels
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Mudiyanselage, U.H.; Gonzalez, E.L.; Sermet, Y.; Demir, I. An Immersive Hydroinformatics Framework with Extended Reality for Enhanced Visualization and Simulation of Hydrologic Data. Appl. Sci. 2025, 15, 5278. https://doi.org/10.3390/app15105278

AMA Style

Mudiyanselage UH, Gonzalez EL, Sermet Y, Demir I. An Immersive Hydroinformatics Framework with Extended Reality for Enhanced Visualization and Simulation of Hydrologic Data. Applied Sciences. 2025; 15(10):5278. https://doi.org/10.3390/app15105278

Chicago/Turabian Style

Mudiyanselage, Uditha Herath, Eveline Landes Gonzalez, Yusuf Sermet, and Ibrahim Demir. 2025. "An Immersive Hydroinformatics Framework with Extended Reality for Enhanced Visualization and Simulation of Hydrologic Data" Applied Sciences 15, no. 10: 5278. https://doi.org/10.3390/app15105278

APA Style

Mudiyanselage, U. H., Gonzalez, E. L., Sermet, Y., & Demir, I. (2025). An Immersive Hydroinformatics Framework with Extended Reality for Enhanced Visualization and Simulation of Hydrologic Data. Applied Sciences, 15(10), 5278. https://doi.org/10.3390/app15105278

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop