Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (177)

Search Parameters:
Keywords = hurricane prediction

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
21 pages, 979 KiB  
Article
AI-Enhanced Coastal Flood Risk Assessment: A Real-Time Web Platform with Multi-Source Integration and Chesapeake Bay Case Study
by Paul Magoulick
Water 2025, 17(15), 2231; https://doi.org/10.3390/w17152231 - 26 Jul 2025
Viewed by 273
Abstract
A critical gap exists between coastal communities’ need for accessible flood risk assessment tools and the availability of sophisticated modeling, which remains limited by technical barriers and computational demands. This study introduces three key innovations through Coastal Defense Pro: (1) the first operational [...] Read more.
A critical gap exists between coastal communities’ need for accessible flood risk assessment tools and the availability of sophisticated modeling, which remains limited by technical barriers and computational demands. This study introduces three key innovations through Coastal Defense Pro: (1) the first operational web-based AI ensemble for coastal flood risk assessment integrating real-time multi-agency data, (2) an automated regional calibration system that corrects systematic model biases through machine learning, and (3) browser-accessible implementation of research-grade modeling previously requiring specialized computational resources. The system combines Bayesian neural networks with optional LSTM and attention-based models, implementing automatic regional calibration and multi-source elevation consensus through a modular Python architecture. Real-time API integration achieves >99% system uptime with sub-3-second response times via intelligent caching. Validation against Hurricane Isabel (2003) demonstrates correction from 197% overprediction (6.92 m predicted vs. 2.33 m observed) to accurate prediction through automated identification of a Chesapeake Bay-specific reduction factor of 0.337. Comprehensive validation against 15 major storms (1992–2024) shows substantial improvement over standard methods (RMSE = 0.436 m vs. 2.267 m; R2 = 0.934 vs. −0.786). Economic assessment using NACCS fragility curves demonstrates 12.7-year payback periods for flood protection investments. The open-source Streamlit implementation democratizes access to research-grade risk assessment, transforming months-long specialist analyses into immediate browser-based tools without compromising scientific rigor. Full article
(This article belongs to the Special Issue Coastal Flood Hazard Risk Assessment and Mitigation Strategies)
Show Figures

Figure 1

20 pages, 25345 KiB  
Article
Mangrove Damage and Early-Stage Canopy Recovery Following Hurricane Roslyn in Marismas Nacionales, Mexico
by Samuel Velázquez-Salazar, Luis Valderrama-Landeros, Edgar Villeda-Chávez, Cecilia G. Cervantes-Rodríguez, Carlos Troche-Souza, José A. Alcántara-Maya, Berenice Vázquez-Balderas, María T. Rodríguez-Zúñiga, María I. Cruz-López and Francisco Flores-de-Santiago
Forests 2025, 16(8), 1207; https://doi.org/10.3390/f16081207 - 22 Jul 2025
Viewed by 1063
Abstract
Hurricanes are powerful tropical storms that can severely damage mangrove forests through uprooting trees, sediment erosion, and saltwater intrusion, disrupting their critical role in coastal protection and biodiversity. After a hurricane, evaluating mangrove damage helps prioritize rehabilitation efforts, as these ecosystems play a [...] Read more.
Hurricanes are powerful tropical storms that can severely damage mangrove forests through uprooting trees, sediment erosion, and saltwater intrusion, disrupting their critical role in coastal protection and biodiversity. After a hurricane, evaluating mangrove damage helps prioritize rehabilitation efforts, as these ecosystems play a key ecological role in coastal regions. Thus, we analyzed the defoliation of mangrove forest canopies and their early recovery, approximately 2.5 years after the landfall of Category 3 Hurricane Roslyn in October 2002 in Marismas Nacionales, Mexico. The following mangrove traits were analyzed: (1) the yearly time series of the Combined Mangrove Recognition Index (CMRI) standard deviation from 2020 to 2025, (2) the CMRI rate of change (slope) following the hurricane’s impact, and (3) the canopy height model (CHM) before and after the hurricane using satellite and UAV-LiDAR data. Hurricane Roslyn caused a substantial decrease in canopy cover, resulting in a loss of 47,202 ha, which represents 82.8% of the total area of 57,037 ha. The CMRI standard deviation indicated early signs of canopy recovery in one-third of the mangrove-damaged areas 2.5 years post-impact. The CMRI slope indicated that areas near the undammed rivers had a maximum recovery rate of 0.05 CMRI units per month, indicating a predicted canopy recovery of ~2.5 years. However, most mangrove areas exhibited CMRI rates between 0.01 and 0.03 CMRI units per month, anticipating a recovery time between 40 months (approximately 3.4 years) and 122 months (roughly 10 years). Unfortunately, most of the already degraded Laguncularia racemosa forests displayed a negative CMRI slope, suggesting a lack of canopy recovery so far. Additionally, the CHM showed a median significant difference of 3.3 m in the canopy height of fringe-type Rhizophora mangle and Laguncularia racemosa forests after the hurricane’s landfall. Full article
Show Figures

Figure 1

17 pages, 2928 KiB  
Article
Hybrid Machine Learning Model for Hurricane Power Outage Estimation from Satellite Night Light Data
by Laiyin Zhu and Steven M. Quiring
Remote Sens. 2025, 17(14), 2347; https://doi.org/10.3390/rs17142347 - 9 Jul 2025
Viewed by 321
Abstract
Hurricanes can cause massive power outages and pose significant disruptions to society. Accurately monitoring hurricane power outages will improve predictive models and guide disaster emergency management. However, many challenges exist in obtaining high-quality data on hurricane power outages. We systematically evaluated machine learning [...] Read more.
Hurricanes can cause massive power outages and pose significant disruptions to society. Accurately monitoring hurricane power outages will improve predictive models and guide disaster emergency management. However, many challenges exist in obtaining high-quality data on hurricane power outages. We systematically evaluated machine learning (ML) approaches to reconstruct historical hurricane power outages based on high-resolution (1 km) satellite night light observations from the Defense Meteorological Satellite Program (DMSP) and other ancillary information. We found that the two-step hybrid model significantly improved model prediction performance by capturing a substantial portion of the uncertainty in the zero-inflated data. In general, the classification and regression tree-based machine learning models (XGBoost and random forest) demonstrated better performance than the logistic and CNN models in both binary classification and regression models. For example, the xgb+xgb model has 14% less RMSE than the log+cnn model, and the R-squared value is 25 times larger. The Interpretable ML (SHAP value) identified geographic locations, population, and stable and hurricane night light values as important variables in the XGBoost power outage model. These variables also exhibit meaningful physical relationships with power outages. Our study lays the groundwork for monitoring power outages caused by natural disasters using satellite data and machine learning (ML) approaches. Future work should aim to improve the accuracy of power outage estimations and incorporate more hurricanes from the recently available Visible Infrared Imaging Radiometer Suite (VIIRS) Day/Night Band (DNB) data. Full article
Show Figures

Figure 1

36 pages, 5039 KiB  
Article
Flood Risk Forecasting: An Innovative Approach with Machine Learning and Markov Chains Using LIDAR Data
by Luigi Bibbò, Giuliana Bilotta, Giuseppe M. Meduri, Emanuela Genovese and Vincenzo Barrile
Appl. Sci. 2025, 15(13), 7563; https://doi.org/10.3390/app15137563 - 5 Jul 2025
Viewed by 466
Abstract
In recent years, the world has seen a significant increase in extreme weather events, such as floods, hurricanes, and storms, which have caused extensive damage to infrastructure and communities. These events result from natural phenomena and human-induced factors, including climate change and natural [...] Read more.
In recent years, the world has seen a significant increase in extreme weather events, such as floods, hurricanes, and storms, which have caused extensive damage to infrastructure and communities. These events result from natural phenomena and human-induced factors, including climate change and natural climate variations. For instance, the floods in Europe in 2024 and the hurricanes in the United States have highlighted the vulnerability of urban and rural areas. These extreme events are often unpredictable and pose considerable challenges for spatial planning and risk management. This study explores an innovative approach that employs machine learning and Markov chains to enhance spatial planning and predict flood risk areas. By utilizing data such as weather records, land use and land cover (LULC) information, topographic LIDAR data, and advanced predictive models, the study aims to identify the most vulnerable areas and provide recommendations for risk mitigation. The results indicate that integrating these technologies can improve forecasting accuracy, thereby supporting more informed decisions in land management. Given the effects of climate change and the increasing frequency of extreme events, adopting advanced forecasting and planning tools is crucial for protecting communities and reducing economic and social damage. This method was applied to the Calopinace area, also known as the Calopinace River or Fiumara della Cartiera, which crosses Reggio Calabria and is notorious for its historical floods. It can serve as part of an early warning system, enabling alerts to be issued throughout the monitored area. Furthermore, it can be integrated into existing emergency protocols, thereby enhancing the effectiveness of disaster response. Future research could investigate incorporating additional data and AI techniques to improve accuracy. Full article
(This article belongs to the Section Green Sustainable Science and Technology)
Show Figures

Figure 1

16 pages, 1456 KiB  
Article
Informing Disaster Recovery Through Predictive Relocation Modeling
by Chao He and Da Hu
Computers 2025, 14(6), 240; https://doi.org/10.3390/computers14060240 - 19 Jun 2025
Viewed by 338
Abstract
Housing recovery represents a critical component of disaster recovery, and accurately forecasting household relocation decisions is essential for guiding effective post-disaster reconstruction policies. This study explores the use of machine learning algorithms to improve the prediction of household relocation in the aftermath of [...] Read more.
Housing recovery represents a critical component of disaster recovery, and accurately forecasting household relocation decisions is essential for guiding effective post-disaster reconstruction policies. This study explores the use of machine learning algorithms to improve the prediction of household relocation in the aftermath of disasters. Leveraging data from 1304 completed interviews conducted as part of the Displaced New Orleans Residents Survey (DNORS) following Hurricane Katrina, we evaluate the performance of Logistic Regression (LR), Random Forest (RF), and Weighted Support Vector Machine (WSVM) models. Results indicate that WSVM significantly outperforms LR and RF, particularly in identifying the minority class of relocated households, achieving the highest F1 score. Key predictors of relocation include homeownership, extent of housing damage, and race. By integrating variable importance rankings and partial dependence plots, the study also enhances interpretability of machine learning outputs. These findings underscore the value of advanced predictive models in disaster recovery planning, particularly in geographically vulnerable regions like New Orleans where accurate relocation forecasting can guide more effective policy interventions. Full article
(This article belongs to the Special Issue Machine Learning and Statistical Learning with Applications 2025)
Show Figures

Figure 1

12 pages, 1842 KiB  
Article
Optimization of Sustainable Seismic Retrofit by Developing an Artificial Neural Network
by Hafiz Asfandyar Ahmed and Waqas Arshad Tanoli
Buildings 2025, 15(12), 2065; https://doi.org/10.3390/buildings15122065 - 16 Jun 2025
Viewed by 380
Abstract
Reinforced concrete structures often require retrofitting due to damage caused by natural disasters such as earthquakes, floods, or hurricanes; deterioration from aging; or exposure to harsh environmental conditions. Retrofitting strategies may involve adding new structural elements like shear walls, dampers, or base isolators, [...] Read more.
Reinforced concrete structures often require retrofitting due to damage caused by natural disasters such as earthquakes, floods, or hurricanes; deterioration from aging; or exposure to harsh environmental conditions. Retrofitting strategies may involve adding new structural elements like shear walls, dampers, or base isolators, as well as strengthening the existing components using methods such as reinforced concrete, steel, or fiber-reinforced polymer jacketing. Selecting the most appropriate retrofit method can be complex and is influenced by various factors, including initial cost, long-term maintenance, environmental impact, and overall sustainability. This study proposes utilizing an artificial neural network (ANN) to predict sustainable and cost-effective seismic retrofit solutions. By training the ANN with a comprehensive dataset that includes jacket thickness, material specifications, reinforcement details, and key sustainability indicators (economic and environmental factors), the model was able to recommend optimized retrofit designs. These designs include ideal values for jacket thickness, concrete strength, and the configuration of reinforcement bars, aiming to minimize both costs and environmental footprint. A major focus of this research was identifying the optimal number of neurons in the hidden layers of the ANN. While the number of input and output neurons is defined by the dataset, determining the right configuration for hidden layers is critical for performance. The study found that networks with one or two hidden layers provided more reliable and efficient results compared to more complex architectures, achieving a total regression value of 0.911. These findings demonstrate that a well-tuned ANN can serve as a powerful tool for designing sustainable seismic retrofit strategies, helping engineers make smarter decisions more quickly and efficiently. Full article
(This article belongs to the Section Building Structures)
Show Figures

Figure 1

31 pages, 5067 KiB  
Review
Passive Microwave Imagers, Their Applications, and Benefits: A Review
by Nazak Rouzegari, Mohammad Bolboli Zadeh, Claudia Jimenez Arellano, Vesta Afzali Gorooh, Phu Nguyen, Huan Meng, Ralph R. Ferraro, Satya Kalluri, Soroosh Sorooshian and Kuolin Hsu
Remote Sens. 2025, 17(9), 1654; https://doi.org/10.3390/rs17091654 - 7 May 2025
Viewed by 1055
Abstract
Passive Microwave Imagers (PMWIs) aboard meteorological satellites have been instrumental in advancing the understanding of Earth’s atmospheric and surface processes, providing invaluable data for weather forecasting, climate monitoring, and environmental research. This review examines the relevance, applications, and benefits of PMWI data, focusing [...] Read more.
Passive Microwave Imagers (PMWIs) aboard meteorological satellites have been instrumental in advancing the understanding of Earth’s atmospheric and surface processes, providing invaluable data for weather forecasting, climate monitoring, and environmental research. This review examines the relevance, applications, and benefits of PMWI data, focusing on their practical use and benefits to society rather than the specific techniques or algorithms involved in data processing. Specifically, it assesses the impact of PMWI data on Tropical Cyclone (TC) intensity and structure, global precipitation and extreme events, flood prediction, the effectiveness of tropical storm and hurricane watches, fire severity and carbon emissions, weather forecasting, and drought mitigation. Additionally, it highlights the importance of PMWIs in hydrometeorological and real-time applications, emphasizing their current usage and potential for improvement. Key recommendations from users include expanding satellite networks for more frequent global coverage, reducing data latency, and enhancing resolution to improve forecasting accuracy. Despite the notable benefits, challenges remain, such as a lack of direct research linking PMWI data to broader societal outcomes, the time-intensive process of correlating PMWI use with measurable societal impacts, and the indirect links between PMWI and improved weather forecasting and disaster management. This study provides insights into the effectiveness and limitations of PMWI data, stressing the importance of continued research and development to maximize their contribution to disaster preparedness, climate resilience, and global weather forecasting. Full article
Show Figures

Figure 1

25 pages, 2867 KiB  
Article
Unmasking Media Bias, Economic Resilience, and the Hidden Patterns of Global Catastrophes
by Fahim Sufi and Musleh Alsulami
Sustainability 2025, 17(9), 3951; https://doi.org/10.3390/su17093951 - 28 Apr 2025
Cited by 1 | Viewed by 616
Abstract
The increasing frequency and destructiveness of natural disasters necessitate scalable, transparent, and timely analytical frameworks for risk reduction. Traditional disaster datasets—curated by intergovernmental bodies such as EM-DAT and UNDRR—face limitations in spatial granularity, temporal responsiveness, and accessibility. This study addresses these limitations by [...] Read more.
The increasing frequency and destructiveness of natural disasters necessitate scalable, transparent, and timely analytical frameworks for risk reduction. Traditional disaster datasets—curated by intergovernmental bodies such as EM-DAT and UNDRR—face limitations in spatial granularity, temporal responsiveness, and accessibility. This study addresses these limitations by introducing a novel, AI-enhanced disaster intelligence framework that leverages 19,130 publicly available news articles from 453 global sources between September 2023 and March 2025. Using OpenAI’s GPT-3.5 Turbo model for disaster classification and metadata extraction, the framework transforms unstructured news text into structured variables across five key dimensions: severity, location, media coverage, economic resilience, and casualties. Hypotheses were tested using statistical modeling, geospatial aggregation, and time series analysis. Findings confirm a modest but significant correlation between severity and casualties (ρ=0.12, p<1060), and a stronger spatial correlation between average regional severity and impact (ρ=0.31, p<1010). Media amplification bias was empirically demonstrated: hurricanes received the most coverage (5599 articles), while under-reported earthquakes accounted for over 3 million deaths. Economic resilience showed a statistically significant but weak protective effect on fatalities (β=0.024, p=0.041). Disaster frequency increased substantially over time (slope η1=53.17, R2=0.32), though severity remained stable. GPT-based classification achieved a high average F1-score (0.91), demonstrating robust semantic accuracy, though not mortality prediction. This study validates the feasibility of using AI-curated, open-access news data for empirical hypothesis testing in disaster science, offering a sustainable alternative to closed datasets and enabling real-time policy feedback loops, particularly for vulnerable, data-scarce regions. Full article
Show Figures

Figure 1

20 pages, 501 KiB  
Article
Regulator Theory, Natural Hazards, and Climate Change
by Geoff Kaine and Vic Wright
Sustainability 2025, 17(7), 2979; https://doi.org/10.3390/su17072979 - 27 Mar 2025
Viewed by 476
Abstract
Climate change is increasing variability in environmental conditions and the frequency and severity of natural hazards such as hurricanes, floods, and wildfires. In this paper, we use general systems theory to describe how disaster management systems are composed of four types of system [...] Read more.
Climate change is increasing variability in environmental conditions and the frequency and severity of natural hazards such as hurricanes, floods, and wildfires. In this paper, we use general systems theory to describe how disaster management systems are composed of four types of system regulators (aggregation, passive, error control, and anticipation) that are deployed to provide protection from natural hazards. We argue that climate change, by changing causal relationships in the environment and thereby reducing the predictability of related hazards and altering exposure to them, is likely to require that disaster management systems be restructured by changing the combinations of system regulators that are employed to prevent or mitigate disasters. This leads to the conclusion that one of the keys to developing effective policies to support adaptation to climate change and to promote sustainability hinges on understanding how disaster management systems can be interpreted as mechanisms for regulating exposure and vulnerability to minimise the threats from natural hazards. Consequently, developing methods for interpreting and modelling system regulators in disaster management systems is an important next step. Full article
Show Figures

Figure 1

28 pages, 29565 KiB  
Article
AI-Driven Global Disaster Intelligence from News Media
by Fahim Sufi and Musleh Alsulami
Mathematics 2025, 13(7), 1083; https://doi.org/10.3390/math13071083 - 26 Mar 2025
Cited by 1 | Viewed by 1047
Abstract
Open-source disaster intelligence (OSDI) is crucial for improving situational awareness, disaster preparedness, and real-time decision-making. Traditional OSDI frameworks often rely on social media data, which are susceptible to misinformation and credibility issues. This study proposes a novel AI-driven framework utilizing automated data collection [...] Read more.
Open-source disaster intelligence (OSDI) is crucial for improving situational awareness, disaster preparedness, and real-time decision-making. Traditional OSDI frameworks often rely on social media data, which are susceptible to misinformation and credibility issues. This study proposes a novel AI-driven framework utilizing automated data collection from 444 large-scale online news portals, including CNN, BBC, CBS News, and The Guardian, to enhance data reliability. Over a 514-day period (27 September 2023 to 26 February 2025), 1.25 million news articles were collected, of which 17,884 were autonomously classified as disaster-related using Generative Pre-Trained Transformer (GPT) models. The analysis identified 185 distinct countries and 6068 unique locations, offering unprecedented geospatial and temporal intelligence. Advanced clustering and predictive analytics techniques, including K-means, DBSCAN, seasonal decomposition (STL), Fourier transform, and ARIMA, were employed to detect geographical hotspots, cyclical patterns, and temporal dependencies. The ARIMA (2, 1, 2) model achieved a mean squared error (MSE) of 823,761, demonstrating high predictive accuracy. Key findings highlight that the USA (6548 disasters), India (1393 disasters), and Australia (1260 disasters) are the most disaster-prone countries, while hurricanes/typhoons/cyclones (5227 occurrences), floods (3360 occurrences), and wildfires (2724 occurrences) are the most frequent disaster types. The framework establishes a comprehensive methodology for integrating geospatial clustering, temporal analysis, and multimodal data processing in OSDI. By leveraging AI automation and diverse news sources, this study provides a scalable, adaptable, and ethically robust solution for proactive disaster management, improving global resilience and preparedness. Full article
(This article belongs to the Topic Soft Computing and Machine Learning)
Show Figures

Figure 1

29 pages, 8824 KiB  
Article
Toward Reliable Post-Disaster Assessment: Advancing Building Damage Detection Using You Only Look Once Convolutional Neural Network and Satellite Imagery
by César Luis Moreno González, Germán A. Montoya and Carlos Lozano Garzón
Mathematics 2025, 13(7), 1041; https://doi.org/10.3390/math13071041 - 23 Mar 2025
Viewed by 778
Abstract
Natural disasters continuously threaten populations worldwide, with hydrometeorological events standing out due to their unpredictability, rapid onset, and significant destructive capacity. However, developing countries often face severe budgetary constraints and rely heavily on international support, limiting their ability to implement optimal disaster response [...] Read more.
Natural disasters continuously threaten populations worldwide, with hydrometeorological events standing out due to their unpredictability, rapid onset, and significant destructive capacity. However, developing countries often face severe budgetary constraints and rely heavily on international support, limiting their ability to implement optimal disaster response strategies. This study addresses these challenges by developing and implementing YOLOv8-based deep learning models trained on high-resolution satellite imagery from the Maxar GeoEye-1 satellite. Unlike prior studies, we introduce a manually labeled dataset, consisting of 1400 undamaged and 1200 damaged buildings, derived from pre- and post-Hurricane Maria imagery. This dataset has been publicly released, providing a benchmark for future disaster assessment research. Additionally, we conduct a systematic evaluation of optimization strategies, comparing SGD with momentum, RMSProp, Adam, AdaMax, NAdam, and AdamW. Our results demonstrate that SGD with momentum outperforms Adam-based optimizers in training stability, convergence speed, and reliability across higher confidence thresholds, leading to more robust and consistent disaster damage predictions. To enhance usability, we propose deploying the trained model via a REST API, enabling real-time damage assessment with minimal computational resources, making it a low-cost, scalable tool for government agencies and humanitarian organizations. These findings contribute to machine learning-based disaster response, offering an efficient, cost-effective framework for large-scale damage assessment and reinforcing the importance of model selection, hyperparameter tuning, and optimization functions in critical real-world applications. Full article
(This article belongs to the Special Issue Mathematical Methods and Models Applied in Information Technology)
Show Figures

Figure 1

31 pages, 14382 KiB  
Article
Spatiotemporal Modeling of Connected Vehicle Data: An Application to Non-Congregate Shelter Planning During Hurricane-Pandemics
by Davison Elijah Tsekeni, Onur Alisan, Jieya Yang, O. Arda Vanli and Eren Erman Ozguven
Appl. Sci. 2025, 15(6), 3185; https://doi.org/10.3390/app15063185 - 14 Mar 2025
Viewed by 726
Abstract
The growing complexity of natural disasters, intensified by climate change, has amplified the challenges of managing emergency shelter demand. Accurate shelter demand forecasting is crucial to optimize resource allocation, prevent overcrowding, and ensure evacuee safety, particularly during concurrent disasters like hurricanes and pandemics. [...] Read more.
The growing complexity of natural disasters, intensified by climate change, has amplified the challenges of managing emergency shelter demand. Accurate shelter demand forecasting is crucial to optimize resource allocation, prevent overcrowding, and ensure evacuee safety, particularly during concurrent disasters like hurricanes and pandemics. Real-time decision-making during evacuations remains a significant challenge due to dynamic evacuation behaviors and evolving disaster conditions. This study introduces a spatiotemporal modeling framework that leverages connected vehicle data to predict shelter demand using data collected during Hurricane Sally (September 2020) across Santa Rosa, Escambia, and Okaloosa counties in Florida, USA. Using Generalized Additive Models (GAMs) with spatial and temporal smoothing, integrated with GIS tools, the framework captures non-linear evacuation patterns and predicts shelter demand. The GAM outperformed the baseline Generalized Linear Model (GLM), achieving a Root Mean Square Error (RMSE) of 6.7791 and a correlation coefficient (CORR) of 0.8593 for shelters on training data, compared to the GLM’s RMSE of 12.9735 and CORR of 0.1760. For lodging facilities, the GAM achieved an RMSE of 4.0368 and CORR of 0.5485, improving upon the GLM’s RMSE of 4.6103 and CORR of 0.2897. While test data showed moderate declines in performance, the GAM consistently offered more accurate and interpretable results across both facility types. This integration of connected vehicle data with spatiotemporal modeling enables real-time insights into evacuation dynamics. Visualization outputs, like spatial heat maps, provide actionable data for emergency planners to allocate resources efficiently, enhancing disaster resilience and public safety during complex emergencies. Full article
(This article belongs to the Special Issue Big Data Applications in Transportation)
Show Figures

Figure 1

19 pages, 27702 KiB  
Article
Low-Cost, LiDAR-Based, Dynamic, Flood Risk Communication Viewer
by Debra F. Laefer, Evan O’Keeffe, Kshitij Chandna, Kim Hertz, Jing Zhu, Raul Lejano, Anh Vu Vo, Michela Bertolotto and Ulrich Ofterdinger
Remote Sens. 2025, 17(4), 592; https://doi.org/10.3390/rs17040592 - 9 Feb 2025
Cited by 1 | Viewed by 1282
Abstract
This paper proposes a flood risk visualization method that is (1) readily transferable (2) hyperlocal, (3) computationally inexpensive, and (4) geometrically accurate. This proposal is for risk communication, to provide high-resolution, three-dimensional flood visualization at the sub-meter level. The method couples a laser [...] Read more.
This paper proposes a flood risk visualization method that is (1) readily transferable (2) hyperlocal, (3) computationally inexpensive, and (4) geometrically accurate. This proposal is for risk communication, to provide high-resolution, three-dimensional flood visualization at the sub-meter level. The method couples a laser scanning point cloud with algorithms that produce textured floodwaters, achieved through compounding multiple sine functions in a graphics shader. This hyper-local approach to visualization is enhanced by the ability to portray changes in (i) watercolor, (ii) texture, and (iii) motion (including dynamic heights) for various flood prediction scenarios. Through decoupling physics-based predictions from the visualization, a dynamic, flood risk viewer was produced with modest processing resources involving only a single, quad-core processor with a frequency around 4.30 GHz and with no graphics card. The system offers several major advantages. (1) The approach enables its use on a browser or with inexpensive, virtual reality hardware and, thus, promotes local dissemination for flood risk communication, planning, and mitigation. (2) The approach can be used for any scenario where water interfaces with the built environment, including inside of pipes. (3) When tested for a coastal inundation scenario from a hurricane, 92% of the neighborhood participants found it to be more effective in communicating flood risk than traditional 2D mapping flood warnings provided by governmental authorities. Full article
Show Figures

Figure 1

28 pages, 5914 KiB  
Article
Predicting Landslide Deposit Zones: Insights from Advanced Sampling Strategies in the Ilopango Caldera, El Salvador
by Laura Paola Calderon-Cucunuba, Abel Alexei Argueta-Platero, Tomás Fernández, Claudio Mercurio, Chiara Martinello, Edoardo Rotigliano and Christian Conoscenti
Land 2025, 14(2), 269; https://doi.org/10.3390/land14020269 - 27 Jan 2025
Viewed by 839
Abstract
In landslide susceptibility modeling, research has predominantly focused on predicting landslides by identifying predisposing factors, often using inventories primarily based on the highest points of landslide crowns. However, a significant challenge arises when the transported mass impacts human activities directly, typically occurring in [...] Read more.
In landslide susceptibility modeling, research has predominantly focused on predicting landslides by identifying predisposing factors, often using inventories primarily based on the highest points of landslide crowns. However, a significant challenge arises when the transported mass impacts human activities directly, typically occurring in the deposition areas of these phenomena. Therefore, identifying the terrain characteristics that facilitate the transport and deposition of displaced material in affected areas is equally crucial. This study aimed to evaluate the predictive capability of identifying where displaced material might be deposited by using different inventories of specific parts of a landslide, including the source area, intermediate area, and deposition area. A sample segmentation was conducted that included inventories of these distinct parts of the landslide in the hydrographic basin of Lake Ilopango, which experienced debris flows and debris floods triggered by heavy rainfall from Hurricane Ida in November 2009. Given the extensive variables extracted for this evaluation (20 variables), the Induced Smoothed (IS) version of the Least Absolute Shrinkage and Selection Operator (LASSO) methodology was employed to determine the significance of each variable within the datasets. Additionally, the Multivariate Adaptive Regression Splines (MARS) algorithm was used for modeling. Our findings revealed that models developed using the deposition area dataset were more effective compared with those based on the source area dataset. Furthermore, the accuracy of models using deposition area data surpassed that of that using data from both the source and intermediate areas. Full article
Show Figures

Figure 1

23 pages, 22589 KiB  
Article
Landslide Prediction Validation in Western North Carolina After Hurricane Helene
by Sophia Lin, Shenen Chen, Ryan A. Rasanen, Qifan Zhao, Vidya Chavan, Wenwu Tang, Navanit Shanmugam, Craig Allan, Nicole Braxtan and John Diemer
Geotechnics 2024, 4(4), 1259-1281; https://doi.org/10.3390/geotechnics4040064 - 14 Dec 2024
Cited by 3 | Viewed by 2523
Abstract
Hurricane Helene triggered 1792 landslides across western North Carolina and has caused damage to 79 bridges to date. Helene hit western North Carolina days after a low-pressure system dropped up to 254 mm of rain in some locations of western North Carolina (e.g., [...] Read more.
Hurricane Helene triggered 1792 landslides across western North Carolina and has caused damage to 79 bridges to date. Helene hit western North Carolina days after a low-pressure system dropped up to 254 mm of rain in some locations of western North Carolina (e.g., Asheville Regional Airport). The already waterlogged region experienced devastation as significant additional rainfall occurred during Helene, where some areas, like Asheville, North Carolina received an additional 356 mm of rain (National Weather Service, 2024). In this study, machine learning (ML)-generated multi-hazard landslide susceptibility maps are compared to the documented landslides from Helene. The landslide models use the North Carolina landslide database, soil survey, rainfall, USGS digital elevation model (DEM), and distance to rivers to create the landslide variables. From the DEM, aspect factors and slope are computed. Because recent research in western North Carolina suggests fault movement is destabilizing slopes, distance to fault was also incorporated as a predictor variable. Finally, soil types were used as a wildfire predictor variable. In total, 4794 landslides were used for model training. Random Forest and logistic regression machine learning algorithms were used to develop the landslide susceptibility map. Furthermore, landslide susceptibility was also examined with and without consideration of wildfires. Ultimately, this study indicates heavy rainfall and debris-laden floodwaters were critical in triggering both landslides and scour, posing a dual threat to bridge stability. Field investigations from Hurricane Helene revealed that bridge damage was concentrated at bridge abutments, with scour and sediment deposition exacerbating structural vulnerability. We evaluated the assumed flooding potential (AFP) of damaged bridges in the study area, finding that bridges with lower AFP values were particularly vulnerable to scour and submersion during flood events. Differentiating between landslide-induced and scour-induced damage is essential for accurately assessing risks to infrastructure. The findings emphasize the importance of comprehensive hazard mapping to guide infrastructure resilience planning in mountainous regions. Full article
Show Figures

Figure 1

Back to TopTop