Special Issue "The Application of Artificial Intelligence in Hydrology"

A special issue of Water (ISSN 2073-4441). This special issue belongs to the section "Hydrology".

Deadline for manuscript submissions: closed (30 September 2021) | Viewed by 15609

Special Issue Editor

Dr. Gonzalo Astray
E-Mail Website
Guest Editor
Physical Chemistry Department, Universidade de Vigo, Vigo, Spain
Interests: machine learning; physical chemistry; hydrology; food technology; palynology; solar radiation
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Over the last few decades, the use of artificial intelligence (AI) has experienced a high increase in a wide variety of research fields. This kind of models are characterized as powerful tools to obtain information which would otherwise be very complicated or impossible to get. AI models, together with the large amount of hydrologycal data currently available, provide the ideal conditions to create tools aimed at managing water supply, predicting flood and drought, monitoring water quality, optimizing irrigation schemes, managing dams, determining carbonate saturation, evaluating the sedimentation process, and modeling the contaminant transport, among others. All the AI models, from the simplest to the most complex, such as random forest or neural networks, therefore allow expanding the existing knowledge about the complex water system.

The aim of this Special Issue on “The Application of Artificial Intelligence in Hydrology” is to present the state-of-the-art related (but not limited) to the study of movements, distribution, and management of water in nature.

We invite authors to submit research articles, reviews, communications, and concept papers that demonstrate the high potential of artificial intelligence in the hydrological field.

Dr. Gonzalo Astray
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Water is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2200 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Artificial intelligence
  • Machine learning
  • Big data/Cloud computing
  • Monitoring/Modelling/Prediction/Optimization
  • Flow prediction
  • Water quality
  • Water supply
  • Management
  • Risk assessment
  • Multidisciplinary research

Published Papers (10 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

Article
Multiple-Depth Soil Moisture Estimates Using Artificial Neural Network and Long Short-Term Memory Models
Water 2021, 13(18), 2584; https://doi.org/10.3390/w13182584 - 18 Sep 2021
Cited by 3 | Viewed by 953
Abstract
Accurate prediction of soil moisture is important yet challenging in various disciplines, such as agricultural systems, hydrology studies, and ecosystems studies. However, many data-driven models are being used to simulate and predict soil moisture at only a single depth. To predict soil moisture [...] Read more.
Accurate prediction of soil moisture is important yet challenging in various disciplines, such as agricultural systems, hydrology studies, and ecosystems studies. However, many data-driven models are being used to simulate and predict soil moisture at only a single depth. To predict soil moisture at various soil depths with depths of 100, 200, 500, and 1000 mm from the surface, based on the weather and soil characteristic data, this study designed two data-driven models: artificial neural networks and long short-term memory models. The developed models are applied to predict daily soil moisture up to 6 days ahead at four depths in the Eagle Lake Observatory in California, USA. The overall results showed that the long short-term memory model provides better predictive performance than the artificial neural network model for all depths. The root mean square error of the predicted soil moisture from both models is lower than 2.0, and the correlation coefficient is 0.80–0.97 for the artificial neural network model and 0.90–0.98 for the long short-term memory model. In addition, monthly based evaluation results showed that soil moisture predicted from the data-driven models is highly useful for analyzing the effects on the water cycle during the wet season as well as dry seasons. The prediction results can be used as basic data for numerous fields such as hydrological study, agricultural study, and environment, respectively. Full article
(This article belongs to the Special Issue The Application of Artificial Intelligence in Hydrology)
Show Figures

Figure 1

Article
floodGAN: Using Deep Adversarial Learning to Predict Pluvial Flooding in Real Time
Water 2021, 13(16), 2255; https://doi.org/10.3390/w13162255 - 18 Aug 2021
Cited by 6 | Viewed by 1543
Abstract
Using machine learning for pluvial flood prediction tasks has gained growing attention in the past years. In particular, data-driven models using artificial neuronal networks show promising results, shortening the computation times of physically based simulations. However, recent approaches have used mainly conventional fully [...] Read more.
Using machine learning for pluvial flood prediction tasks has gained growing attention in the past years. In particular, data-driven models using artificial neuronal networks show promising results, shortening the computation times of physically based simulations. However, recent approaches have used mainly conventional fully connected neural networks which were (a) restricted to spatially uniform precipitation events and (b) limited to a small amount of input data. In this work, a deep convolutional generative adversarial network has been developed to predict pluvial flooding caused by nonlinear spatial heterogeny rainfall events. The model developed, floodGAN, is based on an image-to-image translation approach whereby the model learns to generate 2D inundation predictions conditioned by heterogenous rainfall distributions—through the minimax game of two adversarial networks. The training data for the floodGAN model was generated using a physically based hydrodynamic model. To evaluate the performance and accuracy of the floodGAN, model multiple tests were conducted using both synthetic events and a historic rainfall event. The results demonstrate that the proposed floodGAN model is up to 106 times faster than the hydrodynamic model and promising in terms of accuracy and generalizability. Therefore, it bridges the gap between detailed flood modelling and real-time applications such as end-to-end early warning systems. Full article
(This article belongs to the Special Issue The Application of Artificial Intelligence in Hydrology)
Show Figures

Figure 1

Article
Flood Stage Forecasting Using Machine-Learning Methods: A Case Study on the Parma River (Italy)
Water 2021, 13(12), 1612; https://doi.org/10.3390/w13121612 - 08 Jun 2021
Cited by 7 | Viewed by 1548
Abstract
Real-time river flood forecasting models can be useful for issuing flood alerts and reducing or preventing inundations. To this end, machine-learning (ML) methods are becoming increasingly popular thanks to their low computational requirements and to their reliance on observed data only. This work [...] Read more.
Real-time river flood forecasting models can be useful for issuing flood alerts and reducing or preventing inundations. To this end, machine-learning (ML) methods are becoming increasingly popular thanks to their low computational requirements and to their reliance on observed data only. This work aimed to evaluate the ML models’ capability of predicting flood stages at a critical gauge station, using mainly upstream stage observations, though downstream levels should also be included to consider backwater, if present. The case study selected for this analysis was the lower stretch of the Parma River (Italy), and the forecast horizon was extended up to 9 h. The performances of three ML algorithms, namely Support Vector Regression (SVR), MultiLayer Perceptron (MLP), and Long Short-term Memory (LSTM), were compared herein in terms of accuracy and computational time. Up to 6 h ahead, all models provided sufficiently accurate predictions for practical purposes (e.g., Root Mean Square Error < 15 cm, and Nash-Sutcliffe Efficiency coefficient > 0.99), while peak levels were poorly predicted for longer lead times. Moreover, the results suggest that the LSTM model, despite requiring the longest training time, is the most robust and accurate in predicting peak values, and it should be preferred for setting up an operational forecasting system. Full article
(This article belongs to the Special Issue The Application of Artificial Intelligence in Hydrology)
Show Figures

Figure 1

Article
A Comparative Analysis of Hidden Markov Model, Hybrid Support Vector Machines, and Hybrid Artificial Neural Fuzzy Inference System in Reservoir Inflow Forecasting (Case Study: The King Fahd Dam, Saudi Arabia)
Water 2021, 13(9), 1236; https://doi.org/10.3390/w13091236 - 29 Apr 2021
Cited by 2 | Viewed by 836
Abstract
The precise prediction of the streamflow of reservoirs is of considerable importance for many activities relating to water resource management, such as reservoir operation and flood and drought control and protection. This study aimed to develop and evaluate the applicability of a hidden [...] Read more.
The precise prediction of the streamflow of reservoirs is of considerable importance for many activities relating to water resource management, such as reservoir operation and flood and drought control and protection. This study aimed to develop and evaluate the applicability of a hidden Markov model (HMM) and two hybrid models, i.e., the support vector machine-genetic algorithm (SVM-GA) and artificial neural fuzzy inference system-genetic algorithm (ANFIS-GA), for reservoir inflow forecasting at the King Fahd dam, Saudi Arabia. The results obtained by the HMM model were compared with those for the two hybrid models ANFIS-GA and SVM-GA, and with those for individual SVM and ANFIS models based on performance evaluation indicators and visual inspection. The results of the comparison revealed that the ANFIS-GA model and ANFIS model provided superior results for forecasting monthly inflow with satisfactory accuracy in both training (R2 = 0.924, 0.857) and testing (R2 = 0.842, 0.810) models. The performance evaluation results for the developed models showed that the GA-induced improvement in the ANFIS and SVR forecasts was matched by an approximately 25% decrease in RMSE and around a 13% increase in Nash–Sutcliffe efficiency. The promising accuracy of the proposed models demonstrates their potential for applications in monthly inflow forecasting in the present semiarid region. Full article
(This article belongs to the Special Issue The Application of Artificial Intelligence in Hydrology)
Show Figures

Figure 1

Article
Scenario-Based Real-Time Flood Prediction with Logistic Regression
Water 2021, 13(9), 1191; https://doi.org/10.3390/w13091191 - 25 Apr 2021
Cited by 4 | Viewed by 895
Abstract
This study proposed a real-time flood extent prediction method to shorten the time it takes from the flood occurrence to an alert issuance. This method uses logistic regression to generate a flood probability discriminant for each grid constituting the study area, and then [...] Read more.
This study proposed a real-time flood extent prediction method to shorten the time it takes from the flood occurrence to an alert issuance. This method uses logistic regression to generate a flood probability discriminant for each grid constituting the study area, and then predicts the flood extent with the amount of runoff caused by rainfall. In order to generate the flood probability discriminant for each grid, a two-dimensional (2D) flood inundation model was verified by applying the Typhoon Chaba, which caused great damage to the study area in 2016. Then, 100 probability rainfall scenarios were created by combining the return period, duration, and time distribution using past observation rainfall data, and rainfall-runoff–inundation relation databases were built for each scenario by applying hydrodynamic and hydrological models. A flood probability discriminant based on logistic regression was generated for each grid by using whether the grid was flooded (1 or 0) for the runoff amount in the database. When the runoff amount is input to the generated discriminant, the flood probability on the target grid is calculated by the coefficients, so that the flood extent is quickly predicted. The proposed method predicted the flood extent in a few seconds in both cases and showed high accuracy with 83.6~98.4% and 74.4~99.1%, respectively, in the application of scenario rainfall and actual rainfall. Full article
(This article belongs to the Special Issue The Application of Artificial Intelligence in Hydrology)
Show Figures

Figure 1

Article
Improving Radar-Based Rainfall Forecasts by Long Short-Term Memory Network in Urban Basins
Water 2021, 13(6), 776; https://doi.org/10.3390/w13060776 - 12 Mar 2021
Cited by 4 | Viewed by 1018
Abstract
Radar-based rainfall forecasts are widely used extrapolation algorithms that are popular in systems of precipitation for predicting up to six hours in lead time. Nevertheless, the reliability of rainfall forecasts gradually declines for heavy rain events with lead time due to the lack [...] Read more.
Radar-based rainfall forecasts are widely used extrapolation algorithms that are popular in systems of precipitation for predicting up to six hours in lead time. Nevertheless, the reliability of rainfall forecasts gradually declines for heavy rain events with lead time due to the lack of predictability. Recently, data-driven approaches were commonly implemented in hydrological problems. In this research, the data-driven models were developed based on the data obtained from a radar forecasting system named McGill Algorithm for Precipitation nowcasting by Lagrangian Extrapolation (MAPLE) and ground rain gauges. The data included thirteen urban stations in the five metropolitan cities located in South Korea. The twenty-five data points of MAPLE surrounding each rain station were utilized as the model input, and the observed rainfall at the corresponding gauges were used as the model output. The results showed superior capabilities of long short-term memory (LSTM) network in improving 180-min rainfall forecasts at the stations based on a comparison of five different data-driven models, including multiple linear regression (MLR), multivariate adaptive regression splines (MARS), multi-layer perceptron (MLP), basic recurrent neural network (RNN), and LSTM. Although the model still produced an underestimation of extreme rainfall values at some examined stations, this study proved that the LSTM could provide reliable performance. This model can be an optional method for improving rainfall forecasts at the stations for urban basins. Full article
(This article belongs to the Special Issue The Application of Artificial Intelligence in Hydrology)
Show Figures

Figure 1

Article
Deep Learning with Long Short Term Memory Based Sequence-to-Sequence Model for Rainfall-Runoff Simulation
Water 2021, 13(4), 437; https://doi.org/10.3390/w13040437 - 08 Feb 2021
Cited by 13 | Viewed by 1388
Abstract
Accurate runoff prediction is one of the important tasks in various fields such as agriculture, hydrology, and environmental studies. Recently, with massive improvements of computational system and hardware, the deep learning-based approach has recently been applied for more accurate runoff prediction. In this [...] Read more.
Accurate runoff prediction is one of the important tasks in various fields such as agriculture, hydrology, and environmental studies. Recently, with massive improvements of computational system and hardware, the deep learning-based approach has recently been applied for more accurate runoff prediction. In this study, the long short-term memory model with sequence-to-sequence structure was applied for hourly runoff predictions from 2015 to 2019 in the Russian River basin, California, USA. The proposed model was used to predict hourly runoff with lead time of 1–6 h using runoff data observed at upstream stations. The model was evaluated in terms of event-based performance using the statistical metrics including root mean square error, Nash-Sutcliffe Efficiency, peak runoff error, and peak time error. The results show that proposed model outperforms support vector machine and conventional long short-term memory models. In addition, the model has the best predictive ability for runoff events, which means that it can be effective for developing short-term flood forecasting and warning systems. The results of this study demonstrate that the deep learning-based approach for hourly runoff forecasting has high predictive power and sequence-to-sequence structure is effective method to improve the prediction results. Full article
(This article belongs to the Special Issue The Application of Artificial Intelligence in Hydrology)
Show Figures

Figure 1

Article
Hydrometeorological Drought Forecasting in Hyper-Arid Climates Using Nonlinear Autoregressive Neural Networks
Water 2020, 12(9), 2611; https://doi.org/10.3390/w12092611 - 18 Sep 2020
Cited by 8 | Viewed by 985
Abstract
Drought forecasting is an essential component of efficient water resource management that helps water planners mitigate the severe consequences of water shortages. This is especially important in hyper-arid climates, where drought consequences are more drastic due to the limited water resources and harsh [...] Read more.
Drought forecasting is an essential component of efficient water resource management that helps water planners mitigate the severe consequences of water shortages. This is especially important in hyper-arid climates, where drought consequences are more drastic due to the limited water resources and harsh environments. This paper presents a data-driven approach based on an artificial neural network algorithm for predicting droughts. Initially, the observed drought events in the State of Kuwait were tested for autocorrelation using the correlogram test. Due to the cyclic nature of the observed drought time series, nonlinear autoregressive neural networks (NARs) were used to predict the occurrence of drought events using the Levenberg–Marquardt algorithm to train the NAR models. This approach was tested for the forecasting of 12- and 24-month droughts using the recently developed precipitation index (PI). Four statistical measures were used to assess the model’s performance during training and validation. The performance metrics indicated that the drought prediction was reliable, with Nash–Sutcliffe values of 0.761–0.878 during the validation period. Additionally, the computed R2 values for model forecasts ranged between 0.784–0.883, which indicated the quality of the model predictions. These findings contribute to the development of more efficient drought forecasting tools for use by water managers in hyper-arid regions. Full article
(This article belongs to the Special Issue The Application of Artificial Intelligence in Hydrology)
Show Figures

Figure 1

Article
Utility of Artificial Neural Networks in Modeling Pan Evaporation in Hyper-Arid Climates
Water 2020, 12(5), 1508; https://doi.org/10.3390/w12051508 - 25 May 2020
Cited by 15 | Viewed by 1733
Abstract
Evaporation is the major water-loss component of the hydrologic cycle and thus requires efficient management. This study aims to model daily pan evaporation rates in hyper-arid climates using artificial neural networks (ANNs). Hyper-arid climates are characterized by harsh environmental conditions where annual precipitation [...] Read more.
Evaporation is the major water-loss component of the hydrologic cycle and thus requires efficient management. This study aims to model daily pan evaporation rates in hyper-arid climates using artificial neural networks (ANNs). Hyper-arid climates are characterized by harsh environmental conditions where annual precipitation rates do not exceed 3% of annual evaporation rates. For the first time, ANNs were applied to model such climatic conditions in the State of Kuwait. Pan evaporation data from 1993–2015 were normalized to a 0–1 range to boost ANN performance and the ANN structure was optimized by testing various meteorological input combinations. Levenberg–Marquardt algorithms were used to train the ANN models. The proposed ANN was satisfactorily efficient in modeling pan evaporation in these hyper-arid climatic conditions. The Nash–Sutcliffe coefficients ranged from 0.405 to 0.755 over the validation period. Mean air temperatures and average wind speeds were identified as meteorological variables that most influenced the ANN performance. A sensitivity analysis showed that the number of hidden layers did not significantly impact the ANN performance. The ANN models demonstrated considerable bias in predicting high pan evaporation rates (>25 mm/day). The proposed modeling method may assist water managers in Kuwait and other hyper-arid regions in establishing resilient water-management plans. Full article
(This article belongs to the Special Issue The Application of Artificial Intelligence in Hydrology)
Show Figures

Figure 1

Review

Jump to: Research

Review
Big Data Analytics and Its Role to Support Groundwater Management in the Southern African Development Community
Water 2020, 12(10), 2796; https://doi.org/10.3390/w12102796 - 09 Oct 2020
Cited by 10 | Viewed by 2340
Abstract
Big data analytics (BDA) is a novel concept focusing on leveraging large volumes of heterogeneous data through advanced analytics to drive information discovery. This paper aims to highlight the potential role BDA can play to improve groundwater management in the Southern African Development [...] Read more.
Big data analytics (BDA) is a novel concept focusing on leveraging large volumes of heterogeneous data through advanced analytics to drive information discovery. This paper aims to highlight the potential role BDA can play to improve groundwater management in the Southern African Development Community (SADC) region in Africa. Through a review of the literature, this paper defines the concepts of big data, big data sources in groundwater, big data analytics, big data platforms and framework and how they can be used to support groundwater management in the SADC region. BDA may support groundwater management in SADC region by filling in data gaps and transforming these data into useful information. In recent times, machine learning and artificial intelligence have stood out as a novel tool for data-driven modeling. Managing big data from collection to information delivery requires critical application of selected tools, techniques and methods. Hence, in this paper we present a conceptual framework that can be used to manage the implementation of BDA in a groundwater management context. Then, we highlight challenges limiting the application of BDA which included technological constraints and institutional barriers. In conclusion, the paper shows that sufficient big data exist in groundwater domain and that BDA exists to be used in groundwater sciences thereby providing the basis to further explore data-driven sciences in groundwater management. Full article
(This article belongs to the Special Issue The Application of Artificial Intelligence in Hydrology)
Show Figures

Figure 1

Back to TopTop