Special Issue "The Application of Artificial Intelligence in Hydrology"

A special issue of Water (ISSN 2073-4441). This special issue belongs to the section "Hydrology and Hydrogeology".

Deadline for manuscript submissions: 30 September 2021.

Special Issue Editor

Dr. Gonzalo Astray
E-Mail Website
Guest Editor
Physical Chemistry Department, Universidade de Vigo, Vigo, Spain
Interests: machine learning; physical chemistry; hydrology; food technology; palynology; solar radiation
Special Issues and Collections in MDPI journals

Special Issue Information

Dear Colleagues,

Over the last few decades, the use of artificial intelligence (AI) has experienced a high increase in a wide variety of research fields. This kind of models are characterized as powerful tools to obtain information which would otherwise be very complicated or impossible to get. AI models, together with the large amount of hydrologycal data currently available, provide the ideal conditions to create tools aimed at managing water supply, predicting flood and drought, monitoring water quality, optimizing irrigation schemes, managing dams, determining carbonate saturation, evaluating the sedimentation process, and modeling the contaminant transport, among others. All the AI models, from the simplest to the most complex, such as random forest or neural networks, therefore allow expanding the existing knowledge about the complex water system.

The aim of this Special Issue on “The Application of Artificial Intelligence in Hydrology” is to present the state-of-the-art related (but not limited) to the study of movements, distribution, and management of water in nature.

We invite authors to submit research articles, reviews, communications, and concept papers that demonstrate the high potential of artificial intelligence in the hydrological field.

Dr. Gonzalo Astray
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Water is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2000 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Artificial intelligence
  • Machine learning
  • Big data/Cloud computing
  • Monitoring/Modelling/Prediction/Optimization
  • Flow prediction
  • Water quality
  • Water supply
  • Management
  • Risk assessment
  • Multidisciplinary research

Published Papers (7 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

Open AccessArticle
A Comparative Analysis of Hidden Markov Model, Hybrid Support Vector Machines, and Hybrid Artificial Neural Fuzzy Inference System in Reservoir Inflow Forecasting (Case Study: The King Fahd Dam, Saudi Arabia)
Water 2021, 13(9), 1236; https://doi.org/10.3390/w13091236 - 29 Apr 2021
Viewed by 251
Abstract
The precise prediction of the streamflow of reservoirs is of considerable importance for many activities relating to water resource management, such as reservoir operation and flood and drought control and protection. This study aimed to develop and evaluate the applicability of a hidden [...] Read more.
The precise prediction of the streamflow of reservoirs is of considerable importance for many activities relating to water resource management, such as reservoir operation and flood and drought control and protection. This study aimed to develop and evaluate the applicability of a hidden Markov model (HMM) and two hybrid models, i.e., the support vector machine-genetic algorithm (SVM-GA) and artificial neural fuzzy inference system-genetic algorithm (ANFIS-GA), for reservoir inflow forecasting at the King Fahd dam, Saudi Arabia. The results obtained by the HMM model were compared with those for the two hybrid models ANFIS-GA and SVM-GA, and with those for individual SVM and ANFIS models based on performance evaluation indicators and visual inspection. The results of the comparison revealed that the ANFIS-GA model and ANFIS model provided superior results for forecasting monthly inflow with satisfactory accuracy in both training (R2 = 0.924, 0.857) and testing (R2 = 0.842, 0.810) models. The performance evaluation results for the developed models showed that the GA-induced improvement in the ANFIS and SVR forecasts was matched by an approximately 25% decrease in RMSE and around a 13% increase in Nash–Sutcliffe efficiency. The promising accuracy of the proposed models demonstrates their potential for applications in monthly inflow forecasting in the present semiarid region. Full article
(This article belongs to the Special Issue The Application of Artificial Intelligence in Hydrology)
Show Figures

Figure 1

Open AccessArticle
Scenario-Based Real-Time Flood Prediction with Logistic Regression
Water 2021, 13(9), 1191; https://doi.org/10.3390/w13091191 - 25 Apr 2021
Viewed by 271
Abstract
This study proposed a real-time flood extent prediction method to shorten the time it takes from the flood occurrence to an alert issuance. This method uses logistic regression to generate a flood probability discriminant for each grid constituting the study area, and then [...] Read more.
This study proposed a real-time flood extent prediction method to shorten the time it takes from the flood occurrence to an alert issuance. This method uses logistic regression to generate a flood probability discriminant for each grid constituting the study area, and then predicts the flood extent with the amount of runoff caused by rainfall. In order to generate the flood probability discriminant for each grid, a two-dimensional (2D) flood inundation model was verified by applying the Typhoon Chaba, which caused great damage to the study area in 2016. Then, 100 probability rainfall scenarios were created by combining the return period, duration, and time distribution using past observation rainfall data, and rainfall-runoff–inundation relation databases were built for each scenario by applying hydrodynamic and hydrological models. A flood probability discriminant based on logistic regression was generated for each grid by using whether the grid was flooded (1 or 0) for the runoff amount in the database. When the runoff amount is input to the generated discriminant, the flood probability on the target grid is calculated by the coefficients, so that the flood extent is quickly predicted. The proposed method predicted the flood extent in a few seconds in both cases and showed high accuracy with 83.6~98.4% and 74.4~99.1%, respectively, in the application of scenario rainfall and actual rainfall. Full article
(This article belongs to the Special Issue The Application of Artificial Intelligence in Hydrology)
Show Figures

Figure 1

Open AccessArticle
Improving Radar-Based Rainfall Forecasts by Long Short-Term Memory Network in Urban Basins
Water 2021, 13(6), 776; https://doi.org/10.3390/w13060776 - 12 Mar 2021
Viewed by 431
Abstract
Radar-based rainfall forecasts are widely used extrapolation algorithms that are popular in systems of precipitation for predicting up to six hours in lead time. Nevertheless, the reliability of rainfall forecasts gradually declines for heavy rain events with lead time due to the lack [...] Read more.
Radar-based rainfall forecasts are widely used extrapolation algorithms that are popular in systems of precipitation for predicting up to six hours in lead time. Nevertheless, the reliability of rainfall forecasts gradually declines for heavy rain events with lead time due to the lack of predictability. Recently, data-driven approaches were commonly implemented in hydrological problems. In this research, the data-driven models were developed based on the data obtained from a radar forecasting system named McGill Algorithm for Precipitation nowcasting by Lagrangian Extrapolation (MAPLE) and ground rain gauges. The data included thirteen urban stations in the five metropolitan cities located in South Korea. The twenty-five data points of MAPLE surrounding each rain station were utilized as the model input, and the observed rainfall at the corresponding gauges were used as the model output. The results showed superior capabilities of long short-term memory (LSTM) network in improving 180-min rainfall forecasts at the stations based on a comparison of five different data-driven models, including multiple linear regression (MLR), multivariate adaptive regression splines (MARS), multi-layer perceptron (MLP), basic recurrent neural network (RNN), and LSTM. Although the model still produced an underestimation of extreme rainfall values at some examined stations, this study proved that the LSTM could provide reliable performance. This model can be an optional method for improving rainfall forecasts at the stations for urban basins. Full article
(This article belongs to the Special Issue The Application of Artificial Intelligence in Hydrology)
Show Figures

Figure 1

Open AccessArticle
Deep Learning with Long Short Term Memory Based Sequence-to-Sequence Model for Rainfall-Runoff Simulation
Water 2021, 13(4), 437; https://doi.org/10.3390/w13040437 - 08 Feb 2021
Viewed by 574
Abstract
Accurate runoff prediction is one of the important tasks in various fields such as agriculture, hydrology, and environmental studies. Recently, with massive improvements of computational system and hardware, the deep learning-based approach has recently been applied for more accurate runoff prediction. In this [...] Read more.
Accurate runoff prediction is one of the important tasks in various fields such as agriculture, hydrology, and environmental studies. Recently, with massive improvements of computational system and hardware, the deep learning-based approach has recently been applied for more accurate runoff prediction. In this study, the long short-term memory model with sequence-to-sequence structure was applied for hourly runoff predictions from 2015 to 2019 in the Russian River basin, California, USA. The proposed model was used to predict hourly runoff with lead time of 1–6 h using runoff data observed at upstream stations. The model was evaluated in terms of event-based performance using the statistical metrics including root mean square error, Nash-Sutcliffe Efficiency, peak runoff error, and peak time error. The results show that proposed model outperforms support vector machine and conventional long short-term memory models. In addition, the model has the best predictive ability for runoff events, which means that it can be effective for developing short-term flood forecasting and warning systems. The results of this study demonstrate that the deep learning-based approach for hourly runoff forecasting has high predictive power and sequence-to-sequence structure is effective method to improve the prediction results. Full article
(This article belongs to the Special Issue The Application of Artificial Intelligence in Hydrology)
Show Figures

Figure 1

Open AccessArticle
Hydrometeorological Drought Forecasting in Hyper-Arid Climates Using Nonlinear Autoregressive Neural Networks
Water 2020, 12(9), 2611; https://doi.org/10.3390/w12092611 - 18 Sep 2020
Cited by 1 | Viewed by 589
Abstract
Drought forecasting is an essential component of efficient water resource management that helps water planners mitigate the severe consequences of water shortages. This is especially important in hyper-arid climates, where drought consequences are more drastic due to the limited water resources and harsh [...] Read more.
Drought forecasting is an essential component of efficient water resource management that helps water planners mitigate the severe consequences of water shortages. This is especially important in hyper-arid climates, where drought consequences are more drastic due to the limited water resources and harsh environments. This paper presents a data-driven approach based on an artificial neural network algorithm for predicting droughts. Initially, the observed drought events in the State of Kuwait were tested for autocorrelation using the correlogram test. Due to the cyclic nature of the observed drought time series, nonlinear autoregressive neural networks (NARs) were used to predict the occurrence of drought events using the Levenberg–Marquardt algorithm to train the NAR models. This approach was tested for the forecasting of 12- and 24-month droughts using the recently developed precipitation index (PI). Four statistical measures were used to assess the model’s performance during training and validation. The performance metrics indicated that the drought prediction was reliable, with Nash–Sutcliffe values of 0.761–0.878 during the validation period. Additionally, the computed R2 values for model forecasts ranged between 0.784–0.883, which indicated the quality of the model predictions. These findings contribute to the development of more efficient drought forecasting tools for use by water managers in hyper-arid regions. Full article
(This article belongs to the Special Issue The Application of Artificial Intelligence in Hydrology)
Show Figures

Figure 1

Open AccessArticle
Utility of Artificial Neural Networks in Modeling Pan Evaporation in Hyper-Arid Climates
Water 2020, 12(5), 1508; https://doi.org/10.3390/w12051508 - 25 May 2020
Cited by 8 | Viewed by 1023
Abstract
Evaporation is the major water-loss component of the hydrologic cycle and thus requires efficient management. This study aims to model daily pan evaporation rates in hyper-arid climates using artificial neural networks (ANNs). Hyper-arid climates are characterized by harsh environmental conditions where annual precipitation [...] Read more.
Evaporation is the major water-loss component of the hydrologic cycle and thus requires efficient management. This study aims to model daily pan evaporation rates in hyper-arid climates using artificial neural networks (ANNs). Hyper-arid climates are characterized by harsh environmental conditions where annual precipitation rates do not exceed 3% of annual evaporation rates. For the first time, ANNs were applied to model such climatic conditions in the State of Kuwait. Pan evaporation data from 1993–2015 were normalized to a 0–1 range to boost ANN performance and the ANN structure was optimized by testing various meteorological input combinations. Levenberg–Marquardt algorithms were used to train the ANN models. The proposed ANN was satisfactorily efficient in modeling pan evaporation in these hyper-arid climatic conditions. The Nash–Sutcliffe coefficients ranged from 0.405 to 0.755 over the validation period. Mean air temperatures and average wind speeds were identified as meteorological variables that most influenced the ANN performance. A sensitivity analysis showed that the number of hidden layers did not significantly impact the ANN performance. The ANN models demonstrated considerable bias in predicting high pan evaporation rates (>25 mm/day). The proposed modeling method may assist water managers in Kuwait and other hyper-arid regions in establishing resilient water-management plans. Full article
(This article belongs to the Special Issue The Application of Artificial Intelligence in Hydrology)
Show Figures

Figure 1

Review

Jump to: Research

Open AccessEditor’s ChoiceReview
Big Data Analytics and Its Role to Support Groundwater Management in the Southern African Development Community
Water 2020, 12(10), 2796; https://doi.org/10.3390/w12102796 - 09 Oct 2020
Viewed by 928
Abstract
Big data analytics (BDA) is a novel concept focusing on leveraging large volumes of heterogeneous data through advanced analytics to drive information discovery. This paper aims to highlight the potential role BDA can play to improve groundwater management in the Southern African Development [...] Read more.
Big data analytics (BDA) is a novel concept focusing on leveraging large volumes of heterogeneous data through advanced analytics to drive information discovery. This paper aims to highlight the potential role BDA can play to improve groundwater management in the Southern African Development Community (SADC) region in Africa. Through a review of the literature, this paper defines the concepts of big data, big data sources in groundwater, big data analytics, big data platforms and framework and how they can be used to support groundwater management in the SADC region. BDA may support groundwater management in SADC region by filling in data gaps and transforming these data into useful information. In recent times, machine learning and artificial intelligence have stood out as a novel tool for data-driven modeling. Managing big data from collection to information delivery requires critical application of selected tools, techniques and methods. Hence, in this paper we present a conceptual framework that can be used to manage the implementation of BDA in a groundwater management context. Then, we highlight challenges limiting the application of BDA which included technological constraints and institutional barriers. In conclusion, the paper shows that sufficient big data exist in groundwater domain and that BDA exists to be used in groundwater sciences thereby providing the basis to further explore data-driven sciences in groundwater management. Full article
(This article belongs to the Special Issue The Application of Artificial Intelligence in Hydrology)
Show Figures

Figure 1

Back to TopTop