E-Mail Alert

Add your e-mail address to receive forthcoming issues of this journal:

Journal Browser

Journal Browser

Special Issue "Machine Learning Applied to Hydraulic and Hydrological Modelling"

A special issue of Water (ISSN 2073-4441). This special issue belongs to the section "Hydraulics".

Deadline for manuscript submissions: 30 September 2019.

Special Issue Editors

Guest Editor
Dr. Vasilis Bellos

School of Rural and Surveying Engineering, National Technical University of Athens, 9 Iroon Polytechniou, 15780, Zografou, Greece
E-Mail
Interests: hydrodynamic modelling; flood modelling; urban catchment studies; integrated modelling; hydrological modelling; model reduction
Guest Editor
Dr. Juan Pablo Carbajal

Eawag, Urban Water Management, Überlandstrasse 133, CH-8600 Dübendorf, Switzerland
Website | E-Mail

Special Issue Information

Dear Colleagues,

The computational power available nowadays allow us to tackle simulation challenges in hydraulic and hydrological modelling at different scales that were impossible a few decades ago. However, even in the current situation, the time needed for these simulations is inadequate for many scientific and engineering applications, such as decision support systems, flood warning systems, design or optimization of hydraulic structures, calibration of model parameters, uncertainty quantification, real-time model-based control, etc.

To address these issues, the development of fast surrogate models to increase the simulation speed seems to be promising strategy: It does not require a huge investment in new hardware and software, and the same tools can be used to solve very different problems. The field of Machine Learning offers a huge library of methods to build surrogate models, many of which have been successfully used in hydraulic and hydrological modelling.

In this Special Issue we would like to invite research works which incorporate Machine Learning techniques in hydraulic and hydrological modelling, such as (but not restricted to):

-   Artificial Science, in which a relation between input and output is learned using only data, also known as data-driven methods.

-   Scientific Numerical Modelling, such as simplified numerical models, model calibration (system identification) or optimization, renormalized models, up (down)scaled models, coarse models, etc.

-   Emulation, where a fast emulator is developed based on training data derived by a slow simulator

Dr. Vasilis Bellos
Dr. Juan Pablo Carbajal
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Water is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (11 papers)

View options order results:
result details:
Displaying articles 1-11
Export citation of selected articles as:

Research

Open AccessArticle
Random Forest Ability in Regionalizing Hourly Hydrological Model Parameters
Water 2019, 11(8), 1540; https://doi.org/10.3390/w11081540
Received: 29 May 2019 / Revised: 9 July 2019 / Accepted: 22 July 2019 / Published: 25 July 2019
PDF Full-text (3706 KB) | HTML Full-text | XML Full-text
Abstract
This study investigated the potential of random forest (RF) algorithms for regionalizing the parameters of an hourly hydrological model. The relationships between model parameters and climate/landscape catchment descriptors were multidimensional and exhibited nonlinear features. In this case, machine-learning tools offered the option of [...] Read more.
This study investigated the potential of random forest (RF) algorithms for regionalizing the parameters of an hourly hydrological model. The relationships between model parameters and climate/landscape catchment descriptors were multidimensional and exhibited nonlinear features. In this case, machine-learning tools offered the option of efficiently handling such relationships using a large sample of data. The performance of the regionalized model using RF was assessed in comparison with local calibration and two benchmark regionalization approaches. Two catchment sets were considered: (1) A target pseudo-ungauged catchment set was composed of 120 urban ungauged catchments and (2) 2105 gauged American and French catchments were used for constructing the RF. By using pseudo-ungauged urban catchments, we aimed at assessing the potential of the RF to detect the specificities of the urban catchments. Results showed that RF-regionalized models allowed for slightly better streamflow simulations on ungauged sites compared with benchmark regionalization approaches. Yet, constructed RFs were weakly sensitive to the urbanization features of the catchments, which prevents their use in straightforward scenarios of the hydrological impacts of urbanization. Full article
(This article belongs to the Special Issue Machine Learning Applied to Hydraulic and Hydrological Modelling)
Figures

Figure 1

Open AccessArticle
Seepage Comprehensive Evaluation of Concrete Dam Based on Grey Cluster Analysis
Water 2019, 11(7), 1499; https://doi.org/10.3390/w11071499
Received: 17 June 2019 / Revised: 8 July 2019 / Accepted: 17 July 2019 / Published: 19 July 2019
PDF Full-text (4702 KB) | HTML Full-text | XML Full-text
Abstract
Most concrete dams have seepage problems to some degree, so it is a common strategy to maintain ongoing monitoring and take timely repair measures. In order to grasp the real operation state of dam seepage, it is vital to analyze the measured data [...] Read more.
Most concrete dams have seepage problems to some degree, so it is a common strategy to maintain ongoing monitoring and take timely repair measures. In order to grasp the real operation state of dam seepage, it is vital to analyze the measured data of each monitoring indicator and establish an appropriate prediction equation. However, dam seepage states under the load and environmental influences are very complicated, involving various monitoring indicators and multiple monitoring points of each indicator. For the purpose of maintaining the temporal continuity and spatial correlation of monitoring objects, this paper used a multi-indicator grey clustering analysis model to explore the grey correlation among various indicators, and realized a comprehensive evaluation of a dam seepage state by computation of the clustering coefficient. The case study shows that the proposed method can be successfully applied to the health monitoring of concrete dam seepage. Full article
(This article belongs to the Special Issue Machine Learning Applied to Hydraulic and Hydrological Modelling)
Figures

Figure 1

Open AccessArticle
Application of Long Short-Term Memory (LSTM) Neural Network for Flood Forecasting
Water 2019, 11(7), 1387; https://doi.org/10.3390/w11071387
Received: 28 May 2019 / Revised: 4 July 2019 / Accepted: 5 July 2019 / Published: 5 July 2019
PDF Full-text (2903 KB) | HTML Full-text | XML Full-text
Abstract
Flood forecasting is an essential requirement in integrated water resource management. This paper suggests a Long Short-Term Memory (LSTM) neural network model for flood forecasting, where the daily discharge and rainfall were used as input data. Moreover, characteristics of the data sets which [...] Read more.
Flood forecasting is an essential requirement in integrated water resource management. This paper suggests a Long Short-Term Memory (LSTM) neural network model for flood forecasting, where the daily discharge and rainfall were used as input data. Moreover, characteristics of the data sets which may influence the model performance were also of interest. As a result, the Da River basin in Vietnam was chosen and two different combinations of input data sets from before 1985 (when the Hoa Binh dam was built) were used for one-day, two-day, and three-day flowrate forecasting ahead at Hoa Binh Station. The predictive ability of the model is quite impressive: The Nash–Sutcliffe efficiency (NSE) reached 99%, 95%, and 87% corresponding to three forecasting cases, respectively. The findings of this study suggest a viable option for flood forecasting on the Da River in Vietnam, where the river basin stretches between many countries and downstream flows (Vietnam) may fluctuate suddenly due to flood discharge from upstream hydroelectric reservoirs. Full article
(This article belongs to the Special Issue Machine Learning Applied to Hydraulic and Hydrological Modelling)
Figures

Figure 1

Open AccessArticle
Concrete Dam Displacement Prediction Based on an ISODATA-GMM Clustering and Random Coefficient Model
Water 2019, 11(4), 714; https://doi.org/10.3390/w11040714
Received: 15 February 2019 / Revised: 23 March 2019 / Accepted: 2 April 2019 / Published: 6 April 2019
Cited by 1 | PDF Full-text (6552 KB) | HTML Full-text | XML Full-text
Abstract
Displacement data modelling is of great importance for the safety control of concrete dams. The commonly used artificial intelligence method modelled the displacement data at each monitoring point individually, i.e., the data correlations between the monitoring points are overlooked, which leads to the [...] Read more.
Displacement data modelling is of great importance for the safety control of concrete dams. The commonly used artificial intelligence method modelled the displacement data at each monitoring point individually, i.e., the data correlations between the monitoring points are overlooked, which leads to the over-fitting problem and the limitations in the generalization of model. A novel model combines Gaussian mixture model and Iterative self-organizing data analysing (ISODATA-GMM) clustering and the random coefficient method is proposed in this article, which takes the temporal-spatial correlation among the monitoring points into account. By taking the temporal-spatial correlation among the monitoring points into account and building models for all the points simultaneously, the random coefficient model improves the generalization ability of the model through reducing the number of free model variables. Since the random coefficient model supposed the data follows normal distributions, we use an ISODATA-GMM clustering algorithm to classify the measuring points into several groups according to its temporal and spatial characteristics, so that each group follows one distribution. Our model has the advantage of having a stronger generalization ability. Full article
(This article belongs to the Special Issue Machine Learning Applied to Hydraulic and Hydrological Modelling)
Figures

Figure 1

Open AccessArticle
A Performance Comparison of Machine Learning Algorithms for Arced Labyrinth Spillways
Water 2019, 11(3), 544; https://doi.org/10.3390/w11030544
Received: 29 January 2019 / Revised: 8 March 2019 / Accepted: 13 March 2019 / Published: 16 March 2019
PDF Full-text (4285 KB) | HTML Full-text | XML Full-text | Supplementary Files
Abstract
Labyrinth weirs provide an economic option for flow control structures in a variety of applications, including as spillways at dams. The cycles of labyrinth weirs are typically placed in a linear configuration. However, numerous projects place labyrinth cycles along an arc to take [...] Read more.
Labyrinth weirs provide an economic option for flow control structures in a variety of applications, including as spillways at dams. The cycles of labyrinth weirs are typically placed in a linear configuration. However, numerous projects place labyrinth cycles along an arc to take advantage of reservoir conditions and dam alignment, and to reduce construction costs such as narrowing the spillway chute. Practitioners must optimize more than 10 geometric variables when developing a head–discharge relationship. This is typically done using the following tools: empirical relationships, numerical modeling, and physical modeling. This study applied a new tool, machine learning, to the analysis of the geometrically complex arced labyrinth weirs. In this work, both neural networks (NN) and random forests (RF) were employed to estimate the discharge coefficient for this specific type of weir with the results of physical modeling experiments used for training. Machine learning results are critiqued in terms of accuracy, robustness, interpolation, applicability, and new insights into the hydraulic performance of arced labyrinth weirs. Results demonstrate that NN and RF algorithms can be used as a unique expression for curve fitting, although neural networks outperformed random forest when interpolating among the tested geometries. Full article
(This article belongs to the Special Issue Machine Learning Applied to Hydraulic and Hydrological Modelling)
Figures

Figure 1

Open AccessArticle
Subdaily Rainfall Estimation through Daily Rainfall Downscaling Using Random Forests in Spain
Water 2019, 11(1), 125; https://doi.org/10.3390/w11010125
Received: 5 December 2018 / Revised: 30 December 2018 / Accepted: 7 January 2019 / Published: 11 January 2019
Cited by 2 | PDF Full-text (1843 KB) | HTML Full-text | XML Full-text
Abstract
Subdaily rainfall data, though essential for applications in many fields, is not as readily available as daily rainfall data. In this work, regression approaches that use atmospheric data and daily rainfall statistics as predictors are evaluated to downscale daily-to-subdaily rainfall statistics on more [...] Read more.
Subdaily rainfall data, though essential for applications in many fields, is not as readily available as daily rainfall data. In this work, regression approaches that use atmospheric data and daily rainfall statistics as predictors are evaluated to downscale daily-to-subdaily rainfall statistics on more than 700 hourly rain gauges in Spain. We propose a new approach based on machine learning techniques that improves the downscaling skill of previous methodologies. Results are grouped by climate types (following the Köppen–Geiger classification) to investigate possible missing explanatory variables in the analysis. The methodology is then used to improve the ability of Poisson cluster models to simulate hourly rainfall series that mimic the statistical behavior of the observed ones. This approach can be applied for the study of extreme events and for daily-to-subdaily precipitation disaggregation in any location of Spain where daily rainfall data are available. Full article
(This article belongs to the Special Issue Machine Learning Applied to Hydraulic and Hydrological Modelling)
Figures

Figure 1

Open AccessArticle
Using Adjacent Buoy Information to Predict Wave Heights of Typhoons Offshore of Northeastern Taiwan
Water 2018, 10(12), 1800; https://doi.org/10.3390/w10121800
Received: 2 November 2018 / Revised: 26 November 2018 / Accepted: 6 December 2018 / Published: 7 December 2018
PDF Full-text (7937 KB) | HTML Full-text | XML Full-text
Abstract
In the northeastern sea area of Taiwan, typhoon-induced long waves often cause rogue waves that endanger human lives. Therefore, having the ability to predict wave height during the typhoon period is critical. The Central Weather Bureau maintains the Longdong and Guishandao buoys in [...] Read more.
In the northeastern sea area of Taiwan, typhoon-induced long waves often cause rogue waves that endanger human lives. Therefore, having the ability to predict wave height during the typhoon period is critical. The Central Weather Bureau maintains the Longdong and Guishandao buoys in the northeastern sea area of Taiwan to conduct long-term monitoring and collect oceanographic data. However, records have often become lost and the buoys have suffered other malfunctions, causing a lack of complete information concerning wind-generated waves. The goal of the present study was to determine the feasibility of using information collected from the adjacent buoy to predict waves. In addition, the effects of various factors such as the path of a typhoon on the prediction accuracy of data from both buoys are discussed herein. This study established a prediction model, and two scenarios were used to assess the performance: Scenario 1 included information from the adjacent buoy and Scenario 2 did not. An artificial neural network was used to establish the wave height prediction model. The research results demonstrated that (1) Scenario 1 achieved superior performance with respect to absolute errors, relative errors, and efficiency coefficient (CE) compared with Scenario 2; (2) the CE of Longdong (0.802) was higher than that of Guishandao (0.565); and (3) various types of typhoon paths were observed by examining each typhoon. The present study successfully determined the feasibility of using information from the adjacent buoy to predict waves. In addition, the effects of various factors such as the path of a typhoon on the prediction accuracy of both buoys were also discussed. Full article
(This article belongs to the Special Issue Machine Learning Applied to Hydraulic and Hydrological Modelling)
Figures

Graphical abstract

Open AccessArticle
A Machine Learning Approach to Evaluating the Damage Level of Tooth-Shape Spur Dikes
Water 2018, 10(11), 1680; https://doi.org/10.3390/w10111680
Received: 11 October 2018 / Revised: 10 November 2018 / Accepted: 14 November 2018 / Published: 17 November 2018
PDF Full-text (5234 KB) | HTML Full-text | XML Full-text
Abstract
Little research has been done on the application of machine learning approaches to evaluating the damage level of river training structures on the Yangtze River. In this paper, two machine learning approaches to evaluating the damage level of spur dikes with tooth-shaped structures [...] Read more.
Little research has been done on the application of machine learning approaches to evaluating the damage level of river training structures on the Yangtze River. In this paper, two machine learning approaches to evaluating the damage level of spur dikes with tooth-shaped structures are proposed: a supervised support vector machine (SVM) model and an unsupervised model combining a Kohonen neural network with an SVM model (KNN-SVM). It was found that the supervised SVM model predicted the damage level of the validation samples with high accuracy, and the unsupervised data-mining KNN-SVM model agreed well with the empirical evaluation result. It is shown that both machine learning approaches could become effective tools to evaluate the damage level of spur dikes and other river training structures. Full article
(This article belongs to the Special Issue Machine Learning Applied to Hydraulic and Hydrological Modelling)
Figures

Figure 1

Open AccessArticle
Assessment of Machine Learning Techniques for Monthly Flow Prediction
Water 2018, 10(11), 1676; https://doi.org/10.3390/w10111676
Received: 11 October 2018 / Revised: 9 November 2018 / Accepted: 14 November 2018 / Published: 17 November 2018
Cited by 4 | PDF Full-text (17399 KB) | HTML Full-text | XML Full-text | Supplementary Files
Abstract
Monthly flow predictions provide an essential basis for efficient decision-making regarding water resource allocation. In this paper, the performance of different popular data-driven models for monthly flow prediction is assessed to detect the appropriate model. The considered methods include feedforward neural networks (FFNNs), [...] Read more.
Monthly flow predictions provide an essential basis for efficient decision-making regarding water resource allocation. In this paper, the performance of different popular data-driven models for monthly flow prediction is assessed to detect the appropriate model. The considered methods include feedforward neural networks (FFNNs), time delay neural networks (TDNNs), radial basis neural networks (RBFNNs), recurrent neural network (RNN), a grasshopper optimization algorithm (GOA)-based support vector machine (SVM) and K-nearest neighbors (KNN) model. For this purpose, the performance of each model is evaluated in terms of several residual metrics using a monthly flow time series for two real case studies with different flow regimes. The results show that the KNN outperforms the different neural network configurations for the first case study, whereas RBFNN model has better performance for the second case study in terms of the correlation coefficient. According to the accuracy of the results, in the first case study with more input features, the KNN model is recommended for short-term predictions and for the second case with a smaller number of input features, but more training observations, the RBFNN model is suitable. Full article
(This article belongs to the Special Issue Machine Learning Applied to Hydraulic and Hydrological Modelling)
Figures

Figure 1

Open AccessArticle
Integration of a Parsimonious Hydrological Model with Recurrent Neural Networks for Improved Streamflow Forecasting
Water 2018, 10(11), 1655; https://doi.org/10.3390/w10111655
Received: 11 September 2018 / Revised: 3 November 2018 / Accepted: 9 November 2018 / Published: 14 November 2018
Cited by 3 | PDF Full-text (3747 KB) | HTML Full-text | XML Full-text
Abstract
This study applied a GR4J model in the Xiangjiang and Qujiang River basins for rainfall-runoff simulation. Four recurrent neural networks (RNNs)—the Elman recurrent neural network (ERNN), echo state network (ESN), nonlinear autoregressive exogenous inputs neural network (NARX), and long short-term memory (LSTM) network—were [...] Read more.
This study applied a GR4J model in the Xiangjiang and Qujiang River basins for rainfall-runoff simulation. Four recurrent neural networks (RNNs)—the Elman recurrent neural network (ERNN), echo state network (ESN), nonlinear autoregressive exogenous inputs neural network (NARX), and long short-term memory (LSTM) network—were applied in predicting discharges. The performances of models were compared and assessed, and the best two RNNs were selected and integrated with the lumped hydrological model GR4J to forecast the discharges; meanwhile, uncertainties of the simulated discharges were estimated. The generalized likelihood uncertainty estimation method was applied to quantify the uncertainties. The results show that the LSTM and NARX better captured the time-series dynamics than the other RNNs. The hybrid models improved the prediction of high, median, and low flows, particularly in reducing the bias of underestimation of high flows in the Xiangjiang River basin. The hybrid models reduced the uncertainty intervals by more than 50% for median and low flows, and increased the cover ratios for observations. The integration of a hydrological model with a recurrent neural network considering long-term dependencies is recommended in discharge forecasting. Full article
(This article belongs to the Special Issue Machine Learning Applied to Hydraulic and Hydrological Modelling)
Figures

Figure 1

Open AccessArticle
Least Squares Support Vector Mechanics to Predict the Stability Number of Rubble-Mound Breakwaters
Water 2018, 10(10), 1452; https://doi.org/10.3390/w10101452
Received: 31 August 2018 / Revised: 11 October 2018 / Accepted: 11 October 2018 / Published: 15 October 2018
PDF Full-text (2341 KB) | HTML Full-text | XML Full-text
Abstract
In coastal engineering, empirical formulas grounded on experimental works regarding the stability of breakwaters have been developed. In recent years, soft computing tools such as artificial neural networks and fuzzy models have started to be employed to diminish the time and cost spent [...] Read more.
In coastal engineering, empirical formulas grounded on experimental works regarding the stability of breakwaters have been developed. In recent years, soft computing tools such as artificial neural networks and fuzzy models have started to be employed to diminish the time and cost spent in these mentioned experimental works. To predict the stability number of rubble-mound breakwaters, the least squares version of support vector machines (LSSVM) method is used because it can be assessed as an alternative one to diverse soft computing techniques. The LSSVM models have been operated through the selected seven parameters, which are determined by Mallows’ Cp approach, that are, namely, breakwater permeability, damage level, wave number, slope angle, water depth, significant wave heights in front of the structure, and peak wave period. The performances of the LSSVM models have shown superior accuracy (correlation coefficients (CC) of 0.997) than that of artificial neural networks (ANN), fuzzy logic (FL), and genetic programming (GP), that are all implemented in the related literature. As a result, it is thought that this study will provide a practical way for readers to estimate the stability number of rubble-mound breakwaters with more accuracy. Full article
(This article belongs to the Special Issue Machine Learning Applied to Hydraulic and Hydrological Modelling)
Figures

Figure 1

Water EISSN 2073-4441 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top