# Simulating Reservoir Operation Using a Recurrent Neural Network Algorithm

^{1}

^{2}

^{3}

^{*}

## Abstract

**:**

## 1. Introduction

^{9}m

^{3}, the regulating storage is 6.46 × 10

^{9}m

^{3}, the total installed capacity is 13.86 GW. Combined with previous research results, in this study, we choose three deep learning models, i.e., RNN, long short-term memory (LSTM), and gated recurrent unit (GRU) developed from RNN, to build the reservoir operation model and predict the reservoir outflow [20,21,22]. The goals of this study are (1) to discuss the effect of the parameters setting on the simulation performances for three models; (2) to explore the applicability of the deep learning model to reservoir operation simulation; and (3) to analyze the rationality of the selected influence factors and the importance of various influence factors in reservoir operation decisions.

## 2. Methodology

_{1}, x

_{2}, …, x

_{i}, …, x

_{n}

_{1}) and output data Y (y

_{1}, y

_{2}, …, y

_{k}, …, y

_{n}

_{3}) are connected by the hidden layer H (h

_{1}, h

_{2}, …, h

_{j}, …, h

_{n}

_{2}), where n1, n2, and n3 represent the total number of inputs, hidden neurons, and outputs. The number of input layer and output layer nodes determined by the input and output data of the model. As an important parameter affecting the performance of the model, the number of hidden layer nodes is usually determined by a trial and error method in the training process. In this paper, the weight coefficient matrix is represented by W (for example, W

_{xh}represents the weight coefficient matrix of the input layer to the hidden layer.), the offset vector is represented by b (b

_{h}represents the offset vector of the hidden layer), the activation function is represented by f(), and the learning and training process of the RNN neural network is as follows:

_{t}= f(W

_{xh}x

_{t}+ W

_{hh}h

_{t}

_{−1}+ b

_{h})

_{t}= W

_{hy}h

_{t}+ b

_{y}

_{t}value between 0 to 1, based on the previous moment output h

_{t}

_{−}

_{1}and current input x

_{t}to decide whether to let the information C

_{t}

_{−}

_{1}that is produced in the previous moment pass or partially pass, which is depicted by Equation (3):

_{t}= σ (W

_{xf}x

_{t}+ W

_{hf}h

_{t}

_{−1}+ b

_{f})

_{t}, which can be added to the state. Later, the values produced by these two parts will be combined to update:

_{t}= σ(W

_{xi}x

_{t}+ W

_{hi}h

_{t}

_{−1}+ b

_{i})

_{t}= f

_{t}× C

_{t}

_{−1}+ i

_{t}× tanh(W

_{xc}x

_{t}+ W

_{hc}h

_{t}

_{−1}+ b

_{c})

_{t}value is resized to −1 to 1 through tanh and multiply it by the output of the sigmoid gate so that we only output the target parts:

_{t}= σ(W

_{xo}x

_{t}+ W

_{ho}h

_{t}

_{−1}+ b

_{o})

_{t}= o

_{t}× tanh(C

_{t})

_{t}= σ(x

_{t}U

_{z}+ h

_{t}

_{−1}W

_{z})

_{t}= σ(x

_{t}U

_{r}+ h

_{t}

_{−1}W

_{r})

_{t}’ = tanh(x

_{t}U + r

_{t}Wh

_{t}

_{−1})

_{t}= z

_{t}h

_{t}

_{−1}+ (1 − z

_{t}) h

_{t}’

## 3. Model Building

## 4. Results and Discussion

#### 4.1. Comparison of Parameter Sensitivity and Model Performance

_{i}represents the model simulation value, o

_{i}represents the observed value, and $\overline{{o}_{i}}$ represents the average value of the observed value. The indices RMSE and RSR are valuable because they indicate the error in the units (or squared units) of the constituent of interest. The NSE is a normalized statistic that determines the relative magnitude of the residual variance compared to the measured data variance [29]. As suggested by previous studies, an RMSE value less than half the standard deviation of the observed data may be considered low, and if RSR < 0.7 and NSE > 0.5, model performance can be considered satisfactory, whereas an NSE value <0.0 indicates that the mean observed value is a better predictor than the simulated value, which indicates unacceptable performance [30,31,32].

#### 4.2. The Applicability of the Model in Reservoir Operation

#### 4.2.1. Model Performance Index

#### 4.2.2. Model Economic Index

#### 4.3. Importance Analysis of Input Factors

#### 4.4. Model Generalization Analysis

## 5. Conclusions

## Author Contributions

## Funding

## Conflicts of Interest

## References

- Hossain, M.S.; El-Shafie, A. Intelligent systems in optimizing reservoir operation policy: A review. Water Resour. Manag.
**2013**, 27, 3387–3407. [Google Scholar] [CrossRef] - Lu, D.; Wang, B.; Wang, Y.; Zhou, H.; Liang, Q.; Peng, Y.; Roskilly, T. Optimal operation of cascade hydropower stations using hydrogen as storage medium. Appl. Energy
**2015**, 137, 56–63. [Google Scholar] [CrossRef] - Fang, G.H.; Guo, Y.X.; Huang, X.F.; Rutten, M.; Yuan, Y. Combining grey relational analysis and a bayesian model averaging method to derive monthly optimal operating rules for a hydropower reservoir. Water
**2018**, 10, 1099. [Google Scholar] [CrossRef] - Haimes, Y.Y.; Hall, W.A. Sensitivity, responsivity, stability and irreversibility as multiple objectives in civil systems. Adv. Water Resour.
**1977**, 1, 71–81. [Google Scholar] [CrossRef] - Reed, P.M.; Hadka, D.; Herman, J.D.; Kasprzyk, J.R.; Kollat, J.B. Evolutionary multiobjective optimization in water resources: The past, present, and future. Adv. Water Resour.
**2013**, 51, 438–456. [Google Scholar] [CrossRef] [Green Version] - Chaves, P.; Tsukatani, T.; Kojiri, T. Operation of storage reservoir for water quality by using optimization and artificial intelligence techniques. Math. Comput. Simul.
**2004**, 67, 419–432. [Google Scholar] [CrossRef] - Yang, T.; Gao, X.; Sorooshian, S.; Li, X. Simulating California reservoir operation using the classification and regression-tree algorithm combined with a shuffled cross-validation scheme. Water Resour. Res.
**2016**, 52, 1626–1651. [Google Scholar] [CrossRef] - Chaves, P.; Chang, F.J. Intelligent reservoir operation system based on evolving artificial neural networks. Adv. Water Resour.
**2008**, 31, 926–936. [Google Scholar] [CrossRef] - Jain, S.K.; Das, A.; Srivastava, D.K. Application of ANN for reservoir inflow prediction and operation. J. Water Resour. Plan. Manag.
**1999**, 125, 263–271. [Google Scholar] [CrossRef] - Cortes, C.; Vapnik, V. Support-vector networks. Mach. Learn.
**1995**, 20, 273–297. [Google Scholar] [CrossRef] [Green Version] - Aboutalebi, M.; Haddad, O.B.; Loaiciga, H. Optimal monthly reservoir operation rules for hydropower generation derived with SVR-NSGAII. J. Water Resour. Plan. Manag.
**2015**, 141, 04015029. [Google Scholar] [CrossRef] - Ji, C.M.; Zhou, T.; Huang, H.T. Operating rules derivation of jinsha reservoirs system with parameter calibrated support vector regression. Water Resour. Manag.
**2014**, 28, 2435–2451. [Google Scholar] [CrossRef] - Khalil, A.; McKee, M.; Kemblowski, M.; Asefa, T. Sparse Bayesian learning machine for real-time management of reservoir releases. Water Resour. Res.
**2005**, 41, 4844–4847. [Google Scholar] [CrossRef] - Su, J.; Wang, X.; Liang, Y.; Chen, B. GA-based support vector machine model for the prediction of monthly reservoir storage. J. Hydrol. Eng.
**2013**, 19, 1430–1437. [Google Scholar] [CrossRef] - Lin, J.; Cheng, C.; Chau, K. Using support vector machines for long-term discharge prediction. Hydrol. Sci. J.
**2006**, 51, 599–612. [Google Scholar] [CrossRef] [Green Version] - Yang, T.; Asanjan, A.A.; Faridzad, M.; Hayatbini, N.; Gao, X.; Sorooshian, S. An enhanced artificial neural network with a shuffled complex evolutionary global optimization with principal component analysis. Inf. Sci.
**2016**, 418, 302–316. [Google Scholar] [CrossRef] - Lecun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature
**2015**, 521, 436. [Google Scholar] [CrossRef] - Girshick, R.; Donahue, J.; Darrell, T.; Malik, J. Rich feature hierarchies for accurate object detection and semantic segmentation. In Proceedings of the 2014 IEEE Conference on Computer Vision and Pattern Recognition, Columbus, OH, USA, 24–27 June 2014; pp. 580–587. [Google Scholar]
- Yuan, C.; Wu, C.C.; Shen, C.H.; Lee, H.Y. Unsupervised Learning of Audio Segment Representations Using Sequence-to-Sequence Recurrent Neural Networks. Available online: https://arxiv.org/abs/1603.00982v1 (accessed on 24 April 2019).
- Chung, J.; Gulcehre, C.; Cho, K.H.; Bengio, Y. Empirical evaluation of gated recurrent neural networks on sequence modeling. arXiv
**2014**, arXiv:1412.3555. [Google Scholar] - Gers, F.A.; Schmidhuber, J.; Cummins, F. Learning to forget: Continual prediction with LSTM. Neural Comput.
**2000**, 12, 2451–2471. [Google Scholar] [CrossRef] - Asanjan, A.A.; Yang, T.; Hsu, K.; Sorooshian, S.; Lin, J.; Peng, Q. Short-term precipitation forecast based on the persiann system and LSTM recurrent neural networks. J. Geophys. Res. Atmos.
**2018**, 123, 12–543. [Google Scholar] - Hochreiter, S.; Bengio, Y.; Frasconi, P.; Schmidhuber, J. Gradient flow in recurrent nets: The difficulty of learning long-term dependencies. In A Field Guide to Dynamical Recurrent Neural Networks; IEEE Press: Piscataway, NJ, USA, 2001; pp. 237–243. [Google Scholar]
- Haşim, S.; Andrew, S.; Françoise, B. Long Short-Term Memory Based Recurrent Neural Network Architectures for Large Vocabulary Speech Recognition. Available online: https://arxiv.org/abs/1402.1128v1 (accessed on 24 April 2019).
- Werbos, P.J. Backpropagation through time: What it does and how to do it. Proc. IEEE
**1990**, 78, 1550–1560. [Google Scholar] [CrossRef] - Official Website of China Three Gorges Corporation. Available online: http://www.ctg.com.cn/ (accessed on 23 April 2019).
- National Meteorological Information Center Home Page. Available online: https://data.cma.cn/ (accessed on 23 April 2019).
- Keskar, N.S.; Mudigere, D.; Nocedal, J.; Smelyanskiy, M.; Tang, P.P.T. On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima. Available online: https://arxiv.org/abs/1609.04836 (accessed on 24 April 2019).
- Nash, J.E.; Sutcliffe, J.V. River flow forecasting through conceptual models part I—A discussion of principles. J. Hydrol.
**1970**, 10, 282–290. [Google Scholar] [CrossRef] - Moriasi, D.N.; Arnold, J.G.; Liew, M.W.V.; Bingner, R.L.; Harmel, R.D.; Veith, T.L. Model evaluation guidelines for systematic quantification of accuracy in watershed simulations. Trans. ASABE
**2007**, 50, 885–900. [Google Scholar] [CrossRef] - Singh, J.; Knapp, H.V.; Arnold, J.G.; Demissie, M. Hydrological modeling of the Iroquois river watershed using HSPF and SWAT. J. Am. Water Resour. Assoc.
**2005**, 41, 343–360. [Google Scholar] [CrossRef] - Yang, T.; Asanjan, A.A.; Welles, E.; Gao, X.; Sorooshian, S.; Liu, X. Developing reservoir monthly inflow forecasts using artificial intelligence and climate phenomenon information. Water Resour. Res.
**2017**, 53, 2786–2812. [Google Scholar] [CrossRef] - Yao, X. Evolving artificial neural networks. Proc. IEEE
**1999**, 87, 1423–1447. [Google Scholar] [Green Version] - Moody, J.O.; Antsaklis, P.J. The dependence identification neural network construction algorithm. IEEE Trans. Neural Netw.
**1996**, 7, 3–15. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Zhu, F.; Zhong, P.-A.; Sun, Y.; Xu, B. Selection of criteria for multi-criteria decision making of reservoir flood control operation. J. Hydroinf.
**2017**, 19, 558–571. [Google Scholar] [CrossRef] - Kewley, R.H.; Embrechts, M.J.; Breneman, C. Data strip mining for the virtual design of pharmaceuticals with neural networks. IEEE Trans. Neural Netw.
**2000**, 11, 668–679. [Google Scholar] [CrossRef] - Zhang, C.; Bengio, S.; Hardt, M.; Recht, B.; Vinyals, O. Understanding deep learning requires rethinking generalization. arXiv
**2016**, arXiv:1611.03530. [Google Scholar]

**Figure 2.**Recurrent neural network (

**a**) and folded computational diagram, where the blue square indicates a delay of one time step (

**b**).

**Figure 9.**Comparison of predicted and observed outflow using RNN (

**a**), LSTM (

**b**) and GRU (

**c**) algorithm.

**Figure 10.**Reservoir daily water level changes comparison between observed water level changes (yellow) with the calculated results with RNN (green), LSTM (blue), and GRU (red).

**Figure 11.**Reservoir daily water level changes comparison between observed water level changes (yellow) with the calculated results with RNN (green), LSTM (blue), and GRU (red).

**Figure 13.**Relative importance of the input variables under different algorithm (

**a**) and different operating season (

**b**).

Input Factors Category | Input Factors | Unit |
---|---|---|

Time information | Month of a year | -- |

Hour of a day | h | |

Runoff and water level information | Reservoir inflow | m^{3}/s |

Reservoir outflow (previous moment) | m^{3}/s | |

Maximum controlled operating level | m | |

Minimum controlled operating level | m | |

Meteorological information | Precipitation | mm |

Evaporation | mm | |

Relative humidity | % | |

Air temperature | °C | |

Hydropower firm output information | Hydropower firm output information | MW |

Algorithm | Whole Year | Flood Season | Water Supply Season | |||||||
---|---|---|---|---|---|---|---|---|---|---|

RMSE (m^{3}/s) | RSR | NSE | Time Consumption (s) | RMSE (m^{3}/s) | RSR | NSE | RMSE (m^{3}/s) | RSR | NSE | |

RNN | 472.66 | 0.188 | 0.964 | 7.87 | 558.88 | 0.294 | 0.913 | 392.74 | 0.353 | 0.875 |

LSTM | 447.99 | 0.178 | 0.968 | 14.86 | 523.81 | 0.275 | 0.924 | 385.85 | 0.347 | 0.880 |

GRU | 466.61 | 0.185 | 0.965 | 12.39 | 551.87 | 0.291 | 0.915 | 395.40 | 0.355 | 0.874 |

© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Zhang, D.; Peng, Q.; Lin, J.; Wang, D.; Liu, X.; Zhuang, J.
Simulating Reservoir Operation Using a Recurrent Neural Network Algorithm. *Water* **2019**, *11*, 865.
https://doi.org/10.3390/w11040865

**AMA Style**

Zhang D, Peng Q, Lin J, Wang D, Liu X, Zhuang J.
Simulating Reservoir Operation Using a Recurrent Neural Network Algorithm. *Water*. 2019; 11(4):865.
https://doi.org/10.3390/w11040865

**Chicago/Turabian Style**

Zhang, Di, Qidong Peng, Junqiang Lin, Dongsheng Wang, Xuefei Liu, and Jiangbo Zhuang.
2019. "Simulating Reservoir Operation Using a Recurrent Neural Network Algorithm" *Water* 11, no. 4: 865.
https://doi.org/10.3390/w11040865