# Groundwater Level Prediction with Deep Learning Methods

^{1}

^{2}

^{*}

## Abstract

**:**

## 1. Introduction

**Figure 1.**GAN model (source: Pan et al. [16]).

## 2. Materials and Methods

#### 2.1. Case Study

#### 2.2. Methodology

#### 2.2.1. Water Level Prediction Data Preprocessing

#### 2.2.2. Generative Adversarial Network

_{fake}, which approximates real values, while both X

_{fake}and X

_{real}are used as inputs to the discriminator network. The training objective function is a binary classification task aimed at distinguishing between X

_{real}and X

_{fake}.

#### 2.2.3. Long Short-Term Memory

#### 2.2.4. Convolutional Neural Network

#### 2.2.5. Objective Function

^{2}as evaluation indicators.

- MAE

- MSE

- Log-cosh loss

## 3. Results

#### 3.1. The Results of Imputation

#### 3.1.1. Hyperparameter Settings of GAN

- Adam

- ReLU

#### 3.1.2. Imputation Results

#### 3.1.3. Practical Imputation Case Discussion

#### 3.2. Prediction Results

#### 3.2.1. Hyperparameter Settings of CNN and LSTM

#### 3.2.2. Results of Prediction

^{2}), scatter index (SI), and BIAS were used to assess the performance of the models. The formulas for these statistical parameters have been provided in the literature [64,65,66].

- Univariate and Seq2val

^{2}values of the four models were all as high as 0.99 in Table 7. The accuracy of the CNN was not lower than that of LSTM, which means that the CNN does have strong potential for the prediction of one-dimensional data. Seq2val performed worse than univariate, indicating that the methods perform poorly on multivariate data. It is speculated that due to the large difference in parameter values, the weight of rainfall was too large, resulting in a slight shift in the RMSE.

- Seq2seq

^{2}was as high as 0.95 in Table 8, which indicated good performance. Although the R

^{2}of T = 5–7 was above 0.9, the RMSE was not excellent. It is speculated that where the curve changed sharply, the endpoint simulation was slightly extreme. After T = 7, the evaluation index gradually deteriorated, and the accuracy of the CNN decreased faster than that of LSTM, indicating that LSTM is still relatively good at remembering long-term information. Overall, the simulation accuracy of both models is good, and that of the CNN is slightly better. It is confirmed that the CNN can also be used as a tool for sequence data, and it has more potential in the field of hydrology.

^{2}. LSTM has high accuracy on the smooth segment but has a slightly extreme phenomenon on the simulated endpoint. The CNN evaluation indices are better. However, both CNN and LSTM are suitable tools for simulating groundwater, and both R

^{2}values are as high as 0.98 or more. T = 5 shows that the two models are worse at simulating extreme values, and there is a shift in the smoothing section, resulting in a rapid increase in the RMSE. This is especially the case for Station 2, which is more affected by noise: the simulation curve of LSTM fluctuates gradually, but the R

^{2}remains steady. The overall performance can capture trend changes, indicating that the model has a fairly good ability to grasp changes. The simulation at T = 10 deviates significantly. There is underestimation in the raw water section and overestimation in the receding water section. The simulation offset of LSTM at Station 2 is very severe, with an overestimation phenomenon as high as 0.69 m. Therefore, the model does not have the ability to make multistep forecasts 10 days in advance.

## 4. Conclusions

## Author Contributions

## Funding

## Data Availability Statement

## Acknowledgments

## Conflicts of Interest

## References

- Yu, P.-S.; Yang, T.-C.; Wu, C.-K. Impact of climate change on water resources in southern Taiwan. J. Hydrol.
**2002**, 260, 161–175. [Google Scholar] [CrossRef] - Chen, C.-H.; Wang, C.-H.; Hsu, Y.-J.; Yu, S.-B.; Kuo, L.-C. Correlation between groundwater level and altitude variations in land subsidence area of the Choshuichi Alluvial Fan, Taiwan. Eng. Geol.
**2010**, 115, 122–131. [Google Scholar] [CrossRef] - Tran, D.-H.; Wang, S.-J. Land subsidence due to groundwater extraction and tectonic activity in Pingtung Plain, Taiwan. Proc. IAHS
**2020**, 382, 361–365. [Google Scholar] [CrossRef] - WRA (Water Resources Agency). Report of the Monitoring, Investigating and Analyzing of Land Subsidence in Taiwan (1/4); Ministry of Economic Affairs, Executive Yuan: Taipei, Taiwan, 2001. (In Chinese)
- Lo, W.C.; Borja, R.I.; Deng, J.H.; Lee, J.W. Analytical Solution of Soil Deformation and Fluid Pressure Change for a two-layer System with an Upper Unsaturated Soil and a Lower Saturated Soil under External Loading. J. Hydrol.
**2020**, 588, 124997. [Google Scholar] [CrossRef] - Lo, W.C.; Sposito, G.; Lee, J.W.; Chu, H. One-Dimensional Consolidation in Unsaturated Soils under Cyclic Loading. Adv. Water Resour.
**2016**, 91, 122–137. [Google Scholar] [CrossRef] - Lo, W.C.; Lee, J.W. Effect of Water Content and Soil Texture on Consolidation in Unsaturated Soils. Adv. Water Resour.
**2016**, 82, 52–69. [Google Scholar] - Lo, W.C.; Sposito, G.; Majer, E. Analytical decoupling of poroelasticity equations for acoustic wave propagation and attenuation in a porous medium containing two immiscible fluids. J. Eng. Math.
**2009**, 64, 219–235. [Google Scholar] [CrossRef] - Dimiduk, D.M.; Holm, E.A.; Niezgoda, S.R. Perspectives on the Impact of Machine Learning, Deep Learning, and Artificial Intelligence on Materials, Processes, and Structures Engineering. Integr. Mater. Manuf. Innov.
**2018**, 7, 157–172. [Google Scholar] [CrossRef] - Buuren, S.V.; Groothuis-Oudshoorn, K. MICE: Multivariate Imputation by Chained Equations in R. J. Stat. Softw.
**2011**, 45, 1–67. [Google Scholar] [CrossRef] - Stekhoven, D.J.; Bühlmann, P. MissForest—Non-parametric missing value imputation for mixed-type data. Bioinformatics
**2012**, 28, 112–118. [Google Scholar] [CrossRef] - Candès, E.J.; Recht, B. Exact Matrix Completion via Convex Optimization. Found. Comput. Math.
**2009**, 9, 717–772. [Google Scholar] [CrossRef] - Vincent, P.; Larochelle, H.; Bengio, Y.; Manzagol, P.-A. Extracting and composing robust features with denoising autoencoders. In Proceedings of the 25th International Conference, New York, NY, USA, 5 July 2008. [Google Scholar]
- Goodfellow, I.; Pouget-Abadie, J.; Mirza, M.; Xu, B.; Warde-Farley, D.; Ozair, S.; Courville, A.; Bengio, Y. Generative Adversarial Networks. In Advances in Neural Information Processing Systems; Curran Associates, Inc.: New York, NY, USA, 2014; Volume 27. [Google Scholar]
- Yoon, J.; Jordon, J.; Schaar, M. GAIN: Missing Data Imputation using Generative Adversarial Nets. In Proceedings of the 35th International Conference on Machine Learning, Stockholm, Sweden, 10–15 July 2018; pp. 5689–5698. [Google Scholar]
- Pan, T.; Chen, J.; Zhang, T.; Liu, S.; He, S.; Lv, H. Generative adversarial network in mechanical fault diagnosis under small sample: A systematic review on applications and future perspectives. ISA Trans.
**2022**, 128, 1–10. [Google Scholar] [CrossRef] - Francesconi, W.; Srinivasan, R.; Pérez-Miñana, E.; Willcock, S.P.; Quintero, M. Using the Soil and Water Assessment Tool (SWAT) to model ecosystem services: A systematic review. J. Hydrol.
**2016**, 535, 625–636. [Google Scholar] [CrossRef] - De Almeida Bressiani, D.; Gassman, P.W.; Fernandes, J.G.; Garbossa, L.H.P.; Srinivasan, R.; Bonumá, N.B.; Mendiondo, E.M. Review of Soil and Water Assessment Tool (SWAT) applications in Brazil: Challenges and prospects. Int. J. Agric. Biol. Eng.
**2015**, 8, 3. [Google Scholar] - Srinivasan, R.; Arnold, J.G.; Jones, C.A. Hydrologic Modelling of the United States with the Soil and Water Assessment Tool. Int. J. Water Resour. Dev.
**1998**, 14, 315–325. [Google Scholar] [CrossRef] - Condon, L.E.; Kollet, S.; Bierkens, M.F.P.; Fogg, G.E.; Maxwell, R.M.; Hill, M.C.; Fransen, H.-J.H.; Verhoef, A.; van Loon, A.F.; Sulis, M.; et al. Global Groundwater Modeling and Monitoring: Opportunities and Challenges. Water Resour. Res.
**2021**, 57, e2020WR029500. [Google Scholar] [CrossRef] - Hughes, J.D.; Langevin, C.D.; Banta, E.R. Documentation for the MODFLOW 6 framework. In U.S. Geological Survey Techniques and Methods; United States Geological Survey: Reston, VA, USA, 2017; Volume 6, p. 40. [Google Scholar]
- Kirchner, J.W. Getting the right answers for the right reasons: Linking measurements, analyses, and models to advance the science of hydrology. Water Resour. Res.
**2006**, 42. [Google Scholar] [CrossRef] - McDonnell, J.J.; Sivapalan, M.; Vaché, K.; Dunn, S.; Grant, G.; Haggerty, R.; Hinz, C.; Hooper, R.; Kirchner, J.; Roderick, M.L.; et al. Moving beyond heterogeneity and process complexity: A new vision for watershed hydrology. Water Resour. Res.
**2007**, 43. [Google Scholar] [CrossRef] - Tada, T.; Beven, K.J. Hydrological model calibration using a short period of observations. Hydrol. Process.
**2012**, 26, 883–892. [Google Scholar] [CrossRef] - Ojha, R.; Ramadas, M.; Govindaraju, R.S. Current and Future Challenges in Groundwater. I: Modeling and Management of Resources. J. Hydrol. Eng.
**2013**, 20, A4014007. [Google Scholar] [CrossRef] - Mohanty, S.; Jha, M.K.; Kumar, A.; Panda, D.K. Comparative evaluation of numerical model and artificial neural network for simulating groundwater flow in Kathajodi-Surua Inter-basin of Odisha, India. J. Hydrol.
**2013**, 495, 38–51. [Google Scholar] [CrossRef] - Saberi-Movahed, F.; Najafzadeh, M.; Mehrpooya, A. Receiving More Accurate Predictions for Longitudinal Dispersion Coefficients in Water Pipelines: Training Group Method of Data Handling Using Extreme Learning Machine Conceptions. Water Resour. Manag.
**2020**, 34, 529–561. [Google Scholar] [CrossRef] - Abrahart, R.J.; Anctil, F.; Coulibaly, P.; Dawson, C.W.; Mount, N.J.; See, L.M.; Shamseldin, A.Y.; Solomatine, D.P.; Toth, E.; Wilby, R.L. Two decades of anarchy? Emerging themes and outstanding challenges for neural network river forecasting. Prog. Phys. Geogr. Earth Environ.
**2012**, 36, 480–513. [Google Scholar] [CrossRef] - Neal, A.L.; Gupta, H.V.; Kurc, S.A.; Brooks, P.D. Modeling moisture fluxes using artificial neural networks: Can information extraction overcome data loss? Hydrol. Earth Syst. Sci.
**2011**, 15, 359–368. [Google Scholar] [CrossRef] - Tsai, M.-J.; Abrahart, R.J.; Mount, N.J.; Chang, F.-J. Including spatial distribution in a data-driven rainfall-runoff model to improve reservoir inflow forecasting in Taiwan. Hydrol. Process.
**2014**, 28, 1055–1070. [Google Scholar] [CrossRef] - Chang, F.-J.; Chang, L.-C.; Huang, C.-W.; Kao, I.-F. Prediction of monthly regional groundwater levels through hybrid soft-computing techniques. J. Hydrol.
**2016**, 541, 965–976. [Google Scholar] [CrossRef] - Nourani, V.; Mousavi, S. Spatiotemporal groundwater level modeling using hybrid artificial intelligence-meshless method. J. Hydrol.
**2016**, 536, 10–25. [Google Scholar] [CrossRef] - Taormina, R.; Chau, K.-W.; Sethi, R. Artificial neural network simulation of hourly groundwater levels in a coastal aquifer system of the Venice lagoon. Eng. Appl. Artif. Intell.
**2012**, 25, 1670–1676. [Google Scholar] [CrossRef] - Mohanty, S.; Jha, M.K.; Raul, S.K.; Panda, R.K.; Sudheer, K.P. Using Artificial Neural Network Approach for Simultaneous Forecasting of Weekly Groundwater Levels at Multiple Sites. Water Resour. Manag.
**2015**, 29, 5521–5532. [Google Scholar] [CrossRef] - Adamowski, J.; Chan, H.F. A wavelet neural network conjunction model for groundwater level forecasting. J. Hydrol.
**2011**, 407, 28–40. [Google Scholar] [CrossRef] - Amini, M.; Abbaspour, K.C.; Johnson, C.A. A comparison of different rule-based statistical models for modeling geogenic groundwater contamination. Environ. Model. Softw.
**2010**, 25, 1650–1657. [Google Scholar] [CrossRef] - Fijani, E.; Nadiri, A.A.; Asghari Moghaddam, A.; Tsai, F.T.-C.; Dixon, B. Optimization of drastic method by supervised committee machine artificial intelligence to assess groundwater vulnerability for maragheh-bonab plain aquifer, Iran. J. Hydrol.
**2013**, 503, 89–100. [Google Scholar] [CrossRef] - Nolan, B.T.; Fienen, M.N.; Lorenz, D.L. A statistical learning framework for groundwater nitrate models of the Central Valley, California, USA. J. Hydrol.
**2015**, 531, 902–911. [Google Scholar] [CrossRef] - Tapoglou, E.; Trichakis, I.C.; Dokou, Z.; Nikolos, I.K.; Karatzas, G.P. Groundwater-level forecasting under climate change scenarios using an artificial neural network trained with particle swarm optimization. Hydrol. Sci. J.
**2014**, 59, 1225–1239. [Google Scholar] [CrossRef] - Tremblay, L.; Larocque, M.; Anctil, F.; Rivard, C. Teleconnections and interannual variability in Canadian groundwater levels. J. Hydrol.
**2011**, 410, 178–188. [Google Scholar] [CrossRef] - Udny Yule, G. On a method of investigating periodicities in disturbed series, with special reference to Wolfer’s sunspot numbers. Phil. Trans. R. Soc. Lond.
**1927**, 226, 267–298. [Google Scholar] - Box, G.E.P.; Davida, P. Distribution of residual autocorrelations in autoregressive-integrated moving average time series models. Publ. Am. Stat. Assoc.
**1968**, 65, 1509–1526. [Google Scholar] [CrossRef] - Drucker, H.; Burges, C.J.C.; Kaufman, L.; Smola, A.; Vapnik, V. Support Vector Regression Machines. In Advances in Neural Information Processing Systems; Curran Associates, Inc.: New York, NY, USA, 1996; Volume 9. [Google Scholar]
- Li, X.; Bai, R. Freight Vehicle Travel Time Prediction Using Gradient Boosting Regression Tree. In Proceedings of the 2016 15th IEEE International Conference on Machine Learning and Applications (ICMLA), Anaheim, CA, USA, 18–20 December 2016; pp. 1010–1015. [Google Scholar]
- Ke, G.; Meng, Q.; Finley, T.; Wang, T.; Chen, W.; Ma, W.; Ye, Q.; Liu, T.-Y. LightGBM: A Highly Efficient Gradient Boosting Decision Tree. In Advances in Neural Information Processing Systems; Curran Associates, Inc.: New York, NY, USA, 2017; Volume 30. [Google Scholar]
- Qin, Y.; Song, D.; Chen, H.; Cheng, W.; Jiang, G.; Cottrell, G. A Dual-Stage Attention-Based Recurrent Neural Network for Time Series Prediction. arXiv
**2017**, arXiv:1704.02971. [Google Scholar] - LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature
**2015**, 521, 436–444. [Google Scholar] [CrossRef] - Malekzadeh, M.; Kardar, S.; Saeb, K.; Shabanlou, S.; Taghavi, L. A Novel Approach for Prediction of Monthly Ground Water Level Using a Hybrid Wavelet and Non-Tuned Self-Adaptive Machine Learning Model. Water Resour Manag.
**2019**, 33, 1609–1628. [Google Scholar] [CrossRef] - Cortes, C.; Vapnik, V. Support-vector networks. Mach. Learn.
**1995**, 20, 273–297. [Google Scholar] [CrossRef] - Hopfield, J.J. Neural Networks and Physical Systems with Emergent Collective Computational Abilities. Proc. Natl. Acad. Sci. USA
**1982**, 79, 2554–2558. [Google Scholar] [CrossRef] [PubMed] - Hochreiter, S.; Schmidhuber, J. Long Short-Term Memory. Neural Comput.
**1997**, 9, 1735–1780. [Google Scholar] [CrossRef] - Li, Y.; Zhu, Z.; Kong, D.; Han, H.; Zhao, Y. EA-LSTM: Evolutionary attention-based LSTM for time series prediction. Knowl.-Based Syst.
**2019**, 181, 104785. [Google Scholar] [CrossRef] - LeCun, Y.; Boser, B.; Denker, J.S.; Henderson, D.; Howard, R.E.; Hubbard, W.; Jackel, L.D. Backpropagation Applied to Handwritten Zip Code Recognition. Neural Comput.
**1989**, 1, 541–551. [Google Scholar] [CrossRef] - Lecun, Y.; Bengio, Y. Convolutional Networks for Images, Speech, and Time-Series. In Handbook of Brain Theory and Neural Networks; MIT Press: Cambridge, MA, USA, 1995. [Google Scholar]
- WRA (Water Resources Agency). Preliminary Analyses of Groundwater Hydrology in the Choshui Alluvial Fan, Groundwater Monitoring Network Program Phase I; Ministry of Economic Affairs: Taipei, Taiwan, 1997. (In Chinese)
- Chen, W.-F.; Yuan, P.B. A preliminary study on sedimentary environments of Choshui fan-delta. J. Geol. Soc. China
**1999**, 42, 269–288. [Google Scholar] - WRA (Water Resources Bureau). Summary Report of Groundwater Monitoring Network Plan in Taiwan, Phase I (1992–1998); Ministry of Economic Affairs: Taipei, Taiwan, 1999. (In Chinese)
- Hsu, S.-K. Plan for a groundwater monitoring network in Taiwan. Hydrogeol. J.
**1998**, 6, 405–415. [Google Scholar] [CrossRef] - Hamilton, W.L.; Ying, R.; Leskovec, J. Representation Learning on Graphs: Methods and Applications. IEEE Data Eng. Bull
**2017**, 40, 52–74. [Google Scholar] - Oxford, R.M.; Daniel, L.G. Basic Cross-Validation: Using the “Holdout” Method To Assess the Generalizability of Results. Res. Sch.
**2001**, 8, 83–89. [Google Scholar] - Fukushima, K. Neocognitron: A self-organizing neural network model for a mechanism of pattern recognition unaffected by shift in position. Biol. Cybern.
**1980**, 36, 193–202. [Google Scholar] [CrossRef] - Ackermann, N. Introduction to 1D Convolutional Neural Networks in Keras for Time Sequences. Available online: https://reurl.cc/y7XG58 (accessed on 20 January 2021).
- Kingma, D.; Ba, J. Adam: A Method for Stochastic Optimization. In Proceedings of the International Conference on Learning Representations, Banff, AB, Canada, 14–16 April 2014. [Google Scholar]
- Najafzadeh, M.; Etemad-Shahidi, A.; Lim, S.Y. Scour prediction in long contractions using ANFIS and SVM. Ocean. Eng.
**2016**, 111, 128–135. [Google Scholar] [CrossRef] - Najafzadeh, M.; Barani, G.A. Comparison of group method of data handling based genetic programming and back propagation systems to predict scour depth around bridge piers. Sci. Iran.
**2011**, 18, 1207–1213. [Google Scholar] [CrossRef] - Ayoubloo, M.K.; Etemad-Shahidi, A.; Mahjoobi, J. Evaluation of regular wave scour around a circular pile using data mining approaches. Appl. Ocean Res.
**2010**, 32, 34–39. [Google Scholar] [CrossRef]

**Figure 3.**Monthly average rainfall (the black dots are the rainfall from 2002 to 2021, and the blue bars are the 20-year average rainfall).

**Figure 10.**Difference between 1D CNN and 2D CNN (source: Ackermann [62]).

**Figure 11.**Conv1D structure (source: Xi et al. [59]).

**Figure 13.**Imputation results of GAN (RMSE) (note: the unit is m) (The blue area indicates a good imputation performance, while the boundary between the white and red areas represents the limit of data imputation in this study).

**Figure 14.**Supplementary results of fan top (Stations 1–3), fan central (Stations 4–6), and fan tail (Stations 7–9) water level stations.

Training | Validation | Test | Total | |
---|---|---|---|---|

Groundwater level (m) | 5113 | 731 | 1461 | 7305 |

Precipitation (mm) | 5113 | 731 | 1461 | 7305 |

Optimizer | Activation | Loss Function |
---|---|---|

Adam | ReLU | MSE |

Iterations | Alpha | Hint Rate |

100,000 | 100 | 0.9 |

Optimizer | Activation | Loss Function |
---|---|---|

Adam | ReLU | MSE |

Name | Dimension | Parameter Number |
---|---|---|

Input layer | (None, 3, 1) | 0 |

LSTM | (None, 128) | 66,560 |

Dense | (None, 100) | 1290 |

Dense | (None, 5) | 505 |

Total parameters: 79,965 |

Optimizer | Activation | Loss Function |
---|---|---|

Adam | ReLU | MSE |

Filters | Kernel Size | Stride |

64 | 2 | 1 |

Name | Dimension | Parameter Number |
---|---|---|

Input layer | (None, 3, 1) | 0 |

Conv1D | (None, 3, 64) | 192 |

Conv1D | (None, 3, 64) | 8256 |

MaxPooling1D | (None, 2, 64) | 0 |

Conv1D | (None, 2, 64) | 8256 |

Conv1D | (None, 2, 64) | 8256 |

MaxPooling1D | (None, 1, 64) | 0 |

Flatten | (None, 64) | 0 |

Dense | (None, 50) | 3250 |

Dense | (None, 10) | 510 |

Total parameters: 28,720 |

RMSE (m) | MAE (m) | R^{2} | SI | BIAS | |
---|---|---|---|---|---|

Univariate–CNN | 0.007 | 0.005 | 0.998 | 0.0505 | −0.0037 |

Univariate–LSTM | 0.008 | 0.005 | 0.997 | 0.1265 | −0.0086 |

Seq2val–CNN | 0.0321 | 0.0194 | 0.9981 | 0.0506 | 0.0155 |

Seq2val–LSTM | 0.0508 | 0.0342 | 0.9955 | 0.1856 | 0.0245 |

Seq2seq–CNN | Seq2seq–LSTM | |||||
---|---|---|---|---|---|---|

RMSE (m) | MAE (m) | R^{2} | RMSE (m) | MAE (m) | R^{2} | |

T = 1 | 0.0304 | 0.0206 | 0.9984 | 0.0434 | 0.0347 | 0.9966 |

T = 2 | 0.0631 | 0.0430 | 0.9932 | 0.0706 | 0.0511 | 0.9913 |

T = 3 | 0.0972 | 0.0660 | 0.9838 | 0.1015 | 0.0700 | 0.9821 |

T = 4 | 0.1311 | 0.0879 | 0.9705 | 0.1338 | 0.0921 | 0.9690 |

T = 5 | 0.1634 | 0.1100 | 0.9529 | 0.1666 | 0.1151 | 0.9522 |

T = 6 | 0.1951 | 0.1331 | 0.9323 | 0.1956 | 0.1347 | 0.9320 |

T = 7 | 0.2252 | 0.1543 | 0.9087 | 0.2256 | 0.1559 | 0.9089 |

T = 8 | 0.2533 | 0.1737 | 0.8811 | 0.2538 | 0.1760 | 0.8810 |

T = 9 | 0.2803 | 0.1926 | 0.8482 | 0.2811 | 0.1951 | 0.8501 |

T = 10 | 0.3049 | 0.2097 | 0.8124 | 0.3058 | 0.2125 | 0.8160 |

Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |

© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Chen, H.-Y.; Vojinovic, Z.; Lo, W.; Lee, J.-W.
Groundwater Level Prediction with Deep Learning Methods. *Water* **2023**, *15*, 3118.
https://doi.org/10.3390/w15173118

**AMA Style**

Chen H-Y, Vojinovic Z, Lo W, Lee J-W.
Groundwater Level Prediction with Deep Learning Methods. *Water*. 2023; 15(17):3118.
https://doi.org/10.3390/w15173118

**Chicago/Turabian Style**

Chen, Hsin-Yu, Zoran Vojinovic, Weicheng Lo, and Jhe-Wei Lee.
2023. "Groundwater Level Prediction with Deep Learning Methods" *Water* 15, no. 17: 3118.
https://doi.org/10.3390/w15173118