# Prediction of Precipitation Based on Recurrent Neural Networks in Jingdezhen, Jiangxi Province, China

^{1}

^{2}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Overview of Study Area

^{2}, between 116°57′ E and 117°42′ E and 28°44′ N and 29°56′ N latitude.

^{3}/s. The flood caused by the extreme rainfall return period of the main stream of the Chengjiang River was 20 years, and some river sections were 50 years.

## 3. Results

#### 3.1. Data Sources

#### 3.2. Selecting Meteorological Variables

#### 3.2.1. Binarizing Categorical Features

#### 3.2.2. Binarizing Categorical Features

_{ij}(i = 1, 2, …, n; j = 1, 2, …, m) denotes the value of index j in record i, as described in Equations (1). Due to the difference in indicator values and units, it is necessary to normalize the records.

_{meanj}, X

_{maxj}, and X

_{minj}are the mean, the maximum, and the minimum index j of all records, respectively, and x

_{ij}is the normalized value.

#### 3.2.3. Calculating the CCC

_{1}, a

_{2}, …, a

_{n}} is the normalized precipitation vector and b = {b

_{1}, b

_{2}, …, b

_{n}} is one of the normalized meteorological vectors. In order to calculate the CCC between precipitation and the meteorological variable, Equation (2) should be employed:

#### 3.3. Constructing of the LSTM Model Arctecture

_{1}, x

_{2}, …, x

_{T}} as the input of the LSTM. x

_{t}$\left(t\in T\right)$ represents the meteorological vector observed at time step t. Then, the dimension of input data equals d. The output of the LSTM is the predicted precipitation data at time step t+1, denoted as $\widehat{{y}_{t}}$. The dimension of output data equals 1. The actual rainfall value is denoted as ${y}_{t}$. Because the interval between two records is 3 h, the time step is set to 3 h. This means that LSTM can be used to predict the next 3-h of precipitation.

#### 3.4. Optimization of the LSTM Model

#### 3.4.1. Performing the LSTM Model

- Take the group as a hold out or test dataset.
- Take the remaining groups as a training dataset.
- Fit the LSTM model on the training dataset and evaluate it on the test dataset.
- Retain the evaluation score.

#### 3.4.2. Determining the Input Variables with the Greatest Contribution

#### 3.4.3. Evaluating the Performance

_{1}is used to represent the proportions of samples which are forecasted correctly, including accurate prediction of precipitation or no precipitation. TS

_{2}is used to represent the proportions of samples which can only accurately predict precipitation. TS

_{3}is the average value of TS

_{1}and TS

_{2}. Detailed descriptions of the TS score are defined as in Equation (12), (13) and (14):

_{1}represents the number of samples to predict the correct precipitation, N

_{2}represents the correct number of samples for forecasting no precipitation, N

_{3}represents the number of samples that predict no precipitation but actually rain, and N

_{4}represents the number of samples that predict precipitation but do not actually have precipitation.

## 4. Results and Discussion

#### 4.1. Experimental Setup

#### 4.2. Meteorological Variables Analysis

#### 4.3. Preliminary LSTM Model for Precipitation Prediction

#### 4.4. Quantification of Relative Importance of Input Variables

#### 4.5. Best LSTM Model for Precipitation Prediction

#### 4.6. Scope of Application

## 5. Conclusions

- The quantification of the contribution of the variable relative importance was able to improve the accuracy of predicting precipitation by removing some input variables that show a weak correlation. The LSTM model can predict precipitation well with the help of the circular cross-correlation analysis and the quantification of variable importance.
- The LSTM model’s performance with different numbers of hidden neurons was not significantly different. The results show that a large number of hidden neurons did not always lead to better performance. The best LSTM (5,5,1) model showed satisfactory prediction performance with RMSE values of 42.28 mm, 42.03 mm, and 41.72 mm, and SMAPE values of 14.19%, 14.28%, and 14.17% for the training, validation, and testing data sets, respectively.
- HLC, PT, T, AP, and RH are the most important predictors for precipitation in Jingdezhen City. The optimal model predicted higher values of precipitation to be acceptable, but it needs enough meteorological data. Future studies need to be carried out to improve the models to predict the precipitation in rural areas where there is less data.

## Author Contributions

## Funding

## Acknowledgments

## Conflicts of Interest

## References

- Kucera, P.A.; Ebert, E.E.; Turk, F.J.; Levizzani, V.; Kirschbaum, D.; Tapiador, F.J.; Loew, A.; Borsche, M. Precipitation from Space: Advancing Earth System Science. Bull. Am. Meteorol. Soc.
**2013**, 94, 365–375. [Google Scholar] [CrossRef] - Wu, H.; Adler, R.F.; Hong, Y.; Tian, Y.D.; Policelli, F. Evaluation of Global Flood Detection Using Satellite-Based Rainfall and a Hydrologic Model. J. Hydrometeorol.
**2012**, 13, 1268–1284. [Google Scholar] [CrossRef] - Pasaribu, Y.P.; Fitrianti, H.; Suryani, D.R. Rainfall Forecast of Merauke Using Autoregressive Integrated Moving Average Model. E3S Web Conf.
**2018**, 73, 12010. [Google Scholar] - Deo, R.C.; Samui, P.; Kim, D. Estimation of monthly evaporative loss using relevance vector machine, extreme learning machine and multivariate adaptive regression spline models. Stoch. Environ. Res. Risk Assess.
**2015**, 30, 1769–1784. [Google Scholar] [CrossRef] - Chiang, Y.-M.; Chang, L.-C.; Jou, B.J.D.; Lin, P.-F. Dynamic ANN for precipitation estimation and forecasting from radar observations. J. Hydrol.
**2007**, 334, 250–261. [Google Scholar] [CrossRef] - Deo, R.C.; Wen, X.; Qi, F. A wavelet-coupled support vector machine model for forecasting global incident solar radiation using limited meteorological dataset. Appl. Energy
**2016**, 168, 568–593. [Google Scholar] [CrossRef] - Rahman, M.; Islam, A.H.M.S.; Nadvi, S.Y.M.; Rahman, R.M. Comparative study of ANFIS and ARIMA model for weather forecasting in Dhaka. In Proceedings of the 2013 International Conference on Informatics, Electronics and Vision (ICIEV), Dhaka, Bangladesh, 17–18 May 2013. [Google Scholar]
- Chen, K.; Liu, J.; Guo, S.X.; Chen, J.S.; Liu, P.; Qian, J.; Chen, H.J.; Sun, B. Short-Term Precipitation Occurrence Prediction for Strong Convective Weather Using Fy2-G Satellite Data: A Case Study of Shenzhen, South China. In Proceedings of the XXIII ISPRS Congress, Commission VI, Prague, Czech Republic, 12–19 July 2016; Volume 41, pp. 215–219. [Google Scholar] [CrossRef]
- Guhathakurta, P.; Kumar Kowar, M.; Karmakar, S.; Shrivastava, G. Application of Artificial Neural Networks in Weather Forecasting: A Comprehensive Literature Review. Int. J. Comput. Appl.
**2012**, 51, 17–29. [Google Scholar] [CrossRef] - Pasini, A.; Langone, R. Attribution of Precipitation Changes on a Regional Scale by Neural Network Modeling: A Case Study. Water
**2010**, 2, 321–332. [Google Scholar] [CrossRef] [Green Version] - Nasseri, M.; Asghari, K.; Abedini, M.J. Optimized scenario for rainfall forecasting using genetic algorithm coupled with artificial neural network. Expert Syst. Appl.
**2008**, 35, 1415–1421. [Google Scholar] [CrossRef] - Hosseini, S.M.; Mahjouri, N. Integrating Support Vector Regression and a geomorphologic Artificial Neural Network for daily rainfall-runoff modeling. Appl. Soft Comput.
**2016**, 38, 329–345. [Google Scholar] [CrossRef] - LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature
**2015**, 521, 436–444. [Google Scholar] [CrossRef] [PubMed] - Schmidhuber, J. Deep learning in neural networks: An overview. Neural Netw.
**2015**, 61, 85–117. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Glorot, X.; Bordes, A.; Bengio, Y. Domain Adaptation for Large-Scale Sentiment Classification: A Deep Learning Approach. In Proceedings of the International Conference on International Conference on Machine Learning, Bellevue, Washington, DC, USA, 28 June–2 July 2011; pp. 97–110. [Google Scholar]
- Shi, H.; Xu, M.; Li, R. Deep Learning for Household Load Forecasting—A Novel Pooling Deep RNN. IEEE Trans. Smart Grid
**2017**, 9, 5271–5280. [Google Scholar] [CrossRef] - Jafari-Marandi, R.; Khanzadeh, M.; Smith, B.; Bian, L. Self-Organizing and Error Driven (SOED) artificial neural network for smarter classifications. J. Comput. Des. Eng.
**2017**, 4, 282–304. [Google Scholar] [CrossRef] - Kendall, A.; Gal, Y. What Uncertainties Do We Need in Bayesian Deep Learning for Computer Vision. In Advances in Neural Information Processing Systems 30; Guyon, I., Luxburg, U.V., Bengio, S., Wallach, H., Fergus, R., Vishwanathan, S., Garnett, R., Eds.; Neural Information Processing Systems (Nips): San Diego, CA, USA, 2017; Volume 30. [Google Scholar]
- Huang, Z.; Siniscalchi, S.M.; Lee, C.H. A unified approach to transfer learning of deep neural networks with applications to speaker adaptation in automatic speech recognition. Neurocomputing
**2016**, 218, 448–459. [Google Scholar] [CrossRef] - Tang, D.; Qin, B.; Liu, T. Deep learning for sentiment analysis: Successful approaches and future challenges. Wiley Interdiscip. Rev. Data Min. Knowl. Discov.
**2015**, 5, 292–303. [Google Scholar] [CrossRef] - Heaton, J.B.; Polson, N.G.; Witte, J.H. Deep learning for finance: Deep portfolios. Appl. Stoch. Model. Bus. Ind.
**2016**, 33, 3–12. [Google Scholar] [CrossRef] - Donahue, J.; Hendricks, L.A.; Rohrbach, M.; Venugopalan, S.; Guadarrama, S.; Saenko, K.; Darrell, T. Long-Term Recurrent Convolutional Networks for Visual Recognition and Description. IEEE Trans. Pattern Anal. Mach. Intell.
**2017**, 39, 677–691. [Google Scholar] [CrossRef] [PubMed] - Hartmann, H.; Becker, S.; King, L. Predicting summer rainfall in the Yangtze River basin with neural networks. Int. J. Clim.
**2008**, 28, 925–936. [Google Scholar] [CrossRef] - Miao, Q.; Pan, B.; Hao, W.; Hsu, K.-L.; Sorooshian, S. Improving Monsoon Precipitation Prediction Using Combined Convolutional and Long Short Term Memory Neural Network. Water
**2019**, 11, 977. [Google Scholar] [CrossRef] [Green Version] - Schepen, A.; Wang, Q.; Robertson, D. Evidence for Using Lagged Climate Indices to Forecast Australian Seasonal Rainfall. J. Clim.
**2012**, 25, 1230–1246. [Google Scholar] [CrossRef] - Yuan, F.; Berndtsson, R.; Uvo, C.B.; Zhang, L.; Jiang, P. Summer precipitation prediction in the source region of the Yellow River using climate indices. Hydrol. Res.
**2015**, 47, 847–856. [Google Scholar] [CrossRef] [Green Version] - Grotjahn, R.; Black, R.; Leung, R.; Wehner, M.; Barlow, M.; Bosilovich, M.; Gershunov, A.; Gutowski, W.J.; Gyakum, J.R.; Katz, R.; et al. North American extreme temperature events and related large scale meteorological patterns: A review of statistical methods, dynamics, modeling, and trends. Clim. Dyn.
**2015**, 46, 1151–1184. [Google Scholar] [CrossRef] [Green Version] - Lee, J.; Kim, C.; Lee, J.E.; Kim, N.W.; Kim, H. Application of Artificial Neural Networks to Rainfall Forecasting in the Geum River Basin, Korea. Water
**2018**, 10, 1448. [Google Scholar] [CrossRef] [Green Version] - Gleason, J.A.; Kratz, N.R.; Greeley, R.D.; Fagliano, J.A. Under the Weather: Legionellosis and Meteorological Factors. EcoHealth
**2016**, 13, 293–302. [Google Scholar] [CrossRef] - Poornima, S.; Pushpalatha, M. Prediction of Rainfall Using Intensified LSTM Based Recurrent Neural Network with Weighted Linear Units. Atmosphere
**2019**, 10, 18. [Google Scholar] [CrossRef] [Green Version] - Xingjian, S.; Chen, Z.; Wang, H.; Yeung, D.-Y.; Wong, W.-K.; Woo, W.-C. Convolutional LSTM Network: A Machine Learning Approach for Precipitation Nowcasting. In Proceedings of the Advances in Neural Information Processing Systems, Montreal, QC, Canada, 7–12 December 2015; pp. 802–810. [Google Scholar]
- Asanjan, A.A.; Yang, T.; Hsu, K.; Sorooshian, S.; Lin, J.; Peng, Q. Short-Term Precipitation Forecast Based on the PERSIANN System and LSTM Recurrent Neural Networks. J. Geophys. Res. Atmos.
**2018**, 123, 12543–12563. [Google Scholar] - Du, J.; Liu, Y.; Liu, Z. Study of Precipitation Forecast Based on Deep Belief Networks. Algorithms
**2018**, 11, 132. [Google Scholar] [CrossRef] [Green Version] - Fisher, N.I.; Lee, A.J. A Correlation Coefficient for Circular Data. Biometrika
**1983**, 70. [Google Scholar] [CrossRef] - Matsunaga, Y. Accelerating SAT-Based Boolean Matching for Heterogeneous FPGAs Using One-Hot Encoding and CEGAR Technique. In Proceedings of the 2015 20th Asia and South Pacific Design Automation Conference, Chiba, Japan, 22 January 2015; pp. 255–260. [Google Scholar]
- Hochreiter, S.; Schmidhuber, J. Long short-term memory. Neural Comput.
**1997**, 9, 1735–1780. [Google Scholar] [CrossRef] - James, G.; Witten, D.; Hastie, T.; Tibshirani, R. An Introduction to Statistical Learning: With Applications in R; Springer: New York, NY, USA, 2013. [Google Scholar]
- De Ona, J.; Garrido, C. Extracting the contribution of independent variables in neural network models: A new approach to handle instability. Neural Comput. Appl.
**2014**, 25, 859–869. [Google Scholar] [CrossRef] - Garson, D.G. Interpreting neural network connection weights. Artif. Intell. Expert
**1991**, 6, 47–51. [Google Scholar] [CrossRef] - Goh, A.T.C. Back-propagation neural networks for modeling complex systems. Artif. Intell. Eng.
**1995**, 9, 143–151. [Google Scholar] [CrossRef] - Lam, K.F.; Mui, H.W.; Yuen, H.K. A note on minimizing absolute percentage error in combined forecasts. Comput. Oper. Res.
**2001**, 28, 1141–1147. [Google Scholar] [CrossRef] - Chai, T.; Draxler, R.R. Root mean square error (RMSE) or mean absolute error (MAE)?—Arguments against avoiding RMSE in the literature. Geosci. Model Dev.
**2014**, 7, 1247–1250. [Google Scholar] [CrossRef] [Green Version] - Zhang, P.; Jia, Y.; Gao, J.; Song, W.; Leung, H.K.N. Short-term Rainfall Forecasting Using Multi-layer Perceptron. IEEE Trans. Big Data
**2018**. [Google Scholar] [CrossRef] - Estoque, R.C.; Murayama, Y.; Myint, S.W. Effects of landscape composition and pattern on land surface temperature: An urban heat island study in the megacities of Southeast Asia. Sci. Total Environ.
**2017**, 577, 349–359. [Google Scholar] [CrossRef] - Gultepe, I.; Isaac, G.A.; Joe, P.; Kucera, P.A.; Theriault, J.M.; Fisico, T. Roundhouse (RND) Mountain Top Research Site: Measurements and Uncertainties for Winter Alpine Weather Conditions. Pure Appl. Geophys.
**2014**, 171, 59–85. [Google Scholar] [CrossRef] - Gultepe, I.; Rabin, R.; Ware, R.; Pavolonis, M. Light Snow Precipitation and Effects on Weather and Climate. In Adv Geophys; Nielsen, L., Ed.; Elsevier Academic Press Inc: San Diego, CA, USA, 2016; Volume 57, pp. 147–210. [Google Scholar]

Meteorological Variable | Description | Unit | Type |
---|---|---|---|

T | Air temperature at 2 m height above the earth’s surface | °C | Numerical |

Td | Dewpoint temperature at a height of 2 m above the earth’s surface | °C | Numerical |

Tn | Minimum air temperature during the past period (not exceeding 12 h) | °C | Numerical |

Tx | Maximum air temperature during the past period (not exceeding 12 h) | °C | Numerical |

AP | Atmospheric pressure at weather station level | mmHg | Numerical |

PT | Changes in atmospheric pressure over the last three hours | mmHg | Numerical |

RH | Relative humidity at a height of 2 m above the earth’s surface | % | Numerical |

WS | Mean wind speed at a height of 10–12 m above the earth’s surface over a 10 min period immediately preceding the observation | m/s | Numerical |

WD | Mean wind direction at a height of 10–12 m above the earth’s surface over a 10 min period immediately preceding the observation | - | Categorical |

WSx | Maximum wind speed at a height of 10–12 m above the earth’s surface over a 10 min period immediately preceding the observation | m/s | Numerical |

TCC | Total cloud cover | - | Categorical |

HLC | Height of the base of the lowest clouds | m | Categorical |

AC | Amount of all the clouds | - | Categorical |

PRCP | Amount of precipitation | mm | Numerical |

Wind Direction | Label Encoding |
---|---|

East | 1 |

West | 2 |

South | 3 |

North | 4 |

Southeast | 5 |

Southwest | 6 |

Northeast | 7 |

Northwest | 8 |

East | West | South | North | Southeast | Southwest | Northeast | Northwest | One-Hot Encoding |
---|---|---|---|---|---|---|---|---|

1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10000000 |

0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 01000000 |

0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 00100000 |

0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 00010000 |

0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 00001000 |

0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 00000100 |

0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 00000010 |

0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 00000001 |

Meteorological Variable | Circular Cross-Correlation |
---|---|

T | 0.2363 |

Td | 0.4475 |

Tn | 0.0853 |

Tx | 0.0369 |

AP | 0.1835 |

PT | −0.4768 |

RH | 0.2176 |

WS | 0.1867 |

WD | −0.0163 |

WSx | 0.0932 |

TCC | 0.2841 |

HLC | −0.3548 |

AC | 0.1294 |

**Table 5.**The performance for training, validation, and testing parts of the LSTM model with selected nine input variables.

Number of Hidden Neurons | RMSE (mm) | SMAPE (%) | TS | ||||||
---|---|---|---|---|---|---|---|---|---|

Training | Validation | Testing | Training | Validation | Testing | TS_{1} | TS_{2} | TS_{3} | |

2 | 46.45 | 49.95 | 48.26 | 17.49 | 19.48 | 19.37 | 0.82 | 0.58 | 0.70 |

3 | 49.51 | 54.26 | 53.51 | 17.49 | 20.05 | 21.74 | 0.79 | 0.59 | 0.69 |

4 | 50.23 | 51.33 | 54.64 | 17.30 | 20.38 | 20.96 | 0.77 | 0.55 | 0.66 |

5 | 51.31 | 53.98 | 54.27 | 16.99 | 18.81 | 20.28 | 0.75 | 0.56 | 0.66 |

6 | 48.19 | 49.61 | 47.61 | 16.28 | 18.01 | 18.36 | 0.81 | 0.60 | 0.71 |

7 | 50.43 | 51.29 | 53.23 | 16.95 | 19.52 | 18.69 | 0.74 | 0.61 | 0.68 |

8 | 48.95 | 52.27 | 54.71 | 17.48 | 20.64 | 18.16 | 0.80 | 0.59 | 0.70 |

9 | 50.23 | 53.62 | 54.37 | 16.30 | 20.63 | 18.70 | 0.77 | 0.60 | 0.69 |

10 | 49.71 | 51.91 | 54.71 | 19.63 | 18.65 | 20.64 | 0.79 | 0.57 | 0.68 |

**Table 6.**The performance for training, validation, and testing parts of the LSTM model with all 13 input variables.

Number of Hidden Neurons | RMSE (mm) | SMAPE (%) | TS | ||||||
---|---|---|---|---|---|---|---|---|---|

Training | Validation | Testing | Training | Validation | Testing | TS_{1} | TS_{2} | TS_{3} | |

2 | 49.23 | 49.89 | 50.39 | 17.97 | 19.62 | 20.05 | 0.77 | 0.57 | 0.67 |

3 | 48.92 | 50.82 | 51.08 | 17.57 | 19.01 | 19.43 | 0.79 | 0.53 | 0.66 |

4 | 53.51 | 52.38 | 54.16 | 19.05 | 21.84 | 21.54 | 0.68 | 0.51 | 0.60 |

5 | 52.48 | 56.09 | 54.88 | 18.24 | 18.95 | 20.35 | 0.70 | 0.49 | 0.60 |

6 | 50.62 | 51.13 | 50.18 | 19.97 | 20.56 | 19.94 | 0.75 | 0.55 | 0.65 |

7 | 49.32 | 49.35 | 50.41 | 18.84 | 19.03 | 19.71 | 0.77 | 0.55 | 0.66 |

8 | 48.36 | 49.76 | 48.89 | 17.51 | 18.61 | 18.65 | 0.80 | 0.59 | 0.70 |

9 | 53.62 | 53.49 | 54.71 | 20.20 | 20.94 | 21.45 | 0.65 | 0.52 | 0.59 |

10 | 53.73 | 52.76 | 54.04 | 21.42 | 19.85 | 20.647 | 0.68 | 0.50 | 0.59 |

**Table 7.**The performance for training, validation, and testing parts of the LSTM model with five input variables (height of the lowest clouds (HLC), pressure tendency (PT), temperature (T), atmospheric pressure (AP), and relative humidity (RH)).

Number of Hidden Neurons | RMSE (mm) | SMAPE (%) | TS | ||||||
---|---|---|---|---|---|---|---|---|---|

Training | Validation | Testing | Training | Validation | Testing | TS_{1} | TS_{2} | TS_{3} | |

2 | 43.25 | 43.27 | 43.38 | 15.69 | 15.83 | 16.06 | 0.82 | 0.59 | 0.71 |

3 | 43.25 | 43.15 | 43.10 | 15.91 | 15.75 | 15.24 | 0.83 | 0.58 | 0.71 |

4 | 42.93 | 43.10 | 42.92 | 15.60 | 15.53 | 15.66 | 0.85 | 0.61 | 0.73 |

5 | 42.28 | 42.03 | 41.72 | 14.19 | 14.28 | 14.17 | 0.87 | 0.61 | 0.74 |

6 | 42.39 | 42.25 | 41.96 | 14.16 | 14.57 | 14.63 | 0.89 | 0.63 | 0.76 |

7 | 42.55 | 42.18 | 42.93 | 15.40 | 14.68 | 14.65 | 0.88 | 0.63 | 0.76 |

8 | 42.44 | 42.93 | 43.03 | 15.32 | 14.99 | 14.93 | 0.88 | 0.61 | 0.75 |

9 | 43.04 | 42.99 | 42.94 | 15.26 | 14.91 | 15.65 | 0.85 | 0.60 | 0.73 |

10 | 43.49 | 42.72 | 43.68 | 15.05 | 15.45 | 15.30 | 0.82 | 0.57 | 0.70 |

Model | RMSE (mm) | SMAPE (%) | TS_{1} | TS_{2} | TS_{3} |
---|---|---|---|---|---|

ARMA | 55.73 | 22.56 | 0.73 | 0.45 | 0.59 |

MARS | 53.46 | 23.35 | 0.71 | 0.47 | 0.59 |

BPNN | 43.00 | 20.39 | 0.81 | 0.54 | 0.68 |

SVM | 47.49 | 19.18 | 0.76 | 0.49 | 0.53 |

GA | 42.94 | 18.80 | 0.80 | 0.55 | 0.68 |

LSTM | 41.84 | 15.17 | 0.87 | 0.61 | 0.74 |

© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Kang, J.; Wang, H.; Yuan, F.; Wang, Z.; Huang, J.; Qiu, T.
Prediction of Precipitation Based on Recurrent Neural Networks in Jingdezhen, Jiangxi Province, China. *Atmosphere* **2020**, *11*, 246.
https://doi.org/10.3390/atmos11030246

**AMA Style**

Kang J, Wang H, Yuan F, Wang Z, Huang J, Qiu T.
Prediction of Precipitation Based on Recurrent Neural Networks in Jingdezhen, Jiangxi Province, China. *Atmosphere*. 2020; 11(3):246.
https://doi.org/10.3390/atmos11030246

**Chicago/Turabian Style**

Kang, Jinle, Huimin Wang, Feifei Yuan, Zhiqiang Wang, Jing Huang, and Tian Qiu.
2020. "Prediction of Precipitation Based on Recurrent Neural Networks in Jingdezhen, Jiangxi Province, China" *Atmosphere* 11, no. 3: 246.
https://doi.org/10.3390/atmos11030246