On the Application of Long Short-Term Memory Neural Network for Daily Forecasting of PM2.5 in Dakar, Senegal (West Africa)
Abstract
1. Introduction
2. Model Development
2.1. Data Preprocessing
2.1.1. Study Area
2.1.2. Data Processing and Organization
2.2. Model Architecture, Training, and Conception
2.2.1. Methods
2.2.2. LSTM Model
2.2.3. Configuration of the LSTM Model
2.3. Selection of the Optimal LSTM Model
2.3.1. Sequence Length
2.3.2. Batch Size
2.3.3. Determination of the Number of Layers and Neurons
2.3.4. Iteration
2.3.5. Hyperparameters
- The ADAM optimizer, introduced by Kingma [58], is well known for its performance on deep learning models, especially recurrent architectures such as LSTM. The ADAM optimizer combines the benefits of the root mean square propagation (RMSProp) and Stochastic Gradient Descent (SGD) optimizers with first and second moment corrections to dynamically adjust the learning rate. This adaptability ensures fast and stable convergence even in environments with complex or noisy gradients.
- A dropout rate of 0.2 has been shown to be important in preventing overfitting. This method, introduced by Hinton and further investigated by Srivastava [59], involves randomly disabling 20% of the neurons at each training step. This reduces dependencies between neurons and promotes better generalization while maintaining a balance between model complexity and robustness. The value of 0.2 is chosen as an optimal compromise, ensuring high performance while minimizing the risk of overtraining.
2.3.6. Model Architecture
2.3.7. Model Adjustment
2.3.8. Impact of Sequence Length on the Model’s Predictive Performance
2.3.9. Architecture and Configuration of the LSTM Model
3. Results and Discussion
3.1. Model Performance over a 10-Day Period
3.2. Comparative Analysis with the ARIMA Data Assimilation Model
3.3. Forecasting Using Data from Automatic Stations as Input
3.4. Characterization of LSTM Model Performance Zones Based on Computational Resource
4. Conclusions and Outlook
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
ARIMA | Autoregressive Integrated Moving Average |
LSTM | Long Short-Term Memory |
GPU | Graphical Processing Unit |
TPU | Tensor Processing Unit |
RMSE | Root Mean Square Error |
PM | Particulate Matter |
CGQA | Center Air Quality Management |
RNN | Recurrent Neuronal Network |
RMSProp | Root Mean Square Propagation |
ADAM | Adaptive Moment Estimation |
References
- World Health Organization. 2021. Available online: https://www.who.int/fr/news-room/spotlight/how-air-pollution-is-destroying-our-health (accessed on 1 April 2025).
- Cohen, A.J.; Brauer, M.; Burnett, R.; Anderson, H.R.; Frostad, J.; Estep, K.; Balakrishnan, K.; Brunekreef, B.; Dandona, L.; Dandona, R.; et al. Estimates and 25-year trends of the global burden of disease attributable to ambient air pollution: An analysis of data from the Global Burden of Diseases Study 2015. Lancet 2017, 389, 1907–1918. [Google Scholar] [CrossRef] [PubMed]
- Hayes, R.B.; Lim, C.; Zhang, Y.; Cromar, K.; Shao, Y.; Reynolds, H.R.; Silverman, D.T.; Jones, R.R.; Park, Y.; Jerrett, M.; et al. PM2.5 air pollution and cause-specific cardiovascular disease mortality. Int. J. Epidemiol. 2020, 49, 25–35. [Google Scholar] [CrossRef] [PubMed]
- Brook, R.D.; Rajagopalan, S.; Pope, C.A., 3rd; Brook, J.R.; Bhatnagar, A.; Diez-Roux, A.V.; Holguin, F.; Hong, Y.; Luepker, R.V.; Mittleman, M.A.; et al. Particulate matter air pollution and cardiovascular disease: An update to the scientific statement from the American Heart Association. Circulation 2010, 121, 2331–2378. [Google Scholar] [CrossRef] [PubMed]
- Pope, C.A., III; Dockery, D.W. Health effects of fine particulate air pollution: Lines that connect. J. Air Waste Manag. Assoc. 2006, 56, 709–742. [Google Scholar] [CrossRef]
- Pope, C.A., III; Dockery, D.W. Air pollution and life expectancy in China and beyond. Proc. Natl. Acad. Sci. USA 2013, 110, 12861–12862. [Google Scholar] [CrossRef]
- Liousse, C.; Galy-Lacaux, C. Urban pollution in West Africa. Meteorology 2010, 2010, 45–49. [Google Scholar] [CrossRef]
- Gueye, A.; Drame, M.S.; Diallo, M.; Ngom, B.; Ndao, D.N.; Faye, A.; Pusede, S. PM2.5 Hotspot Identification in Dakar area: An Innovative IoT and Mapping-Based Approach for Effective Air Quality Management. In Proceedings of the IEEE 9th International Conference on Smart Instrumentation, Measurement and Applications (ICSIMA), Kuala Lumpur, Malaysia, 17–18 October 2023; pp. 302–307. [Google Scholar] [CrossRef]
- Venter, Z.S.; Aunan, K.; Chowdhury, S.; Lelieveld, J. COVID-19 lockdowns cause global air pollution declines. Proc. Natl. Acad. Sci. USA 2020, 117, 18984–18990. [Google Scholar] [CrossRef]
- Baklanov, A.; Grimmond, C.S.B.; Carlson, D.; Terblanche, D.; Tang, X.; Bouchet, V.; Lee, B.; Langendijk, G.; Kolli, R.K.; Hovsepyan, A. From urban meteorology, climate and environment research to integrated city services. Urban Clim. 2018, 23, 330–341. [Google Scholar] [CrossRef]
- Keita, S.; Liousse, C.; Yoboué, V.; Dominutti, P.; Guinot, B.; Assamoi, E.M.; Borbon, A.; Haslett, S.L.; Bouvier, L.; Colomb, A.; et al. Particle and VOC emission factor measurements for anthropogenic sources in West Africa. Atmos. Chem. Phys. 2018, 18, 7691–7708. [Google Scholar] [CrossRef]
- Doumbia, T.; Liousse, C.; Ouafo-Leumbe, M.R.; Ndiaye, S.A.; Gardrat, E.; Galy-Lacaux, C.; Zouiten, C.; Yoboué, V.; Granier, C. Source apportionment of ambient particulate matter (PM) in two Western African urban sites (Dakar in Senegal and Bamako in Mali). Atmosphere 2023, 14, 684. [Google Scholar] [CrossRef]
- Gueye, A.; Drame, M.S.; Niang, S.A.A.; Diallo, M.; Toure, M.D.; Niang, D.N.; Talla, K. Enhancing PM2.5 Predictions in Dakar Through Automated Data Integration into a Data Assimilation Model. Aerosol. Sci. Eng. 2024, 8, 402–413. [Google Scholar] [CrossRef]
- Hyndman, R.J.; Athanasopoulos, G. Forecasting: Principles and Practice. OTexts. 2018. Available online: https://otexts.com/fpp2/ (accessed on 10 January 2025).
- Goyal, P.; Chan, A.T.; Jaiswal, N. Statistical models for the prediction of respirable suspended particulate matter in urban cities. Atmos. Environ. 2006, 40, 2068–2077. [Google Scholar] [CrossRef]
- Chaloulakou, A.; Kassomenos, P.; Spyrellis, N.; Demokritou, P.; Koutrakis, P. Measurements of PM10 and PM2. 5 particle concentrations in Athens, Greece. Atmos. Environ. 2003, 37, 649–660. [Google Scholar] [CrossRef]
- Sandu, A.; Daescu, D.N.; Carmichael, G.R.; Chai, T. Adjoint sensitivity analysis of regional air quality models. J. Comput. Phys. 2005, 204, 222–252. [Google Scholar] [CrossRef]
- Taylor, J.W. Short-term electricity demand forecasting using double seasonal exponential smoothing. J. Oper. Res. Soc. 2003, 54, 799–805. [Google Scholar] [CrossRef]
- Dominici, F.; McDermott, A.; Daniels, M.; Zeger, S.L.; Samet, J.M. Mortality among residents of 90 cities. In Revised Analyses of Time-Series Studies of Air Pollution and Health; Health Effects Institute: Boston, MA, USA, 2003. [Google Scholar] [CrossRef]
- Tsai, Y.T.; Zeng, Y.R.; Chang, Y.S. Air pollution forecasting using RNN with LSTM. In Proceedings of the 2018 IEEE 16th Intl Conf on Dependable, Autonomic and Secure Computing, 16th Intl Conf on Pervasive Intelligence and Computing, 4th Intl Conf on Big Data Intelligence and Computing and Cyber Science and Technology Congress (DASC/PiCom/DataCom/CyberSciTech), Athens, Greece, 12–15 August 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 1074–1079. [Google Scholar] [CrossRef]
- Hochreiter, S.; Schmidhuber, J. Long short-term memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef]
- Athira, V.; Geetha, P.; Vinayakumar, R.; Soman, K.P. Deepairnet: Applying recurrent networks for air quality prediction. Procedia Comput. Sci. 2018, 132, 1394–1403. [Google Scholar] [CrossRef]
- Youness, J.; Driss, M. LSTM Deep Learning vs ARIMA Algorithms for Univariate Time Series Forecasting: A case study. In Proceedings of the 2022 8th International Conference on Optimization and Applications (ICOA), Genoa, Italy, 6–7 October 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 1–4. [Google Scholar] [CrossRef]
- Zhou, H.; Zhang, Y.; Yang, L.; Liu, Q.; Yan, K.; Du, Y. Short-term photovoltaic power forecasting based on long short term memory neural network and attention mechanism. IEEE Access 2019, 7, 78063–78074. [Google Scholar] [CrossRef]
- Hu, M.; Li, W.; Yan, K.; Ji, Z.; Hu, H. Modern machine learning techniques for univariate tunnel settlement forecasting: A comparative study. Math. Probl. Eng. 2019, 2019, 7057612. [Google Scholar] [CrossRef]
- Cao, Y.; Zhou, X.; Yan, K. Deep learning neural network model for tunnel ground surface settlement prediction based on sensor data. Math. Probl. Eng. 2021, 2021, 9488892. [Google Scholar] [CrossRef]
- Alimissis, A.; Philippopoulos, K.; Tzanis, C.G.; Deligiorgi, D. Spatial estimation of urban air pollution with the use of artificial neural network models. Atmos. Environ. 2018, 191, 205–213. [Google Scholar] [CrossRef]
- Elangasinghe, M.A.; Singhal, N.; Dirks, K.N.; Salmond, J.A. Development of an ANN–based air pollution forecasting system with explicit knowledge through sensitivity analysis. Atmos. Pollut. Res. 2014, 5, 696–708. [Google Scholar] [CrossRef]
- Pardo, E.; Malpica, N. Air quality forecasting in Madrid using long short-term memory networks. In Proceedings of the International Work-Conference on the Interplay Between Natural and Artificial Computation, Corunna, Spain, 19–23 June 2017; Springer International Publishing: Cham, Switzerland, 2017; pp. 232–239. [Google Scholar] [CrossRef]
- Song, X.; Huang, J.; Song, D. Air quality prediction based on LSTM-Kalman model. In Proceedings of the 2019 IEEE 8th Joint International Information Technology and Artificial Intelligence Conference (ITAIC), Chongqing, China, 24–26 May 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 695–699. [Google Scholar] [CrossRef]
- Bai, Y.; Li, Y.; Zeng, B.; Li, C.; Zhang, J. Hourly PM2. 5 concentration forecast using stacked autoencoder model with emphasis on seasonality. J. Clean. Prod. 2019, 224, 739–750. [Google Scholar] [CrossRef]
- Li, S.; Xie, G.; Ren, J.; Guo, L.; Yang, Y.; Xu, X. Urban PM2. 5 concentration prediction via attention-based CNN–LSTM. Appl. Sci. 2020, 10, 1953. [Google Scholar] [CrossRef]
- Zhang, L.; Liu, P.; Zhao, L.; Wang, G.; Zhang, W.; Liu, J. Air quality predictions with a semi-supervised bidirectional LSTM neural network. Atmos. Pollut. Res. 2021, 12, 328–339. [Google Scholar] [CrossRef]
- Rahimzad, M.; Moghaddam Nia, A.; Zolfonoon, H.; Soltani, J.; Danandeh Mehr, A.; Kwon, H.H. Performance comparison of an LSTM-based deep learning model versus conventional machine learning algorithms for streamflow forecasting. Water Resour. Manag. 2021, 35, 4167–4187. [Google Scholar] [CrossRef]
- Song, X.; Liu, Y.; Xue, L.; Wang, J.; Zhang, J.; Wang, J.; Jiang, L.; Cheng, Z. Time-series well performance prediction based on Long Short-Term Memory (LSTM) neural network model. J. Pet. Sci. Eng. 2020, 186, 106682. [Google Scholar] [CrossRef]
- Wang, H.; Yang, Z.; Yu, Q.; Hong, T.; Lin, X. Online reliability time series prediction via convolutional neural network and long short term memory for service-oriented systems. Knowl.-Based Syst. 2018, 159, 132–147. [Google Scholar] [CrossRef]
- Bui, T.C.; Le, V.D.; Cha, S.K. A deep learning approach for forecasting air pollution in South Korea using LSTM. arXiv 2018, arXiv:1804.07891. [Google Scholar] [CrossRef]
- Xayasouk, T.; Lee, H.; Lee, G. Air Pollution Prediction Using Long Short-Term Memory (LSTM) and Deep Autoencoder (DAE) Models. Sustainability 2020, 12, 2570. [Google Scholar] [CrossRef]
- Yang, R.; Singh, S.K.; Tavakkoli, M.; Amiri, N.; Yang, Y.; Karami, M.A.; Rai, R. CNN-LSTM deep learning architecture for computer vision-based modal frequency detection. Mech. Syst. Signal Process. 2020, 144, 106885. [Google Scholar] [CrossRef]
- Hewamalage, H.; Bergmeir, C.; Bandara, K. Recurrent neural networks for time series forecasting: Current status and future directions. Int. J. Forecast. 2021, 37, 388–427. [Google Scholar] [CrossRef]
- Bengio, Y.; Simard, P.; Frasconi, P. Learning long-term dependencies with gradient descent is difficult. IEEE Trans. Neural Netw. 1994, 5, 157–166. [Google Scholar] [CrossRef]
- Zhong, S.; Zhang, K.; Bagheri, M.; Burken, J.G.; Gu, A.; Li, B.; Ma, X.; Marrone, B.L.; Ren, Z.J.; Schrier, J.; et al. Machine learning: New ideas and tools in environmental science and engineering. Environ. Sci. Technol. 2021, 55, 12741–12754. [Google Scholar] [CrossRef]
- LeCun, Y.; Bottou, L.; Bengio, Y.; Haffner, P. Gradient-based learning applied to document recognition. Proc. IEEE 1998, 86, 2278–2324. [Google Scholar] [CrossRef]
- Goodfellow, I.; Bengio, Y.; Courville, A. Deep Learning; MIT Press: Cambridge, MA, USA, 2016; Available online: https://www.deeplearningbook.org (accessed on 10 January 2025).
- Kochkina, E.; Liakata, M.; Augenstein, I. Turing at semeval-2017 task 8: Sequential approach to rumour stance classification with branch-lstm. arXiv 2017, arXiv:1704.07221. [Google Scholar] [CrossRef]
- Kim, T.Y.; Cho, S.B. Web traffic anomaly detection using C-LSTM neural networks. Expert Syst. Appl. 2018, 106, 66–76. [Google Scholar] [CrossRef]
- Zargar, S. Introduction to Sequence Learning Models: RNN, LSTM, GRU; Department of Mechanical and Aerospace Engineering, North Carolina State University: Raleigh, NC, USA, 2021. [Google Scholar] [CrossRef]
- Krause, B.; Lu, L.; Murray, I.; Renals, S. Multiplicative LSTM for sequence modelling. arXiv 2016, arXiv:1609.07959. [Google Scholar] [CrossRef]
- Shankar, R.; Sarojini, B.K.; Mehraj, H.; Kumar, A.S.; Neware, R.; Singh Bist, A. Impact of the learning rate and batch size on NOMA system using LSTM-based deep neural network. J. Def. Model. Simul. 2023, 20, 259–268. [Google Scholar] [CrossRef]
- Choudhury, N.A.; Soni, B. An adaptive batch size-based-CNN-LSTM framework for human activity recognition in uncontrolled environment. IEEE Trans. Ind. Inform. 2023, 19, 10379–10387. [Google Scholar] [CrossRef]
- Lakretz, Y.; Kruszewski, G.; Desbordes, T.; Hupkes, D.; Dehaene, S.; Baroni, M. The emergence of number and syntax units in LSTM language models. arXiv 2019, arXiv:1903.07435. [Google Scholar] [CrossRef]
- Salman, A.G.; Heryadi, Y.; Abdurahman, E.; Suparta, W. Single layer & multi-layer long short-term memory (LSTM) model with intermediate variables for weather forecasting. Procedia Comput. Sci. 2018, 135, 89–98. [Google Scholar] [CrossRef]
- Hastomo, W.; Karno, A.S.B.; Kalbuana, N.; Meiriki, A. Characteristic parameters of epoch deep learning to predict COVID-19 data in Indonesia. J. Phys. Conf. Ser. 2021, 1933, 012050. [Google Scholar] [CrossRef]
- You, Y.; Hseu, J.; Ying, C.; Demmel, J.; Keutzer, K.; Hsieh, C.J. Large-batch training for LSTM and beyond. In Proceedings of the International Conference for High Performance Computing, Networking, Storage and Analysis, Denver, CO, USA, 17–22 November 2019; pp. 1–16. [Google Scholar] [CrossRef]
- Reimers, N.; Gurevych, I. Optimal hyperparameters for deep lstm-networks for sequence labeling tasks. arXiv 2017, arXiv:1707.06799. [Google Scholar] [CrossRef]
- Greff, K.; Srivastava, R.K.; Koutník, J.; Steunebrink, B.R.; Schmidhuber, J. LSTM: A search space odyssey. IEEE Trans. Neural Netw. Learn. Syst. 2016, 28, 2222–2232. [Google Scholar] [CrossRef]
- Freeman, B.S.; Taylor, G.; Gharabaghi, B.; Thé, J. Forecasting air quality time series using deep learning. J. Air Waste Manag. Assoc. 2018, 68, 866–886. [Google Scholar] [CrossRef]
- Kingma, D.P. Adam: A method for stochastic optimization. arXiv 2014, arXiv:1412.6980. [Google Scholar] [CrossRef]
- Srivastava, N.; Hinton, G.; Krizhevsky, A.; Sutskever, I.; Salakhutdinov, R. Dropout: A simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 2014, 15, 1929–1958. Available online: http://jmlr.org/papers/v15/srivastava14a.html (accessed on 20 March 2025).
- Haque, A.; Rahman, S. Short-term electrical load forecasting through heuristic configuration of regularized deep neural network. Appl. Soft Comput. 2022, 122, 108877. [Google Scholar] [CrossRef]
- Rodola, G. Psutil Documentation. Psutil. 2020. Available online: https://psutil.readthedocs.io/en/latest (accessed on 1 February 2025).
- Hadri, S.; Naitmalek, Y.; Najib, M.; Bakhouya, M.; Fakhri, Y.; Elaroussi, M. A comparative study of predictive approaches for load forecasting in smart buildings. Procedia Comput. Sci. 2019, 160, 173–180. [Google Scholar] [CrossRef]
- Li, R.; Li, Z.; Gao, W.; Ding, W.; Xu, Q.; Song, X. Diurnal, seasonal, and spatial variation of PM2. 5 in Beijing. Sci. Bull. 2015, 60, 387–395. [Google Scholar] [CrossRef]
- Siami-Namini, S.; Tavakoli, N.; Namin, A.S. A Comparison of ARIMA and LSTM in Forecasting Time Series. In Proceedings of the 2018 17th IEEE International Conference on Machine Learning and Applications (ICMLA), Orlando, FL, USA, 17–20 December 2018; pp. 1394–1401. [Google Scholar] [CrossRef]
- Li, W.; Yi, L.; Yin, X. Real time air monitoring, analysis and prediction system based on internet of things and LSTM. In Proceedings of the 2020 International Conference on Wireless Communications and Signal Processing (WCSP), Nanjing, China, 21–23 October 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 188–194. [Google Scholar] [CrossRef]
- Mao, W.; Wang, W.; Jiao, L.; Zhao, S.; Liu, A. Modeling air quality prediction using a deep learning approach: Method optimization and evaluation. Sustain. Cities Soc. 2021, 65, 102567. [Google Scholar] [CrossRef]
- Huang, R.; Wei, C.; Wang, B.; Yang, J.; Xu, X.; Wu, S.; Huang, S. Well performance prediction based on Long Short-Term Memory (LSTM) neural network. J. Pet. Sci. Eng. 2022, 208, 109686. [Google Scholar] [CrossRef]
Parameter | Description |
---|---|
Sequence length | This refers to the number of time steps considered for each input sample, enabling the modeling of temporal dependencies in the data |
Number of layers | This parameter represents the number of LSTM layers in the model, influencing the depth of learning as well as the ability to capture temporal relationships |
Neurons per layer | The number of neurons per layer controls the complexity and modeling capacity of the network, enabling it to identify complex relationships |
Hyperparameters | Description |
---|---|
Number of epochs | The batch size represents the number of samples processed in each iteration. A larger batch size allows for a more stable gradient estimate. |
Apprenticeship rate | The number of epochs refers to the number of times the model passes through the entire training dataset, thereby influencing the level of learning. It is used in combination with early stopping to prevent overfitting. |
Dropout rate | It is the percentage of neurons temporarily deactivated during each pass to limit overfitting and improve the model’s generalization. |
Activation function | It is the nonlinear function applied to the neurons, typically ReLU [45] or tanh [46], which enhances the model’s ability to represent complex data. |
Sequence Length | Batch Size | Epochs | Optimization | Number of Neurons | RMSE |
---|---|---|---|---|---|
30 | 16 | 500 | Adam | 500 | 7.44 |
10 | 16 | 500 | Adam | 500 | 7.62 |
40 | 16 | 500 | Adam | 500 | 8.02 |
20 | 16 | 500 | Adam | 500 | 8.11 |
Sequence Length | RMSE | Mean Error | Std Error | R2 | Runtime (min) | CPU (Go) |
---|---|---|---|---|---|---|
10 | 3.10 | 0.25 | 2.70 | 0.988 | 37 | 0.38 |
20 | 3.00 | 0.10 | 2.29 | 0.989 | 44 | 0.54 |
30 | 3.41 | −0.07 | 2.60 | 0.988 | 52 | 0.72 |
40 | 3.11 | −0.35 | 2.33 | 0.900 | 145 | 0.43 |
Sequence Length | Epochs | Batch Size | Layer | Optimization | Dropout | |
---|---|---|---|---|---|---|
1st Neuron Layer | 2nd Neuron Layer | |||||
20 | 500 | 16 | 500 | 50 | ADAM | 0.2 |
DAYS | INDICATORS | |||
---|---|---|---|---|
R2 | RMSE | |||
ARIMA | LSTM | ARIMA | LSTM | |
DAY 1 | 0.80 | 0.95 | 14.99 | 7.63 |
DAY 2 | 0.75 | 0.94 | 17.42 | 7.74 |
DAY 3 | 0.69 | 0.94 | 18.17 | 7.75 |
DAY 4 | 0.70 | 0.94 | 18.41 | 7.81 |
DAY 5 | 0.69 | 0.93 | 18.46 | 7.82 |
DAY 6 | 0.68 | 0.93 | 18.64 | 7.91 |
DAY 7 | 0.65 | 0.92 | 20.52 | 8.15 |
Models | Diff Max | Diff Min | Mean Diff | RMSE | R2 |
---|---|---|---|---|---|
Statistics LSTM model | 0.49 | 3.06 | 0.31 | 5.18 | 0.97 |
Statistics ARIMA model | 16 | 4.45 | 4.8 | 13.16 | 0.79 |
Processor | Type | RAM System | Disk | Running Time | CPU Used | RMSE | R2 |
---|---|---|---|---|---|---|---|
Colab T4 GPU | Graphics processor NVIDIA T4 | 15.0 GB | 112.6 GB | 5 min 40 s | 1.23 GB | 3.48 | 0.98 |
Colab TPU v2-8 | Tensor Processing Unit | 334.6 GB | 225.3 GB | 25 min | 2.31 GB | 2.88 | 0.99 |
Kaggle (CPU) | Server | 32.2 GB | 61.86 GB | 37 min 21 s | 0.89 GB | 2.92 | 0.99 |
My server (CPU) | Computer | 16 GB | 500 GB | 41 min 31 s | 0.41 GB | 3.00 | 0.99 |
Server Colab (CPU) | Server | 12.7 GB | 107.7 GB | 79 min 50 s | 0.79 GB | 3.05 | 0.98 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Gueye, A.; Niang, S.A.A.; Diallo, I.; Dramé, M.S.; Diallo, M.; Younous, A.A. On the Application of Long Short-Term Memory Neural Network for Daily Forecasting of PM2.5 in Dakar, Senegal (West Africa). Sustainability 2025, 17, 5421. https://doi.org/10.3390/su17125421
Gueye A, Niang SAA, Diallo I, Dramé MS, Diallo M, Younous AA. On the Application of Long Short-Term Memory Neural Network for Daily Forecasting of PM2.5 in Dakar, Senegal (West Africa). Sustainability. 2025; 17(12):5421. https://doi.org/10.3390/su17125421
Chicago/Turabian StyleGueye, Ahmed, Serigne Abdoul Aziz Niang, Ismaila Diallo, Mamadou Simina Dramé, Moussa Diallo, and Ali Ahmat Younous. 2025. "On the Application of Long Short-Term Memory Neural Network for Daily Forecasting of PM2.5 in Dakar, Senegal (West Africa)" Sustainability 17, no. 12: 5421. https://doi.org/10.3390/su17125421
APA StyleGueye, A., Niang, S. A. A., Diallo, I., Dramé, M. S., Diallo, M., & Younous, A. A. (2025). On the Application of Long Short-Term Memory Neural Network for Daily Forecasting of PM2.5 in Dakar, Senegal (West Africa). Sustainability, 17(12), 5421. https://doi.org/10.3390/su17125421