# Solar Irradiance Forecasting Based on Deep Learning Methodologies and Multi-Site Data

^{*}

## Abstract

**:**

## 1. Introduction

#### 1.1. Motivation of the Study

#### 1.2. Problem Statement

#### 1.3. Related Work

#### 1.4. Contributions of the Study

- Time series models based on deep learning methodologies were implemented to forecast the daily solar irradiance of two locations in India through consideration of the historical data collected.
- The models were developed on the basis of single-location univariate solar irradiance data as well as data from multiple locations.
- The accuracy, performance and reliability of the model were investigated on the basis of standard performance evaluation metrics and rolling window evaluation.
- The feature importance of the nearby locations with respect to forecasting target location solar irradiance was analyzed and compared on the basis of the solar irradiance data obtained from NASA over 36 years.

## 2. Materials and Methods

#### 2.1. Data Collection

#### 2.2. Data Selection

#### 2.2.1. Pearson Correlation

#### 2.2.2. Spearman Correlation

#### 2.2.3. XGBoost

#### 2.3. Forecast Methodology

#### 2.3.1. LSTM

#### 2.3.2. Bidirectional LSTM

#### 2.3.3. GRU

#### 2.3.4. CNN LSTM

#### 2.3.5. Attention LSTM

_{t-1}is the hidden state and s

_{t-1}the cell state. W

_{a}, W

_{x}and b

_{a}are the attention weights and bias. As it can be observed from Figure 9, after the input layer, there exists an encoder layer which processes the input and then forwards it to the attention layer. The $\sigma $ represents the softmax function performed on the score. This attention layer proceeds with the input according to the equations as stated above and then feeds its output to the decoder to be sent to the output layer. Now, these encoder and decoder layers are nothing but the DNN hidden nodes and layers which perform a typical sequential learning from the complete process. The additional benefit is the computation of relevant information after extracting it through the scoring function. This aides in managing long-term dependencies by just the introduction of an attention layer.

#### 2.4. Performance Evaluation Metrics

## 3. Results and Discussion

#### 3.1. Test Results

#### 3.2. Analysis of Diversity and Robustness

#### 3.3. Feature Importance Analysis

#### 3.4. Comparison for Different Horizons

## 4. Conclusions

## Author Contributions

## Funding

## Acknowledgments

## Conflicts of Interest

## Appendix A

## References

- Heng, J.; Wang, J.; Xiao, L.; Lu, H. Research and application of a combined model based on frequent pattern growth algorithm and multi-objective optimization for solar radiation forecasting. Appl. Energy
**2017**, 208, 845–866. [Google Scholar] [CrossRef] - Amrouche, B.; Le Pivert, X. Artificial neural network based daily local forecasting for global solar radiation. Appl. Energy
**2014**, 130, 333–341. [Google Scholar] [CrossRef] - Amrouche, B.; Sicot, L.; Guessoum, A.; Belhamel, M. Experimental analysis of the maximum power point’s properties for four photovoltaic modules from different technologies: Monocrystalline and polycrystalline silicon, CIS and CdTe. Sol. Energy Mater. Sol. Cells
**2013**. [Google Scholar] [CrossRef] - Lubitz, W. Effect of manual tilt adjustments on incident irradiance on fixed and tracking solar panels. Appl. Energy
**2011**, 88, 1710–1719. [Google Scholar] [CrossRef] - Su, Y.; Chan, L.; Shu, L.; Tsui, K. Real-time prediction models for output power and efficiency of grid-connected solar photovoltaic systems. Appl. Energy
**2012**, 93, 319–326. [Google Scholar] [CrossRef] - Thapar, V.; Agnihotri, G.; Sethi, V.K. Estimation of Hourly Temperature at a Site and its Impact on Energy Yield of a PV Module. Int. J. Green Energy
**2012**, 9, 553–572. [Google Scholar] [CrossRef] - Thapar, V. A revisit to solar radiation estimations using sunshine duration: Analysis of impact of these estimations on energy yield of a PV generating system. Energy Sources Part A Recovery Util. Environ. Eff.
**2019**, 1–25. [Google Scholar] [CrossRef] - Tsay, R.S. Analysis of Financial Time Series Second Edition; John Wiley & Sons: New York, NY, USA, 2005. [Google Scholar] [CrossRef]
- Mateo, F.; Carrasco, J.; Sellami, A.; Millán-Giraldo, M.; Domínguez, M.; Soria-Olivas, E. Machine learning methods to forecast temperature in buildings. Expert Syst. Appl.
**2013**, 40, 1061–1068. [Google Scholar] [CrossRef] - Box, G.; Jenkins, G.; Reinsel, G.; Ljung, G. Time Series Analysis: Forecasting & Control; John Wiley & Sons: New York, NY, USA, 2015. [Google Scholar] [CrossRef]
- Li, Y.; Su, Y.; Shu, L. An ARMAX model for forecasting the power output of a grid connected photovoltaic system. Renew. Energy
**2014**, 66, 78–89. [Google Scholar] [CrossRef] - Zhang, P. Time series forecasting using a hybrid ARIMA and neural network model. Neurocomputing
**2003**, 50, 159–175. [Google Scholar] [CrossRef] - Brockwell, P.; Davis, R. Introduction to Time Series and Forecasting—Second Edition; Springer: Berlin/Heidelberg, Germany, 2002. [Google Scholar] [CrossRef]
- Kariniotakis, G. Renewable Energy Forecasting: From Models to Applications; Woodhead Publishing: Cambridge, UK, 2017. [Google Scholar]
- Aburto, L.; Weber, R. Improved supply chain management based on hybrid demand forecasts. Appl. Soft Comput. J.
**2007**, 7, 136–144. [Google Scholar] [CrossRef] - Creal, D.; Koopman, S.; Lucas, A. Generalized autoregressive score models with applications. J. Appl. Econom.
**2013**, 28, 777–795. [Google Scholar] [CrossRef] [Green Version] - Neves, C.; Fernandes, C.; Hoeltgebaum, H. Five different distributions for the Lee–Carter model of mortality forecasting: A comparison using GAS models. Insur. Math. Econ.
**2017**, 75, 48–57. [Google Scholar] [CrossRef] - Belmahdi, B.; Louzazni, M.; Bouardi, A.E. One month-ahead forecasting of mean daily global solar radiation using time series models. Optik
**2020**, 219, 165207. [Google Scholar] [CrossRef] - Yagli, G.M.; Yang, D.; Srinivasan, D. Automatic hourly solar forecasting using machine learning models. Renew. Sustain. Energy Rev.
**2019**, 105, 487–498. [Google Scholar] [CrossRef] - Alzahrani, A.; Shamsi, P.; Ferdowsi, M.; Dagli, C. Solar irradiance forecasting using deep recurrent neural networks. In Proceedings of the 2017 IEEE 6th International Conference on Renewable Energy Research and Applications (ICRERA), San Diego, CA, USA, 5–8 November 2017; pp. 988–994. [Google Scholar] [CrossRef]
- Rumelhart, D.; Hinton, G.; Williams, R. Learning representations by back-propagating errors. Nature
**1986**, 323, 533–536. [Google Scholar] [CrossRef] - Hochreiter, S.; Schmidhuber, J. Long Short-Term Memory. Neural Comput.
**1997**, 9, 1735–1780. [Google Scholar] [CrossRef] - Srivastava, S.; Lessmann, S. A comparative study of LSTM neural networks in forecasting day-ahead global horizontal irradiance with satellite data. Sol. Energy
**2018**, 162, 232–247. [Google Scholar] [CrossRef] - Qing, X.; Niu, Y. Hourly day-ahead solar irradiance prediction using weather forecasts by LSTM. Energy
**2018**, 148, 461–468. [Google Scholar] [CrossRef] - Wang, Y.; Shen, Y.; Mao, S.; Chen, X.; Zou, H. LASSO & LSTM Integrated Temporal Model for Short-term Solar Intensity Forecasting. IEEE Internet Things J.
**2018**. [Google Scholar] [CrossRef] - Shih, S.; Sun, F.; Lee, H.Y. Temporal pattern attention for multivariate time series forecasting. Mach. Learn.
**2019**, 108, 1421–1441. [Google Scholar] [CrossRef] [Green Version] - Li, S.; Jin, X.; Xuan, Y.; Zhou, X.; Chen, W.; Wang, Y.X.; Yan, X. Enhancing the Locality and Breaking the Memory Bottleneck of Transformer on Time Series Forecasting. Adv. Neural Inf. Process. Syst.
**2020**, 5243–5253. Available online: http://papers.nips.cc/paper/8766-enhancing-the-locality-and-breaking-the-memory-bottleneck- of-transformer-on-time-series-forecasting.pdf (accessed on 2 November 2020). - Greff, K.; Srivastava, R.; Koutnik, J.; Steunebrink, B.; Schmidhuber, J. LSTM: A Search Space Odyssey. IEEE Trans. Neural Netw. Learn. Syst.
**2017**, 28, 2222–2232. [Google Scholar] [CrossRef] [Green Version] - Graves, A.; Schmidhuber, J. Framewise phoneme classification with bidirectional LSTM and other neural network architectures. Neural Netw.
**2005**, 18, 602–610. [Google Scholar] [CrossRef] - Kyunghyun, C.; Bart, V.; Caglar, G.; Dzmitry, B.; Fethi, B.; Holger, S.; Yoshua, B. Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation. arXiv
**2014**, arXiv:1406.1078. [Google Scholar] - Geron, A. Hands-on Machine Learning with Scikit-Learn and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems; O’Reilly Media: Sebastopol, CA, USA, 2017. [Google Scholar]
- LeCun, Y.; Haffner, P.; Bottou, L.; Bengio, Y. Object recognition with gradient-based learning. In Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Springer: Berlin/Heidelberg, Germany, 1999; Volume 1681, pp. 319–345. [Google Scholar] [CrossRef]
- Ilya, S.; Oriol, V.; Quoc, V. Sequence to Sequence Learning with Neural Networks. Adv. Neural Inf. Process. Syst. 2014. Available online: https://papers.nips.cc/paper/5346-sequence-to-sequence-learning-with-neural-networks.pdf (accessed on 2 November 2020).
- Brahma, B.; Wadhvani, R. Time Series Forecasting: A Comparison of DeepNeural Network Techniques. Solid State Technol.
**2020**, 63, 1747–1761. [Google Scholar] - Alex, G.; Greg, W.; Ivo, D. Neural Turing Machines. arXiv
**2014**, arXiv:1410.5401. [Google Scholar] - Tashman, L.J. Out-of-sample tests of forecasting accuracy: An analysis and review. Int. J. Forecast.
**2000**, 16, 437–450. [Google Scholar] [CrossRef] - Barrow, D.K.; Crone, S.F. Cross-validation aggregation for combining autoregressive neural network forecasts. Int. J. Forecast.
**2016**, 32, 1120–1137. [Google Scholar] [CrossRef] [Green Version] - Guermoui, M.; Melgani, F.; Gairaa, K.; Mekhalfi, M.L. A comprehensive review of hybrid models for solar radiation forecasting. J. Clean. Prod.
**2020**, 258, 120357. [Google Scholar] [CrossRef] - Hajirahimi, Z.; Khashei, M. Hybrid structures in time series modeling and forecasting: A review. Eng. Appl. Artif. Intell.
**2019**, 86, 83–106. [Google Scholar] [CrossRef]

**Figure 2.**Point data and multi-site data collected for location 1. Adapted from the POWER Data Access Viewer by the NASA Langley Research Center (LaRC) POWER Project funded through the NASA Earth Science/Applied Science Program (https://power.larc.nasa.gov/data-access-viewer/).

**Figure 3.**Point data and multi-site data collected for location 2. Adapted from the POWER Data Access Viewer by the NASA Langley Research Center (LaRC) POWER Project funded through the NASA Earth Science/Applied Science Program (https://power.larc.nasa.gov/data-access-viewer/).

**Figure 7.**Long short-term memory (LSTM) cell. Adapted from “LSTM: A Search Space Odyssey” by K. Greff, R. K. Srivastava, J. Koutník, B. R. Steunebrink and J. Schmidhuber, 2017, IEEE Transactions on Neural Networks and Learning Systems, 28(10), pp. 2222–2232.

**Figure 8.**Gated recurrent unit (GRU) cell. Adapted from “Handling Long Sequences” in A. Geron, Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow (p. 519), 2019, O’Reilly Media, Inc. Sebastopol, CA 95472.

**Figure 9.**Attention mechanism. Adapted from “Time Series Forecasting: A Comparison of Deep Neural Network Techniques” by B. Brahma and R. Wadhvani, 2020, Solid State Technology, 63(6), pp. 1747–1761.

Location # | Target Location Coordinates | Number of Enclosed Regional Sites | Enclosed Site Coordinates |
---|---|---|---|

1 | 23.25991, 77.41261 | 12 | 23.9645, 78.2255; 22.7947, 76.5556 |

2 | 22.71961, 75.85771 | 15 | 23.4011, 76.7314; 22.1450, 74.9077 |

Statistic | Value for Location 1 | Value for Location 2 |
---|---|---|

Total Observations | 13,000 | 13,000 |

Date Range | 1983–2019 | 1983–2019 |

Minimum | 0.2 | 0.16 |

Maximum | 8.41 | 8.66 |

Mean | 5.09 | 5.17 |

Standard Deviation | 1.42 | 1.38 |

Metric | Equation |
---|---|

MSE | $\frac{1}{N}{\sum}_{n=1}^{N}{(E({D}_{n}\mid \overrightarrow{{X}_{n}})-F(\overrightarrow{{X}_{n}},\overrightarrow{w}))}^{2}$ |

RMSE | $\sqrt{\frac{1}{N}{\sum}_{n=1}^{N}{(E({D}_{n}\mid \overrightarrow{{X}_{n}})-F(\overrightarrow{{X}_{n}},\overrightarrow{w}))}^{2}}$ |

${R}^{2}$ | $1-\frac{{\sum}_{n=1}^{N}{(E({D}_{n}\mid \overrightarrow{{X}_{n}})-F(\overrightarrow{{X}_{n}},\overrightarrow{w}))}^{2}}{{\sum}_{n=1}^{N}{(E({D}_{n}\mid \overrightarrow{{X}_{n}})-\overrightarrow{F}(\overrightarrow{{X}_{n}},\overrightarrow{w}))}^{2}}$ |

Model | Single-Location | Multi-Location | ||||
---|---|---|---|---|---|---|

MSE | RMSE | ${\mathit{R}}^{2}$ | MSE | RMSE | ${\mathit{R}}^{2}$ | |

LSTM | 9.721 | 9.859 | 68.62 | 9.581 | 9.788 | 69.06 |

GRU | 9.825 | 9.912 | 68.28 | 9.193 | 9.588 | 70.32 |

CNN | 9.714 | 9.856 | 68.64 | 9.213 | 9.598 | 70.25 |

Bidir | 9.617 | 9.806 | 68.95 | 9.094 | 9.536 | 70.64 |

Attention | 9.610 | 9.803 | 68.97 | 9.399 | 9.695 | 69.65 |

Model | Single-Location | Multi-Location | ||||
---|---|---|---|---|---|---|

MSE | RMSE | ${\mathit{R}}^{2}$ | MSE | RMSE | ${\mathit{R}}^{2}$ | |

LSTM | 13.19 | 11.48 | 57.41 | 12.79 | 11.31 | 58.68 |

GRU | 13.28 | 11.52 | 57.13 | 12.93 | 11.37 | 58.23 |

CNN | 13.22 | 11.50 | 57.31 | 12.94 | 11.37 | 58.23 |

Bidir | 13.20 | 11.49 | 57.36 | 12.78 | 11.30 | 58.74 |

Attention | 13.21 | 11.49 | 57.34 | 12.69 | 11.26 | 59.01 |

Model | Single-Location | Multi-Location | ||||
---|---|---|---|---|---|---|

MSE | RMSE | ${\mathit{R}}^{2}$ | MSE | RMSE | ${\mathit{R}}^{2}$ | |

LSTM | 15.48 | 12.44 | 50.24 | 15.19 | 12.33 | 50.98 |

GRU | 15.41 | 12.41 | 50.50 | 16.25 | 12.75 | 47.57 |

CNN | 15.05 | 12.26 | 51.66 | 18.37 | 13.55 | 40.75 |

Bidir | 15.16 | 12.31 | 51.28 | 14.84 | 12.18 | 52.12 |

Attention | 15.23 | 12.34 | 51.05 | 14.38 | 11.99 | 53.56 |

Model | Single-Location | Multi-Location | ||||
---|---|---|---|---|---|---|

MSE | RMSE | ${\mathit{R}}^{2}$ | MSE | RMSE | ${\mathit{R}}^{2}$ | |

LSTM | 7.911 | 8.894 | 72.29 | 7.678 | 8.762 | 73.11 |

GRU | 7.869 | 8.871 | 72.44 | 8.106 | 9.003 | 71.61 |

CNN | 7.937 | 8.909 | 72.21 | 8.317 | 9.120 | 70.86 |

Bidir | 7.794 | 8.828 | 72.71 | 7.610 | 8.724 | 73.34 |

Attention | 7.866 | 8.869 | 72.45 | 7.631 | 8.735 | 73.27 |

Model | Single-Location | Multi-Location | ||||
---|---|---|---|---|---|---|

MSE | RMSE | ${\mathit{R}}^{2}$ | MSE | RMSE | ${\mathit{R}}^{2}$ | |

LSTM | 10.63 | 10.31 | 62.75 | 10.50 | 10.27 | 63.06 |

GRU | 10.56 | 10.27 | 63.01 | 10.67 | 10.33 | 62.64 |

CNN | 10.68 | 10.34 | 62.57 | 10.93 | 10.45 | 61.72 |

Bidir | 10.76 | 10.37 | 62.30 | 10.50 | 10.25 | 63.21 |

Attention | 10.79 | 10.38 | 62.21 | 10.69 | 10.34 | 62.55 |

Model | Single-Location | Multi-Location | ||||
---|---|---|---|---|---|---|

MSE | RMSE | ${\mathit{R}}^{2}$ | MSE | RMSE | ${\mathit{R}}^{2}$ | |

LSTM | 12.51 | 11.18 | 56.34 | 12.61 | 11.23 | 55.89 |

GRU | 13.07 | 11.43 | 54.38 | 12.98 | 11.39 | 54.59 |

CNN | 13.40 | 11.57 | 53.25 | 15.12 | 12.29 | 47.11 |

Bidir | 12.83 | 11.32 | 55.23 | 12.54 | 11.19 | 56.14 |

Attention | 12.65 | 11.25 | 55.84 | 12.45 | 11.16 | 56.44 |

Function | 1 | 2 | 4 | 5 | 8 |
---|---|---|---|---|---|

Pearson | 87.4959 | 90.5483 | 87.2493 | 91.7756 | 91.9108 |

Spearman | 88.9031 | 91.3695 | 88.2469 | 92.2826 | 92.3385 |

XGBoost | 0.3709 | 21.3245 | 0.1822 | 34.5608 | 43.5614 |

Function | 1 | 4 | 11 | 12 | 14 |
---|---|---|---|---|---|

Pearson | 92.5453 | 91.9138 | 86.2631 | 91.2246 | 87.3032 |

Spearman | 92.5622 | 91.9727 | 87.4589 | 91.7827 | 88.6667 |

XGBoost | 37.8932 | 41.0277 | 0.3307 | 20.4562 | 0.2919 |

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Brahma, B.; Wadhvani, R.
Solar Irradiance Forecasting Based on Deep Learning Methodologies and Multi-Site Data. *Symmetry* **2020**, *12*, 1830.
https://doi.org/10.3390/sym12111830

**AMA Style**

Brahma B, Wadhvani R.
Solar Irradiance Forecasting Based on Deep Learning Methodologies and Multi-Site Data. *Symmetry*. 2020; 12(11):1830.
https://doi.org/10.3390/sym12111830

**Chicago/Turabian Style**

Brahma, Banalaxmi, and Rajesh Wadhvani.
2020. "Solar Irradiance Forecasting Based on Deep Learning Methodologies and Multi-Site Data" *Symmetry* 12, no. 11: 1830.
https://doi.org/10.3390/sym12111830