Next Article in Journal
Study of the Effect of Addition of Hydrogen to Natural Gas on Diaphragm Gas Meters
Next Article in Special Issue
Deterministic and Interval Wind Speed Prediction Method in Offshore Wind Farm Considering the Randomness of Wind
Previous Article in Journal
A Novel Approach for the Determination of Sorption Equilibria and Sorption Enthalpy Used for MOF Aluminium Fumarate with Water
Previous Article in Special Issue
Models for Short-Term Wind Power Forecasting Based on Improved Artificial Neural Network Using Particle Swarm Optimization and Genetic Algorithms
 
 
Order Article Reprints
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Multiple Site Intraday Solar Irradiance Forecasting by Machine Learning Algorithms: MGGP and MLP Neural Networks

1
School of Electrical, Mechanical, and Computer Engineering, Federal University of Goias (UFG), Goiania 74605-010, Brazil
2
Department of Energy, Politecnico di Milano, 20156 Milano, Italy
*
Authors to whom correspondence should be addressed.
Energies 2020, 13(11), 3005; https://doi.org/10.3390/en13113005
Received: 6 April 2020 / Revised: 7 May 2020 / Accepted: 29 May 2020 / Published: 11 June 2020
(This article belongs to the Special Issue Solar and Wind Power and Energy Forecasting)

Abstract

:
The forecasting of solar irradiance in photovoltaic power generation is an important tool for the integration of intermittent renewable energy sources (RES) in electrical utility grids. This study evaluates two machine learning (ML) algorithms for intraday solar irradiance forecasting: multigene genetic programming (MGGP) and the multilayer perceptron (MLP) artificial neural network (ANN). MGGP is an evolutionary algorithm white-box method and is a novel approach in the field. Persistence, MGGP and MLP were compared to forecast irradiance at six locations, within horizons from 15 to 120 min, in order to compare these methods based on a wide range of reliable results. The assessment of exogenous inputs indicates that the use of additional weather variables improves irradiance forecastability, resulting in improvements of 5.68% for mean absolute error (MAE) and 3.41% for root mean square error (RMSE). It was also verified that iterative predictions improve MGGP accuracy. The obtained results show that location, forecast horizon and error metric definition affect model accuracy dominance. Both Haurwitz and Ineichen clear sky models have been implemented, and the results denoted a low influence of these models in the prediction accuracy of multivariate ML forecasting. In a broad perspective, MGGP presented more accurate and robust results in single prediction cases, providing faster solutions, while ANN presented more accurate results for ensemble forecasting, although it presented higher complexity and requires additional computational effort.

Graphical Abstract

1. Introduction

The increased penetration of renewable energy sources (RES) in power systems has created a complex challenge from the point of view of electric grid management [1,2,3], mainly due to high intermittence energy sources such as sun irradiation and wind [4,5]. Climatic variations instantly influence the electric power generation of wind farms and photovoltaic (PV) systems and may put the balance between load demand and power supply in the electrical power grid at risk. In this context, weather forecasting stands out as an important tool to guide the operation of electric power utility grids in the presence of intermittent RES [6,7].
Solar energy forecasting is normally classified in terms of two different forecasting horizons, namely intraday forecasting, from a few minutes to 6 h ahead, and day-ahead forecasting, where predictions are performed for the next day, as defined by the International Energy Agency (IEA) in their Solar Forecasting State of the Art Report [8]. The IEA report also states that statistical techniques such as time-series machine learning provide good results in the intraday context, while physical models based on numerical weather prediction (NWP) provide good results in the day-ahead context.
Numerous contributions by different authors have reviewed the field of solar forecasting research [9,10]. Recurrently, the performance of a given method varies depending on different circumstances; a variety of machine learning (ML) and statistical methods have been studied for intraday solar predictions. Currently, ML is probably the most employed approach [9]. Research specifically done with ML techniques was reviewed by Voyant et al. [11] and has pointed out that the accuracy and robustness of ML forecasts depend on the training method and the metric used to evaluate predictions [9]. Ranking these methods in the literature is a complex mission due to the influence of the distinct data sets studied, time steps, forecasting horizons and performance indicators [11]. As a result, also considering significant errors noticed in solar forecast results, this work addresses the investigation of additional ML forecast methods at multiple locations in order to achieve further results to support the solar forecasting research field [12].
Concerning ML methods applied to solar forecasting, artificial neural networks (ANN), support vector machines (SVM), k-nearest neighbors (kNN), random forest (RF) and gradient boosted regression (GBR) are the most employed techniques [13,14,15,16,17,18,19]. In particular, in short-term intraday solar forecasting applications, autoregressive methods of the autoregressive integrated moving average (ARIMA) class are also frequently used as statistical method to forecast normalized indexes of irradiance [20]; moreover, frequency domain models are commonly used in solar irradiance forecasting [21].
Studies on intraday solar forecasting also differ from each other in terms of the category of data input. The most frequent approaches are time-series point forecasts, where meteorological measurements are used as input [13,14,15,17,21], sky imagers from satellite or from ground [22,23,24], experiments combining multiple time-series based on spatio-temporal forecasts [25,26] and hybrid approaches combining different statistical methods or multiple data acquisition systems [27].
This study proposes multigene genetic programming (MGGP) as a novel state-of-the-art ML method applied to solar irradiance forecasting. A comparison between MGGP and the multilayer perceptron (MLP) artificial neural network (ANN) is carried out on the same basis for intraday irradiance forecasting at multiple locations. Forecasts are based on meteorological historical data to execute predictions in horizons from 15 to 120 min ahead. The accuracy, robustness, advantages and disadvantages of each ML method are highlighted in order to support future research in the field. The results used in the comparative study were achieved by the implementation of both ML methods in the Matlab® programming platform.

2. Databases

Six locations were evaluated in this study, as shown in Figure 1 and reported in the following paragraphs.

2.1. Goiania, Brazil

The database from Goiania was acquired using a meteorological station setup at the Federal University of Goias (UFG) in Goiania—the capital city of the State of Goias, Midwest Brazil—whose coordinates are latitude −16.67 (Southern Hemisphere), longitude −9.24 (west); the station is located 749 m above sea level [28]. The three-year-long database sampled each 60 s from August 2015 to July 2018 is available at the webpage https://sites.google.com/site/sfvemcufg/weather-station [28]. The database has a rigorous data quality control with monthly inspection of equipment, and data are acquired and stored in a datalogger. Table 1 presents the configuration of the weather station setup for solar applications and research. Measured global horizontal irradiation values were, respectively, 1929 kWh/m 2 , 1913 kWh/m 2 and 1924 kWh/m 2 for each sequence of 12 months from August 2015 to July 2018.

2.2. Milan, Italy

The database from Milan was acquired using a meteorological station setup installed at the SolarTech Lab at the Politecnico di Milano in Milan, Italy, whose coordinates are latitude +45.50 (Northern Hemisphere), longitude +49.24 (east); the station is located 120 m above sea level [30]. A database was sampled each 60 s for 26 months from September 2016 to October 2018 with the station at the SolarTech Lab [30].

2.3. SURFRAD-US

The other four databases were obtained from sites based in the United States of America and collected from the USA National Oceanic and Atmospheric Administration (NOAA) Surface Radiation Network (SURFRAD). The coordinates of these sites are as follows: Desert Rock, latitude +36.62 (north), longitude −116.02 (west), altitude of 1007 m; Pennsylvania State University (PSU), latitude +40.72 (north), longitude −77.93 (west), altitude of 376 m; Bondville, latitude +40.05 (north), longitude −88.37 (west), altitude of 213 m; and Sioux Falls, latitude +43.73 (north), longitude −96.62 (west), altitude of 473 m. These databases are public domain and have also been analyzed in a previous study of irradiance forecasting [21]. This study used weather data sets from NOAA’s SURFRAD averaged per minute from January 2013 to December 2015. The quality control of measurements from such databases was performed by the identification and removal of inconsistent values. The yearly global horizontal irradiations from the weather stations from January 2013 to December 2015 were, respectively, 2025, 2055 and 1916 kWh/m 2 for Desert Rock; 1317, 1350 and 1366 kWh/m 2 for PSU; 1438, 1458 and 1438 kWh/m 2 for Bondville; and 1410, 1384 and 1437 kWh/m 2 for Sioux Falls.

3. Data Processing

3.1. Normalization

Independent of whether forecasts are performed with the use of artificial intelligence methods or classical regressions, the data processing strategy and input–output scheme play a key role in developing improved forecasts. The first data processing strategy considered global horizontal irradiance (G) as a target value, combining past values of irradiance and weather variables in addition to deterministic variables (in order to capture temporal trends in datasets) [31].
The proposed approach was refined by adopting a data processing strategy that forecasts normalized indexes in order to remove seasonality in solar data, yielding prompter ML algorithm convergence for irradiance forecasting. Values measured at night and during solar elevations (h) less than 5 were neglected. Normalization of solar data can be performed by the application of Equation (1), where k t * is the so-called clear sky index, G is the observed global horizontal plane irradiance (GHI) and G c l r is the theoretical clear sky irradiance.
k t * = G G c l r
Clear sky irradiance models used in the literature range from simple functions of extraterrestrial irradiance models to complex approaches that take numerous measured atmospheric parameters into account. It was found that Haurwitz clear sky irradiance and Ineichen–Perez models are simple and sufficiently accurate models that were systematically employed to evaluate meteorological data from a wide number of sites in the USA [32].
The Haurwitz clear sky irradiance model was developed in 1945 and is given by Equation (2), where θ z is the solar zenith angle (complementary to the solar elevation angle h). The constants 1098 and −0.057 were obtained by the least-squares method in order to fit measured cloudless sky irradiance data from a site in the USA to a theoretical curve based on a zenith angle exponential function. The exponential function is decreased by a factor proportional to cos θ z from sunrise to sunset.
G c l r = 1098 cos θ z exp 0 . 057 cos θ z
The solar zenith angle is defined as the angle between the zenithal axis and the line to the sun. Thus, this angle varies instantly, according to the rotation movement of the Earth. The cosine of the solar zenith angle is obtained from Equation (3), where δ is the declination angle, ϕ is the latitude of the weather station location, and ω is the sun hour angle. A detailed definition and calculation of solar geometry variables is provided in [33].
cos θ z = cos ϕ cos δ cos ω + sin ϕ sin δ
Ineichen–Perez clear sky irradiance uses optical air mass ratio (AM), atmospheric turbidity and altitude of location in clear sky irradiance modeling [34]. Ineichen–Perez G c l r is calculated by Equation (4), where G o is the extraterrestrial irradiance, h is the solar elevation angle, a 1 , a 2 , f h 1 and f h 2 are constant functions of local altitude, T L is the Linke turbidity factor and A M is the optical air mass ratio. The constants in Equations (5) and 6 were added empirically by Ineichen and Perez to improve previous clear sky models which were logarithmically dependent on the Linke turbidity factor and limited to specific location and zenith angles. T L was obtained in this study from a map of monthly averaged values for each site [29]. In order to avoid discontinuities in T L and G c l r calculations, a daily fitness procedure was used as presented in [35,36]. Figure 2 presents an example of daily T L fitness for Desert Rock.
G c l r = a 1 · G o · sin h · exp [ a 2 · A M · ( f h 1 + f h 2 ( T L 1 ) ) ]
a 1 = 5 . 09 · 10 5 · a l t i t u d e + 0 . 868
a 2 = 3 . 92 · 10 5 · a l t i t u d e + 0 . 0387
Figure 3 presents an example of the data normalization. It is possible to observe how the normalized index removes daily and yearly seasonality and emphasizes the influence of both clouds and solar potential instantaneous variabilities.

3.2. Data Statistics

A general overview of the solar variability and statistics of each site is presented in Table 2, achieved by applying the Ineichen clear sky model for 15 min averaged point databases. Results in Table 2 show that training, validation and testing datasets present similar mean and standard deviations for k t * —an important requirement to implement ML forecasting models. Results from Desert Rock present a behavior with more clear sky conditions as opposed to other locations, thus presenting the highest mean k t * with lowest standard deviations, while results from Milan present the highest variabilities ( σ ).

3.3. Data Relations

The ML methods consists of a “multivariate” data structure of inputs, as defined in [17], to forecast k t * (single output), using relations among data based on output past values, past values of weather variables and deterministic solar variables. Irradiances are then obtained by multiplying back the normalized index outputs by respective clear sky irradiances.
The inputs are as follows:
-
k t * (-5)…(-60): the 12 past values of k t * in time windows of 5 min averages.
-
T a (-5)…(-60): the 12 past values of ambient temperature in C .
-
W s (-5)…(-60): the 12 past values of wind speed in m/s.
-
H r (-5)…(-60): the 12 past values of relative humidity in %.
-
p a (-5)…(-60): the 12 past values of atmospheric pressure in mBar.
-
h is the elevation angle of the forecast time window in radians, varying from around 0.0873 (5 ) to 1.5708 (90 ).
-
t s is the time difference in respect to sunrise in minutes.
-
ω s is the solar time angle in radians.
-
“Day” is the day of forecast interval. The days of the year are counted starting one day after the winter solstice and ending on the winter solstice of the next year. We decided to adopt this definition to follow the solar cycle starting from the day of lowest irradiance levels, since the traditional day counting does not have a direct mathematical relation to the evolution of solar variables throughout the year.
-
“Month” is the month of the forecast interval, varying from 1 to 12.
Solar deterministic variables are calculated by deterministic mathematical models that represent the solar time-based behavior of solar quantities. These variables are computed in solar time instead of legal time and are directly proportional to the irradiance and its indexes. The data processing methods presented in this section yielded the most accurate results when used by the authors for the analyzed databases.

4. Forecast Methods

4.1. Genetic Programming

Genetic programming (GP) is an artificial intelligence technique which was originally proposed by Koza [37] in the evolutionary computation field; it is considered as an extension of genetic algorithms. GP is inspired by population genetics and biological evolution at the population level [38] (Algorithm 1). GP has proved to be competitive in time series forecasting in relation to other statistical techniques based on artificial intelligence, such as ANN and the support vector machine (SVM) [39,40,41]. It has been applied in numerous studies of predictions of natural resources—e.g., hydrology [42,43]—and has also been applied to daily or monthly solar irradiance forecasting in PV power systems [44,45,46].
MGGP and MLP neural networks were analyzed in comparison to other different methods of forecasting, considering that ANN comprises adjustable parameters that can be trained using optimization techniques to solve classification and regression problems and GP is a stochastic optimization method. When GP is used to build a mathematical model based on sampled data with the aim of predicting future values, it is named symbolic regression (SR). GP models are typically described as in Equation (7), where y is the observed output variable, y ^ is the predicted output, and x 1 x n are the observed input variables. In contrast to other soft computing methodologies, such as feed-forward ANNs and SVMs, trained GP models are basic constitutive equations that can be implemented without a specific software environment in any modern programming language.
y ^ = f ( x 1 , . . . , x n )
GP models can be classified into three different categories according to their mathematical model complexity: naive SR, when the model requires only one gene to relate input data with output data; scaled SR, when the model employs one gene associated to a bias term to relate input and output data; and multigene SR, when the GP uses multiple genes and a bias term to relate input and output data (Figure 4).
Algorithm 1: Genetic programming pseudocode
Energies 13 03005 i001
Figure 4 illustrates a population individual and a multigene GP model, usning Equation (8), where a bias term d 0 is added to two genes with weights d 1 and d 2 in a tree structure. The terms “plus”, “times”, “square root” and “tanh” are known as node functions. Both weights and nodes are obtained in a GP training procedure.
y ^ = d 0 + d 1 ( 0 . 41 x 1 + tanh ( x 2 x 3 ) ) + d 2 ( 0 . 45 x 3 + x 2 )
GP evolves a population of candidate solutions (population size) in multiple generations by the application of genetic operators with a tournament selection of best individuals. A crossover operation exchanges genes between individuals to assess possible structural improvements of individuals. Mutation is a fine adjustment operation that changes pieces or entire genes into new, random ones to evaluate a possible structural improvement in terms of fitness. Bias and gene weights of individuals are then optimized in terms of least root mean square errors applied to training data according to Equation (9). Applying an elitism strategy with a given elitism rate, a percentage of best fitness solutions is stored over generations. Based on these procedures, GP evaluates thousands of possible regression structures with optimized weights to relate inputs and outputs. Table 3 summarizes the parameters adopted in GP, which are considered again in Section 6.
s * = m i n i = 1 N s a m p ( y i y i ^ ) 2 N s a m p
The dynamics of GP solutions are characterized by generalization ability, providing both accurate and robust solutions in training and for other datasets. On the other hand, ANN is highly influenced by overfitting, which is usually controlled by a validation step named early stopping, while GP does not require a validation step during the SR model training stage. Figure 5 presents the performance of the best individuals which evolved over generations for GP forecasts. It is possible to observe the robustness of solutions repeating from training to validation datasets. MGGP models were implemented on GPtips 2—an open-source GP platform for Matlab® [47].

4.2. Artificial Neural Networks

A feed-forward multilayer perceptron neural network (MLP) architecture was applied to this analysis, containing one hidden layer of 10 neurons using the hyperbolic tangent sigmoid transfer function. The neural networks were trained with the Levenberg–Marquadt algorithm including early stopping implemented in Matlab® using the neural networks toolbox. The employed ANN set of attributes was previously validated for intraday solar forecasting [14].

4.3. Ensemble Forecasts

Ensemble forecast models are convenient to build with multiple ML simulations and tend to improve forecast accuracy [48]. The ensemble forecast is given by Equation (10), where N t r i a l is the number of trials by the given ML method. In this analysis, the internal parameters of GP and ANN do not vary in each trial, and 10 trials were performed to produce each ensemble according to the methodology described in [48].
G ^ e n s = i = 1 N t r i a l G ^ i N t r i a l

4.4. Iterative Forecasts

Rana et al. [17] evaluated a forecast method where predictions of instant t+1 are iteratively added as inputs to predictions of instant t+2. As a conclusion, the iterative method did not improve forecasts in their study on PV power forecasting using ANN ensemble and SVM; however, the iterative GP method was tested in this work and yielded improvements on forecasting results. Results comparable to [17] were obtained, and no significant improvement was achieved by using iteration for ANNs. Therefore, the results reported here were obtained using iterative predictions for MGGP.

4.5. Persistence

Persistence forecasting is a common benchmark technique used to evaluate intraday solar forecasting. Persistence forecasting can be computed by Equation (11), where G ^ ( t + Δ T ) is the persistence forecast and Δ T is the forecast horizon, which varies from 15 to 120 min; k t * ( t ) is the present clear sky index; and G c l r ( t + Δ T ) is the clear sky irradiance at the horizon of the forecast.
G ^ ( t + Δ T ) = k t * ( t ) · G c l r ( t + Δ T )

5. Error Metrics

Although there are many error metrics used in the field of solar forecasting, this study assumed the most traditional metrics in the literature: the root mean squared error (RMSE) given by Equation (12), the mean absolute error (MAE) computed by Equation (13) and the forecast skill given by Equation (14).
R M S E = i = 1 N s a m p ( y i y i ^ ) 2 N s a m p
M A E = i = 1 N s a m p | ( y i y i ^ ) | N s a m p
s = 1 R M S E f o r e c a s t R M S E p e r s i s t e n c e
While MAE is a linear error metric, RMSE is a quadratic error metric that penalizes inaccurate forecast values due to quadratic factors. RMSE is of particular interest when evaluating RES forecasting since ramp behavior is a relevant issue in PV power system operation.

6. Results and Discussion

6.1. GP Tuning

Initial simulations were intended to analyze the influence of GP parameters in forecast accuracy and robustness. The analysis of parametric influence is known as the parameter tuning of evolutionary algorithms (EAs), as described in [49]. Parameter tuning is by nature an optimization task comprising multiple variables (parameters). In current analyses of multiple horizon forecasts, each forecast horizon at each location consists of a different problem to be tuned. In order to reduce the number of simulations to assess GP parameters, this study considered prior knowledge from other studies to seek good parameter choices to perform a lower number of simulations. Therefore, parameter assessment was carried out only for one forecast horizon using the dataset from Goiania station. Thus, parameter settings from Goiania were used in forecasting models for other sites.
Lima et al. [50] performed a systematic analysis of GP that indicated the population size, number of generations and tree size as the main parameters which influence fitness, while genetic operators have a lower influence. Increases in the size limit of regression functions tend to improve fitness; however, when the size limit is excessively large, this leads to a bloat (function size growth without fitness improvement) [51]. Bloat can be relieved by using realistic elitism rates [52]. In summary, lower tournament sizes and lower elitism rates lead to a higher diversity of solutions.
According to the literature review and some former analyses of irradiance forecasting, the maximum number of genes was set at 5, the tree depth at 4, the number of generations at 150 and population size at 300. These parameters presented a good trade-off between complexity and fitness improvement. Figure 6 presents the improvement of solution fitness in the validation dataset from Goiania station versus the increase in complexity (increasing the maximum number of genes).
Genetic operators were analyzed by multiple simulations for a forecast horizon of 90 min ahead, as this is a demanding time window for prediction and consequently presents high variability in the different algorithm simulations. The results for accuracy and robustness are given in Figure 7. The number of generations was lowered to 50 during tests in order to obtain a higher variability of results. It is possible to conclude that the best accuracy and robustness (standard deviation for multiple simulations) were those accomplished using higher mutation rates, lower tournament sizes and lower elitism rates. Therefore, we selected the setting with lowest RMSE: κ = 6 , p m = 0 . 12 and elit = 0 . 30 .

6.2. Assessment of Exogenous Input

ANN and GP were executed for all formerly defined locations and forecast horizons both considering and neglecting weather variables Hr, Ta, Ws and pa. The error improvement index, I m p r o v e r r o r , was defined in Equation (15) in order to assess the improvement yielded by the addition of weather variables at a given error metric, where e r r o r u n i v is the forecast error obtained based on past values of k t * with the sole addition of deterministic variables, and e r r o r m u l t i v is the forecast error obtained by including weather variables. It is worth highlighting that deterministic variables are able to improve forecasts based merely on past values of k t * .
I m p r o v e r r o r = ( e r r o r u n i v e r r o r m u l t i v ) e r r o r u n i v · 100 %
Improvements were calculated both in terms of MAE and RMSE, as described in Figure 8. The graphs represent typical behaviors, where weather variables generally improve forecastability for all locations by up to 5.68% in terms of MAE and 3.41% in terms of RMSE; in some locations, negative improvements were obtained for shorter forecast horizons from 15 to 60 min. Mostly, the addition of weather variables tends to improve forecastability for all locations; thus, the results obtained by the multivariate forecasts are reported.

6.3. Specific Results

Complete results for each forecast horizon and location are presented in Appendix A. The most accurate results are in bold characters for both single and ensemble forecast comparisons. Model accuracy dominance depends on the location, forecast horizon and error metric, as summarized in Figure 9. The accomplished results point toward ANN as the most accurate for short horizons and GP as the most accurate for longer horizons, which also predominantly improves robustness. Furthermore, location attributes have been proven to affect model dominance. Figure 10 presents forecast accuracies for both methods applied to the Goiania station, where the most accurate results were obtained by ANN, and Figure 11 displays the results for the Desert Rock station, where the most accurate results were obtained by GP.
Both GP and ANN methods were consistently improved considering both error metrics by employing an ensemble strategy for each forecast horizon and location. ANN presented more significant improvement and superior accuracy using the ensemble strategy in most cases, as summarized for model accuracy dominance in Figure 12 using ensemble forecasting. G P e n s led to the most accurate results in eight cases out of 48, while A N N e n s yielded the most accurate results in 23 cases out of 48. G P e n s achieved the most accurate results for the Milan station for horizons from 15 to 45 min and from 105 to 120 min using MAE as a reference metric. At Desert Rock station, G P e n s attained the lowest RMSE for horizons from 30 to 120 min. At Bondville station, G P e n s accomplished the lowest RMSE for horizons from 90 to 120 min and the lowest MAE for horizons from 75 to 120 min. At PSU station, G P e n s led to the lowest MAE and RMSE for horizons from 105 to 120 min. At Sioux Falls station, G P e n s yielded the lowest RMSE for horizons from 105 to 120 min and the lowest MAE for horizons of 45 min and from 90 to 120 min.
A N N e n s has proved to be consistently effective in the forecasts carried out for Goiania weather station, as expected, because the lower variations in sunshine duration along the year lead to a less biased dataset in terms of overfitting, as night period points are excluded from the dataset during the processing stage.
From a comparison of the results obtained by Haurwitz and Ineichen for clear sky index forecasts, it is possible to conclude that Ineichen k t * persistence produces lower errors than results obtained by Haurwitz for most of the locations and horizons of prediction. Nevertheless, as the AI methods used here are improved by employing exogenous inputs, a trend of clear sky model dominance over results from GP and ANN techniques was not achieved.

6.4. Generic Results

The computation of averages based on multiple results is widely employed as a procedure to achieve reliable generalized results according to Rana et al. [17], although the use of averages does not disregard the importance of specific results. MAE and RMSE averages of all forecast horizons and locations were calculated in order to carry out a generic evaluation of accuracy for GP and ANN, and the results are presented in Figure 13. The average robustness of MAE and RMSE were similarly determined, and results are presented in Figure 14. From the generalized results, it is possible to assume that GP presents more accurate and robust forecast results in comparison to ANN for single forecasts; the ensemble strategy improves ANN forecasts more significantly than GP; the ANN ensemble generally presents the most accurate results; and both models produce similar forecastability, with little difference in terms of accuracy, indicating that GP can provide faster, more reliable and accurate predictions with lower computing complexity, while ANN can provide more accurate predictions using higher complexity and a time-demanding strategy.
A general comparison of clear sky indexes from multiple sites is exhibited in Table 4. From the analysis of results, it is possible to observe that the difference between Haurwitz k t * and Ineichen k t * forecast results is negligible, showing the low influence of the clear sky model on the accuracy of multivariate forecast results.

6.5. Regression Functions

Equation (16) presents an example of a regression function developed to forecast k t * (15), comprising a combination of the deterministic variable ω s with previous values of k t * and the weather variables T a and H r . The algorithm has been proven to be efficient in selecting suitable variables to achieve accurate and robust models with generalization ability. Selected variables to develop regressions for Goiania station are expressed in Table 5.
k t * ( 15 ) = 0 . 535 + 0 . 98 tanh k t * ( 5 ) 0 . 0049 [ T a ( 45 ) + ω s · k t * ( 20 ) ] 0 . 142 [ e k t * ( 35 ) · k t * ( 50 ) + cos k t * ( 20 ) ]
0 . 00141 H r ( 5 ) + 0 . 0244 [ e ω s k t * ( 5 ) k t * ( 35 ) ] + 0 . 00249 ω s k t * ( 20 ) e e ω s

6.6. Comparison with the State-of-the-Art

A recent analysis of intraday solar irradiance forecasting at the SURFRAD weather stations has been carried out using regression and frequency domain models [21]. A direct comparison of the results obtained by regression, frequency domain and MGGP is presented in Table 6. Reikard et al. [21] analyzed forecasts for the same years, based on the same historical data and datasets used here. Although pieces of datasets used in each analysis are not guaranteed to be the same, a direct comparison of the results is able to ensure the suitability of the results of GP prediction.

6.7. Machine Learning Algorithm Training Speed

Training machine learning algorithms to optimize results and accuracy is normally a time-consuming task. Table 7 presents a comparison of the average training times (in minutes) assessed for Goiania station according to each forecast horizon. Similar results were obtained for the other previously mentioned stations. Although MGGP has been demonstrated to be more robust for single forecasts, the training speed of this method is lower than that for ANN. Improvements of MGGP parameter tuning strategies should be considered in future studies in order to increase the speed of MGGP training.

7. Conclusions

Machine learning algorithms are extensively adopted techniques for solar forecasting. This study proposed and evaluated multigene genetic programming (MGGP) as a novel machine learning algorithm, which is classified as a white box, with the aim of performing intraday solar irradiance forecasting. MGGP derives analytical regression functions that can be implemented without a specific software environment in any modern programming language using fundamental hardware. MGGP has been proven to consistently possess data generalization ability, providing robust and reliable solutions. The MGGP algorithm and another state-of-the-art MLP artificial neural network (ANN) algorithm were applied to datasets from six different locations from three different countries in order to compare results for forecast horizons from 15 to 120 min.
Data processing strategies were carefully analyzed in terms of input and output alternatives. Initial simulations were carried out for solar irradiance forecasting, using 15 minute time-windows as input data. Five minute time-window data, Haurwitz and Ineichen clear sky indexes were considered and combined with solar deterministic variables and weather variables in order to yield accurate forecast accuracies in terms of the data processing strategy.
The computation of MAE and RMSE as error metrics showed that the location, forecast horizon and error of evaluation impact the selection of the dominant model in terms of accuracy. MGGP and ANN typically yielded similar and consistent results. MGGP’s utilization for single forecasts led to more accurate and robust results as opposed to ANN. Predictions were significantly improved for MGGP and ANN by adopting ensemble forecast, while the ensemble strategy improved ANN more extensively than MGGP. Regarding ensemble forecasts, MGGP was more accurate for a lower number of locations and evaluated forecast horizons in comparison to ANN, presenting the best forecast skills for Desert Rock station. MGGP predominantly accomplished more accurate prediction results for longer forecast horizons from 90 to 120 min ahead for different localities.
Based on a direct comparison with other state-of-the-art methods of forecasting applied to the same locations in USA, MGGP presented a relevant reduction in error and was proven to be a reliable and accurate approach for the analyzed localities.
The attributes of a locality have been proved to affect model dominance, showing that both MGGP and ANN can be applied to different locations. As suggestions for future investigation, studies may address hybridization strategies, ML algorithm improvement, advanced data processing strategies applied to MGGP forecasting and improvements in parameter tuning in order to enhance MGGP’s training speed; furthermore, additional analyses of other solar parameters with the aim of improving the accuracy and performance of forecasting techniques could be undertaken.

Author Contributions

Conceptualization, G.M.d.P., S.P.P. and M.M.; methodology, G.M.d.P., S.P.P. and M.M.; software, G.M.d.P., S.P.P., B.P.A., and M.M; validation, G.M.d.P., E.G.M. and M.M.; formal analysis, G.M.d.P. and M.M.; investigation, G.M.d.P., S.P.P., and S.L.; resources, E.G.M., S.P.P., M.M. and S.L.; data curation, E.G.M., S.P.P., M.M. and S.L.; writing—original draft preparation, G.M.d.P.; writing—review and editing, B.P.A., E.G.M., S.P.P., M.M., S.L.; visualization, G.M.d.P. and M.M.; supervision, B.P.A. and S.L.; project administration, S.P.P., B.P.A. and M.M.; funding acquisition, G.M.d.P., B.P.A. and S.P.P. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Fundação de Amparo à Pesquisa do Estado de Goiás grant number 154/2017.

Acknowledgments

The authors thank the Federal University of Goias (UFG) through the Electrical Mechanic and Computer Engineering School (EMC) for providing access to and scientific usage of the research data. The authors also thank Espora Energetica S/A, proponent company of the R&D project related to this work, and the respective cooperative companies, Transenergia Sao Paulo S/A, Transenergia Renovavel S/A, and Caldas Novas Transmissao S/A. The authors also thank Politecnico di Milano and the National Oceanic and Atmospheric Administration for providing access to and scientific usage of their weather data.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
AMAir mass
ANNArtificial neural network
ARIMAAutoregressive integrated moving average
EAEvolutionary algorithms
GBRGradient boosted regression
GPGenetic programming
IEAInternational energy agency
kNNk-Nearest neighbors
MAEMean absolute error
MGGPMultigene genetic programming
MLMachine learning
MLPMultilayer perceptron
NOAANational Oceanic and Atmospheric Administration
NWPNumerical weather prediction
PSUPennsylvania State University
PVPhotovoltaic
RESRenewable energy sources
RFRandom forest
RMSERoot mean squared error
SURFRADSurface Radiation Network
SVMSupport vector machine

Appendix A. Errors Obtained for Each Location, ML Algorithm and Forecast Horizon

Table A1. Forecast errors for Goiania (best values in bold). P e r s i s t : persistence.
Table A1. Forecast errors for Goiania (best values in bold). P e r s i s t : persistence.
Haur. k t * Inei. k t *
FHMethodsRMSE σ RMSE MAE σ MAE sRMSE σ RMSE MAE σ MAE
P e r s i s t 120.67 64.13 120.64 63.85
G P 14.51103.160.2860.320.2914.29103.400.3760.090.62
15 A N N 15.46102.020.2759.970.4415.08102.450.3760.190.45
G P e n s 14.90102.68 59.89 14.53103.12 59.83
A N N e n s 16.05101.30 59.09 15.87101.50 59.16
P e r s i s t 151.11 85.42 150.81 84.59
G P 13.75130.340.5282.360.6113.56130.360.2982.310.37
30 A N N 14.87128.640.5980.490.7214.49128.950.3981.020.90
G P e n s 14.11129.79 81.84 13.94129.78 81.81
A N N e n s 15.52127.65 79.34 15.28127.76 79.54
P e r s i s t 163.39 96.35 162.72 94.92
G P 15.46138.130.3190.120.3614.93138.430.4490.940.28
45 A N N 15.65137.820.3789.290.4015.03138.260.5089.860.65
G P e n s 15.56137.96 89.67 15.06138.21 90.56
A N N e n s 16.40136.60 87.99 16.03136.64 88.43
P e r s i s t 170.94 103.64 169.82 101.44
G P 16.13143.360.4096.000.4215.92142.780.2695.690.46
60 A N N 16.21143.220.7593.981.2115.77143.040.5094.520.69
G P e n s 16.45142.82 95.61 16.07142.53 95.44
A N N e n s 17.03141.83 92.46 16.36142.04 93.90
P e r s i s t 178.00 110.33 176.44 107.51
G P 17.36147.110.4599.500.4116.70146.980.2599.560.41
75 A N N 17.34147.150.6698.450.7916.42147.470.5298.810.50
G P e n s 17.62146.65 98.97 16.80146.79 99.35
A N N e n s 18.14145.71 96.98 17.25146.00 97.42
P e r s i s t 185.30 117.29 183.28 113.91
G P 18.86150.350.40102.470.4518.18149.960.40102.160.63
90 A N N 19.08149.950.43101.110.6818.03150.230.28101.490.88
G P e n s 19.09149.91 102.13 18.35149.65 101.73
A N N e n s 19.88148.46 99.57 18.84148.76 99.91
P e r s i s t 192.48 123.01 190.04 119.03
G P 20.71152.610.35104.210.5319.91152.210.36104.130.35
105 A N N 20.79152.450.50102.520.5619.45153.090.81103.821.30
G P e n s 20.94152.16 103.84 20.07151.91 103.82
A N N e n s 21.46151.18 101.29 20.20151.65 102.47
P e r s i s t 199.62 128.73 196.80 124.08
G P 22.65154.400.27106.040.3221.58154.330.37105.850.49
120 A N N 22.27155.170.92105.711.0121.12155.240.77106.491.00
G P e n s 22.80154.10 105.79 21.77153.96 105.50
A N N e n s 23.09153.52 104.23 21.73154.03 105.26
Table A2. Forecast errors for Milan (best values in bold).
Table A2. Forecast errors for Milan (best values in bold).
Haur. k t * Inei. k t *
FHMethodsRMSE σ RMSE MAE σ M A E sRMSE σ RMSE MAE σ MAE
P e r s i s t 76.02 37.43 75.97 37.33
G P 11.6367.180.6334.220.3811.3567.340.6334.060.30
15 A N N 12.5466.481.0935.110.3112.0666.801.0436.110.25
G P e n s 12.2366.72 33.84 11.9566.89 33.69
A N N e n s 14.8164.76 34.11 14.0165.32 34.18
P e r s i s t 99.41 51.12 99.29 50.80
G P 10.8088.670.3248.950.449.6289.740.3550.180.38
30 A N N 10.8688.610.7150.230.519.2490.121.8051.491.09
G P e n s 11.1288.36 48.65 10.1289.24 49.73
A N N e n s 12.6286.87 48.90 11.8287.55 48.92
P e r s i s t 110.31 59.34 110.12 58.76
G P 10.7098.500.2056.460.429.18100.010.3856.920.52
45 A N N 10.4098.830.6458.270.648.51100.750.8159.040.65
G P e n s 10.8798.32 56.25 9.4699.70 56.63
A N N e n s 12.0197.06 56.94 11.0297.99 57.17
P e r s i s t 120.03 66.08 119.82 65.28
G P 11.58106.120.2262.040.4911.09106.520.3562.780.39
60 A N N 11.55106.160.6163.800.7910.48107.270.7863.560.47
G P e n s 11.83105.83 61.75 11.52106.02 62.70
A N N e n s 12.87104.58 62.50 11.91105.54 61.69
P e r s i s t 128.66 72.05 128.54 71.04
G P 12.80112.180.3467.430.3212.36112.660.6567.200.50
75 A N N 12.67112.360.9067.970.7211.76113.421.4869.281.15
G P e n s 12.44112.65 67.22 12.50112.47 67.03
A N N e n s 13.94110.72 66.63 13.35111.38 67.63
P e r s i s t 136.10 77.66 136.15 76.32
G P 13.60117.590.4070.970.2913.93117.180.6671.160.49
90 A N N 13.52117.690.5372.621.0213.39117.910.8572.661.37
G P e n s 13.87117.22 70.78 14.35116.61 70.86
A N N e n s 14.79115.97 71.25 15.08115.62 70.74
P e r s i s t 142.26 82.92 142.50 81.48
G P 14.60121.490.5274.050.3514.60121.690.4474.920.45
105 A N N 13.64122.852.4176.341.2114.16122.330.8976.220.70
G P e n s 14.77121.26 73.85 14.77121.46 74.71
A N N e n s 15.45120.29 74.41 15.71120.11 74.56
P e r s i s t 147.60 87.74 148.21 86.14
G P 15.70124.430.3676.900.2815.95124.580.3678.461.07
120 A N N 15.03125.421.2179.811.6415.57125.141.0179.721.26
G P e n s 15.83124.24 76.73 16.13124.30 78.28
A N N e n s 16.25123.61 78.47 16.97123.06 78.15
Table A3. Forecast errors for Desert Rock (best values in bold).
Table A3. Forecast errors for Desert Rock (best values in bold).
Haur. k t * Inei. k t *
FHMethodsRMSE σ RMSE MAE σ MAE sRMSE σ RMSE MAE σ MAE
P e r s i s t 78.00 34.46 77.90 33.59
G P 11.9668.680.1431.640.1811.9468.600.2331.810.23
15 A N N 11.9868.660.2531.850.5611.9868.570.2931.840.51
G P e n s 12.2668.44 31.29 12.2868.34 31.58
A N N e n s 13.2467.68 31.02 12.9667.81 31.27
P e r s i s t 101.32 47.74 101.05 45.88
G P 11.4589.720.5745.240.4111.8089.120.0944.820.45
30 A N N 10.7490.440.7144.930.4310.7890.160.5344.830.37
G P e n s 11.7789.39 44.73 11.9189.01 44.68
A N N e n s 11.6089.57 44.15 11.6289.31 44.33
P e r s i s t 111.75 55.36 111.24 52.51
G P 12.0998.240.2850.870.3511.8998.021.4150.911.08
45 A N N 10.9899.490.4650.941.0010.9599.060.2251.100.67
G P e n s 12.2198.11 50.72 12.4797.37 50.47
A N N e n s 11.8598.51 50.00 11.5398.41 50.51
P e r s i s t 118.32 61.63 117.51 57.80
G P 13.05102.880.3255.240.3813.13102.090.1554.460.16
60 A N N 11.81104.341.2156.511.0111.28104.260.2954.990.47
G P e n s 13.17102.74 55.11 13.19102.02 54.41
A N N e n s 12.77103.20 55.35 11.90103.53 54.47
P e r s i s t 124.48 67.03 123.35 62.16
G P 14.46106.480.1159.120.1014.25105.770.1158.470.34
75 A N N 13.34107.880.3359.701.5012.74107.630.4758.800.60
G P e n s 14.56106.36 58.42 14.32105.69 57.81
A N N e n s 14.06106.98 58.31 13.41106.81 57.42
P e r s i s t 129.33 71.66 127.87 65.93
G P 15.18109.700.2061.550.3614.77108.980.1560.570.24
90 A N N 13.87111.400.7661.600.9112.81111.500.8061.120.65
G P e n s 15.25109.60 61.47 14.84108.90 60.44
A N N e n s 14.72110.30 60.67 13.55110.55 59.99
P e r s i s t 133.39 75.55 131.57 68.98
G P 16.06111.970.1463.150.2215.36111.360.1563.130.27
105 A N N 14.22114.420.7063.200.8913.68113.570.3963.420.60
G P e n s 16.11111.91 63.49 15.45111.24 62.52
A N N e n s 14.99113.40 62.11 14.34112.70 62.02
P e r s i s t 137.62 79.56 135.42 72.19
G P 16.83114.460.2066.090.2515.85113.960.1565.140.29
120 A N N 15.60116.150.5665.590.5214.36115.980.4265.541.16
G P e n s 16.91114.35 65.98 15.91113.88 65.07
A N N e n s 16.24115.27 64.86 15.07115.01 64.83
Table A4. Forecast errors for Pennsylvania State University (best values in bold).
Table A4. Forecast errors for Pennsylvania State University (best values in bold).
Haur. k t * Inei. k t *
FHMethodsRMSE σ RMSE MAE σ MAE sRMSE σ RMSE MAE σ MAE
P e r s i s t 94.43 51.76 94.37 51.49
G P 12.7882.360.4447.470.3412.9982.110.2847.230.14
15 A N N 14.4280.820.2947.490.4513.8681.290.2948.170.53
G P e n s 13.3181.86 47.10 13.3981.73 46.92
A N N e n s 14.9880.28 47.00 14.4980.69 47.57
P e r s i s t 118.31 68.46 118.13 67.85
G P 10.71105.640.2666.560.2810.46105.780.2666.020.28
30 A N N 11.70104.470.1765.930.6111.54104.490.3666.470.37
G P e n s 11.05105.23 66.23 10.77105.40 65.78
A N N e n s 12.34103.72 65.28 12.29103.62 65.66
P e r s i s t 131.38 78.48 131.05 77.51
G P 12.32115.190.2374.600.4011.85115.520.2174.300.29
45 A N N 13.07114.210.2874.800.6312.71114.390.2374.520.45
G P e n s 12.61114.82 74.26 12.06115.24 74.01
A N N e n s 13.76113.30 74.07 13.44113.44 73.69
P e r s i s t 138.63 84.75 138.19 83.45
G P 12.54121.250.0679.960.1712.17121.370.2980.290.31
60 A N N 13.11120.450.3679.940.8712.67120.690.4180.030.72
G P e n s 12.68121.06 79.78 12.39121.06 80.00
A N N e n s 13.68119.66 79.28 13.42119.64 79.17
P e r s i s t 144.19 90.43 143.68 88.84
G P 12.65125.950.1184.920.2112.40125.860.8684.800.97
75 A N N 12.88125.620.4884.870.9812.57125.630.4284.710.40
G P e n s 12.86125.65 84.71 12.77125.33 84.45
A N N e n s 13.57124.63 84.10 13.27124.61 83.81
P e r s i s t 150.58 96.09 150.03 94.27
G P 13.34130.500.3689.320.2013.63129.590.2989.070.37
90 A N N 13.27130.590.6189.370.9813.06130.430.5089.500.68
G P e n s 13.52130.23 89.09 13.77129.37 88.89
A N N e n s 14.13129.30 88.42 13.89129.19 88.57
P e r s i s t 156.99 101.08 156.42 99.04
G P 14.97133.480.1392.240.2114.69133.440.3892.100.43
105 A N N 14.38134.420.2392.990.9413.85134.750.4593.370.53
G P e n s 15.14133.22 92.13 14.90133.11 91.71
A N N e n s 15.08133.31 92.23 14.67133.46 92.44
P e r s i s t 164.37 106.50 163.82 104.31
G P 16.23137.700.2195.990.1516.21137.270.2295.950.22
120 A N N 15.50138.900.8297.390.9315.08139.120.7497.701.02
G P e n s 16.36137.48 95.85 16.37137.00 95.73
A N N e n s 16.25137.66 96.57 15.91137.75 96.73
Table A5. Forecast errors for Bondville (best values in bold).
Table A5. Forecast errors for Bondville (best values in bold).
Haur. k t * Inei. k t *
FHMethodsRMSE σ RMSE MAE σ MAE sRMSE σ RMSE MAE σ MAE
P e r s i s t 81.28 43.45 81.20 43.01
G P 10.4372.810.7741.510.3510.8572.390.4141.050.15
15 A N N 11.7471.740.4940.870.5911.1072.180.4141.260.34
G P e n s 11.3172.09 41.05 11.3072.02 40.84
A N N e n s 12.7170.95 40.22 11.9271.52 40.71
P e r s i s t 101.49 57.67 101.25 56.79
G P 9.9791.370.4756.080.5610.2390.890.2856.380.29
30 A N N 10.1691.180.4855.920.739.5791.560.2956.500.21
G P e n s 10.3590.98 55.75 10.5290.61 56.15
A N N e n s 10.9490.38 55.32 10.4190.71 55.78
P e r s i s t 113.22 66.80 112.82 65.39
G P 9.83102.090.5264.380.4810.28101.220.4264.370.53
45 A N N 10.61101.200.3263.270.689.82101.740.8964.491.34
G P e n s 10.26101.60 63.92 10.56100.91 64.15
A N N e n s 11.37100.35 62.57 10.74100.71 63.72
P e r s i s t 121.55 73.53 120.99 71.70
G P 11.65107.390.3569.370.4211.16107.480.4669.760.79
60 A N N 11.38107.730.3369.370.4810.72108.020.3769.980.54
G P e n s 12.06106.90 68.94 11.58106.97 69.30
A N N e n s 12.15106.79 68.64 11.53107.04 69.15
P e r s i s t 127.95 79.19 127.21 77.04
G P 12.01112.580.1173.780.2011.19112.970.3073.980.32
75 A N N 11.52113.210.4974.300.8310.62113.711.0174.831.01
G P e n s 12.22112.31 73.47 11.43112.67 73.70
A N N e n s 12.29112.23 73.56 11.62112.43 73.77
P e r s i s t 134.49 84.73 133.56 82.22
G P 13.22116.710.1977.690.2112.15117.330.2278.670.28
90 A N N 12.24118.030.5178.320.7411.40118.330.7178.930.48
G P e n s 13.37116.51 77.49 12.29117.14 78.49
A N N e n s 12.99117.02 77.53 12.44116.94 77.77
P e r s i s t 140.34 89.83 139.23 86.79
G P 13.52121.370.3281.880.3912.21122.230.2282.320.45
105 A N N 12.56122.710.6982.060.5211.90122.660.5582.700.45
G P e n s 13.87120.87 81.01 12.43121.93 82.10
A N N e n s 13.40121.53 81.18 12.96121.19 81.49
P e r s i s t 145.88 95.01 144.58 91.41
G P 14.42124.840.3585.950.3112.96125.850.7286.300.55
120 A N N 13.31126.460.7786.471.3512.54126.460.5586.820.76
G P e n s 14.59124.59 85.07 13.25125.42 85.79
A N N e n s 14.45124.80 85.30 13.59124.93 85.66
Table A6. Forecast errors for Sioux Falls (best values in bold).
Table A6. Forecast errors for Sioux Falls (best values in bold).
Haur. k t * Inei. k t *
FHMethodsRMSE σ RMSE MAE σ MAE sRMSE σ RMSE MAE σ MAE
P e r s i s t 75.47 40.51 75.40 40.06
G P 10.0967.850.3038.170.2410.1367.760.2337.820.19
15 A N N 12.2566.230.2038.020.1811.7066.581.0438.410.25
G P e n s 10.4667.58 37.91 10.3267.61 37.70
A N N e n s 13.0465.63 37.55 12.7665.78 37.78
P e r s i s t 95.39 54.00 95.22 53.11
G P 9.2286.600.1052.860.178.7186.930.2852.970.25
30 A N N 9.5186.320.4553.150.519.1786.490.2652.820.42
G P e n s 9.3186.51 52.77 9.2086.46 52.77
A N N e n s 10.3885.49 52.56 9.9585.75 52.27
P e r s i s t 107.46 62.87 107.17 61.44
G P 9.5697.180.2160.860.429.7296.750.0960.690.42
45 A N N 9.8496.890.5161.690.699.5796.910.3061.030.48
G P e n s 9.7996.94 60.61 10.0096.45 60.46
A N N e n s 10.6296.05 61.13 10.2396.20 60.52
P e r s i s t 116.51 69.81 116.12 67.95
G P 10.59104.170.1166.920.2310.42104.030.1767.090.30
60 A N N 10.16104.680.3867.470.3310.20104.270.4567.050.46
G P e n s 10.72104.02 66.80 10.52103.90 66.96
A N N e n s 10.98103.72 66.74 10.98103.37 66.39
P e r s i s t 123.64 75.45 123.21 73.35
G P 11.18109.820.2071.760.3311.01109.640.1471.930.39
75 A N N 10.92110.140.5872.210.8610.76109.950.3371.580.67
G P e n s 11.37109.58 71.56 11.18109.43 71.73
A N N e n s 11.75109.11 71.49 11.51109.03 70.92
P e r s i s t 130.98 81.05 130.57 78.71
G P 12.47114.650.1975.830.1512.31114.490.1175.460.35
90 A N N 11.84115.480.4276.451.0011.60115.420.2776.370.76
G P e n s 12.65114.42 75.62 12.42114.35 75.33
A N N e n s 12.71114.34 75.66 12.43114.34 75.59
P e r s i s t 138.49 86.65 138.10 83.89
G P 13.92119.200.1579.620.3413.70119.180.2579.310.21
105 A N N 12.90120.620.2080.500.5912.99120.170.5080.390.79
G P e n s 14.12118.94 79.41 13.94118.85 79.10
A N N e n s 13.81119.36 79.65 13.88118.94 79.52
P e r s i s t 143.72 91.37 143.38 88.30
G P 14.59122.750.3282.760.2013.98123.340.9283.061.03
120 A N N 13.71124.020.6384.050.4813.34124.260.6083.990.81
G P e n s 14.76122.50 82.52 14.34122.82 82.70
A N N e n s 14.52122.85 83.29 14.23122.97 83.11

References

  1. Liang, H.; Tamang, A.K.; Zhuang, W.; Shen, X.S. Stochastic information management in smart grid. IEEE Commun. Surv. Tutor. 2014, 16, 1746–1770. [Google Scholar] [CrossRef]
  2. Bagheri, A.; Zhao, C.; Qiu, F.; Wang, J. Resilient transmission hardening planning in a high renewable penetration era. IEEE Trans. Power Syst. 2019, 34, 873–882. [Google Scholar] [CrossRef]
  3. Lahon, R.; Gupta, C.P. Energy management of cooperative microgrids with high-penetration renewables. IET Renew. Power Gener. 2018, 12, 680–690. [Google Scholar] [CrossRef]
  4. Lauret, P.; Perez, R.; Aguiar, L.M.; Tapachès, E.; Diagne, H.M.; David, M. Characterization of the intraday variability regime of solar irradiation of climatically distinct locations. Sol. Energy 2016, 125, 99–110. [Google Scholar] [CrossRef]
  5. Shahriari, M.; Blumsack, S. Scaling of wind energy variability over space and time. Appl. Energy 2017, 195, 572–585. [Google Scholar] [CrossRef][Green Version]
  6. Bakirtzis, E.A.; Biskas, P.N. Multiple time resolution stochastic scheduling for systems with high renewable penetration. IEEE Trans. Power Syst. 2017, 32, 1030–1040. [Google Scholar] [CrossRef]
  7. Du, E.; Zhang, N.; Hodge, B.; Wang, Q.; Lu, Z.; Kang, C.; Kroposki, B.; Xia, Q. Operation of a high renewable penetrated power system with csp plants: A look-ahead stochastic unit commitment model. IEEE Trans. Power Syst. 2019, 34, 140–151. [Google Scholar] [CrossRef]
  8. Pelland, S.; Remund, J.; Keissl, J.; Oozeki, T.; Brabandere, K.D. Photovoltaic and Solar Forecasting: State of the Art; Tech. rep.; International Energy Agency: Paris, France, 2013. [Google Scholar]
  9. Yang, D.; Kleissl, J.; Gueymard, C.A.; Pedro, H.T.; Coimbra, C.F. History and trends in solar irradiance and pv power forecasting: A preliminary assessment and review using text mining. Sol. Energy 2018, 168, 60–101. [Google Scholar] [CrossRef]
  10. Diagne, M.; David, M.; Lauret, P.; Boland, J.; Schmutz, N. Review of solar irradiance forecasting methods and a proposition for small-scale insular grids. Renew. Sustain. Energy Rev. 2013, 27, 65–76. [Google Scholar] [CrossRef][Green Version]
  11. Voyant, C.; Notton, G.; Kalogirou, S.; Nivet, M.-L.; Paoli, C.; Motte, F.; Fouilloy, A. Machine learning methods for solar radiation forecasting: A review. Renew. Energy 2017, 105, 569–582. [Google Scholar] [CrossRef]
  12. Sperati, S.; Alessandrini, S.; Pinson, P.; Kariniotakis, G. The “weather intelligence for renewable energies” benchmarking exercise on short-term forecasting of wind and solar power generation. Energies 2015, 8, 9594–9619. [Google Scholar] [CrossRef][Green Version]
  13. Li, J.; Ward, J.K.; Tong, J.; Collins, L.; Platt, G. Machine learning for solar irradiance forecasting of photovoltaic system. Renew. Energy 2016, 90, 542–553. [Google Scholar] [CrossRef]
  14. Pedro, H.T.; Coimbra, C.F. Short-term irradiance forecastability for various solar micro-climates. Sol. Energy 2015, 122, 587–602. [Google Scholar] [CrossRef][Green Version]
  15. Benali, L.; Notton, G.; Fouilloy, A.; Voyant, C.; Dizene, R. Solar radiation forecasting using artificial neural network and random forest methods: Application to normal beam, horizontal diffuse and global components. Renew. Energy 2019, 132, 871–884. [Google Scholar] [CrossRef]
  16. Persson, C.; Bacher, P.; Shiga, T.; Madsen, H. Multi-site solar power forecasting using gradient boosted regression trees. Solar Energy 2017, 150, 423–436. [Google Scholar] [CrossRef]
  17. Rana, M.; Koprinska, I.; Agelidis, V.G. Univariate and multivariate methods for very short-term solar photovoltaic power forecasting. Energy Convers. Manag. 2016, 121, 380–390. [Google Scholar] [CrossRef]
  18. Zeng, J.; Qiao, W. Short-term solar power prediction using a support vector machine. Renew. Energy 2013, 52, 118–127. [Google Scholar] [CrossRef]
  19. Dolara, A.; Grimaccia, F.; Leva, S.; Mussetta, M.; Ogliari, E. A Physical Hybrid Artificial Neural Network for Short Term Forecasting of PV Plant Power Output. Energies 2015, 8, 1138–1153. [Google Scholar] [CrossRef][Green Version]
  20. Nobre, A.M.; Severiano, C.A.; Karthik, S.; Kubis, M.; Zhao, L.; Martins, F.R.; Pereira, E.B.; Rüther, R.; Reindl, T. Pv power conversion and short-term forecasting in a tropical, densely-built environment in Singapore. Renew. Energy 2016, 94, 496–509. [Google Scholar] [CrossRef]
  21. Reikard, G.; Hansen, C. Forecasting solar irradiance at short horizons: Frequency and time domain models. Renew. Energy 2019, 135, 1270–1290. [Google Scholar] [CrossRef]
  22. Yang, H.; Kurtz, B.; Nguyen, D.; Urquhart, B.; Chow, C.W.; Ghonima, M.; Kleissl, J. Solar irradiance forecasting using a ground-based sky imager developed at uc san diego. Sol. Energy 2014, 103, 502–524. [Google Scholar] [CrossRef]
  23. Chow, C.W.; Urquhart, B.; Lave, M.; Dominguez, A.; Kleissl, J.; Shields, J.; Washom, B. Intra-hour forecasting with a total sky imager at the uc san diego solar energy testbed. Sol. Energy 2011, 85, 2881–2893. [Google Scholar] [CrossRef][Green Version]
  24. Dong, Z.; Yang, D.; Reindl, T.; Walsh, W.M. Satellite image analysis and a hybrid esss/ann model to forecast solar irradiance in the tropics. Energy Convers. Manag. 2014, 79, 66–73. [Google Scholar] [CrossRef]
  25. Bessa, R.J.; Trindade, A.; Miranda, V. Spatial-temporal solar power forecasting for smart grids. IEEE Trans. Ind. Informat. 2015, 11, 232–241. [Google Scholar] [CrossRef]
  26. Gutierrez-Corea, F.-V.; Manso-Callejo, M.-A.; Moreno-Regidor, M.-P.; Manrique-Sancho, M.-T. Forecasting short-term solar irradiance based on artificial neural networks and data from neighboring meteorological stations. Sol. Energy 2016, 134, 119–131. [Google Scholar] [CrossRef]
  27. Aguiar, L.M.; Pereira, B.; Lauret, P.; Díaz, F.; David, M. Combining solar irradiance measurements, satellite-derived data and a numerical weather prediction model to improve intra-day solar forecasting. Renew. Energy 2016, 97, 599–610. [Google Scholar] [CrossRef][Green Version]
  28. Solar lab EMC/UFG. Federal University of Goias. Brazil. Available online: https://sites.google.com/site/sfvemcufg/ (accessed on 4 June 2020).
  29. SoDa. Solar Energy Services for Professionals. France. 2017. Available online: http://www.soda-pro.com/home;jsessionid=B032D33B0AD3460B741E14E41CC46BE2 (accessed on 4 June 2020).
  30. SolarTech Lab. Politecnico di Milano. Italy. Available online: http://www.solartech.polimi.it/ (accessed on 4 June 2020).
  31. De Paiva, G.M.; Pimentel, S.P.; Leva, S.; Mussetta, M. Intelligent approach to improve genetic programming based intra-day solar forecasting models. In Proceedings of the 2018 IEEE Congress on Evolutionary Computation (CEC), Rio de Janeiro, Brazil, 8–13 July 2018; pp. 1–8. [Google Scholar] [CrossRef]
  32. Reno, M.J.; Hansen, C.W.; Stein, J.S. Global Horizontal Irradiance Clear Sky Models: Implementation and Analysis; Tech. rep.; Sandia National Laboratories: Albuquerque, NM, USA, 2012.
  33. Duffie, J.A.; Beckman, W.A. Solar Engineering of Thermal Processes; John Wiley & Sons, Inc.: Hoboken, NJ, USA, 2013. [Google Scholar]
  34. Ineichen, P.; Perez, R. A new airmass independent formulation for the linke turbidity coefficient. Sol. Energy 2002, 73, 151–157. [Google Scholar] [CrossRef][Green Version]
  35. Ineichen, P. Comparison of eight clear sky broadband models against 16 independent data banks. Sol. Energy 2006, 80, 468–478. [Google Scholar] [CrossRef][Green Version]
  36. Engerer, N.; Mills, F. Kpv: A clear-sky index for photovoltaics. Sol. Energy 2014, 105, 679–693. [Google Scholar] [CrossRef]
  37. Koza, J.R. Genetic Programming: On the Programming of Computers by Means of Natural Selection; MIT Press: Cambridge, MA, USA, 1992. [Google Scholar]
  38. Brownlee, J. Clever Algorithms: Nature-Inspired Programming Recipes, 1st ed.; Lulu Press, Inc.: Morrisville, NC, USA, 2011. [Google Scholar]
  39. Lee, Y.-S.; Tong, L.-I. Forecasting time series using a methodology based on autoregressive integrated moving average and genetic programming. Knowl.-Based Syst. 2011, 24, 66–72. [Google Scholar] [CrossRef]
  40. Garg, A.; Sriram, S.; Tai, K. Empirical analysis of model selection criteria for genetic programming in modeling of time series system. In Proceedings of the 2013 IEEE Conference on Computational Intelligence for Financial Engineering Economics (CIFEr), Singapore, Singapore, 16–19 April 2013; IEEE: Piscataway, NJ, USA; pp. 90–94. [Google Scholar] [CrossRef]
  41. Mehr, A.D.; Kahya, E.; Olyaie, E. Streamflow prediction using linear genetic programming in comparison with a neuro-wavelet technique. J. Hydrol. 2013, 505, 240–249. [Google Scholar] [CrossRef]
  42. Ghorbani, M.A.; Khatibi, R.; Mehr, A.D.; Asadi, H. Chaos-based multigene genetic programming: A new hybrid strategy for river flow forecasting. J. Hydrol. 2018, 562, 455–467. [Google Scholar] [CrossRef]
  43. Mehr, A.D.; Jabarnejad, M.; Nourani, V. Pareto-optimal mpsa-mggp: A new gene-annealing model for monthly rainfall forecasting. J. Hydrol. 2019, 571, 406–415. [Google Scholar] [CrossRef]
  44. Russo, M.; Leotta, G.; Pugliatti, P.; Gigliucci, G. Genetic programming for photovoltaic plant output forecasting. Sol. Energy 2014, 105, 264–273. [Google Scholar] [CrossRef]
  45. Pan, I.; Pandey, D.S.; Das, S. Global solar irradiation prediction using a multi-gene genetic programming approach. J. Renew. Sustain. Energy 2013, 5, 063129. [Google Scholar] [CrossRef][Green Version]
  46. Ghimire, S.; Deo, R.C.; Downs, N.J.; Raj, N. Global solar radiation prediction by ann integrated with european centre for medium range weather forecast fields in solar rich cities of queensland australia. J. Clean. Prod. 2019, 216, 288–310. [Google Scholar] [CrossRef]
  47. Searson, D.; Leahy, D.; Willis, M. GPTIPS: An open source genetic programming toolbox for multigene symbolic regression. In Proceedings of the International Multiconference of Engineers and Computer Scientists, Hong Kong, 17–19 March 2010; IAENG: Hong Kong, 2010; pp. 77–80. [Google Scholar]
  48. Leva, S.; Dolara, A.; Grimaccia, F.; Mussetta, M.; Ogliari, E. Analysis and validation of 24 hours ahead neural network forecasting of photovoltaic output power. Math. Comput. Simul. 2017, 131, 88–100. [Google Scholar] [CrossRef][Green Version]
  49. Eiben, A.; Smit, S. Parameter tuning for configuring and analyzing evolutionary algorithms. Swarm Evol. Comput. 2011, 1, 19–31. [Google Scholar] [CrossRef][Green Version]
  50. Lima, E.B.; Pappa, G.L.; Almeida, J.M.; Gonçalves, M.A.; Meira, W. Tuning genetic programming parameters with factorial designs. In Proceedings of the IEEE Congress on Evolutionary Computation, Barcelona, Spain, 18–23 July 2010; IEEE: Piscataway, NJ, USA; pp. 1–8. [Google Scholar] [CrossRef]
  51. Poli, R.; Langdon, W.B.; McPhee, N.F. (with contributions by J. R. Koza); A Field Guide to Genetic Programming; Lulu Press, Inc.: Morrisville, NC, USA, 2008. [Google Scholar]
  52. Poli, R.; McPhee, N.F.; Vanneschi, L. Elitism reduces bloat in genetic programming. In Proceedings of the 10th Annual Conference on Genetic and Evolutionary Computation, Atlanta, GA, USA, 8–12 July 2008; Association for Computing Machinery: New York, NY, USA; pp. 1343–1344. [Google Scholar] [CrossRef]
Figure 1. Locations of the weather stations under analysis presented on a world map, adapted from [29].
Figure 1. Locations of the weather stations under analysis presented on a world map, adapted from [29].
Energies 13 03005 g001
Figure 2. Linke turbidity daily fitness (blue line) of monthly averaged values (red dots) for Desert Rock.
Figure 2. Linke turbidity daily fitness (blue line) of monthly averaged values (red dots) for Desert Rock.
Energies 13 03005 g002
Figure 3. Data normalization from yearly (left) and daily (right) perspectives for Goiania.
Figure 3. Data normalization from yearly (left) and daily (right) perspectives for Goiania.
Energies 13 03005 g003
Figure 4. Example of a multigene symbolic regression (SR) model presented in a tree structure.
Figure 4. Example of a multigene symbolic regression (SR) model presented in a tree structure.
Energies 13 03005 g004
Figure 5. Fitness of best GP solution s * measured by k t * RMSE for training and validation datasets.
Figure 5. Fitness of best GP solution s * measured by k t * RMSE for training and validation datasets.
Energies 13 03005 g005
Figure 6. Influence of maximum number of genes on the fitness of best solutions, evaluated for the Goiania validation dataset.
Figure 6. Influence of maximum number of genes on the fitness of best solutions, evaluated for the Goiania validation dataset.
Energies 13 03005 g006
Figure 7. Influence of tournament size ( κ ), mutation rate ( p m ) and elitism rate (elit) on the accuracy and robustness (RMSE standard deviation) of the validation dataset from the Goiania site.
Figure 7. Influence of tournament size ( κ ), mutation rate ( p m ) and elitism rate (elit) on the accuracy and robustness (RMSE standard deviation) of the validation dataset from the Goiania site.
Energies 13 03005 g007
Figure 8. Improvements (%) of multivariate forecasting using GP according to mean absolute error (MAE) (dark red bars) and according to RMSE (dark blue bars), and improvements (%) of multivariate forecasting using an artificial neural network (ANN) according to MAE (orange bars) and according to RMSE (light blue bars).
Figure 8. Improvements (%) of multivariate forecasting using GP according to mean absolute error (MAE) (dark red bars) and according to RMSE (dark blue bars), and improvements (%) of multivariate forecasting using an artificial neural network (ANN) according to MAE (orange bars) and according to RMSE (light blue bars).
Energies 13 03005 g008
Figure 9. Model accuracy dominance by location and forecast horizon in single forecasts. GP/ANN indicates cases in which accuracy dominance differs from the error metric evaluated.
Figure 9. Model accuracy dominance by location and forecast horizon in single forecasts. GP/ANN indicates cases in which accuracy dominance differs from the error metric evaluated.
Energies 13 03005 g009
Figure 10. Accuracy of persistence, GP and ANN according to RMSE (left) and MAE (right) for Goiania, showing the dominance of ANN.
Figure 10. Accuracy of persistence, GP and ANN according to RMSE (left) and MAE (right) for Goiania, showing the dominance of ANN.
Energies 13 03005 g010
Figure 11. Accuracy of persistence, GP and ANN according to RMSE (left) and MAE (right) for Desert Rock, showing the dominance of GP.
Figure 11. Accuracy of persistence, GP and ANN according to RMSE (left) and MAE (right) for Desert Rock, showing the dominance of GP.
Energies 13 03005 g011
Figure 12. Model accuracy dominance by location and forecast horizon in ensemble forecasts. GP/ANN indicates cases in which accuracy dominance differs from the error metric evaluated.
Figure 12. Model accuracy dominance by location and forecast horizon in ensemble forecasts. GP/ANN indicates cases in which accuracy dominance differs from the error metric evaluated.
Energies 13 03005 g012
Figure 13. General accuracies of GP, ANN, GP ensemble and ANN ensemble for all sites according to RMSE values (left-hand graphs) and MAE values (right-hand graphs).
Figure 13. General accuracies of GP, ANN, GP ensemble and ANN ensemble for all sites according to RMSE values (left-hand graphs) and MAE values (right-hand graphs).
Energies 13 03005 g013
Figure 14. Comparison of general robustness of GP and ANN single forecasts according to MAE and RMSE.
Figure 14. Comparison of general robustness of GP and ANN single forecasts according to MAE and RMSE.
Energies 13 03005 g014
Table 1. Description of the equipment used at the Federal University of Goias (UFG) weather station, the parameters they measure and their accuracy and range of operation.
Table 1. Description of the equipment used at the Federal University of Goias (UFG) weather station, the parameters they measure and their accuracy and range of operation.
EquipmentParameter MeasuredInformation
Pyranometer Hukseflux LP02Global horizontal irradianceSecond class ISO 9060: in-field uncertainty
calibrated of ±5%, calibration uncertainty < 1.8%
R. M. Young Wind 03002Wind speedRange 0 to 50 m/s and accuracy of ±0.5 m/s
Wind directionAccuracy of ±5%
Texas Electronics TB-2012MAtmospheric pressureCalibration range 878 to 1080 mBars
Uncertainty of ±1.3 mBar
Texas Electronics TTH-1315Ambient temperatureOperating ranges −40 C –+60 C and 0–100%
Relative humidityAccuracy of ±0.3 C and ±1.5% RH
Texas Electronics TR-525IRainfallAccuracy of ±1%
Datalogger Campbell ScientificAutomatic data acquisition
CR800X
Table 2. Data statistics of training, validation and test datasets for each location: N s a m p is the number of samples of each dataset, μ is the average Ineichen k t * and σ is the standard deviation of k t * .
Table 2. Data statistics of training, validation and test datasets for each location: N s a m p is the number of samples of each dataset, μ is the average Ineichen k t * and σ is the standard deviation of k t * .
Training Validation Testing
N samp μ σ N samp μ σ N samp μ σ
Goiania25,8130.73790.3042113670.74580.298311,1630.74230.3022
Milan17,8280.85440.384379690.80690.389779440.79990.4194
Desert Rock25,9590.91390.238010,9290.91330.245110,8650.90250.2458
Pennsylvania25,7060.67410.353410,9980.62600.349211,1770.66040.3572
Bondville25,9350.72460.359310,8180.69740.366011,0050.71970.3478
Sioux Falls25,8390.75790.345510,8980.74760.359410,7080.76380.3353
Table 3. Summary of genetic programming (GP) simulation parameters.
Table 3. Summary of genetic programming (GP) simulation parameters.
ParameterAdopted Setting
Node functions+, −, ·, /, x 2 , tanh, exp
x , exp x , sin, cos
Population size300
Maximum generations150
Maximum number of genes5
Maximum tree depth4
Tournament size ( κ )6
Lexicographic selectionTrue
Elitism fraction0.3
Fitness functionRoot mean squared error (RMSE)
Crossover probability ( p c )0.88
Mutation probability ( p m )0.12
High-level crossover probability0.5
Ephemeral random constants rangefrom −10 to +10
ERC probability at creating nodes0.2
Table 4. Generalized accuracies for Haurwitz k t * and Ineichen k t * forecasts.
Table 4. Generalized accuracies for Haurwitz k t * and Ineichen k t * forecasts.
RMSE σ RMSE MAE σ MAE
Haurwitz111.870.4470.220.54
Ineichen111.930.4770.330.55
Table 5. Variables selected by GP regression models according to the forecast horizon for Goiania.
Table 5. Variables selected by GP regression models according to the forecast horizon for Goiania.
Forecast HorizonVariables Selected
15 min ω s , k t * ( 5 , 20 , 35 , 50 ) , H r ( 5 ) , T a ( 40 )
30 min t s , ω s , k t * ^ ( 15 ) , k t * ( 5 ) , p a ( 25 ) , H r ( 40 )
45 min ω s , k t * ^ ( 30 ) , H r ( 5 , 35 , 40 ) , T a ( 20 ) , p a ( 60 )
60 min t s , h, k t * ^ ( 15 , 45 ) , H r ( 15 ) , T a ( 40 ) , k t * ( 45 )
75 min ω s , h, k t * ^ ( 60 ) , p a ( 5 , 10 , 20 ) , T a ( 10 , 55 ) , H r ( 10 , 15 ) , W s ( 60 )
90 min ω s , k t * ^ ( 30 , 45 , 75 )
105 min ω s , k t * ^ ( 45 , 60 , 90 )
120 min M o n t h , k t * ^ ( 105 ) , k t * ( 25 ) , H r ( 30 , 35 ) , T a ( 40 )
Table 6. Comparison of state-of-the-art methods applied to intraday solar irradiance forecasting for Surface Radiation Network (SURFRAD) weather stations (best values in bold).
Table 6. Comparison of state-of-the-art methods applied to intraday solar irradiance forecasting for Surface Radiation Network (SURFRAD) weather stations (best values in bold).
F.H.MethodDesert Rock Pennsylv. SU Bondville Sioux Falls
RMSEMAERMSEMAERMSEMAERMSEMAE
Regression84.451.489.155.381.149.370.944.9
15Freq. Domain84.251.091.056.182.550.173.946.5
G P e n s 68.331.681.746.972.040.867.637.7
Regression105.666.6112.674.1102.367.691.559.7
30Freq. Domain108.163.0112.073.2102.266.992.160.3
G P e n s 89.044.7105.465.890.656.286.552.8
Regression119.976.5127.387.1116.980.3106.371.3
45Freq. Domain119.171.7125.186.1114.578.8106.669.4
G P e n s 97.450.5115.274.0100.964.296.560.5
Table 7. Comparison of training time required for each machine learning (ML) method, evaluated for the Goiania dataset (best values in bold).
Table 7. Comparison of training time required for each machine learning (ML) method, evaluated for the Goiania dataset (best values in bold).
ML MethodF.H.
153045607590105120
GP3.623.363.243.403.503.713.433.42
ANN0.890.470.440.340.350.450.390.35

Share and Cite

MDPI and ACS Style

Mendonça de Paiva, G.; Pires Pimentel, S.; Pinheiro Alvarenga, B.; Gonçalves Marra, E.; Mussetta, M.; Leva, S. Multiple Site Intraday Solar Irradiance Forecasting by Machine Learning Algorithms: MGGP and MLP Neural Networks. Energies 2020, 13, 3005. https://doi.org/10.3390/en13113005

AMA Style

Mendonça de Paiva G, Pires Pimentel S, Pinheiro Alvarenga B, Gonçalves Marra E, Mussetta M, Leva S. Multiple Site Intraday Solar Irradiance Forecasting by Machine Learning Algorithms: MGGP and MLP Neural Networks. Energies. 2020; 13(11):3005. https://doi.org/10.3390/en13113005

Chicago/Turabian Style

Mendonça de Paiva, Gabriel, Sergio Pires Pimentel, Bernardo Pinheiro Alvarenga, Enes Gonçalves Marra, Marco Mussetta, and Sonia Leva. 2020. "Multiple Site Intraday Solar Irradiance Forecasting by Machine Learning Algorithms: MGGP and MLP Neural Networks" Energies 13, no. 11: 3005. https://doi.org/10.3390/en13113005

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop