# SeMiner: A Flexible Sequence Miner Method to Forecast Solar Time Series

^{1}

^{2}

^{3}

^{*}

^{†}

^{‡}

## Abstract

**:**

## 1. Introduction

- Few works consider the historical evolution of solar time series;
- Few works explicitly employs astrophysicists knowledge in the actual forecasting process;

- The pre-processing method must take into account the evolution of solar data to perform an accurate forecasting;
- The pre-processing method must be able to determine the most significant periods containing those solar time sub-series that best influences in the forecasting process;
- The preprocessing method must be flexible enough to take into account an astrophysics specialist knowledge;
- Finally, the preprocessing method must be optimized to perform its tasks as fast as possible, causing minimum delay on the entire forecasting process;

## 2. Related Work

## 3. Sequence Miner (SeMiner) Description

#### 3.1. The Series to Sequence (SS) Algorithm

**Definition**

**1.**

**Definition**

**2.**

**Definition**

**3.**

**Definition**

**4.**

**Definition**

**5.**

**Definition**

**6.**

**Definition**

**7.**

**Definition**

**8.**

**Definition**

**9.**

#### 3.1.1. Series to Sequence (SS) Algorithm: Sequential Version

^{2}of X-ray intensity, and at this instant, it also produced a solar flare of class B. The SS algorithm uses this time series to map observed values with solar events occurred after these observations.

Algorithm 1: The Series to Sequence (SS) algorithm. |

#### 3.1.2. Series to Sequence (SS) Algorithm: Parallel Version

- $qty\_of\_windows=(timeSeriesSize-totalWindowSize+1)/step$
- $NUMBEROFTHREADSPERBLOCK=1024$

Algorithm 2: SS kernel function algorithm. |

## 4. Experiments

- The first set of experiments performed considered the intensity of X-rays in the 1–8 Angstrom band as input, generating a set of sequences. It is important to mention that it aimed to forecast the background level of X-ray flux instead of solar flare events. The method was tested using different classification methods and data size, forecasting one day in advance. The highest accuracy obtained in those tests was 91.3%. True Positive Rate was equal to 94.3%, while True Negative Rate was 86.5%, using the IBK classifier (a variation of the k-nearest neighbor method-KNN). These results show a strong balance between True Positive (TP) and True Negative (TN) rates, a desirable feature considering that the solar data relevant to this task is very imbalanced. It also corroborates that the KNN method may achieve near-optimal results in predicting future events given present values. These achievements can be found even in other application domains. As an example, the method of predicting demand for natural gas and energy cost savings described in [26] also employs KNN, obtaining results with very low error rates between estimated and performed events.
- In the second set of experiments, feature selection methods were applied. Relief Attribute Evaluation [27] and StarMiner [28] methods were employed in SeMiner to forecast actual solar flares and to select X-ray subseries belonging to the most relevant periods for solar flare forecasting. This approach resulted in good accuracy results, with balanced TP and TN rates, and achieved a high-performance speedup. The developed method achieved an accuracy of 72.7%, with a TP rate of 70.9% and TN rate of 79.7%. Another contribution of this work was the possibility to analyze the most significant X-ray subseries to consider in the forecasting method, considering that feature selection can help the task of choosing the best time intervals to be used by the forecasting module. In those experiments, it was found that the best time intervals to get an X-ray subseries was within two days for current observations, comprising both, the initial and final periods of the first day, and the remaining 16 h of the second day. This result corroborates the empirical opinion from the expert astrophysicist, who assumes that the analysis of two days of data is enough to predict possible future solar flares.
- The last set of experiments performed was concerned with the parallelized version of the SS algorithm. It was found that the optimized method developed in CUDA for execution using GPUs runs about four times faster than the original algorithm, developed in pure C language. Overall, the solution adopted exhibits good potential to optimize this kind of software application.

#### 4.1. Experiment 1: X-ray Background Level Forecasting

- (1)
- 10 fold cross-validation: the dataset is divided into ten folds. Then, the first part is used as the test set, and the other nine as the training set. Afterwards, the second part is used as the test set, and the other nine as the training set, and so on.
- (2)
- Fixed dataset splitting: 67% for training, and 33% for testing.

#### 4.2. Experiment 2: Solar Flare Forecasting

#### 4.3. Experiment 3: SS Parallel Optimizations

- Processor: Intel i5;
- RAM: 8 GB;
- Graphic’s card: Geforce 960X
- -
- Number of CUDA cores (GPUs): 1024
- -
- Memory: 2 GB

- Operating System: Windows 10;
- Application running during the test: Eclipse IDE

## 5. Method Limitations and Future Directions

- (a)
- There is no consensus in solar space research regarding which set of solar features fully describe the events that determine the occurrence of solar flares and changes in the X-ray background level emitted by the Sun. In the method described in this paper, we have used the X-ray levels emitted by the Sun to predict future solar events. However, other works have considered different aspects, such as magnetic features captured from the Sun, or even features related to the geometry of sunspots found in the Sun chromosphere layer. Therefore, we intend to improve our method by using other features, expecting to increase the accuracy for longer forecasting horizons;
- (b)
- Other limitation of our method resides in the fact that, for the optimized version, it relies on a single server with just one GPU card. Considering the characteristics of the adopted parallel implementation, this architecture limits the data size the method can handle, which limits the time series length. One of the future aims of this project is to use data from at least eleven years, as this period encompass one full solar cycle, and may produce a learning model more suitable for middle or long-term forecasting. One solar cycle is the period upon which the solar activity increases to a top level and regresses to the weakest level. Implementation changes in the parallel version of the algorithm, or even a more powerful parallel architecture should allow us to fully exploit a much larger dataset, and maybe better results for longer forecasting horizons;
- (c)
- The third limitation of our method is that, as the size of the window increases, the method turns more demanding regarding computing time. Thus, as the forecasting horizon increases, the time for building the prediction model also increases;
- (d)
- Finally, according to astrophysicists’ empirical findings, the analysis of the last two days of X-ray flux may indicate a possible solar event within a maximum horizon of 3 days. Our method was developed based on this observation so that if the horizon exceeds three days, the method accuracy decreases.

## 6. Conclusions

## Acknowledgments

## Author Contributions

## Conflicts of Interest

## References

- Tsurutani, B.T.; Verkhoglyadova, O.P.; Mannucci, A.J.; Lakhina, G.S.; Li, G.; Zank, G.P. A brief review of “solar flare effects” on the ionosphere. Radio Sci.
**2009**, 44. [Google Scholar] [CrossRef] - Basu, S.; Basu, S.; MacKenzie, E.; Bridgwood, C.; Valladares, C.E.; Groves, K.M.; Carrano, C. Specification of the occurrence of equatorial ionospheric scintillations during the main phase of large magnetic storms within solar cycle 23. Radio Sci.
**2010**, 45, 1–15. [Google Scholar] [CrossRef] - Winter, L.M.; Balasubramaniam, K. Using the maximum X-ray flux ratio and X-ray background to predict solar flare class. Space Weather
**2015**, 13, 286–297. [Google Scholar] [CrossRef] - McIntosh, P.S. The classification of sunspot groups. Sol. Phys.
**1990**, 125, 251–267. [Google Scholar] [CrossRef] - Gallagher, P.T.; Moon, Y.J.; Wang, H. Active-Region Monitoring and Flare Forecasting—I. Data Processing and First Results. Sol. Phys.
**2002**, 209, 171–183. [Google Scholar] [CrossRef] - Barnes, G.; Leka, K.D.; Schumer, E.A.; Della-Rose, D.J. Probabilistic forecasting of solar flares from vector magnetogram data. Space Weather
**2007**, 5. [Google Scholar] [CrossRef] - Ahmed, O.W.; Qahwaji, R.; Colak, T.; Higgins, P.A.; Gallagher, P.T.; Bloomfield, D.S. Solar Flare Prediction Using Advanced Feature Extraction, Machine Learning, and Feature Selection. Sol. Phys.
**2013**, 283, 157–175. [Google Scholar] [CrossRef] - Nishizuka, N.; Sugiura, K.; Kubo, Y.; Den, M.; Watari, S.; Ishii, M. Solar Flare Prediction Model with Three Machine-learning Algorithms using Ultraviolet Brightening and Vector Magnetograms. Astrophys. J.
**2017**, 835, 156. [Google Scholar] [CrossRef] - Bobra, M.G.; Couvidat, S. Solar Flare Prediction Using SDO/HMI Vector Magnetic Field Data with a Machine-Learning Algorithm. Astrophys. J.
**2015**, 798, 135. [Google Scholar] [CrossRef] - Li, R.; Zhu, J. Solar flare forecasting based on sequential sunspot data. Res. Astron. Astrophys.
**2013**, 13, 1118–1126. [Google Scholar] [CrossRef] - Yu, D.; Huang, X.; Hu, Q.; Zhou, R.; Wang, H.; Cui, Y. Short-Term Solar Flare Prediction Using Multiresolution Predictors. Astrophys. J.
**2010**, 709, 321–326. [Google Scholar] [CrossRef] - Zavvari, A.; Islam, M.T.; Anwar, R.; Abidin, Z.Z. Solar Flare M-Class Prediction Using Artificial Intelligence Techniques. J. Theor. Appl. Inf. Technol.
**2015**, 74, 63–67. [Google Scholar] - Qahwaji, R.; Colak, T. Automatic Short-Term Solar Flare Prediction Using Machine Learning and Sunspot Associations. Sol. Phys.
**2007**, 241, 195–211. [Google Scholar] [CrossRef] [Green Version] - Wang, H.; Cui, Y.; Li, R.; Zhang, L.; Han, H. Solar flare forecasting model supported with artificial neural network techniques. Adv. Space Res.
**2008**, 42, 1464–1468. [Google Scholar] [CrossRef] - Zhang, X.; Liu, J.; Wang, Q. Image feature extraction for solar flare prediction. In Proceedings of the 2011 4th International Congress on Image and Signal, Shanghai, China, 15–17 October 2011; pp. 910–914. [Google Scholar]
- Huang, X.; Yu, D.; Hu, Q.; Wang, H.; Cui, Y. Short-Term Solar Flare Prediction Using Predictor Teams. Sol. Phys.
**2010**, 263, 175–184. [Google Scholar] [CrossRef] - Yu, D.; Huang, X.; Wang, H.; Cui, Y. Short-Term Solar Flare Prediction Using a Sequential Supervised Learning Method. Sol. Phys.
**2009**, 255, 91–105. [Google Scholar] [CrossRef] - Yuan, Y.; Shih, F.Y.; Jing, J.; Wang, H.M. Automated flare forecasting using a statistical learning technique. Res. Astron. Astrophys.
**2010**, 10, 785–796. [Google Scholar] [CrossRef] - Colak, T.; Qahwaji, R. Automated Solar Activity Prediction: A hybrid computer platform using machine learning and solar imaging for automated prediction of solar flares. Space Weather
**2009**, 7. [Google Scholar] [CrossRef] - Innocenti, M.E.; Johnson, A.; Markidis, S.; Amaya, J.; Deca, J.; Olshevsky, V.; Lapenta, G. Progress towards physics based space weather forecasting with exascale computing. Adv. Eng. Softw.
**2017**, 111, 3–17. [Google Scholar] [CrossRef] - Wells, B.E.; Singh, N.; Somarouthu, T. Parallel kinetic particle in cell code simulation of astrophysical plasmas affecting magnetic reconnection (non-reviewed). In Proceedings of the IEEE SoutheastCon 2008, Huntsville, AL, USA, 3–6 April 2008; p. 261. [Google Scholar]
- Volberg, O.; Toth, G.; Gombosi, T.I.; Stout, Q.F.; Powell, K.G.; De Zeeuw, D.; Ridley, A.J.; Kane, K.; Hansen, K.C.; Chesney, D.R.; et al. A high performance framework for sun to earth space weather modeling. In Proceedings of the 19th IEEE International Parallel and Distributed Processing Symposium (IPDPS), Denver, CO, USA, 4–8 April 2005. [Google Scholar]
- Clauer, C.R.; Gombosi, T.I.; Zeenw, D.L.D.; Ridley, A.J.; Powell, K.G.; Leer, B.V.; Stout, Q.F.; Groth, C.P.T.; Holzer, T.E. High performance computer methods applied to predictive space weather simulations. IEEE Trans. Plasma Sci.
**2000**, 28, 1931–1937. [Google Scholar] [CrossRef] - Poli, G.; Llapa, E.; Cecatto, J.; Saito, J.; Peters, J.; Ramanna, S.; Nicoletti, M. Solar flare detection system based on tolerance near sets in a GPU CUDA framework. Knowl. Based Syst.
**2014**, 70, 345–360. [Google Scholar] [CrossRef] - Xenopoulos, P.; Daniel, J.; Matheson, M.; Sukumar, S. Big data analytics on HPC architectures: Performance and cost. In Proceedings of the 2016 IEEE International Conference on Big Data (Big Data), Washington, DC, USA, 5–8 December 2016; pp. 2286–2295. [Google Scholar]
- Rodger, J.A. A fuzzy nearest neighbor neural network statistical model for predicting demand for natural gas and energy cost savings in public buildings. Expert Syst. Appl.
**2014**, 41, 1813–1829. [Google Scholar] [CrossRef] - Kira, K.; Rendell, L.A. A practical approach to feature selection. In Proceedings of the ninth international workshop on Machine learning, Aberdeen, UK, 1–3 July 1992; pp. 249–256. [Google Scholar]
- Ribeiro, M.X.; Balan, A.G.R.; Felipe, J.C.; Traina, A.J.M.; Traina, C. Mining statistical association rules to select the most relevant medical image features. In Mining Complex Data; Springer: Berlin/Heidelberg, Germany, 2009; pp. 113–131. [Google Scholar]
- Hall, M.; Frank, E.; Holmes, G.; Pfahringer, B.; Reutemann, P.; Witten, I. The WEKA data mining software: An update. SIGKDD Explor.
**2009**, 11, 10–18. [Google Scholar] [CrossRef] - Quinlan, J.R.; Ross, J. C4.5: Programs for machine learning; Morgan Kaufmann Publishers: Burlington, MA, USA, 1993; p. 302. [Google Scholar]
- Aha, D.W.; Kibler, D.; Albert, M.K. Instance-Based Learning Algorithms. Mach. Learn.
**1991**, 6, 37–66. [Google Scholar] [CrossRef] - Besnard, P.; Hanks, S. Uncertainty in Artificial Intelligence: Proceedings of the Eleventh Conference (1995): 18–20 August 1995; Morgan Kaufmann Publishers: San Francisco, CA, USA, 1995; p. 591. [Google Scholar]
- Holte, R.C. Very Simple Classification Rules Perform Well on Most Commonly Used Datasets. Mach. Learn.
**1993**, 11, 63–90. [Google Scholar] [CrossRef] - Schoolkopf, B.; Burges, C.J.C.; Smola, A.J. Advances in Kernel Methods: Support Vector Learning; MIT Press: Cambridge, MA, USA, 1999; p. 376. [Google Scholar]
- Fawcett, T. ROC Graphs: Notes and Practical Considerations for Researchers; Technical Report; HP Laboratories: Palo Alto, CA, USA, 2004. [Google Scholar]

X-ray Time Series | Solar Flare Class | |
---|---|---|

t | X(t) | |

0 | 3.65 × 10${}^{-7}$ | B |

5 | 3.92 × 10${}^{-7}$ | B |

10 | 4.09 × 10${}^{-7}$ | B |

15 | 4.04 × 10${}^{-7}$ | B |

20 | 3.92 × 10${}^{-7}$ | B |

25 | 3.94 × 10${}^{-7}$ | B |

30 | 3.84 × 10${}^{-7}$ | B |

35 | 3.80 × 10${}^{-7}$ | B |

40 | 3.80 × 10${}^{-7}$ | B |

45 | 3.83 × 10${}^{-6}$ | C |

50 | 3.84 × 10${}^{-7}$ | B |

55 | 3.90 × 10${}^{-7}$ | B |

60 | 6.47 × 10${}^{-7}$ | B |

65 | 6.75 × 10${}^{-7}$ | B |

70 | 5.24 × 10${}^{-7}$ | B |

X-ray Time Series | ||||||

t | X(t) | Class of Solar Flare | Window-0 | Window-1 | ||

0 | 3.65 $\times {10}^{-7}$ | B | Current-Window | step | Window-2 | |

5 | 3.92 $\times {10}^{-7}$ | B | Current-Window | step | Window-3 | |

10 | 4.09 $\times {10}^{-7}$ | B | Current-Window | step | ||

15 | 4.04 $\times {10}^{-7}$ | B | Current-Window | |||

20 | 3.92 $\times {10}^{-7}$ | B | jump | |||

25 | 3.94 $\times {10}^{-7}$ | B | jump | |||

30 | 3.84 $\times {10}^{-7}$ | B | jump | |||

35 | 3.80 $\times {10}^{-7}$ | B | jump | |||

40 | 3.80 $\times {10}^{-7}$ | B | Future-Window | |||

45 | 3.83 $\times {10}^{-6}$ | C | Future-Window | |||

50 | 3.84 $\times {10}^{-7}$ | B | Future-Window | |||

55 | 3.90 $\times {10}^{-7}$ | B | Future-Window | |||

60 | 6.47 $\times {10}^{-7}$ | B | ||||

65 | 6.75 $\times {10}^{-7}$ | B | ||||

70 | 5.24 $\times {10}^{-7}$ | B |

Current Window (f1–f4) | Jump (f5–f8) | Future Window (f9–f12) | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|

Window | f1 | f2 | f3 | f4 | f5 | f6 | f7 | f8 | f9 | f10 | f11 | f12 |

0 | 3.65 $\times {10}^{-7}$ (B) | 3.92 $\times {10}^{-7}$ (B) | 4.09 $\times {10}^{-7}$ (B) | 4.04 $\times {10}^{-7}$ (B) | 3.92 $\times {10}^{-7}$ (B) | 3.94 $\times {10}^{-7}$ (B) | 3.84 $\times {10}^{-7}$ (B) | 3.80 $\times {10}^{-7}$ (B) | 3.80 $\times {10}^{-7}$(B) | 3.83 $\times {10}^{-6}$(C) | 3.84 $\times {10}^{-7}$ (B) | 3.90 $\times {10}^{-7}$ (B) |

1 | 3.92 $\times {10}^{-7}$ (B) | 4.09 $\times {10}^{-7}$ (B) | 4.04 $\times {10}^{-7}$ (B) | 3.92 $\times {10}^{-7}$ (B) | 3.94 $\times {10}^{-7}$ (B) | 3.84 $\times {10}^{-7}$ (B) | 3.80 $\times {10}^{-7}$ (B) | 3.80 $\times {10}^{-7}$ (B) | 3.83 $\times {10}^{-6}$ (C) | 3.84 $\times {10}^{-7}$ (B) | 3.90 $\times {10}^{-7}$ (B) | 6.47 $\times {10}^{-7}$ (B) |

2 | 4.09 $\times {10}^{-7}$ (B) | 4.04 $\times {10}^{-7}$ (B) | 3.92 $\times {10}^{-7}$ (B) | 3.94 $\times {10}^{-7}$ (B) | 3.84 $\times {10}^{-7}$ (B) | 3.80 $\times {10}^{-7}$ (B) | 3.80 $\times {10}^{-7}$ (B) | 3.83 $\times {10}^{-6}$ (C) | 3.84 $\times {10}^{-7}$ (B) | 3.90 $\times {10}^{-7}$ (B) | 6.47 $\times {10}^{-7}$ (B) | 6.75 $\times {10}^{-7}$ (B) |

3 | 4.04 $\times {10}^{-7}$ (B) | 3.92 $\times {10}^{-7}$ (B) | 3.94 $\times {10}^{-7}$ (B) | 3.84 $\times {10}^{-7}$ (B) | 3.80 $\times {10}^{-7}$ (B) | 3.80 $\times {10}^{-7}$ (B) | 3.83 $\times {10}^{-6}$ (C) | 3.84 $\times {10}^{-7}$ (B) | 3.90 $\times {10}^{-7}$ (B) | 6.47 $\times {10}^{-7}$ (B) | 6.75 $\times {10}^{-7}$ (B) | 5.24 $\times {10}^{-7}$ (B) |

Solar Sequence | f1 | f2 | f3 | f4 | Class |
---|---|---|---|---|---|

0 | 3.65 $\times {10}^{-7}$ | 3.92 $\times {10}^{-7}$ | 4.09 $\times {10}^{-7}$ | 4.04 $\times {10}^{-7}$ | C |

1 | 3.92 $\times {10}^{-7}$ | 4.09 $\times {10}^{-7}$ | 4.04 $\times {10}^{-7}$ | 3.92 $\times {10}^{-7}$ | C |

2 | 4.09 $\times {10}^{-7}$ | 4.04 $\times {10}^{-7}$ | 3.92 $\times {10}^{-7}$ | 3.94 $\times {10}^{-7}$ | B |

3 | 4.04 $\times {10}^{-7}$ | 3.92 $\times {10}^{-7}$ | 3.94 $\times {10}^{-7}$ | 3.84 $\times {10}^{-7}$ | B |

**Table 5.**Setup of the SeMiner experiments. Test ID is the number of the performed experiment; n (current window size), d (window step), s (jump) and f (size of the future window) are measured in numbers of observations; $\Delta t$ (period of time) is measured in minutes; $\left|R\right|$ is the number of generated sequences.

Test ID | Test and Training Set Division Method | Class Definition of Solar Event | $\Delta \mathit{t}$ | n | s | f | d | #Observations | Data Period (Days) | $\left|\mathit{R}\right|$ |
---|---|---|---|---|---|---|---|---|---|---|

1 | Cross-validation: 10 folds | C, M and X | 5 | 288 | 288 | 288 | 1 | 2880 | 10 | 2016 |

2 | Cross-validation: 10 folds | C, M and X | 5 | 288 | 288 | 288 | 1 | 8640 | 30 | 7776 |

3 | Cross-validation: 10 folds | C, M and X | 5 | 288 | 288 | 288 | 1 | 17,280 | 60 | 16,416 |

4 | Percentage split: 67% for training data | C, M and X | 5 | 288 | 288 | 288 | 1 | 105,120 | 365 | 104,256 |

5 | Percentage split: 67% for training data | C, M and X | 5 | 288 | 288 | 288 | 1 | 210,240 | 730 | 209,376 |

Phase. Test Id | Feature Selection Method | Current Window | Jump | Future Window | Data Interval | Classification Method | |
---|---|---|---|---|---|---|---|

Training | Testing | ||||||

1.1 | Not used | 1 | 1 | 1 | 1 year 2014 | 6 months 2015 | J48, IBK, NaiveBayes, OneR, SVM (Weka-SMO-Polykernel) |

2.2 | Starminer | ||||||

3.3 | Relief Attribute Evaluation | ||||||

1.4 | Not used | 2 | 1 | 1 | |||

2.5 | Starminer | ||||||

3.6 | Relief Attribute Evaluation |

Configuration Attribute | Description | Possible Values |
---|---|---|

Feature Selection | This attribute tells the Feature Selection Method used | Not used Starminer Relief Attribute Evaluation |

Current window | This attribute tells the number of days of the window used to compose the sequence that will be labeled with the “future” class. | Integer value |

Jump | This attribute tells the number of observations that will be considered to build the next sequence. | Integer value |

Future window | This attribute tells the number of days of the window used to look for the “future” class of the previouly defined “Current window” | Integer value |

Data Interval/Training | This attribute tells the interval considered to catch the data used during the training phase of the classification method used in the experiments. | 1 year 2014 |

Data Interval/Testing | This attribute tells the interval considered to catch the data used during the testing phase of the experiments. | 6 months 2015 |

Classification Method | This attribute tells the Data Mining Classification Method used during the validation of the proposed forecasting method. | J48, IBK, NaiveBayes, OneR SVM (Weka-SMO-Polykernel) |

Setup | Results | ||||||
---|---|---|---|---|---|---|---|

Seminer Implementation | Data Volume | Current Window | Jump | Future Window | Step | Mean Execution Time (s) | Size of Output File |

pure C | 3 years (2013–2015) | 288 | 288 | 288 | 1 | 208.5 | 1.09 GB |

With CUDA | 3 years (2013–2015) | 288 | 288 | 288 | 1 | 47.8 | 1.09 GB |

© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Díscola Junior, S.L.; Cecatto, J.R.; Merino Fernandes, M.; Xavier Ribeiro, M.
SeMiner: A Flexible Sequence Miner Method to Forecast Solar Time Series. *Information* **2018**, *9*, 8.
https://doi.org/10.3390/info9010008

**AMA Style**

Díscola Junior SL, Cecatto JR, Merino Fernandes M, Xavier Ribeiro M.
SeMiner: A Flexible Sequence Miner Method to Forecast Solar Time Series. *Information*. 2018; 9(1):8.
https://doi.org/10.3390/info9010008

**Chicago/Turabian Style**

Díscola Junior, Sérgio Luisir, José Roberto Cecatto, Márcio Merino Fernandes, and Marcela Xavier Ribeiro.
2018. "SeMiner: A Flexible Sequence Miner Method to Forecast Solar Time Series" *Information* 9, no. 1: 8.
https://doi.org/10.3390/info9010008