Multisite Long-Term Photovoltaic Forecasting Model Based on VACI
Abstract
:1. Introduction
- Proposing an improved model architecture—variable–adaptive channel-independent architecture, which combines the advantages of increased training data volume and high robustness brought by channel independence and the benefits of enhanced representation performance from channel dependence;
- Presenting a specific scheme for the variable–adaptive channel-independent architecture by introducing a deep tree-structured multi-scale gated component into the backbone of the forecasting model, and subsequently demonstrating its effectiveness through ablation experiments;
- Developing a forecasting model called DTMGNet based on the deep tree-structured multi-scale gated component, which outperforms existing advanced methods on the solar energy benchmark dataset, achieving SOTA performance.
2. Related Work
2.1. PV Power Forecasting Task
2.2. Long-Term Series Forecasting (LTSF)
2.3. Multivariate Time Series Forecasting (MTSF)
3. Preliminaries
4. Proposed Method
4.1. Variable–Adaptive Channel-Independent Architecture (VACI)
4.2. DTMGNet
4.2.1. Instance Normalization
4.2.2. Backbone
4.2.3. DTM Block
Algorithm 1: DTM block | ||
Require: input lookback time series X ∈ RL×C; input length L; the number of channels C; embedding dimension E; the number of the block N. | ||
01: | Xi = StandardScaler(XT) {Xi ∈ RL×C} | |
02: | ▷ Tree-structured multi-scale processing for i in {1,···, N}: | |
03: | XRi = Reshape(Xi) {Xi ∈ RC×j×L/j, j = 2i} | |
X1,i = XRi[:, 1::2, :] {X1,i ∈ RC×j×L/2j, j = 2i} | ||
X2,i = XRi[:, ::2, :] {X2,i ∈ RC×j×L/2j, j = 2i} | ||
04: | ▷ 2 sets of blocks are adopted: | |
05: | for XBi in {X1,i, X2,i}: | |
06: | ▷ LayerNorm is adopted to reduce attribute discrepancies. | |
07: | XNi = LayerNorm(XBi) {XNi ∈ RC×j×L/2j} | |
08: | ▷ The MLP performs nonlinear transformations on the temporal aspect. | |
09: | XFi = MLP(XNi) {XFi ∈ RC×j×E} | |
10: | ▷ Add dropout to output stream. | |
11: | XDi = Dropout(XFi) {XDi∈RC×j×E} | |
12: | ▷ Add internalgate mechanism | |
13: | XGi = G(Linear(XDi), MLP(XDi)) {XGi ∈ RC×j×L/2j} | |
14: | End for | |
15: | Output: O = Reshape(Connect(XG1,i, XG2,i)) {O ∈ RC×L} |
4.2.4. Gating Mechanism
4.2.5. Feed-Forward Layer
Algorithm 2: DTMGNet Architecture | ||
Require: input lookback time series X ∈ RL×C; input length L; the number of channels C; predicted length O; embedding dimension E; the number of the block N. | ||
01: | Xi = StandardScaler(XT) {Xi ∈ RC×L} | |
02: | for i in {1,···, N}: {Run through the backbone.} | |
03: | Ci = DTM block(X) {Ci ∈ RC×L} | |
04: | ▷ Add gate mechanism to input stream. | |
05: | XGi = G(Xi, Ci){Ci ∈ RC×L} {XGi ∈ RC×L} | |
06: | ▷ The feedforward layer performs nonlinear transformations. | |
07: | XFi = FeedForward(XGi) {XFi ∈ RC×L} | |
08: | ▷ Subtract the previously learned output. | |
09: | Xi+1 = XFi − Xi {Xi+1 ∈ RC×L} | |
10: | End for | |
11: | ▷ Linear layer is adopted as prediction head. | |
12: | Oi = Linear(Xi+1) {Oi ∈ RC×O} | |
13: | Output: O = InvertedScaler(OiT) {Output the final prediction results O ∈ RO×C} |
5. Experimental Validation and Analysis
5.1. Experimental Setup
5.2. Experimental Results
5.3. Ablation Study and Analysis
5.4. Deep Stacking Experiments and Analysis
6. Conclusions
7. Limitations and Future Directions
- Optimizing the model processing architecture to enhance computational efficiency;
- Exploring the application of data augmentation techniques in multivariate and long-term prediction by mining time series data characteristics;
- Integrating online learning mechanisms by collecting the latest time series data to continuously optimize the model in real-time.
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Petroleum, B. Statistical Review of World Energy 2021. 2020. Available online: https://www.bp.com/content/dam/bp/business-sites/en/global/corporate/pdfs/energy-economics/statistical-review/bp-stats-review-2021-full-report.pdf (accessed on 1 March 2023).
- Halder, P.; Paul, N.; Joardder, M.U.; Sarker, M. Energy scarcity and potential of renewable energy in Bangladesh. Renew. Sustain. Energy Rev. 2015, 51, 1636–1649. [Google Scholar] [CrossRef]
- Gielen, D.; Gorini, R.; Leme, R.; Prakash, G.; Wagner, N.; Janeiro, L.; Collins, S.; Kadir, M.; Asmelash, E.; Ferroukhi, R. World Energy Transitions Outlook: 1.5 °C Pathway; International Renewable Energy Agency: Abu Dhabi, United Arab Emirates, 2021; Available online: https://www.irena.org/publications/2021/Jun/World-Energy-Transitions-Outlook (accessed on 1 March 2023).
- Attanayake, K.; Wickramage, I.; Samarasinghe, U.; Ranmini, Y.; Ehalapitiya, S.; Jayathilaka, R.; Yapa, S. Renewable energy as a solution to climate change: Insights from a comprehensive study across nations. PLoS ONE 2024, 19, e0299807. [Google Scholar] [CrossRef] [PubMed]
- Obaideen, K.; Olabi, A.G.; Al Swailmeen, Y.; Shehata, N.; Abdelkareem, M.A.; Alami, A.H.; Rodriguez, C.; Sayed, E.T. Solar energy: Applications, trends analysis, bibliometric analysis and research contribution to sustainable development goals (SDGs). Sustainability 2023, 15, 1418. [Google Scholar] [CrossRef]
- Mellit, A.; Kalogirou, S. Artificial intelligence and internet of things to improve efficacy of diagnosis and remote sensing of solar photovoltaic systems: Challenges, recommendations and future directions. Renew. Sustain. Energy Rev. 2021, 143, 110889. [Google Scholar] [CrossRef]
- Ekström, J.; Koivisto, M.; Millar, J.; Mellin, I.; Lehtonen, M. A statistical approach for hourly photovoltaic power generation modeling with generation locations without measured data. Sol. Energy 2016, 132, 173–187. [Google Scholar] [CrossRef]
- Piccolo, D. A distance measure for classifying ARIMA models. J. Time Ser. Anal. 1990, 11, 153–164. [Google Scholar] [CrossRef]
- Gardner, E.S., Jr. Exponential smoothing: The state of the art. J. Forecast. 1985, 4, 1–28. [Google Scholar] [CrossRef]
- Li, X.; Wang, K.; Wang, W.; Li, Y. A multiple object tracking method using Kalman filter. In Proceedings of the 2010 IEEE International Conference on Information and Automation, Harbin, China, 20–23 June 2010; pp. 1862–1866. [Google Scholar]
- Liang, D.; Zhang, H.; Yuan, D.; Zhang, B.; Zhang, M. Minusformer: Improving Time Series Forecasting by Progressively Learning Residuals. arXiv 2024, arXiv:2402.02332. [Google Scholar]
- Zsiborács, H.; Pintér, G.; Vincze, A.; Birkner, Z.; Baranyai, N.H. Grid balancing challenges illustrated by two European examples: Interactions of electric grids, photovoltaic power generation, energy storage and power generation forecasting. Energy Rep. 2021, 7, 3805–3818. [Google Scholar] [CrossRef]
- Faraji, J.; Hashemi-Dezaki, H.; Ketabi, A. Multi-year load growth-based optimal planning of grid-connected microgrid considering long-term load demand forecasting: A case study of Tehran, Iran. Sustain. Energy Technol. Assess. 2020, 42, 100827. [Google Scholar] [CrossRef]
- Zhou, H.; Zhang, S.; Peng, J.; Zhang, S.; Li, J.; Xiong, H.; Zhang, W. Informer: Beyond efficient transformer for long sequence time-series forecasting. Proc. AAAI Conf. Artif. Intell. 2021, 35, 11106–11115. [Google Scholar] [CrossRef]
- Han, L.; Ye, H.-J.; Zhan, D.-C. The capacity and robustness trade-off: Revisiting the channel independent strategy for multivariate time series forecasting. IEEE Trans. Knowl. Data Eng. 2024, 1–14. [Google Scholar] [CrossRef]
- Cai, W.; Liang, Y.; Liu, X.; Feng, J.; Wu, Y. Msgnet: Learning multi-scale inter-series correlations for multivariate time series forecasting. Proc. AAAI Conf. Artif. Intell. 2024, 38, 11141–11149. [Google Scholar] [CrossRef]
- Mohamad Radzi, P.N.L.; Akhter, M.N.; Mekhilef, S.; Mohamed Shah, N.J.S. Review on the application of photovoltaic forecasting using machine learning for very short-to long-term forecasting. Sustainability 2023, 15, 2942. [Google Scholar] [CrossRef]
- Wu, Y.-K.; Huang, C.-L.; Phan, Q.-T.; Li, Y.-Y. Completed review of various solar power forecasting techniques considering different viewpoints. Energies 2022, 15, 3320. [Google Scholar] [CrossRef]
- Niccolai, A.; Dolara, A.; Ogliari, E. Hybrid PV power forecasting methods: A comparison of different approaches. Energies 2021, 14, 451. [Google Scholar] [CrossRef]
- Chapman, L.; Thornes, J.E. The use of geographical information systems in climatology and meteorology. Prog. Phys. Geogr. 2003, 27, 313–330. [Google Scholar] [CrossRef]
- Lin, C.; Mao, X.; Qiu, C.; Zou, L.; Sensing, R. DTCNet: Transformer-CNN Distillation for Super-Resolution of Remote Sensing Image. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2024, 17, 11117–11133. [Google Scholar] [CrossRef]
- Gaboitaolelwe, J.; Zungeru, A.M.; Yahya, A.; Lebekwe, C.K.; Vinod, D.N.; Salau, A.O. Machine learning based solar photovoltaic power forecasting: A review and comparison. IEEE Access 2023, 11, 40820–40845. [Google Scholar] [CrossRef]
- Holt, C.A.; Shobe, W.M. Reprint of: Price and quantity collars for stabilizing emission allowance prices: Laboratory experiments on the EU ETS market stability reserve. J. Environ. Econ. Manag. 2016, 80, 69–86. [Google Scholar] [CrossRef]
- Zang, H.; Cheng, L.; Ding, T.; Cheung, K.W.; Liang, Z.; Wei, Z.; Sun, G. Hybrid method for short-term photovoltaic power forecasting based on deep convolutional neural network. IET Gener. Transm. Distrib. 2018, 12, 4557–4567. [Google Scholar] [CrossRef]
- Campos, F.D.; Sousa, T.C.; Barbosa, R.S. Short-Term Forecast of Photovoltaic Solar Energy Production Using LSTM. Energies 2024, 17, 2582. [Google Scholar] [CrossRef]
- Li, R.; Wang, M.; Li, X.; Qu, J.; Dong, Y. Short-term photovoltaic prediction based on CNN-GRU optimized by improved similar day extraction, decomposition noise reduction and SSA optimization. IET Renew. Power Gener. 2024, 18, 908–928. [Google Scholar] [CrossRef]
- Lai, G.; Chang, W.-C.; Yang, Y.; Liu, H. Modeling long-and short-term temporal patterns with deep neural networks. In Proceedings of the 41st International ACM SIGIR Conference on Research & Development in Information Retrieval, Ann Arbor, MI, USA, 8–12 July 2018; pp. 95–104. [Google Scholar]
- Wu, H.; Xu, J.; Wang, J.; Long, M. Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting. Adv. Neural Inf. Process. Syst. 2021, 34, 22419–22430. [Google Scholar]
- Zhou, T.; Ma, Z.; Wen, Q.; Wang, X.; Sun, L.; Jin, R. Fedformer: Frequency enhanced decomposed transformer for long-term series forecasting. In Proceedings of the International Conference on Machine Learning, Baltimore, MD, USA, 17–23 July 2022; pp. 27268–27286. [Google Scholar]
- Nie, Y.; Nguyen, N.H.; Sinthong, P.; Kalagnanam, J. A time series is worth 64 words: Long-term forecasting with transformers. arXiv 2022, arXiv:2211.14730. [Google Scholar]
- Lin, S.; Lin, W.; Wu, W.; Wang, S.; Wang, Y. Petformer: Long-term time series forecasting via placeholder-enhanced transformer. arXiv 2023, arXiv:2308.04791. [Google Scholar]
- Liu, Y.; Hu, T.; Zhang, H.; Wu, H.; Wang, S.; Ma, L.; Long, M. Itransformer: Inverted transformers are effective for time series forecasting. arXiv 2023, arXiv:2310.06625. [Google Scholar]
- Lira, H.; Martí, L.; Sanchez-Pi, N. A graph neural network with spatio-temporal attention for multi-sources time series data: An application to frost forecast. Sensors 2022, 22, 1486. [Google Scholar] [CrossRef] [PubMed]
- Liu, M.; Zeng, A.; Chen, M.; Xu, Z.; Lai, Q.; Ma, L.; Xu, Q. Scinet: Time series modeling and forecasting with sample convolution and interaction. Adv. Neural Inf. Process. Syst. 2022, 35, 5816–5828. [Google Scholar]
- Wu, H.; Hu, T.; Liu, Y.; Zhou, H.; Wang, J.; Long, M. Timesnet: Temporal 2d-variation modeling for general time series analysis. In Proceedings of The eleventh international conference on learning representations. arXiv 2022, arXiv:2210.02186. [Google Scholar]
- Lin, S.; Lin, W.; Wu, W.; Zhao, F.; Mo, R.; Zhang, H. Segrnn: Segment recurrent neural network for long-term time series forecasting. arXiv 2023, arXiv:2308.11200. [Google Scholar]
- Huang, Q.; Shen, L.; Zhang, R.; Ding, S.; Wang, B.; Zhou, Z.; Wang, Y. Crossgnn: Confronting noisy multivariate time series via cross interaction refinement. Adv. Neural Inf. Process. Syst. 2023, 36, 46885–46902. [Google Scholar]
- Shao, Z.; Zhang, Z.; Wang, F.; Wei, W.; Xu, Y. Spatial-temporal identity: A simple yet effective baseline for multivariate time series forecasting. In Proceedings of the 31st ACM International Conference on Information & Knowledge Management, Atlanta, GA, USA, 17–21 October 2022; pp. 4454–4458. [Google Scholar]
- Zeng, A.; Chen, M.; Zhang, L.; Xu, Q. Are transformers effective for time series forecasting? Proc. AAAI Conf. Artif. Intell. 2023, 37, 11121–11128. [Google Scholar] [CrossRef]
- Yi, K.; Zhang, Q.; Fan, W.; Wang, S.; Wang, P.; He, H.; An, N.; Lian, D.; Cao, L.; Niu, Z. Frequency-domain MLPs are more effective learners in time series forecasting. Adv. Neural Inf. Process. Syst. 2023, 36, 76656–76679. [Google Scholar]
- Das, A.; Kong, W.; Leach, A.; Mathur, S.; Sen, R.; Yu, R. Long-term forecasting with tide: Time-series dense encoder. arXiv 2023, arXiv:2304.08424. [Google Scholar]
- Ekambaram, V.; Jati, A.; Nguyen, N.; Sinthong, P.; Kalagnanam, J. Tsmixer: Lightweight mlp-mixer model for multivariate time series forecasting. In Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, Long Beach, CA, USA, 6–10 August 2023; pp. 459–469. [Google Scholar]
- Xu, Z.; Zeng, A.; Xu, Q. FITS: Modeling Time Series with $10 k $ Parameters. arXiv 2023, arXiv:2307.03756. [Google Scholar]
- Wang, S.; Wu, H.; Shi, X.; Hu, T.; Luo, H.; Ma, L.; Zhang, J.Y.; Zhou, J. Timemixer: Decomposable multiscale mixing for time series forecasting. arXiv 2024, arXiv:2405.14616. [Google Scholar]
- Zhao, L.; Shen, Y. Rethinking Channel Dependence for Multivariate Time Series Forecasting: Learning from Leading Indicators. arXiv 2024, arXiv:2401.17548. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778. [Google Scholar]
- Lin, S.; Lin, W.; Wu, W.; Chen, H.; Yang, J. SparseTSF: Modeling Long-term Time Series Forecasting with 1k Parameters. arXiv 2024, arXiv:2405.00946. [Google Scholar]
- Zhang, J.; Wang, J.; Qiang, W.; Xu, F.; Zheng, C.; Sun, F.; Xiong, H. Intriguing Properties of Positional Encoding in Time Series Forecasting. arXiv 2024, arXiv:2404.10337. [Google Scholar]
- Ni, R.; Lin, Z.; Wang, S.; Fanti, G. Mixture-of-Linear-Experts for Long-term Time Series Forecasting. In Proceedings of the International Conference on Artificial Intelligence and Statistics, Valencia, Spain, 2–4 May 2024; pp. 4672–4680. [Google Scholar]
- Goerg, G. Forecastable component analysis. In Proceedings of the International Conference on Machine Learning, Atlanta, GA, USA, 17–19 June 2013; pp. 64–72. [Google Scholar]
- Paszke, A.; Gross, S.; Massa, F.; Lerer, A.; Bradbury, J.; Chanan, G.; Killeen, T.; Lin, Z.; Gimelshein, N.; Antiga, L. Pytorch: An imperative style, high-performance deep learning library. Adv. Neural Inf. Process. Syst. 2019, 32, 8026–8037. [Google Scholar]
- Kingma, D.P.; Ba, J. Adam: A method for stochastic optimization. arXiv 2014, arXiv:1412.6980. [Google Scholar]
- Zhang, Y.; Yan, J. Crossformer: Transformer utilizing cross-dimension dependency for multivariate time series forecasting. In Proceedings of the Eleventh International Conference on Learning Representations, Kigali, Rwanda, 1–5 May 2023. [Google Scholar]
- Wang, H.; Peng, J.; Huang, F.; Wang, J.; Chen, J.; Xiao, Y. Micn: Multi-scale local and global context modeling for long-term series forecasting. In Proceedings of the Eleventh International Conference on Learning Representations, Virtual Event, 25–29 April 2022. [Google Scholar]
- Zhou, T.; Ma, Z.; Wen, Q.; Sun, L.; Yao, T.; Yin, W.; Jin, R. Film: Frequency improved legendre memory model for long-term time series forecasting. Adv. Neural Inf. Process. Syst. 2022, 35, 12677–12690. [Google Scholar]
- Liu, Y.; Wu, H.; Wang, J.; Long, M. Non-stationary transformers: Exploring the stationarity in time series forecasting. Adv. Neural Inf. Process. Syst. 2022, 35, 9881–9893. [Google Scholar]
Dataset | Dim | Dataset Size | Frequency | Forecastability |
---|---|---|---|---|
Solar Energy | 137 | (36601, 5161, 10417) | 10 min | 0.33 |
Metric | 96 | 192 | 336 | 720 | Avg | 1st Count | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
Model | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | ||
DTMGNet | 0.1673 | 0.215 | 0.194 | 0.241 | 0.197 | 0.247 | 0.199 | 0.2500 | 0.189 | 0.238 | 8 | |
TimeMixer | 0.167 | 0.220 | 0.187 | 0.249 | 0.200 | 0.258 | 0.215 | 0.250 | 0.192 | 0.244 | 3 | |
iTransformer | 0.193 | 0.243 | 0.217 | 0.269 | 0.227 | 0.280 | 0.239 | 0.298 | 0.219 | 0.272 | 0 | |
PatchTST | 0.224 | 0.278 | 0.253 | 0.298 | 0.273 | 0.306 | 0.272 | 0.308 | 0.256 | 0.298 | 0 | |
TimesNet | 0.219 | 0.314 | 0.231 | 0.322 | 0.246 | 0.337 | 0.280 | 0.363 | 0.244 | 0.334 | 0 | |
Crossformer | 0.181 | 0.240 | 0.196 | 0.252 | 0.216 | 0.243 | 0.220 | 0.256 | 0.204 | 0.248 | 1 | |
MICN | 0.188 | 0.252 | 0.215 | 0.280 | 0.222 | 0.267 | 0.226 | 0.264 | 0.213 | 0.266 | 0 | |
FiLM | 0.320 | 0.339 | 0.360 | 0.362 | 0.398 | 0.375 | 0.399 | 0.368 | 0.369 | 0.361 | 0 | |
DLinear | 0.289 | 0.377 | 0.319 | 0.397 | 0.352 | 0.415 | 0.356 | 0.412 | 0.329 | 0.400 | 0 | |
FEDformer | 0.201 | 0.304 | 0.237 | 0.337 | 0.254 | 0.362 | 0.280 | 0.397 | 0.243 | 0.350 | 0 | |
Stationary | 0.321 | 0.380 | 0.346 | 0.369 | 0.357 | 0.387 | 0.335 | 0.384 | 0.340 | 0.380 | 0 | |
Autoformer | 0.456 | 0.446 | 0.588 | 0.561 | 0.595 | 0.588 | 0.733 | 0.633 | 0.593 | 0.557 | 0 | |
Informer | 0.200 | 0.247 | 0.220 | 0.251 | 0.260 | 0.287 | 0.244 | 0.301 | 0.231 | 0.272 | 0 |
Metric | 96 | 192 | 336 | 720 | Avg | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Model | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | |
DTMGNet | 0.167 | 0.215 | 0.194 | 0.241 | 0.197 | 0.247 | 0.199 | 0.250 | 0.189 | 0.238 | |
Normal Gate | 0.181 | 0.227 | 0.197 | 0.245 | 0.203 | 0.252 | 0.202 | 0.252 | 0.196 | 0.244 | |
Without Adaption | 0.180 | 0.231 | 0.193 | 0.246 | 0.200 | 0.252 | 0.200 | 0.253 | 0.193 | 0.245 |
Metric | 96 | 192 | 336 | 720 | Avg | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Depth | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | |
1 | 0.184 | 0.233 | 0.196 | 0.245 | 0.203 | 0.251 | 0.206 | 0.254 | 0.197 | 0.246 | |
2 | 0.175 | 0.223 | 0.195 | 0.243 | 0.201 | 0.250 | 0.201 | 0.251 | 0.193 | 0.242 | |
3 | 0.173 | 0.218 | 0.195 | 0.242 | 0.199 | 0.248 | 0.199 | 0.250 | 0.191 | 0.240 | |
4 | 0.167 | 0.215 | 0.194 | 0.241 | 0.197 | 0.247 | 0.199 | 0.250 | 0.189 | 0.238 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Feng, S.; Chen, R.; Huang, M.; Wu, Y.; Liu, H. Multisite Long-Term Photovoltaic Forecasting Model Based on VACI. Electronics 2024, 13, 2806. https://doi.org/10.3390/electronics13142806
Feng S, Chen R, Huang M, Wu Y, Liu H. Multisite Long-Term Photovoltaic Forecasting Model Based on VACI. Electronics. 2024; 13(14):2806. https://doi.org/10.3390/electronics13142806
Chicago/Turabian StyleFeng, Siling, Ruitao Chen, Mengxing Huang, Yuanyuan Wu, and Huizhou Liu. 2024. "Multisite Long-Term Photovoltaic Forecasting Model Based on VACI" Electronics 13, no. 14: 2806. https://doi.org/10.3390/electronics13142806
APA StyleFeng, S., Chen, R., Huang, M., Wu, Y., & Liu, H. (2024). Multisite Long-Term Photovoltaic Forecasting Model Based on VACI. Electronics, 13(14), 2806. https://doi.org/10.3390/electronics13142806