Ultra-Short-Term Power Prediction for Distributed Photovoltaics Based on Time-Series LLMs
Abstract
1. Introduction
- (1)
- A dedicated LLM reprogramming framework. We propose an efficient approach that freezes the pre-trained LLM parameters while redesigning only the input and output adaptation layers. This transforms numerical PV time-series data into embedding-compatible representations for direct processing by the LLM, which greatly reduces training costs, mitigates the need for large labeled datasets, and facilitates effective few-shot learning.
- (2)
- A Prompt-as-Prefix (PaP) based cross-modal prompting mechanism. We design a PaP strategy to integrate meteorological information, statistical features, and task instructions into textual prompts. These prompts are fused with the reprogrammed time-series embeddings as prefix tokens, allowing the LLM to fully leverage its pre-trained global reasoning capacity to improve forecasting accuracy and stability across multiple PV stations and under varying weather conditions.
- (3)
- A cross-attention data reconstruction module for numerical-text modality alignment. We propose a module that maps normalized time-series slices into the LLM’s semantic space and aligns them with pre-trained word embeddings via multi-head cross-attention. This approach effectively captures nonlinear dependencies and complex spatio-temporal patterns, enabling accurate ultra-short-term PV predictions without retraining the entire LLM.
2. Materials and Methods
2.1. Problem Formulation and Feature Analysis
2.2. LLM Prediction Methodology
2.2.1. Transformer Model
2.2.2. Pre-Training and Fine-Tuning Paradigm
2.2.3. Adapting LLMs for Time-Series Forecasting
- Embedding-visible Adaptation: This approach leverages the LLM as a powerful feature extractor. It entails designing a specialized input layer to convert numerical time-series data into vector embeddings that the LLM can process. The model is then fine-tuned to recognize temporal patterns within these new “numerical tokens.”
- Text-visible Adaptation: By contrast, this method attempts to convert numerical data directly into natural language descriptions. It then utilizes the LLM’s inherent text comprehension and reasoning capabilities, guided by prompting, to generate forecasts.

3. The Proposed Solar-LLM Framework
3.1. Overall Structure
3.2. Prompt-as-Prefix Module
- Dataset context: The dataset context includes background information related to distributed PV power prediction, such as the data source, collection frequency, geographic location, and environmental conditions. For example, “This dataset comes from the historical power generation records of a PV power plant with a geographic location of 30° north latitude and 120° east longitude. The frequency of data collection is hourly, and the power generation and corresponding weather conditions, including sunny, cloudy, and rainy days, were recorded for the past year.” This information helps the LLM establish a basic understanding of the data and clarify its physical meaning and practical application scenarios.
- Mission Directive: The mission directive specifies the objectives and requirements of the LLM in the current mission. For example, “Please predict the change in PV power over the next 48 h based on historical power generation data and weather conditions over the past week.” This enables the adaptation of the LLM to different downstream tasks and ensures that the model is tuned for specific forecasting needs.
- Statistical description includes trends, cyclical characteristics, fluctuation ranges, and time delays of the data, for example, enter, “Daily generation power shows significant daily cyclical variations, with higher power in the morning and afternoon and lower power in the midday and evening. Power generation over the past month shows significant weekend fluctuations, with higher power on weekdays and lower on weekends.” The textualization of PV power data based on the characteristics of the data helped the LLM better understand and process the data.

3.3. Data Reconstruction Module
3.3.1. Time-Series Patching
3.3.2. Time-Series Data Reprogramming
3.4. Output Projection
- (1)
- Flattening: This process involves compressing a multidimensional data structure into a one-dimensional structure to simplify processing by subsequent fully connected or other layers. For distributed photovoltaic power prediction tasks, flattening helps to convert complex prediction results into a more manageable form. When the prediction task includes multiple time steps and features, the results must be converted to standard prediction data. If the input data X dimension is (T,F) with indicating T the number of time steps and indicating the number of features for each step, standard prediction data points need to be output. The flattening process is as follows:where is the data after flattening, and is it a one-dimensional array.
- (2)
- Linear projection processing: Linear projection uses a weight matrix and a bias vector to map the flattened data to the target feature space through linear transformation, thereby outputting the final predicted result, Y* Linear projection processing is as follows.where is the weight matrix, () is the dimension, and is the bias vector that adjusts the output value.
- (3)
- Output result mapping: Through flattening and linear projection processing, complex prediction results are converted into a standardized output format that is easy to process. The output result is , which represents the model’s prediction of the future T-step photovoltaic power generation.

3.5. Parameter-Efficient Fine-Tuning Strategy
4. Results and Discussion
4.1. Experimental Setup and Operating Environment
4.2. Parameter Design
4.3. Evaluation Metrics
4.4. Performance Comparison
4.5. Small Sample Learning Ability Analysis
4.6. Ablation Study
4.7. Comparison with Traditional Time-Series Forecasting Models
4.8. Discussion on Solar-LLM Generalizability
5. Conclusions
- (1)
- The Solar-LLM demonstrates a notable reduction in MAE, RMSE, and R2 metrics when compared to traditional TCN, CNN-BiLSTM, Informer, and GCN-LSTM models, a result of its meticulously designed input and output layers. Specifically, in 15 min clear-sky forecasting, Solar-LLM achieves a 43.6% reduction in MAE and a 43.9% reduction in RMSE compared to TCN. Compared to Informer, the MAE is reduced by 39.7% and the RMSE by 48.3%. Similarly, in 4 h rainy-day forecasting, Solar-LLM shows a 13.5% reduction in MAE and a 0.7% reduction in RMSE compared to CNN-BiLSTM. Compared to Informer, the MAE is reduced by 6.3% and the RMSE by 9.0%. These figures indicate the model’s high proficiency in PV forecasting tasks and its consistently high performance across various forecasting time horizons. Additionally, it exhibits strong small-sample learning capabilities. The model’s broad applicability and superiority in photovoltaic forecasting tasks have been validated by relevant experiments.
- (2)
- It has been demonstrated that the prompt prefix module of the Solar-LLM plays a certain role in improving the performance of the prediction task. The task description in the prompt encompasses not only the LLM’s comprehension of the prediction task but also incorporates statistical data concerning the input data, thereby facilitating the LLM’s reasoning and computational processes. The paper designed relevant ablation experiments, and the results showed that the use of prompts resulted in varying degrees of reduction in MAE values across all weather types and prediction periods, with reductions ranging from 3.03% to 15.91%. This further validates the effectiveness of the prompt prefix module in the Solar-LLM.
- (3)
- The data reconstruction module in the Solar-LLM has been demonstrated to enhance prediction performance. The cross-attention mechanism enables the data reconstruction module to effectively integrate time-series data with the pre-trained word embeddings of the LLM. This integration enables the model to gain a deeper understanding of data meaning and extract data features. The experimental results of the paper indicate that data reconstruction reduces MAE across all weather types and forecast horizons, with relative improvements in a single experiment ranging from approximately 1.54% to 15.09%, and an average relative improvement across nine experimental groups of about 7.7%. These quantitative results further validate that the integration of spatiotemporal features not only enhances the model’s understanding and feature representation of the data but also further improves prediction accuracy.
- (4)
- Experiments indicate that Solar-LLM has demonstrated excellent performance in ultra-short-term PV output forecasting, with high reliability, rapid adaptability, and controllable fine-tuning capabilities. This enables timely and accurate predictions of PV output fluctuations, thereby enhancing grid robustness and optimizing the dispatch strategy for distributed PV generation. By providing reliable forecasts, Solar-LLM improves the efficiency of electricity market operations and reduces reliance on traditional peaking resources. The practical significance for grid dispatch is substantial. Solar-LLM’s forecasting capability enables operators to optimize resource allocation and maintain grid stability amid renewable energy fluctuations. Future development will focus on integrating it into broader grid control architectures, exploring its potential in microgrid management and energy storage optimization, thereby contributing to a more resilient and sustainable energy future.
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Sun, R.; Wang, L.; Wang, Y.; Ding, R.; Xu, H.; Wang, J.; Li, Q. Ultra-short-term Forecasting of Photovoltaic Power Generation Output Based on Digital Twin. Power Syst. Technol. 2021, 45, 1258–1264. [Google Scholar]
- Wang, B.; Lyu, Y.; Chen, Z.; Zhao, Q.; Zhang, Z.; Tian, J. Mechanistic-data Hybrid-driven Short-term Power Forecasting of Distributed Photovoltaic Considering Information Time Delay. Autom. Electr. Power Syst. 2022, 46, 67–74. [Google Scholar]
- Wang, X.; Ai, X.; Wang, T. Short-term Prediction of Photovoltaic Power with Small Samples Based on Instance Transfer Learning. Acta Energiae Solaris Sin. 2024, 45, 325–333. [Google Scholar] [CrossRef]
- Golestaneh, F.; Pinson, P.; Gooi, H.B. Very short-term nonparametric probabilistic forecasting of renewable energy generation—With application to solar energy. IEEE Trans. Power Syst. 2016, 31, 3850–3863. [Google Scholar] [CrossRef]
- Li, F.; Wang, L.; Zhao, J.; Zhang, J.; Zhang, S.; Tian, Y. A Short-Term Power Forecasting Method for Distributed Photovoltaic Based on Weather Fusion and LSTM Network. Electr. Power 2022, 55, 149–154. [Google Scholar]
- Wang, H.; Ju, R.; Dong, Y. Interval Forecasting of Distributed Photovoltaic Power Based on Spatiotemporal Correlation Characteristics and B-LSTM Model. Electr. Power 2024, 57, 74–80. [Google Scholar]
- Zhang, K.; Ma, L.; Zhang, T.; Zhong, H.; Tan, W.; Wei, Y.; Lin, Z. A Spatiotemporal Collaborative Probabilistic Forecasting Method for Distributed Photovoltaic Output Based on iDGA-LSTM. Autom. Electr. Power Syst. 2025, 49, 128–138. [Google Scholar]
- Khan, W.; Walker, S.; Zeiler, W. Improved solar photovoltaic energy generation forecast using deep learning-based ensemble stacking approach. Energy 2022, 240, 122812. [Google Scholar] [CrossRef]
- Ye, L.; Chen, Z.; Zhao, Y.; Zhu, Q.W. A Photovoltaic Power Generation Output Forecasting Model Based on Genetic Algorithm-Fuzzy Radial Basis Function Neural Network. Autom. Electr. Power Syst. 2015, 39, 16–22. [Google Scholar]
- Nastić, F.; Jurišević, N.; Nikolić, D.; Končalović, D. Harnessing open data for hourly power generation forecasting in newly commissioned photovoltaic power plants. Energy Sustain. Dev. 2024, 81, 101512. [Google Scholar] [CrossRef]
- Liu, Y.; Zhang, H.; Li, C.; Huang, X.; Wang, J.; Long, M. Timer: Generative pre-trained transformers are large time series models. arXiv 2024, arXiv:2402.02368. [Google Scholar]
- Das, A.; Kong, W.; Sen, R.; Zhou, Y. A decoder-only foundation model for time-series forecasting. arXiv 2024, arXiv:2310.10688v4. [Google Scholar]
- Wu, T.; Ling, Q. STELLM: Spatio-temporal enhanced pre-trained large language model for wind speed forecasting. Appl. Energy 2024, 375, 124034. [Google Scholar] [CrossRef]
- Rasul, K.; Ashok, A.; Williams, A.R.; Khorasani, A.; Adamopoulos, G.; Bhagwatkar, R.; Biloš, M.; Ghonia, H.; Hassen, N.; Schneider, A. Lag-llama: Towards foundation models for time series forecasting. R0-FoMo: Robustness of Few-shot and Zero-shot Learning in Large Foundation Models. arXiv 2023, arXiv:2310.08278. [Google Scholar]
- Ma, Z.; Mei, G. A hybrid attention-based deep learning approach for wind power prediction. Appl. Energy 2022, 323, 119608. [Google Scholar] [CrossRef]
- Zheng, R.; Li, G.; Han, B.; Wang, K.; Peng, D. Day-ahead Power Forecasting of Distributed Photovoltaic Power Generation Based on Weighted Extended Daily Feature Matrix. Electr. Power Autom. Equip. 2022, 42, 99–105. [Google Scholar]
- Liu, Y.J.; Chen, Y.L.; Liu, J.Y.; Zhang, X.M.; Wu, X.Y.; Kong, W.Z. Distributed photovoltaic power generation day-ahead prediction based on ensemble learning. China Electr. Power 2022, 55, 38–45. [Google Scholar]
- Vaswani, A. Attention is all you need. arXiv 2017, arXiv:1706.03762. [Google Scholar]
- Yue, Z.; Ye, X.; Liu, R. A Review of Pre-training Technologies Based on Language Models. J. Chin. Inf. Process. 2021, 35, 15–29. [Google Scholar]
- Cai, R.; Ge, J.; Sun, Z.; Hu, B.; Xu, Y.; Sun, Z. Review of the Development of AI Pre-trained Large-scale Models. J. Chin. Comput. Syst. 1–12. Available online: http://kns.cnki.net/kcms/detail/21.1106.tp.20240510.1900.010.html (accessed on 21 August 2024).
- Sun, K.; Luo, X.; Luo, Y. Review on the Applications of Pre-trained Language Models. Comput. Sci. 2023, 50, 176–184. [Google Scholar]
- Zhang, H.; Yan, J.; Liu, Y.; Gao, Y.; Han, S.; Li, L. Multi-source and temporal attention network for probabilistic wind power prediction. IEEE Trans. Sustain. Energy 2021, 12, 2205–2218. [Google Scholar] [CrossRef]
- Wang, S.; Ma, L.; Chu, Z.; Zhang, J.Y.; Shi, X.; Chen, P.-Y.; Liang, Y.; Li, Y.-F.; Pan, S.; Wen, Q. Time-llm: Time series forecasting by reprogramming large language models. arXiv 2023, arXiv:2310.01728. [Google Scholar]
- Wu, H.; Meng, K.; Fan, D.; Zhang, Z.; Liu, Q. Multistep short-term wind speed forecasting using transformer. Energy 2022, 261, 125231. [Google Scholar] [CrossRef]
- Liu, X.; Zhou, J. Short-term wind power forecasting based on multivariate/multi-step LSTM with temporal feature attention mechanism. Appl. Soft Comput. 2024, 150, 111050. [Google Scholar] [CrossRef]
- Xue, H.; Salim, F.D. Utilizing language models for energy load forecasting. In Proceedings of the 10th ACM International Conference on Systems for Energy-Efficient Buildings, Cities, and Transportation, Istanbul, Turkey, 15–16 November 2023; pp. 224–227. [Google Scholar]
- Ilharco, G.; Wortsman, M.; Gadre, S.Y.; Song, S.; Hajishirzi, H.; Kornblith, S.; Farhadi, A.; Schmidt, L. Patching open-vocabulary models by interpolating weights. Adv. Neural Inf. Process. Syst. 2022, 35, 29262–29277. [Google Scholar]
- Asudani, D.S.; Nagwani, N.K.; Singh, P. Impact of word embedding models on text analytics in deep learning environment: A review. Artif. Intell. Rev. 2023, 56, 10345–10425. [Google Scholar] [CrossRef] [PubMed]
- Yao, T.; Wang, J.; Wu, H.; Zhang, P.; Li, S.; Wang, Y.; Chi, X.; Shi, M. A photovoltaic power output dataset: Multi-source photovoltaic power output dataset with Python toolkit. Sol. Energy 2021, 230, 122–130. [Google Scholar] [CrossRef]
- Tang, Y.G.; Yang, K.; Zhang, S.J.; Zhang, Z. Photovoltaic power forecasting: A hybrid deep learning model incorporating transfer learning strategy. Renew. Sustain. Energy Rev. 2022, 162, 112473. [Google Scholar] [CrossRef]
- Zhao, H.; Zhu, D.; Yang, Y.; Li, Q.; Zhang, E. Study on photovoltaic power forecasting model based on peak sunshine hours and sunshine duration. Energy Sci. Eng. 2023, 11, 4570–4580. [Google Scholar] [CrossRef]
- Wang, F.; Mi, Z.Q.; Zhen, Z.; Yang, G.; Zhou, H.M. A classification prediction method for photovoltaic power plant generation power based on weather state pattern recognition. Chin. J. Electr. Eng. 2013, 33, 75–82. [Google Scholar]
- Yang, J.H.; Zhang, L.; Liang, F.Z.; Yang, Y.J.; Zhang, W. A Prediction Method for Surface Solar Radiation Based on Secondary Weather Classification. Distrib. Energy 2024, 9, 54–63. [Google Scholar]
- Zhou, X.; Pang, C.; Zeng, X.; Jiang, L.; Chen, Y. A short-term power prediction method based on temporal convolutional network in virtual power plant photovoltaic system. IEEE Trans. Instrum. Meas. 2023, 72, 1–10. [Google Scholar] [CrossRef]
- Tang, C.; Zhang, Y.; Wu, F.; Tang, Z. An improved cnn-bilstm model for power load prediction in uncertain power systems. Energies 2024, 17, 2312. [Google Scholar] [CrossRef]
- Phan, Q.T.; Wu, Y.K.; Phan, Q.D. An innovative hybrid model combining informer and K-Means clustering methods for invisible multisite solar power estimation. IET Renew. Power Gener. 2024, 18, 4318–4333. [Google Scholar] [CrossRef]
- Chen, H.; Zhu, M.; Hu, X.; Wang, J.; Sun, Y.; Yang, J. Research on short-term load forecasting of new-type power system based on GCN-LSTM considering multiple influencing factors. Energy Rep. 2023, 9, 1022–1031. [Google Scholar] [CrossRef]
- Vaz, A.G.R.; Elsinga, B.; van Sark, W.G.J.H.M.; Brito, M. An artificial neural network to assess the impact of neighbouring photovoltaic systems in power forecasting in Utrecht, the Netherlands. Renew. Energy 2016, 85, 631–641. [Google Scholar] [CrossRef]
- Gensler, A.; Henze, J.; Sick, B.; Raabe, N. Deep Learning for solar power forecasting-An approach using AutoEncoder and LSTM Neural Networks. In Proceedings of the 2016 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Budapest, Hungary, 9–12 October 2016; pp. 2858–2865. [Google Scholar]
- Bouzerdoum, M.; Mellit, A.; Massi Pavan, A. A hybrid model (SARIMA-SVM) for short-term power forecasting of a small-scale grid-connected photovoltaic plant. Sol. Energy 2013, 98, 226–235. [Google Scholar] [CrossRef]






| Models | Reference | Architectural Paradigm | Training Requirement | Spatio-Temporal Correlation | Small Sample Learning | Long-Term Dependencies | Cross-Modal Fusion |
|---|---|---|---|---|---|---|---|
| Machine/Ensemble Learning Model | [1,4,8] | Classical Statistical/ML | Task-specific training | × | × | × | × |
| Deep Learning Models | [3] | Task-specific DL | Task-specific training | × | √ | × | × |
| [10] | Task-specific training | × | × | √ | × | ||
| [6,7] | Task-specific training | √ | × | √ | × | ||
| LLM-based Models | [11,12,14] | Native time-series foundation model | Large-scale pre-training | × | × | √ | × |
| [13] | Native time-series foundation model | Two-stage pre-training | √ | × | √ | × | |
| Our work | Reprogramming pre-trained LLM | Light weight fine-tuning | √ | √ | √ | √ |
| Influencing Factor | Ri | Influencing Factor | Ri |
|---|---|---|---|
| temp | 0.536 | distance to adjacent stations | 0.712 |
| humidity | −0.423 | historical PV power | 0.793 |
| solar irradiance | 0.921 | historical power consumption | 0.217 |
| air velocity | 0.161 | inverter efficiency | 0.255 |
| air visibility | 0.143 | historical grid exchange power | 0.402 |
| Module | Parameter State | Key Components |
|---|---|---|
| Pre-trained LLM core | frozen | all Transformer parameters |
| Data reconstruction module | fine-tuned | |
| Cross-Attention mechanism | fine-tuned | |
| Prompt embeddings | fine-tuned | vector representations of task context and statistical features |
| Output projection layer | fine-tuned |
| Classification | Prediction Model | Parameters | Value/Type |
|---|---|---|---|
| Our work | Solar-LLM | Based-model | GPT-2 |
| Feature dimension | 768 | ||
| Patch length | 32 | ||
| Stride | 16 | ||
| Hidden layer feature dimension | 32 | ||
| Attention heads | 8 | ||
| Hybrid Models | CNN-BiLSTM | CNN kernel | 8 × 1 |
| BiLSTM hidden layer feature dimension | 32 | ||
| GCN-LSTM | GCN/LSTM layer | 2 | |
| LSTM hidden layer feature dimension | 32 | ||
| GCN node embedding dimension | 16 | ||
| CNN-based | TCN | Layer | 2 |
| Hidden layer feature dimension | 32 | ||
| Kernel size | 16 | ||
| Transformer-based | Informer | Encoder/Decoder structure | 2 |
| Attention heads | 8 | ||
| Attention network feature dimension | 32 | ||
| Feed-forward network feature dimension | 64 |
| Weather Type | Prediction Model | 15 min | 1 h | 4 h | ||||||
|---|---|---|---|---|---|---|---|---|---|---|
| MAE | RMSE | MAE | RMSE | MAE | RMSE | |||||
| Sunny | Solar-LLM | 0.044 | 0.060 | 0.964 | 0.056 | 0.079 | 0.938 | 0.074 | 0.110 | 0.877 |
| TCN | 0.078 | 0.107 | 0.884 | 0.072 | 0.108 | 0.880 | 0.096 | 0.150 | 0.769 | |
| CNN-BiLSTM | 0.058 | 0.090 | 0.917 | 0.065 | 0.102 | 0.896 | 0.103 | 0.162 | 0.730 | |
| Informer | 0.073 | 0.116 | 0.865 | 0.067 | 0.109 | 0.879 | 0.081 | 0.132 | 0.820 | |
| GCN-LSTM | 0.059 | 0.097 | 0.903 | 0.061 | 0.099 | 0.902 | 0.093 | 0.143 | 0.787 | |
| Cloudy | Solar-LLM | 0.048 | 0.080 | 0.899 | 0.064 | 0.109 | 0.814 | 0.087 | 0.142 | 0.687 |
| TCN | 0.084 | 0.131 | 0.730 | 0.079 | 0.130 | 0.734 | 0.090 | 0.146 | 0.680 | |
| CNN-BiLSTM | 0.064 | 0.106 | 0.822 | 0.070 | 0.120 | 0.771 | 0.112 | 0.167 | 0.563 | |
| Informer | 0.086 | 0.148 | 0.660 | 0.081 | 0.140 | 0.691 | 0.089 | 0.151 | 0.658 | |
| GCN-LSTM | 0.066 | 0.113 | 0.800 | 0.068 | 0.121 | 0.769 | 0.108 | 0.169 | 0.565 | |
| Rainy | Solar-LLM | 0.037 | 0.065 | 0.878 | 0.057 | 0.098 | 0.714 | 0.090 | 0.151 | 0.476 |
| TCN | 0.083 | 0.133 | 0.464 | 0.082 | 0.136 | 0.451 | 0.096 | 0.156 | 0.438 | |
| CNN-BiLSTM | 0.060 | 0.105 | 0.659 | 0.068 | 0.119 | 0.576 | 0.104 | 0.152 | 0.406 | |
| Informer | 0.078 | 0.138 | 0.397 | 0.081 | 0.143 | 0.366 | 0.096 | 0.166 | 0.410 | |
| Weather Type | Prediction Model | Training Data Quantity | ||
|---|---|---|---|---|
| 25% | 50% | 100% | ||
| Sunny | Solar-LLM | 0.048 | 0.047 | 0.044 |
| TCN | 0.152 | 0.096 | 0.078 | |
| CNN-BiLSTM | 0.073 | 0.063 | 0.058 | |
| Informer | 0.140 | 0.090 | 0.073 | |
| GCN-LSTM | 0.081 | 0.072 | 0.059 | |
| Cloudy | Solar-LLM | 0.051 | 0.049 | 0.048 |
| TCN | 0.110 | 0.097 | 0.084 | |
| CNN-BiLSTM | 0.097 | 0.076 | 0.064 | |
| Informer | 0.129 | 0.100 | 0.086 | |
| GCN-LSTM | 0.086 | 0.074 | 0.066 | |
| Rainy | Solar-LLM | 0.041 | 0.038 | 0.037 |
| TCN | 0.119 | 0.100 | 0.083 | |
| CNN-BiLSTM | 0.102 | 0.073 | 0.060 | |
| Informer | 0.122 | 0.096 | 0.078 | |
| GCN-LSTM | 0.075 | 0.067 | 0.059 | |
| Weather Type | Forecast Period | MAE (with Prompts) | MAE (Without Prompts) | RI (%) |
|---|---|---|---|---|
| Sunny | 15 min | 0.044 | 0.048 | 8.33% |
| 1 h | 0.056 | 0.059 | 5.08% | |
| 4 h | 0.074 | 0.080 | 7.50% | |
| Cloudy | 15 min | 0.048 | 0.052 | 7.69% |
| 1 h | 0.064 | 0.066 | 3.03% | |
| 4 h | 0.087 | 0.097 | 10.31% | |
| Rainy | 15 min | 0.037 | 0.044 | 15.91% |
| 1 h | 0.057 | 0.065 | 12.31% | |
| 4 h | 0.090 | 0.101 | 10.89% |
| Weather Type | Forecast Period | MAE (with Reprogramming) | MAE (Without Reprogramming) | RI (%) |
|---|---|---|---|---|
| Sunny | 15 min | 0.044 | 0.049 | 10.20% |
| 1 h | 0.056 | 0.059 | 5.08% | |
| 4 h | 0.074 | 0.077 | 3.90% | |
| Cloudy | 15 min | 0.048 | 0.054 | 11.11% |
| 1 h | 0.064 | 0.065 | 1.54% | |
| 4 h | 0.087 | 0.092 | 5.43% | |
| Rainy | 15 min | 0.037 | 0.043 | 13.95% |
| 1 h | 0.057 | 0.059 | 3.39% | |
| 4 h | 0.090 | 0.106 | 15.09% |
| Prediction Model | 15 min | 1 h | 4 h | ||||||
|---|---|---|---|---|---|---|---|---|---|
| MAE | RMSE | MAE | RMSE | MAE | RMSE | ||||
| Solar-LLM | 0.045 | 0.068 | 0.946 | 0.059 | 0.091 | 0.901 | 0.087 | 0.135 | 0.781 |
| BP | 0.079 | 0.104 | 0.853 | 0.089 | 0.113 | 0.814 | 0.128 | 0.162 | 0.616 |
| ELM | 0.084 | 0.108 | 0.834 | 0.098 | 0.125 | 0.776 | 0.146 | 0.185 | 0.508 |
| SVR | 0.060 | 0.079 | 0.906 | 0.068 | 0.091 | 0.879 | 0.098 | 0.128 | 0.757 |
| Weather Type | Model | MAE | RMSE | R2 |
|---|---|---|---|---|
| Sunny | Solar-LLM | 0.079 | 0.118 | 0.864 |
| TCN | 0.112 | 0.168 | 0.718 | |
| CNN-BiLSTM | 0.121 | 0.181 | 0.687 | |
| Informer | 0.098 | 0.151 | 0.776 | |
| GCN-LSTM | 0.11 | 0.163 | 0.742 | |
| Cloudy | Solar-LLM | 0.094 | 0.153 | 0.662 |
| TCN | 0.109 | 0.166 | 0.613 | |
| CNN-BiLSTM | 0.128 | 0.189 | 0.504 | |
| Informer | 0.105 | 0.171 | 0.608 | |
| GCN-LSTM | 0.123 | 0.185 | 0.521 | |
| Rainy/Snowy | Solar-LLM | 0.101 | 0.168 | 0.453 |
| TCN | 0.118 | 0.179 | 0.374 | |
| CNN-BiLSTM | 0.124 | 0.182 | 0.359 | |
| Informer | 0.113 | 0.189 | 0.368 | |
| GCN-LSTM | 0.109 | 0.174 | 0.402 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Lv, C.; Fan, H.; Zhang, Z.; Fan, M.; Run, W.; Yang, L.; Yang, Y.; Liu, D. Ultra-Short-Term Power Prediction for Distributed Photovoltaics Based on Time-Series LLMs. Electronics 2025, 14, 4519. https://doi.org/10.3390/electronics14224519
Lv C, Fan H, Zhang Z, Fan M, Run W, Yang L, Yang Y, Liu D. Ultra-Short-Term Power Prediction for Distributed Photovoltaics Based on Time-Series LLMs. Electronics. 2025; 14(22):4519. https://doi.org/10.3390/electronics14224519
Chicago/Turabian StyleLv, Chen, Hang Fan, Zuhan Zhang, Menghua Fan, Wencai Run, Liuqing Yang, Yuying Yang, and Dunnan Liu. 2025. "Ultra-Short-Term Power Prediction for Distributed Photovoltaics Based on Time-Series LLMs" Electronics 14, no. 22: 4519. https://doi.org/10.3390/electronics14224519
APA StyleLv, C., Fan, H., Zhang, Z., Fan, M., Run, W., Yang, L., Yang, Y., & Liu, D. (2025). Ultra-Short-Term Power Prediction for Distributed Photovoltaics Based on Time-Series LLMs. Electronics, 14(22), 4519. https://doi.org/10.3390/electronics14224519

