Explainable Multi-Frequency Long-Term Spectrum Prediction Based on GC-CNN-LSTM
Abstract
1. Introduction
- Multi-frequency long-term spectrum prediction was achieved using CNN-LSTM, and it was compared with LSTM, GRU, CNN, Transformer, and CNN-LSTM-Attention. The advantages and disadvantages of the current prediction method were tested.
- The GC-CNN-LSTM method was used to conduct a time–frequency global interpretation of the model, where model inputs were selectively based on the feature importance obtained through Grad-CAM’s interpretation in both the time and frequency domains.
- The performance of the proposed scheme was evaluated through extensive simulations. The simulation results show that the GC-CNN-LSTM algorithm can effectively improve the accuracy of spectrum prediction and provide superior interpretability compared to SHAP.
2. Problem Modeling
3. Model Construction
3.1. CNN-LSTM
3.2. Improved Grad-CAM
3.3. Evaluation Metrics
3.4. Simulation Setup
4. Simulation Results and Analysis
4.1. CNN-LSTM Model Prediction Results
4.2. Explainability Analysis
4.2.1. SHAP Explanation Results
4.2.2. GC-CNN-LSTM Explanation Results
4.3. Efficiency Analysis
4.4. Discussion
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Appendix A. Model Architectures
Appendix A.1. LSTM, GRU, CNN, CNN-LSTM, and CNN-LSTM-Attention
Module | Layer Type | Parameters | Activation Function |
---|---|---|---|
Input | Input | input_shape = (seq_len, freqs) | |
1st Conv Block | Conv1D | filters = 64, kernel_size = 3 | ReLU |
MaxPooling1D | pool_size = 2 | ||
Dropout | rate = 0.2 | ||
2nd Conv Block | Conv1D | filters = 32, kernel_size = 3 | ReLU |
MaxPooling1D | pool_size = 2 | ||
Dropout | rate = 0.2 | ||
1st LSTM | LSTM | units = 64, return_sequences = True | Tanh |
Dropout | rate = 0.2 | ||
2st LSTM | LSTM | units = 32, return_sequences = False | Tanh |
Dropout | rate = 0.2 | ||
Fully Connected | Dense | units = 64 | ReLU |
Output | Dense | output_size = pred_steps × pred_freqs | Linear |
Compile | Optimizer | Adam(learning_rate = 0.001) | |
Loss | MSE | ||
Training Settings | Epochs | 70 | |
Batch Size | 32 | ||
Validation Split | 0.2 |
Appendix A.2. Transformer
Module | Layer Type | Parameters | Activation Function |
---|---|---|---|
Input | Input | input_shape = (seq_len, freqs) | |
Preprocessing | Positional Encoding | Custom function | |
Dropout | rate = 0.2 | ||
Core Block | Transformer Encoder | head_size = 128, num_heads = 4, ff_dim = 8 | ReLU |
Sequence Pooling | GlobalAverage-Pooling1D | -- | |
Dropout | rate = 0.2 | ||
Fully Connected | Dense | units = 64 | |
Output | Dense | output_size = pred_steps × pred_freqs | Linear |
Compile | Optimizer | Adam(learning_rate = 0.001) | |
Loss | MSE | ||
Training Settings | Epochs | 70 | |
Batch Size | 32 | ||
Validation Split | 0.2 |
References
- Shawel, B.S.; Woldegebreal, D.H.; Pollin, S. Convolutional LSTM-based long-term spectrum prediction for dynamic spectrum access. In Proceedings of the 2019 27th European Signal Processing Conference (EUSIPCO), A Coruna, Spain, 2–6 September 2019; pp. 1–5. [Google Scholar]
- Haykin, S. Cognitive radio: Brain-empowered wireless communications. IEEE J. Sel. Areas Commun. 2005, 23, 201–220. [Google Scholar] [CrossRef]
- Solanki, S.; Dehalwar, V.; Choudhary, J. Deep learning for spectrum sensing in cognitive radio. Symmetry 2021, 13, 147. [Google Scholar] [CrossRef]
- Wang, L.; Hu, J.; Jiang, R.; Chen, Z. A deep long-term joint temporal–spectral network for spectrum prediction. Sensors 2024, 24, 1498. [Google Scholar] [CrossRef] [PubMed]
- Wang, L.; Hu, J.; Jiang, D.; Zhang, C.; Jiang, R.; Chen, Z. Deep Learning Models for Spectrum Prediction: A Review. IEEE Sens. J. 2024, 24, 28553–28575. [Google Scholar] [CrossRef]
- Pan, G.; Yau, D.K.Y.; Zhou, B.; Wu, Q. Deep Learning for Spectrum Prediction in Cognitive Radio Networks: State-of-the-Art, New Opportunities, and Challenges. IEEE Netw. 2025, early access. [Google Scholar] [CrossRef]
- Durairaj, D.M.; Mohan, B.H.K. A convolutional neural network based approach to financial time series prediction. Neural Comput. Appl. 2022, 34, 13319–13337. [Google Scholar] [CrossRef] [PubMed]
- Kim, T.Y.; Cho, S.B. Predicting residential energy consumption using CNN-LSTM neural networks. Energy 2019, 182, 72–81. [Google Scholar] [CrossRef]
- Jia, L.; Qi, N.; Su, Z.; Chu, F.; Fang, S.; Wong, K.-K.; Chae, C.-B. Game theory and reinforcement learning for anti-jamming defense in wireless communications: Current research, challenges, and solutions. IEEE Commun. Surv. Tutor. 2024, 27, 1798–1838. [Google Scholar] [CrossRef]
- Zhang, L.; Jia, M. Accurate spectrum prediction based on joint LSTM with CNN toward spectrum sharing. In Proceedings of the 2021 IEEE Global Communications Conference (GLOBECOM), Madrid, Spain, 7–11 December 2021; pp. 1–6. [Google Scholar]
- Coutinho, E.R.; Madeira, J.G.F.; Borges, D.G.F.; Springer, M.V.; de Oliveira, E.M.; Coutinho, A.L.G.A. Multi-step forecasting of meteorological time series using CNN-LSTM with decomposition methods. Water Resour. Manag. 2025, 39, 3173–3198. [Google Scholar] [CrossRef]
- Chung, W.H.; Gu, Y.H.; Yoo, S.J. District heater load forecasting based on machine learning and parallel CNN-LSTM attention. Energy 2022, 246, 123350. [Google Scholar] [CrossRef]
- Pan, X.; Cao, K. Spectrum sensing based on CNN-LSTM with attention for cognitive radio networks. In Proceedings of the 2023 8th International Conference on Intelligent Informatics and Biomedical Sciences (ICIIBMS), Okinawa, Japan, 23–25 November 2023; Volume 8, pp. 63–67. [Google Scholar]
- Saeys, Y.; Inza, I.; Larranaga, P. A review of feature selection techniques in bioinformatics. Bioinformatics 2007, 23, 2507–2517. [Google Scholar] [CrossRef]
- Schwalbe, G.; Finzel, B. A comprehensive taxonomy for explainable artificial intelligence: A systematic survey of surveys on methods and concepts. Data Min. Knowl. Discov. 2024, 38, 3043–3101. [Google Scholar] [CrossRef]
- Vilone, G.; Longo, L. Explainable artificial intelligence: A systematic review. arXiv 2020, arXiv:2006.00093. [Google Scholar] [CrossRef]
- Theissler, A.; Spinnato, F.; Schlegel, U.; Guidotti, R. Explainable AI for time series classification: A review, taxonomy and research directions. IEEE Access 2022, 10, 100700–100724. [Google Scholar] [CrossRef]
- Maarif, M.R.; Saleh, A.R.; Habibi, M.; Fitriyani, N.L.; Syafrudin, M. Energy usage forecasting model based on long short-term memory (LSTM) and eXplainable artificial intelligence (XAI). Information 2023, 14, 265. [Google Scholar]
- Selvaraju, R.R.; Cogswell, M.; Das, A.; Vedantam, R.; Parikh, D.; Batra, D. Grad-cam: Visual explanations from deep networks via gradient-based localization. In Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy, 22–29 October 2017; pp. 618–626. [Google Scholar]
- Van Zyl, C.; Ye, X.; Naidoo, R. Harnessing eXplainable artificial intelligence for feature selection in time series energy forecasting: A comparative analysis of Grad-CAM and SHAP. Appl. Energy 2024, 353, 122079. [Google Scholar] [CrossRef]
- Machlev, R.; Perl, M.; Belikov, J.; Levy, K.Y.; Levron, Y. Measuring explainability and trustworthiness of power quality disturbances classifiers using XAI—Explainable artificial intelligence. IEEE Trans. Ind. Inform. 2021, 18, 5127–5137. [Google Scholar] [CrossRef]
- Wu, M.; Shukla, S.; Vrancken, B.; Verbeke, M.; Karsmakers, P. Data-Driven Approach to Identify Acoustic Emission Source Motion and Positioning Effects in Laser Powder Bed Fusion with Frequency Analysis. Procedia CIRP 2025, 133, 531–536. [Google Scholar] [CrossRef]
- Wu, M.; Yao, Z.; Verbeke, M.; Karsmakers, P.; Gorissen, B.; Reynaerts, D. Data-driven models with physical interpretability for real-time cavity profile prediction in electrochemical machining processes. Eng. Appl. Artif. Intell. 2025, 160, 111807. [Google Scholar] [CrossRef]
- Xie, W.; Zhu, Y.; Cao, W.; Pan, J.; Wu, B.; Liu, S.; Ji, Y. PCA-LSTM anomaly detection and prediction method based on time series power data. In Proceedings of the 2022 China Automation Congress (CAC), Xiamen, China, 25–27 November 2022; pp. 5537–5542. [Google Scholar]
- Lee, S.W.; Kim, H.Y. Stock market forecasting with super-high dimensional time-series data using ConvLSTM, trend sampling, and specialized data augmentation. Expert Syst. Appl. 2020, 161, 113704. [Google Scholar]
- Sun, J.; Lapuschkin, S.; Samek, W.; Binder, A. Understanding image captioning models beyond visualizing attention. arXiv 2020, arXiv:2001.01037. [Google Scholar]
- Sarkar, A.; Rahnemoonfar, M. Grad-Cam aware supervised attention for visual question answering for post-disaster damage assessment. In Proceedings of the 2022 IEEE International Conference on Image Processing (ICIP), Bordeaux, France, 16–19 December 2022; pp. 3783–3787. [Google Scholar]
- Zhou, Y.; He, X.; Montillet, J.P.; Wang, S.; Hu, S.; Sun, X.; Huang, J.; Ma, X. An improved ICEEMDAN-MPAGRU model for GNSS height time series prediction with weighted quality evaluation index. GPS Solut. 2025, 29, 113. [Google Scholar] [CrossRef]
- Rajendran, S.; Calvo-Palomino, R.; Fuchs, M.; van den Bergh, B.; Cordobes, H.; Giustiniano, D.; Pollin, S.; Lenders, V. Electrosense: Open and big spectrum data. IEEE Commun. Mag. 2017, 56, 210–217. [Google Scholar] [CrossRef]
- Zuo, P.L.; Wang, X.; Linghu, W.; Sun, R.; Peng, T.; Wang, W. Prediction-based spectrum access optimization in cognitive radio networks. In Proceedings of the 2018 IEEE 29th Annual International Symposium on Personal, Indoor and Mobile Radio Communications (PIMRC), Bologna, Italy, 9–12 September 2018; pp. 1–7. [Google Scholar]
- Yu, L.; Guo, Y.; Wang, Q.; Luo, C.; Li, M.; Liao, W.; Li, P. Spectrum availability prediction for cognitive radio communications: A DCG approach. IEEE Trans. Cogn. Commun. Netw. 2020, 6, 476–485. [Google Scholar] [CrossRef]
- Tang, Y.; Zhang, Y.; Li, J. A time series driven model for early sepsis prediction based on transformer module. BMC Med. Res. Methodol. 2024, 24, 23. [Google Scholar] [CrossRef]
- Cui, B.; Liu, M.; Li, S.; Jin, Z.; Zeng, Y.; Lin, X. Deep learning methods for atmospheric PM2. 5 prediction: A comparative study of transformer and CNN-LSTM-attention. Atmos. Pollut. Res. 2023, 14, 101833. [Google Scholar] [CrossRef]
- Ye, X.; He, Z.; Heng, W.; Li, Y. Toward understanding the effectiveness of attention mechanism. AIP Adv. 2023, 13, 035019. [Google Scholar] [CrossRef]
- Lu, Z.; Yuan, K.-H. Welch’s t-Test; Sage: Thousand Oaks, CA, USA, 2010. [Google Scholar] [CrossRef]
- Ge, J.; Yao, Z.; Wu, M.; Almeida, J.H.S., Jr.; Jin, Y.; Sun, D. Tackling data scarcity in machine learning-based CFRP drilling performance prediction through a Broad Learning System with Virtual Sample Generation (BLS-VSG). Compos. Part B Eng. 2025, 305, 112701. [Google Scholar] [CrossRef]
Evaluation Metrics | LSTM | GRU | CNN | Transformer | CNN-LSTM | CNN-LSTM-Attention |
---|---|---|---|---|---|---|
RMSE | 2.0518 | 2.0553 | 2.2210 | 2.0753 | 1.9932 | 1.9816 |
MAPE (%) | 8.78 | 8.92 | 8.85 | 8.49 | 8.46 | 8.34 |
The Number of Retained Frequencies | 15 | 30 | 45 | 60 | 75 | 90 | 105 | 120 |
---|---|---|---|---|---|---|---|---|
RMSE | 2.0131 | 1.9969 | 1.9906 | 1.9871 | 1.9762 | 1.976 | 1.9779 | 1.9795 |
RMSE ↓ (%) | −1.00 | −0.19 | 0.13 | 0.31 | 0.85 | 0.86 | 0.77 | 0.69 |
MAPE (%) | 8.31 | 8.22 | 8.22 | 8.18 | 8.16 | 8.21 | 8.2 | 8.24 |
MAPE ↓ (%) | 1.77 | 2.84 | 2.84 | 3.31 | 3.55 | 2.96 | 3.07 | 2.60 |
Frequency (MHz) | 1550 | 1750 | ||
---|---|---|---|---|
Model | CNN-LSTM | GC-CNN-LSTM | CNN-LSTM | GC-CNN-LSTM |
RMSE | 1.0442 | 1.0092(3.35% ↓) | 1.416 | 1.358 (4.07% ↓) |
MAPE (%) | 5.07 | 4.73(6.71% ↓) | 5.67 | 5.46 (3.70% ↓) |
t-Test Values | T-Statistic | p-Value | 95% Confidence Interval | Average Improvement |
---|---|---|---|---|
RMSE | 6.247 | <0.001 | [4.26%, 8.18%] | 6.22% |
MAPE | 3.760 | 0.00018 | [2.03%, 6.48%] | 4.26% |
Model | LSTM | GRU | CNN | Transformer | CNN-LSTM | CNN-LSTM- Attention | SHAP-CNN-LSTM | GC-CNN-LSTM |
---|---|---|---|---|---|---|---|---|
Time(s) | 20.89 | 21.29 | 13.05 | 56.16 | 15.93 | 24.87 | 15.2 | 13.8 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Xu, W.; Zhang, J.; Su, Z.; Jia, L. Explainable Multi-Frequency Long-Term Spectrum Prediction Based on GC-CNN-LSTM. Electronics 2025, 14, 3530. https://doi.org/10.3390/electronics14173530
Xu W, Zhang J, Su Z, Jia L. Explainable Multi-Frequency Long-Term Spectrum Prediction Based on GC-CNN-LSTM. Electronics. 2025; 14(17):3530. https://doi.org/10.3390/electronics14173530
Chicago/Turabian StyleXu, Wei, Jianzhao Zhang, Zhe Su, and Luliang Jia. 2025. "Explainable Multi-Frequency Long-Term Spectrum Prediction Based on GC-CNN-LSTM" Electronics 14, no. 17: 3530. https://doi.org/10.3390/electronics14173530
APA StyleXu, W., Zhang, J., Su, Z., & Jia, L. (2025). Explainable Multi-Frequency Long-Term Spectrum Prediction Based on GC-CNN-LSTM. Electronics, 14(17), 3530. https://doi.org/10.3390/electronics14173530