Next Article in Journal
A CFD-Based Correction for Ship Mass and Longitudinal Center of Gravity to Improve Resistance Simulation
Previous Article in Journal
Telecom Fraud Recognition Based on Large Language Model Neuron Selection
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

Mathematical and Machine Learning Innovations for Power Systems: Predicting Transformer Oil Temperature with Beluga Whale Optimization-Based Hybrid Neural Networks

Chongqing University-University of Cincinnati Joint Co-Op Institute, Chongqing University, Chongqing 400044, China
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Mathematics 2025, 13(11), 1785; https://doi.org/10.3390/math13111785
Submission received: 20 April 2025 / Revised: 24 May 2025 / Accepted: 24 May 2025 / Published: 27 May 2025

Abstract

Power transformers are vital in power systems, where oil temperature is a key operational indicator. This study proposes an advanced hybrid neural network model, BWO-TCN-BiGRU-Attention, to predict the top-oil temperature of transformers. The model was validated using temperature data from power transformers in two Chinese regions. It achieved MAEs of 0.5258 and 0.9995, MAPEs of 2.75% and 2.73%, and RMSEs of 0.6353 and 1.2158, significantly outperforming mainstream methods like ELM, PSO-SVR, Informer, CNN-BiLSTM-Attention, and CNN-GRU-Attention. In tests conducted in spring, summer, autumn, and winter, the model’s MAPE was 2.75%, 3.44%, 3.93%, and 2.46% for Transformer 1, and 2.73%, 2.78%, 3.07%, and 2.05% for Transformer 2, respectively. These results indicate that the model can maintain low prediction errors even with significant seasonal temperature variations. In terms of time granularity, the model performed well at both 1 h and 15 min intervals: for Transformer 1, MAPE was 2.75% at 1 h granularity and 2.98% at 15 min granularity; for Transformer 2, MAPE was 2.73% at 1 h granularity and further reduced to 2.16% at 15 min granularity. This shows that the model can adapt to different seasons and maintain good prediction performance with high-frequency data, providing reliable technical support for the safe and stable operation of power systems.
Keywords: power transformer; oil temperature prediction; hybrid neural network; beluga whale optimization; power system; artificial intelligence power transformer; oil temperature prediction; hybrid neural network; beluga whale optimization; power system; artificial intelligence

Share and Cite

MDPI and ACS Style

Liu, J.; Hou, Z.; Liu, B.; Zhou, X. Mathematical and Machine Learning Innovations for Power Systems: Predicting Transformer Oil Temperature with Beluga Whale Optimization-Based Hybrid Neural Networks. Mathematics 2025, 13, 1785. https://doi.org/10.3390/math13111785

AMA Style

Liu J, Hou Z, Liu B, Zhou X. Mathematical and Machine Learning Innovations for Power Systems: Predicting Transformer Oil Temperature with Beluga Whale Optimization-Based Hybrid Neural Networks. Mathematics. 2025; 13(11):1785. https://doi.org/10.3390/math13111785

Chicago/Turabian Style

Liu, Jingrui, Zhiwen Hou, Bowei Liu, and Xinhui Zhou. 2025. "Mathematical and Machine Learning Innovations for Power Systems: Predicting Transformer Oil Temperature with Beluga Whale Optimization-Based Hybrid Neural Networks" Mathematics 13, no. 11: 1785. https://doi.org/10.3390/math13111785

APA Style

Liu, J., Hou, Z., Liu, B., & Zhou, X. (2025). Mathematical and Machine Learning Innovations for Power Systems: Predicting Transformer Oil Temperature with Beluga Whale Optimization-Based Hybrid Neural Networks. Mathematics, 13(11), 1785. https://doi.org/10.3390/math13111785

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop