Next Article in Journal
Residual Lipids Pretreatment Towards Renewable Fuels
Previous Article in Journal
Modeling, Validation and Analysis of the Performance of Direct Air-Cooling Condensers for Mountainous Terrain
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Artificial Intelligence Algorithms for Hybrid Electric Powertrain System Control: A Review

School of Mechanical Engineering, Beijing Institute of Technology, Beijing 100081, China
*
Author to whom correspondence should be addressed.
Energies 2025, 18(8), 2018; https://doi.org/10.3390/en18082018
Submission received: 12 March 2025 / Revised: 28 March 2025 / Accepted: 11 April 2025 / Published: 14 April 2025
(This article belongs to the Section E: Electric Vehicles)

Abstract

:
With the accelerating depletion of fossil fuels and growing severity of air pollution, hybrid electric powertrain systems have become a research hotspot in transportation, owing to their ability to improve fuel economy and reduce emissions. However, optimizing the control of these systems is challenging, as it involves multi-power source coordination, dynamic operating condition adaptation, and real-time energy distribution. Traditional control methods, whether rule-based or optimization-based, often lack global optimality and adaptability. In recent years, artificial intelligence algorithms have provided new solutions for the intelligent control of hybrid electric powertrain systems with their powerful nonlinear modeling capabilities, data-driven optimization, and adaptive learning capabilities. This paper systematically reviews the research progress of artificial intelligence algorithms in hybrid electric powertrain systems. First, the architecture classification of hybrid electric powertrain systems is introduced. Secondly, the advantages and disadvantages of rule-based and optimization-based energy management strategies are summarized. Then, the existing research on the application of artificial intelligence algorithms in hybrid electric powertrain systems is systematically reviewed, and the advantages, disadvantages, and specific applications of various algorithms are analyzed in detail. Finally, the future application direction of artificial intelligence algorithms in hybrid electric powertrain systems is prospected.

1. Introduction

With the increasing depletion of fossil fuels and the growing prominence of air pollution, governments and companies around the world are accelerating the pace of research and application of green transportation technologies. Traditional fuel vehicles’ dependence on fossil fuels not only exacerbates the energy crisis but also leads to severe air pollution and greenhouse gas emissions [1,2]. To meet these challenges, new energy vehicles have become a key direction for solving energy and environmental problems [3]. As an important component of new energy vehicles, hybrid electric powertrain systems can further reduce vehicle operating energy consumption and improve energy utilization through the coordinated work of multiple energy sources [4,5,6].
Hybrid electric powertrain systems have been widely used in hybrid electric vehicles (HEVs), fuel cell hybrid electric vehicles (FCHEVs), plug-in hybrid electric vehicles (PHEVs) and other fields [7,8,9,10]. Hybrid electric vehicles (HEVs) combine the power of an internal combustion engine (ICE) with the zero-emission characteristics of an electric motor. Compared with traditional ICE vehicles, HEVs have better fuel efficiency. Unlike HEVs, plug-in hybrid vehicles have larger batteries than HEVs and can be connected to the power grid to charge the batteries [11]. The unique feature of fuel cell electric vehicles is that they use fuel cells (FCs) and battery technology or supercapacitors (SCs) as power sources [12].
Regardless of the type of vehicle that uses a hybrid electric powertrain system, an energy management strategy (EMS) is an important technology to improve its performance [13,14]. The core of an EMS is to coordinate the power distribution between different power sources according to different driving scenarios and power requirements to minimize energy consumption and emissions. However, there are many challenges in designing an efficient EMS. The complexity of the hybrid electric powertrain system, the coordinated operation of multiple energy sources, and the response to changes in driving conditions all place higher requirements on the adaptability and real-time performance of the EMS [15].
The energy management strategies of hybrid electric powertrain systems can be divided into rule-based, optimization-based, and artificial intelligence-based [16]. Rule-based energy management strategies can be divided into deterministic rule-based and fuzzy logic rule-based. Reference [17] proposed a hybrid control strategy based on fuzzy logic for PHEVs, established a controller with driving conditions as influencing factors, and verified the efficiency of the proposed controller through simulation. Reference [18] proposed a deterministic rule-based energy management strategy for hybrid electric vehicles, which determines the engine and motor operating modes through set thresholds. However, this strategy uses fixed thresholds to achieve power distribution and cannot achieve optimal performance.
In addition, optimization-based methods such as dynamic programming and genetic algorithms that rely on a complete driving cycle require the entire driving cycle information to be known in advance in order to solve the global optimal solution. However, in practical applications, the future driving cycle cannot be obtained in advance, and the global optimal strategy design has a large computational burden, which leads to poor real-time control performance [19].
In recent years, more and more relevant studies have shown that the application of artificial intelligence algorithms in hybrid electric powertrain systems has great potential. Li et al. proposed an energy management strategy based on deep reinforcement learning, which takes advantage of the fact that deep reinforcement learning does not rely on future driving information and has good generalization. The simulation results show that compared with the real-time EMS based on dynamic programming and model prediction, in the absence of future driving information, the fuel economy is only 3.5% lower than that of dynamic programming, which is better than the model predictive control strategy that relies on accurate prediction. This study also shows that this method is not limited by the powertrain configuration and is applicable to various HEV models. This result proves that the deep reinforcement learning strategy has obvious advantages in learning ability, optimality, and generalization ability [20]. Reference [21] proposed an energy management strategy based on reinforcement learning for power distribution between engines and motors, and compared it with energy management based on dynamic programming. The results showed that the strategy based on reinforcement learning has better adaptability and learning ability.
In 2017, the authors of [22] reviewed the modeling and control of hybrid electric vehicles, mainly introducing the offline and online control strategies of HEVs. Reinforcement learning and deep reinforcement learning control methods were not involved. In 2023, the authors of [23] reviewed the modeling and energy management strategies of hybrid electric vehicles, introducing the configuration classification and energy management strategies of hybrid electric vehicles, but the energy management strategy based on learning was not described in detail, nor were various learning-based algorithms analyzed. In 2021, the authors of [24] reviewed the energy management algorithms of hybrid electric vehicles from the perspective of machine learning, but the discussion on neural network-based algorithms was not comprehensive.
In recent years, artificial intelligence algorithms have been widely used in hybrid electric powertrain vehicles. This paper discusses the application of artificial intelligence algorithms in hybrid electric drive systems in detail, and summarizes them in six parts. Each part, including neural networks, is described in detail, and the latest research results are included. The purpose of reviewing the existing literature and summarizing the relevant artificial intelligence algorithms is to provide a certain reference for future related research. First, the rule-based and optimization-based energy management strategies are outlined and their shortcomings are summarized. Secondly, the previous literature is analyzed with artificial intelligence algorithms as the theme, including the application of supervised learning, unsupervised learning, reinforcement learning, neural networks, deep learning, and deep reinforcement learning in hybrid electric powertrain systems. Finally, the existing literature is summarized and the relevant research directions of future hybrid electric powertrain systems are projected in combination with the current research status.

2. Architectures of Hybrid Electric Powertrain Systems

Hybrid electric powertrain systems are composed of multiple power sources that work together mechanically and electrically to achieve efficient energy distribution and multi-mode drive [4,5]. Hybrid electric powertrain system applications include hybrid electric vehicles (HEVs), plug-in hybrid electric vehicles (PHEVs), fuel cell electric vehicles (FCEVs), etc. [7,25]. Hybrid electric vehicles combine an internal combustion engine and an electric motor [11]. Plug-in hybrid electric vehicles have larger batteries than ordinary HEVs, and the vehicle can be connected to the power grid to charge the battery [26]. Fuel cell electric vehicles combine fuel cells and batteries or supercapacitor storage technology as energy [27].
Whether it is a hybrid electric vehicle, a plug-in hybrid electric vehicle, or a fuel cell hybrid electric vehicle, the hybrid electric drive systems they use can be mainly divided into three types—series, parallel and hybrid—according to the mechanical connection method and power transmission path between the powertrain components [28].

2.1. Series Hybrid Electric Powertrain Systems

The series hybrid powertrain system is shown in Figure 1. The electric energy generated by the generator is transmitted to the motor via the inverter, and the mechanical energy output by the motor is transmitted to the vehicle transmission device to drive the vehicle. During the entire process, the generator and the transmission device have no mechanical connection and do not directly participate in the vehicle drive. Common generators mainly include auxiliary power units composed of internal combustion engines and generators, fuel cells, or other power sources with high energy density [29,30].
The new model released by Li Auto in April 2024 is equipped with an extended-range plug-in hybrid system, which optimizes the traditional series system. It consists of dual motors and a 1.5T (displacement 1.5 L, equipped with a turbocharger system) four-cylinder engine. The two motors are placed at the front and rear ends of the vehicle for front and rear drive. The vehicle is driven only by the electric motor, and the power generated by the engine is used to drive the generator to generate electricity, and does not directly participate in the vehicle drive.

2.2. Parallel Hybrid Electric Powertrain System

In the parallel hybrid powertrain configuration, the engine and the motor are two independent power sources, which can drive the car separately to achieve pure electric drive and pure engine drive. At the same time, the power can be coupled together to jointly drive the car. In addition, the optional operating modes of parallel hybrid electric vehicles also include driving charging and regenerative braking, so as to meet the vehicle drive requirements under different working conditions. Figure 2 shows the P2 configuration in the parallel hybrid powertrain. According to the different positions of the motor, it can be divided into P0–P4 configurations, where the P0 configuration refers to the motor located at the front end of the engine. The P1 configuration refers to the motor located after the engine and before the clutch. The P2 configuration refers to the motor located between the clutch and the transmission. The P3 configuration refers to the electric motor located at the output end of the transmission. The P4 configuration motor is located on another shaft and directly drives the wheels [31].
In 2022, Changan Automobile launched the Blue Whale hybrid electric system, which adopts an innovative parallel P2 configuration and applies three-clutch integration technology to the electric drive transmission. By integrating the clutch that controls the engine access with the traditional dual clutch inside the motor, the power system is highly integrated and lightweight.

2.3. Parallel–Parallel Hybrid Electric Powertrain System

The series–parallel hybrid electric powertrain system combines two types of power transmission systems, series and parallel, including a series–parallel hybrid drive system and a power-split hybrid drive system, as shown in Figure 3 and Figure 4. The series–parallel hybrid drive system can achieve flexible switching between pure electric drive mode and parallel drive mode through the engagement and disengagement of the clutch. The power-split hybrid drive system uses a planetary gear set as a power distribution device, and the planetary gear set connects the engine, motor, and generator together. In this configuration, part of the engine’s output power is mechanically transmitted to the wheels to drive the vehicle, and the remaining power is transmitted to the generator for energy conversion. The generator converts mechanical energy into electrical energy, which can be used to drive the motor or stored in the power battery according to the vehicle’s operating conditions. This configuration can enable the vehicle to achieve higher performance [32].
In May 2024, BYD launched a model equipped with the latest fifth-generation DM-i plug-in hybrid electric system, which uses dual planetary gears and conventional differentials, equipped with an engine and dual motors, with electric drive as the core. The overall structure of the system is simpler, and the dual-motor coordinated drive is achieved through the addition of a dual-clutch design. In December 2024, Geely released a new model equipped with the latest Thor EM-i plug-in hybrid electric system, which consists of dual motors, an engine, and a single-speed transmission, optimizing power output.

3. Classification of Energy Management Strategies

In a hybrid electric powertrain system, the main function of the energy management strategy is to coordinate the energy distribution between multiple power sources and regulate the charging and discharging process of the battery pack to meet the system’s adaptation to the driving environment and power requirements and improve the system’s performance [33,34]. According to the design method and principle of the strategy, the current energy management strategies of hybrid electric powertrain systems can be divided into three categories, rule-based, optimization-based, and artificial intelligence algorithm-based [16], as shown in Figure 5.

4. Rule-Based Energy Management Strategy

Rule-based energy management strategies are divided into deterministic rule-based and fuzzy rule-based strategies [35]. Deterministic rule-based control strategies are designed based on the designer’s engineering experience, basic equations, and analytical equations. Rules are formulated according to different vehicle speeds, engine torque, and battery SOC conditions to meet power requirements. This strategy uses a fixed threshold value, so it lacks flexibility when facing variable driving conditions [36].
The energy management strategy based on fuzzy logic rules acts to fuzzify the corresponding control strategy based on deterministic rules and represent it with a fuzzy rule base [37]. The EMS based on fuzzy logic rules is shown in Figure 6. The input vectors include vehicle speed, battery SOC, and required torque, and the output vector is engine torque. Compared with the energy management strategy based on deterministic rules, the fuzzy logic rule strategy does not require an accurate mathematical model and has better adaptability and robustness. However, since the fuzzy logic rule control strategy still relies on expert experience during design, it has limitations [38].
Reference [39] proposed a rule-based control strategy for plug-in hybrid electric vehicles, which uses travel and driving information as the basis for strategy formulation. The strategy uses fuel consumption and engine start–stop frequency as indicators to systematically evaluate the control performance. Reference [40] proposed a thermostat-based control strategy for HEVs, which uses a thermostat as the basis for power distribution management of series HEVs. Reference [41] proposed a power follower energy management strategy for HEVs. This strategy achieves power distribution by distinguishing power demand frequency bands, and simulates and verifies the effectiveness of this strategy. Reference [42] established an energy management strategy based on a state machine, which uses the vehicle’s required torque and the charge state of the motor’s battery to formulate an operation control strategy. Reference [43] proposed an energy management strategy based on fuzzy logic rules for HEV energy distribution, which adopted a priority power strategy and saved 19.8% of fuel consumption. Reference [44] proposed a fuzzy logic control strategy. The switching fuzzy logic control strategy is based on the identification and prediction of traffic conditions to achieve energy distribution of multiple power sources.
Reference [45] proposed a fuzzy logic control strategy based on driving cycle identification, which takes driving cycle identification into account and distributes the power output of the engine and motor through a fuzzy torque distribution controller. Reference [46] proposed a fuzzy logic control strategy for PHEVs, where a fuzzy controller was used to distribute the torque between the engine and the motor, and a genetic algorithm was used to optimize the membership function. The results showed that the strategy maintained the stability of battery charging and discharging. Reference [47] proposed a hybrid power-following fuzzy control strategy for fuel cell hybrid electric vehicles (FCHEVs). The power-following strategy module was optimized through a fuzzy logic control strategy to achieve efficient operation of the fuel cell. The results showed that the hydrogen consumption in the hydrogen fuel cell was reduced by 13.5%.
The common advantages of these strategies are strong real-time performance, low computational burden, high stability, and easy implementation in commercial vehicles. However, the rules of rule-based energy management strategies are fixed, lacking adaptability to variable working conditions, and the control parameters require a large number of experimental calibrations, resulting in high development costs. In addition, due to the strong subjectivity of the rules, it is difficult to achieve global optimal fuel economy, and the rules need to be readjusted when new driving conditions or power system changes occur, resulting in poor robustness [48,49,50].

5. Energy Management Strategy Based on Optimization

Energy management strategies based on optimization are mainly divided into global optimization and instantaneous optimization [51]. Instantaneous optimization control strategies make decisions based on limited information at the current moment, but in practical applications, they rely too much on precise information about unknown traffic conditions [46]. Global optimization control requires information about the entire driving cycle, which has limitations in practical applications [19]. Common global optimization methods include dynamic programming (DP) [52], genetic algorithms (GAs) [53], and Pontryagin’s Minimum Principle (PMP) [54]. Common instantaneous optimization methods include the equivalent consumption minimization strategy (ECMS) [55] and model predictive control (MPC) [56].
The dynamic programming algorithm can find the global optimal solution under the premise of known driving information, but the dynamic programming algorithm has limitations, because to obtain the global optimal solution, it is necessary to give complete driving cycle information, but this cannot be achieved during the actual operation of the vehicle [57]. Hu et al. [58] used the dynamic programming algorithm to solve the energy coordination problem between the engine and the battery. The results show that the use of the dynamic programming algorithm reduces fuel consumption by 7.3%. Du et al. [59] studied an EMS based on the PMP algorithm, and the results showed that the EMS reduced battery degradation. Reference [60] proposed an energy management strategy based on the Pontryagin Minimum Principle (PMP) for fuel cell hybrid electric vehicles. The effectiveness of this strategy in reducing hydrogen consumption was verified.
In addition, the instantaneous optimization-based method includes the equivalent consumption minimization strategy. Sun et al. [61] designed a new ECMS to dynamically adjust the equivalent factor, segmented the driving cycle based on the vehicle stop time, and optimized the equivalent factor of different driving cycles. Compared with the rule-based method, the use of this ECMS reduced fuel consumption by 18.7%. Reference [62] proposed an energy management strategy based on an adaptive equivalent consumption minimization strategy, which optimizes the equivalent factor according to the predicted power to achieve stable battery charging maintenance and suboptimal fuel consumption. Simulation results show that this strategy can maintain power within the driving cycle.
Bonab et al. [63] proposed an EMS based on MPC, which uses dynamic programming to optimize speed data and then combines MPC to establish a control strategy. Compared with the rule-based EMS, it reduces fuel consumption by 12.81%. Reference [64] proposed an energy management strategy based on model predictive control (MPC) for the hybrid power system of fuel cell electric vehicles. The speed prediction model predicts the results and then imports them into the model predictive control model for control. The results show that the total operating cost of this strategy is reduced by 3.74%.
The genetic algorithm (GA) is an optimization method inspired by biological evolution mechanisms, and its process is shown in Figure 7. Compared with traditional optimization methods, the genetic algorithm has stronger robustness and adaptability. Reference [65] studied a new energy management strategy optimization method for fuel cell hybrid electric vehicles. This strategy optimizes fuzzy logic parameters through the genetic algorithm. The results show that this strategy improves fuel utilization by 28%. Fan et al. [66] proposed an adaptively regulated EMS, established a control strategy based on the synergy of rules and equivalent consumption minimization strategy, and used the genetic algorithm to optimize the threshold in the rule-based strategy and the equivalent factor in the ECMS. The results show that compared with dynamic programming (DP), this EMS reduces fuel consumption by 14.8%.
Genetic algorithms can also be combined with some machine learning algorithms to achieve better results. For example, genetic algorithms combined with K-means clustering algorithms can optimize the clustering centers of K-means algorithms [67], and combined with neural networks to optimize the structure of neural networks [68]. Therefore, genetic algorithms are widely used in fields such as hybrid electric powertrain systems.

6. Artificial Intelligence Algorithms

Artificial intelligence algorithms solve problems by simulating and implementing human intelligent behavior. As hybrid powertrains become more complex and countries place higher demands on vehicle performance and emissions, the application of artificial intelligence algorithms is becoming more and more widespread [69]. This paper divides the research related to artificial intelligence algorithms into the following six parts for detailed description, including supervised learning, unsupervised learning, neural networks, deep learning, reinforcement learning, and deep reinforcement learning.
Artificial intelligence algorithms are applied to hybrid electric powertrain systems to dynamically adjust the power output of the engine and motor to achieve better control strategies, thereby improving fuel efficiency and reducing emissions [70]. In terms of battery management, the application of artificial intelligence algorithms can accurately predict battery status, optimize charging strategies, and further extend battery life [71]. Artificial intelligence algorithms can also predict vehicle speed [72]. In addition, artificial intelligence algorithms can be combined with intelligent transportation systems to use advanced communication technologies to achieve interaction between vehicles and infrastructure, and achieve better control effects [73].

6.1. Unsupervised Learning

Hybrid powertrains have multiple power sources, and it is of great significance to develop efficient energy management strategies to control the hybrid powertrain and maximize the performance of the hybrid powertrain [4]. Unsupervised learning is a machine learning method that does not require explicit labeling of data during the training process, unlike supervised learning. Unsupervised learning automatically identifies patterns and categories in data by analyzing their intrinsic distribution and relationships [74].
In the energy management strategy of hybrid electric vehicles, unsupervised learning is mainly used for driving mode classification and data dimensionality reduction. Common methods include K-means clustering and principal component analysis (PCA).
K-means clustering is a simple but effective algorithm. Zhang et al. [75] used K-means clustering to classify HEV driving modes, reducing fuel consumption and electricity consumption. Sun et al. [76] proposed an adaptive EMS based on an ECMS for hybrid electric buses. The classified road segment method used K-means clustering, and the constructed Markov transition matrix was used to predict speed. The predicted vehicle speed and traffic information were then used in the ECMS to dynamically adjust the equivalent factor. The results showed that the fuel economy was improved.
The K-means clustering algorithm is sensitive to the selection of initial cluster centers and is prone to fall into local optimal solutions. Genetic algorithms can search for better solutions in the solution space by simulating natural selection and genetic mechanisms. Combining K-means clustering with genetic algorithms and optimizing cluster centers through genetic algorithms can improve the clustering effect. Li et al. [77] applied a genetic algorithm to optimize the initial clustering centers of K-means, used the optimized K-means to identify driving states, and combined them with the equivalent consumption minimization strategy to establish an energy management strategy. The results showed that, compared with the ECMS, it achieved lower fuel consumption, with the engine operating near the optimal efficiency curve. In addition to the K-means clustering algorithm, another commonly used method is principal component analysis (PCA). Chang et al. [78] used PCA to reduce the dimensionality of motion sequence features, extracted the most representative features to simplify data complexity, and then classified the motion sequences after dimensionality reduction. Liu et al. [79] took PHEVs as the research object and established an EMS based on a PCA algorithm and GA. The driving condition data were classified into four categories by using the PCA method. Then, the GA was used to optimize the control action values. The driving conditions and the corresponding optimal control actions were combined. The results showed that the performance was close to the globally optimal method.

6.2. Supervised Learning

Supervised learning involves building a predictive model with generalization ability by using labeled training datasets to learn the intrinsic mapping relationship between input features and output labels. In the field of hybrid electric vehicles, supervised learning can be used not only to identify and classify driving modes and conditions but also for fault diagnosis. It can also predict future power requirements based on driving and road condition information [80].
Support vector machines (SVMs) can perform well even with fewer samples, effectively process high-dimensional data, and reduce the risk of overfitting when tackling complex tasks. As a result, they are widely used. The literature proposed a detection method based on SVM. SVM-based fault diagnosis consisted of three parts: data preprocessing, the SVM multi-classifier, and sample fault category identification. As shown in Figure 8, samples were collected from the vehicle model, and the feature signals were transmitted to the diagnosis model through the Simulink module and the S-function. The samples were then input into the SVM multi-classifier. The simulation results showed that vehicle faults could be accurately identified [81].
In addition, Zhuang et al. [82] optimized the energy management of HEVs using the DP method and obtained the global optimal operating point. The mode transition diagram was extracted from these operating points through data preprocessing using SVM, which was then applied to establish a real-time control strategy in an ECMS. The results showed that fuel economy and driving comfort were improved. Ji et al. [83] proposed an anomaly detection algorithm based on a one-class SVM for hybrid electric vehicles and verified its effectiveness in detecting anomalies using real vehicle data.
Shi et al. [84] adopted a support vector machine method to achieve more accurate recognition of driving cycles and then optimized it using a particle swarm algorithm. Based on this, they built a driving cycle recognition model using a support vector machine. This model was then applied in the equivalent fuel consumption minimization strategy. The results showed that both battery life and fuel economy were improved. The least squares support vector machine (LSSVM) is also used in hybrid electric vehicles. For example, Zhuang et al. [85] proposed a control strategy based on the least squares support vector machine (LSSVM) and trained the LSSVM using a dataset consisting of global optimal results. Simulation results showed the effectiveness of the strategy.
In addition to the SVM algorithm, other supervised learning algorithms, such as decision trees and random forests, are also applied in the energy management strategy of HEVs. Wang et al. [86] encoded human expert knowledge into a decision tree and then used it to initialize the weights and biases of a neural network to establish an EMS. The results showed that energy consumption was reduced compared to the rule-based EMS.
The prediction process of random forest is divided into three stages, bootstrapping, decision tree training, and aggregation, as shown in Figure 9. In a random forest, each decision tree is trained independently, and the final prediction is determined by aggregating the output of all the trees. The average of the predictions of each decision tree is used as the final prediction for regression problems. For classification problems, a voting mechanism is used to select the category with the highest number of votes as the result [87]. Lu et al. [88] studied the EMS of a PHEV and established a random forest classification model. The DP was used to obtain the global optimal solution for the driving cycle data and form a dataset. The random forest was trained to select input features for the PHEV and establish a neural network control strategy. The results showed that the strategy saved energy consumption.

6.3. Neural Networks

A neural network (NN) is a mathematical model that simulates how biological nervous systems process information. By building many interconnected nodes, it mimics the collaborative functioning of nervous systems, learning complex patterns and features in the given data, and performing tasks such as classification and prediction. Neural networks can achieve a high degree of accuracy when handling complex and nonlinear problems and possess excellent learning and generalization capabilities [89].
In hybrid electric powertrain systems, there are several challenges, such as vehicle speed prediction, driving mode recognition, and fuel consumption optimization. These problems are highly nonlinear and are influenced by many uncertain factors, making them more complicated. Traditional algorithms often suffer from low computational efficiency and poor accuracy in addressing these issues. In contrast, neural networks, with their flexible architecture and powerful data learning capabilities, can extract complex patterns from large datasets and establish high-dimensional mapping relationships between nonlinear inputs and outputs. This makes neural networks widely used in the field of hybrid electric powertrain systems [90,91].
Huo et al. [92] proposed an energy management strategy based on an artificial neural network (ANN) for plug-in hybrid electric vehicles. The DP results were used as training sets, and then the artificial neural network was trained to achieve torque distribution to reduce emissions. The evaluation results showed that the selected ANN models could achieve optimization effects comparable to the optimal solution of DP. This ECMS method with fixed equivalent factors cannot achieve the best control performance for the vehicle. The use of neural networks allows for dynamic adjustment of the equivalent factors based on changes in the driving environment and control requirements.
Xie et al. [93] proposed an improved ECMS based on an artificial neural network. The difference from the traditional method was that the artificial neural network was used to dynamically adjust the equivalent factor. An artificial neural network model with three inputs was constructed and trained. Simulations were performed by setting different initial SOC values. The results showed that the proposed ANN-ECMS method could provide better fuel economy in plug-in hybrid electric vehicles.
A back-propagation neural network is a multi-layer feed-forward neural network that uses the calculated error and propagates it back to adjust the weights and biases of the network. Its advantage is its strong adaptability, but its disadvantage is also obvious, and it is easy to fall into problems such as local optima and gradient disappearance. Its structure is shown in Figure 10. Shen et al. [94] proposed a vehicle speed prediction method based on a back-propagation neural network and a Markov chain. The Markov model was used to obtain the speed and prediction error, which were then fitted using the BP neural network. The results showed that compared with the adaptive equivalent consumption minimization strategy (A-ECMS) and the ECMS, this method effectively improved fuel economy.
Back-propagation neural networks can also be combined with genetic algorithms. Genetic algorithms can optimize the architecture of neural networks to enhance their generalization capabilities, improve prediction accuracy by adjusting weights, and reduce training time by optimizing initial parameters. Therefore, using genetic algorithms in neural network training and optimization can achieve better results. Tang et al. [95] integrated a GA with back-propagation neural networks to develop a hybrid electric vehicle neural network model for cold start emission prediction. The weights and thresholds of the BP neural network were optimized by the GA. The prediction results of the model were studied by setting different cold start temperatures and inputting different parameters. The results demonstrated that the GA-BP algorithm model could accurately predict hydrocarbon, nitrogen oxides, and particulate matter number emissions in hybrid electric vehicles.
Fuzzy neural networks are intelligent systems that combine fuzzy logic and neural networks, leveraging the advantages of both. Chen et al. [96] proposed a power control method based on a compensatory fuzzy neural network (CFNN). This strategy optimized the convergence efficiency by adjusting the dynamic step size mechanism. In addition, the sample data selection, the fuzzy sets of input and output, and the determination of fuzzy rules were taken as key factors, and the impact of these factors on the performance of the algorithm was systematically analyzed. Zhang et al. [97] proposed a neural network-based fuzzy energy management strategy grounded in driving cycle recognition. The neural network was trained using driving cycle data to enable pattern recognition, and the training results were fed into the fuzzy controller to optimize the membership functions. The results demonstrated that this method could effectively apply optimal control strategies across various driving cycles, with superior robustness and performance compared to traditional fuzzy control strategies.
Wang et al. [98] proposed an EMS based on subtractive clustering and an adaptive fuzzy neural network (AFNN). First, the topological structure of the fuzzy neural network was determined by the subtractive clustering algorithm. Then, a hybrid optimization algorithm combining the back-propagation algorithm and the least squares method was introduced to optimize both antecedent parameters and consequent parameters of the fuzzy neural network. Finally, the fuzzy membership function and rule set of the neural network were utilized to achieve power distribution control for hybrid electric vehicles. The results indicate that the power battery operates 454 times at a current of 8A, while the operating frequency of the supercapacitor is more obvious in the large current area of 12A to 16A, avoiding the large current discharge of the power battery.

6.4. Deep Learning

The core of deep learning is to process large-scale data by building a neural network with multiple hidden layers, thereby automatically extracting and learning complex features. Deep learning can effectively solve problems such as driving cycle identification, vehicle speed prediction, and energy management optimization in hybrid vehicles, and can usually achieve better performance than traditional methods. In the field of hybrid electric vehicles, common deep learning models include long short-term memory neural networks (LSTM), convolutional neural networks (CNNs), and Recursive Neural Networks (RNNs) [99,100].
The long short-term memory network (LSTM) is an enhanced version of the traditional recurrent neural network. It retains the advantages of traditional recurrent neural networks in processing and predicting time series data while introducing “memory cells” and gating mechanisms, including forget gates, input gates, and output gates. The principle is illustrated in Figure 11. Xia et al. [101] proposed a real-time predictive EMS based on an LSTM neural network. The LSTM neural network served as the foundation and was trained using vehicle speed profiles from multiple representative driving cycles. A model predictive control framework incorporating DP was designed to enable real-time optimization of power distribution between the internal combustion engine and the electric motor. As a result, the proposed strategy achieved improved fuel economy compared to the A-ECMS. Bao et al. [102] proposed a vehicle power split hybrid system energy management algorithm based on an LSTM network. The neural network was trained using data obtained from DP, and then the trained neural network was applied to a light commercial vehicle model. The results showed that the LSTM model could still adapt well and make accurate predictions when faced with driving cycle data that had never been encountered before, demonstrating good generalization ability and real-time control performance.
Han et al. [103] proposed a vehicle speed prediction model based on CB-LSTM, in which the convolutional neural network (CNN) module was used to implement feature extraction. The Bi-LSTM module was used to extract time series features from the output of the CNN layer and finally generate prediction results. Compared with the classic BP network, the CB-LSTM prediction model significantly improved the accuracy of short-term vehicle speed prediction.
Convolutional neural networks (CNNs) are deep learning models specifically designed to process structured data, such as images or grid-like data. They extract local features through convolutional operations and progressively combine these features across multiple layers to accomplish complex pattern recognition tasks. The structure of the principle of a CNN is illustrated in Figure 12. Chen et al. [104] studied a driving cycle prediction method based on a CNN. By clustering the driving data, the data were divided into different types of driving cycles, and then a CNN was used to predict these different types of cycles. In the process of training the neural network, a deep neural network was further constructed based on the extraction of key information such as vehicle speed and battery SOC. The results show that this method improves fuel economy by 14.68%.
Xie et al. [105] proposed a driving mode recognition method based on a CNN and kernel principal component analysis (KPCA). By integrating KPCA into the standard CNN architecture to reduce model parameters, the generalization capability for small sample datasets was enhanced. Subsequently, feature extraction was performed on the two-dimensional dataset. The results demonstrated that the recognition accuracy of this method surpassed that of traditional driving mode recognition algorithms relying on manual feature extraction.
Recursive Neural Networks (RNNs) are neural networks used to process tree-like or graph-like data. Their advantage is that they can process hierarchical relationships and capture complex relationships in data, but they have strict requirements on the structure of input data, require a pre-defined tree structure, and the training process is relatively complicated. Han et al. [106] proposed a novel real-time energy management strategy, RNN-A-ECMS, to address the challenges of multi-objective optimization in plug-in hybrid electric vehicles. The strategy employed PMP and particle swarm optimization to solve multi-objective optimization problems offline. Historical traffic data were optimized using DP, and the state-of-charge trajectory information was utilized to train the RNN. Additionally, a Proportional plus Integral controller was implemented to adjust the equivalent factor. Simulation tests demonstrated that this approach effectively reduced fuel consumption and extended battery lifespan.
Millo et al. [107] proposed an energy management strategy based on a deep RNN. The driving data were generated using a digital twin model, and the global optimal solution generated by DP was used as a training dataset, which was then used to train the neural network. The experimental results showed that carbon dioxide emissions were significantly reduced compared with rule-based control strategies and an ECMS. Wu [108] proposed a predictive EMS based on an RNN and a radial basis function neural network and applied it to multi-mode plug-in hybrid electric vehicles. A pattern recognition network was introduced and was trained using the optimization results of DP. For battery power modeling, the RNN was trained using the optimal result of PMP. This strategy is targeted at the multi-mode powertrain PHEV adopted in this research. To verify the strategy, it is compared with the DP-based and ECMS-based strategies in multiple vehicle test cycles, and the maximum cost of each cycle obtained by the ECMS is used as the cost baseline. The proposed algorithm saves 22.07%, 27.10%, 12.2%, 32.20%, 22.67%, and 34.68% in 13 SC03, 6 UDDS, 5 WLTC, 8 WVUSUB, 6 KM1, and 13 KM2 cycles, respectively. Compared with DP, the corresponding excess costs are 25.36%, 21.88%, 18.21%, 19.05%, 22.75%, and 23.57%, respectively.

6.5. Reinforcement Learning

In the field of hybrid electric vehicles, in addition to supervised learning and unsupervised learning, reinforcement learning is also a commonly used machine learning method. Reinforcement learning is divided into two parts: the agent and the environment. The agent explores the environment by trying different actions, and then adjusts its strategy based on feedback from the environment. When an action leads to a positive reward, the action will be encouraged, and if the action leads to a penalty, the frequency of occurrence will be suppressed. By constantly trying different actions and evaluating their effects, the agent gradually learns the behavior that can maximize long-term rewards in the current environment. This mechanism enables reinforcement learning to autonomously learn and optimize strategies without explicit guidance, thereby achieving goal optimization in complex dynamic environments [109]. The interaction between the agent and the environment during learning is shown in Figure 13.
Reinforcement learning has been widely applied in hybrid electric vehicle research, with algorithms such as Q-Learning and the Dyna algorithm being particularly prominent. Among these, Q-Learning is the most extensively utilized algorithm. Yang et al. [110] investigated the application of reinforcement learning to energy management strategy within a model predictive control MPC framework. An LSTM network was employed for speed prediction, which was integrated with the hybrid electric vehicle prediction model to serve as the environment model for reinforcement learning. The approach enabled learning and selecting the optimal action within the prediction horizon, but only the action corresponding to the current state was executed. Yang et al. [111] used the Dyna algorithm to establish an EMS for HEVs. This method used actual experience to build a model. Dyna combined the advantages of direct reinforcement learning and concise reinforcement learning. The results showed that the algorithm improved the learning speed. Du et al. [112] proposed Dyna-H, a new reinforcement learning method that combined heuristic planning with its agent, which had higher computational efficiency and better optimization performance than DP and traditional Dyna in dealing with energy management problems.
Musa et al. designed an energy management strategy based on the Q-Learning algorithm. The Q-Learning algorithm was used to adjust the start timing and output torque of the internal combustion engine in a hybrid electric vehicle, as shown in Figure 14. The results showed that the fuel economy was improved compared with the DP [113]. Ahmadian et al. [114] developed an EMS based on the Q-Learning algorithm for hybrid electric vehicles. To address the optimization challenges of fuel consumption and battery lifespan, an online reward function was innovatively incorporated. This method eliminated the need for prior knowledge of driving cycles and avoided complex and precise modeling processes. Simulation results demonstrated that, compared to rule-based energy management strategies, this approach reduced fuel consumption by 1.25% and extended battery life by 65%.
By utilizing the efficient exploration and learning capabilities of Q-Learning, it is possible to quickly find the approximate optimal solution in the MPC-based EMS rolling optimization process, thereby reducing the computational complexity and improving the real-time performance and enhancing the fuel economy of hybrid electric vehicles. Chen et al. [115] designed a hybrid strategy combining reinforcement learning and stochastic model predictive control. Using the Q-Learning algorithm, a reinforcement learning controller was trained with driving power distribution data under different driving cycles, and the trained Q-table was used to calculate the rolling optimization process in MPC. Double Q-Learning (DQL) is an improved algorithm of the Q-Learning algorithm, which effectively solves the problem of overestimation of Q-values in traditional Q-Learning and improves the convergence efficiency. Applying Double Q-Learning in MPC can also solve the rolling optimization problem. Chen et al. [116] proposed an EMS based on MPC to achieve a reasonable distribution of engine and motor power. The Double Q-Learning algorithm was applied in MPC. Then, two trained Q-tables were established based on the Double Q-Learning algorithm to solve the rolling optimization process in MPC. The results showed that this strategy could achieve good fuel economy and improve the adaptability of EMS to dynamic driving environments.
Shen et al. [117] used the Double Delayed Q-learning (DDQL) algorithm in MPC to establish a PHEV energy management strategy. This method constructed the energy allocation problem as a nonlinear optimal control model and adopted a strategy that combined prediction and optimization to achieve efficient control. A vehicle speed prediction model was constructed using a convolutional neural network and embedded in the model predictive control framework. In addition, the Double Delayed Q-Learning algorithm was introduced to solve the rolling optimization process of MPC. The results showed that compared with the MPC-based EMS, driving environment changes had less impact on the Double Delayed Q-Learning fusion MPC strategy, achieving good fuel economy and higher computational efficiency.

6.6. Deep Reinforcement Learning

Deep reinforcement learning is a combination of reinforcement learning and deep learning, which uses deep learning to process high-dimensional state spaces and action spaces. The Q-Learning method has obvious limitations when dealing with high-dimensional problems because it is highly dependent on the discretization of state and action spaces, which leads to a sharp increase in computational complexity in high-dimensional or continuous spaces. To solve this problem, researchers combined the advantages of deep learning and proposed the Deep Q-Network (DQN). DQN leverages deep neural networks as function approximators to directly process high-dimensional continuous state spaces, effectively addressing the limitations of traditional Q-Learning in handling high-dimensional problems.
By combining reinforcement learning with deep learning, DQN not only improves the generalization ability of the algorithm but can also cope with more complex and realistic control and decision-making problems. Figure 15 is a structural diagram of the energy management strategy based on the DQN algorithm [118].
Tang et al. [119] designed an energy and emission collaborative optimization strategy with DQN as the core. Under the premise of combining emission optimization objectives, two distributed deep reinforcement learning techniques were used to design an energy management system. Wu et al. [120] applied the Deep Q-Learning (DQL) algorithm to the EMS of hybrid electric buses. The strategy used a neural network to estimate the action value function and was trained to optimize performance. Experimental results showed that compared with traditional Q-Learning, this strategy performed better in terms of training speed and convergence efficiency. Wang et al. [121] proposed a new energy management strategy to reduce the energy consumption of hybrid electric vehicles. The strategy adopted an enhanced parameterized Deep Q-Network (PDQN) and applied it to the proposed energy management strategy that integrated the battery aging mechanism. Qi et al. [122] proposed a hierarchical reinforcement learning EMS based on an improved version of the Deep Q-Learning algorithm (DQL-H). Compared with other reinforcement learning algorithms, this approach significantly improved training efficiency while achieving lower fuel consumption. Du et al. [123] designed a learning framework of the Double Deep Q-Learning (DDQL) algorithm for the EMS of a series hybrid electric tracked vehicle with the goal of solving the energy management optimization problem. Compared with previous deep reinforcement learning, it achieved higher training speed and less energy consumption.
Shi et al. [124] studied an adaptive ECMS plug-in hybrid electric vehicle energy management strategy improved by deep reinforcement learning. The strategy took the battery SOC and periodically predicted driving cycle information as input parameters and dynamically corrected the equivalent factor of ECMS based on these inputs through a Double Deep Q-Network (DDQN). The corrected ECMS further adjusted the engine torque demand and powertrain transmission ratio in real time and used a BP neural network to predict the average speed in the driving cycle information to achieve more accurate prediction of future driving information. The results indicated that the EMS reduced engine fuel consumption.
The Deep Deterministic Policy Gradient (DDPG) algorithm is based on the actor–critic, uses a deep neural network to construct a policy network (actor) and a value network (critic), and uses the stochastic gradient descent method to train them. The policy network (actor) generates actions based on the current state, and the value network (critic) evaluates the value of the current state and action. The schematic diagram of the DDPG algorithm is shown in Figure 16 [125].
Zhang et al. [126] developed a two-layer energy management strategy that combined deep reinforcement learning with driving state recognition for hierarchical collaborative control. In the upper layer, a sequence network was utilized to extract temporal features of vehicle speed, while a multi-objective reward function was designed in the lower layer. The Deep Deterministic Policy Gradient (DDPG) algorithm was employed to dynamically adjust the energy allocation strategy in real time. This energy management strategy improved fuel economy. Ruan et al. [127] applied the DDPG algorithm to the energy management of plug-in hybrid electric vehicles to achieve a reasonable distribution of double-motor driving force. The energy management was divided into two layers. The DDPG algorithm was introduced and trained in the upper layer architecture. In the lower layer architecture, the system selected and determined the most appropriate driving mode based on the energy consumption optimization strategy. This EMS achieved better power distribution performance.
He et al. [128] applied the DDPG algorithm to design a double-layer prediction EMS. The DDPG algorithm was used in the upper structure of the model and trained with actual speed data. In the lower structure, the short-term speed prediction capability of the deep neural network and the optimization mechanism of MPC were combined to track the SOC reference trajectory. Simulation results showed that the strategy achieved approximate performance of the global optimal algorithm and had excellent robustness and flexible adaptability to prediction errors. Ma et al. [129] proposed an improved DDPG algorithm for hybrid electric tracked vehicles. Based on a time-varying weighting factor, an online update framework was constructed to dynamically update network parameters using recent operating data, optimizing the model built on DDPG. Compared with the unimproved DDPG, fuel economy improved by 6%. In comparison to the energy management strategy based on DP, this strategy achieved suboptimal fuel economy while improving calculation speed. The EMS structure based on DDPG is shown in Figure 17 [118].
Twin Delayed Deep Deterministic Policy Gradient (TD3) is an improved version of the DDPG algorithm, which makes improvements in the delayed update of the double Q-value network and the smoothing regularization of the target strategy network. Zhou et al. established an EMS based on the TD3 algorithm. The principal structure of the TD3 algorithm can be seen in Figure 18. A heuristic rule-based controller was combined with TD3 for improvement, and a hybrid experience replay method was developed to address external environmental interference [130]. Yazar et al. [131] used the TD3 algorithm in the design of the EMS for HEVs, which showed better performance compared to the existing EMS designs based on Q-Learning, Q-Network, and DDPG algorithms.
Huang et al. [132] designed a new EMS based on a TD3 algorithm. The EMS aimed to reduce fuel consumption and slow down battery degradation as a common goal, thereby lowering operating costs. The results showed that battery capacity loss was reduced by 57.54%, with a 2.12% increase in fuel consumption. Its training efficiency and learning ability were better than those of the DDPG algorithm.
In addition, the soft actor–critic (SAC) is another deep reinforcement learning algorithm used in energy management strategies for hybrid electric vehicles. Huang et al. [133] proposed an intelligent EMS based on soft actor–critic (SAC). They improved the SAC algorithm by applying the prioritized experience replay technique. Huang et al. [134] established an EMS based on deep reinforcement learning for a power-split HEV. Using a data-driven approach, a specific driving cycle was constructed based on real data from a specific urban area. In addition, an innovative intelligent energy management strategy was proposed. The strategy used an improved soft actor–critic algorithm, and a new experience replay method. Simulation results showed that the fuel consumption and pollutant emissions of the proposed EMS were lower than those of the EMS using the standard SAC.
Li et al. [135] designed a novel model-free energy management strategy based on the SAC algorithm. The maximum entropy framework was introduced to achieve multiple power efficiency optimizations and better adaptability to driving cycles. Simulation results showed that, compared with the traditional EMS, it saved 4.37% in energy consumption and improved adaptability to different driving cycles. Li et al. [136] proposed a new EMS that combined intelligent transportation systems. They used a long short-term memory neural network to predict the speed of the preceding vehicle, combined the predictive cruise control strategy with the adaptive cruise control strategy, and established the EMS using SAC, as shown in Figure 19.
In this paper, artificial intelligence algorithms in hybrid electric powertrain systems are divided into six parts: unsupervised learning, supervised learning, neural networks, deep learning, reinforcement learning, and deep reinforcement learning. The advantages and disadvantages of various algorithms are summarized, as shown in Table 1.

7. Summary and Outlook

This paper categorizes and elaborates on the latest research advancements in artificial intelligence algorithms for hybrid electric powertrain systems control. First, the traditional methods and their advantages and limitations are introduced, and then the specific application of artificial intelligence algorithms in hybrid electric powertrain systems is introduced. Then, the artificial intelligence algorithms used in hybrid electric powertrain systems are divided into unsupervised learning, supervised learning, neural networks, deep learning, reinforcement learning, and deep reinforcement learning. The advantages and disadvantages of each algorithm are summarized.
The primary advantage of unsupervised learning lies in its ability to uncover hidden patterns and structures within data without requiring extensive labeled datasets. However, it imposes high demands on data quality. This approach is predominantly applied in tasks such as driving mode classification and data dimensionality reduction. The primary advantage of supervised learning is its ability to deliver high-accuracy predictions and classifications when ample labeled data are available. However, its drawback lies in the tendency to overfit when data are insufficient, which compromises generalization performance. Supervised learning plays a critical role in applications such as fault diagnosis, driving cycle identification, and vehicle speed prediction.
Neural networks possess powerful nonlinear modeling capabilities, enabling them to handle complex relationships within high-dimensional data. However, this method is highly sensitive to initial weights, as different initial values can significantly impact training outcomes. Additionally, insufficient training data may lead to overfitting. Neural networks are widely applied in vehicle speed prediction, driving mode recognition, emission prediction, and fuel consumption optimization. More specifically, they can also be utilized to dynamically adjust the equivalent factor in the equivalent consumption minimization strategy.
Deep learning can extract more complex features and patterns through its multi-layer neural network architecture, demonstrating significant advantages in applications such as fault diagnosis, battery state estimation, vehicle speed prediction, driving mode recognition, and enhancing driving comfort. Reinforcement learning does not require the prior establishment of an accurate environmental model and exhibits strong adaptive capabilities. Specifically, it offers significant advantages in optimizing energy distribution between internal combustion engines and electric motors, refining driving strategies, and managing battery systems.
Deep reinforcement learning integrates the strengths of deep learning and reinforcement learning, utilizing deep neural networks to process state, action, and reward information. It addresses the limitations of traditional reinforcement learning in handling high-dimensional continuous state spaces and is applied in tasks such as energy distribution and battery management.
In the future, novel artificial intelligence algorithms will be developed to enable control systems to achieve superior performance in complex and dynamic scenarios. Additionally, with the rapid advancement of intelligent transportation systems and vehicle networking technologies, the application of artificial intelligence algorithms will allow for more accurate predictions of driving behavior and traffic condition changes. Furthermore, the integration of artificial intelligence algorithms with intelligent networking technologies will drive innovation in vehicle energy management systems and contribute significantly to the evolution of intelligent transportation systems.

Author Contributions

Conceptualization, D.Z. and B.L.; methodology, B.L.; validation, B.L.; formal analysis, W.F.; investigation, D.Z.; resources, J.T.; data curation, W.F.; writing—original draft preparation, D.Z.; writing—review and editing, D.Z. and B.L.; visualization, L.L.; supervision, B.L.; project administration, J.T. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Jain, H. From pollution to progress: Groundbreaking advances in clean technology unveiled. Innov. Green Dev. 2024, 3, 100143. [Google Scholar] [CrossRef]
  2. Abbasi, K.R.; Zhang, Q.; Ozturk, I.; Alvarado, R.; Musa, M. Energy transition, fossil fuels, and green innovations: Paving the way to achieving sustainable development goals in the United States. Gondwana Res. 2024, 130, 326–341. [Google Scholar] [CrossRef]
  3. Yuan, X.; Liu, X.; Zuo, J. The development of new energy vehicles for a sustainable future: A review. Renew. Sustain. Energy Rev. 2015, 42, 298–305. [Google Scholar] [CrossRef]
  4. Khalatbarisoltani, A.; Zhou, H.; Tang, X.; Kandidayeni, M.; Boulon, L.; Hu, X. Energy management strategies for fuel cell vehicles: A comprehensive review of the latest progress in modeling, strategies, and future prospects. IEEE Trans. Intell. Transp. Syst. 2023, 25, 14–32. [Google Scholar] [CrossRef]
  5. Yi, C.; Epureanu, B.I.; Hong, S.K.; Ge, T.; Yang, X.G. Modeling, control, and performance of a novel architecture of hybrid electric powertrain system. Appl. Energy 2016, 178, 454–467. [Google Scholar] [CrossRef]
  6. Hayes, J.G.; Goodarzi, G.A. Electric powertrain: Energy systems, power electronics and drives for hybrid, electric and fuel cell vehicles. IEEE Aerosp. Electron. Syst. Mag. 2018, 34, 46–47. [Google Scholar] [CrossRef]
  7. Hannan, M.A.; Azidin, F.A.; Mohamed, A. Hybrid electric vehicles and their challenges: A review. Renew. Sustain. Energy Rev. 2014, 29, 135–150. [Google Scholar] [CrossRef]
  8. Martinez, C.M.; Hu, X.; Cao, D.; Velenis, E.; Gao, B.; Wellers, M. Energy management in plug-in hybrid electric vehicles: Recent progress and a connected vehicles perspective. IEEE Trans. Veh. Technol. 2016, 66, 4534–4549. [Google Scholar] [CrossRef]
  9. Stempien, J.P.; Chan, S.H. Comparative study of fuel cell, battery and hybrid buses for renewable energy constrained areas. J. Power Sources 2017, 340, 347–355. [Google Scholar] [CrossRef]
  10. Sabri, M.F.M.; Danapalasingam, K.A.; Rahmat, M.F. A review on hybrid electric vehicles architecture and energy management strategies. Renew. Sustain. Energy Rev. 2016, 53, 1433–1442. [Google Scholar] [CrossRef]
  11. Singh, K.V.; Bansal, H.O.; Singh, D. A comprehensive review on hybrid electric vehicles: Architectures and components. J. Mod. Transp. 2019, 27, 77–107. [Google Scholar] [CrossRef]
  12. Ahmadi, S.; Bathaee, S.M.T.; Hosseinpour, A.H. Improving fuel economy and performance of a fuel-cell hybrid electric vehicle (fuel-cell, battery, and ultra-capacitor) using optimized energy management strategy. Energy Convers. Manag. 2018, 160, 74–84. [Google Scholar] [CrossRef]
  13. Zhang, F.; Hu, X.; Langari, R.; Cao, D. Energy management strategies of connected HEVs and PHEVs: Recent progress and outlook. Prog. Energy Combust. Sci. 2019, 73, 235–256. [Google Scholar] [CrossRef]
  14. Lü, X.; Li, S.; He, X.; Xie, C.; He, S.; Xu, Y.; Fang, J.; Zhang, M.; Yang, X. Hybrid electric vehicles: A review of energy management strategies based on model predictive control. J. Energy Storage 2022, 56, 106112. [Google Scholar] [CrossRef]
  15. Tran, D.D.; Vafaeipour, M.; El Baghdadi, M.; Barrero, R.; Van Mierlo, J.; Hegazy, O. Thorough state-of-the-art analysis of electric and hybrid vehicle powertrains: Topologies and integrated energy management strategies. Renew. Sustain. Energy Rev. 2020, 119, 109596. [Google Scholar] [CrossRef]
  16. Jui, J.J.; Ahmad, M.A.; Molla, M.I.; Rashid, M.I.M. Optimal energy management strategies for hybrid electric vehicles: A recent survey of machine learning approaches. J. Eng. Res. 2024, 12, 454–467. [Google Scholar] [CrossRef]
  17. Denis, N.; Dubois, M.R.; Desrochers, A. Fuzzy-based blended control for the energy management of a parallel plug-in hybrid electric vehicle. IET Intell. Transp. Syst. 2015, 9, 30–37. [Google Scholar] [CrossRef]
  18. Banvait, H.; Anwar, S.; Chen, Y. A rule-based energy management strategy for plug-in hybrid electric vehicle (PHEV). In Proceedings of the 2009 American Control Conference, St. Louis, MO, USA, 10–12 June 2009; IEEE: Piscataway, NJ, USA, 2009; pp. 3938–3943. [Google Scholar]
  19. Xie, S.; He, H.; Peng, J. An energy management strategy based on stochastic model predictive control for plug-in hybrid electric buses. Appl. Energy 2017, 196, 279–288. [Google Scholar] [CrossRef]
  20. Li, Y.; He, H.; Peng, J.; Wang, H. Deep reinforcement learning-based energy management for a series hybrid electric vehicle enabled by history cumulative trip information. IEEE Trans. Veh. Technol. 2019, 68, 7416–7430. [Google Scholar] [CrossRef]
  21. Liu, T.; Zou, Y.; Liu, D.; Sun, F. Reinforcement learning of adaptive energy management with transition probability for a hybrid electric tracked vehicle. IEEE Trans. Ind. Electron. 2015, 62, 7837–7846. [Google Scholar] [CrossRef]
  22. Enang, W.; Bannister, C. Modelling and control of hybrid electric vehicles (A comprehensive review). Renew. Sustain. Energy Rev. 2017, 74, 1210–1239. [Google Scholar] [CrossRef]
  23. Cao, Y.; Yao, M.; Sun, X. An overview of modelling and energy management strategies for hybrid electric vehicles. Appl. Sci. 2023, 13, 5947. [Google Scholar] [CrossRef]
  24. Song, C.; Kim, K.; Sung, D.; Kim, K.; Yang, H.; Lee, H.; Cho, G.Y.; Cha, S.W. A review of optimal energy management strategies using machine learning techniques for hybrid electric vehicles. Int. J. Automot. Technol. 2021, 22, 1437–1452. [Google Scholar] [CrossRef]
  25. Selvakumar, S.G. Electric and hybrid vehicles–A comprehensive overview. In Proceedings of the 2021 IEEE 2nd International Conference On Electrical Power and Energy Systems (ICEPES), Bhopal, India, 10–11 December 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 1–6. [Google Scholar]
  26. Cheng, H.; Wang, L.; Xu, L.; Ge, X.; Yang, S. An integrated electrified powertrain topology with SRG and SRM for plug-in hybrid electrical vehicle. IEEE Trans. Ind. Electron. 2019, 67, 8231–8241. [Google Scholar] [CrossRef]
  27. Fathabadi, H. Fuel cell hybrid electric vehicle (FCHEV): Novel fuel cell/SC hybrid power generation system. Energy Convers. Manag. 2018, 156, 192–201. [Google Scholar] [CrossRef]
  28. Zhuang, W.; Li, S.; Zhang, X.; Kum, D.; Song, Z.; Yin, G.; Ju, F. A survey of powertrain configuration studies on hybrid electric vehicles. Appl. Energy 2020, 262, 114553. [Google Scholar] [CrossRef]
  29. Chen, B.; Evangelou, S.A.; Lot, R. Series hybrid electric vehicle simultaneous energy management and driving speed optimization. IEEE/ASME Trans. Mechatron. 2019, 24, 2756–2767. [Google Scholar] [CrossRef]
  30. Veerendra, A.S.; Mohamed, M.R.; Leung, P.K.; Shah, A.A. Hybrid power management for fuel cell/supercapacitor series hybrid electric vehicle. Int. J. Green Energy 2021, 18, 128–143. [Google Scholar] [CrossRef]
  31. Yang, Y.; Hu, X.; Pei, H.; Peng, Z. Comparison of power-split and parallel hybrid powertrain architectures with a single electric machine: Dynamic programming approach. Appl. Energy 2016, 168, 683–690. [Google Scholar] [CrossRef]
  32. Wang, Y.; Wang, X.; Sun, Y.; You, S. Model predictive control strategy for energy optimization of series-parallel hybrid electric vehicle. J. Clean. Prod. 2018, 199, 348–358. [Google Scholar] [CrossRef]
  33. Xie, S.; Hu, X.; Qi, S.; Tang, X.; Lang, K.; Xin, Z.; Brighton, J. Model predictive energy management for plug-in hybrid electric vehicles considering optimal battery depth of discharge. Energy 2019, 173, 667–678. [Google Scholar] [CrossRef]
  34. Sarvaiya, S.; Ganesh, S.; Xu, B. Comparative analysis of hybrid vehicle energy management strategies with optimization of fuel economy and battery life. Energy 2021, 228, 120604. [Google Scholar] [CrossRef]
  35. Chrenko, D.; Gan, S.; Gutenkunst, C.; Kriesten, R.; Le Moyne, L. Novel classification of control strategies for hybrid electric vehicles. In Proceedings of the 2015 IEEE Vehicle Power and Propulsion Conference (VPPC), Montreal, QC, Canada, 19–22 October 2015; IEEE: Piscataway, NJ, USA, 2015; pp. 1–6. [Google Scholar]
  36. Wirasingha, S.G.; Emadi, A. Classification and review of control strategies for plug-in hybrid electric vehicles. IEEE Trans. Veh. Technol. 2010, 60, 111–122. [Google Scholar] [CrossRef]
  37. Yin, H.; Zhou, W.; Li, M.; Ma, C.; Zhao, C. An adaptive fuzzy logic-based energy management strategy on battery/ultracapacitor hybrid electric vehicles. IEEE Trans. Transp. Electrif. 2016, 2, 300–311. [Google Scholar] [CrossRef]
  38. Panday, A.; Bansal, H.O. A review of optimal energy management strategies for hybrid electric vehicle. Int. J. Veh. Technol. 2014, 2014, 160510. [Google Scholar] [CrossRef]
  39. Padmarajan, B.V.; McGordon, A.; Jennings, P.A. Blended rule-based energy management for PHEV: System structure and strategy. IEEE Trans. Veh. Technol. 2015, 65, 8757–8762. [Google Scholar] [CrossRef]
  40. Kim, M.; Jung, D.; Min, K. Hybrid thermostat strategy for enhancing fuel economy of series hybrid intracity bus. IEEE Trans. Veh. Technol. 2014, 63, 3569–3579. [Google Scholar] [CrossRef]
  41. Peng, C.; Feng, F.; Xiao, Y.; Liang, W. Multi-working points Power follower based energy management strategy for series hybrid electric vehicle. J. Phys. Conf. Series. 2020, 1601, 022039. [Google Scholar] [CrossRef]
  42. Shi, D.; Wang, S.; Pisu, P.; Chen, L.; Wang, R.; Wang, R. Modeling and optimal energy management of a power split hybrid electric vehicle. Sci. China Technol. Sci. 2017, 60, 713–725. [Google Scholar] [CrossRef]
  43. Mohd Sabri, M.F.; Danapalasingam, K.A.; Rahmat, M.F. Improved fuel economy of through-the-road hybrid electric vehicle with fuzzy logic-based energy management strategy. Int. J. Fuzzy Syst. 2018, 20, 2677–2692. [Google Scholar] [CrossRef]
  44. Montazeri-Gh, M.; Mahmoodi-K, M. Optimized predictive energy management of plug-in hybrid electric vehicle based on traffic condition. J. Clean. Prod. 2016, 139, 935–948. [Google Scholar] [CrossRef]
  45. Wu, J.; Zhang, C.H.; Cui, N.X. Fuzzy energy management strategy for a hybrid electric vehicle based on driving cycle recognition. Int. J. Automot. Technol. 2012, 13, 1159–1167. [Google Scholar] [CrossRef]
  46. Dawei, M.; Yu, Z.; Meilan, Z.; Risha, N. Intelligent fuzzy energy management research for a uniaxial parallel hybrid electric vehicle. Comput. Electr. Eng. 2017, 58, 447–464. [Google Scholar] [CrossRef]
  47. Zou, K.; Luo, W.; Lu, Z. Real-Time Energy Management Strategy of Hydrogen Fuel Cell Hybrid Electric Vehicles Based on Power Following Strategy–Fuzzy Logic Control Strategy Hybrid Control. World Electr. Veh. J. 2023, 14, 315. [Google Scholar] [CrossRef]
  48. Shabbir, W.; Evangelou, S.A. Threshold-changing control strategy for series hybrid electric vehicles. Appl. Energy 2019, 235, 761–775. [Google Scholar] [CrossRef]
  49. Lü, X.; Wu, Y.; Lian, J.; Zhang, Y.; Chen, C.; Wang, P.; Meng, L. Energy management of hybrid electric vehicles: A review of energy optimization of fuel cell hybrid power system based on genetic algorithm. Energy Convers. Manag. 2020, 205, 112474. [Google Scholar] [CrossRef]
  50. Liu, Y.; Gao, J.; Qin, D.; Zhang, Y.; Lei, Z. Rule-corrected energy management strategy for hybrid electric vehicles based on operation-mode prediction. J. Clean. Prod. 2018, 188, 796–806. [Google Scholar] [CrossRef]
  51. Wu, J.; Ruan, J.; Zhang, N.; Walker, P.D. An optimized real-time energy management strategy for the power-split hybrid electric vehicles. IEEE Trans. Control Syst. Technol. 2018, 27, 1194–1202. [Google Scholar] [CrossRef]
  52. Wang, R.; Lukic, S.M. Dynamic programming technique in hybrid electric vehicle optimization. In Proceedings of the 2012 IEEE International Electric Vehicle Conference, Greenville, SC, USA, 4–8 March 2012; IEEE: Piscataway, NJ, USA, 2012; pp. 1–8. [Google Scholar]
  53. Desai, C.; Williamson, S.S. Optimal design of a parallel hybrid electric vehicle using multi-objective genetic algorithms. In Proceedings of the 2009 IEEE Vehicle Power and Propulsion Conference, Dearborn, MI, USA, 7–11 September 2009; IEEE: Piscataway, NJ, USA, 2009; pp. 871–876. [Google Scholar]
  54. Wang, Y.; Wu, Z.; Chen, Y.; Xia, A.; Guo, C.; Tang, Z. Research on energy optimization control strategy of the hybrid electric vehicle based on Pontryagin’s minimum principle. Comput. Electr. Eng. 2018, 72, 203–213. [Google Scholar] [CrossRef]
  55. Rezaei, A.; Burl, J.B.; Zhou, B. Estimation of the ECMS equivalent factor bounds for hybrid electric vehicles. IEEE Trans. Control. Syst. Technol. 2017, 26, 2198–2205. [Google Scholar] [CrossRef]
  56. Kim, T.S.; Manzie, C.; Sharma, R. Model predictive control of velocity and torque split in a parallel hybrid vehicle. In Proceedings of the 2009 IEEE International Conference on Systems, Man and Cybernetics, San Antonio, TX, USA, 11–14 October 2009; IEEE: Piscataway, NJ, USA, 2009; pp. 2014–2019. [Google Scholar]
  57. Montazeri-Gh, M.; Mahmoodi-K, M. Development a new power management strategy for power split hybrid electric vehicles. Transp. Res. Part D Transp. Environ. 2015, 37, 79–96. [Google Scholar] [CrossRef]
  58. Hu, J.; Li, J.; Hu, Z.; Xu, L.; Ouyang, M. Power distribution strategy of a dual-engine system for heavy-duty hybrid electric vehicles using dynamic programming. Energy 2021, 215, 118851. [Google Scholar] [CrossRef]
  59. Du, J.; Zhang, X.; Wang, T.; Song, Z.; Yang, X.; Wang, H.; Ouyang, M.; Wu, X. Battery degradation minimization oriented energy management strategy for plug-in hybrid electric bus with multi-energy storage system. Energy 2018, 165, 153–163. [Google Scholar] [CrossRef]
  60. Yao, Z.; Shao, R.; Zhan, S.; Mo, R.; Wu, Z. Energy management strategy for fuel cell hybrid electric vehicles using Pontryagin’s minimum principle and dynamic SoC planning. Energy Sources Part A Recovery Util. Environ. Eff. 2024, 46, 5112–5513. [Google Scholar] [CrossRef]
  61. Sun, X.; Chen, Z.; Han, S.; Tian, X.; Jin, Z.; Cao, Y.; Xue, M. Adaptive real-time ECMS with equivalent factor optimization for plug-in hybrid electric buses. Energy 2024, 304, 132014. [Google Scholar] [CrossRef]
  62. Zeng, T.; Zhang, C.; Zhang, Y.; Deng, C.; Hao, D.; Zhu, Z.; Ran, H.; Cao, D. Optimization-oriented adaptive equivalent consumption minimization strategy based on short-term demand power prediction for fuel cell hybrid vehicle. Energy 2021, 227, 120305. [Google Scholar] [CrossRef]
  63. Bonab, S.A.; Emadi, A. MPC-based energy management strategy for an autonomous hybrid electric vehicle. IEEE Open J. Ind. Appl. 2020, 1, 171–180. [Google Scholar] [CrossRef]
  64. Quan, S.; Wang, Y.X.; Xiao, X.; He, H.; Sun, F. Real-time energy management for fuel cell electric vehicle using speed prediction-based model predictive control considering performance degradation. Appl. Energy 2021, 304, 117845. [Google Scholar] [CrossRef]
  65. Mazouzi, A.; Hadroug, N.; Alayed, W.; Hafaifa, A.; Iratni, A.; Kouzou, A. Comprehensive optimization of fuzzy logic-based energy management system for fuel-cell hybrid electric vehicle using genetic algorithm. Int. J. Hydrogen Energy 2024, 81, 889–905. [Google Scholar] [CrossRef]
  66. Fan, L.; Wang, Y.; Wei, H.; Zhang, Y.; Zheng, P.; Huang, T.; Li, W. A GA-based online real-time optimized energy management strategy for plug-in hybrid electric vehicles. Energy 2022, 241, 122811. [Google Scholar] [CrossRef]
  67. Kapil, S.; Chawla, M.; Ansari, M.D. On K-means data clustering algorithm with genetic algorithm. In Proceedings of the 2016 Fourth International Conference on Parallel, Distributed and Grid Computing (PDGC), Waknaghat, India, 2–24 December 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 202–206. [Google Scholar]
  68. Ding, S.; Su, C.; Yu, J. An optimizing BP neural network algorithm based on genetic algorithm. Artif. Intell. Rev. 2011, 36, 153–162. [Google Scholar] [CrossRef]
  69. Belhadj, S.; Belmokhtar, K.; Ghedamsi, K. Improvement of Energy Management Control Strategy of Fuel Cell Hybrid Electric Vehicles Based on Artificial Intelligence Techniques. J. Eur. Des Systèmes Autom. 2019, 52, 541–550. [Google Scholar] [CrossRef]
  70. Xiong, R.; Cao, J.; Yu, Q. Reinforcement learning-based real-time power management for hybrid energy storage system in the plug-in hybrid electric vehicle. Appl. Energy 2018, 211, 538–548. [Google Scholar] [CrossRef]
  71. Tang, X.; Zhang, J.; Pi, D.; Lin, X.; Grzesiak, L.M.; Hu, X. Battery health-aware and deep reinforcement learning-based energy management for naturalistic data-driven driving scenarios. IEEE Trans. Transp. Electrif. 2021, 8, 948–964. [Google Scholar] [CrossRef]
  72. Lemieux, J.; Ma, Y. Vehicle speed prediction using deep learning. In Proceedings of the 2015 IEEE Vehicle Power and Propulsion Conference (VPPC), Montreal, QC, Canada, 19–22 October 2015; IEEE: Piscataway, NJ, USA, 2015; pp. 1–5. [Google Scholar]
  73. Chen, Z.; Wu, S.; Shen, S.; Liu, Y.; Guo, F.; Zhang, Y. Co-optimization of velocity planning and energy management for autonomous plug-in hybrid electric vehicles in urban driving scenarios. Energy 2023, 263, 126060. [Google Scholar] [CrossRef]
  74. James, G.; Witten, D.; Hastie, T.; Tibshirani, R.; Taylor, J. Unsupervised learning. In An Introduction to Statistical Learning: With Applications in Python; Springer International Publishing: Cham, Switzerland, 2023; pp. 503–556. [Google Scholar]
  75. Zhang, J.; Chu, L.; Wang, X.; Guo, C.; Fu, Z.; Zhao, D. Optimal energy management strategy for plug-in hybrid electric vehicles based on a combined clustering analysis. Appl. Math. Model. 2021, 94, 49–67. [Google Scholar] [CrossRef]
  76. Sun, X.; Cao, Y.; Jin, Z.; Tian, X.; Xue, M. An adaptive ECMS based on traffic information for plug-in hybrid electric buses. IEEE Trans. Ind. Electron. 2022, 70, 9248–9259. [Google Scholar] [CrossRef]
  77. Li, S.; Hu, M.; Gong, C.; Zhan, S.; Qin, D. Energy management strategy for hybrid electric vehicle based on driving condition identification using KGA-means. Energies 2018, 11, 1531. [Google Scholar] [CrossRef]
  78. Chang, C.; Zhao, W.; Wang, C.; Song, Y. A novel energy management strategy integrating deep reinforcement learning and rule based on condition identification. IEEE Trans. Veh. Technol. 2022, 72, 1674–1688. [Google Scholar] [CrossRef]
  79. Liu, T.; Yu, H.; Guo, H.; Qin, Y.; Zou, Y. Online energy management for multimode plug-in hybrid electric vehicles. IEEE Trans. Ind. Inform. 2018, 15, 4352–4361. [Google Scholar] [CrossRef]
  80. Ye, F.; Zhai, X. Research on energy management strategy of diesel hybrid electric vehicle based on decision tree CART algorithm. In IOP Conference Series: Materials Science and Engineering; IOP Publishing: Bristol, UK, 2019; Volume 677, p. 032076. [Google Scholar]
  81. Liu, F.; Liu, B.; Zhang, J.; Wan, P.; Li, B. Fault mode detection of a hybrid electric vehicle by using support vector machine. Energy Rep. 2023, 9, 137–148. [Google Scholar] [CrossRef]
  82. Zhuang, W.; Zhang, X.; Li, D.; Wang, L.; Yin, G. Mode shift map design and integrated energy management control of a multi-mode hybrid electric vehicle. Appl. Energy 2017, 204, 476–488. [Google Scholar] [CrossRef]
  83. Ji, Y.; Lee, H. Event-based anomaly detection using a one-class SVM for a hybrid electric vehicle. IEEE Trans. Veh. Technol. 2022, 71, 6032–6043. [Google Scholar] [CrossRef]
  84. Shi, Q.I.N.; Qiu, D.; He, L.; Wu, B.; Li, Y. Support vector machine–based driving cycle recognition for dynamic equivalent fuel consumption minimization strategy with hybrid electric vehicle. Adv. Mech. Eng. 2018, 10, 1687814018811020. [Google Scholar] [CrossRef]
  85. Liu, Y.; Zhang, Y.; Yu, H.; Nie, Z.; Liu, Y.; Chen, Z. A novel data-driven controller for plug-in hybrid electric vehicles with improved adaptabilities to driving environment. J. Clean. Prod. 2022, 334, 130250. [Google Scholar] [CrossRef]
  86. Wang, H.; Arjmandzadeh, Z.; Ye, Y.; Zhang, J.; Xu, B. FlexNet: A warm start method for deep reinforcement learning in hybrid electric vehicle energy management applications. Energy 2024, 288, 129773. [Google Scholar] [CrossRef]
  87. Adedeji, B.P. Energy parameter modeling in plug-in hybrid electric vehicles using supervised machine learning approaches. E-Prime Adv. Electr. Eng. Electron. Energy 2024, 8, 100584. [Google Scholar] [CrossRef]
  88. Lu, Z.; Tian, H.; Li, R.; Tian, G. Neural network energy management strategy with optimal input features for plug-in hybrid electric vehicles. Energy 2023, 285, 129399. [Google Scholar] [CrossRef]
  89. Adedeji, B.P. A multivariable output neural network approach for simulation of plug-in hybrid electric vehicle fuel consumption. Green Energy Intell. Transp. 2023, 2, 100070. [Google Scholar] [CrossRef]
  90. Faria, J.; Pombo, J.; Calado, M.D.R.; Mariano, S. Power management control strategy based on artificial neural networks for standalone PV applications with a hybrid energy storage system. Energies 2019, 12, 902. [Google Scholar] [CrossRef]
  91. Lin, X.; Wang, Z.; Wu, J. Energy management strategy based on velocity prediction using back propagation neural network for a plug-in fuel cell electric vehicle. Int. J. Energy Res. 2021, 45, 2629–2643. [Google Scholar] [CrossRef]
  92. Huo, D.; Meckl, P. Power management of a plug-in hybrid electric vehicle using neural networks with comparison to other approaches. Energies 2022, 15, 5735. [Google Scholar] [CrossRef]
  93. Xie, S.; Hu, X.; Qi, S.; Lang, K. An artificial neural network-enhanced energy management strategy for plug-in hybrid electric vehicles. Energy 2018, 163, 837–848. [Google Scholar] [CrossRef]
  94. Shen, P.; Zhao, Z.; Zhan, X.; Li, J.; Guo, Q. Optimal energy management strategy for a plug-in hybrid electric commercial vehicle based on velocity prediction. Energy 2018, 155, 838–852. [Google Scholar] [CrossRef]
  95. Tang, D.; Zhang, Z.; Hua, L.; Pan, J.; Xiao, Y. Prediction of cold start emissions for hybrid electric vehicles based on genetic algorithms and neural networks. J. Clean. Prod. 2023, 420, 138403. [Google Scholar] [CrossRef]
  96. Chen, H.; Wu, G. Compensation fuzzy neural network power management strategy for hybrid electric vehicle. Tongji Daxue Xuebao J. Tongji Univ. 2009, 37, 525–530. [Google Scholar]
  97. Zhang, Q.; Fu, X. A neural network fuzzy energy management strategy for hybrid electric vehicles based on driving cycle recognition. Appl. Sci. 2020, 10, 696. [Google Scholar] [CrossRef]
  98. Wang, Q.; Luo, Y. Research on a new power distribution control strategy of hybrid energy storage system for hybrid electric vehicles based on the subtractive clustering and adaptive fuzzy neural network. Clust. Comput. 2022, 25, 4413–4422. [Google Scholar] [CrossRef]
  99. Estrada, P.M.; de Lima, D.; Bauer, P.H.; Mammetti, M.; Bruno, J.C. Deep learning in the development of energy management strategies of hybrid electric vehicles: A hybrid modeling approach. Appl. Energy 2023, 329, 120231. [Google Scholar] [CrossRef]
  100. Alaoui, C. Hybrid vehicle energy management using deep learning. In Proceedings of the 2019 International Conference on Intelligent Systems and Advanced Computing Sciences (ISACS), Taza, Morocco, 26–27 December 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 1–5. [Google Scholar]
  101. Xia, J.; Wang, F.; Xu, X. A predictive energy management strategy for multi-mode plug-in hybrid electric vehicle based on long short-term memory neural network. IFAC-Pap. 2021, 54, 132–137. [Google Scholar] [CrossRef]
  102. Bao, S.; Tang, S.; Sun, P.; Wang, T. LSTM-based energy management algorithm for a vehicle power-split hybrid powertrain. Energy 2023, 284, 129267. [Google Scholar] [CrossRef]
  103. Han, S.; Zhang, F.; Xi, J.; Ren, Y.; Xu, S. Short-term vehicle speed prediction based on convolutional bidirectional lstm networks. In Proceedings of the 2019 IEEE Intelligent Transportation Systems Conference (ITSC), Auckland, New Zealand, 27–30 October 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 4055–4060. [Google Scholar]
  104. Chen, Z.; Yang, C.; Fang, S. A convolutional neural network-based driving cycle prediction method for plug-in hybrid electric vehicles with bus route. IEEE Access 2019, 8, 3255–3264. [Google Scholar] [CrossRef]
  105. Xie, L.; Tao, J.; Zhang, Q.; Zhou, H. CNN and KPCA-based automated feature extraction for real time driving pattern recognition. IEEE Access 2019, 7, 123765–123775. [Google Scholar] [CrossRef]
  106. Han, L.; Jiao, X.; Zhang, Z. Recurrent neural network-based adaptive energy management control strategy of plug-in hybrid electric vehicles considering battery aging. Energies 2020, 13, 202. [Google Scholar] [CrossRef]
  107. Millo, F.; Rolando, L.; Tresca, L.; Pulvirenti, L. Development of a neural network-based energy management system for a plug-in hybrid electric vehicle. Transp. Eng. 2023, 11, 100156. [Google Scholar] [CrossRef]
  108. Wu, Y.; Zhang, Y.; Li, G.; Shen, J.; Chen, Z.; Liu, Y. A predictive energy management strategy for multi-mode plug-in hybrid electric vehicles based on multi neural networks. Energy 2020, 208, 118366. [Google Scholar] [CrossRef]
  109. Lin, X.; Wang, Y.; Bogdan, P.; Chang, N.; Pedram, M. Reinforcement learning based power management for hybrid electric vehicles. In Proceedings of the 2014 IEEE/ACM International Conference on Computer-Aided Design (ICCAD), San Jose, CA, USA, 2–6 November 2014; IEEE: Piscataway, NJ, USA, 2014; pp. 33–38. [Google Scholar]
  110. Yang, N.; Ruan, S.; Han, L.; Liu, H.; Guo, L.; Xiang, C. Reinforcement learning-based real-time intelligent energy management for hybrid electric vehicles in a model predictive control framework. Energy 2023, 270, 126971. [Google Scholar] [CrossRef]
  111. Yang, N.; Han, L.; Xiang, C.; Liu, H.; Hou, X. Energy management for a hybrid electric vehicle based on blended reinforcement learning with backward focusing and prioritized sweeping. IEEE Trans. Veh. Technol. 2021, 70, 3136–3148. [Google Scholar] [CrossRef]
  112. Du, G.; Zou, Y.; Zhang, X.; Liu, T.; Wu, J.; He, D. Deep reinforcement learning based energy management for a hybrid electric vehicle. Energy 2020, 201, 117591. [Google Scholar] [CrossRef]
  113. Musa, A.; Anselma, P.G.; Belingardi, G.; Misul, D.A. Energy Management in Hybrid Electric Vehicles: A Q-Learning Solution for Enhanced Drivability and Energy Efficiency. Energies 2023, 17, 62. [Google Scholar] [CrossRef]
  114. Ahmadian, S.; Tahmasbi, M.; Abedi, R. Q-learning based control for energy management of series-parallel hybrid vehicles with balanced fuel consumption and battery life. Energy AI 2023, 11, 100217. [Google Scholar] [CrossRef]
  115. Chen, Z.; Mi, C.C.; Xia, B.; You, C. Stochastic model predictive control for energy management of power-split plug-in hybrid electric vehicles based on reinforcement learning. Energy 2020, 211, 118931. [Google Scholar] [CrossRef]
  116. Chen, Z.; Gu, H.; Shen, S.; Shen, J. Energy management strategy for power-split plug-in hybrid electric vehicle based on MPC and double Q-learning. Energy 2022, 245, 123182. [Google Scholar] [CrossRef]
  117. Shen, S.; Gao, S.; Liu, Y.; Zhang, Y.; Shen, J.; Chen, Z.; Lei, Z. Real-time energy management for plug-in hybrid electric vehicles via incorporating double-delay Q-learning and model prediction control. IEEE Access 2022, 10, 131076–131089. [Google Scholar] [CrossRef]
  118. Mei, P.; Karimi, H.R.; Xie, H.; Chen, F.; Huang, C.; Yang, S. A deep reinforcement learning approach to energy management control with connected information for hybrid electric vehicles. Eng. Appl. Artif. Intell. 2023, 123, 106239. [Google Scholar] [CrossRef]
  119. Tang, X.; Chen, J.; Liu, T.; Qin, Y.; Cao, D. Distributed deep reinforcement learning-based energy and emission management strategy for hybrid electric vehicles. IEEE Trans. Veh. Technol. 2021, 70, 9922–9934. [Google Scholar] [CrossRef]
  120. Wu, J.; He, H.; Peng, J.; Li, Y.; Li, Z. Continuous reinforcement learning of energy management with deep Q network for a power split hybrid electric bus. Appl. Energy 2018, 222, 799–811. [Google Scholar] [CrossRef]
  121. Wang, H.; He, H.; Bai, Y.; Yue, H. Parameterized deep Q-network based energy management with balanced energy economy and battery life for hybrid electric vehicles. Appl. Energy 2022, 320, 119270. [Google Scholar] [CrossRef]
  122. Qi, C.; Zhu, Y.; Song, C.; Yan, G.; Xiao, F.; Zhang, X.; Wang, D.; Cao, J.; Song, S. Hierarchical reinforcement learning based energy management strategy for hybrid electric vehicle. Energy 2022, 238, 121703. [Google Scholar] [CrossRef]
  123. Du, G.; Zou, Y.; Zhang, X.; Guo, L.; Guo, N. Energy management for a hybrid electric vehicle based on prioritized deep reinforcement learning framework. Energy 2022, 241, 122523. [Google Scholar] [CrossRef]
  124. Shi, D.; Xu, H.; Wang, S.; Hu, J.; Chen, L.; Yin, C. Deep reinforcement learning based adaptive energy management for plug-in hybrid electric vehicle with double deep Q-network. Energy 2024, 305, 132402. [Google Scholar] [CrossRef]
  125. Feng, Z.; Wang, C.; An, J.; Zhang, X.; Liu, X.; Ji, X.; Kang, L.; Quan, W. Emergency fire escape path planning model based on improved DDPG algorithm. J. Build. Eng. 2024, 95, 110090. [Google Scholar] [CrossRef]
  126. Zhang, D.; Li, J.; Guo, N.; Liu, Y.; Shen, S.; Wei, F.; Chen, Z.; Zheng, J. Adaptive deep reinforcement learning energy management for hybrid electric vehicles considering driving condition recognition. Energy 2024, 313, 134086. [Google Scholar] [CrossRef]
  127. Ruan, J.; Wu, C.; Liang, Z.; Liu, K.; Li, B.; Li, W.; Li, T. The application of machine learning-based energy management strategy in a multi-mode plug-in hybrid electric vehicle, part II: Deep deterministic policy gradient algorithm design for electric mode. Energy 2023, 269, 126792. [Google Scholar] [CrossRef]
  128. He, H.; Huang, R.; Meng, X.; Zhao, X.; Wang, Y.; Li, M. A novel hierarchical predictive energy management strategy for plug-in hybrid electric bus combined with deep deterministic policy gradient. J. Energy Storage 2022, 52, 104787. [Google Scholar] [CrossRef]
  129. Ma, Z.; Huo, Q.; Zhang, T.; Hao, J.; Wang, W. Deep deterministic policy gradient based energy management strategy for hybrid electric tracked vehicle with online updating mechanism. IEEE Access 2021, 9, 7280–7292. [Google Scholar] [CrossRef]
  130. Zhou, J.; Xue, S.; Xue, Y.; Liao, Y.; Liu, J.; Zhao, W. A novel energy management strategy of hybrid electric vehicle via an improved TDdeep reinforcement learning. Energy 2021, 224, 120118. [Google Scholar] [CrossRef]
  131. Yazar, O.; Coskun, S.; Li, L.; Zhang, F.; Huang, C. Actor-critic TD3-based deep reinforcement learning for energy management strategy of HEV. In Proceedings of the 2023 5th International Congress on Human-Computer Interaction, Optimization and Robotic Applications (HORA), Istanbul, Turkey, 8–10 June 2023; IEEE: Piscataway, NJ, USA, 2023; pp. 1–6. [Google Scholar]
  132. Huang, R.; He, H.; Zhao, X.; Wang, Y.; Li, M. Battery health-aware and naturalistic data-driven energy management for hybrid electric bus based on TD3 deep reinforcement learning algorithm. Appl. Energy 2022, 321, 119353. [Google Scholar] [CrossRef]
  133. Huang, R.; He, H.; Su, Q. Smart energy management for hybrid electric bus via improved soft actor-critic algorithm in a heuristic learning framework. Energy 2024, 309, 133091. [Google Scholar] [CrossRef]
  134. Huang, R.; He, H. Naturalistic data-driven and emission reduction-conscious energy management for hybrid electric vehicle based on improved soft actor-critic algorithm. J. Power Sources 2023, 559, 232648. [Google Scholar] [CrossRef]
  135. Li, T.; Cui, W.; Cui, N. Soft actor-critic algorithm-based energy management strategy for plug-in hybrid electric vehicle. World Electr. Veh. J. 2022, 13, 193. [Google Scholar] [CrossRef]
  136. Li, C.; Xu, X.; Zhu, H.; Gan, J.; Chen, Z.; Tang, X. Research on car-following control and energy management strategy of hybrid electric vehicles in connected scene. Energy 2024, 293, 130586. [Google Scholar] [CrossRef]
Figure 1. Structure of series hybrid electric powertrain system.
Figure 1. Structure of series hybrid electric powertrain system.
Energies 18 02018 g001
Figure 2. Structure of parallel hybrid electric powertrain system.
Figure 2. Structure of parallel hybrid electric powertrain system.
Energies 18 02018 g002
Figure 3. Series–parallel hybrid electric powertrain system.
Figure 3. Series–parallel hybrid electric powertrain system.
Energies 18 02018 g003
Figure 4. Power-split hybrid electric powertrain system.
Figure 4. Power-split hybrid electric powertrain system.
Energies 18 02018 g004
Figure 5. Energy management strategy for HEPS.
Figure 5. Energy management strategy for HEPS.
Energies 18 02018 g005
Figure 6. Schematic diagram of fuzzy logic rule-based EMS.
Figure 6. Schematic diagram of fuzzy logic rule-based EMS.
Energies 18 02018 g006
Figure 7. Schematic diagram of genetic algorithm.
Figure 7. Schematic diagram of genetic algorithm.
Energies 18 02018 g007
Figure 8. SVM-based fault identification and diagnosis framework.
Figure 8. SVM-based fault identification and diagnosis framework.
Energies 18 02018 g008
Figure 9. Flowchart of random forest.
Figure 9. Flowchart of random forest.
Energies 18 02018 g009
Figure 10. BP neural network structure diagram.
Figure 10. BP neural network structure diagram.
Energies 18 02018 g010
Figure 11. Principle structure of LSTM.
Figure 11. Principle structure of LSTM.
Energies 18 02018 g011
Figure 12. Principle structure of convolutional neural network.
Figure 12. Principle structure of convolutional neural network.
Energies 18 02018 g012
Figure 13. Agent and environment interaction in reinforcement learning.
Figure 13. Agent and environment interaction in reinforcement learning.
Energies 18 02018 g013
Figure 14. Application of Q-Learning in hybrid electric vehicles.
Figure 14. Application of Q-Learning in hybrid electric vehicles.
Energies 18 02018 g014
Figure 15. Energy management strategy based on DQN algorithm.
Figure 15. Energy management strategy based on DQN algorithm.
Energies 18 02018 g015
Figure 16. DDPG algorithm schematic (N*: N sampled transitions).
Figure 16. DDPG algorithm schematic (N*: N sampled transitions).
Energies 18 02018 g016
Figure 17. EMS structure based on DDPG.
Figure 17. EMS structure based on DDPG.
Energies 18 02018 g017
Figure 18. TD3 structure diagram (Notation: Q* and S* represent target networks).
Figure 18. TD3 structure diagram (Notation: Q* and S* represent target networks).
Energies 18 02018 g018
Figure 19. Energy management strategy based on SAC algorithm.
Figure 19. Energy management strategy based on SAC algorithm.
Energies 18 02018 g019
Table 1. Comparison of artificial intelligence algorithms.
Table 1. Comparison of artificial intelligence algorithms.
Artificial Intelligence AlgorithmsAlgorithm ExamplesAdvantagesDisadvantagesReferences
Unsupervised LearningK-means, PCANo need for labeled data, identifies hidden patternsHard to interpret, unclear evaluation metrics[75,76,77,78,79]
Supervised LearningSVM, LSSVM, decision tree, random forestHigh accuracy, interpretableRequires large labeled datasets[81,82,83,84,85,86,87,88]
Neural NetworksBPNN, fuzzy neural networkHandles complex nonlinear relationshipsProne to overfitting on limited training data[92,93,94,95,96,97,98]
Deep LearningLSTM, CNN, RNNUtilizes deep architectures to model complex nonlinear system relationshipsRequires a large amount of data[101,102,103,104,105,106,107,108]
Reinforcement LearningQ-Learning, Dyna,
Double Q-Learning,
Double Delayed Q-Learning
Adapts to dynamic environments, optimizes long-term rewardsTedious training time[110,111,112,113,114,115,116,117]
Deep Reinforcement LearningDeep Q-Network (DQN),
Deep Q-Learning (DQL),
Double Deep Q-Learning (DDQL),
Double Deep Q-Network (DDQN),
DDPG, TD3, SAC
Suitable for high-dimensional state spacesTedious training time, hyperparameter sensitivity affects reproducibility[118,119,120,121,122,123,124,125,126,127,128,129,130,131,132,133,134,135,136]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhong, D.; Liu, B.; Liu, L.; Fan, W.; Tang, J. Artificial Intelligence Algorithms for Hybrid Electric Powertrain System Control: A Review. Energies 2025, 18, 2018. https://doi.org/10.3390/en18082018

AMA Style

Zhong D, Liu B, Liu L, Fan W, Tang J. Artificial Intelligence Algorithms for Hybrid Electric Powertrain System Control: A Review. Energies. 2025; 18(8):2018. https://doi.org/10.3390/en18082018

Chicago/Turabian Style

Zhong, Dawei, Bolan Liu, Liang Liu, Wenhao Fan, and Jingxian Tang. 2025. "Artificial Intelligence Algorithms for Hybrid Electric Powertrain System Control: A Review" Energies 18, no. 8: 2018. https://doi.org/10.3390/en18082018

APA Style

Zhong, D., Liu, B., Liu, L., Fan, W., & Tang, J. (2025). Artificial Intelligence Algorithms for Hybrid Electric Powertrain System Control: A Review. Energies, 18(8), 2018. https://doi.org/10.3390/en18082018

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop