Next Article in Journal
Carbon Dioxide Oil Repulsion in the Sandstone Reservoirs of Lunnan Oilfield, Tarim Basin
Previous Article in Journal
Evaluating the Impact of CO2 Capture on the Operation of Combined Cycles with Different Configurations
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Transformer-Based Deep Learning Models for State of Charge and State of Health Estimation of Li-Ion Batteries: A Survey Study

Department of Mechanical Engineering, McMaster University, Hamilton, ON L8S 4L8, Canada
*
Author to whom correspondence should be addressed.
Energies 2024, 17(14), 3502; https://doi.org/10.3390/en17143502
Submission received: 26 June 2024 / Revised: 10 July 2024 / Accepted: 15 July 2024 / Published: 17 July 2024
(This article belongs to the Section E: Electric Vehicles)

Abstract

:
The global transportation system’s need for electrification is driving research efforts to overcome the drawbacks of battery electric vehicles (BEVs). The accurate and reliable estimation of the states of charge (SOC) and health (SOH) of Li-Ion batteries (LIBs) is crucial for the widespread adoption of BEVs. Transformers, cutting-edge deep learning (DL) models, are demonstrating promising capabilities in addressing various sequence-processing problems. This manuscript presents a thorough survey study of previous research papers that introduced modifications in the development of Transformer-based architectures for the SOC and SOH estimation of LIBs. This study also highlights approximately 15 different real-world datasets that have been utilized for training and testing these models. A comparison is made between the architectures, addressing each state using the root mean square error (RMSE) and mean absolute error (MAE) metrics.

1. Introduction

Automotive manufacturers are undergoing an extensive electrification process, transitioning from fuel-powered vehicles to electric ones [1]. The goal is to reduce greenhouse gas emissions to keep the global average temperature increase below 2 °C compared to the values before the industrial revolution [2,3]. Battery electric vehicles (BEVs) are expected to surpass internal combustion engine (ICE) vehicles by 2035 in many countries, especially for light-duty vehicles, in order to achieve zero emissions (the amount of greenhouse gases added equal to that removed) during the second half of the century [4,5].
Nevertheless, the main weak point of BEVs that use Li-Ion batteries (LIBs) is their short driving range, leading to range anxiety for BEV users. To alleviate this issue, there is a great need to develop higher-energy-density battery packs, more efficient electric motors, and more reliable state estimation methods since the crucial states of the battery, the states of charge (SOC) and health (SOH), cannot be directly measured or obtained. The SOC indicates the remaining battery charges ( Q t ) as a percentage of its full capacity (C) at a specific age. The Coulomb counting (CC) concept [6,7] can easily express the SOC mathematically as in Equations (1) and (2), given an initial quantity of charge ( Q i ). However, this is not a practical method to be used in a battery management system (BMS) as it lacks aging considerations and may encounter a significant cumulative error [8].
Q t = Q i + t 0 t I ( t ) d t
S O C t = Q t C × 100
The SOH is an index that quantifies the age of the battery. In most studies, the SOH refers to the battery’s capacity fade ( S O H C ) and/or internal Ohmic resistance increase ( S O H R ) [9]. Equations (3) and (4) express both methods, where C t and R t are the current battery’s full capacity and resistance, whereas C f r e s h and R f r e s h are the nominal capacity and resistance for the fresh cell, and R E O L is the resistance at the end of life (EOL).
S O H C = C t C I × 100
S O H R = R E O L R t R E O L R f r e s h × 100
The estimation of both previous states has been the main focus of numerous research studies. This estimation is typically performed using a mathematical model based on the physical and chemical nature of the battery, or an artificial intelligence (AI) model trained on a significant amount of experimental data to ensure it can make accurate predictions in new and unseen scenarios. For model-based approaches, either the equivalent circuit model (ECM) or the electrochemical model (EcM) is used along with a state observer to estimate the unmeasured states (such as SOC, SOH, etc.) based on the measured parameters, such as the terminal voltage [10,11,12,13,14,15,16]. However, the EcM has several fitting parameters [17], which leads to a daunting model identification process, and the ECM, being a reduced lumped model, requires identification over the entire operating range of SOC, SOH, and temperature [18,19]. ECM complexity may vary from a simple RINT model that only involves internal resistance to a generalized nonlinear (GNL) model that possess several RC branches and self-discharge resistance [20].
For data-driven estimation methods, deep learning (DL) sequence models have frequently demonstrated high accuracy levels in modeling time-series problems [21,22]. Different DL architectures were tested in various studies with the problem of battery state estimation [23,24], starting from basic multilayer perceptron (MLP) to recurrent neural networks (RNN), bidirectional RNNs, and long–short-term memory (LSTM) networks [25,26,27,28,29].
However, LSTMs and RNNs still struggle with long-sequence problems due to their sequential nature and memory constraints, dramatically increasing complexity [30]. This paper provides a summary of previous studies utilizing estimators based on the Transformer model [30], a state-of-the-art DL architecture that relies on attention mechanisms for processing long sequences. A thorough survey is conducted using the Google Scholar and Engineering Village search engines [31,32] to include all research papers that proposed Transformer-based estimators for the SOC and SOH of LIBs. A comparison analysis is presented for all the models that addressed each state using the root mean square error (RMSE) and mean absolute error (MAE) evaluation metrics. This paper is organized as follows: Section 2 gives a brief interpretation of the Transformer model; Section 3 presents an overview of the datasets’ structure and preparation; Section 4 summarizes the studies for the SOC estimation; Section 5 presents the reviewed studies for the SOH estimation; and Section 6 concludes this manuscript.

2. An Overview of the Transformer Model

This section discusses the development of the Transformer model [30] and its improved modeling capabilities for time-series problems.
Figure 1 visualizes the overall Transformer model; the architecture is composed of two towers: an encoder and a decoder. To prepare the data for the encoder, an embedding process takes place in which a sequence of feature vectors is mapped to a larger dimension d e m b e d ; this results in having tokens that enhance the contextual understanding of the data. The model can process large amounts of data in parallel by embedding tokens simultaneously in a sequence, which significantly increases time efficiency compared to recurrent models that process data sequentially. In the case of non-numerical problems such as natural language processing (NLP), a word-embedding takes place to convert each word to a corresponding vector based on a language-specific dictionary. The embedded tokens (vectors) are modified by the positional encoding layer. This is a replacement for recurrence in which the position vectors are added to the tokens. The position vectors are obtained from sine and cosine functions with different frequencies. This results in tokens that correspond to the relative or absolute positions in the sequence, thus improving the overall sequence meaning. The shape of the resulting output of data embedding and positional encoding is ( B a t c h _ s i z e × d e m b e d × s e q u e n c e _ l e n g t h ) , and the sequence_length defines how far the model can look into the past to recognize the context of the input data.
After embedding the positional encoded tokens into the encoder, the attention mechanism plays its role. This study proposes the use of a multiple-layer attention mechanism known as multi-head attention. Each attention head relies on the scaled dot-product attention algorithm; the model formulates query (Q), key (K) and value (V) vectors for each embedded token by multiplying it by three learnable matrices, W Q , W K , and W V . The query of each token is multiplied by the key of all tokens in the sequence to measure the extent of the contextual relationship between this token and every other one in the sequence; then, this product is scaled and passed to a soft-max function, and the output is used to weigh the value of each token, and then, the weighted sum of all values obtains the attention vector of the investigated token. The attention can be expressed as in Equation (5) [30]:
A t t e n t i o n ( Q , K , V ) = s o f t m a x ( Q K T d k ) V
This process is repeated for all the attention heads (six heads in the original model), which can process the input in parallel as they don’t depend on each other, and the multiple heads help in learning various data relationships. The final attention representation is the concatenation of the outputs of all heads.
A feedforward neural network (FNN) is applied to generate the encoder output for keys and values that will be used by the decoder. The encoder aims to convert input tokens into attention maps that closely adhere to the sequence context. The decoder’s architecture is similar to the encoder, but it has two stages of multi-head attention. The second stage uses the keys and values generated from the encoder along with a query obtained from the attention applied to the previous outputs in the first stage. The encoder and the decoder are both repeated N times with the same process.

3. Datasets

This section provides a brief overview of the structure of the datasets for training and testing Transformer-based models for SOC and SOH estimation. The data used to train ML/DL models for estimating the various states of LIBs are obtained from lab experiments on real batteries or operational EVs over long periods. These datasets consist of parameters that can be directly measured from the battery. These parameters usually include the battery’s terminal voltage ( V T ), the current intensity (I) passing through it, and the average temperature of its lateral surfaces (T). The majority of studies in this survey use open-source datasets from laboratory experiments. In these experiments, a temperature chamber is utilized to soak the battery at a specific temperature. And a cycler, connected to the battery, supplies it with a specific current profile corresponding to a standard charging, discharging, or drive cycle scenario. The cycler consequently measures the voltage response of the battery, and multiple thermocouples keep track of the temperature at various lateral points of the battery. The same procedures are repeated for multiple combinations of temperatures and current profiles similar to those of real operational conditions. The V T , I, and T measurements of a dataset are used as the features or variates to infer the SOC or SOH.
To obtain the target values (labels) for the SOC, the cycler performs Coulomb counting, relying on accurate current sensors and a known initial state. There is a maximum and minimum value for the V T specified by the manufacturer of the cells, these values correspond to 100% and 0% SOC. The cycler uses these values to determine the full and empty states. Figure 2 shows an example of the structure and flow of data for SOC estimation. The V T , I, and T plots are obtained for the Urban Dynamometer Driving Schedule (UDDS) drive cycle at 25 °C from the LG 18650HG2 [33] dataset. The Transformer model processes the previous Sequence Length of the data to understand their context and infer the SOC.
Regarding the SOH, aging experiments are required to obtain the datasets. In these experiments, a fresh cell is exposed to numerous charging and discharging cycles until it reaches a preset EOL criteria for capacity or internal resistance. An example of this is the dataset produced by the NASA Ames Prognostics Center of Excellence (PCoE) [34]; this dataset is obtained by cycling multiple Li-Ion 18650-sized batteries by successive constant-current charging and discharging cycles until they reach EOL at 70% of the rated fresh capacity. Figure 3 displays the voltage and temperature response for B0005 for each discharge cycle across the whole life range (168 cycles). Each cycle corresponds to a single capacity value (i.e., SOH value). It is obvious that aging changes the behavior of the battery for the same current profiles; the voltage drops faster, whereas the temperature reaches higher values due to the internal resistance increase. In most of the reviewed studies, a feature extraction method is performed on the data of the cycles to obtain a feature-to-SOH representation to be processed by the Transformer-based model.
In this section, samples of the reviewed datasets are presented and discussed for SOC and SOH estimation. This study includes 15 open-source datasets that have similar structures to the discussed ones.

4. State of Charge Estimation

This section presents a survey of previous studies aimed at improving SOC estimation for LIBs through Transformer-based models. A summary of all the reviewed studies is presented as a comparison in Table 1.
In a study by Hannan et al. [35], a deep learning model based on the Transformer architecture was presented. The proposed model was trained using the self-supervised learning (SSL) approach, with the LG 18650HG2 dataset [33] used for training. The SSL method allowed the model to be trained on only 20% of the training data with 5 epochs, yet it achieved a testing RMSE as low as 1.9% of the SOC. The SSL added another impressive capability to this model; the weights that were learned in the training phase could be transferred to be used with the predictions made on a different cell chemistry. The proposed model could estimate the SOC with an RMSE of 0.9% and an MAE of 0.44% at constant ambient temperature, whereas the errors reached an RMSE of 1.19% and an MAE of 0.7% for varying temperatures.
In their study [36], Shen et al. propose an estimator based on a Transformer model, followed by an immersion and invariance (I&I) adaptive observer. For the Transformer stage, two parallel encoder towers are used to handle the current–temperature input sequence and the voltage–temperature input sequence separately. The study shows that generating two separate context vectors for current and temperature is more effective than mixing them together. The I&I observer is then deployed on the predicted SOC to reduce severe fluctuations. The study uses a dataset for the LiFeO4 chemistry-based battery from [37], and the experiments are conducted using the US06 drive cycle at different temperatures. The results show an estimation accuracy of less than 1% SOC for both MAX error and RMSE. Furthermore, the authors claim that the proposed architecture has good performance with unseen training temperatures and better accuracy compared to the LSTM-based models.
Duan et al. proposed a convolutional Transformer network (CTN) [38] to leverage the capabilities of convolutional neural networks (CNNs). The study used voltage, current, and temperature data from the Panasonic 18650PF dataset [39] as input data. The CNN was used first to extract the spatial features of the data, followed by the Transformer to extract the temporal features. This stage is referred to as the CTN stage. Additionally, a sigma-point Kalman filter (SPKF) was utilized to attenuate the severe fluctuations in the predicted SOC. The results of the experiments conducted on four drive cycles at 0, 10, and 25 °C showed an MAE, MAX, and RMSE of 0.81%, 5.95%, and 0.93%, respectively, for the SOC.
The CNN is also utilized in [40] together with the Informer model [41], a novel version of the Transformer model. The CNN-Informer architecture is shown to improve the extraction of spatiotemporal features. Moreover, a Laplace-based loss function is used to account for the probabilistic distribution of the error. The study tests the proposed method on the Panasonic [39] and INR [42] datasets at temperatures of 0, 10, and 25 °C. The results show that the model achieves an MAE of 0.77%, an MAXE of 3.79%, and an RMSE of 0.86%.
An architecture that is based on Bidirectional Encoder Representations from Transformers for Batteries (BERTtery) is proposed in [43]. This study presents tests on multiple SOH levels from 100% to 80% of the rated capacity and temperatures from −5 °C to 35 °C. For the cell-level experiments, 5 large-scale NMC cells were used, and in addition to this, a battery pack that had been in service for 8 months in an EV was used for testing the model as well. An RMSE of 0.5% and an MAE of 2% are achieved.
In their paper [44], Ahn et al. present a model that combines a Transformer model with the single-particle model with electrolyte dynamics (SPMe). This physics-informed DL model was developed to perform well even when there are limited data, which is often the case in practical applications. Their research evaluates the model under two scenarios, partially available cycles and varying temperatures, using experimental battery discharge cycles from the CALCE dataset [45]. The study shows that the hybrid model achieves a less than 2% error in most scenarios, thus outperforming traditional machine learning models. The accuracy of the hybrid model heavily relies on the accuracy of the SPMe.
Bao et al. conducted a recent study [46] where they proposed a Temporal Transformer-based sequence network (TTSNet) to encode time series features. The network utilizes a module that fuses features with the aid of attention to integrate information from voltage, current, and temperature data. Additionally, the method was enhanced with a moving time window strategy and a Kalman filter (KF) for post-processing. The results of the study demonstrate the high accuracy and robustness of the network. The average MAE and RMSE values achieved were 0.50% and 0.69%, respectively, on open battery datasets and vehicle operation datasets.
A comparative study is performed in [47], whereby the SOC estimation accuracy of a Transformer model is compared to that of a Long–Short-Term Memory (LSTM) network, Bi-directional LSTM (Bi-LSTM) models, and Support Vector Regression (SVR). The evaluation is performed using real-world battery data from NASA [34], BMW i3 [48], Stanford University [49], and Musoshi electric vehicles. The Transformer model outperforms the others, achieving a 0.99% RMSE value.
Table 1. Summary of SOC estimation studies with minimum RMSE and MAE achieved based on source manuscripts.
Table 1. Summary of SOC estimation studies with minimum RMSE and MAE achieved based on source manuscripts.
MethodologyDatasetRMSEMAE
Transformer with SSL [35]LG 18650HG2 [33]1.9%: 20% of training data 0.9%: constant temperature 1.19%: varying temperature0.44%: constant temperature 1.7%: varying temperature
2-Encoder Transformer + I&I Observer [36]LiFeO4 chemistry-based battery [37]<1%——
Convolutional Transformer network (CTN) + SPKF [38]Panasonic 18650PF [39]0.93%0.81%
CNN-Informer [40]Panasonic 18650PF [39] and INR [42]0.86%0.77%
BERTtery [43]5 large-scale NMC cells and a battery pack serviced for 8 months in an EV0.5%2%
Transformer with SPMe [44]CALCE [45]<2%——
TTSNet + Kalman filter (post-processing) [46]——0.69%0.5%
Comparative Study [47]: Transformer, LSTM, Bi-LSTM, and SVRNASA [34], BMW i3 [48], Stanford University [49], and Musoshi electric vehicles.0.99%: Transformer——
—— Means that its not mentioned in the cited manuscript.

5. State of Health Estimation

This section presents a review study of previous work on estimating the SOH and the remaining useful life (RUL) of Li-Ion batteries using Transformer-based models. A summary of all the reviewed studies is presented as a comparison in Table 2.
In their paper [50], Chen et al. proposed an approach to preprocess noisy capacity data. They introduced a Denoising Auto-Encoder (DAE) followed by a Transformer network, which captures temporal information and learns features. This integrated framework, called DeTransformer, combines denoising and prediction tasks and demonstrates superior performance compared to existing methods. The model’s behavior on the NASA dataset [34] achieved an MAE of 0.07 and an RMSE of 0.08, while on the CALCE dataset [45], it showed an MAE of 0.06 and an RMSE of 0.07.
The study presented in [51] introduces a new method for estimating SOH through a combination of data preprocessing strategies and a CNN–Transformer architecture. The study proposes three preprocessing methods: Pearson correlation coefficient (PCC), principal component analysis (PCA), and feature scaling. These methods help to select highly related features, reduce computational complexity, and normalize features for faster training. The CNN–Transformer model proposed in the study combines the strengths of both CNNs and Transformers, resulting in accurate SOH estimates with high stability. The model achieves estimation errors within 1%. The study validates the model’s performance using the NASA dataset [34]. The MAE and RMSE are within 0.55%.
In [52], the authors developed a CNN–Transformer model with a multiview information perception framework (MVIP-Trans) to perceive global and local information from voltage and current signals. The framework includes a Local Information Perceptron (LIP) that relies on parallel multiscale attention to enhance feature extraction and noise suppression. A Transformer–attention architecture encodes global dependencies between local features. The model’s effectiveness is demonstrated through experiments with real-world datasets [34,53], achieving high accuracy in SOH estimation. It outperforms other methods, with an RMSE of 0.5% on the NASA dataset and 0.3% on the Oxford dataset.
In [54], Zhao et al. utilized the PCC to establish a correlation between 28 features obtained from the NASA dataset and the SOH. They then employed an encoder-only Transformer to predict the SOH. The study reported an RMSE and an MAE of 0.029 and 0.0258, respectively.
Chen et al. introduce a neural network called Vision Transformer (ViT)-based DL Network in their paper [55]. The ViT utilizes optimal sampling points during the training process, thereby reducing the need for manual effort, and outperforming traditional methods. The experiments, conducted on two datasets [34,45], demonstrate the ViT’s enhanced accuracy and robustness. The results show that the ViT has great potential in SOH estimation, with an average MAE and RMSE of 0.36% and 0.46% for the NASA dataset, and 0.37% and 0.47% for the CALCE dataset. The study also highlights the need for neural networks with simpler structures and lower computational costs for practical applications.
In the study in [56], the authors present a Transformer-based model that can determine the SOH using electrochemical impedance spectroscopy (EIS) data, the dataset is from Cavendish Laboratory of the University of Cambridge (CLUC) [57]. The authors propose a feature extraction technique that simplifies the EIS data and reduces the RMSE and MAE to 0.64% and 0.51%, respectively, throughout the battery’s life cycle. By focusing on key electrochemical features that reflect the battery’s aging state, this approach can improve the accuracy of SOH prediction. The Transformer model performs better than traditional recurrent neural network models by effectively catching long-term inter-relations in the data.
Chen et al. [58] propose a combined Transformer–gated recurrent unit (GRU) model based on modal decomposition to enhance the process of SOH estimation. They introduce variational mode decomposition (VMD) and particle swarm optimization (PSO) to optimize the neural network hyperparameters. The Transformer model is adapted to handle time-series data, capturing global features and internal correlations, while the GRU predicts high-frequency subsequences. The model’s effectiveness is validated using the NASA dataset [34], showing superior prediction accuracy and fitting degree, with an MAE and an RMSE of less than 0.62% and 1.19%, respectively, and a determination coefficient over 87.08%.
A combination of Transformers and LSTMs is presented in [59]. Initially, a 1D-CNN is utilized for feature extraction; then, the features are passed to a Transformer–LSTM (T-LSTM) architecture for sequence processing to infer the SOH. The proposed estimator is validated on two aging datasets; the first is generated from an experimental aging process performed on seven Prospower ICR18650P batteries, and the second is the CLUC dataset [57]. The model achieves an RMSE and an MAE of 0.66% and 0.53%. Noisy environments are used to test the model’s robustness, and a comparison is presented with the GRU, LSTM, and Transformer models.
Gomez et al. introduce an improved temporal fusion Transformer (ITFT) in [60] for the estimation of the remaining capacity and RUL. This model replaces the long–short-term memory with a Bi-LSTM encoder–decoder layer. The study also presents a new Bayesian optimization method for hyperparameter identification; this method utilizes a tree-structure Parzen estimator (TPE). The MIT aging dataset [61] is used for model validation. The minimum RMSE for SOH estimation is 0.13% from four batteries, while the online RUL prediction gave a minimum RMSE of 0.67% from nine batteries.
In [62], the multiple correlation coefficient (MCC) is utilized to correlate discharge cycles before deploying a Transformer model; the model is based on the attention of the Informer architecture [41]. Model validation shows an RMSE and an MAE for the MIT dataset [61] of 0.2% for both, while for the CALCE dataset [45], the errors are 1.8% and 1.0%, respectively.
Fauzi et al., in their study [63], added seasonal and growth embedding to the Exponential Smoothing Transformer (ETSformer) [64] to introduce the SGEformer architecture for SOH estimation. The proposed estimator was compared to the ETSformer, Informer [40], Reformer [65], Transformer, and LSTM. The NASA and CALCE datasets were used for validation. An RMSE and N MAE of 0.96% and 0.01% were stated.
The estimation of the SOH based on real EV data is addressed in [66]. A basic encoder-only architecture is utilized, and the SOH inference is based on input features such as current, voltage, SOC, and vehicle speed. These data are obtained from direct measurements of three real operating vehicles collected over three years. A mixture of fixed routes are assigned to the vehicles; this includes highway and city driving schedules in Japan. A minimum RMSE of 1.31% is obtained. The study also highlights that the Transformer relies on the stationary period of the vehicle to create the attention maps used for inference.
Zhang et al. [67] applied PCA to the NASA dataset before using a stack of successive DAEs as a method of preprocessing enhancement. Then, a Transformer was deployed for RUL prediction. The RMSE and MAE were 0.2% and 0.17%. It is notable to compare these results with those obtained in [50] to highlight how the PCA, together with a stack of DAEs (other than one), improved the RUL prediction for the same dataset.
A remarkable integration between the KF and the Transformer is proposed in [68]. After applying a DAE, an encoder-only architecture is built to include the KF between the attention block and the FNN; the aim is to create a robust SOH estimation framework by extracting nonlinear variations. The RMSE of the estimation is 2.52% on the CALCE dataset and 3.45% on the NASA one.
Table 2. Summary of the SOH estimation studies with the minimum RMSE and MAE achieved based on the source manuscripts.
Table 2. Summary of the SOH estimation studies with the minimum RMSE and MAE achieved based on the source manuscripts.
MethodologyDatasetRMSEMAE
DAE -> Transformer (DeTramsformer) [50]NASA [34] and CALCE [45]8% 7%7% 6%
[PCA, PCC, and feature scaling] + CNN–Transformer [51]NASA [34]≈0.55%≈0.55%
CNN-MVIP-Trans [52]NASA [34] and  Oxford [53]0.5% 0.3%——
PCC + encoder-only Transformer [54]NASA [34]2.9%2.6%
Vision Transformer (ViT) [55]NASA [34] and  CALCE [45]0.46% 0.47%0.36% 0.37%
Transformer with EIS analysis [56]CLUC [57]0.64%0.51%
Transformer–GRU [58]NASA [34]1.19%0.62%
1D-CNN + T-LSTM [59]Aging experiment on 7 Prospower ICR18650P batteries, and  CLUC [57]0.66%0.53%
ITFT (Bi-LSTM) [60]MIT [61]SOH: 0.13% RUL: 0.67%——
MCC + Informer [62]MIT [61] CALCE [45]0.2% 1.8%0.2% 1.0%
SGEformer [63]NASA [34] and CALCE [45]0.96%0.01%
Encoder-only Transformer [66]The study used a dataset that was generated specifically for this work based on measurements from 3 real EVs in 3 years.1.31%——
PCA -> Stacked DAE -> Transformer for RUL [67]NASA [34]0.2%0.17%
DAE -> KF-Transformer [68]NASA [34] and CALCE [45]3.45%2.52%——
—— Means that its not mentioned in the cited manuscript.

6. Conclusions

A comprehensive survey is conducted in this paper to highlight previous research manuscripts that used, developed, or compared various Transformer-based DL architectures to estimate the SOC and SOH of LIBs in BEVs. The survey summarizes 8 papers on SOC estimation and 14 papers on SOH estimation; these were the studies found up to the survey date. The study also includes 15 different datasets that are obtained from real batteries in different driving and charging scenarios; these datasets are suitable for training Transformer-based estimators and testing their robustness and generalization capabilities.
The following conclusions are drawn from this survey:
  • For SOC estimation, it is obvious that in most of the studies, there is a need to utilize post-processing methods to attenuate the severe fluctuations in the output predictions of the Transformer-based model; this is carried out using filters and state observers.
  • For the SOH, there is always a need for preprocessing feature extraction to extract the features from the raw aging datasets to be passed to the Transformer model to infer the SOH; CNNs, DAEs, and correlation analysis methods may be used for this.
  • It is evident from the reviewed manuscripts that Transformer-based models can achieve impressive estimation accuracy, with an RMSE below 1% for both states.
  • Nevertheless, with more than a million learnable parameters, the computational complexity of these models results in daunting processing.
  • Addressing the implementation of Transformer-based models in real time on microcontrollers is crucial in future work, to demonstrate the feasibility of deploying this approach in the BMS of an EV.

Funding

This research received no external funding.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
BEVBattery electric vehicle
LIBLithium-ion battery
SOCState of charge
SOHState of health
FNNFeedforward neural network
RNNRecurrent neural network
LSTMLong–short-term memory
NLPNatural language processing
UDDSUrban Dynamometer Driving Schedule
I&IImmersion and invariance
CTNConvolutional Transformer network
CNNConvolutional neural network
BERTBidirectional encoder representation from Transformers
SPMeSingle-particle model with electrolyte dynamics
TTSNetTemporal Transformer-based sequence network
SVRSupport vector regression
RULRemaining useful life
DAEDenoising Auto-Encoder
PCAPrincipal component analysis
PCCPearson correlation coefficient
MVIPMultiview information perception framework
ViTVision Transformer
CLUCCavendish Laboratory of the University of Cambridge
EISElectrochemical impedance spectroscopy
GRUGated recurrent unit
VMDVariational mode decomposition
PSOParticle swarm optimization
ITFTImproved temporal fusion Transformer
TPETree-structure Parzen estimator
MCCMultiple correlation coefficient
SGESeasonal and growth embedding
KFKalman filter
MLPMultilayer perceptron

References

  1. Swarnkar, R.; Ramachandran, H.; Ali, S.H.M.; Jabbar, R. A Systematic Literature Review of State of Health and State of Charge Estimation Methods for Batteries Used in Electric Vehicle Applications. World Electr. Veh. J. 2023, 14, 247. [Google Scholar] [CrossRef]
  2. Watari, T.; McLellan, B.C.; Ogata, S.; Tezuka, T. Analysis of potential for critical metal resource constraints in the international energy agency’s long-term low-carbon energy scenarios. Minerals 2018, 8, 156. [Google Scholar] [CrossRef]
  3. Maizland, L. Global Climate Agreements: Successes and Failures. 2023. Available online: https://www.cfr.org/backgrounder/paris-global-climate-change-agreements#chapter-title-0-2 (accessed on 9 July 2024).
  4. The Government of Canada. Net-Zero Emissions by 2050; The Government of Canada: Ottawa, ON, USA, 2024.
  5. The Government of Canada. Building a Green Economy: Government of Canada to Require 100% of Car and Passenger Truck Sales Be Zero-Emission by 2035 in Canada; The Government of Canada: Ottawa, ON, USA, 2021.
  6. Ng, K.S.; Moo, C.S.; Chen, Y.P.; Hsieh, Y.C. Enhanced coulomb counting method for estimating state-of-charge and state-of-health of lithium-ion batteries. Appl. Energy 2009, 86, 1506–1511. [Google Scholar] [CrossRef]
  7. Yan, J.; Xu, G.; Qian, H.; Xu, Y. Robust state of charge estimation for hybrid electric vehicles: Framework and algorithms. Energies 2010, 3, 1654–1672. [Google Scholar] [CrossRef]
  8. He, W.; Williard, N.; Chen, C.; Pecht, M. State of charge estimation for electric vehicle batteries using unscented kalman filtering. Microelectron. Reliab. 2013, 53, 840–847. [Google Scholar] [CrossRef]
  9. Belt, J.; Utgikar, V.; Bloom, I. Calendar and PHEV cycle life aging of high-energy, lithium-ion cells containing blended spinel and layered-oxide cathodes. J. Power Sources 2011, 196, 10213–10221. [Google Scholar] [CrossRef]
  10. Plett, G.L. Extended Kalman filtering for battery management systems of LiPB-based HEV battery packs: Part 1. Background. J. Power Sources 2004, 134, 252–261. [Google Scholar] [CrossRef]
  11. Hu, C.; Youn, B.D.; Chung, J. A multiscale framework with extended Kalman filter for lithium-ion battery SOC and capacity estimation. Appl. Energy 2012, 92, 694–704. [Google Scholar] [CrossRef]
  12. Plett, G.L. Extended Kalman filtering for battery management systems of LiPB-based HEV battery packs: Part 3. State and parameter estimation. J. Power Sources 2004, 134, 277–292. [Google Scholar] [CrossRef]
  13. Rahimifard, S.; Habibi, S.; Tjong, J. Dual estimation strategy for new and aged electric vehicles batteries. In Proceedings of the 2020 IEEE Transportation Electrification Conference & Expo (ITEC), Chicago, IL, USA, 23–26 June 2020; pp. 579–583. [Google Scholar]
  14. Rahimifard, S.; Habibi, S.; Goward, G.; Tjong, J. Adaptive smooth variable structure filter strategy for state estimation of electric vehicle batteries. Energies 2021, 14, 8560. [Google Scholar] [CrossRef]
  15. Wang, L.; Ma, J.; Zhao, X.; Li, X.; Zhang, K. Online estimation of state-of-charge inconsistency for lithium-ion battery based on SVSF-VBL. J. Energy Storage 2023, 67, 107657. [Google Scholar] [CrossRef]
  16. Chen, C.; Zhang, B.; Vachtsevanos, G.; Orchard, M. Machine condition prediction based on adaptive neuro–fuzzy and high-order particle filtering. IEEE Trans. Ind. Electron. 2010, 58, 4353–4364. [Google Scholar] [CrossRef]
  17. Ahmed, R.; Sayed, M.E.; Arasaratnam, I.; Tjong, J.; Habibi, S. Reduced-Order Electrochemical Model Parameters Identification and SOC Estimation for Healthy and Aged Li-Ion Batteries Part I: Parameterization Model Development for Healthy Batteries. IEEE J. Emerg. Sel. Top. Power Electron. 2014, 2, 659–677. [Google Scholar] [CrossRef]
  18. Zhang, M.; Liu, Y.; Li, D.; Cui, X.; Wang, L.; Li, L.; Wang, K. Electrochemical Impedance Spectroscopy: A New Chapter in the Fast and Accurate Estimation of the State of Health for Lithium-Ion Batteries. Energies 2023, 16, 1599. [Google Scholar] [CrossRef]
  19. Tran, M.K.; Mathew, M.; Janhunen, S.; Panchal, S.; Raahemifar, K.; Fraser, R.; Fowler, M. A comprehensive equivalent circuit model for lithium-ion batteries, incorporating the effects of state of health, state of charge, and temperature on model parameters. J. Energy Storage 2021, 43, 103252. [Google Scholar] [CrossRef]
  20. Tao, Z.; Zhao, Z.; Wang, C.; Huang, L.; Jie, H.; Li, H.; Hao, Q.; Zhou, Y.; See, K.Y. State of charge estimation of lithium Batteries: Review for equivalent circuit model methods. Measurement 2024, 236, 115148. [Google Scholar] [CrossRef]
  21. Ahmed, D.M.; Hassan, M.M.; Mstafa, R.J. A Review on Deep Sequential Models for Forecasting Time Series Data. Appl. Comput. Intell. Soft Comput. 2022, 2022, 1–19. [Google Scholar] [CrossRef]
  22. Li, Y.; Du, M.; He, S. Attention-Based Sequence-to-Sequence Model for Time Series Imputation. Entropy 2022, 24, 1798. [Google Scholar] [CrossRef]
  23. Eleftheriadis, P.; Giazitzis, S.; Leva, S.; Ogliari, E. Data-Driven Methods for the State of Charge Estimation of Lithium-Ion Batteries: An Overview. Forecasting 2023, 5, 576–599. [Google Scholar] [CrossRef]
  24. Lucaferri, V.; Quercio, M.; Laudani, A.; Riganti Fulginei, F. A Review on Battery Model-Based and Data-Driven Methods for Battery Management Systems. Energies 2023, 16, 7807. [Google Scholar] [CrossRef]
  25. Vieira, R.N.; Kollmeyer, P.; Naguib, M.; Emadi, A. Feedforward and NARX Neural Network Battery State of Charge Estimation with Robustness to Current Sensor Error. In Proceedings of the 2023 IEEE Transportation Electrification Conference & Expo (ITEC), Chiang Mai, Thailand, 28 November–1 December 2023; pp. 1–6. [Google Scholar] [CrossRef]
  26. Wand, M.; Koutník, J.; Schmidhuber, J. Lipreading with long short-term memory. In Proceedings of the 2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Shanghai, China, 20–25 March 2016; pp. 6115–6119. [Google Scholar]
  27. Vidal, C.; Malysz, P.; Naguib, M.; Emadi, A.; Kollmeyer, P.J. Estimating battery state of charge using recurrent and non-recurrent neural networks. J. Energy Storage 2022, 47, 103660. [Google Scholar] [CrossRef]
  28. Chemali, E.; Kollmeyer, P.J.; Preindl, M.; Ahmed, R.; Emadi, A. Long short-term memory networks for accurate state-of-charge estimation of Li-ion batteries. IEEE Trans. Ind. Electron. 2017, 65, 6730–6739. [Google Scholar] [CrossRef]
  29. Schuster, M.; Paliwal, K.K. Bidirectional recurrent neural networks. IEEE Trans. Signal Process. 1997, 45, 2673–2681. [Google Scholar] [CrossRef]
  30. Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, Ł.; Polosukhin, I. Attention is all you need. Adv. Neural Inf. Process. Syst. 2017, 30. [Google Scholar] [CrossRef]
  31. Google Scholar. Search Engine. Available online: https://scholar.google.com/ (accessed on 15 March 2024).
  32. Engineering Village. Search Engine. Available online: https://www.engineeringvillage.com/home.url?redir=t (accessed on 15 March 2024).
  33. Kollmeyer, P.; Vidal, C.; Naguib, M.; Skells, M. LG 18650HG2 Li-ion Battery Data and Example Deep Neural Network xEV SOC Estimator Script. 2020. Available online: https://data.mendeley.com/datasets/cp3473x7xv/3 (accessed on 10 December 2023).
  34. Saha, B.; Goebel, K. Battery Data Set. NASA AMES Prognostics Data Repository. 2007. Available online: https://data.nasa.gov/dataset/Li-ion-Battery-Aging-Datasets/uj5r-zjdb/about_data (accessed on 15 January 2024).
  35. Hannan, M.A.; How, D.N.; Lipu, M.H.; Mansor, M.; Ker, P.J.; Dong, Z.; Sahari, K.; Tiong, S.K.; Muttaqi, K.M.; Mahlia, T.I.; et al. Deep learning approach towards accurate state of charge estimation for lithium-ion batteries using self-supervised transformer model. Sci. Rep. 2021, 11, 19541. [Google Scholar] [CrossRef]
  36. Shen, H.; Zhou, X.; Wang, Z.; Wang, J. State of charge estimation for lithium-ion battery using Transformer with immersion and invariance adaptive observer. J. Energy Storage 2022, 45, 103768. [Google Scholar] [CrossRef]
  37. Chin, C.; Gao, Z. A123’s Lithium Iron Phosphate (ANR26650M1-B) Battery Cell Data 2019. Available online: https://ieee-dataport.org/documents/a123s-lithium-iron-phosphate-anr26650m1-b-battery-cell-data (accessed on 10 May 2024).
  38. Duan, Y.; Zou, R. A Novel State-of-Charge Estimation Method for Lithium-ion Batteries Using Convolutional Transformer Network and Sigma-point Kalman Filter. In Proceedings of the 2022 4th International Conference on Smart Power & Internet Energy Systems (SPIES), Beijing, China, 27–30 October 2022; pp. 1850–1855. [Google Scholar]
  39. Kollmeyer, P. Panasonic 18650PF Li-Ion Battery Data 2018. Available online: https://data.mendeley.com/datasets/wykht8y7tg/1 (accessed on 10 May 2024).
  40. Zou, R.; Duan, Y.; Wang, Y.; Pang, J.; Liu, F.; Sheikh, S.R. A novel convolutional informer network for deterministic and probabilistic state-of-charge estimation of lithium-ion batteries. J. Energy Storage 2023, 57, 106298. [Google Scholar] [CrossRef]
  41. Zhou, H.; Zhang, S.; Peng, J.; Zhang, S.; Li, J.; Xiong, H.; Zhang, W. Informer: Beyond efficient transformer for long sequence time-series forecasting. In Proceedings of the AAAI Conference on Artificial Intelligence, Virtual, 2–9 February 2021; Volume 35, pp. 11106–11115. [Google Scholar]
  42. Zheng, F.; Xing, Y.; Jiang, J.; Sun, B.; Kim, J.; Pecht, M. Influence of different open circuit voltage tests on state of charge online estimation for lithium-ion batteries. Appl. Energy 2016, 183, 513–525. [Google Scholar] [CrossRef]
  43. Shi, D.; Zhao, J.; Wang, Z.; Zhao, H.; Wang, J.; Lian, Y.; Burke, A.F. Spatial-temporal self-attention transformer networks for battery state of charge estimation. Electronics 2023, 12, 2598. [Google Scholar] [CrossRef]
  44. Ahn, H.; Shen, H.; Zhou, X.; Kung, Y.C.; Wang, J. State of Charge Estimation of Lithium-Ion Batteries Using Physics-Informed Transformer for Limited Data Scenarios. ASME Lett. Dyn. Syst. Control 2023, 3, 041002. [Google Scholar] [CrossRef]
  45. He, W.; Williard, N.; Chen, C.; Pecht, M. State of charge estimation for Li-ion batteries using neural network modeling and unscented Kalman filter-based error cancellation. Int. J. Electr. Power Energy Syst. 2014, 62, 783–791. [Google Scholar] [CrossRef]
  46. Bao, Z.; Nie, J.; Lin, H.; Gao, K.; He, Z.; Gao, M. TTSNet: State-of-Charge Estimation of Li-ion Battery in Electrical Vehicles with Temporal Transformer-based Sequence Network. IEEE Trans. Veh. Technol. 2024, 73, 7838–7851. [Google Scholar] [CrossRef]
  47. Yılmaz, M.; Çinar, E.; Yazici, A. A Novel Transformer-Based Model for State of Charge Estimation of Electrical Vehicle Batteries. Available at SSRN 4736082. Available online: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4736082 (accessed on 9 April 2024).
  48. Steinstraeter, M.; Buberger, J.; Trifonov, D. Battery and heating data in real driving cycles. IEEE Dataport 2020, 10. [Google Scholar] [CrossRef]
  49. Catenaro, E.; Onori, S. Experimental data of lithium-ion batteries under galvanostatic discharge tests at different rates and temperatures of operation. Data Brief 2021, 35, 106894. [Google Scholar] [CrossRef]
  50. Chen, D.; Hong, W.; Zhou, X. Transformer Network for Remaining Useful Life Prediction of Lithium-Ion Batteries. IEEE Access 2022, 10, 19621–19628. [Google Scholar] [CrossRef]
  51. Gu, X.; See, K.W.; Li, P.; Shan, K.; Wang, Y.; Zhao, L.; Lim, K.C.; Zhang, N. A novel state-of-health estimation for the lithium-ion battery using a convolutional neural network and transformer model. Energy 2023, 262, 125501. [Google Scholar] [CrossRef]
  52. Bai, T.; Wang, H. Convolutional Transformer-Based Multiview Information Perception Framework for Lithium-Ion Battery State-of-Health Estimation. IEEE Trans. Instrum. Meas. 2023, 72, 2523312. [Google Scholar] [CrossRef]
  53. Birkl, C. Oxford Battery Degradation Dataset 1. 2017. Available online: https://ora.ox.ac.uk/objects/uuid:03ba4b01-cfed-46d3-9b1a-7d4a7bdf6fac (accessed on 15 April 2024).
  54. Zhao, Y.; Behdad, S. State of Health Estimation of Electric Vehicle Batteries Using Transformer-Based Neural Network. In Proceedings of the International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, Boston, MA, USA, 20–23 August 2023; Volume 87332, p. V005T05A017. [Google Scholar]
  55. Chen, L.; Xie, S.; Lopes, A.M.; Bao, X. A vision transformer-based deep neural network for state of health estimation of lithium-ion batteries. Int. J. Electr. Power Energy Syst. 2023, 152, 109233. [Google Scholar] [CrossRef]
  56. Luo, K.; Zheng, H.; Shi, Z. A simple feature extraction method for estimating the whole life cycle state of health of lithium-ion batteries using transformer-based neural network. J. Power Sources 2023, 576, 233139. [Google Scholar] [CrossRef]
  57. Zhang, Y.; Tang, Q.; Zhang, Y.; Wang, J.; Stimming, U.; Lee, A.A. Identifying degradation patterns of lithium ion batteries from impedance spectroscopy using machine learning. Nat. Commun. 2020, 11, 1706. [Google Scholar] [CrossRef]
  58. Chen, C.; Li, Y.; Liang, X.; Li, F.; Zhang, Z. Battery health state estimation of combined Transformer-GRU based on modal decomposition. Energy Storage Sci. Technol. 2023, 12, 2927. [Google Scholar]
  59. Fan, Y.; Li, Y.; Zhao, J.; Wang, L.; Yan, C.; Wu, X.; Zhang, P.; Wang, J.; Gao, G.; Wei, L. Online State-of-Health Estimation for Fast-Charging Lithium-Ion Batteries Based on a Transformer–Long Short-Term Memory Neural Network. Batteries 2023, 9, 539. [Google Scholar] [CrossRef]
  60. Gomez, W.; Wang, F.K.; Chou, J.H. Li-ion battery capacity prediction using improved temporal fusion transformer model. Energy 2024, 296, 131114. [Google Scholar] [CrossRef]
  61. Severson, K.A.; Attia, P.M.; Jin, N.; Perkins, N.; Jiang, B.; Yang, Z.; Chen, M.H.; Aykol, M.; Herring, P.K.; Fraggedakis, D.; et al. Data-driven prediction of battery cycle life before capacity degradation. Nat. Energy 2019, 4, 383–391. [Google Scholar] [CrossRef]
  62. Gao, M.; Shen, H.; Bao, Z.; Deng, Y.; He, Z. A Correlation-Augmented Informer-Based Method for State-of-Health Estimation of Li-Ion Batteries. IEEE Sens. J. 2024, 24, 3342–3353. [Google Scholar] [CrossRef]
  63. Fauzi, M.R.; Yudistira, N.; Mahmudy, W.F. State-of-Health Prediction of Lithium-Ion Batteries Using Exponential Smoothing Transformer With Seasonal and Growth Embedding. IEEE Access 2024, 12, 14659–14670. [Google Scholar] [CrossRef]
  64. Woo, G.; Liu, C.; Sahoo, D.; Kumar, A.; Hoi, S. Etsformer: Exponential smoothing transformers for time-series forecasting. arXiv 2022, arXiv:2202.01381. [Google Scholar]
  65. Kitaev, N.; Kaiser, Ł.; Levskaya, A. Reformer: The efficient transformer. arXiv 2020, arXiv:2001.04451. [Google Scholar]
  66. Nakano, K.; Tanaka, K. Transformer-Based Online Battery State of Health Estimation from Electric Vehicle Driving Data. Available online: https://www.enerarxiv.org/thesis/1704194360.pdf (accessed on 9 April 2024).
  67. Zhang, W.; Jia, J.; Pang, X.; Wen, J.; Shi, Y.; Zeng, J. An Improved Transformer Model for Remaining Useful Life Prediction of Lithium-Ion Batteries under Random Charging and Discharging. Electronics 2024, 13, 1423. [Google Scholar] [CrossRef]
  68. Huang, Y.; Liang, H.; Xu, L. Kalman Filter Optimize Transformer Method for State of Health Prediction on Lithium-Ion Battery. Available online: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4718095 (accessed on 9 April 2024).
Figure 1. Transformer model architecture.
Figure 1. Transformer model architecture.
Energies 17 03502 g001
Figure 2. Example of the structure and flow of data for SOC estimation in Transformer-based models.
Figure 2. Example of the structure and flow of data for SOC estimation in Transformer-based models.
Energies 17 03502 g002
Figure 3. Example of the structure and flow of data for SOH estimation in Transformer-based models.
Figure 3. Example of the structure and flow of data for SOH estimation in Transformer-based models.
Energies 17 03502 g003
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Guirguis, J.; Ahmed, R. Transformer-Based Deep Learning Models for State of Charge and State of Health Estimation of Li-Ion Batteries: A Survey Study. Energies 2024, 17, 3502. https://doi.org/10.3390/en17143502

AMA Style

Guirguis J, Ahmed R. Transformer-Based Deep Learning Models for State of Charge and State of Health Estimation of Li-Ion Batteries: A Survey Study. Energies. 2024; 17(14):3502. https://doi.org/10.3390/en17143502

Chicago/Turabian Style

Guirguis, John, and Ryan Ahmed. 2024. "Transformer-Based Deep Learning Models for State of Charge and State of Health Estimation of Li-Ion Batteries: A Survey Study" Energies 17, no. 14: 3502. https://doi.org/10.3390/en17143502

APA Style

Guirguis, J., & Ahmed, R. (2024). Transformer-Based Deep Learning Models for State of Charge and State of Health Estimation of Li-Ion Batteries: A Survey Study. Energies, 17(14), 3502. https://doi.org/10.3390/en17143502

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop