Ensemble Gradient Boosted Tree for SoH Estimation Based on Diagnostic Features
Abstract
:1. Introduction
2. Proposed SoH Estimation’s Framework
2.1. Experimental Data
2.2. Data Preparation
2.3. Feature Engineering
2.3.1. Feature Exploration
Statistical Features
Distortion Metrics
Spectral Features
2.3.2. Prognostic Feature Ranking
2.4. Ensemble Gradient Boosted Tree
 $\left\{\left({x}_{1},{y}_{1}\right),\dots ,\left({x}_{n},{y}_{n}\right)\right\}$ is the training set;
 $L$ is the number of leaves;
 ${R}_{1},\text{}{R}_{2},\dots ,\text{}{R}_{L}$ are the disjoint regions constitute from input data;
 ${P}_{i}$ is the output of region ${R}_{i},\text{}i=1\text{}to\text{}L$;
 $H$ is a decision tree and its output is calculated as: $H={\displaystyle \sum}_{i=1}^{L}{P}_{i}{1}_{{R}_{i}}\left(x\right)$;
 $\widehat{F}\left(x\right)=\mathrm{arg}minF\left(x\right)\text{}\left[L\left(y,F\left(x\right)\right)\right]$ is a function that maps $x$ to $y$ in a way that reduces the loss function $L\left(y,F\left(x\right)\right)$ over the joint distribution of all ensembles;
 The pseudoresidual is determined as: ${g}_{i}\left(x\right)=\left[\frac{\partial L\left({y}_{j},{F}_{i1}\left({x}_{i}\right)\right)}{\partial {F}_{i1}\left({x}_{i}\right)}\right],\text{}j=1\dots N.\text{}$
 Crossvalidate a set of ensembles. Exponentially increase the treecomplexity level for subsequent ensembles from decision stump (one split) to at most n  1 splits. n is the sample size. In addition, vary the learning rate for each ensemble between 0.05 to 0.2.
 Vary the maximum number of leaves using the values in the sequence {${2}^{0}$,$\text{}{2}^{1}$,…, ${2}^{m}$}. m is such that 2 m is no greater than n − 1.
 For each variant, adjust the learning rate using each value in the set {0.05, 0.1, 0.15, 0.2}.
 Estimate the RMSE for each ensemble.
 Identify the number of trees (N), maximum leaves number of (L), and learning rate (R) that yields the lowest RMSE overall.
3. Results and Discussion
3.1. Experimental Results
3.2. Discussion
4. Conclusions and Outlooks
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
 Ali, M.U.; Zafar, A.; Nengroo, S.H.; Hussain, S.; Alvi, M.J.; Kim, H.J. Towards a Smarter Battery Management System for Electric Vehicle Applications: A Critical Review of LithiumIon Battery State of Charge Estimation. Energies 2019, 12, 446. [Google Scholar] [CrossRef] [Green Version]
 Gandoman, F.H.; Jaguemont, J.; Goutam, S.; Gopalakrishnan, R.; Firouz, Y.; Kalogiannis, T.; Omar, N.; Mierlo, J.V. Concept of reliability and safety assessment of lithiumion batteries in electric vehicles: Basics, progress, and challenges. Appl. Energy 2019, 251, 113343. [Google Scholar] [CrossRef]
 Berecibar, M.; Gandiaga, I.; Villarreal, I.; Omar, N.; Van Mierlo, J.; Van Den Bossche, P. Critical review of state of health estimation methods of Liion batteries for real applications. Renew. Sustain. Energy Rev. 2015, 56, 572–587. [Google Scholar] [CrossRef]
 Liu, D.; Yin, X.; Song, Y.; Liu, W.; Peng, Y. An OnLine State of Health Estimation of LithiumIon Battery Using Unscented Particle Filter. IEEE Access 2018, 6, 40990–41001. [Google Scholar] [CrossRef]
 Yu, Z.; Huai, R.; Xiao, L. StateofCharge Estimation for LithiumIon Batteries Using a Kalman Filter Based on Local Linearization. Energies 2015, 8, 7854–7873. [Google Scholar] [CrossRef] [Green Version]
 How, D.N.T.; Hannan, M.A.; Hossain Lipu, M.S.; Ker, P.J. State of Charge Estimation for LithiumIon Batteries Using ModelBased and DataDriven Methods: A Review. IEEE Access 2019, 7, 136116–136136. [Google Scholar] [CrossRef]
 Xiong, R.; Li, L.; Tian, J. Towards a smarter battery management system: A critical review on battery state of health monitoring methods. J. Power Sources 2018, 405, 18–29. [Google Scholar] [CrossRef]
 Li, Y.; Liu, K.; Foley, A.M.; Zülke, A.; Berecibar, M.; NaniniMaury, E.; NaniniMaury, E.; van Mierlo, J.; Hoster, H.E. Datadriven health estimation and lifetime prediction of lithiumion batteries: A review. Renew. Sustain. Energy Rev. 2019, 113, 109254. [Google Scholar] [CrossRef]
 Meng, H.; Li, Y.F. A review on prognostics and health management (PHM) methods of lithiumion batteries. Renew. Sustain. Energy Rev. 2019, 116, 109405. [Google Scholar] [CrossRef]
 Pan, H.; Lü, Z.; Wang, H.; Wei, H.; Chen, L. Novel battery stateofhealth online estimation method using multiple health indicators and an extreme learning machine. Energy 2018, 160, 466–477. [Google Scholar] [CrossRef]
 Guo, P.; Cheng, Z.; Yang, L. A datadriven remaining capacity estimation approach for lithiumion batteries based on charging health feature extraction. J. Power Sources 2019, 412, 442–450. [Google Scholar] [CrossRef]
 Liu, J.; Chen, Z. Remaining Useful Life Prediction of LithiumIon Batteries Based on Health Indicator and Gaussian Process Regression Model. IEEE Access 2019, 7, 39474–39484. [Google Scholar] [CrossRef]
 Deng, Y.; Ying, H.E.J.; Zhu, H.; Wei, K.; Chen, J.; Zhang, F.; Liao, G. Feature parameter extraction and intelligent estimation of the StateofHealth of lithiumion batteries. Energy 2019, 176, 91–102. [Google Scholar] [CrossRef]
 Li, Y.; AbdelMonem, M.; Gopalakrishnan, R.; Berecibar, M.; NaniniMaury, E.; Omar, N.; den Bossche, P.; Mierlo, J. A quick online state of health estimation method for Liion battery with incremental capacity curves processed by Gaussian filter. J. Power Sources 2018, 373, 40–53. [Google Scholar] [CrossRef]
 Weng, C.; Cui, Y.; Sun, J.; Peng, H. Onboard state of health monitoring of lithiumion batteries using incremental capacity analysis with support vector regression. J. Power Sources 2013, 235, 36–44. [Google Scholar] [CrossRef]
 de Hoog, J.; Timmermans, J.M.; IoanStroe, D.; Swierczynski, M.; Jaguemont, J.; Goutam, S.; Omar, N.; van Mierlo, J.; Van den Bossche, P. Combined cycling and calendar capacity fade modeling of a NickelManganeseCobalt Oxide Cell with reallife profile validation. Appl. Energy 2017, 200, 47–61. [Google Scholar] [CrossRef]
 Yang, D.; Wang, Y.; Pan, R.; Chen, R.; Chen, Z. A Neural Network Based StateofHealth Estimation of Lithiumion Battery in Electric Vehicles. Energy Procedia 2017, 105, 2059–2064. [Google Scholar] [CrossRef]
 Khaleghi, S.; Firouz, Y.; Van Mierlo, J.; Van den Bossche, P. Developing a realtime datadriven battery health diagnosis method, using time and frequency domain condition indicators. Appl. Energy 2019, 255, 113813. [Google Scholar] [CrossRef]
 Yang, S.; Wu, J.; Du, Y.; He, Y.; Chen, X. Ensemble Learning for ShortTerm Traffic Prediction Based on Gradient Boosting Machine. J. Sens. 2017. [Google Scholar] [CrossRef]
 Singh, S.K.; Kumar, S.; Dwivedi, J.P. A novel soft computing method for engine RUL prediction. Multimed. Tools Appl. 2019, 78, 4065–4087. [Google Scholar] [CrossRef]
 Nenadic, N.G.; Bussey, H.E.; Ardis, P.A.; Thurston, M.G. Estimation of StateofCharge and Capacity of Used LithiumIon Cells. Int. J. Progn. Health Manag. 2014, 5, 12. [Google Scholar]
 Dong, X.; Yu, Z.; Cao, W.; Shi, Y.; Ma, Q. A survey on ensemble learning. Front. Comput. Sci. 2020, 14, 241–258. [Google Scholar] [CrossRef]
 Sagi, O.; Rokach, L. Ensemble learning: A survey. Wiley Interdiscip. Rev. Data Min. Knowl. Discov. 2018, 8. [Google Scholar] [CrossRef]
 Zhang, C.; Ma, Y. Ensemble Machine Learning; Springer: Berlin/Heidelberg, Germany, 1996. [Google Scholar] [CrossRef]
 Chen, X.; Shen, W.; Vo, T.T.; Cao, Z.; Kapoor, A. An overview of lithiumion batteries for electric vehicles. Int. Power Energy Conf. 2012, 2012, 230–235. [Google Scholar] [CrossRef]
 Deng, S.; Jing, B.; Sheng, S.; Huang, Y.; Zhou, H. Impulse feature extraction method for machinery fault detection using fusion sparse coding and online dictionary learning. Chin. J. Aeronaut. 2015, 28, 488–498. [Google Scholar] [CrossRef] [Green Version]
 Kester, W. MT003 TUTORIAL Understand SINAD, ENOB, SNR, THD, THD + N, and SFDR so You Don’t Get Lost in the Noise Floor; Analog Devices: Norwood, MA, USA, 2009. [Google Scholar]
 Kunjir, R.; Bhanuse, V.; Kulkarni, J.; Patankar, S. Determination of Deformation of Steel Plate Using Welch’s Periodogram Estimate. In Proceedings of the 2018 Second International Conference on Intelligent Computing and Control Systems (ICICCS), Madurai, India, 14–15 June 2018; pp. 1169–1174. [Google Scholar] [CrossRef]
 Baraldi, P.; Bonfanti, G.; Zio, E. Differential evolutionbased multiobjective optimization for the definition of a health indicator for fault diagnostics and prognostics. Mech. Syst. Signal Process. 2018, 102, 382–400. [Google Scholar] [CrossRef]
 Yang, J.; Wang, Y.; Pei, S.; Hu, Q. Monotonicity Induced Parameter Learning for Bayesian Networks with Limited Data. In Proceedings of the International Joint Conference on Neural Networks (IJCNN), Rio de Janeiro, Brazil, 8–13 July 2018; Institute of Electrical and Electronics Engineers Inc.: Piscataway, NJ, USA, 2018. [Google Scholar] [CrossRef]
 Coble, J.; Hines, J.W. Identifying Optimal Prognostic Parameters from Data: A Genetic Algorithms Approach. In Proceedings of the Annual Conference of the Prognostics and Health Management Society 2009, San Diego, CA, USA, 27 September–1 October 2009. [Google Scholar]
 Hastie, T.; Tibshirani, R.; Friedman, J. The Elements of Statistical Learning: Data Mining, Inference, and Prediction; SpringerVerlag: New York, NY, USA, 2009. [Google Scholar]
 Jalkanen, K.; Karppinen, J.; Skogström, L.; Laurila, T.; Nisula, M.; Vuorilehto, K. Cycle aging of commercial NMC/graphite pouch cells at different temperatures. Appl. Energy 2015, 154, 160–172. [Google Scholar] [CrossRef]
Positive Electrode Material  Nickel Manganese Cobalt Oxid 

Negative electrode material  Graphite 
Cell wight  428 $\mathrm{g}$ 
Nominal voltage  3.65 $\mathrm{V}$ 
Nominal capacity  20 $\mathrm{Ah}$ 
Voltage range  3 to 4.2 $\mathrm{V}$ 
Power density  2300 $\mathrm{W}/\mathrm{Kg}$ 
Specific energy  174 $\mathrm{W}/\text{}\mathrm{Kg}$ 
Conditions  Cell 1  Cell 2  Cell 3 

DoD (%)  75  60  75 
Middle SoC %  50  50  50 
Temperature  45  45  10 
Number of full charges and discharge cycles per day  6  12  6 
Number of WLTC trips in each cycle  5  4  5 
Features  Description 

Basic Features  
MAV  $MAV=\frac{1}{N}{\displaystyle \sum}_{i=1}^{N}\left{s}_{i}\right$ 
SD  Measures data spreadation around mean value: $SD=\sqrt{\frac{{{\displaystyle \sum}}_{i=1}^{N}{\left\left({s}_{i}mean\left(s\right)\right)\right}^{2}}{N1}}$ 
RMS  Root mean square of an input signal: $RMS=\sqrt{\frac{{{\displaystyle \sum}}_{i=1}^{N}{s}_{n}{}^{2}}{N}}$ 
SF  Calculated by dividing RMS by MAV. It is dependent on the signal shape. $SF=\frac{RMS}{MAV}$ 
Impulsive features  
Peak values  The maximum absolute value of a signal. The basic parameter for computation of other impulsive features. 
Impulse parameter  The height of a peak divided by to the signal’s mean absolute level: $\frac{Peakvalue}{MAV}$ 
Crest parameter  Calculated by dividing peak value over RMS: $C=\frac{\leftPeak\right}{RMS}$ 
Highorder features  
Skewness  Describes the symmetry of a distribution: $Sk=\frac{1}{N1}\left(\frac{{{\displaystyle \sum}}_{i=1}^{N}{\left({s}_{i}mean\left(s\right)\right)}^{3}}{S{D}^{3}}\right)$ 
Kurtosis  Characterizes the difference between a distribution and a normal distribution: $\frac{1}{N1}\left(\frac{{{\displaystyle \sum}}_{i=1}^{N}{\left({s}_{i}mean\left(s\right)\right)}^{4}}{S{D}^{4}}\right)$ 
EGBT Algorithm 


Metrics  Definition 

Mean absolute error (MAE)  $MAE=\frac{1}{N}{\displaystyle {\displaystyle \sum}_{i=1}^{N}}\left\left({\widehat{Y}}_{i}{Y}_{i}\right)\right$ 
Mean absolute percentage error (MAPE)  $MAPE=\frac{1}{N}{\displaystyle {\displaystyle \sum}_{i=1}^{N}}\left\frac{({\widehat{Y}}_{i}{Y}_{i}}{{Y}_{i}})\right\times 100\%$ 
Root mean squared error (RMSE)  $RSME=\sqrt{\frac{1}{N}{\displaystyle {\displaystyle \sum}_{i=1}^{N}}{\left({\widehat{Y}}_{i}{Y}_{i}\right)}^{2}}$ 
Model  MAPE  Computational Cost (s) 

Model 1  0.45  11.8 
Model 2  0.21  8.1 
Model 3  0.20  6.9 
Error Evaluation Metric  Cell 1  Cell 3 

MAE  0.53  0.64 
RMSE  0.69  0.70 
MAPE  0.58  0.63 
Training  Validation on Cell 2  Validation on Cell 3  

Error Evaluation Metric  EGBT  Decision Tree  EGBT  Decision Tree  EGBT  Decision Tree 
RMSE  0.29  0.34  0.69  1.18  0.70  2.67 
MAE  0.083  0.081  0.53  1.05  0.64  2.34 
MAPE  0.20  0.21  0.58  1.14  0.63  2.35 
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Khaleghi, S.; Firouz, Y.; Berecibar, M.; Mierlo, J.V.; Bossche, P.V.D. Ensemble Gradient Boosted Tree for SoH Estimation Based on Diagnostic Features. Energies 2020, 13, 1262. https://doi.org/10.3390/en13051262
Khaleghi S, Firouz Y, Berecibar M, Mierlo JV, Bossche PVD. Ensemble Gradient Boosted Tree for SoH Estimation Based on Diagnostic Features. Energies. 2020; 13(5):1262. https://doi.org/10.3390/en13051262
Chicago/Turabian StyleKhaleghi, Sahar, Yousef Firouz, Maitane Berecibar, Joeri Van Mierlo, and Peter Van Den Bossche. 2020. "Ensemble Gradient Boosted Tree for SoH Estimation Based on Diagnostic Features" Energies 13, no. 5: 1262. https://doi.org/10.3390/en13051262