A Framework for an ML-Based Predictive Turbofan Engine Health Model
Abstract
1. Introduction
References | List of Predictor Variables | No. Var. | Model | GT Type | Target |
---|---|---|---|---|---|
[19] | , , , , , , , , , | 10 | ML | industrial | performance |
[20] | load, , relative humidity | 3 | ML | industrial | performance |
[4] | , , , , , , , FF, N | 9 | physics-based | aero | health state |
[5] | , , , , , , , RH, , VAR, , , | 13 | physics-based | industrial | health state |
[21] | , , , , N, FF | 6 | ML | aero | TGT |
[22] | n/a (not given) | 14 | ML | aero | TGT margin |
[23] | n/a (not given) | 10 | ML | aero | TGT margin |
[24] | damage index, starts, trips, vendor type, , , , , ACC1, ACC2, , | 12 | ML | industrial | RUL |
[25] | N1, , , , , | 6 | ML | aero | fault classes |
[26] | , , , , , Phi, | 7 | ML | aero | RUL |
[27] | , , Pitch, AOA, Roll, , TAT, Ground/Air | 8 | ML | aero | performance |
[28] | ALT, MN, FF, Power | 4 | ML | aero | performance |
[7] | , , , , , , , , , FF, | 10+ | physics-based | marine | health state |
[29] | , , , , FF, , , , , , , , , | 14 | ML | aero | performance |
[30] | SLOAT, ALT, , MN, , | 6 | ML | aero | EGT margin |
[31] | , , , , , , , , , , Fuel Feeding, Nozzle Throat, Throttle position | 13 | ML | aero | EGT |
[32] | , , , , | 5 | ML | aero | RUL, EOH |
[33] | shape parameter, EngCycRem, TCI, InspInterval, CycOneTimeInsp, InspModel, ReplaceAll | 7 | ML | aero | maintenance optimization |
NASA C-MAPSS Studies (see Table A1 in Appendix A) | NASA C-MAPSS dataset | 7–46 | ML/physics-based | aero | RUL |
2. Preprocessing: Selection of Predictor Variable and Data Segmentation
2.1. EHM Data and Turbine Gas Temperature
2.2. Selection of Predictor Variable
2.3. Segmentation of Data Using Engine Maintenance Interval
2.4. Detrending
2.5. Multi-Engine-Based Large Generalized Model
3. Machine Learning Algorithms
3.1. Long Short-Term Memory
3.2. Linear and Nonlinear Algorithms
4. Results
4.1. Prediction Accuracies of Linear Algorithms, Nonlinear Algorithms, and LSTM
4.2. Comparison of Prediction Accuracies of Training Approaches
4.3. Result of Multi-Engine-Based Large Generalized Model
4.4. Detail View of the Prediction Results
4.5. Uncertainty Quantification
5. Conclusions
- -
- The seven predictor variables of Set 4 showed a minimum percentage of outliers with comparable prediction accuracy among the other sets.
- -
- There were promising results from linear and nonlinear regression algorithms.
- -
- For individual engines, the proposed training approaches demonstrated their prediction capability with a mean root mean squared error (RMSE) ranging from 4 ∘C to 6 ∘C, utilizing up to 65% less data than the train (80%)–test (20%) split method.
- -
- The multi-engine-based large generalized model, by utilizing the data of each family engine, achieved similar prediction accuracy (a mean RMSE ranging from 3 ∘C to 5 ∘C) with smaller IQR (from 0.5 ∘C to 1.6 ∘C); however, the amount of data required was 45–300 times larger than the proposed approaches.
- -
- Uncertainty quantification showed a coverage width criterion (CWC) between 29 ∘C and 40 ∘C, varying with different engine families. The prediction interval coverage probability (PICP) was over 93% for all engines.
6. Future Research
- -
- Incorporate ancillary data such as flight routes, weather, and airborne particulates.
- -
- Implement missing value imputation.
- -
- Apply advanced algorithms.
- -
- Examine uncertainty quantification (UQ) methods.
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
ada | adaboost regression |
bag | bagging regression |
cart | decision tree regression |
C-MAPSS | Commercial Modular Aero-Propulsion System Simulation |
CWC | coverage width criterion |
EHM | engine health monitoring |
en | elastic net regression |
extra | extra trees regression |
gbm | gradient boosting machine |
HPC | high-pressure compressor |
huber | Huber regression |
IPC | intermediate pressure compressor |
IQR | interquantile range |
knn | k-nearest neighbors regression |
lasso | lasso linear regression |
lassol | lasso least angle regression (lars) |
lr | linear regression |
LSTM | long short-term-memory |
Max. | maximum value |
Min. | minimum value |
MPIW | mean prediction interval width |
NMPIW | normalized mean prediction interval width |
outlier | ratio of outlier |
pa | passive aggressive regression |
PICP | prediction interval coverage probability |
rf | random forest regression |
ridge | ridge regression |
RMSE | root mean squared error |
RUL | remaining useful life |
sgd | stochastic gradient descent regression |
StdAE | standard deviation of absolute error |
StdPIW | standard deviation of prediction interval width |
svmr | support vector machine regression |
TGT | turbine gas temperature |
predicted value | |
actual (target) value |
Appendix A. Studies Utilizing NASA C-MAPSS Dataset
Reference | No. Var. | Reference | No. Var. | Reference | No. Var. | Reference | No. Var. |
---|---|---|---|---|---|---|---|
[3] | 14 | [70] | 30/18 | [71] | 26 | [72] | 14 |
[73] | 21 | [74] | 14 | [75] | 26 | [76] | 14 |
[77] | 26 | [78] | 26 | [38] | 14 | [79] | 14 |
[6] | 7 | [80] | 26 | [81] | 26 | [82] | 14–17 |
[83] | 24 | [84] | 11 | [85] | 26 | [86] | 46 |
[39] | 26 | [87] | 14 | [88] | 24 | [89] | 14 |
[90] | 21 | [91] | 14 | [92] | 24 | [93] | 14 |
[40] | 26 | [94] | 21 | [42] | 21 | [95] | 7 |
[41] | 14 | [96] | 26 | [97] | 24 | [98] | 14 |
[99] | 14 | [100] | 26 | [101] | 26 | [102] | 10–20 |
[103] | 24 | [104] | 24 |
References
- Schwabacher, M. A Survey of Data-Driven Prognostics. In Proceedings of the Infotech@Aerospace, Arlington, VI, USA, 26–29 September 2005; p. 7002. [Google Scholar] [CrossRef]
- Schwabacher, M.; Goebel, K. A Survey of Artificial Intelligence for Prognostics. In AAAI Fall Symposium: Artificial Intelligence for Prognostics; Association for the Advancement of Artificial Intelligence: Washington, DC, USA, 2007; pp. 108–115. [Google Scholar]
- Saxena, A.; Goebel, K.; Simon, D.; Eklund, N. Damage propagation modeling for aircraft engine run-to-failure simulation. In Proceedings of the 2008 International Conference on Prognostics and Health Management, Denver, CO, USA, 6–9 October 2008; IEEE: New York, NY, USA, 2008; pp. 1–9. [Google Scholar]
- Li, Y.G. Gas turbine performance and health status estimation using adaptive gas path analysis. J. Eng. Gas Turbines Power 2010, 132, 041701. [Google Scholar] [CrossRef]
- Pinelli, M.; Spina, P.R.; Venturini, M. Gas turbine health state determination: Methodology approach and field application. Int. J. Rotating Mach. 2012, 2012, 142173. [Google Scholar] [CrossRef]
- Alozie, O.; Li, Y.G.; Wu, X.; Shong, X.; Ren, W. An adaptive model-based framework for prognostics of gas path faults in aircraft gas turbine engines. Int. J. Progn. Health Manag. 2019, 10, 013. [Google Scholar] [CrossRef]
- Fang, Y.l.; Liu, D.f.; Liu, Y.b.; Yu, L.w. Comprehensive assessment of gas turbine health condition based on combination weighting of subjective and objective. Int. J. Gas Turbine Propuls. Power Syst. 2020, 11, 56–62. [Google Scholar] [CrossRef]
- LeCun, Y.; Bengio, Y.; Hinton, G. Deep Learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef]
- Hochreiter, S.; Schmidhuber, J. Long short-term memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef]
- Frederick, D.K.; DeCastro, J.A.; Litt, J.S. User’s Guide for the Commercial Modular Aero-Propulsion System Simulation (C-MAPSS); Technical Report; Glenn Research Center: Cleveland, OH, USA, 2007. [Google Scholar]
- CMAPSS Jet Engine Simulated Data, NASA Open Data Portal. 2016. Available online: https://data.nasa.gov/dataset/cmapss-jet-engine-simulated-data (accessed on 23 January 2025).
- Marinai, L. Gas-Path Diagnostics and Prognostics for Aero-Engines Using Fuzzy Logic and Time Series Analysis. Ph.D. Thesis, School of Engineering, Canfield University, Canfield, OH, USA, 2004. [Google Scholar]
- Spieler, S.; Staudacher, S.; Fiola, R.; Sahm, P.; Weißschuh, M. Probabilistic Engine Performance Scatter and Deterioration Modeling. In Proceedings of the ASME Turbo Expo 2007: Power for Land, Sea, and Air, Montreal, QC, Canada, 14–17 May 2007; American Society of Mechanical Engineers Digital Collection: New York, NY, USA, 2009; pp. 1073–1082. [Google Scholar] [CrossRef]
- Martínez, A.; Sánchez, L.; Couso, I. Engine Health Monitoring for Engine Fleets Using Fuzzy Radviz. In Proceedings of the 2013 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE), Hyderabad, India, 7–10 July 2013; pp. 1–8. [Google Scholar] [CrossRef]
- Martinez, A.; Sánchez, L.; Couso, I. Aeroengine Prognosis through Genetic Distal Learning Applied to Uncertain Engine Health Monitoring Data. In Proceedings of the 2014 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE), Beijing, China, 6–11 July 2014; pp. 1945–1952. [Google Scholar] [CrossRef]
- Gräter, F.; Staudacher, S.; Weißschuh, M. Operator-Specific Engine Trending Using a Feature-Based Model. In Proceedings of the ASME Turbo Expo 2010: Power for Land, Sea, and Air, Glasgow, UK, 14–18 June 2010; American Society of Mechanical Engineers Digital Collection: New York, NY, USA, 2010; pp. 79–88. [Google Scholar] [CrossRef]
- Skaf, Z.; Zaidan, M.A.; Harrison, R.F.; Mills, A.R. Accommodating Repair Actions into Gas Turbine Prognostics. In Proceedings of the Annual Conference of the PHM Society, New Orleans, LA, USA, 14–17 October 2013; Volume 5. [Google Scholar] [CrossRef]
- Laslett, O.W.; Mills, A.R.; Zaidan, M.A.; Harrison, R.F. Fusing an Ensemble of Diverse Prognostic Life Predictions. In Proceedings of the 2014 IEEE Aerospace Conference, Big Sky, MT, USA, 1–8 March 2014; pp. 1–10. [Google Scholar] [CrossRef]
- Olausson, P.; Häggståhl, D.; Arriagada, J.; Dahlquist, E.; Assadi, M. Hybrid model of an evaporative gas turbine power plant utilizing physical models and artificial neural networks. In Proceedings of the Turbo Expo: Power for Land, Sea, and Air, Atlanta, GA, USA, 16–19 June 2003; Volume 36843, pp. 299–306. [Google Scholar]
- Fast, M.; Assadi, M.; De, S. Condition based maintenance of gas turbines using simulation data and artificial neural network: A demonstration of feasibility. In Proceedings of the Turbo Expo: Power for Land, Sea, and Air, Berlin, Germany, 9–13 June 2008; Volume 43123, pp. 153–161. [Google Scholar]
- Vatani, A.; Khorasani, K.; Meskin, N. Health monitoring and degradation prognostics in gas turbine engines using dynamic neural networks. In Proceedings of the Turbo Expo: Power for Land, Sea, and Air, Montreal, QC, Canada, 15–19 June 2015; American Society of Mechanical Engineers: New York, NY, USA, 2015; Volume 56758, p. V006T05A030. [Google Scholar]
- Zaidan, M.A.; Harrison, R.F.; Mills, A.R.; Fleming, P.J. Bayesian hierarchical models for aerospace gas turbine engine prognostics. Expert Syst. Appl. 2015, 42, 539–553. [Google Scholar] [CrossRef]
- Zaidan, M.A.; Relan, R.; Mills, A.R.; Harrison, R.F. Prognostics of gas turbine engine: An integrated approach. Expert Syst. Appl. 2015, 42, 8472–8483. [Google Scholar] [CrossRef]
- Pillai, P.; Kaushik, A.; Bhavikatti, S.; Roy, A.; Kumar, V. A hybrid approach for fusing physics and data for failure prediction. Int. J. Progn. Health Manag. 2016, 7, 4. [Google Scholar] [CrossRef]
- Yang, X.; Pang, S.; Shen, W.; Lin, X.; Jiang, K.; Wang, Y. Aero engine fault diagnosis using an optimized extreme learning machine. Int. J. Aerosp. Eng. 2016, 2016, 7892875. [Google Scholar] [CrossRef]
- Khan, F.; Eker, O.; Khan, A.; Orfali, W. Adaptive Degradation Prognostic Reasoning by Particle Filter with a Neural Network Degradation Model for Turbofan Jet Engine. Data 2018, 3, 49. [Google Scholar] [CrossRef]
- Yildirim, M.T.; Kurt, B. Aircraft gas turbine engine health monitoring system by real flight data. Int. J. Aerosp. Eng. 2018, 2018, 9570873. [Google Scholar] [CrossRef]
- Kim, S.; Kim, K.; Son, C. Transient system simulation for an aircraft engine using a data-driven model. Energy 2020, 196, 117046. [Google Scholar] [CrossRef]
- Fentaye, A.; Zaccaria, V.; Rahman, M.; Stenfelt, M.; Kyprianidis, K. Hybrid model-based and data-driven diagnostic algorithm for gas turbine engines. In Proceedings of the Turbo Expo: Power for Land, Sea, and Air, Online, 21–25 September 2020; American Society of Mechanical Engineers: New York, NY, USA, 2020; Volume 84140, p. V005T05A008. [Google Scholar]
- Lin, L.; Liu, J.; Guo, H.; Lv, Y.; Tong, C. Sample adaptive aero-engine gas-path performance prognostic model modeling method. Knowl.-Based Syst. 2021, 224, 107072. [Google Scholar] [CrossRef]
- Ullah, S.; Li, S.; Khan, K.; Khan, S.; Khan, I.; Eldin, S.M. An Investigation of Exhaust Gas Temperature of Aircraft Engine Using LSTM. IEEE Access 2023, 11, 5168–5177. [Google Scholar] [CrossRef]
- Xiao, W.; Chen, Y.; Zhang, H.; Shen, D. Remaining Useful Life Prediction Method for High Temperature Blades of Gas Turbines Based on 3D Reconstruction and Machine Learning Techniques. Appl. Sci. 2023, 13, 11079. [Google Scholar] [CrossRef]
- Lee, D.; Kwon, H.J.; Choi, K. Risk-Based Maintenance Optimization of Aircraft Gas Turbine Engine Component. Proc. Inst. Mech. Eng. Part O J. Risk Reliab. 2024, 238, 429–445. [Google Scholar] [CrossRef]
- Wold, S.; Esbensen, K.; Geladi, P. Principal Component Analysis. Chemom. Intell. Lab. Syst. 1987, 2, 37–52. [Google Scholar] [CrossRef]
- Abdi, H.; Williams, L.J. Principal Component Analysis. Wires Comput. Stat. 2010, 2, 433–459. [Google Scholar] [CrossRef]
- Pearson, K. Note on regression and inheritance in the case of two parents. Proc. R. Soc. Lond. 1895, 58, 240–242. [Google Scholar] [CrossRef]
- Bengio, Y.; Simard, P.; Frasconi, P. Learning long-term dependencies with gradient descent is difficult. IEEE Trans. Neural Netw. 1994, 5, 157–166. [Google Scholar] [CrossRef] [PubMed]
- Wang, J.; Wen, G.; Yang, S.; Liu, Y. Remaining Useful Life Estimation in Prognostics Using Deep Bidirectional LSTM Neural Network. In Proceedings of the 2018 Prognostics and System Health Management Conference (PHM-Chongqing), Chongqing, China, 26–28 October 2018; pp. 1037–1042. [Google Scholar] [CrossRef]
- Remadna, I.; Terrissa, S.L.; Zemouri, R.; Ayad, S.; Zerhouni, N. Leveraging the Power of the Combination of CNN and Bi-Directional LSTM Networks for Aircraft Engine RUL Estimation. In Proceedings of the 2020 Prognostics and Health Management Conference (PHM-Besançon), Besancon, France, 4–7 May 2020; pp. 116–121. [Google Scholar] [CrossRef]
- Shah, S.R.B.; Chadha, G.S.; Schwung, A.; Ding, S.X. A Sequence-to-Sequence Approach for Remaining Useful Lifetime Estimation Using Attention-augmented Bidirectional LSTM. Intell. Syst. Appl. 2021, 10–11, 200049. [Google Scholar] [CrossRef]
- Song, Y.; Gao, S.; Li, Y.; Jia, L.; Li, Q.; Pang, F. Distributed Attention-Based Temporal Convolutional Network for Remaining Useful Life Prediction. IEEE Internet Things J. 2021, 8, 9594–9602. [Google Scholar] [CrossRef]
- Song, J.W.; Park, Y.I.; Hong, J.J.; Kim, S.G.; Kang, S.J. Attention-Based Bidirectional LSTM-CNN Model for Remaining Useful Life Estimation. In Proceedings of the 2021 IEEE International Symposium on Circuits and Systems (ISCAS), Daegu, Republic of Korea, 22–28 May 2021; pp. 1–5. [Google Scholar] [CrossRef]
- Cover, T.; Hart, P. Nearest neighbor pattern classification. IEEE Trans. Inf. Theory 1967, 13, 21–27. [Google Scholar] [CrossRef]
- Tibshirani, R. Regression shrinkage and selection via the lasso. J. R. Stat. Soc. Ser. B (Methodol.) 1996, 58, 267–288. [Google Scholar] [CrossRef]
- Loh, W.Y. Classification and regression trees. Wiley Interdiscip. Rev. Data Min. Knowl. Discov. 2011, 1, 14–23. [Google Scholar] [CrossRef]
- Hoerl, A.E.; Kennard, R.W. Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 1970, 12, 55–67. [Google Scholar] [CrossRef]
- Geurts, P.; Ernst, D.; Wehenkel, L. Extremely randomized trees. Mach. Learn. 2006, 63, 3–42. [Google Scholar] [CrossRef]
- Zou, H.; Hastie, T. Regularization and variable selection via the elastic net. J. R. Stat. Soc. Ser. B (Stat. Methodol.) 2005, 67, 301–320. [Google Scholar] [CrossRef]
- Drucker, H.; Burges, C.J.; Kaufman, L.; Smola, A.; Vapnik, V. Support vector regression machines. In Proceedings of the 10th International Conference on Neural Information Processing Systems, Denver, CO, USA, 3–5 December 1996; Volume 9. [Google Scholar]
- Huber, P.J. Robust regression: Asymptotics, conjectures and Monte Carlo. Ann. Stat. 1973, 1, 799–821. [Google Scholar] [CrossRef]
- Freund, Y.; Schapire, R.; Abe, N. A short introduction to boosting. J.-Jpn. Soc. Artif. Intell. 1999, 14, 1612. [Google Scholar]
- Efron, B.; Hastie, T.; Johnstone, I.; Tibshirani, R. Least angle regression. Ann. Stat. 2004, 32, 407–499. [Google Scholar] [CrossRef]
- Breiman, L. Bagging predictors. Mach. Learn. 1996, 24, 123–140. [Google Scholar] [CrossRef]
- Crammer, K.; Dekel, O.; Keshet, J.; Shalev-Shwartz, S.; Singer, Y. Online passive aggressive algorithms. J. Mach. Learn. Res. 2006, 7, 551–585. [Google Scholar]
- Breiman, L. Random forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef]
- Kiefer, J.; Wolfowitz, J. Stochastic estimation of the maximum of a regression function. Ann. Math. Stat. 1952, 23, 462–466. [Google Scholar] [CrossRef]
- Friedman, J.H. Greedy function approximation: A gradient boosting machine. Ann. Stat. 2001, 29, 1189–1232. [Google Scholar] [CrossRef]
- Liashchynskyi, P.; Liashchynskyi, P. Grid Search, Random Search, Genetic Algorithm: A Big Comparison for NAS. arXiv 2019, arXiv:1912.06059. [Google Scholar] [CrossRef]
- Hwang, J.G.; Ding, A.A. Prediction intervals for artificial neural networks. J. Am. Stat. Assoc. 1997, 92, 748–757. [Google Scholar] [CrossRef]
- De Vleaux, R.D.; Schumi, J.; Schweinsberg, J.; Ungar, L.H. Prediction intervals for neural networks via nonlinear regression. Technometrics 1998, 40, 273–282. [Google Scholar] [CrossRef]
- Ho, S.L.; Xie, M.; Tang, L.; Xu, K.; Goh, T. Neural network modeling with confidence bounds: A case study on the solder paste deposition process. IEEE Trans. Electron. Packag. Manuf. 2001, 24, 323–332. [Google Scholar] [CrossRef]
- Khosravi, A.; Nahavandi, S.; Creighton, D. Improving prediction interval quality: A genetic algorithm-based method applied to neural networks. In Proceedings of the Neural Information Processing: 16th International Conference, ICONIP 2009, Bangkok, Thailand, 1–5 December 2009; Proceedings, Part II 16. Springer: Berlin/Heidelberg, Germany, 2009; pp. 141–149. [Google Scholar]
- Lu, T.; Viljanen, M. Prediction of indoor temperature and relative humidity using neural network models: Model comparison. Neural Comput. Appl. 2009, 18, 345–357. [Google Scholar] [CrossRef]
- Wu, W.; Chen, K.; Qiao, Y.; Lu, Z. Probabilistic short-term wind power forecasting based on deep neural networks. In Proceedings of the 2016 International Conference on Probabilistic Methods Applied to Power Systems (PMAPS), Beijing, China, 16–20 October 2016; IEEE: New York, NY, USA, 2016; pp. 1–8. [Google Scholar]
- Van Hinsbergen, C.I.; Van Lint, J.; Van Zuylen, H. Bayesian committee of neural networks to predict travel times with confidence intervals. Transp. Res. Part C Emerg. Technol. 2009, 17, 498–509. [Google Scholar] [CrossRef]
- Zhao, J.H.; Dong, Z.Y.; Xu, Z.; Wong, K.P. A statistical approach for interval forecasting of the electricity price. IEEE Trans. Power Syst. 2008, 23, 267–276. [Google Scholar] [CrossRef]
- Pierce, S.G.; Worden, K.; Bezazi, A. Uncertainty analysis of a neural network used for fatigue lifetime prediction. Mech. Syst. Signal Process. 2008, 22, 1395–1411. [Google Scholar] [CrossRef]
- Zhang, C.; Wei, H.; Xie, L.; Shen, Y.; Zhang, K. Direct interval forecasting of wind speed using radial basis function neural networks in a multi-objective optimization framework. Neurocomputing 2016, 205, 53–63. [Google Scholar] [CrossRef]
- Khosravi, A.; Nahavandi, S.; Creighton, D.; Atiya, A.F. Comprehensive Review of Neural Network-Based Prediction Intervals and New Advances. IEEE Trans. Neural Netw. 2011, 22, 1341–1356. [Google Scholar] [CrossRef]
- Chao, M.A.; Kulkarni, C.; Goebel, K.; Fink, O. Fusing physics-based and deep learning models for prognostics. Reliab. Eng. Syst. Saf. 2022, 217, 107961. [Google Scholar] [CrossRef]
- Sateesh Babu, G.; Zhao, P.; Li, X.L. Deep convolutional neural network based regression approach for estimation of remaining useful life. In Proceedings of the International Conference on Database Systems for Advanced Applications, Dallas, TX, USA, 16–19 April 2016; Springer: Berlin/Heidelberg, Germany, 2016; pp. 214–228. [Google Scholar]
- Thakkar, U.; Chaoui, H. Remaining Useful Life Prediction of an Aircraft Turbofan Engine Using Deep Layer Recurrent Neural Networks. Actuators 2022, 11, 67. [Google Scholar] [CrossRef]
- Yuan, M.; Wu, Y.; Lin, L. Fault diagnosis and remaining useful life estimation of aero engine using LSTM neural network. In Proceedings of the 2016 IEEE International Conference on Aircraft Utility Systems (AUS), Beijing, China, 10–12 October 2016; IEEE: New York, NY, USA, 2016; pp. 135–140. [Google Scholar]
- Wang, T.; Guo, D.; Sun, X.M. Remaining Useful Life Predictions for Turbofan Engine Degradation Based on Concurrent Semi-Supervised Model. Neural Comput. Appl. 2022, 34, 5151–5160. [Google Scholar] [CrossRef]
- Zheng, S.; Ristovski, K.; Farahat, A.; Gupta, C. Long short-term memory network for remaining useful life estimation. In Proceedings of the 2017 IEEE International Conference on Prognostics and Health Management (ICPHM), Dallas, TX, USA, 19–21 June 2017; IEEE: New York, NY, USA, 2017; pp. 88–95. [Google Scholar]
- Xu, T.; Han, G.; Gou, L.; Martínez-García, M.; Shao, D.; Luo, B.; Yin, Z. SGBRT: An Edge-Intelligence Based Remaining Useful Life Prediction Model for Aero-Engine Monitoring System. IEEE Trans. Netw. Sci. Eng. 2022, 9, 3112–3122. [Google Scholar] [CrossRef]
- Hsu, C.S.; Jiang, J.R. Remaining useful life estimation using long short-term memory deep learning. In Proceedings of the 2018 IEEE International Conference on Applied System Invention (ICASI), Chiba, Japan, 13–17 April 2018; IEEE: New York, NY, USA, 2018; pp. 58–61. [Google Scholar]
- Alomari, Y.; Andó, M.; Baptista, M.L. Advancing Aircraft Engine RUL Predictions: An Interpretable Integrated Approach of Feature Engineering and Aggregated Feature Importance. Sci. Rep. 2023, 13, 13466. [Google Scholar] [CrossRef] [PubMed]
- Ensarioğlu, K.; İnkaya, T.; Emel, E. Remaining Useful Life Estimation of Turbofan Engines with Deep Learning Using Change-Point Detection Based Labeling and Feature Engineering. Appl. Sci. 2023, 13, 11893. [Google Scholar] [CrossRef]
- Keshun, Y.; Guangqi, Q.; Yingkui, G. A 3D Attention-enhanced Hybrid Neural Network for Turbofan Engine Remaining Life Prediction Using CNN and BiLSTM Models. IEEE Sens. J. 2023, 24, 21893–21905. [Google Scholar] [CrossRef]
- Miao, H.; Li, B.; Sun, C.; Liu, J. Joint Learning of Degradation Assessment and RUL Prediction for Aeroengines via Dual-Task Deep LSTM Networks. IEEE Trans. Ind. Inform. 2019, 15, 5023–5032. [Google Scholar] [CrossRef]
- Hu, Q.; Zhao, Y.; Ren, L. Novel Transformer-Based Fusion Models for Aero-Engine Remaining Useful Life Estimation. IEEE Access 2023, 11, 52668–52685. [Google Scholar] [CrossRef]
- Pasa, G.D.; Medeiros, I.; Yoneyama, T. Operating condition-invariant neural network-based prognostics methods applied on turbofan aircraft engines. In Proceedings of the Annual Conference of the PHM Society, Scottsdale, AZ, USA, 23–26 September 2019; Volume 11, pp. 1–10. [Google Scholar]
- Liu, Z.; Zhang, X.; Pan, J.; Zhang, X.; Hong, W.; Wang, Z.; Wang, Z.; Miao, Y. Similar or Unknown Fault Mode Detection of Aircraft Fuel Pump Using Transfer Learning With Subdomain Adaption. IEEE Trans. Instrum. Meas. 2023, 72, 3526411. [Google Scholar] [CrossRef]
- Wu, Z.; Yu, S.; Zhu, X.; Ji, Y.; Pecht, M. A Weighted Deep Domain Adaptation Method for Industrial Fault Prognostics According to Prior Distribution of Complex Working Conditions. IEEE Access 2019, 7, 139802–139814. [Google Scholar] [CrossRef]
- Maulana, F.; Starr, A.; Ompusunggu, A.P. Explainable Data-Driven Method Combined with Bayesian Filtering for Remaining Useful Lifetime Prediction of Aircraft Engines Using NASA CMAPSS Datasets. Machines 2023, 11, 163. [Google Scholar] [CrossRef]
- Sharanya, S.; Venkataraman, R.; Murali, G. Predicting Remaining Useful Life of Turbofan Engines Using Degradation Signal Based Echo State Network. Int. J. Turbo Jet-Engines 2023, 40, s181–s194. [Google Scholar] [CrossRef]
- Liu, Y.; Wang, X. Deep & Attention: A Self-Attention Based Neural Network for Remaining Useful Lifetime Predictions. In Proceedings of the 2021 7th International Conference on Mechatronics and Robotics Engineering (ICMRE), Budapest, Hungary, 3–5 February 2021; pp. 98–105. [Google Scholar] [CrossRef]
- Wang, H.; Zhang, Z.; Li, X.; Deng, X.; Jiang, W. Comprehensive Dynamic Structure Graph Neural Network for Aero-Engine Remaining Useful Life Prediction. IEEE Trans. Instrum. Meas. 2023, 72, 3533816. [Google Scholar] [CrossRef]
- Muneer, A.; Taib, S.M.; Naseer, S.; Ali, R.F.; Aziz, I.A. Data-Driven Deep Learning-Based Attention Mechanism for Remaining Useful Life Prediction: Case Study Application to Turbofan Engine Analysis. Electronics 2021, 10, 2453. [Google Scholar] [CrossRef]
- Wang, H.; Li, D.; Li, D.; Liu, C.; Yang, X.; Zhu, G. Remaining Useful Life Prediction of Aircraft Turbofan Engine Based on Random Forest Feature Selection and Multi-Layer Perceptron. Appl. Sci. 2023, 13, 7186. [Google Scholar] [CrossRef]
- Remadna, I.; Terrissa, L.S.; Ayad, S.; Zerhouni, N. RUL Estimation Enhancement Using Hybrid Deep Learning Methods. Int. J. Progn. Health Manag. 2021, 12. [Google Scholar] [CrossRef]
- Wang, Y.; Wang, Y. A Denoising Semi-Supervised Deep Learning Model for Remaining Useful Life Prediction of Turbofan Engine Degradation. Appl. Intell. 2023, 53, 22682–22699. [Google Scholar] [CrossRef]
- Xiang, S.; Qin, Y.; Luo, J.; Wu, F.; Gryllias, K. A Concise Self-Adapting Deep Learning Network for Machine Remaining Useful Life Prediction. Mech. Syst. Signal Process. 2023, 191, 110187. [Google Scholar] [CrossRef]
- Youness, G.; Aalah, A. An Explainable Artificial Intelligence Approach for Remaining Useful Life Prediction. Aerospace 2023, 10, 474. [Google Scholar] [CrossRef]
- Sharma, R.K. Framework Based on Machine Learning Approach for Prediction of the Remaining Useful Life: A Case Study of an Aviation Engine. J. Fail. Anal. Prev. 2024, 24, 1333–1350. [Google Scholar] [CrossRef]
- Xie, Z.; Du, S.; Lv, J.; Deng, Y.; Jia, S. A Hybrid Prognostics Deep Learning Model for Remaining Useful Life Prediction. Electronics 2021, 10, 39. [Google Scholar] [CrossRef]
- Smirnov, A.N.; Smirnov, S.N. Modeling the Remaining Useful Life of a Gas Turbine Engine Using Neural Networks. In Proceedings of the 2024 6th International Youth Conference on Radio Electronics, Electrical and Power Engineering (REEPE), Moscow, Russia, 29 February–2 March 2024; pp. 1–6. [Google Scholar] [CrossRef]
- Muneer, A.; Taib, S.M.; Fati, S.M.; Alhussian, H. Deep-Learning Based Prognosis Approach for Remaining Useful Life Prediction of Turbofan Engine. Symmetry 2021, 13, 1861. [Google Scholar] [CrossRef]
- Ture, B.A.; Akbulut, A.; Zaim, A.H.; Catal, C. Stacking-Based Ensemble Learning for Remaining Useful Life Estimation. Soft Comput. 2024, 28, 1337–1349. [Google Scholar] [CrossRef]
- Peng, C.; Chen, Y.; Chen, Q.; Tang, Z.; Li, L.; Gui, W. A Remaining Useful Life Prognosis of Turbofan Engine Using Temporal and Spatial Feature Fusion. Sensors 2021, 21, 418. [Google Scholar] [CrossRef]
- Zha, W.; Ye, Y. An Aero-Engine Remaining Useful Life Prediction Model Based on Feature Selection and the Improved TCN. Frankl. Open 2024, 6, 100083. [Google Scholar] [CrossRef]
- Asif, O.; Haider, S.A.; Naqvi, S.R.; Zaki, J.F.W.; Kwak, K.S.; Islam, S.M.R. A Deep Learning Model for Remaining Useful Life Prediction of Aircraft Turbofan Engine on C-MAPSS Dataset. IEEE Access 2022, 10, 95425–95440. [Google Scholar] [CrossRef]
- Boujamza, A.; Lissane Elhaq, S. Attention-Based LSTM for Remaining Useful Life Estimation of Aircraft Engines. IFAC-PapersOnLine 2022, 55, 450–455. [Google Scholar] [CrossRef]
Turbofan Family Engine A | Turbofan Family Engine B | Turbofan Family Engine C | |
---|---|---|---|
No. of engines | several hundred | several hundred | several hundred |
No. of parameters | <190 | <190 | <120 |
Flight envelopes | Taxi-out, Take-off, Climb, Cruise, Taxi-in |
Symbol | Description | Units | Set 1 (10 Vars) | Set 2 (9 Vars) | Set 3 (8 Vars) | Set 4 (7 Vars) | Set 5 (6 Vars) | Set 6 (5 Vars) | Set 7 (4 Vars) | Set 8 (3 Vars) |
---|---|---|---|---|---|---|---|---|---|---|
N1 | Low pressure spool speed | % | x | x | x | x | x | x | x | |
N2 | Intermediate pressure spool speed | % | x | x | x | x | x | x | x | x |
N3 | High pressure spool speed | % | x | x | x | x | x | x | x | x |
P25 | IPC delivery pressure | psi | x | x | x | x | x | |||
P3 | HPC delivery pressure | psi | x | x | x | x | x | x | ||
P50 | Turbine exit pressure | psi | x | x | x | x | ||||
T25 | IPC delivery temperature | ∘C | x | x | x | |||||
T3 | HPC delivery temperature | ∘C | x | x | x | x | x | x | x | x |
FF | Fuel flow rate | lb/h | x | x | ||||||
GWT | Aircraft gross weight | lbs | x |
Units | Set 1 (10 Vars) | Set 2 (9 Vars) | Set 3 (8 Vars) | Set 4 (7 Vars) | Set 5 (6 Vars) | Set 6 (5 Vars) | Set 7 (4 Vars) | Set 8 (3 Vars) | |
---|---|---|---|---|---|---|---|---|---|
mean RMSE | ∘C | 11.16 | 12.45 | 14.29 | 14.14 | 13.96 | 14.03 | 13.65 | 16.03 |
median RMSE | ∘C | 9.03 | 9.50 | 11.23 | 11.25 | 10.71 | 10.7 | 10.00 | 11.96 |
IQR | ∘C | 8.36 | 9.52 | 12.47 | 12.45 | 11.71 | 12.07 | 11.99 | 13.01 |
Q1 | ∘C | 5.76 | 5.89 | 6.45 | 6.17 | 6.28 | 6.03 | 5.78 | 7.45 |
Q3 | ∘C | 14.12 | 15.40 | 18.92 | 18.62 | 17.99 | 18.10 | 17.77 | 20.47 |
max RMSE | ∘C | 41.05 | 60.79 | 60.18 | 62.82 | 59.42 | 58.92 | 63.09 | 68.02 |
min RMSE | ∘C | 0.74 | 1.29 | 2.14 | 0.86 | 2.16 | 2.166 | 2.43 | 1.67 |
outlier | % | 6.30 | 6.80 | 4.74 | 4.12 | 5.36 | 5.57 | 5.57 | 6.19 |
Approach | Training Dataset | Test Dataset | Hypothesis |
---|---|---|---|
I (one-to-many) | segment 1 only | segment 2 | Segment 1 represents the deterioration of a new engine, which can be used for generic training data for the rest of the service period |
II (many-to-many) | segment 1 | segment N | The progressively accumulated data will provide all the historical deterioration characteristics for the ML model so that it can make more accurate predictions |
III (one-to-one) | segment | segment N | Only the EHM data of the previous service period is valid to represent the latest condition so that it can represent the deterioration characteristics |
Train–test split | 80% | 20% | Conventional ML model training method, use 80% of the available data of each engine for training, test the trained model on the remaining 20% of the data for each engine |
Linear Algorithms | Nonlinear Algorithms | ||||
---|---|---|---|---|---|
Algorithm | Abbr. | Reference | Algorithm | Abbr. | Reference |
Linear | lr | n/a | K-nearest neighbors | knn | [43] |
Lasso | lasso | [44] | Decision tree | cart | [45] |
Ridge | ridge | [46] | Extra tree | et | [47] |
Elastic net | en | [48] | Support vector | svmr | [49] |
Huber | huber | [50] | Adaboost | ada | [51] |
Lasso lars | lassol | [52] | Bagging | bag | [53] |
Passive aggressive | pa | [54] | Random forest | rf | [55] |
Stochastic gradient decent | sgd | [56] | Gradient boosting | gbm | [57] |
Hyperparameter | Parameter Set |
---|---|
No. of layers | 1 |
Units | 10 |
Epochs | 100, 200, 500, 1000 |
Batch size | 32, 64, 128 |
Learning rate | 0.001 |
Optimizer | Adam |
Loss function | mse |
Linear | Nonlinear | LSTM | |||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Approach | lr | lasso | lassol | huber | en | pa | ridge | sgd | ada | bag | cart | et | extra | gbm | knn | rf | svmr | - | |
I | Mean | 3.11 | 6.27 | 6.26 | 3.02 | 3.71 | 3.91 | 3.02 | 3.27 | 6.21 | 4.47 | 5.53 | 4.15 | 5.73 | 4.28 | 5.87 | 4.47 | 3.17 | 4.14 |
Median | 2.44 | 5.60 | 5.64 | 2.44 | 3.31 | 3.04 | 2.41 | 2.82 | 4.67 | 3.47 | 4.43 | 3.24 | 4.56 | 3.38 | 4.48 | 3.45 | 2.52 | 2.86 | |
Max. | 39.03 | 30.45 | 30.45 | 33.18 | 30.43 | 81.03 | 34.10 | 30.49 | 36.44 | 30.57 | 30.57 | 30.51 | 30.95 | 30.47 | 31.32 | 30.55 | 36.28 | 30.52 | |
Min. | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.03 | 0.00 | 0.00 | 0.94 | 0.25 | 0.07 | 0.06 | 0.13 | 0.16 | 0.29 | 0.04 | 0.04 | 0.00 | |
IQR | 1.02 | 3.68 | 3.60 | 0.92 | 1.00 | 1.64 | 0.96 | 0.99 | 2.37 | 1.76 | 2.12 | 1.48 | 2.19 | 1.51 | 3.36 | 1.82 | 1.03 | 1.66 | |
Outlier | 8.58 | 1.76 | 1.94 | 9.58 | 7.22 | 9.13 | 7.66 | 10.07 | 12.10 | 11.69 | 12.58 | 13.69 | 10.20 | 12.45 | 8.62 | 12.06 | 10.47 | 12.78 | |
II | Mean | 3.11 | 6.19 | 6.22 | 3.07 | 3.70 | 4.11 | 3.06 | 3.17 | 6.21 | 4.06 | 5.29 | 3.83 | 5.31 | 3.94 | 5.13 | 4.07 | 3.33 | 3.88 |
Median | 2.49 | 5.75 | 5.81 | 2.48 | 3.35 | 3.28 | 2.46 | 2.78 | 4.49 | 3.31 | 4.27 | 3.16 | 4.39 | 3.29 | 4.09 | 3.33 | 2.56 | 2.92 | |
Max. | 39.03 | 30.45 | 30.45 | 33.18 | 30.43 | 48.32 | 34.10 | 30.50 | 83.64 | 30.57 | 49.94 | 30.50 | 33.22 | 30.46 | 31.32 | 30.56 | 30.41 | 32.79 | |
Min. | 0.00 | 0.00 | 0.00 | 0.01 | 0.00 | 0.06 | 0.00 | 0.01 | 0.78 | 0.18 | 0.07 | 0.37 | 0.14 | 0.55 | 0.96 | 0.33 | 0.13 | 0.17 | |
IQR | 1.15 | 3.04 | 3.13 | 1.12 | 1.04 | 1.95 | 1.04 | 0.88 | 1.92 | 1.53 | 2.07 | 1.37 | 1.85 | 1.37 | 2.29 | 1.52 | 1.20 | 1.60 | |
Outlier | 8.27 | 2.40 | 2.17 | 9.35 | 7.08 | 7.42 | 8.41 | 9.63 | 15.97 | 8.75 | 10.04 | 10.09 | 10.60 | 8.67 | 8.78 | 9.17 | 12.47 | 10.52 | |
III | Mean | 3.23 | 6.32 | 6.32 | 3.13 | 3.75 | 4.25 | 3.09 | 3.27 | 6.31 | 4.33 | 5.35 | 4.03 | 5.44 | 4.16 | 5.52 | 4.31 | 3.21 | 4.05 |
Median | 2.41 | 5.82 | 5.82 | 2.41 | 3.33 | 2.96 | 2.39 | 2.77 | 4.57 | 3.37 | 4.24 | 3.17 | 4.34 | 3.30 | 4.22 | 3.32 | 2.51 | 2.83 | |
Max. | 87.64 | 30.45 | 30.45 | 83.76 | 30.43 | 178.74 | 51.99 | 30.50 | 78.93 | 40.75 | 40.75 | 40.75 | 40.75 | 40.75 | 40.75 | 40.75 | 30.41 | 58.55 | |
Min. | 0.00 | 0.00 | 0.00 | 0.02 | 0.00 | 0.01 | 0.00 | 0.00 | 0.20 | 0.14 | 0.12 | 0.01 | 0.26 | 0.29 | 0.19 | 0.09 | 0.00 | 0.05 | |
IQR | 0.96 | 3.37 | 3.35 | 0.93 | 1.04 | 1.47 | 0.94 | 0.84 | 2.07 | 1.64 | 1.81 | 1.39 | 1.83 | 1.41 | 2.56 | 1.63 | 1.03 | 1.52 | |
Outlier | 8.90 | 2.21 | 2.16 | 9.51 | 7.22 | 10.07 | 7.71 | 11.29 | 13.63 | 11.09 | 11.76 | 12.01 | 11.60 | 10.99 | 9.72 | 10.85 | 12.16 | 11.96 | |
Train–test | Mean | 3.11 | 6.59 | 6.61 | 3.13 | 3.97 | 4.42 | 3.12 | 3.39 | 6.12 | 4.09 | 5.36 | 3.92 | 5.30 | 3.95 | 5.03 | 4.11 | 3.32 | 3.39 |
Median | 2.49 | 6.11 | 6.10 | 2.48 | 3.45 | 3.30 | 2.49 | 2.80 | 4.48 | 3.39 | 4.32 | 3.22 | 4.45 | 3.33 | 4.20 | 3.40 | 2.60 | 2.72 | |
Max. | 39.64 | 39.51 | 39.51 | 39.74 | 39.78 | 39.96 | 39.77 | 39.99 | 52.27 | 40.10 | 47.57 | 40.23 | 40.80 | 40.14 | 41.67 | 40.04 | 39.83 | 40.19 | |
Min. | 0.89 | 1.99 | 1.98 | 0.90 | 1.49 | 0.97 | 0.85 | 1.19 | 1.89 | 1.60 | 2.48 | 1.53 | 2.58 | 1.52 | 2.32 | 1.60 | 0.89 | 1.00 | |
IQR | 1.28 | 3.26 | 3.28 | 1.27 | 1.07 | 2.25 | 1.28 | 0.93 | 1.96 | 1.38 | 1.56 | 1.32 | 1.47 | 1.17 | 1.66 | 1.40 | 1.31 | 1.31 | |
Outlier | 6.28 | 3.49 | 3.54 | 6.94 | 7.67 | 7.90 | 6.42 | 8.71 | 11.35 | 7.48 | 9.27 | 7.82 | 7.42 | 7.91 | 9.54 | 7.58 | 7.58 | 7.05 |
Approach I | Approach II | Approach III | Train–Test Split | Large Model | ||
---|---|---|---|---|---|---|
Turbofan family engine A | Mean | 4.14 | 3.88 | 4.05 | 3.39 | 3.44 |
Median | 2.86 | 2.92 | 2.83 | 2.72 | 2.94 | |
Max. | 30.52 | 32.78 | 58.55 | 40.19 | 30.48 | |
Min. | 0.00 | 0.17 | 0.05 | 1.00 | 0.66 | |
IQR | 1.66 | 1.60 | 1.52 | 1.31 | 0.94 | |
Outlier | 12.78 | 10.52 | 11.96 | 7.05 | 9.47 | |
Turbofan family engine B | Mean | 5.50 | 4.86 | 4.63 | 4.41 | 3.38 |
Median | 3.60 | 2.90 | 2.79 | 2.50 | 3.28 | |
Max. | 24.10 | 24.24 | 23.45 | 106.60 | 10.72 | |
Min. | 0.11 | 0.03 | 0.02 | 1.58 | 2.11 | |
IQR | 4.50 | 3.37 | 3.25 | 1.48 | 0.47 | |
Outlier | 7.26 | 9.83 | 9.83 | 10.71 | 5.56 | |
Turbofan family engine C | Mean | 4.45 | 4.90 | 4.77 | 4.55 | 5.48 |
Median | 3.54 | 3.88 | 3.19 | 4.00 | 4.88 | |
Max. | 30.06 | 51.60 | 58.57 | 19.36 | 23.92 | |
Min. | 0.11 | 0.12 | 0.43 | 1.23 | 1.72 | |
IQR | 2.39 | 2.33 | 2.26 | 2.94 | 1.64 | |
Outlier | 7.05 | 5.63 | 10.39 | 5.65 | 8.19 |
Dataset | Approach I | Approach II | Approach III | Train–Test Split | Large Model | |
---|---|---|---|---|---|---|
Turbofan family engine A | Training | 0.65 | 0.95 | 0.70 | 1.00 | 277.40 |
Test | 0.52 | 0.52 | 0.52 | 0.28 | 0.52 | |
Turbofan family engine B | Training | 0.81 | 1.08 | 0.68 | 1.00 | 116.68 |
Test | 0.44 | 0.44 | 0.44 | 0.28 | 0.44 | |
Turbofan family engine C | Training | 0.35 | 0.78 | 0.45 | 1.00 | 44.25 |
Test | 0.52 | 0.52 | 0.52 | 0.28 | 0.52 |
CWC | MAE | NMPIW | PICP | MPIW | StdAE | StdPIW | |
---|---|---|---|---|---|---|---|
Turbofan family engine A | 30.99 | 2.22 | 12.19 | 99.66 | 30.99 | 1.87 | 0.00 |
Turbofan family engine B | 39.35 | 2.54 | 5.56 | 92.52 | 12.07 | 2.07 | 0.00 |
Turbofan family engine C | 29.19 | 3.99 | 10.45 | 98.53 | 29.19 | 2.86 | 0.00 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Jung, J.-S.; Son, C.; Rimell, A.; Clarkson, R.J. A Framework for an ML-Based Predictive Turbofan Engine Health Model. Aerospace 2025, 12, 725. https://doi.org/10.3390/aerospace12080725
Jung J-S, Son C, Rimell A, Clarkson RJ. A Framework for an ML-Based Predictive Turbofan Engine Health Model. Aerospace. 2025; 12(8):725. https://doi.org/10.3390/aerospace12080725
Chicago/Turabian StyleJung, Jin-Sol, Changmin Son, Andrew Rimell, and Rory J. Clarkson. 2025. "A Framework for an ML-Based Predictive Turbofan Engine Health Model" Aerospace 12, no. 8: 725. https://doi.org/10.3390/aerospace12080725
APA StyleJung, J.-S., Son, C., Rimell, A., & Clarkson, R. J. (2025). A Framework for an ML-Based Predictive Turbofan Engine Health Model. Aerospace, 12(8), 725. https://doi.org/10.3390/aerospace12080725