Data-Driven Deep Learning-Based Attention Mechanism for Remaining Useful Life Prediction: Case Study Application to Turbofan Engine Analysis
Abstract
:1. Introduction
2. Background and Related Work
3. Development of Data-Driven Model
3.1. Candidate Model Training and Optimization
3.1.1. Recurrent Neural Networks
- No significance gate is used in generic LSTM units for computation.
- LSTM units utilize two distinct gates instead of an update gate , namely output gate and update gate . The output gate tracks the content’s visibility of the memory cell to compute LSTM unit activation outputs for other hidden units in the network. To achieve , the forget gate handles the extent of overwriting on , for instance, how much memory cell information must be overlooked in order to function properly for memory cells.
- LSTM is different from GRU architectures by the fact that the memory cell contents may not be equivalent to the activation at time t.
3.1.2. Convolutional Neural Networks
3.2. Prognostic Procedure
3.3. Data Pre-Processing and Normalization
3.3.1. Sensor Data Selection
3.3.2. Data Normalization
3.4. Samples Preparation Using Sliding Time Window
4. Experimental Results and Discussion
4.1. C-MAPSS Benchmark Dataset
4.2. Experimental Results and Candidate Models Performance Analysis
4.3. Model Uncertainty Quantification
5. Comparison with Literature
6. Limitations and Future Research
7. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Wang, Y.; Zhao, Y.; Addepalli, S. Remaining useful life prediction using deep learning approaches: A review. Procedia Manuf. 2020, 49, 81–88. [Google Scholar] [CrossRef]
- Chen, X.; Wang, S.; Qiao, B.; Chen, Q. Basic research on machinery fault diagnostics: Past, present, and future trends. Front. Mech. Eng. 2018, 13, 264–291. [Google Scholar] [CrossRef] [Green Version]
- Wei, Y.; Li, Y.; Xu, M.; Huang, W. A review of early fault diagnosis approaches and their applications in rotating machinery. Entropy 2019, 21, 409. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Lu, B.L.; Liu, Z.H.; Wei, H.L.; Chen, L.; Zhang, H.; Li, X.H. A Deep Adversarial Learning Prognostics Model for Remaining Useful Life Prediction of Rolling Bearing. IEEE Trans. Artif. Intell. 2021, 1. [Google Scholar] [CrossRef]
- Zeng, D.; Zhou, D.; Tan, C.; Jiang, B. Research on model-based fault diagnosis for a gas turbine based on transient performance. Appl. Sci. 2018, 8, 148. [Google Scholar] [CrossRef] [Green Version]
- Chui, K.T.; Gupta, B.B.; Vasant, P. A Genetic Algorithm Optimized RNN-LSTM Model for Remaining Useful Life Prediction of Turbofan Engine. Electronics 2021, 10, 285. [Google Scholar] [CrossRef]
- Souza, R.M.; Nascimento, E.G.; Miranda, U.A.; Silva, W.J.; Lepikson, H.A. Deep learning for diagnosis and classification of faults in industrial rotating machinery. Comput. Ind. Eng. 2021, 153, 107060. [Google Scholar] [CrossRef]
- Ahmed, U.; Ali, F.; Jennions, I. A review of aircraft auxiliary power unit faults, diagnostics and acoustic measurements. Prog. Aerosp. Sci. 2021, 124, 100721. [Google Scholar] [CrossRef]
- de Jonge, B.; Teunter, R.; Tinga, T. The influence of practical factors on the benefits of condition-based maintenance over time-based maintenance. Reliab. Eng. Syst. Saf. 2017, 158, 21–30. [Google Scholar] [CrossRef]
- Seiti, H.; Tagipour, R.; Hafezalkotob, A.; Asgari, F. Maintenance strategy selection with risky evaluations using RAHP. J. Multi-Criteria Decis. Anal. 2017, 24, 257–274. [Google Scholar] [CrossRef]
- Ozcan, S.; Simsir, F. A new model based on Artificial Bee Colony algorithm for preventive maintenance with replacement scheduling in continuous production lines. Eng. Sci. Technol. Int. J. 2019, 22, 1175–1186. [Google Scholar] [CrossRef]
- Xie, Z.; Du, S.; Lv, J.; Deng, Y.; Jia, S. A Hybrid Prognostics Deep Learning Model for Remaining Useful Life Prediction. Electronics 2020, 10, 39. [Google Scholar] [CrossRef]
- Zhao, Y.; Wang, Y. Remaining Useful Life Prediction for Multi-sensor Systems Using a Novel End-to-end Deep-Learning Method. Measurement 2021, 182, 109685. [Google Scholar] [CrossRef]
- Hong, C.W.; Lee, C.; Lee, K.; Ko, M.S.; Kim, D.E.; Hur, K. Remaining Useful Life Prognosis for Turbofan Engine Using Explainable Deep Neural Networks with Dimensionality Reduction. Sensors 2020, 20, 6626. [Google Scholar] [CrossRef] [PubMed]
- Zheng, S.; Ristovski, K.; Farahat, A.; Gupta, C. Long Short-Term Memory Network for Remaining Useful Life Estimation. In Proceedings of the 2017 IEEE International Conference on Prognostics and Health Management (ICPHM), Ottawa, Canada, 19–21 June 2017; pp. 88–95. [Google Scholar]
- Ren, L.; Liu, Y.; Wang, X.; Lü, J.; Deen, M.J. Cloud-edge based lightweight temporal convolutional networks for remaining useful life prediction in iiot. IEEE Internet Things J. 2020, 8, 12578–12587. [Google Scholar] [CrossRef]
- Malhotra, P.; Vishnu, T.V.; Ramakrishnan, A.; Anand, G.; Vig, L.; Agarwal, P.; Shroff, G. Multisensor prognostics using an unsupervised health index based on LSTM encoder-decoder. arXiv 2020, arXiv:1608.06154. [Google Scholar]
- Cornelius, J.; Brockner, B.; Hong, S.H.; Wang, Y.; Pant, K.; Ball, J. Estimating and Leveraging Uncertainties in Deep Learning for Remaining Useful Life Prediction in Mechanical Systems. In Proceedings of the 2020 IEEE International Conference on Prognostics and Health Management (ICPHM), Detroit, MI, USA, 8–10 June 2020; pp. 1–8. [Google Scholar]
- Ramadhan, M.S.; Hassan, K.A. Remaining useful life prediction using an integrated Laplacian-LSTM network on machinery components. Appl. Soft Comput. 2021, 112, 107817. [Google Scholar]
- Ahn, G.; Yun, H.; Hur, S.; Lim, S. A Time-Series Data Generation Method to Predict Remaining Useful Life. Processes 2021, 9, 1115. [Google Scholar] [CrossRef]
- Zhang, C.; Lim, P.; Qin, K.; Tan, K.C. Multiobjective Deep Belief Networks Ensemble for Remaining Useful Life Estimation in Prognostics. IEEE Trans. Neural Netw. Learn. Syst. 2016, 28, 2306–2318. [Google Scholar] [CrossRef]
- Aggarwal, K.; Atan, O.; Farahat, A.K.; Zhang, C.; Ristovski, K.; Gupta, C. Two Birds with One Network: Unifying Failure Event Prediction and Time-to-Failure Modeling. In Proceedings of the 2018 IEEE International Conference on Big Data (Big Data), Seattle, WA, USA, 10–13 December 2018; pp. 1308–1317. [Google Scholar]
- Li, X.; Ding, Q.; Sun, J.Q. Remaining useful life estimation in prognostics using deep convolution neural networks. Reliab. Eng. Syst. Saf. 2018, 172, 1–11. [Google Scholar] [CrossRef] [Green Version]
- Gao, G.; Que, Z.; Xu, Z. Predicting Remaining Useful Life with Uncertainty Using Recurrent Neural Process. In Proceedings of the 2020 IEEE 20th International Conference on Software Quality, Reliability and Security Companion (QRS-C), Macau, China, 11–14 December 2020; pp. 291–296. [Google Scholar]
- Sankararaman, S.; Goebel, K. Why is the Remaining Useful Life Prediction Uncertain? In Proceedings of the Annual Conference of the PHM Society, New Orleans, LA, USA, 14–17 October 2013; Volume 5, p. 1. [Google Scholar]
- Chen, N.; Tsui, K.L. Condition monitoring and remaining useful life prediction using degradation signals: Revisited. IIE Trans. 2013, 45, 939–952. [Google Scholar] [CrossRef]
- Zhai, Q.; Ye, Z.-S. RUL Prediction of Deteriorating Products Using an Adaptive Wiener Process Model. IEEE Trans. Ind. Inform. 2017, 13, 2911–2921. [Google Scholar] [CrossRef]
- Zhu, J.; Chen, N.; Peng, W. Estimation of Bearing Remaining Useful Life Based on Multiscale Convolutional Neural Network. IEEE Trans. Ind. Electron. 2018, 66, 3208–3216. [Google Scholar] [CrossRef]
- Rui, K.; Wenjun, G.; Chen, Y. Model-driven degradation modeling approaches: Investigation and review. Chin. J. Aeronaut. 2020, 33, 1137–1153. [Google Scholar]
- Park, P.; Jung, M.; Di Marco, P. Remaining Useful Life Estimation of Bearings Using Data-Driven Ridge Regression. Appl. Sci. 2020, 10, 8977. [Google Scholar] [CrossRef]
- Lei, Y. Intelligent Fault Diagnosis and Remaining Useful Life Prediction of Rotating Machinery; Butterworth-Heinemann: Oxford, UK, 2016. [Google Scholar]
- Chen, Y.; Peng, G.; Zhu, Z.; Li, S. A novel deep learning method based on attention mechanism for bearing remaining useful life prediction. Appl. Soft Comput. 2019, 86, 105919. [Google Scholar] [CrossRef]
- Ingemarsdotter, E.; Kambanou, M.L.; Jamsin, E.; Sakao, T.; Balkenende, R. Challenges and solutions in condition-based maintenance implementation-A multiple case study. J. Clean. Prod. 2021, 296, 126420. [Google Scholar] [CrossRef]
- Kaparthi, S.; Bumblauskas, D. Designing predictive maintenance systems using decision tree-based machine learning techniques. Int. J. Qual. Reliab. Manag. 2020, 37, 659–686. [Google Scholar] [CrossRef]
- Wen, L.; Dong, Y.; Gao, L. A new ensemble residual convolutional neural network for remaining useful life estimation. Math. Biosci. Eng. 2019, 16, 862–880. [Google Scholar] [CrossRef] [PubMed]
- Babu, G.S.; Zhao, P.; Li, X.L. Deep convolutional neural network based regression approach for estimation of remaining useful life. In International Conference on Database Systems for Advanced Applications; Springer: Cham, Switzerland, 2016; pp. 214–228. [Google Scholar]
- Wen, L.; Li, X.; Gao, L.; Zhang, Y. A new convolutional neural network-based data-driven fault diagnosis method. IEEE Trans. Ind. Electron. 2018, 65, 5990–5998. [Google Scholar] [CrossRef]
- Li, J.; Li, X.; He, D. A directed acyclic graph network combined with CNN and LSTM for remaining useful life prediction. IEEE Access 2019, 7, 75464–75475. [Google Scholar] [CrossRef]
- Huang, R.; Xi, L.; Li, X.; Liu, C.R.; Qiu, H.; Lee, J. Residual life predictions for ball bearings based on self-organizing map and back propagation neural network methods. Mech. Syst. Signal Process. 2007, 21, 193–207. [Google Scholar] [CrossRef]
- Muneer, A.; Taib, S.M.; Fati, S.M.; Alhussian, H. Deep-Learning Based Prognosis Approach for Remaining Useful Life Prediction of Turbofan Engine. Symmetry 2021, 13, 1861. [Google Scholar] [CrossRef]
- Khawaja, T.; Vachtsevanos, G.; Wu, B. Reasoning about Uncertainty in Prognosis: A Confidence Prediction Neural Network Approach. In Proceedings of the Annual Meeting of the North American Fuzzy Information Processing Society, Redmond, WA, USA, 20–22 August 2020; pp. 7–12. [Google Scholar]
- Malhi, A.; Yan, R.; Gao, R.X. Prognosis of Defect Propagation Based on Recurrent Neural Networks. IEEE Trans. Instrum. Meas. 2011, 60, 703–711. [Google Scholar] [CrossRef]
- Yuan, M.; Wu, Y.; Lin, L. Fault Diagnosis and Remaining Useful Life Estimation of Aero Engine Using LSTM Neural Network. In Proceedings of the 2016 IEEE International Conference on Aircraft Utility Systems (AUS), Beijing, China, 10–12 October 2016; pp. 135–140. [Google Scholar]
- Zhao, R.; Wang, J.; Yan, R.; Mao, K. Machine Health Monitoring with LSTM Networks. In Proceedings of the 2016 10th International Conference on Sensing Technology (ICST), Nanjing, China, 11–13 November 2016; pp. 1–6. [Google Scholar]
- Ren, L.; Sun, Y.; Wang, H.; Zhang, L. Prediction of bearing remaining useful life with deep convolution neural network. IEEE Access 2018, 6, 13041–13049. [Google Scholar] [CrossRef]
- Song, Y.; Xia, T.; Zheng, Y.; Zhuo, P.; Pan, E. Remaining useful life prediction of turbofan engine based on Autoencoder-BLSTM. Comput. Integr. Manuf. Syst. 2019, 25, 1611–1619. [Google Scholar]
- Zhang, X.; Xiao, P.; Yang, Y.; Cheng, Y.; Chen, B.; Gao, D.; Liu, W.; Huang, Z. Remaining Useful Life Estimation Using CNN-XGB With Extended Time Window. IEEE Access 2019, 7, 154386–154397. [Google Scholar] [CrossRef]
- Ji, S.; Han, X.; Hou, Y.; Song, Y.; Du, Q. Remaining Useful Life Prediction of Airplane Engine Based on PCA–BLSTM. Sensors 2020, 20, 4537. [Google Scholar] [CrossRef]
- Wang, R.; Shi, R.; Hu, X.; Shen, C. Remaining Useful Life Prediction of Rolling Bearings Based on Multiscale Convolutional Neural Network with Integrated Dilated Convolution Blocks. Shock Vib. 2021, 1–11. [Google Scholar] [CrossRef]
- Bergstra, J.; Bengio, Y. Random search for hyper-parameter optimization. J. Mach. Learn. Res. 2012, 13, 305–345. [Google Scholar]
- Alqushaibi, A.; Abdulkadir, S.J.; Rais, H.M.; Al-Tashi, Q.; Ragab, M.G.; Alhussian, H. Enhanced Weight-Optimized Recurrent Neural Networks Based on Sine Cosine Algorithm for Wave Height Prediction. J. Mar. Sci. Eng. 2021, 9, 524. [Google Scholar] [CrossRef]
- Cho, K.; Van Merriënboer, B.; Bahdanau, D.; Bengio, Y. On the properties of neural machine translation: Encoder-decoder approaches. arXiv 2014, arXiv:1409.1259. [Google Scholar]
- Hochreiter, S.; Schmidhuber, J. Long short-term memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef] [PubMed]
- Naseer, S.; Ali, R.F.; Muneer, A.; Fati, S.M. IAmideV-deep: Valine amidation site prediction in proteins using deep learning and pseudo amino acid compositions. Symmetry 2021, 13, 560. [Google Scholar] [CrossRef]
- Muneer, A.; Fati, S.M. Efficient and Automated Herbs Classification Approach Based on Shape and Texture Features using Deep Learning. IEEE Access 2020, 8, 196747–196764. [Google Scholar] [CrossRef]
- Durairajah, V.; Gobee, S.; Muneer, A. Automatic Vision Based Classification System Using DNN and SVM Classifiers. In Proceedings of the 2018 3rd International Conference on Control, Robotics and Cybernetics (CRC), Penang, Malaysia, 26–28 September 2018; pp. 6–14. [Google Scholar]
- Naseer, S.; Ali, R.F.; Fati, S.M.; Muneer, A. iNitroY-Deep: Computational Identification of Nitrotyrosine Sites to Supplement Carcinogenesis Studies Using Deep Learning. IEEE Access 2021, 9, 73624–73640. [Google Scholar] [CrossRef]
- Naseer, S.; Saleem, Y. Enhanced Network Intrusion Detection using Deep Convolutional Neural Networks. Trans. Internet Inf. Syst. 2018, 12, 5159–5178. [Google Scholar]
- Yang, B.; Liu, R.; Zio, E. Remaining Useful Life Prediction Based on a Double-Convolutional Neural Network Architecture. IEEE Trans. Ind. Electron. 2019, 66, 9521–9530. [Google Scholar] [CrossRef]
- Hou, G.; Xu, S.; Zhou, N.; Yang, L.; Fu, Q. Remaining useful life estimation using deep convolutional generative adversarial networks based on an autoencoder scheme. Comput. Intell. Neurosci. 2020. [Google Scholar] [CrossRef] [PubMed]
- Xiang, S.; Qin, Y.; Luo, J.; Pu, H.; Tang, B. Multicellular LSTM-based deep learning model for aero-engine remaining useful life prediction. Reliab. Eng. Syst. Saf. 2021, 216, 107927. [Google Scholar] [CrossRef]
- Saxena, A.; Goebel, K.; Simon, D.; Eklund, N. Damage Propagation Modeling for Aircraft Engine Run-to-Failure Simulation. In Proceedings of the 2008 International Conference on Prognostics and Health Management, Denver, CO, USA, 6–9 October 2008; pp. 1–9. [Google Scholar]
- Osband, I.; Blundell, C.; Pritzel, A.; Van Roy, B. Deep exploration via bootstrapped DQN. Adv. Neural Inf. Process. Syst. 2016, 29, 4026–4034. [Google Scholar]
- Ragab, M.; Chen, Z.; Wu, M.; Kwoh, C.K.; Yan, R.; Li, X. Attention Sequence to Sequence Model for Machine Remaining Useful Life Prediction. arXiv 2020, arXiv:2007.09868. [Google Scholar]
- Chen, Z.; Wu, M.; Zhao, R.; Guretno, F.; Yan, R.; Li, X. Machine Remaining Useful Life Prediction via an Attention-Based Deep Learning Approach. IEEE Trans. Ind. Electron. 2020, 68, 2521–2531. [Google Scholar] [CrossRef]
- Zeng, F.; Li, Y.; Jiang, Y.; Song, G. A deep attention residual neural network-based remaining useful life prediction of machinery. Measurement 2021, 181, 109642. [Google Scholar] [CrossRef]
Authors | Year | Technique used | Benchmark Dataset | Results achieved | Limitations/Gaps |
---|---|---|---|---|---|
Babu et al. [36] | 2016 | Deep CNN | C-MAPSS | RMSE for FD002 (30.29) RMSE for FD003 (19.81) | This study does not consider the uncertainty prediction associated with RUL. |
Zhang et al. [21] | 2016 | Multi-objective deep belief networks ensemble | C-MAPSS | RMSE for FD002 (25.05) RMSE for FD003 (12.51) | This work does not consider the uncertainty prediction inherent in their DL model, which makes impractical in practice. |
Malhotra et al. [17] | 2020 | LSTM-ED | C-MAPSS | The authors have failed to report the RMSE value for each subset of data. | The model was not able to estimate the RUL and only able to capture the fault. |
Li, Ding, and Sun, [22] | 2018 | DCNN | C-MAPSS | RMSE for FD002 (22.36) RMSE for FD003 (12.64) | This work does not consider the uncertainty prediction inherent in their DCNN model and further architecture optimization is still needed since the current training time is longer than most shallow networks in the literature. |
Zheng et al. [15] | 2017 | Deep LSTM | C-MAPSS | RMSE for FD002 (24.49) RMSE for FD003 (16.18) | Uncertainty of LSTM model was not examined |
Song et al. [46] | 2019 | Autoencoder-BLSTM | C-MAPSS | The model was only tested on a simpler subset of data called FD001 where the RMSE achieved is 13.63. | High computational load and uncertainty were not quantified. |
Wen et al. [37] | 2018 | ResCNN | C-MAPSS | Authors have failed to report the RMSE value for each subset of data. | The limitations of the proposed method are that the imbalance of signal data is ignored and the tuning parameter process of the ensemble ResCNN is very time-consuming. Moreover, uncertainty was not predicted, which made this method impractical. |
Li, Li, and He [38] | 2019 | CNN combined with LSTM | C-MAPSS | RMSE for FD002 (20.34) RMSE for FD003 (12.46) | The proposed DAG network suffers from a slow training time issue, which was reported to be (138.17 s), and the uncertainty was not quantified. |
Muneer et al. [40] | 2021 | Attention-based DCNN | C-MAPSS | RMSE for FD002 (18.34) RMSE for FD003 (13.08) | The problem of RUL prediction with associated uncertainties of their DL model were not addressed. |
Zhang et al. [47] | 2019 | CNN-XGB | C-MAPSS | RMSE for FD002 (19.61) RMSE for FD003 (13.01) | This method’s main drawback is the computational speed, where the prediction time taken is around 621.7 s. It also does not consider the uncertainty prediction associated with RUL. |
Ji et al. [48] | 2020 | PCA–BLSTM | C-MAPSS | The model was only tested on a simpler subset of data called FD003 where the RMSE achieved is 11.1. | Long training time where the researchers failed to indicate the time of training in seconds. |
Wang et al. [49] | 2021 | MS-CNN | PRONOSTIA Bearing dataset | The MSE obtained with average test loss is (35.48) The R2 achieved with average test loss is (0.64) | This work does not consider the uncertainty prediction inherent in their DL model, which makes it impractical in practice |
This study | 2021 | LSTM with Attention Mechanism | C-MAPSS | RMSE for FD002 (12.87) RMSE for FD003 (11.23) | Optimized network structure where the parameters and computational cost of the training process are considerably decreased using dimensionality reduction processing. The prediction time taken is around 117.3 s, and the uncertainty in DNNs predictions is examined. |
Layer Type | No. of Weights |
---|---|
Reshape layer (sequence to timeseries conversion) Simple RNN with 30 neurons | No weights 35 × 30 = 1050 |
FC1 with 20 relu units Dropout layer for regularization FC2 with 8 relu units | (30 + 1) × 20 = 620 No weights (20 + 1) × 8 = 168 |
Output Layer with One linear unit | (8 + 1) × 1 = 9 |
Layer Type | No. of Weights |
---|---|
Reshape layer (sequence to timeseries conversion) GRU layer with 30 neurons | No weights 78 × 30 = 2340 |
FC1 with 20 relu units Dropout layer for regularization FC2 with 8 relu units | (30 + 1) × 20 = 620 No weights (20 + 1) × 8 = 168 |
Output Layer with One linear unit | (8+1) × 1 = 9 |
Layer Type | No. of Weights |
---|---|
Reshape layer (sequence to timeseries conversion) LSTM layer with 30 neurons FC1 with 20 relu units | No weights 144 × 30 = 4320 (30 + 1) × 20 = 620 |
Dropout layer for regularization FC2 with 8 relu units Output Layer with linear unit | No weights (20 + 1) × 8 = 168 (8 + 1) × 1 = 9 |
Layer Type | No. of Weights |
---|---|
Reshape layer | No weights |
Conv-1D with 8 kernels of size 3 | ((3 × 2) + 1) × 8 = 56 |
Maxpool-1D | No weights |
Regularization with 50% of probability | No weights |
Conv-1D with 16 kernels of size 3 | ((3 × 8) + 1) × 16) = 400 |
Maxpool-1D | No weights |
Flatten (to create array of scalars) | No weights |
FC1 with 8 relu units | (32 + 1) × 8 = 264 |
Output layer comprising of one sigmoid | (8 + 1) × 1 = 9 |
Dataset | C-MAPSS | |||
---|---|---|---|---|
FD001 | FD002 | FD003 | FD004 | |
Engine units for training | 100 | 260 | 100 | 249 |
Engine units for testing | 100 | 259 | 100 | 248 |
Opeating conditions | 1 | 6 | 1 | 6 |
Fault modes | 1 | 1 | 2 | 2 |
FD002 | FD003 | |||||
---|---|---|---|---|---|---|
Prediction Model | MAE | RMSE | R2 | MAE | RMSE | R2 |
Proposed Attention-based CNN Model | 16.23 | 22.34 | 0.336 | 8.99 | 21.89 | 0.232 |
Proposed Attention-based SRNN Model | 14.32 | 29.32 | 0.236 | 12.97 | 25.14 | 0.323 |
Proposed Attention-based GRU Model | 14.75 | 18.38 | 0.162 | 13.32 | 19.28 | 0.423 |
Proposed Attention-based LSTM Model | 1.42 | 12.87 | 0.236 | 5.47 | 11.23 | 0.168 |
MLP [21] | Not Reported | 80.03 | Not Reported | Not Reported | 37.39 | Not Reported |
Deep LSTM [43] | - | 24.49 | - | - | 16.18 | - |
RF [21] | - | 20.23 | - | - | 22.34 | - |
DBN [21] | - | 30.05 | - | - | 20.99 | - |
DW-RNN [22] | - | 25.90 | - | - | 18.75 | - |
MTL-RNN [22] | - | 25.78 | - | - | 17.98 | - |
DCNN [23] | - | 22.36 | - | - | 12.64 | - |
DNN [23] | - | 24.61 | - | 13.93 | - | |
RNN [23] | - | 24.03 | - | - | 13.36 | - |
LSTM [23] | - | 24.42 | - | - | 13.54 | - |
Attention-based Sequence to Sequence Model [64] | - | 14.90 | - | - | 11.85 | - |
Attention-based DL model [65] | - | FD001 (14.53) | - | - | FD004 (27.08) | - |
DARNN [66]; PRONOSTIA dataset | - | Bearing4 0.07 ± 0.002 | - | - | Bearing5 0.07 ± 0.002 | - |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Muneer, A.; Taib, S.M.; Naseer, S.; Ali, R.F.; Aziz, I.A. Data-Driven Deep Learning-Based Attention Mechanism for Remaining Useful Life Prediction: Case Study Application to Turbofan Engine Analysis. Electronics 2021, 10, 2453. https://doi.org/10.3390/electronics10202453
Muneer A, Taib SM, Naseer S, Ali RF, Aziz IA. Data-Driven Deep Learning-Based Attention Mechanism for Remaining Useful Life Prediction: Case Study Application to Turbofan Engine Analysis. Electronics. 2021; 10(20):2453. https://doi.org/10.3390/electronics10202453
Chicago/Turabian StyleMuneer, Amgad, Shakirah Mohd Taib, Sheraz Naseer, Rao Faizan Ali, and Izzatdin Abdul Aziz. 2021. "Data-Driven Deep Learning-Based Attention Mechanism for Remaining Useful Life Prediction: Case Study Application to Turbofan Engine Analysis" Electronics 10, no. 20: 2453. https://doi.org/10.3390/electronics10202453
APA StyleMuneer, A., Taib, S. M., Naseer, S., Ali, R. F., & Aziz, I. A. (2021). Data-Driven Deep Learning-Based Attention Mechanism for Remaining Useful Life Prediction: Case Study Application to Turbofan Engine Analysis. Electronics, 10(20), 2453. https://doi.org/10.3390/electronics10202453