Improved WTCN-Informer Model Based on Frequency-Enhanced Channel Attention Mechanism and Wavelet Convolution: Prediction of Remaining Service Life of Ion Etcher Cooling System
Abstract
1. Introduction
- (1)
- The original TCN structure is optimized by replacing and adding the wavelet convolutional layer [21] to the TCN structure to obtain the WTCN structure. This new structure enhances the global sensory field of the data and improves the model’s ability to perform time series modeling.
- (2)
- By integrating the discrete cosine transform (DCT) and channel attention mechanism into the WTCN, it is possible to achieve multi-dimensional feature extraction of time series data. The features resulting from the DCT, channel attention mechanism, and WTCN can then be inputted into the Informer network, thereby enhancing the accuracy and efficiency of prediction.
- (3)
- In this paper, the optimization algorithm is used to train the network end-to-end, enabling the model to adaptively learn and identify the optimal hyperparameters, thereby enhancing the accuracy of prediction while reducing the bias error in prediction.
2. Related Works
2.1. Machine Learning-Based Prediction Methods
2.2. Deep Learning-Based Prediction Methods
2.2.1. Convolutional Neural Networks and Improved Models
2.2.2. Recurrent Neural Networks and Improved Models
2.2.3. Transformer Class Model
3. Methodology
3.1. Overall RUL Forecasting Program
3.2. Degradation Early Warning Health Assessment Model
3.3. FECAM-WTCN-Informer Model Component
3.3.1. Modified Temporal Convolution Network (TCN)
3.3.2. Frequency-Enhanced Channel Attention Mechanism
3.3.3. Informer
3.3.4. Parameter Optimization Component
3.4. FECAM-WTCN-Informer Model
4. Predictive Results of the Model
- -
- The experiments that were conducted to study the process of ablation;
- -
- The results of the comparison experiments;
- -
- The predictive performance of the model on the test set.
4.1. Data Description and Preprocessing
4.1.1. Flowcool Dataset
4.1.2. Data Preprocessing
4.2. Model Evaluation Indicators
4.3. RUL Projections
4.3.1. A Comparative Analysis of the Experimental Results
4.3.2. The Results of the Ablation Experiments Are Presented Herewith
4.3.3. Calculating Costs and Model Complexity
4.3.4. Model Visualization
- 1.
- Diagonal Pattern: The prominent diagonal pattern indicates that the model places significant emphasis on recent historical information when predicting immediate future steps, thereby aligning with the temporal locality principle in time series forecasting.
- 2.
- Sparse Attention Distribution: The sparse attention pattern, characterized by concentrated activity along the diagonal with reduced activity in other regions, suggests that the Enhanced Informer successfully learns to prioritize relevant historical time steps, filtering out extraneous noise and irrelevant information.
- 3.
- Temporal Dependency: The attention weights manifest a discernible decay pattern as the temporal distance increases, thereby suggesting that the model appropriately assigns greater weight to recent observations compared to distant historical data.
- 4.
- Prediction Horizon Analysis: For prediction horizons extending beyond the immediate term (t − 25 to t − 30), the distribution of attention across multiple historical time steps becomes more pronounced. This phenomenon signifies the model’s capacity to discern long-term dependencies, particularly in scenarios where short-term patterns prove inadequate.
4.3.5. Discussion of Comparisons Between Studies Using the Same Dataset
4.3.6. Model Predictions
5. Discussion
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Si, X.S.; Wang, W.; Hu, C.H.; Zhou, D.H. Remaining useful life estimation—A review on the statistical data driven approaches. Eur. J. Oper. Res. 2011, 213, 1–14. [Google Scholar] [CrossRef]
- Zonta, T.; da Costa, C.A.; da Rosa Righi, R.; de Lima, M.J.; da Trindade, E.S.; Li, G.P. Predictive maintenance in the Industry 4.0: A systematic literature review. Comput. Ind. Eng. 2020, 150, 106889. [Google Scholar] [CrossRef]
- Scheibelhofer, P.; Gleispach, D.; Hayderer, G.; Stadlober, E. A Methodology for Predictive Maintenance in Semiconductor Manufacturing. Austrian J. Stat. 2017, 41, 161–173. [Google Scholar] [CrossRef]
- Huff, M. Recent Advances in Reactive Ion Etching and Applications of High-Aspect-Ratio Microfabrication. Micromachines 2021, 12, 991. [Google Scholar] [CrossRef]
- Liu, C.; Zhang, L.; Li, J.; Zheng, J.; Wu, C. Two-Stage Transfer Learning for Fault Prognosis of Ion Mill Etching Process. IEEE Trans. Semicond. Manuf. 2021, 34, 185–193. [Google Scholar] [CrossRef]
- Wu, S.; Jiang, Y.; Luo, H.; Yin, S. Remaining useful life prediction for ion etching machine cooling system using deep recurrent neural network-based approaches. Control Eng. Pract. 2021, 109, 104748. [Google Scholar] [CrossRef]
- Siniaguine, O. Atmospheric downstream plasma etching of Si wafers. In Proceedings of the Twenty Third IEEE/CPMT International Electronics Manufacturing Technology Symposium (Cat. No.98CH36205), Austin, TX, USA, 21 October 1998. [Google Scholar] [CrossRef]
- Ferreira, C.; Gonçalves, G. Remaining Useful Life prediction and challenges: A literature review on the use of Machine Learning Methods. J. Manuf. Syst. 2022, 63, 550–562. [Google Scholar] [CrossRef]
- Mrugalska, B. Remaining Useful Life as Prognostic Approach: A Review. In Advances in Intelligent Systems and Computing, Human Systems Engineering and Design; Springer: Berlin/Heidelberg, Germany, 2019; pp. 689–695. [Google Scholar] [CrossRef]
- Cheng, Y.; Hu, K.; Wu, J.; Zhu, H.; Shao, X. A convolutional neural network based degradation indicator construction and health prognosis using bidirectional long short-term memory network for rolling bearings. Adv. Eng. Inform. 2021, 48, 101247. [Google Scholar] [CrossRef]
- Li, J.; Li, X.; He, D. A Directed Acyclic Graph Network Combined with CNN and LSTM for Remaining Useful Life Prediction. IEEE Access 2019, 7, 75464–75475. [Google Scholar] [CrossRef]
- Falcon, A.; D’Agostino, G.; Serra, G.; Brajnik, G.; Tasso, C. A Neural Turing Machine-based approach to Remaining Useful Life Estimation. In Proceedings of the 2020 IEEE International Conference on Prognostics and Health Management (ICPHM), Detroit, MI, USA, 8–10 June 2020. [Google Scholar] [CrossRef]
- Yang, Z.; Baraldi, P.; Zio, E. A comparison between extreme learning machine and artificial neural network for remaining useful life prediction. In Proceedings of the 2016 Prognostics and System Health Management Conference (PHM-Chengdu), Chengdu, China, 19–21 October 2016; pp. 1–7. [Google Scholar] [CrossRef]
- Huang, W.; Zhang, X.; Wu, C.; Cao, S.; Zhou, Q. Tool wear prediction in ultrasonic vibration-assisted drilling of CFRP: A hybrid data-driven physics model-based framework. Tribol. Int. 2022, 174, 107755. [Google Scholar] [CrossRef]
- Shen, S.; Lu, H.; Sadoughi, M.; Hu, C.; Nemani, V.; Thelen, A.; Webster, K.; Darr, M.; Sidon, J.; Kenny, S. A physics-informed deep learning approach for bearing fault detection. Eng. Appl. Artif. Intell. 2021, 103, 104295. [Google Scholar] [CrossRef]
- Shutin, D.; Bondarenko, M.; Polyakov, R.; Stebakov, I.; Savin, L. Method for On-Line Remaining Useful Life and Wear Prediction for Adjustable Journal Bearings Utilizing a Combination of Physics-Based and Data-Driven Models: A Numerical Investigation. Lubricants 2023, 11, 33. [Google Scholar] [CrossRef]
- He, G.; Zhao, Y.; Yan, C. MFLP-PINN: A physics-informed neural network for multiaxial fatigue life prediction. Eur. J. Mech.-A/Solids 2023, 98, 104889. [Google Scholar] [CrossRef]
- Singh, K.; Selvanathan, B.; Zope, K.; Nistala, S.H.; Runkana, V. Concurrent Estimation of Remaining Useful Life for Multiple Faults in an Ion Etch Mill. In Proceedings of the Annual Conference of the PHM Society, Philadelphia, PA, USA, 24–27 September 2018. [Google Scholar] [CrossRef]
- He, A.; Jin, X. Failure Detection and Remaining Life Estimation for Ion Mill Etching Process Through Deep-Learning Based Multimodal Data Fusion. In Proceedings of the Volume 1: Additive Manufacturing; Manufacturing Equipment and Systems; Bio and Sustainable Manufacturing, Erie, PA, USA, 10–14 June 2019. [Google Scholar] [CrossRef]
- Huang, W.; Khorasgani, H.; Gupta, C.; Farahat, A.; Zheng, S. Remaining Useful Life Estimation for Systems with Abrupt Failures. In Proceedings of the Annual Conference of the PHM Society, Philadelphia, PA, USA, 24–27 September 2018. [Google Scholar] [CrossRef]
- Finder, S.E.; Amoyal, R.; Treister, E.; Freifeld, O. Wavelet Convolutions for Large Receptive Fields. arXiv 2024, arXiv:2407.05848. [Google Scholar] [CrossRef]
- Sigaud, O.; Droniou, A. Towards Deep Developmental Learning. IEEE Trans. Cogn. Dev. Syst. 2016, 8, 99–114. [Google Scholar] [CrossRef]
- Berghout, T.; Benbouzid, M. A Systematic Guide for Predicting Remaining Useful Life with Machine Learning. Electronics 2022, 11, 1125. [Google Scholar] [CrossRef]
- Zhang, C.; He, Y.; Yuan, L.; Xiang, S.; Wang, J. Prognostics of Lithium-Ion Batteries Based on Wavelet Denoising and DE-RVM. Comput. Intell. Neurosci. 2015, 2015, 918305. [Google Scholar] [CrossRef]
- Ben Ali, J.; Chebel-Morello, B.; Saidi, L.; Malinowski, S.; Fnaiech, F. Accurate bearing remaining useful life prediction based on Weibull distribution and artificial neural network. Mech. Syst. Signal Process. 2015, 56, 150–172. [Google Scholar] [CrossRef]
- Zhang, W.; Yang, D.; Wang, H. Data-Driven Methods for Predictive Maintenance of Industrial Equipment: A Survey. IEEE Syst. J. 2019, 13, 2213–2227. [Google Scholar] [CrossRef]
- Wu, J.Y.; Wu, M.; Chen, Z.; Li, X.L.; Yan, R. Degradation-Aware Remaining Useful Life Prediction with LSTM Autoencoder. IEEE Trans. Instrum. Meas. 2021, 70, 1–10. [Google Scholar] [CrossRef]
- Tziolas, T.; Papageorgiou, K.; Theodosiou, T.; Papageorgiou, E.; Mastos, T.; Papadopoulos, A. Autoencoders for Anomaly Detection in an Industrial Multivariate Time Series Dataset. Eng. Proc. 2022, 18, 23. [Google Scholar] [CrossRef]
- Lecun, Y.; Bottou, L.; Bengio, Y.; Haffner, P. Gradient-based learning applied to document recognition. Proc. IEEE 1998, 86, 2278–2324. [Google Scholar] [CrossRef]
- Li, X.; Zhang, W.; Ding, Q. Deep learning-based remaining useful life estimation of bearings using multi-scale feature extraction. Reliab. Eng. Syst. Saf. 2019, 182, 208–218. [Google Scholar] [CrossRef]
- Ma, M.; Mao, Z. Deep-Convolution-Based LSTM Network for Remaining Useful Life Prediction. IEEE Trans. Ind. Inform. 2021, 17, 1658–1667. [Google Scholar] [CrossRef]
- Borovykh, A.; Bohte, S.; Oosterlee, C.W. Conditional Time Series Forecasting with Convolutional Neural Networks. arXiv 2017, arXiv:1703.04691. [Google Scholar] [CrossRef]
- Dong, X.; Qian, L.; Huang, L. Short-term load forecasting in smart grid: A combined CNN and K-means clustering approach. In Proceedings of the 2017 IEEE International Conference on Big Data and Smart Computing (BigComp), Jeju Island, Republic of Korea, 13–16 February 2017. [Google Scholar] [CrossRef]
- Bai, S.; Kolter, J.; Koltun, V. An Empirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Modeling. arXiv 2018, arXiv:1803.01271. [Google Scholar] [CrossRef]
- Jordan, M.I. Serial Order: A Parallel Distributed Processing Approach. In Neural-Network Models of Cognition—Biobehavioral Foundations, Advances in Psychology; Elsevier: Amsterdam, The Netherlands, 1997; pp. 471–495. [Google Scholar] [CrossRef]
- Catelani, M.; Ciani, L.; Fantacci, R.; Patrizi, G.; Picano, B. Remaining Useful Life Estimation for Prognostics of Lithium-Ion Batteries Based on Recurrent Neural Network. IEEE Trans. Instrum. Meas. 2021, 70, 1–11. [Google Scholar] [CrossRef]
- Nikam, H.; Satyam, S.; Sahay, S. Long Short-Term Memory Implementation Exploiting Passive RRAM Crossbar Array. IEEE Trans. Electron Devices 2022, 69, 1743–1751. [Google Scholar] [CrossRef]
- Zhang, J.; Wang, P.; Yan, R.; Gao, R.X. Long short-term memory for machine remaining life prediction. J. Manuf. Syst. 2018, 48, 78–86. [Google Scholar] [CrossRef]
- Chen, J.; Jing, H.; Chang, Y.; Liu, Q. Gated recurrent unit based recurrent neural network for remaining useful life prediction of nonlinear deterioration process. Reliab. Eng. Syst. Saf. 2019, 185, 372–382. [Google Scholar] [CrossRef]
- Cho, K.; van Merrienboer, B.; Gulcehre, C.; Bahdanau, D.; Bougares, F.; Schwenk, H.; Bengio, Y. Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation. In Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), Doha, Qatar, 25–29 October 2014. [Google Scholar] [CrossRef]
- Liao, G.P.; Gao, W.; Yang, G.J.; Guo, M.F. Hydroelectric Generating Unit Fault Diagnosis Using 1-D Convolutional Neural Network and Gated Recurrent Unit in Small Hydro. IEEE Sens. J. 2019, 19, 9352–9363. [Google Scholar] [CrossRef]
- Zhao, R.; Wang, D.; Yan, R.; Mao, K.; Shen, F.; Wang, J. Machine Health Monitoring Using Local Feature-Based Gated Recurrent Unit Networks. IEEE Trans. Ind. Electron. 2018, 65, 1539–1548. [Google Scholar] [CrossRef]
- Schuster, M.; Paliwal, K. Bidirectional recurrent neural networks. IEEE Trans. Signal Process. 1997, 45, 2673–2681. [Google Scholar] [CrossRef]
- Graves, A.; Schmidhuber, J. Framewise phoneme classification with bidirectional LSTM and other neural network architectures. Neural Netw. 2005, 18, 602–610. [Google Scholar] [CrossRef] [PubMed]
- Wang, F.K.; Amogne, Z.E.; Chou, J.H.; Tseng, C. Online Remaining Useful Life Prediction of Lithium-Ion Batteries Using Bidirectional Long Short-Term Memory with Attention Mechanism. SSRN Electron. J. 2022. [Google Scholar] [CrossRef]
- Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.; Kaiser, L.; Polosukhin, I. Attention is All you Need. Adv. Neural Inf. Process. Syst. 2017, 30, 5998–6008. [Google Scholar]
- Zhou, H.; Zhang, S.; Peng, J.; Zhang, S.; Li, J.; Xiong, H.; Zhang, W. Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting. In Proceedings of the AAAI Conference on Artificial Intelligence, Online, 2–9 February 2021; Volume 35, pp. 11106–11115. [Google Scholar] [CrossRef]
- Qi, X.; Hou, K.; Liu, T.; Yu, Z.; Hu, S.; Ou, W. From Known to Unknown: Knowledge-guided Transformer for Time-Series Sales Forecasting in Alibaba. arXiv 2021, arXiv:2109.08381. [Google Scholar]
- Wu, H.; Xu, J.; Wang, J.; Long, M. Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting. arXiv 2021, arXiv:2106.13008. [Google Scholar]
- Zhou, T.; Ma, Z.; Wen, Q.; Wang, X.; Sun, L.; Jin, R. FEDformer: Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting. arXiv 2022, arXiv:2201.12740. [Google Scholar] [CrossRef]
- Gulati, A.; Qin, J.; Chiu, C.C.; Parmar, N.; Zhang, Y.; Yu, J.; Han, W.; Wang, S.; Zhang, Z.; Wu, Y.; et al. Conformer: Convolution-augmented Transformer for Speech Recognition. In Proceedings of the Interspeech 2020, Shanghai, China, 25–29 October 2020. [Google Scholar] [CrossRef]
- Ahmed, N.; Natarajan, T.; Rao, K. Discrete Cosine Transform. IEEE Trans. Comput. 1974, C-23, 90–93. [Google Scholar] [CrossRef]
- Hu, J.; Shen, L.; Albanie, S.; Sun, G.; Wu, E. Squeeze-and-Excitation Networks. IEEE Trans. Pattern Anal. Mach. Intell. 2020, 42, 2011–2023. [Google Scholar] [CrossRef] [PubMed]
- Wu, M.; Yang, D.; Yang, Z.; Guo, Y. Sparrow Search Algorithm for Solving Flexible Jobshop Scheduling Problem. In Lecture Notes in Computer Science, Advances in Swarm Intelligence; Springer: Berlin/Heidelberg, Germany, 2021; pp. 140–154. [Google Scholar] [CrossRef]
- Hsu, C.Y.; Lu, Y.W.; Yan, J.H. Temporal Convolution-Based Long-Short Term Memory Network with Attention Mechanism for Remaining Useful Life Prediction. IEEE Trans. Semicond. Manuf. 2022, 35, 220–228. [Google Scholar] [CrossRef]
- Darwish, A. A Data-driven Deep Learning Approach for Remaining UsefulLife in the ion mill etching Process. Sustain. Mach. Intell. J. 2024, 8, 14–34. [Google Scholar] [CrossRef]
Name | Description |
---|---|
Time | time |
Tool | tool id |
Stage | processing stage of the wafer |
Lot | wafer id |
Run num | number of times the tool has been run |
Recipe | tool settings used to process the wafer |
Recipe step | the process step of a recipe |
iongaugepressure | pressure reading for the main process chamber when k is under vacuum |
etchbeamvoltage | the voltage potential applied to the beam plate of the grid assembly |
etchbeamcurrent | ion current impacting the beam grid determining the amount of ions accelerated through the grid assembly to the wafer |
etchsuppressorvoltage | voltage potential applied to the suppressor plate of the grid assembly |
etchsuppressorcurrent | ion current impacting the suppressor grid plate |
flowcoolflowrate | rate of flow of helium through the flow-cool circuit, controlled by the mass flow controller |
flowcoolpressure | resulting helium pressure in the flow-cool circuit |
etchgaschannel1readback | rate of flow of argon into the source assembly in the vacuum chamber |
etchpbngasreadback | rate of flow of argon into the PBN assembly in the chamber |
fixturetiltangle | wafer tilt angle setting |
rotationspeed | wafer rotation speed setting |
actualrotationangle | measure of the wafer rotation angle |
fixtureshutterposition | open/close shutter setting for the wafer shielding |
etchsourceusage | counter of use for the grid assembly consumable |
etchauxsourcetimer | counter of use for the chamber shields consumable |
etchaux2sourcetimer | counter of use for the chamber shields consumable |
actualstepduration | measured time duration for a particular step |
F1 | F2 | F3 | ||||
---|---|---|---|---|---|---|
NORM | RMSE | MAE | RMSE | MAE | RMSE | MAE |
RNN | 1488.8 | 1009.3 | 1079.1 | 858.49 | 2850.29 | 1921.7 |
LSTM | 837.33 | 223.44 | 798.88 | 273.17 | 549.88 | 157.63 |
FNet | 896.58 | 218.11 | 2393.02 | 1144.40 | 1241.67 | 406.96 |
TCLSTM | 774.04 | 220.64 | 794.99 | 263.07 | 544.61 | 123.92 |
SCINET | 821.84 | 187.64 | 2417.05 | 479.05 | 1320.10 | 371.48 |
DLinear | 684.89 | 73.26 | 3089.50 | 1596.82 | 1052.14 | 290.40 |
GRU | 506.64 | 121.73 | 145.16 | 59.77 | 329.62 | 178.14 |
Transformer | 392.28 | 98.60 | 325.79 | 280.93 | 403.33 | 73.34 |
F-C-I | 317.69 | 34.85 | 228.04 | 80.22 | 296.43 | 179.63 |
F-R-I | 235.88 | 70.96 | 185.55 | 58.78 | 258.64 | 158.36 |
F-T-I | 41.68 | 38.89 | 176.23 | 64.36 | 211.81 | 134.05 |
F-WT-I | 12.86 | 5.97 | 137.31 | 26.53 | 175.17 | 37.42 |
F1 | F2 | F3 | ||||
---|---|---|---|---|---|---|
NORM | RMSE | MAE | RMSE | MAE | RMSE | MAE |
WTCN | 736.41 | 225.18 | 758.73 | 211.84 | 754.77 | 263.7 |
Informer | 501.79 | 300.48 | 612.85 | 468.11 | 391.51 | 332.9 |
WTCN-Informer | 581.62 | 98.14 | 266.76 | 120.42 | 314.66 | 61.56 |
F-WTCN | 270.93 | 135.59 | 211.92 | 79.15 | 318.89 | 72.69 |
F-Informer | 219.13 | 93.56 | 265.36 | 84.43 | 271.40 | 66.58 |
FWT-Informer | 35.77 | 33.24 | 256.19 | 104.45 | 234.52 | 58.39 |
Proposed model | 12.86 | 5.97 | 137.31 | 26.53 | 175.17 | 37.42 |
F1 | F2 | F3 | ||||
---|---|---|---|---|---|---|
NORM | RMSE | MAE | RMSE | MAE | RMSE | MAE |
Unoptimized | 35.77 | 33.24 | 256.19 | 104.45 | 234.52 | 58.39 |
PSO | 30.13 | 26.39 | 250.04 | 84.77 | 224.44 | 121.89 |
ACO | 23.51 | 16.17 | 227.59 | 60.07 | 220.61 | 40.74 |
GA | 14.53 | 8.68 | 245.40 | 69.11 | 215.41 | 48.14 |
SSA | 12.86 | 5.97 | 137.31 | 26.53 | 175.17 | 37.42 |
F1 | F2 | F3 | |
---|---|---|---|
Total parameters | 154,065 | 154,065 | 154,065 |
Training time/s | 2846.50 | 641.47 | 2590.76 |
Average inference time (1 sample)/ms | 12.8570 | 11.2660 | 11.5056 |
Max GPU memory used/MB | 8053.63 | 526.19 | 2428.53 |
Memory Usage/MB | 2121.86 | 1646.72 | 1875.83 |
Saved model size/MB | 3.06 | 3.06 | 3.06 |
Our Model | GRU | Transformer | LSTM | |
---|---|---|---|---|
Training time/s | 1666.77 | 5040 | 3122.37 | 4860 |
Average inference time (1 sample)/ms | 0.0235 | 0.1148 | 0.0449 | 0.1136 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ma, T.; Liu, J.; Xu, P.; Song, Y.; Bai, X. Improved WTCN-Informer Model Based on Frequency-Enhanced Channel Attention Mechanism and Wavelet Convolution: Prediction of Remaining Service Life of Ion Etcher Cooling System. Sensors 2025, 25, 4883. https://doi.org/10.3390/s25164883
Ma T, Liu J, Xu P, Song Y, Bai X. Improved WTCN-Informer Model Based on Frequency-Enhanced Channel Attention Mechanism and Wavelet Convolution: Prediction of Remaining Service Life of Ion Etcher Cooling System. Sensors. 2025; 25(16):4883. https://doi.org/10.3390/s25164883
Chicago/Turabian StyleMa, Tingyu, Jiaqi Liu, Panfeng Xu, Yan Song, and Xiaoping Bai. 2025. "Improved WTCN-Informer Model Based on Frequency-Enhanced Channel Attention Mechanism and Wavelet Convolution: Prediction of Remaining Service Life of Ion Etcher Cooling System" Sensors 25, no. 16: 4883. https://doi.org/10.3390/s25164883
APA StyleMa, T., Liu, J., Xu, P., Song, Y., & Bai, X. (2025). Improved WTCN-Informer Model Based on Frequency-Enhanced Channel Attention Mechanism and Wavelet Convolution: Prediction of Remaining Service Life of Ion Etcher Cooling System. Sensors, 25(16), 4883. https://doi.org/10.3390/s25164883