Tool Wear State Recognition Based on One-Dimensional Convolutional Channel Attention
Abstract
:1. Introduction
- (1)
- A 1DCCA-CNN model was proposed to realize the tool wear state recognition. The features of the cutting signal were extracted by a one-dimensional convolutional neural network. A novel channel attention was proposed. The inter-channel weight relationships are learned by one-dimensional convolution rather than squeezed excitation of fully connected layers, which can improve the interaction ability between different channels, and extract the features strongly related to the tool wear to improve model performance.
- (2)
- Validation was performed on the PHM2010 public cutting dataset. Cross-validation datasets with different groups were designed. The proposed model’s performance on the cross-validation dataset was evaluated by confusion matrix, accuracy, precision and recall. The superiority of the proposed model was verified by comparing with other models.
2. Structure of the Proposed Model
2.1. Overall Framework
2.2. 1D Convolutional Neural Network Layer
2.3. 1D Channel Attention
2.4. Full Connected Layer
3. Experiment
3.1. Experiment Set
3.2. Data Preparation
3.3. Model Parameters and Training Parameter Settings
3.4. Analysis of Experimental Results
4. Conclusions and Future Work
- (1)
- The channel attention mechanism was proposed by using 1DCNN instead of the traditional squeeze excitation. The good cross-channel information acquisition ability of convolutional was used to improve the information interaction between channels, so as to effectively capture the dependency between channels.
- (2)
- The model performance was verified on PHM2010 public dataset. Compared with CNN without attention mechanism, the recognition accuracy of the proposed 1DCCA-CNN improved 3.8%, 6.7%, and 4.7% respectively under the three datasets, which verified the importance of applying channel attention to model performance improvement. Compared with SECA-CNN, which used the traditional squeeze excitation attention mechanism, the recognition accuracy of the proposed 1DCCA-CNN improved 1.2%, 2.1%, and 1.6%, respectively, under the three datasets, which verified the superior performance of the proposed 1DCNN attention mechanism.
- (3)
- The proposed model had high accuracy, precision and recall, which verified the recognition performance of the model. Compared with the existing research results, the proposed model performed well on the T1 and T3 datasets, increasing by 4% and 5% respectively compared with the highest level of the existing research results, which verified the superior properties of the model.
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Pimenov, D.Y.; Gupta, M.K.; Da Silva, L.R.R.; Kiran, M.; Khanna, M.; Krolczyk, G.M. Application of measurement systems in tool condition monitoring of Milling: A review of measurement science approach. Measurement 2022, 199, 111503. [Google Scholar] [CrossRef]
- Pimenov, D.Y.; Bustillo, A.; Wojciechowski, S.; Sharma, V.S.; Gupta, M.K.; Kuntoglu, M. Artificial intelligence systems for tool condition monitoring in machining: Analysis and critical review. J. Intell. Manuf. 2023, 34, 2079–2121. [Google Scholar] [CrossRef]
- Li, X.B.; Liu, X.L.; Yue, C.X.; Liang, S.Y.; Wang, L.H. Systematic review on tool breakage monitoring techniques in machining operations. Int. J. Mach. Tools Manuf. 2022, 176, 103882. [Google Scholar] [CrossRef]
- Banda, T.; Farid, A.A.; Li, C.; Jauw, V.L.; Lim, C.S. Application of machine vision for tool condition monitoring and tool performance optimization—A review. Int. J. Adv. Manuf. Technol. 2022, 121, 7057–7086. [Google Scholar] [CrossRef]
- Yu, J.B.; Cheng, X.; Lu, L.; Wu, B. A machine vision method for measurement of machining tool wear. Measurement 2021, 182, 109683. [Google Scholar] [CrossRef]
- Shang, H.B.; Sun, C.; Liu, J.X.; Chen, X.F.; Yan, R.Q. Defect-aware transformer network for intelligent visual surface defect detection. Adv. Eng. Inform. 2023, 55, 101882. [Google Scholar] [CrossRef]
- Zhang, Y.P.; Qi, X.Z.; Wang, T.; He, Y.H. Tool Wear Condition Monitoring Method Based on Deep Learning with Force Signals. Sensors 2023, 23, 4595. [Google Scholar] [CrossRef]
- Wang, L.Q.; Li, X.; Shi, B.; Munochiveyi, M. Analysis and selection of eigenvalues of vibration signals in cutting tool milling. Adv. Mech. Eng. 2022, 14, 1079553789. [Google Scholar] [CrossRef]
- Liu, M.K.; Tseng, Y.H.; Tran, M.Q. Tool wear monitoring and prediction based on sound signal. Int. J. Adv. Manuf. Technol. 2019, 103, 3361–3373. [Google Scholar] [CrossRef]
- Nasir, V.; Sassani, F. A review on deep learning in machining and tool monitoring: Methods, opportunities, and challenges. Int. J. Adv. Manuf. Technol. 2021, 115, 2683–2709. [Google Scholar] [CrossRef]
- Qin, Y.Y.; Liu, X.L.; Yue, C.X.; Zhao, M.W.; Wei, X.D.; Wang, L.H. Tool wear identification and prediction method based on stack sparse self-coding net-work. J. Manuf. Syst. 2023, 68, 72–84. [Google Scholar] [CrossRef]
- Duan, J.; Hu, C.; Zhan, X.B.; Zhou, H.D.; Liao, G.L.; Shi, T.L. MS-SSPCANet: A powerful deep learning framework for tool wear prediction. Robot. Comput.-Integr. Manuf. 2022, 78, 102391. [Google Scholar] [CrossRef]
- Wei, X.D.; Liu, X.L.; Yue, C.X.; Wang, L.H.; Liang, S.Y.; Qin, Y.Y. Tool wear state recognition based on feature selection method with whitening variational mode decomposition. Robot. Comput.-Integr. Manuf. 2022, 77, 102344. [Google Scholar] [CrossRef]
- Chan, Y.W.; Kang, T.C.; Yang, C.T.; Chang, C.H.; Huang, S.M.; Tsai, Y.T. Tool wear prediction using convolutional bidirectional LSTM networks. J. Supercomput. 2022, 78, 810–832. [Google Scholar] [CrossRef]
- Zhou, Y.Q.; Zhi, G.F.; Chen, W.; Qian, Q.J.; He, D.D.; Sun, B.T.; Sun, W.F. A new tool wear condition monitoring method based on deep learning under small samples. Measurement 2022, 189, 110622. [Google Scholar] [CrossRef]
- Hou, W.; Guo, H.; Yan, B.N.; Xu, Z.; Yuan, C.; Mao, Y. Tool wear state recognition under imbalanced data based on WGAN-GP and light-weight neural network ShuffleNet. J. Mech. Sci. Technol. 2022, 36, 4993–5009. [Google Scholar] [CrossRef]
- Niu, Z.Y.; Zhong, G.Q.; Yu, H. A review on the attention mechanism of deep learning. Neurocomputing 2021, 452, 48–62. [Google Scholar] [CrossRef]
- Li, R.Y.; Wei, P.N.; Liu, X.L.; Li, C.L.; Ni, J.; Zhao, W.K.; Zhao, L.B.; Hou, K.L. Cutting tool wear state recognition based on a channel-space attention mechanism. J. Manuf. Syst. 2023, 69, 135–149. [Google Scholar] [CrossRef]
- Zeng, Y.F.; Liu, R.L.; Liu, X.F. A novel approach to tool condition monitoring based on multi-sensor data fusion imaging and an attention mechanism. Meas. Sci. Technol. 2021, 32, 55601. [Google Scholar] [CrossRef]
- Hou, W.; Guo, H.; Luo, L.; Jin, M.J. Tool wear prediction based on domain adversarial adaptation and channel attention multiscale convolutional long short-term memory network. J. Manuf. Process. 2022, 84, 1339–1361. [Google Scholar] [CrossRef]
- Zhou, J.Q.; Yue, C.X.; Liu, X.L.; Xia, W.; Wei, X.D.; Qu, J.X.; Liang, S.Y.; Wang, L.H. Classification of Tool Wear State based on Dual Attention Mechanism Network. Robot. Comput.-Integr. Manuf. 2023, 83, 102575. [Google Scholar] [CrossRef]
- He, J.L.; Sun, Y.X.; Yin, C.; He, Y.; Wang, Y.L. Cross-domain adaptation network based on attention mechanism for tool wear prediction. J. Intell. Manuf. 2022, 34, 3365–3387. [Google Scholar] [CrossRef]
- Dong, L.; Wang, C.S.; Yang, G.; Huang, Z.Y.; Zhang, Z.Y.; Li, C. An Improved ResNet-1d with Channel Attention for Tool Wear Monitor in Smart Manufacturing. Sensors 2023, 23, 1240. [Google Scholar] [CrossRef] [PubMed]
- Guo, B.S.; Zhang, Q.; Peng, Q.J.; Zhuang, J.C.; Wu, F.H.; Zhang, Q. Tool health monitoring and prediction via attention-based encoder-decoder with a multi-step mechanism. Int. J. Adv. Manuf. Technol. 2022, 122, 685–695. [Google Scholar] [CrossRef]
- Lai, X.W.; Zhang, K.; Zheng, Q.; Li, Z.X.; Ding, G.F.; Ding, K. A frequency-spatial hybrid attention mechanism improved tool wear state recognition method guided by structure and process parameters. Measurement 2023, 214, 112833. [Google Scholar] [CrossRef]
- Feng, T.T.; Guo, L.; Gao, H.L.; Chen, T.; Yu, Y.X.; Li, C.G. A new time–space attention mechanism driven multi-feature fusion method for tool wear monitoring. Int. J. Adv. Manuf. Technol. 2022, 120, 5633–5648. [Google Scholar] [CrossRef]
- Huang, Q.Q.; Wu, D.; Huang, H.; Zhang, Y.; Han, Y. Tool Wear Prediction Based on a Multi-Scale Convolutional Neural Network with Attention Fusion. Information 2022, 13, 504. [Google Scholar] [CrossRef]
- Wang, Q.L.; Wu, B.G.; Zhu, P.F.; Li, P.H. ECA-Net: Efficient Channel Attention for Deep Convolutional Neural Networks. In Proceedings of the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA, 13–19 June 2020. [Google Scholar]
- PHM Society. PHM Society Conference Data Challenge. 2010. Available online: https://www.phmsociety.org/competition/phm/10 (accessed on 20 December 2022).
- Yin, Y.; Wang, S.X.; Zhou, J. Multisensor-based tool wear diagnosis using 1D-CNN and DGCCA. Appl. Intell. 2023, 53, 4448–4461. [Google Scholar] [CrossRef]
- Li, G.F.; Wang, Y.B.; He, J.L.; Hao, Q.B.; Yang, H.J.; Wei, J.F. Tool wear state recognition based on gradient boosting decision tree and hybrid classification RBM. Int. J. Adv. Manuf. Technol. 2020, 110, 511–522. [Google Scholar] [CrossRef]
Experimental Parameters | Size |
---|---|
Spindle speed (RPM) | 10,400 |
Feed rate (mm/min) | 1555 |
Cutting depths (mm) | 0.2 |
Cutting width (mm) | 0.125 |
Sampling frequency | 50 kHz |
Datasets | Training Datasets | Test Datasets |
---|---|---|
T1 | C1 + C4 | C6 |
T2 | C4 + C6 | C1 |
T3 | C6 + C1 | C4 |
Layer Name | Description | Parameters Setting |
---|---|---|
Convolutional layer 1 | 1D Convolution | filters = 42, kernel size = 3, padding = 0, stride = 1, activation = RELU, out size = 98 × 42 |
1D Max pooling | pool size = 2, padding = 0, stride = 2, out size = 49 × 42 | |
Batch normalization | feature number = 42, out size = 49 × 42 | |
Convolutional layer 2 | 1D Convolution | filters = 84, kernel size = 3, padding = 0, stride = 1, activation = RELU, out size = 47 × 84 |
Max pooling | pool size = 2, padding = 0, stride = 2, out size = 23 × 84 | |
Batch normalization | feature number = 84, out size = 23 × 84 | |
Channel attention | GMP | out size = 1 × 84 |
1D Convolution | filters = 1, kernel size = 3, padding = 1, stride = 1, out size = 1 × 84 | |
Sigmod | out size = 1 × 84 | |
Weighted multiplication | out size = 23 × 84 | |
Convolutional layer 3 | 1D Convolution | filters = 168, kernel size = 3, padding = 0, stride = 1, activation = RELU, out size = 21 × 42 |
1D Max pooling | pool size = 2, padding = 0, stride = 2, out size = 10 × 168 | |
Batch normalization | feature number = 168, out size = 10 × 168 | |
Fully connected layer | Flatten | out size = 1 × 1680 |
Linear 1 | out size = 1 × 64 | |
RELU | out size = 1 × 64 | |
Linear 2 | out size = 1 × 3 | |
Sigmod | out size = 1 × 3 |
Hyperparameters | Size |
---|---|
Learning rate | 0.0001 |
Batch size | 256 |
Epoch | 200 |
Weight_decay | 0.05 |
Numerical (%) | Evaluation Metrics | |||
---|---|---|---|---|
Precision | Recall | Accuracy | ||
T1 datasets | Initial wear | 90.8 | 99.2 | 93.4 |
Normal wear | 94.5 | 85.2 | ||
Severe wear | 95.2 | 95.8 | ||
T2 datasets | Initial wear | 96.4 | 94.4 | 87.9 |
Normal wear | 79.7 | 85.5 | ||
Severe wear | 88.4 | 83.8 | ||
T3 datasets | Initial wear | 92.5 | 99.6 | 95.0 |
Normal wear | 94.0 | 90.9 | ||
Severe wear | 99.0 | 94.6 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Xue, Z.; Li, L.; Chen, N.; Wu, W.; Zou, Y.; Yu, N. Tool Wear State Recognition Based on One-Dimensional Convolutional Channel Attention. Micromachines 2023, 14, 1983. https://doi.org/10.3390/mi14111983
Xue Z, Li L, Chen N, Wu W, Zou Y, Yu N. Tool Wear State Recognition Based on One-Dimensional Convolutional Channel Attention. Micromachines. 2023; 14(11):1983. https://doi.org/10.3390/mi14111983
Chicago/Turabian StyleXue, Zhongling, Liang Li, Ni Chen, Wentao Wu, Yuhang Zou, and Nan Yu. 2023. "Tool Wear State Recognition Based on One-Dimensional Convolutional Channel Attention" Micromachines 14, no. 11: 1983. https://doi.org/10.3390/mi14111983
APA StyleXue, Z., Li, L., Chen, N., Wu, W., Zou, Y., & Yu, N. (2023). Tool Wear State Recognition Based on One-Dimensional Convolutional Channel Attention. Micromachines, 14(11), 1983. https://doi.org/10.3390/mi14111983