Research on Yield Prediction Technology for Aerospace Engine Production Lines Based on Convolutional Neural Networks-Improved Support Vector Regression
Abstract
:1. Introduction
2. Analysis of Yield Prediction Problems
3. CNN-ISVR Yield Prediction Model
3.1. Convolutional Neural Networks
3.2. Support Vector Regression
- Given the set of training samples
- 2.
- Construct regression models
- 3.
- Model parameters solution
- 4.
- Selection of Kernel FunctionsThere are generally five major categories of kernel functions:
- Linear kernel function
- Polynomial kernel function
- Laplacian kernel function
- Sigmoid kernel function
- Radial basis kernel function
3.3. CNN-ISVR Model
4. Improved GA-SVR Algorithm
4.1. Genetic Algorithm
4.2. Elite Strategy Genetic Algorithm
4.3. Optimization of Hyperparameters for SVR Model
- Step 1:
- Determine the range of values for the hyperparameters (, ). The commonly used range is [2−10, 210].
- Step 2:
- Perform GS on the hyperparameters. Since the parameter range spans a wide range, taking the logarithm base 2 of the hyperparameter values gives us [10, −10]. Using an interval of 0.25, we can obtain a total of 81 × 81 = 6561 hyperparameter combinations.
- Step 3:
- -fold CV. Choose 600 samples out of 500 as training data and 100 samples as testing data. Let = 5, then divide the training data into 5 equal parts and use each part in turn as the validation set to evaluate the quality of the model trained by the remaining 4 parts. Finally, select the hyperparameters with the minimum root-mean-square error as the optimal hyperparameters in the CV process.
- Step 4:
- Hyperparameter combination grid search. Repeat step 3 for the next group of hyperparameter combinations until the combination with the smallest mean squared error is found, which becomes the optimal SVR model hyperparameters for the current sample set. When different parameter combinations () correspond to the same cross-validation accuracy during the model training process, it is generally preferable to choose the combination with smaller to improve the model’s generalization ability.
4.4. Optimization Process of the SVR Model
- Determine the optimized parameters and encode them, setting the initial range for the optimization.
- Generate an initial population by randomly producing individuals and setting relevant parameters to improve the GA.
- Calculate the fitness using MSE as the evaluation criterion in the sense of the GS-CV algorithm, verifying the classification accuracy.
- Employ adaptive crossover and mutation techniques, as well as conduct elite selection and backup.
- Determine the termination condition: if the number of iterations is met or the solution converges, then decode and output the optimal SVR parameter solution; otherwise, repeat step 2 and continue the process.
4.5. Optimization Results of SVR Model Parameters
5. The Production Prediction for Aerospace Engine Production Lines
5.1. The Acquisition of Production Line Data
The Communication between Devices and Modbus TCP
5.2. Reprocessing of Production Data
5.3. Display of Production Data
5.4. Design of Evaluation Metrics
5.5. Normalization Process
5.6. Experimental Design
5.6.1. The Pre-Processing and Matrixing of Input Data
- The column padding operation is performed on the input production data to enable each sample vector to be matrixed. Five zero vectors are added after the original 220 columns of data, and the size of the data matrix becomes 500 × 225.
- Normalize the data. Considering the requirements of CNN on data, the data are normalized to [0, 1] according to Formula (20), which eliminates the interference of dimensional disturbance of data while ensuring that the data distribution remains unchanged.
- Matrixize each row sample vector. Each row sample changes from a 1 × 225 vector form to a 15 × 15 matrix form.
- Up-sample the data as the input for the CNN model. The corresponding sample matrix of 200 × (15 × 15) is up-sampled to a 4-dimensional matrix of size 500 × 15 × 15 × 1.
5.6.2. Parameters of CNN-ISVR Model
5.7. Analysis of Production Line Yield Model Prediction Results
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Nomenclature
Full Name | Abbreviations |
Support Vector Regression | SVR |
Improved Support Vector Regression | ISVR |
Radial Basis Function | RBF |
Convolutional Neural Networks | CNN |
Genetic Algorithm | GA |
Grid-Search | GS |
Cross-Validation | CV |
Radial Basis Kernel Function | RBF |
Mean Squared Error | MSE |
Root Mean Square Error | RMSE |
Mean Absolute Percentage Error | MAPE |
Linear Regression | LR |
Ridge Regression | RG |
Random Forest | RF |
Industrial Communication Protocols | ICPs |
References
- De Leone, R.; Pietrini, M.; Giovannelli, A. Photovoltaic energy production forecast using support vector regression. J. Neural Comput. Appl. 2015, 26, 1955–1962. [Google Scholar] [CrossRef]
- Li, D.; Wang, L.; Huang, Q. A case study of SOS-SVR model for PCB throughput estimation in SMT production lines. In Proceedings of the IEEE International Conference on Industrial Engineering and Systems Management (IESM), Shanghai, China, 25–27 September 2019; pp. 1–6. [Google Scholar]
- Hu, X.R. Simulation Study of Fully Automatic Cotton Yarn Dyeing and Printing Production Line Based on Unity3D; Donghua University: Shanghai, China, 2020. (In Chinese) [Google Scholar]
- Gaonkar, B.; Davatzikos, C. Analytic estimation of statistical significance maps for support vector machine based multi-variate image analysis and classification. J. Neuroimage 2013, 78, 270–283. [Google Scholar] [CrossRef]
- Nandi, S.; Badhe, Y.; Lonari, J. Hybrid process modeling and optimization strategies integrating neural networks/support vector regression and genetic algorithms: Study of benzene isopropylation on Hbeta catalyst. J. Chem. Eng. J. 2004, 97, 115–129. [Google Scholar] [CrossRef]
- Han, Y.; Liu, Y.; Pan, Y.H. Comparison of SVM, BP neural network and linear regression. J. North China Univ. Sci. Technol. 2017, 39, 104–109. [Google Scholar]
- Zhang, J.; Wang, J.; Qin, W.; Rosa, J.L.G. Artificial neural networks in production scheduling and yield prediction of semiconductor wafer fabrication system. In Artificial Neural Networks-Models and Applications; IntechOpen: London, UK, 2016; pp. 355–387. [Google Scholar]
- Nuñez-Piña, F.; Medina-Marin, J.; Seck-Tuoh-Mora, J.C. Modeling of throughput in production lines using response surface methodology and artificial neural networks. J. Complex. 2018, 1–10. [Google Scholar] [CrossRef]
- Yang, Y.Y.; Liu, G.L.; Zhao, G.H. A long shortterm memory based deep learning method for industrial load forecasting. J. Power Constr. 2018, 39, 29–36. (In Chinese) [Google Scholar]
- Ye, Y.W.; Lu, J.J.; Qian, Z.Q. Study on the Temperature Error Prediction of Mechanical Temperature Instrument Based on LSSVM. J. Chin. J. Sci. Instrum. 2016, 37, 57–66. (In Chinese) [Google Scholar]
- Zhang, L.; Zhou, W.D.; Chang, P.C. Iterated Time Series Prediction with Multiple Support Vector Regression Models. J. Neurocomputing 2013, 99, 411–422. [Google Scholar] [CrossRef]
- Tang, Y. Deep learning using linear support vector machines. arXiv 2013, arXiv:1306.0239. [Google Scholar]
- Shi, Y.G.; Cheng, K.; Liu, Z.W. Hippocampus sub-area image segmentation combined with deep learning and support vector machine. J. Image Graph. 2018, 23, 542–551. (In Chinese) [Google Scholar]
- Shi, P.M.; Liang, K.; Zhao, N. Gear intelligent fault diagnosis based on deep learning feature extraction and particle swarm support vector machine state recognition. J. China Mech. Eng. 2017, 28, 1056–1061. (In Chinese) [Google Scholar]
- Kang, Z.; Catal, C.; Tekinerdogan, B. Machine learning applications in production lines: A systematic literature review. J. Comput. Ind. Eng. 2020, 149, 106773. [Google Scholar] [CrossRef]
- Chang, L.; Deng, X.M.; Zhou, M.Q. Convolutional neural networks in image understanding. J. Acta Autom. Sin. 2016, 42, 1300–1312. (In Chinese) [Google Scholar]
- Zhou, F.Y.; Jin, L.P.; Dong, J. Review of convolutional neural networks. J. Comput. 2017, 40, 1229–1251. (In Chinese) [Google Scholar]
- Zhang, W.D.; Xu, Y.L.; Ni, J.C. Image target recognition algorithm based on multi-scale block convolutional neural network. J. Comput. Appl. 2016, 36, 1033–1038. (In Chinese) [Google Scholar]
- Yang, Y.; Wang, Z.Q.; Yang, B. A predictive model for fixture layout optimization of aerospace thin-walled parts based on support vector regression. J. Comput. Integr. Manuf. 2017, 23, 1302–1309. (In Chinese) [Google Scholar]
- Boser, B.E.; Guyon, I.M.; Vapnik, V.N. A training algorithm for optimal margin classifiers. In Proceedings of the Fifth Annual Workshop on Computational Learning Theory, Pittsburgh, PA, USA, 27–29 July 1992; pp. 144–152. [Google Scholar]
- Kingma, D.P.; Ba, J. Adam: A method for stochastic optimization. arXiv 2014, arXiv:1412.6980. [Google Scholar]
- Huang, J.; Bo, Y.; Wang, H. Electromechanical equipment state forecasting based on genetic algorithm–support vector regression. J. Expert Syst. Appl. 2011, 38, 8399–8402. [Google Scholar] [CrossRef]
- Tao, P.Y.; Sun, Z.; Sun, Z.X. An improved intrusion detection algorithm based on GA and SVM. J. IEEE Access 2018, 6, 13624–13631. [Google Scholar] [CrossRef]
- Bergstra, J.; Bengio, Y. Random search for hyper-parameter optimization. J. Mach. Learn. Res. 2012, 13, 281–305. [Google Scholar]
- Smola, A.J.; Schölkopf, B. A tutorial on support vector regression. J. Stat. Comput. 2004, 14, 199–222. [Google Scholar] [CrossRef]
- Tapia, E.; Sastoque-Pinilla, L.; Lopez-Novoa, U. Assessing Industrial Communication Protocols to Bridge the Gap between Machine Tools and Software Monitoring. J. Sens. 2023, 23, 5694. [Google Scholar] [CrossRef] [PubMed]
- Bustillo, A.; Pimenov, D.Y.; Matuszewski, M. Using artificial intelligence models for the prediction of surface wear based on surface isotropy levels. J. Robot. Comput. Integr. Manuf. 2018, 53, 215–227. [Google Scholar] [CrossRef]
- Grzenda, M.; Bustillo, A. The evolutionary development of roughness prediction models. J. Appl. Soft Comput. 2013, 13, 2913–2922. [Google Scholar] [CrossRef]
- Bustillo, A.; Reis, R.; Machado, A.R. Improving the accuracy of machine-learning models with data from machine test repetitions. J. Intell. Manuf. 2022, 33, 203–221. [Google Scholar] [CrossRef]
Influencing Factors | Symbolic Representation | Descriptions |
---|---|---|
Production cycle | represents the time required to complete the th process | |
Equipment load | represents the number of the th workpiece to be processed on the th machine at moment | |
Equipment condition |
| represents the average utilization of the th device at time and represents the failure rate of the device at time |
Production line work system | represents the operating time of the production line for the day |
Relevant Parameters | Take Values (Range) |
---|---|
Maximum number of evolution generations | 50 |
Population size | 20 |
Range of values for | [2−10, 210] |
Number of folds for CV () | 5 |
Range of values for () | [2−10, 210] |
Probability of crossover | [0.5, 0.7] |
Probability of mutation | [0.01, 0.05] |
Elite selection operator | 0.025 |
Hyperparameters | GA-SVR | E-GA-SVR |
---|---|---|
Penalty parameter | 97.934 | 122.418 |
kernel function | 0.6953 | 0.7884 |
MSE | 9.35 × 10−5 | 9.12 × 10−5 |
Serial Number | Engine Yield /Unit | Production Cycle p1/Hour | Production Cycle p2/Hour | Facility Load q11/Unit | Facility Load q12/Unit | Facility Utilization r1/% | Facility Utilization r2/% | Facility Failure Rate z/% | Work System/Hour |
---|---|---|---|---|---|---|---|---|---|
1. | 8 | 6.97 | 6.00 | 0 | 1 | 81.50 | 85.42 | 11.1 | 8 |
2. | 7 | 6.99 | 5.9 | 1 | 0 | 81.13 | 84.58 | 10.5 | 8 |
⋮ | ⋮ | ⋮ | ⋮ | ⋮ | ⋮ | ⋮ | ⋮ | ⋮ | ⋮ |
599 | 5 | 7.15 | 6.08 | 0 | 1 | 70.41 | 60.49 | 11.8 | 8 |
600 | 4 | 7.16 | 6.23 | 1 | 0 | 59.13 | 58.12 | 12.86 | 8 |
Layer Structure | Feature Size | Feature Channel | Convolutional Kernel |
---|---|---|---|
Input layer | 8 × 8 | 1 | - |
Convolutional layer | 8 × 8 | 32 | 2 × 2 |
Convolutional layer | 8 × 8 | 64 | 2 × 2 |
Fully connected layer | 1 × 512 | 1 | 4096 × 1 |
Fully connected layer | 1 × 512 | 1 | 4096 × 1 |
Output layer | 1 | - | - |
γ | R2 | MAPE | RMSE | |
---|---|---|---|---|
5 | 0.6 | 0.9158 | 6.8716% | 4.3879 |
8 | 0.6 | 0.9188 | 6.9654% | 4.4247 |
4 | 0.8 | 0.9167 | 6.8921% | 4.4549 |
4 | scale | 0.9265 | 6.6756% | 4.1651 |
Model | Parameter Configuration |
---|---|
SVR | = 4, γ = ‘scale’, = ‘RBF’ |
LR | fit_intercept = True, normalize = False, copy_X = True |
RG | alpha = 2.0, fit_intercept = True, solver = ‘sag’ |
RF | n_estimators = 100, criterion = ‘MSE’, max_features = auto’ |
Model | R2 | RMSE | MAPE |
---|---|---|---|
LR | 0.9088 | 4.0517 | 6.4354 |
RG | 0.9122 | 4.1235 | 6.3290 |
RF | 0.9075 | 4.5775 | 6.8126 |
CNN | 0.9183 | 5.0982 | 8.2477 |
SVR | 0.9116 | 4.1651 | 6.6756 |
CNN-SVR | 0.9241 | 3.5759 | 5.5916 |
CNN-ISVR | 0.9265 | 2.1428 | 4.4758 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Liu, H.; Li, B.; Liu, C.; Zu, M.; Lin, M. Research on Yield Prediction Technology for Aerospace Engine Production Lines Based on Convolutional Neural Networks-Improved Support Vector Regression. Machines 2023, 11, 875. https://doi.org/10.3390/machines11090875
Liu H, Li B, Liu C, Zu M, Lin M. Research on Yield Prediction Technology for Aerospace Engine Production Lines Based on Convolutional Neural Networks-Improved Support Vector Regression. Machines. 2023; 11(9):875. https://doi.org/10.3390/machines11090875
Chicago/Turabian StyleLiu, Hongjun, Boyuan Li, Chang Liu, Mengqi Zu, and Minhao Lin. 2023. "Research on Yield Prediction Technology for Aerospace Engine Production Lines Based on Convolutional Neural Networks-Improved Support Vector Regression" Machines 11, no. 9: 875. https://doi.org/10.3390/machines11090875
APA StyleLiu, H., Li, B., Liu, C., Zu, M., & Lin, M. (2023). Research on Yield Prediction Technology for Aerospace Engine Production Lines Based on Convolutional Neural Networks-Improved Support Vector Regression. Machines, 11(9), 875. https://doi.org/10.3390/machines11090875