Next Article in Journal
Fluoride Removal by Spherical Agglomeration Technique Process in Water Using Sunflower Oil as a Sustainable Alternative to n-Heptane
Previous Article in Journal
Smart Card-Based Vehicle Ignition Systems: Security, Regulatory Compliance, Drug and Impairment Detection, Through Advanced Materials and Authentication Technologies
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Enhancing the Product Quality of the Injection Process Using eXplainable Artificial Intelligence

by
Jisoo Hong
1,
Yongmin Hong
1,
Jung-Woo Baek
2 and
Sung-Woo Kang
1,*
1
Department of Industrial Engineering, Inha University, Incheon 22212, Republic of Korea
2
Department of Industrial & Systems Engineering, Dongguk University, Seoul 04620, Republic of Korea
*
Author to whom correspondence should be addressed.
Processes 2025, 13(3), 912; https://doi.org/10.3390/pr13030912
Submission received: 13 February 2025 / Revised: 12 March 2025 / Accepted: 17 March 2025 / Published: 19 March 2025
(This article belongs to the Section Materials Processes)

Abstract

:
The injection molding process is a traditional technique for making products in various industries such as electronics and automobiles via solidifying liquid resin into certain molds. Recently, research has continued to reduce the defect rate of the injection molding process. This study proposes an optimal injection molding process control system to reduce the defect rate of injection molding products with eXplainable Artificial Intelligence (XAI) approaches. Boosting algorithms (XGBoost version 2.1.3 and LightGBM version 4.1.0) are used as tree-based classifiers for predicting whether each product is normal or defective. The main features to control the process for improving the product are extracted by Shapley Additive exPlanations (SHAP), while the individual conditional expectation analyzes the optimal control range of these extracted features. To validate the methodology presented in this work, the actual injection molding AI manufacturing dataset provided by the Korea AI Manufacturing Platform (KAMP) is employed for the case study. The results reveal that the defect rate decreases from 1.00% (original defect rate) to 0.21% with XGBoost and 0.13% with LightGBM, respectively.

1. Introduction

During the injection molding process, liquid raw materials are injected into a mold and hardened to produce a product. It is widely used as an effective technique to mass-produce large core components and small parts, such as automobiles, displays, and semiconductors. The injection molding process maintains a relatively high quality and has been improved over time [1].
Injection molding manufacturers have recently employed machine learning, deep learning, and artificial intelligence to the injection molding process. In order to determine the optimal process conditions, researchers have integrated neural network models and genetic algorithms [2]. In the field of injection molding process, an artificial intelligence-based approach has been employed to increase the yield rate by reducing the downtime of the process and defective parts [3]. Researchers have employed machine learning algorithms to optimize the injection molding process by analyzing in-mold conditions [4]. The effects of cavity pressure and mold surface temperature on quality factors have been analyzed by XAI-based approaches [5]. However, machine learning and deep learning often lack transparency and interpretability, making them unfamiliar to field operators.
The injection molding process has been continuously improved, hereby reaching a high yield rate [6]. However, achieving a process yield close to 100% from an already high-yield state requires fine-tuning of process variables. This paper aims to reduce the defect rate of injection-molded products by employing the eXplainable Artificial Intelligence (XAI) algorithm to fine-tune the process variables.
Traditional machine learning techniques that exhibit black-box characteristics lack the ability to provide explanations for their predictions, thereby demonstrating limited reliability. This shortcoming poses significant challenges to their practical implementation in real-world processes. However, XAI methods provide clear reasons and justifications for the model’s outcomes. This feature makes XAI a suitable approach for fine-tuning process variables to improve the defect rate in injection molding processes. This paper aims to enhance the reliability of the process and achieve even higher yield rates by employing XAI. Also, XAI enables field experts to more easily understand AI predictions by providing evidence for model learning.
Shapley Additive Explanations (SHAP) identifies main features contributing to product defects. Tree-based algorithms, such as XGBoost and LightGBM, are used as training models for feature extraction. The optimal control range of features identified through SHAP is determined using the Individual Conditional Expectation (ICE) algorithm.
The remainder of this paper is organized as follows. Section 1 introduces the motivation and purpose of this study. Section 2 describes previous studies. Section 3 presents a methodology that explains the process management method presented in this paper. Section 4 presents the experimental results using actual injection molding process data. Section 5 discusses the conclusions and future work.

2. Related Studies

2.1. Injection Process

The injection molding process involves plastic molding. The structure of the injection molding process is shown in Figure 1.
The injection process involves plastic molding. This process is performed by injecting a dissolved thermoplastic resin into a mold and cooling it [7].
The injection molding process has six stages, as shown in Figure 2: plasticization, clamping, filling, packing, cooling, demolding, and ejection [8].
  • Plasticization stage: The screw moves forward, and the plastic resin is dissolved by a heated barrel.
  • Clamping stage: The oil pressure system enables the plastic resin to fit the fixed and movable parts of the mold closely.
  • Filling stage: The mold is filled with dissolved plastic resin from the nozzle.
  • Packing stage: To prevent the volume from shrinking, pressure was applied before the plastic resin hardens completely.
  • Cooling stage: The dissolved plastic resin is cooled and hardened.
  • Demolding and ejection stage: When the mold is opened, the resin shrinks, and the product is ejected.
Figure 2. Injection process.
Figure 2. Injection process.
Processes 13 00912 g002
The injection molding products are processed by repeating the clamping, demolding, and ejection stages. Because the injection molding process produces finished products, a high quality must be maintained. Therefore, the optimal management of variables, such as temperature and pressure, which are the major variables that determine product quality, is very important for improving the process product yield.
In the field of the injection process, pressure and temperature are the main factors that affect the product quality by changing the crystallization rate and curing degree. Controlling these parameters of the injection molding process is important for optimization in various fields. In the field of injection molding process control for internal combustion engines, numerical analysis of the injection molding process is performed by modeling and computer simulations based on multiple fuel injections [9]. The AVL Boost simulation application is used to monitor engine functionality. However, the simulation used only three monitoring conditions. This study uses continuous feature conditions to propose the control range of main features. In the medical field, research on injection molding process optimization is also being conducted. A polycaprolactone parts development system is proposed for future implants through several injection molding parameter improvements, including the melting temperature, injection time, and injection pressure [10]. The results of this system demonstrate the potential of using simulations as tools to optimize the injection-molding process. However, the data used in this study are artificial data generated from the literature. Therefore, it is necessary to consider its application in actual processes.
The injection molding process has a low defect rate. Therefore, the production of failure data is much lower than the normal product data. Consequently, when applying artificial intelligence to injection molding process data, an imbalance between normal and defective data is inherent. In various studies, the SMOTE algorithm was used to solve the data imbalance problem in the injection molding process [11,12,13]. Synthetic Minority Over-sampling TechniquE (SMOTE) is appropriate for addressing data imbalance in manufacturing processes because it generates new data points between existing variable values. This study employs the SMOTE technique to augment defective data, thereby resolving the imbalance problem.

2.2. eXplainable Artificial Intelligence (XAI)

Unlike existing AI, explainable XAI is an algorithm that increases reliability by presenting validity and grounds for machine learning [14]. Original AI has the “black box” characteristic that does not provide grounds for prediction results. In 2017, the Defense Advanced Research Projects Agency suggested using XAI to address the limitations of AI, as shown in Figure 3 [15]. Because of these characteristics of XAI, field experts can easily understand the prediction results.
Recently, research into yield improvement processes based on these factors has progressed. Zhang proposed a fault-diagnosis system for oil-immersed transformers [16]. The main fault features have been extracted from the power transformer failure by analyzing through the SHAP based system. The system used the SHAP for feature selection and achieved a recall value of 0.96 for the fault samples [17]. However, no additional measures were conducted for the selected features. This study employs the ICE algorithm to provide the optimal control range of each selected features to the field experts.
To improve manufacturing quality, rule-based explanations are performed based on ensemble machine learning [18]. Feature importance is used to obtain the most significant process conditions, and Partial Dependence Plot (PDP) and ICE plots are used to provide a visual overview. However, the feature importance does not consider the correlation of each feature. The SHAP algorithm creates a subset of each feature to extract the main features by calculating the correlations. In addition, this study uses the PDP and ICE plots to determine the optimal control range of the main features.
In recent years, the application of machine learning models, particularly black-box models like deep neural networks (DNNs) and random forests, has gained significant traction in manufacturing optimization tasks, including the injection molding process. However, these models, while effective in capturing complex patterns within data, are often criticized for their lack of transparency, making it difficult for domain experts to interpret how decisions are made. For instance, a study by Liou applied deep learning techniques to optimize injection molding parameters, but the model lacked the interpretability needed to understand the underlying decision-making process, which limited its practical applicability [19]. In contrast, XAI-based algorithms offer a significant advantage by providing interpretability alongside predictive power. By explaining the contribution of each input feature to the model’s predictions, XAI allows experts to not only trust the results but also understand the underlying factors influencing the process. A recent study by Ribeiro introduced Local Interpretable Model-agnostic Explanations (LIME), which has been applied in various industrial contexts, showing how it can provide interpretable explanations for complex models [20]. This increased transparency facilitates better decision-making and enhances the practical value of AI models in industrial applications, positioning XAI as a more reliable and user-friendly alternative to traditional black-box models.

3. Methodology

The injection molding process is a traditional manufacturing method with a high production yield. This process is the final step in creating the surface of a product. Therefore, it is directly related to product defects, and strict yield management is required. Recently, XAI has become a state-of-the-art methodology for improving manufacturing processes. This paper presents a pilot study for implementing XAI to increase the injection molding process yield. This study aims to improve the injection molding process based on artificial intelligence, and the methodology of the study is shown in Figure 4.
The injection process shows a data imbalance between normal and defect data owing to the high yield of its own nature. To resolve the data imbalance, the SMOTE technique is employed in the data preprocessing stage (Section 3.1) Then, the tree-based classifier (Section 3.2) trains a model for predicting the product’s defect. The SHAP algorithm (Section 3.3) extracts major features that critically affect defect prediction. Finally, the control range of the major features is determined using the ICE algorithm (Section 3.4).

3.1. Data Preprocessing for Injection Process

This study uses the injection molding process data collected by sensors from a mold and machine [21]. The DataFrame is constructed by selecting controllable features such as temperature and pressure. The injection molding process has a high yield; therefore, the numbers of normal data and defect data are imbalanced, which results in a biased analysis. Therefore, oversampling is performed to balance the data used in the study. To solve this problem, this study employs the SMOTE algorithm for oversampling. SMOTE is a k-nearest neighbor (KNN)-based oversampling algorithm [22]. Figure 5 shows the operating principle of SMOTE.
First, one selects one of the data points of the minority class; in this case, the defect is a minority class, such as the red squares ( x i ) in Figure 5. The squares represent defect data for the injection molding process. One of the K nearest data points of the corresponding data is randomly selected, and the difference between the two selected data points is multiplied by the weight to generate new data, such as the green squares in Figure 5 ( x n e w ). In this case, the weight is randomly generated between zero and one. The imbalance in the data is resolved by repeating this process until a sufficient amount of data is generated. In this study, defective data are oversampled to equal the amount of normal data. Because the injection molding process data are distributed within a similar range owing to the characteristics of the process, the SMOTE algorithm is employed to generate virtual defect datasets close to the original data.

3.2. Tree-Based Classifier (XGBoost, LightGBM)

This study uses a tree-based classifier to learn and predict whether products are defective. The tree-based classifiers used in this study are XGBoost and LightGBM. XGBoost is a gradient-boosting-based algorithm that combines several weak decision trees to build a robust model [23,24]. XGBoost is widely used in many ways because of its parallel learning, fast calculation speed, and excellent performance. The learning process for XGBoost is shown in Algorithm 1.
Algorithm 1 XGBoost Algorithm
eXtreme Gradient Boosting (XGBoost)
Input:
Instance set of current node; feature dimension;
Procedure:
J P = 0
G = i I   g i ,   H = i I   h i
f o r   k = 1   t o   n   d o
  G L = 0 ,   H L = 0  
  f o r   j   i n   s o r t e d   d o
  G L = G L + g j ,   H L = H L + H j
  G R = G G L ,   H R = H L H L
  s c o r e = max s c o r e ,   J P
  e n d
e n d
Output: Split with max score
LightGBM is a gradient-boost-based algorithm, like XGBoost [25,26]. The primary technology used is gradient-based one-sided sampling (GOSS), which applies multiplier constants to low-weight objects. LightGBM uses memory more efficiently by dividing the tree leaf-wise rather than level-wise; therefore, it exhibits good speed and performance. A level-wise tree requires additional operations to balance it. However, a leafwise tree is more efficient, because it divides and calculates the node with the largest delta loss. The LightGBM learning process is shown in Algorithm 2.
Algorithm 2 LightGBM Algorithm
Light Gradient Boosting Machine (LightGBM)
Input:
T r a i n i n g   d a t a :
D = x 1 ,   y 1 ,   x 2 ,   y 2 ,   ,   x N ,   y N ,
x i x ,   x R ,   y i 1 , + 1 ;
L o s s   f u n c t i o n :   L y ,   θ x

Iterations:
M ; Big   gradient   data   sampling   ratio : a ;
slight   gradient   data   sampling   ratio : b ;
1 .   C o m b i n e   f e a t u r e s   t h a t   a r e   m u t u a l l y  
e x c l u s i v e ( i . e . ,   f e a t u r e s   n e v e r   s i m u l t a n e o u s l y  
a c c e p t   n o n z e r o   v a l u e s )   o f   x i , i = 1 ,   ,   N   b y  
t h e   e x c l u s i c e   f e a t u r e   b u n d l i n g   E F B   t e c h n i q u e ;
2 .   S e t   θ 0 x = a r g m i n c i N L y i ,   c ;
3 .   f o r   m = 1   t o   M   d o
4 .   C a l c u l a t e   g r a d i e n t   a b s o l u t e   v a l u e s ;
r i = L y i ,   θ x i / θ x i θ x = θ m 1 x ,     i = 1 ,   ,   N
5 .   R e s a m p l e   d a t a   s e t   u s i n g   g r a d i e n t   b a s e d   o n e
s i d e   s a m p l i n g   G O S S   p r o c e s s ;  
t o p N = a × l e n D ; r a n d N = b × l e n D ;
S o r t e d = G e t S o r t e d I n d i c e s a b s r ;
A = s o r t e d 1 : t o p N ;
B = R a n d o m P i c k s o r t e d t o p N : l e n D , r a n d N ;
D ´ = A + B ;
6 .   C a l c u l a t e   i n f o r m a t i o n   g a i n s ;
V j d = ( x i A l r i + 1 a / b x i B l r i 2 / n l j d
+ x i A r r i + 1 a / b x i B r r i 2 / n r j d ) / n
7 .   D e v e l o p   a   n e w   d e c i s i o n   t r e e   θ m x   o n   s e t   D
8 .   U p d a t e   θ m x = θ m 1 x + θ m x
9 .   E n d
Output: Return θ ˜ x = θ M x

3.3. Shapley Additive exPlanations (SHAP)

The SHAP algorithm extracts the main features of the injection molding process by exploring the impact of each feature on product quality. The algorithm is based on Shapley’s game theory, which examines how individuals make decisions when faced with interdependent circumstances [27]. This algorithm regards each manufacturing feature as an individual in game theory. The impact on feature i is analyzed using the process described in Figure 6.
Figure 6. Procedure for obtaining the Shapley value.
Figure 6. Procedure for obtaining the Shapley value.
Processes 13 00912 g006
v S =   f ^ x 1 ,   ,   x n dP x S E x   f ^ X
ϕ i v = S 1 ,   ,   n   i S ! n S 1 !   n ! v S i v S
ϕ i : Shapley   Value   for   manufacturing   feature   i .
n : Total   number   of   manufacturing   features .
S : Subset   that   does   not   contain   manufacturing   feature   i .
v S   : Contribution   of   a   subset   S .
v S i : Contribution   of   a   subset   S i .
The SHAP algorithm generates every possible subset of each manufacturing feature. To examine the influence of a manufacturing feature, one subtracts the algorithm subsets from the contribution of a subset that does not contain features from the contribution of a subset; the contribution of the subset is calculated as shown in (1). To check the importance of the feature, as shown in (2), a value called the Shapley value is calculated. In this study, the Shapley values are used to select the main features. The mean absolute Shapley value is used to consider both the negative and positive influences on the product. Figure 7 shows the Shapley value for each instance and expresses the mean of the absolute Shapley value. The SHAP algorithm addresses the limitations of traditional variable importance methods (e.g., Feature Importance) by accounting for both negative and positive interactions between variables.
The injection features are sorted in descending order of importance. The main features of the process are selected based on the line in which the cumulative importance of the features is 70% of the total importance.

3.4. ICE and PDP

To explore the conditions for improving the injection quality, both the ICE and PDP algorithms are proposed to determine the control range of the main features. The ICE predicts the target value of an instance according to the changes in the feature values of the manufacturing process. In the injection molding process, the target value is predicted by fixing other features (temperature and RPM) and changing a particular feature (pressure) to propose a control pressure range. The ICE process is presented in Algorithm 3.
Algorithm 3 Procedure Used by the ICE Algorithm to Predict the Control Range in the Injection Process
ICE algorithm to predict the control range in injection molding process
Input:
X i   : A   s p e c i f i c   m a n u f a c t u r i n g   f e a t u r e   f o r  
p r e s e n t i n g   t h e   c o n t r o l   r a n g e
X i   : A l l   m a n u f a c t u r i n g   f e a t u r e s   e x c e p t   X i
N   : N u m b e r   o f   i n s t a n c e
p ,   q   : E a c h   i n s t a n c e
Procedure:
1 .   I n i t i a l i z e   m o d e l   w i t h   a   c o n s t a n t   v a l u e  
f ^ X i p ,   X i q p , q = 1 N
2 .   f o r   q = 1   t o   N :
f o r   p = 1   t o   N :
  X i p = T h e   v a l u e   o f   X i   i n   i n d e x   p    
  X i q = T h e   v a l u e   o f   X i   i n   i n d e x   q    
P l o t t i n g   f ^ X i p ,   X i q
Output: ICE & PDP plot

4. Experimental Results and Discussion

This paper aims to present a process yield improvement methodology using XAI-based algorithms. The main features are derived using SHAP, and their control range is determined using ICE.

4.1. Collection and Preprocessing for the Injection Process

This study uses automobile windshield side molding injection molding process data collected from 16 October 2020 to 19 November 2020. The total number of collected data points is 7990, and the number of features is 45. The total dataframe is shown in Table 1. The target value is “PassOrFail”, and it is expressed as 1 for normal products and 0 for defective products.
The preprocessing is performed in three steps. A dataframe is constructed by selecting 16 controllable features such as temperature, pressure, and RPM from the collected process features. Time features such as ‘Filling_Time’, ‘Ejection_Time’ and position features are excluded due to uncontrollability. Also, products with different process indices are excluded as they violate the control variables. Subsequently, a process is conducted to check for missing values or outliers. An example of the selected process features is presented in Table 2.
Training and validation are performed using train–test splits. The training and test datasets are split in a 5:5 ratio, and each split dataset is listed in Table 3.
The SMOTE algorithm is used to balance the ratios of normal and defective data. The results of the oversampling are listed in Table 4.

4.2. Model Training for Injection Process

This study uses a tree-based classifier, XGBoost, and LightGBM to train and predict whether injection molding process products are defective. The training dataset (normal data: 3964/defective data: 3964) is used for training, and the test dataset (Normal Data: 3955/Defective Data: 40) is used to check the accuracy of the model. Additionally, cross-validation is performed to check the model’s performance. During the cross-validation process, the number of subsets is set to five. For XGBoost, the accuracy of each cross-validation is 0.9947, 0.9977, and 0.9981, with a CV average accuracy of 0.9968. For LightGBM, the respective accuracies are 0.9924, 0.9955, and 0.9977, with a CV average accuracy of 0.9952. The results of XGBoost and LightGBM are presented in Table 5.

4.3. SHAP (Shapley Additive exPlanations)

To verify the importance of features in the injection molding process, the main features are extracted by using the SHAP algorithm. Figure 8 shows the mean absolute Shapley value of each manufacturing feature for XGBoost and LightGBM.
Each graph shows the importance of manufacturing features in descending order. Features with cumulative importance corresponding to 70% of the total are selected as the main features. Orange line is cumulative shapley value, and the red box is main features. In the case of XGBoost, the main features are “Max Injection Pressure”, “Average Back Pressure”, “Max Switch Over Pressure”, “Barrel Temperature 5”, “Max Screw RPM”, “Average Screw RPM”, and “Barrel Temperature 1”.
In the case of LightGBM, the main features are “Max Injection Pressure”, “Max Switch Over Pressure”, “Barrel Temperature 5”, “Average Back Pressure”, “Barrel Temperature 3”, and “Mold Temperature 4”. The selected main features and mean absolute Shapley values are listed in Table 6.

4.4. ICE and PDP

The ICE algorithm extracts the control range of the main features to reduce the process-defect rate. The ICE plots of the main features selected in Section 4.3 by each XGBoost and LightGBM are given in Figure 9 and Figure 10, respectively.
Each control range of the main features is presented according to the algorithm described in Section 3.4. The PDP is the average of the ICE experimental results, which are represented by orange dotted lines in Figure 9 and Figure 10. The minimum and maximum PDP values of each main feature are indicated by red lines in Figure 9 and Figure 10.
For example, in the case of Figure 10b, the maximum PDP value is 0.73, and the minimum value is 0.26. Both values are calculated according to the change in the x value Max_Switch_Over_Pressure. Table 7 and Table 8 show the control ranges of the main features for alpha values of 0.05, 0.1, and 0.2 based on the y-axis maximum values.

4.5. Discussion

To validate the methodology, the test dataset presented in Table 9 is utilized. The test dataset is not oversampled to reflect the low defect rate of the actual process. Subsequently, the optimal control range specified in Table 7 and Table 8 is applied, and only the products produced within this range are selected. The defect rate from the test data set is compared with the original defect rate to determine whether the process has improved. The validation results are presented in Table 9.
When the alpha value decreases, the defect rate also decreases because of the limited control range of the process features. However, limited control of the process will involve a greater cost. On the contrary, as the alpha value increases, the defect rate of the process will decrease relatively less than the previous case. The cost required for process control will also decrease. Therefore, the user should select the alpha value appropriately according to the manufacturing surroundings. In the case of LightGBM, for alpha values of 0.05 and 0.1, the defect rate cannot be calculated because no data exist in this range. This also indicates that defective products are not produced. If XGBoost is used and the alpha value is 0.05, the result is a decrease from the initial defect rate of 1% to 0.21%. Also, for all six experiments, the defect rate was lower than the original defect rate of 1.00%. On the basis of the validation in Table 9, LightGBM is better at controlling the injection molding process than XGBoost. However, both algorithms require less than a minute to process the data.

5. Conclusions

This paper proposes an optimal injection molding process control model to minimize the defect rate during the injection molding process. The methodology proposed in this study selects the main features of the injection molding process and presents the control range of the main features by using XAI. To predict whether the products are defective, tree-based classifier models (XGBoost and LightGBM) are used. The main features affecting the product defectivity are selected using the SHAP algorithm. The control range of the selected main features is presented by using the ICE algorithm.
A test dataset was used to verify the defect rate reduction for validation. The original dataset consisted 3995 of normal data values and 40 defect data values. The defect rate in the original dataset was 1.00%. Using XGBoost, the improved dataset comprised 969 normal data values and 2 defect data values. The defect rate in the improved dataset was 0.21%. Using LightGBM, the improved dataset consisted of 2314 normal data values and three defect data values. The defect rate of the improved dataset was 0.13%. The defect rates were 0.79% and 0.87%, respectively.
This study proposes an optimal model for improving product yield using injection molding process data. Compared with traditional AI approaches, XAI allows injection domain experts who may lack expertise in AI to understand the results of the methodology. As the injection molding process is not performed automatically in this study, it could help support injection engineers in improving the yield rate by providing the main features with control ranges. The study authors collaborated with company L to decrease the defect rate in the injection molding process.
This study focuses on the controllable variables in the injection molding process. The field experts from company L identified 16 features, and excluded 29 features including time and position features. Therefore, the significance of this study lies in its ability to improve process yield by adjusting the values of the main features identified in the methodology. Also, it enables field experts to more easily understand AI predictions by providing evidence for model learning by using XAI.
However, there are several limitations of the methodology presented in this paper. The quality of the resin used in the injection molding process is not considered in this work. It is hard to analyze the quality of this material since the resin is bought from vendors. Analyzing the correlation between the material quality and the product quality will be helpful in future research.
The methodology introduced in this paper relies on the sensor data sets from a manufacturing execution system (MES). The lack of sensors and the system preclude the implementation of the methodology.
In order to enhance the product quality and reduce the knowledge gap between AI approaches and manufacturing experts, various MES datasets from different domains should be used. In addition, the application of neural-network-based classification models or reinforcement learning techniques should be analyzed for automated manufacturing processes in order to increase the product yield rates.

Author Contributions

Conceptualization, S.-W.K. and J.-W.B.; methodology, J.H. and Y.H.; software, Y.H.; validation, J.H. and Y.H.; formal analysis, J.H. and Y.H.; investigation, Y.H.; resources, J.H., Y.H. and S.-W.K.; data curation, Y.H.; writing-original draft preparation, J.H. and Y.H.; writing-review and editing, S.-W.K. and J.-W.B.; visualization, Y.H.; supervision, S.-W.K.; project administration, S.-W.K.; funding acquisition, S.-W.K. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Technology Innovation Program (20023907, Development of Automation Equipment for Elevator Hoistway Installation) funded By the Ministry of Trade, Industry & Energy (MOTIE, Republic of Korea).

Data Availability Statement

Data are contained within the article.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
SHAPShapley Additive exPlanations
ICEIndividual Conditional Expectation
PDPPartial Dependence Plot
XAIeXplainable Artificial Intelligence
SMOTESynthetic Minority Over-sampling TechniquE
KAMPKorea AI Manufacturing Platform
XGBoosteXtreme Gradient Boosting
LightGBMLight Gradient Boosting Model

References

  1. Mason, M.; Rachmat, M.; Jose, M.C.; Ben, H. Experimental Development of an Injection Molding Process Window. Polymers 2023, 15, 3207. [Google Scholar] [CrossRef]
  2. Shen, C.; Wang, L.; Li, Q. Optimization of injection molding process parameters using combination of artificial neural net-work and genetic algorithm method. J. Mater. Process. Technol. 2007, 182, 412–418. [Google Scholar] [CrossRef]
  3. Silva, B.; Marques, R.; Faustino, D.; IIheu, P.; Santos, T.; Sousa, J.; Rocha, A.D. Enhance the injection molding quality prediction with artificial intelligence to reach zero-defect manufacturing. Processes 2022, 11, 62. [Google Scholar] [CrossRef]
  4. Gim, J.; Lin, C.; Turng, L. In-mold condition-centered and explainable artificial intelligence-based (IMC-XAI) process optimization for injection molding. J. Manuf. Syst. 2024, 72, 196–213. [Google Scholar] [CrossRef]
  5. Gim, J.; Turng, L. Interpretation of the effect of transient process data on part quality of injection molding based on explainable artificial intelligence. Int. J. Prod. Res. 2023, 61, 8192–8212. [Google Scholar] [CrossRef]
  6. Xu, Y.; Zhang, Q.; Zhang, W. Optimization of injection molding process parameters to improve the mechanical performance of polymer product against impact. Int. J. Adv. Manuf. Technol. 2015, 76, 2199–2208. [Google Scholar] [CrossRef]
  7. Fu, H.; Xu, H.; Liu, Y.; Yang, Z.; Kormakov, S.; Wu, D.; Sun, J. Overview of injection molding technology for processing polymers and their composites. ES Mater. Manuf. 2020, 8, 3–23. [Google Scholar] [CrossRef]
  8. Tsai, M.; Fan-Jiang, J.; Liou, G.; Cheng, F.; Hwang, S.; Peng, H.; Chu, H. Development of an online quality control system for injection molding process. Polymers 2022, 14, 1607. [Google Scholar] [CrossRef]
  9. Chen, J.; Sheu, L.; Chen, W.; Chen, H.; Chen, C. Application of Advanced Process Control in Plastic Injection Molding. IEEE Int. Conf. Serv. Oper. Logist. Inform. 2008, 2, 2719–2724. [Google Scholar]
  10. Formas, K.; Kurowska, A.; Janusz, J.; Szczygiel, P.; Rajzer, I. Injection Molding Process Simulation of Polycaprolactone Sticks for Further 3D Printing of Medical Implants. Materials 2022, 15, 7295. [Google Scholar] [CrossRef]
  11. Koo, K.; Choi, K.; Yoo, D. Double Ensemble Technique for Improving the Weight Defect Prediction of Injection Molding in Smart Factories. IEEE Access 2023, 11, 113605–113622. [Google Scholar] [CrossRef]
  12. Aslantaş, G.; Alaygut, T.; Rumelli, M.; Özsaraç, M.; Bakırlı, G.; Bırant, D. Estimating Types of Faults on Plastic Injection Molding Machines from Sensor Data for Predictive Maintenance. Artif. Intell. Theory Appl. 2023, 3, 1–11. [Google Scholar]
  13. Jung, H.; Jeon, J.; Choi, D.; Park, J. Application of machine learning techniques in injection molding quality prediction: Implications on sustainable manufacturing industry. Sustainability 2021, 13, 4120. [Google Scholar] [CrossRef]
  14. Arrieta, A.; Diaz-Rodriguez, R.; Del Ser, J.; Bennetot, A.; Tabik, S.; Barbado, A.; Garcia, S.; Gil-Lopez, S.; Molina, D.; Benjamins, R.; et al. Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI. Inf. Fusion 2020, 58, 82–115. [Google Scholar] [CrossRef]
  15. Gunning, D.; Aha, D. DARPA’s explainable artificial intelligence (XAI) program. AI Mag. 2019, 40, 44–58. [Google Scholar]
  16. Zhang, D.; Li, C.; Shahidehpour, M.; Wu, Q.; Zhou, B.; Zhang, C.; Buang, W. A bi-level machine learning method for fault diagnosis of oil-immersed transformers with feature explainability. Int. J. Electr. Power Energy Syst. 2022, 134, 107356. [Google Scholar] [CrossRef]
  17. Noor, A.; Javaid, N.; Alrajeh, N.; Mansoor, B.; Khaqan, A.; Bouk, S. Heart Disease Prediction using Stacking Model with Balancing Techniques and Dimensionality Reduction. IEEE Access 2023, 11, 116026–116045. [Google Scholar] [CrossRef]
  18. Obregon, J.; Hong, J.; Jung, J. Rule-based explanations based on ensemble machine learning for detecting sink mark defects in the injection moulding process. J. Manuf. Syst. 2021, 60, 392–405. [Google Scholar] [CrossRef]
  19. Liou, G.; Su, W.; Cheng, F.; Chang, C.; Tseng, R.; Hwang, S.; Peng, H.; Chu, H. Optimize Injection-Molding Process Parameters and Build an Adaptive Process Control System Based on Nozzle Pressure Profile and Clamping Force. Polymers 2023, 15, 610. [Google Scholar] [CrossRef]
  20. Ribeiro, M.; Singh, S.; Guestrin, C. “Why Should I Trust You?” Explaining the Predictions of Any Classifier. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA, 13–17 August 2016; pp. 1135–1144. [Google Scholar]
  21. Korea AI Manufacturing Platform (KAMP). Injection Molding Machine AI Dataset, KAIST (UNIST, EPM Solutions). Available online: https://www.kamp-ai.kr/aidataDetail?AI_SEARCH=&page=1&DATASET_SEQ=4&EQUIP_SEL=&GUBUN_SEL=&FILE_TYPE_SEL=&WDATE_SEL= (accessed on 14 December 2020).
  22. Nitesh, V. SMOTE: Synthetic minority over-sampling technique. J. Artif. Intell. Res. 2002, 16, 321–357. [Google Scholar]
  23. Chen, T.; Guestrin, C. XGBoost: A scalable tree boosting system. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA, 13–17 August 2016; pp. 785–794. [Google Scholar]
  24. Odya, P.; Górski, F.; Czyżewski, A. User Authentication by Eye Movement Features Employing SVM and XGBoost Classifiers. IEEE Access 2023, 11, 93341–93353. [Google Scholar] [CrossRef]
  25. Ke, G.; Meng, Q.; Finley, T.; Wang, T.; Chen, W.; Ma, W.; Ye, Q.; Liu, T. Lightgbm: A highly efficient gradient boosting decision tree. Adv. Neural Inf. Process. Syst. 2017, 30, 3148–3156. [Google Scholar]
  26. Jafari, S.; Byun, Y. Optimizing Battery RUL Prediction of Lithium-Ion Batteries Based on Harris Hawk Optimization Approach Using Random Forest and LightGBM. IEEE Access 2023, 11, 87034–87046. [Google Scholar] [CrossRef]
  27. Lundberg, S.; Lee, S. A Unified Approach to Interpreting Model Predictions. Adv. Neural Inf. Process. Syst. 2017, 30, 4768–4777. [Google Scholar]
Figure 1. Structure of injection molding process.
Figure 1. Structure of injection molding process.
Processes 13 00912 g001
Figure 3. eXplainable Artificial Intelligence (XAI).
Figure 3. eXplainable Artificial Intelligence (XAI).
Processes 13 00912 g003
Figure 4. Flowchart of the methodology.
Figure 4. Flowchart of the methodology.
Processes 13 00912 g004
Figure 5. Operating principle of the SMOTE algorithm.
Figure 5. Operating principle of the SMOTE algorithm.
Processes 13 00912 g005
Figure 7. Representative plots of the SHAP value.
Figure 7. Representative plots of the SHAP value.
Processes 13 00912 g007
Figure 8. Shapley value of manufacturing features ((left): XGBoost; (right): LightGBM).
Figure 8. Shapley value of manufacturing features ((left): XGBoost; (right): LightGBM).
Processes 13 00912 g008
Figure 9. ICE plots of XGBoost.
Figure 9. ICE plots of XGBoost.
Processes 13 00912 g009
Figure 10. ICE plots of LightGBM.
Figure 10. ICE plots of LightGBM.
Processes 13 00912 g010
Table 1. Example of injection process dataset.
Table 1. Example of injection process dataset.
PassOFailAverage_
Screw_RPM
Max_
Screw_RPM
Barrel_
Temperature_1
Max_
Injection_Pressure
1292.530.7276.5141.8
1292.430.8276.2141.7
1292.530.8276.2141.7
1292.631.0276.5141.5
1292.630.8276.8142.5
0292.530.9276.3142.6
1292.531.0275.5142.5
0290.530.9286.1142.6
Table 2. Independent variables of the injection molding process data.
Table 2. Independent variables of the injection molding process data.
Independent Variable
(Unit)
Description
Max_Screw_RPM
(mm/s)
Maximum speed of screw for injection
Average_Screw_RPM
(mm/s)
Average speed of screw for injection
Max_Injection_Pressure
(MPa)
Maximum pressure applied to the molten resin flowing into the mold
Max_Switch_Over_Pressure
(MPa)
Pressure converted from injection to packing pressure
Average_Back_Pressure
(MPa)
Average pressure to prevent the screw from being pushed out
Barrel_Temperature_1~7
(°C)
Temperature of the barrel
Hopper_Temperature
(°C)
Temperature of the hopper
Mold_Temperature_3, 4
(°C)
Temperature of the mold
Table 3. Result of the train–test split.
Table 3. Result of the train–test split.
NormalDefective
Train Dataset396431
Test Dataset 395540
Table 4. Oversampling results.
Table 4. Oversampling results.
NormalDefective
Train Dataset39643964
Test Dataset 395540
Table 5. Model training results.
Table 5. Model training results.
Actual
Normal Data
Actual
Defective Data
AccuracyPrecisionRecallCV Average Accuracy
XGBoostPredicted
Normal Data
39412599.0299.6599.370.9964
Predicted
Defective Data
1415
LightGBMPredicted
Normal Data
39412599.0299.6599.370.9948
Predicted
Defective Data
1415
Table 6. Selected main features and mean of the absolute Shapley value.
Table 6. Selected main features and mean of the absolute Shapley value.
XGBoostCumulative Ratio
Feature NameValue
1Max_Injection_Pressure1.740.15
2Average_Back_Pressure1.520.28
3Max_Switch_Over_Pressure1.210.38
4Barrel_Temperature_50.930.46
5Max_Screw_RPM0.800.53
6Average_Screw_RPM0.770.59
7Barrel_Temperature_1 0.750.66
LightGBMCumulative Ratio
Feature NameValue
1Max_Injection_Pressure2.050.17
2Max_Switch_Over_Pressure1.920.34
3Barrel_Temperature_51.060.43
4Average_Back_Pressure1.040.51
5Barrel_Temperature_30.940.59
6Mold_Temperature_40.870.67
Table 7. Control range of the main features for three alpha values (XGBoost results).
Table 7. Control range of the main features for three alpha values (XGBoost results).
α0.050.10.2
Variable
Max_Injection_Pressure[141.60, 142.40][141.20, 183.20][141.20, 183.20]
Average_Back_Pressure[13.30, 90.80][13.30, 90.80][13.30, 90.80]
Max_Switch_Over_Pressure[115.60, 136.50][115.60, 136.52][115.60, 136.52]
Barrel_Temperature_5[236.30, 255.00][236.30, 266.40][236.30, 266.40]
Max_Screw_RPM[30.30, 31.20][30.30, 31.20][30.30, 31.20]
Average_Screw_RPM[29.00, 293.40][29.00, 293.40][29.00, 293.40]
Barrel_Temperature_1[244.70, 287.10][244.70, 287.10][244.70, 287.10]
Table 8. Control range of the main features for three alpha values (LightGBM results).
Table 8. Control range of the main features for three alpha values (LightGBM results).
α0.050.10.2
Variable
Max_Injection_Pressure[141.50, 142.20][141.20, 183.20][141.20, 183.20]
Max_Switch_Over_Pressure [115.60, 119.00][115.60, 119.55][115.60, 136.80]
Barrel_Temperature_5[236.30, 254.90][236.30, 255.00][236.30, 266.40]
Average_Back_Pressure[13.30, 60.00][13.30, 60.00][13.30, 60.00]
Barrel_Temperature_3[285.50, 285.80][245.00, 285.40][245.00, 285.40]
Barrel_Temperature_4[20.60, 22.60][20.60, 22.69][20.60, 27.70]
Table 9. Validation results.
Table 9. Validation results.
XGBoostDefect Rate (%)
NormalDefect
α = 0.0596920.21
α = 0.12284200.88
α = 0.22284200.88
Original Data3995401.00
LightGBMDefect rate (%)
NormalDefect
α = 0.05N/AN/AN/A
α = 0.1N/AN/AN/A
α = 0.2231430.13
Original Data3995401.00
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Hong, J.; Hong, Y.; Baek, J.-W.; Kang, S.-W. Enhancing the Product Quality of the Injection Process Using eXplainable Artificial Intelligence. Processes 2025, 13, 912. https://doi.org/10.3390/pr13030912

AMA Style

Hong J, Hong Y, Baek J-W, Kang S-W. Enhancing the Product Quality of the Injection Process Using eXplainable Artificial Intelligence. Processes. 2025; 13(3):912. https://doi.org/10.3390/pr13030912

Chicago/Turabian Style

Hong, Jisoo, Yongmin Hong, Jung-Woo Baek, and Sung-Woo Kang. 2025. "Enhancing the Product Quality of the Injection Process Using eXplainable Artificial Intelligence" Processes 13, no. 3: 912. https://doi.org/10.3390/pr13030912

APA Style

Hong, J., Hong, Y., Baek, J.-W., & Kang, S.-W. (2025). Enhancing the Product Quality of the Injection Process Using eXplainable Artificial Intelligence. Processes, 13(3), 912. https://doi.org/10.3390/pr13030912

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop