Next Article in Journal
Hydrogen-Assisted Crack Growth in the Heat-Affected Zone of X80 Steels during in Situ Hydrogen Charging
Next Article in Special Issue
A Sensitivity Analysis-Based Parameter Optimization Framework for 3D Printing of Continuous Carbon Fiber/Epoxy Composites
Previous Article in Journal
Influence of Different Pretreatments on the Structure and Hydrolysis Behavior of Bamboo: A Comparative Study
Previous Article in Special Issue
Improvement of Surface Roughness and Hydrophobicity in PETG Parts Manufactured via Fused Deposition Modeling (FDM): An Application in 3D Printed Self–Cleaning Parts
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Decision Tree Methods for Predicting Surface Roughness in Fused Deposition Modeling Parts

Department of Mechanical Engineering, University of Cordoba, Medina Azahara Avenue, 5–14071 Cordoba, Spain
*
Author to whom correspondence should be addressed.
Materials 2019, 12(16), 2574; https://doi.org/10.3390/ma12162574
Submission received: 20 July 2019 / Revised: 8 August 2019 / Accepted: 12 August 2019 / Published: 12 August 2019

Abstract

:
3D printing using fused deposition modeling (FDM) includes a multitude of control parameters. It is difficult to predict a priori what surface finish will be achieved when certain values are set for these parameters. The objective of this work is to compare the models generated by decision tree algorithms (C4.5, random forest, and random tree) and to analyze which makes the best prediction of the surface roughness in polyethylene terephthalate glycol (PETG) parts printed in 3D using the FDM technique. The models have been created using a dataset of 27 instances with the following attributes: layer height, extrusion temperature, print speed, print acceleration, and flow rate. In addition, a dataset has been created to evaluate the models, consisting of 15 additional instances. The models generated by the random tree algorithm achieve the best results for predicting the surface roughness in FDM parts.

1. Introduction

Additive manufacturing or 3D printing techniques allow small batches of parts to be produced directly, economically, and flexibly [1]. There are different additive manufacturing techniques; however, fused deposition modeling (FDM) printers are the most extended due to their low cost and the wide variety of materials that can be used [2,3,4].
FDM printers offer a large number of print parameters: print temperature, layer height, print speed, print acceleration, and flow rate, among others. When a part is printed, it is difficult to predict a priori if it will receive an adequate surface finish [5,6].
Data mining techniques are used to improve the quality of processes and products based on data gathered from previous experiences [7,8,9]: they are used to find out which parameters are most influential in surface finishing in electrical discharge machining (EDM) processes [10]. They are also used to predict the wear of a tool in milling processes [11] or to increase the accuracy of high-speed machining of titanium alloys [12].
Data mining techniques can be classified into supervised (classification, regression) and unsupervised (clustering, association rules, correlations). Classification techniques are widely utilized [9]. To use these techniques, different classes must be established in which each instance in the database must belong to a class; the rest of the attributes of the instance are used to predict that class. The objective of these algorithms is to maximize the accuracy ratio of the classification of new instances [13].
Decision trees are one of the most widely used classification techniques in data mining [13]. Tree nodes represent a condition relative to a given attribute. The leaves indicate the number of instances that belong to a class and that satisfy the conditions imposed in the previous nodes. By means of this type of algorithms, it is possible to create models that allow for predictions such as whether a register with certain attributes will belong to one class or another. There are several classification algorithms: C4.5 [14], random forest [15], random tree [16].
Previous work has focused on the use of classification techniques in additive manufacturing processes. Wu et al. [17,18] applied random forest, k-nearest neighbor, and anomaly detection techniques to detect defects caused by a cyberattack on an FDM printer during part fabrication. Amini and Chang [19] used classification techniques to reduce defects in metal parts manufacturing processes using selective laser melting (SLM) printers. Recently, Li et al. [20] have generated and checked models using different machine learning algorithms to predict the surface roughness of 3D printed parts using FDM; in this case, the authors used a design of experiments with three variables (layer height, print temperature, and print speed/flow rate) and measured the roughness in a unique direction.
The objective of this work is to analyze which decision tree algorithm (C4.5, random forest, or random tree) is the best to predict the surface finish (Ra,0 and Ra,90) of a FDM printed part. Data mining models have been developed from a training dataset consisting of 27 instances with attributes of layer height, print temperature, flow rate, print speed, print acceleration, Ra,0 class, and Ra,90 class. These models have been tested using a dataset with 15 additional instances.

2. Materials and Methods

Figure 1 shows the different stages that constitute the methodology followed in this work: 3D printing, surface roughness measurements, data mining, models generation, models testing, and comparison between algorithms.

2.1. 3D Printing and Surface Roughness Measurements

To carry out the present study, 27 specimens of dimensions 25.0 mm × 25.0 mm × 2.4 mm were printed, following a design of fractionated orthogonal experiments, with five factors and three levels. The parameters used and the values assigned to each level are shown in Table 1.
The specimens were designed using SolidWorks software. The selection of values for print parameters and numerical code (NC) generation was done using CURA software. The specimens were manufactured using an Ender 3 printer, with a diameter nozzle equal to 0.4 mm.
The polyethylene terephthalate glycol (PETG) filament was supplied by Smart Materials 3D (Smart Materials 3D, Alcalá la Real, Spain). Although there are not many published works on PETG, it is a filament increasingly used in the industry, having mechanical characteristics similar to ABS but printed with the ease of a PLA [21].
Once printed, the roughness of the specimens was measured using a Mitutoyo perthometer model SJ-201 (Mitutoyo, Kawasaki, Japan). Five measurements were made in each specimen and direction, 0° and 90°, as seen in Figure 2. The representative value was calculated as the arithmetic mean of the five measurements. The resulting values are shown in Table 2.
The models were generated using the free software WEKA (Waikato Environment for Knowledge Analysis). The roughness values were divided into two classes (class 1 and class 2) from the mean of the values of Ra,0 and Ra,90. For Ra,0, class 1 includes 0–4.43 μm and class 2 is from 4.43 to 10.64 μm. For Ra,90, class 1 includes 0–11.63 μm and class 2 includes 11.63–32.99 μm. To validate the model obtained, 15 additional specimens were printed using random values for the parameters under study, as seen in Table 3. Likewise, five roughness measurements were performed on each specimen and direction, obtaining the representative values by means of the arithmetic mean.

2.2. Data Mining and Decision Trees

The data mining process consists of several steps: (1) integration and data collection, which coincides with the previous paragraph; (2) selection, cleaning and transformation, in which the data were prepared in .arff format, which are files used by WEKA; (3) data mining, consisting of applying the algorithms to the data and generating patterns and models; (4) evaluation and interpretation of the information generated; (5) generation of knowledge and decision making.
In this work, three algorithms based on decision trees are compared: J48 (a variation of WEKA for C4.5), random forest, and random tree. The algorithms are briefly presented below.

2.2.1. J48 (C4.5)

The J48 is WEKA’s version of the C4.5 algorithm. Algorithm C4.5, created by Quinlan [14], allows the generation of decision trees. It is an iterative algorithm that consists of dividing the data in each stage into two groups using the concept of information entropy. This partitioning process is recursive and stops when all records of a sheet belong to the same class or category.

2.2.2. Random Forest

The random forest method was developed by Breiman [15]. It consists of constructing multiple decision trees using random combinations and orders of variables before constructing a random tree using bootstrap aggregating (also known as bagging). This highly accurate algorithm is capable of handling hundreds of variables without excluding any [16].

2.2.3. Random Tree

The random tree is an algorithm halfway between a simple decision tree and a random forest. Random trees are a set of predictor trees called forest. The classification mechanisms are as follows: the random tree classifier obtains the input characteristic vector, classifies it with each tree in the forest, and produces the class label that received the most “votes” [13].

3. Results

This work proposes the use of decision trees and data mining techniques to predict which values should be selected for the print parameters of PETG flat specimens, manufactured by FDM. The dataset used as a training dataset to develop the models is shown in Table 4.
Algorithm J48 (C4.5) generates decision trees that can be represented graphically. Figure 3 shows the decision tree generated for Ra,0 from the corresponding training dataset. Figure 4 shows the decision tree generated for Ra,90.
Once the models have been generated using the algorithms studied, they have been checked using the dataset shown in Table 5. The results of each algorithm for the prediction of Ra,0 are shown in Table 6 and Table 7. Likewise, the results for the prediction of Ra,90 are presented in Table 8 and Table 9. Table 10 shows the time used by each algorithm to construct and validate the models.
The J48 algorithm allows for the generation of simple trees that can be easily understood and interpreted, as seen in Figure 3 and Figure 4. In these trees, it is clear which parameters are most important to reduce Ra,0 (PA, LH, F) and Ra,90 (F, PS, LH). However, the models created by means of the J48 algorithm are those that adjust the worst to the test data, as seen in Table 6 and Table 8: a 60.00% success rate for Ra,0; a 73.33% success rate for Ra,90; and a negative kappa statistic in both cases. Table 7 and Table 9 show that the precision for the models created using the J48 algorithm is lower than that achieved by the rest of the models, and that the value of the area under the ROC is very low.
The random forest algorithm slightly improves the results of J48, as seen in Table 6 and Table 8; the model created using this algorithm provides 66.67% success for Ra,0 and 80.00% success for Ra,90. In this case, the kappa statistic is close to zero for Ra,0 and negative for Ra,90. Table 7 and Table 9 show that the precision of both models increases, as does the area under ROC. This algorithm does not generate trees that can be plotted.
The models generated by the random tree algorithm are the ones that obtain the best results in this case, as seen in Table 6 and Table 8, with an 80% success rate for Ra,0 and 86.67% success rate for Ra,90. It obtains a positive kappa statistic in both cases: 0.2857 for Ra,0 and 0.5946 for Ra,90. These values can be classified as fair and moderate according to Table 11. Table 7 and Table 9 show that the precision of both models is higher than in the cases analyzed above; the area under ROC is 0.673 for the Ra,0 model and 0.923 for the Ra,90 model. In addition, as can be seen in Table 10, this algorithm took the least time to build the models.

4. Discussion

In the present work, three classification tree algorithms were used to predict the surface finish of FDM-printed PETG flat specimens. The algorithms used were J48, random forest, and random tree. The software used to generate the models was WEKA [13].
The J48 algorithm has been widely used to study problems in the manufacturing area related to quality improvement in production processes [10,22]. In this work, the J48 generated models that classified training dataset data very well. Additionally, this algorithm made it possible to create trees that can be plotted and that are easily understood [10,16,23,24]. This algorithm also allowed for the identification of the most influential parameters in roughness: PA, LH, and F for Ra,0; F, PS, and LH for Ra,90. These results coincide with those obtained by the authors in previous works [25]: PA is the parameter with the greatest influence on Ra,0; F, PS, and LH are the parameters with the greatest influence on Ra,90. However, in this case, the models created with J48 algorithm were not able to predict the data used in the test. This may be related to overfitting problems [13,26].
In the problem addressed, the random forest algorithm obtained better results than J48, as could be expected from the literature [27,28,29]. Although it does not generate a graphical model, this algorithm is widely used by other authors in problems similar to the one studied [30].
In this work, the random tree algorithm generated the best models for Ra,0 and Ra,90. Previously, other authors have also chosen this algorithm for its properties [31]. An additional advantage of this algorithm is its calculation time, which is the fastest of those studied [32].

5. Conclusions

In the present work, three models were created and compared to predict the surface roughness of flat pieces printed on PETG by FDM. For this purpose, data mining classification were used, such as J48 (C4.5), random forest, and random tree. The software used to generate the models was the open source software WEKA.
The model generated by random tree obtains better results. It correctly classifies a priori 80% of the instances in the case of Ra,0 and 86.67% of the instances in the case of Ra,90. It is the only algorithm of the three evaluated that achieves a positive kappa statistic, qualified as fair for Ra,0 and moderate for Ra,90. It obtains the highest accuracy and area under ROC and is also the fastest algorithm of the three analyzed.
In future works, we intend to study whether the decision trees can be used to generate models that allow for the prediction of a better dimensional accuracy of the parts manufactured by FDM. The impact of other print factors on the surface properties of printed parts, such as nozzle diameter, will also be studied.

Author Contributions

Conceptualization, P.E.R.; methodology, P.E.R.; software, P.E.R.; validation, J.M.B. and P.E.R.; formal analysis, P.E.R.; investigation, J.M.B.; resources, P.E.R.; data curation, J.M.B.; writing—original draft preparation, J.M.B.; writing—review and editing, P.E.R.; visualization, J.M.B.; supervision, P.E.R.; project administration, P.E.R.; funding acquisition, P.E.R.

Funding

This research was funded by University of Cordoba, via “Plan Propio de Investigacion”.

Acknowledgments

The authors would like to thank Smart Materials 3D (www.smartmaterials3d.com) for the material supplied. We must also thank the laboratory technician Ana Martin Bernal for her help during the work.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Lipson, H.; Kurman, M. Fabricated: The New World of 3D Printing; John Wiley & Sons Inc.: Indianapolis, IN, USA, 2013. [Google Scholar]
  2. Carneiro, O.S.; Silva, A.F.; Gomes, R. Fused deposition modeling with polypropylene. Mater. Des. 2015, 83, 768–776. [Google Scholar] [CrossRef]
  3. Ahn, S.; Montero, M.; Odel, D.; Roundy, S.; Wright, P.K. Anisotropic material properties of fused deposition modeling ABS. Rapid Prototyp. J. 2002, 8, 248–257. [Google Scholar] [CrossRef] [Green Version]
  4. Messimer, S.L.; Pereira, T.R.; Patterson, A.E.; Lubna, M.; Drozda, F.O. Full-density fused deposition modeling dimensional error as a function of raster angle and build orientation: Large dataset for eleven materials. J. Manuf. Mater. Process. 2019, 3, 6. [Google Scholar] [CrossRef]
  5. Nancharaiah, T.; Raju, D.R.; Raju, V.R. An experimental investigation on surface quality and dimensional accuracy of FDM components. Int. J. Emerg. Technol. 2010, 1, 106–111. [Google Scholar]
  6. Gordon, E.R.; Shokrani, A.; Flynn, J.M.; Goguelin, S.; Barclay, J.; Dhokia, V. A Surface Modification Decision Tree to Influence Design in Additive Manufacturing. In Sustainable Design and Manufacturing 2016. SDM2016. Smart Innovation, Systems and Technologies; Setchi, R., Howlett, R., Liu, Y., Theobald, P., Eds.; Springer: Cham, Switzerland, 2016; Volume 52, ISBN 9783319320984. [Google Scholar]
  7. Harding, J.A.; Shahbaz, M.; Srinivas, S.; Kusiak, A. Data Mining in Manufacturing: A review. J. Manuf. Sci. Eng. 2006, 128, 969–976. [Google Scholar] [CrossRef]
  8. Choudhary, A.K.; Harding, J.A.; Tiwari, M.K. Data mining in manufacturing: A review based on the kind of knowledge. J. Intell. Manuf. 2009, 20, 501–521. [Google Scholar] [CrossRef]
  9. Köksal, G.; Batmaz, I.; Testik, M.C. A review of data mining applications for quality improvement in manufacturing industry. Expert Syst. Appl. 2011, 38, 13448–13467. [Google Scholar] [CrossRef]
  10. Kuriakose, S.; Mohan, K.; Shunmugam, M.S. Data mining applied to wire-EDM process. J. Mater. Process. Technol. 2003, 142, 182–189. [Google Scholar] [CrossRef]
  11. Wu, D.; Jennings, C.; Terpenny, J.; Gao, R.X.; Kumara, S. A comparative study on machine learning algorithms for smart manufacturing: Tool wear prediction using random forests. J. Manuf. Sci. Eng. 2017, 139, 071018. [Google Scholar] [CrossRef]
  12. Krishnakumar, P.; Rameshkumar, K.; Ramachandran, K.I. Acoustic Emission-Based Tool Condition Classification in a Precision High-Speed Machining of Titanium Alloy: A machine learning approach. Int. J. Comput. Intell. Appl. 2018, 17, 1850017. [Google Scholar] [CrossRef]
  13. Witten, I.H.; Frank, E.; Hall, M.A. Data Mining: Practical Machine Learning Tools and Techniques, 3rd ed.; Morgan Kaufmann: Burlington, MA, USA, 2011. [Google Scholar]
  14. Quinlan, J.R. C4.5: Programs for Machine Learning; Morgan Kaufmann Publishers, Inc.: San Mateo, CA, USA, 2014. [Google Scholar]
  15. Breiman, L. Random forest. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef]
  16. Zhao, Y.; Zhang, Y. Comparison of decision tree methods for finding active objects. Adv. Space Res. 2008, 41, 1955–1959. [Google Scholar] [CrossRef]
  17. Wu, M.; Song, Z.; Moon, Y.B. Detecting cyber-physical attacks in CyberManufacturing systems with machine learning methods. J. Intell. Manuf. 2019, 30, 1111–1123. [Google Scholar] [CrossRef]
  18. Wu, M.; Zhou, H.; Lin, L.L.; Silva, B.; Song, Z.; Cheung, J.; Moon, Y. Detecting attacks in cybermanufacturing systems: Additive manufacturing example. MATEC Web Confer. 2017, 108, 06005. [Google Scholar] [CrossRef]
  19. Amini, M.; Chang, S.I. MLCPM: A process monitoring framework for 3D metal printing in industrial scale. Comput. Ind. Eng. 2018, 124, 322–330. [Google Scholar] [CrossRef]
  20. Li, Z.; Zhang, Z.; Shi, J.; Wu, D. Prediction of surface roughness in extrusion-based additive manufacturing with machine learning. Robot. Comput. Integr. Manuf. 2019, 57, 488–495. [Google Scholar] [CrossRef]
  21. Smart Materials 3D. Available online: https://www.smartmaterials3d.com/en/ (accessed on 17 January 2019).
  22. Kim, A.; Oh, K.; Jung, J.; Kim, B. Imbalanced classification of manufacturing quality conditions using cost-sensitive decision tree ensembles. Int. J. Comput. Integr. Manuf. 2017, 31, 701–717. [Google Scholar] [CrossRef]
  23. Ronowicz, J.; Thommes, M.; Kleinebudde, P.; Krysiński, J. A data mining approach to optimize pellets manufacturing process based on a decision tree algorithm. Eur. J. Pharm. Sci. 2015, 73, 44–48. [Google Scholar] [CrossRef]
  24. Rodríguez, J.; Quintana, G.; Bustillo, A.; Ciurana, J. A decision-making tool based on decision trees for roughness prediction in face milling. Int. J. Comput. Integr. Manuf. 2017, 30, 943–957. [Google Scholar] [CrossRef]
  25. Barrios, J.M.; Romero, P.E. Improvement of surface roughness and hydrophobicity in PETG parts manufactured via fused deposition modeling (FDM): An application in 3D printed self—Cleaning parts. Materials (Basel) 2019, 12, 2499. [Google Scholar] [CrossRef]
  26. Rokach, L. Decision forest: Twenty years of research. Inf. Fusion 2016, 27, 111–125. [Google Scholar] [CrossRef]
  27. Sathyadevan, S.; Nair, R.R. Comparative Analysis of Decision Tree Algorithms: ID3, C4.5 and Random Forest. In Computational Intelligence in Data Mining—Volume 1. Smart Innovation, Systems and Technologies; Jain, L., Behera, H., Mandal, J., Mohapatra, D., Eds.; Springer: New Delhi, India, 2015; Volume 31, pp. 549–562. ISBN 9788132222057. [Google Scholar]
  28. Kwon, B.; Won, J.; Kang, D. Fast Defect Detection for Various Types of Surfaces using Random Forest with VOV Features. Int. J. Precis. Eng. Manuf. 2015, 16, 965–970. [Google Scholar] [CrossRef]
  29. Ali, J.; Khan, R.; Ahmad, N.; Maqsood, I. Random Forests and Decision Trees. Int. J. Comput. Sci. Issues 2012, 9, 272–278. [Google Scholar]
  30. Amand, B.; Cordy, M.; Heymans, P.; Acher, M.; Temple, P.; Jézéquel, J. Towards learning-aided configuration in 3D printing: Feasibility study and application to defect prediction. In Proceedings of the 13th International Workshop on Variability Modelling of Software Intensive Systems (VaMoS’19), Leuven, Belgium, 6–8 February 2019; Perrouin, G., Weyns, D., Eds.; ACM: New York, NY, USA, 2019; p. 9. [Google Scholar]
  31. Kevric, J.; Jukic, S.; Subasi, A. An effective combining classifier approach using tree algorithms for network intrusion detection. Neural Comput. Appl. 2017, 28, 1051–1058. [Google Scholar] [CrossRef]
  32. Ravichandran, S.; Srinivasan, V.B.; Ramasamy, C. Comparative study on decision tree techniques for mobile call detail record. J. Commun. Comput. 2012, 9, 1331–1335. [Google Scholar]
Figure 1. Different stages that compose the methodology followed in the present work: 3D printing, surface roughness measurements, data mining, models generation, models testing, and comparison between algorithms.
Figure 1. Different stages that compose the methodology followed in the present work: 3D printing, surface roughness measurements, data mining, models generation, models testing, and comparison between algorithms.
Materials 12 02574 g001
Figure 2. Measurement of surface roughness in the direction parallel to the direction of extrusion (a) and in the direction perpendicular to the direction of extrusion (b).
Figure 2. Measurement of surface roughness in the direction parallel to the direction of extrusion (a) and in the direction perpendicular to the direction of extrusion (b).
Materials 12 02574 g002
Figure 3. J48 (C4.5) decision tree for Ra,0.
Figure 3. J48 (C4.5) decision tree for Ra,0.
Materials 12 02574 g003
Figure 4. J48 (C4.5) decision tree for Ra,90.
Figure 4. J48 (C4.5) decision tree for Ra,90.
Materials 12 02574 g004
Table 1. Factors and levels used in design of the experiments (DOE).
Table 1. Factors and levels used in design of the experiments (DOE).
Print ParameterLevel 1Level 2Level 3
Layer height (LH), mm0.160.200.24
Temperature (T), °C240245250
Print speed (PS), mm/s405060
Print acceleration (PA), mm/s250010001500
Flow rate (F), %90100110
Table 2. Training dataset: design of experiment (L27), according to Taguchi method: layer height (LH), print temperature (T), print speed (PS), print acceleration (PA), and flow rate (F).
Table 2. Training dataset: design of experiment (L27), according to Taguchi method: layer height (LH), print temperature (T), print speed (PS), print acceleration (PA), and flow rate (F).
No.LH (mm)T (°C)PS (mm/s)PA (mm/s2)F (%)
10.162404050090
20.1624040500100
30.1624040500110
40.16245501000110
50.1624550100090
60.16245501000100
70.16250601500100
80.16250601500110
90.1625060150090
100.20240501500100
110.20240501500110
120.2024050150090
130.202456050090
140.2024560500100
150.2024560500110
160.20250401000110
170.2025040100090
180.20250401000100
190.24240601000110
200.2424060100090
210.24240601000100
220.24245401500100
230.24245401500110
240.2424540150090
250.242505050090
260.2425050500100
270.2425050500110
Table 3. Test dataset: design of experiment (L27), according to Taguchi method: layer height (LH), print temperature (T), print speed (PS), print acceleration (PA), and flow rate (F).
Table 3. Test dataset: design of experiment (L27), according to Taguchi method: layer height (LH), print temperature (T), print speed (PS), print acceleration (PA), and flow rate (F).
No.LH (mm)T (°C)PS (mm/s)PA (mm/s2)F (%)
10.142401520095
20.1423618300105
30.1823815300115
40.182432020095
50.1424635400105
60.232484640095
70.2424345600103
80.30230562000100
90.30250601600110
100.20251701500102
110.20249851200100
120.28249100120098
130.2823725110090
140.1423821800100
150.1023950600110
Table 4. Training dataset: results for surface roughness (Ra,0, Ra,90).
Table 4. Training dataset: results for surface roughness (Ra,0, Ra,90).
TestRa,0 (µm)Ra,0 Class Ra,90 (µm)Ra,90 Class
110.648Class212.240Class2
20.916Class16.464Class1
31.126Class19.160Class1
42.428Class132.994Class2
51.800Class15.504Class1
68.814Class210.922Class1
74.552Class223.650Class2
81.370Class114.458Class2
90.954Class15.414Class1
101.462Class123.470Class2
111.666Class19.050Class1
121.554Class110.074Class1
136.258Class220.088Class2
147.788Class215.368Class2
1510.172Class212.560Class2
169.744Class210.186Class2
174.696Class25.462Class1
185.112Class25.330Class1
194.274Class210.668Class1
206.994Class18.214Class1
215.868Class26.056Class1
223.796Class28.680Class1
233.054Class15.720Class1
243.702Class16.804Class1
254.124Class119.654Class1
264.682Class18.964Class2
272.256Class27.122Class1
Table 5. Testing dataset: results for surface roughness (Ra,0, Ra,90).
Table 5. Testing dataset: results for surface roughness (Ra,0, Ra,90).
TestRa,0 (µm)Ra,0 Class Ra,90 (µm)Ra,90 Class
11.026Class14.462Class1
21.178Class12.656Class1
32.064Class13.97Class1
41.126Class17.192Class1
51.984Class19.24Class1
61.252Class18.276Class1
71.026Class14.462Class1
81.744Class17.732Class1
95.906Class211.408Class1
103.99Class112.082Class2
111.182Class16.392Class1
121.008Class16.622Class1
136.466Class214.002Class2
140.606Class16.828Class1
151.846Class14.758Class1
Table 6. Indicators to compare the models generated by the studied algorithms to predict Ra,0.
Table 6. Indicators to compare the models generated by the studied algorithms to predict Ra,0.
IndicatorJ48Random ForestRandom Tree
Correctly Classified Instances60.00%66.67%80.00%
Incorrectly Classified Instances40.00%33.33%20.00%
Kappa statistic−0.21620.11760.2857
Mean absolute error0.49260.4290.2
Root mean squared error0.5950.4740.4472
Relative absolute error106.60%92.84%43.28%
Root relative squared error128.39%102.29%96.50%
Table 7. Detailed precision parameters achieved by each algorithm for the Ra,0 prediction model.
Table 7. Detailed precision parameters achieved by each algorithm for the Ra,0 prediction model.
Detailed Accuracy (weighted av.)J48Random ForestRandom Tree
True Positive (TP) Rate0.6000.6670.800
False Positive (FP) Rate0.9080.4740.454
Precision0.7090.8070.839
Recall0.6000.6670.800
F-measure0.6500.7160.816
MCC−0.2370.1390.294
ROC Area0.1540.6920.673
PRC Area0.6970.8680.819
Table 8. Indicators to compare the models generated by the studied algorithms to predict Ra,90.
Table 8. Indicators to compare the models generated by the studied algorithms to predict Ra,90.
IndicatorJ48Random ForestRandom Tree
Correctly Classified Instances73.33%80.00%86.67%
Incorrectly Classified Instances26.67%20.00%13.33%
Kappa statistic−0.1538−0.09760.5946
Mean absolute error0.25560.28880.1333
Root mean squared error0.46450.38540.3651
Relative absolute error66.17%74.57%34.52%
Root relative squared error116.01%96.26%91.20%
Table 9. Detailed precision parameters achieved by each algorithm for the Ra,90 prediction model.
Table 9. Detailed precision parameters achieved by each algorithm for the Ra,90 prediction model.
Detailed Accuracy (weighted av.)J48Random ForestRandom Tree
True Positive (TP) Rate0.7330.8000.867
False Positive (FP) Rate0.8870.8770.021
Precision0.7330.7430.933
Recall0.7330.8000.867
F-measure0.7330.7700.883
MCC−0.154−0.1050.650
ROC Area0.3850.4810.923
PRC Area0.7450.7970.916
Table 10. Time used by each algorithm to build and validate the model.
Table 10. Time used by each algorithm to build and validate the model.
AlgorithmComputing Time for Ra,0 Model (s)Computing Time for Ra,90 Model (s)
J480.110.19
Random Forest0.050.34
Random Tree0.010.01
Table 11. Strength of concordance for kappa statistic.
Table 11. Strength of concordance for kappa statistic.
Kappa StatisticStrength of Concordance
0.00Poor
0.01–0.20Slight
0.21–0.40Fair
0.41–0.60Moderate
0.61–0.80Substancial
0.81–1.00Almost perfect

Share and Cite

MDPI and ACS Style

Barrios, J.M.; Romero, P.E. Decision Tree Methods for Predicting Surface Roughness in Fused Deposition Modeling Parts. Materials 2019, 12, 2574. https://doi.org/10.3390/ma12162574

AMA Style

Barrios JM, Romero PE. Decision Tree Methods for Predicting Surface Roughness in Fused Deposition Modeling Parts. Materials. 2019; 12(16):2574. https://doi.org/10.3390/ma12162574

Chicago/Turabian Style

Barrios, Juan M., and Pablo E. Romero. 2019. "Decision Tree Methods for Predicting Surface Roughness in Fused Deposition Modeling Parts" Materials 12, no. 16: 2574. https://doi.org/10.3390/ma12162574

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop