Improving Innovation from Science Using Kernel Tree Methods as a Precursor to Designed Experimentation
Abstract
:1. Introduction
2. Material and Methods
2.1. Dataset Descriptions
2.2. Description of Predictor Variables
2.3. Kernel Tree Methods
3. Fractional Factorials and Aliasing
4. Results
4.1. Boosted Tree Models and Bootstrap Forest
4.2. RT Models as a Precursor for Response Surface Methods
5. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Box, G.E.P.; Behnken, D. Some new three level designs for the study of quantitative variables. Technometrics 1960, 2, 455–475. [Google Scholar] [CrossRef]
- Fielding, A. Binary segmentation: The automatic detector and related techniques for exploring data structure. In The Analysis of Survey Data, Exploring Data Structures; O’Muircheartaigh, C.A., Payne, C., Eds.; John Wiley and Sons, Inc.: New York, NY, USA, 1977; Volume I, pp. 221–257. [Google Scholar]
- Kass, G.V. Significance testing in automatic interaction detection (A.I.D.). Appl. Stat. 1975, 24, 178–189. [Google Scholar] [CrossRef]
- Loh, W.Y. Regression trees with unbiased variable selection and interaction detection. Stat. Sin. 2002, 12, 361–386. [Google Scholar]
- Morgan, J.N.; Sunquist, J.A. Problems in the analysis of survey data and a proposal. J. Am. Stat. Assoc. 1963, 58, 415–434. [Google Scholar] [CrossRef]
- Friedman, J.H. Greedy function approximation: A gradient booting machine. Ann. Stat. 2001, 29, 1189–1232. [Google Scholar] [CrossRef]
- Friedman, J.H. Stochastic gradient boosting. Comput. Stat. Data Anal. 2002, 38, 367–378. [Google Scholar] [CrossRef]
- Friedman, J.H.; Meulman, J.J. Multiple additive regression trees with application in epidemiology. Stat. Med. 2003, 22, 1365–1381. [Google Scholar] [CrossRef]
- Kim, H.; Loh, W.Y. Classification trees with unbiased multiway splits. J. Am. Stat. Assoc. 2001, 96, 589–604. [Google Scholar] [CrossRef] [Green Version]
- Kim, H.; Loh, W.Y. Classification trees with bivariate linear discriminant node models. J. Comput. Graph. Stat. 2003, 12, 512–530. [Google Scholar] [CrossRef] [Green Version]
- Kim, H.; Guess, F.M.; Young, T.M. Using data mining tools of decision trees in reliability applications. IIE Trans. 2011, 43, 43–54. [Google Scholar]
- Stoma, P.; Stoma, M.; Dudziak, A.; Caban, J. Bootstrap analysis of the production processes capability assessment. Appl. Sci. 2019, 9, 5360. [Google Scholar] [CrossRef] [Green Version]
- Akaike, H. A new look at the statistical model identification. IEEE Trans. Autom. Control 1974, 19, 716–723. [Google Scholar] [CrossRef]
- Schwarz, G.E. Estimating the dimension of a model. Ann. Stat. 1978, 6, 461–464. [Google Scholar] [CrossRef]
- Adcock, T.; Wolcott, M.P. Wood: Structural Panel Processes. In Encyclopedia of Materials: Science and Technology; Buschow, K.H.J., Cahn, R.W., Flemings, M.C., Ilschner, B., Kramer, E.J., Mahajan, S., Veyssière, P., Eds.; Elsevier: Amsterdam, The Netherlands, 2001; pp. 9678–9683. [Google Scholar]
- Kamke, F.A. Wood: Nonstructural panel processes. In Encyclopedia of Materials: Science and Technology; Buschow, K.H.J., Cahn, R.W., Flemings, M.C., Ilschner, B., Kramer, E.J., Mahajan, S., Veyssière, P., Eds.; Elsevier: Amsterdam, The Netherlands, 2001; pp. 9673–9678. [Google Scholar]
- Chaudhuri, P.; Huang, M.C.; Loh, W.Y.; Yao, R. Piecewise-polynomial regression trees. Stat. Sin. 1994, 4, 143–167. [Google Scholar]
- Déath, G.; Fabricius, K. Classification and regression trees: A powerful yet simple technique for ecological data analysis. Ecology 2000, 81, 3178–3192. [Google Scholar] [CrossRef]
- Loh, W.Y.; Vanichsetakul, N. Tree-structured classification via generalized discriminant analysis. J. Am. Stat. Assoc. 1988, 83, 715–728. [Google Scholar] [CrossRef]
- Hand, D.J.; Mannila, H.; Smyth, P. Principles of Data Mining (Adaptive Computation and Machine Learning), 3rd ed.; MIT Press: Cambridge, MA, USA, 2001. [Google Scholar]
- André, N.; Young, T.M. Real-time process modeling of particleboard manufacture using variable selection and regression methods ensemble. Eur. J. Wood Wood Prod. 2013, 71, 361–370. [Google Scholar] [CrossRef]
- Carty, D.M.; Young, T.M.; Zaretzki, R.L.; Guess, F.M.; Petutschnigg, A. Predicting the strength properties of wood composites using boosted regression trees. Forest Prod. J. 2015, 65, 365–371. [Google Scholar] [CrossRef]
- Cherkassky, V.S.; Mulier, F. Learning from Data: Concepts, Theory, and Methods; John Wiley & Sons, Inc.: New York, NY, USA, 1998; pp. 1–536. [Google Scholar]
- Fayyad, U.M.; Piatetsky-Shapiro, G.; Smyth, P. From Data Mining to Knowledge Discovery: An Overview of Advances in Knowledge Discovery and Data Mining; The MIT Press: Cambridge, MA, USA, 1996; pp. 1–34. [Google Scholar]
- Loh, W.Y. Classification and regression trees. WIREs Data Min. Knowl. 2011, 1, 14–23. [Google Scholar] [CrossRef]
- Young, T.M.; León, R.V.; Chen, C.-H.; Chen, W.; Guess, F.M.; Edwards, D.J. Robustly estimating lower percentiles when observations are costly. Qual. Eng. 2015, 27, 361–373. [Google Scholar] [CrossRef]
- Young, T.M.; Clapp, N.E., Jr.; Guess, F.M.; Chen, C.-H. Predicting key reliability response with limited response data. Qual. Eng. 2014, 26, 223–232. [Google Scholar] [CrossRef]
- Zeng, Y.; Young, T.M.; Edwards, D.J.; Guess, F.M.; Chen, C.-H. Case studies: A study of missing data imputation in predictive modeling of a wood composite manufacturing process. J. Qual. Technol. 2016, 48, 284–296. [Google Scholar]
- Breiman, L.; Friedman, J.H.; Olshen, R.A.; Stone, C.I. Classification and Regression Trees; Wadsworth: Belmont, CA, USA, 1984; pp. 199–215. [Google Scholar]
- Luna, J.M.; Gennatas, E.D.; Ungar, L.H.; Valdes, G. Building more accurate decision trees with the additive tree. Proc. Natl. Acad. Sci. USA 2019, 116, 19887–19893. [Google Scholar] [CrossRef] [Green Version]
- Schapire, R.E. The boosting approach to machine learning: An overview. In MSRI Workshop on Nonlinear Estimation and Classification; Denison, D.D., Hansen, M.H., Holmes, C., Mallick, B., Yu, B., Eds.; Springer: New York, NY, USA, 2003; pp. 113–141. [Google Scholar]
- Feng, J.; Yu, Y.; Zhou, Z.-H. Multi-layered gradient boosting decision trees. In Proceedings of the 32nd Conference on Neural Information Processing Systems (NeurIPS), Montréal, QC, Canada, 3–8 December 2018; pp. 3555–3565. [Google Scholar]
- Khan, Z.; Gul, A.; Perperoglou, A. Ensemble of optimal trees, random forest and random projection ensemble classification. Adv. Data Anal. Cl. 2020, 14, 97–116. [Google Scholar] [CrossRef] [Green Version]
- Khuri, N. Mining environmental chemicals with boosted trees. In Proceedings of the 35th Annual ACM Symposium on Applied Computing, Brno, Czech Republic, 30 March–3 April 2020; pp. 1082–1089. [Google Scholar]
- Elith, J.; Leathwick, J.R.; Hastie, T. A working guide to boosted regression trees. J. Anim. Ecol. 2008, 77, 802–813. [Google Scholar] [CrossRef]
- Breiman, L. Random forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef] [Green Version]
- Fawagreh, K.; Gaber, M.M.; Elyan, E. Random forests: From early developments to recent advancements. J. Syst. Sci. Syst. Eng. 2014, 2, 602–609. [Google Scholar] [CrossRef] [Green Version]
- Amit, Y.; Geman, D. Shape quantization and recognition with randomized trees. Neural Comput. 1997, 9, 1545–1588. [Google Scholar] [CrossRef] [Green Version]
- Ho, T.K. The random subspace method for constructing decision forests. IEEE Trans. Pattern Anal. 1998, 20, 832–844. [Google Scholar]
- Boinee, P.; De Angelis, A.; Foresti, G.L. Meta random forests. Int. J. Comput. Int. Syst. 2005, 2, 138–147. [Google Scholar]
- Gregorutti, B.; Michel, B.; Saint-Pierre, P. Correlation and variable importance in random forests. Stat. Comput. 2017, 27, 659–678. [Google Scholar] [CrossRef] [Green Version]
- Jaiswal, J.K.; Samikannu, R. Application of Random Forest Algorithm on Feature Subset Selection and Classification and Regression. In Proceedings of the World Congress on Computing and Communication Technologies (WCCCT), Tiruchirappalli, India, 2–4 February 2017; pp. 65–68. [Google Scholar]
- Liaw, A.; Wiener, M. Classification and regression by randomforest. IRNews 2002, 2, 18–22. [Google Scholar]
- Attewell, P.; Monaghan, D. Data Mining for the Social Cciences: An Introduction; University of California Press: Berkeley, CA, USA, 2015; pp. 1–264. [Google Scholar]
- Fisher, R.A. The Design of Experiments; Hafner Publishing Company: New York, NY, USA, 1971; pp. 23–36. [Google Scholar]
- Box, G.E.P. Science and statistics. J. Am. Stat. Assoc. 1976, 71, 791–799. [Google Scholar] [CrossRef]
- Pattengale, N.D.; Alipour, M.; Bininda-Emonds, O.R.P.; Moret, B.M.E.; Stamatakis, A. How Many Bootstrap Replicates Are Necessary; Batzoglou, S., Ed.; RECOMB, LNCS 5541; Springer: Berlin/Heidelberg, Germany, 2009; pp. 184–200. [Google Scholar]
- Box, G.E.P.; Draper, N.R. Empirical Model Building and Response Surfaces; John Wiley and Sons: New York, NY, USA, 1987; pp. 1–688. [Google Scholar]
Tensile Strength-Model Comparisons | ||
---|---|---|
Distribution | AIC | BIC |
Normal | 4966.5326 | 4975.3464 |
Generalized Gamma | 4967.6229 | 4980.8336 |
Log Generalized Gamma | 4967.6304 | 4980.8411 |
Lognormal | 4970.1264 | 4978.9402 |
Logistic | 4978.8342 | 4987.6480 |
Loglogistic | 4981.3085 | 4990.1222 |
Weibull | 5028.7795 | 5037.5932 |
LEV | 5044.7598 | 5053.5736 |
SEV | 5087.5301 | 5096.3439 |
Frechet | 5110.6015 | 5119.4153 |
Exponential | 7255.9066 | 7260.3168 |
Ultimate Static Load-Model Comparisons | ||
Distribution | AIC | BIC |
Normal | −524.1410 | −518.2014 |
Generalized Gamma | −522.2338 | −513.3663 |
Log Generalized Gamma | −522.0893 | −513.2218 |
Lognormal | −521.9522 | −516.0125 |
Logistic | −518.1157 | −512.1760 |
Loglogistic | −516.7510 | −510.8113 |
Weibull | −516.6517 | −510.7120 |
SEV | −507.4300 | −501.4904 |
LEV | −504.8098 | −498.8702 |
Frechet | −491.4876 | −485.5480 |
Exponential | 30.1596 | 33.1432 |
Bootstrap Forest (k = 1000 Bootstraps) | Boosted Tree (k = 1000) | ||||
---|---|---|---|---|---|
Variable | Number of Splits | Sums of Squares | Variable | Number of Splits | Sums of Squares |
Face Plate Position | 25 | 2771.68 | Face Plate Position | 9 | 26,525.05 |
Core Dust Speed | 13 | 1052.24 | Press Position Time | 6 | 17,736.67 |
Press Position Time | 12 | 796.26 | Core Dust Speed | 7 | 14952.18 |
Total Press Time | 11 | 594.18 | Face Steam Flow | 4 | 9565.28 |
Face Steam Flow | 15 | 592.02 | Total Press Time | 2 | 8360.52 |
Adhesive Percent | 14 | 469.34 | Adhesive Percent | 4 | 7688.29 |
Bootstrap Forest (k = 10,000 Bootstraps) | Boosted Tree (k = 10,000) | ||||
Variable | Number of Splits | Sums of Squares | Variable | Number of Splits | Sums of Squares |
Face Steam Flow | 20 | 2379.71 | Face Plate Position | 7 | 28,640.33 |
Adhesive Percent | 29 | 2142.70 | Swing Plate Position | 7 | 16,677.42 |
Face Plate Position | 87 | 1571.86 | Face Steam Flow | 8 | 15,481.28 |
Press Position Time | 15 | 1470.78 | Adhesive Percent | 4 | 9565.11 |
Swing Plate Position | 45 | 1319.93 | Press Position Time | 6 | 9553.90 |
Resin Temperature | 20 | 1211.34 | Face Steam Flow | 6 | 8170.58 |
Bootstrap Forest (k = 1000 Bootstraps) | Boosted Tree (k =1000 Bootstraps) | ||||
---|---|---|---|---|---|
Variable | Number of Splits | Sum of Squares | Variable | Number of Splits | Sum of Squares |
Wet Bin Speed | 7 | 0.117 | Wet Bin Speed | 4 | 0.050 |
Dryer Inlet Temperature | 2 | 0.023 | Flake Moisture Content | 4 | 0.049 |
Flake Moisture Content | 2 | 0.019 | Dryer Inlet Temperature | 3 | 0.035 |
Dryer Outlet Temperature | 1 | 0.015 | Dryer Outlet Temperature | 2 | 0.017 |
Mat Weight | 1 | 0.012 | Mat Weight | 2 | 0.017 |
Wood Weight | 1 | 0.009 | Wood Weight | 2 | 0.016 |
Bootstrap Forest (k = 10,000 Bootstraps) | Boosted Tree (k =10,000 Bootstraps) | ||||
Variable | Number of Splits | Sum of Squares | Variable | Number of Splits | Sum of Squares |
Dryer Outlet Temperature | 5 | 0.172 | Dryer Inlet Temperature | 8 | 0.062 |
Dryer Inlet Temperature | 5 | 0.093 | Wet Bin Speed | 5 | 0.045 |
Wood Weight | 4 | 0.076 | Wood Weight | 3 | 0.034 |
Flake Moisture Content | 6 | 0.035 | Press Closing Time | 6 | 0.026 |
Dry Bin Speed | 2 | 0.026 | Dryer Outlet Temperature | 3 | 0.020 |
Wet Bin Speed | 1 | 0.024 | Flake Moisture Content | 4 | 0.016 |
Tensile Strength (kPa) Quantiles | Face Plate Position (mm) Quantiles | Core Dust Speed (m/min) Quantiles | ||||||
100.0% | maximum | 1275.575 | 100.0% | maximum | 10.459 | 100.0% | maximum | 24.542 |
99.5% | 1226.483 | 99.5% | 10.456 | 99.5% | 24.195 | |||
97.5% | 1137.675 | 97.5% | 10.422 | 97.5% | 22.788 | |||
90.0% | 1067.346 | 90.0% | 9.720 | 90.0% | 19.928 | |||
75.0% | quartile | 1020.46 | 75.0% | quartile | 9.140 | 75.0% | quartile | 16.468 |
50.0% | median | 951.51 | 50.0% | median | 8.324 | 50.0% | median | 10.921 |
25.0% | quartile | 882.56 | 25.0% | quartile | 5.492 | 25.0% | quartile | 8.245 |
10.0% | 827.4 | 10.0% | 4.160 | 10.0% | 7.094 | |||
2.5% | 767.4135 | 2.5% | 3.135 | 2.5% | 6.128 | |||
0.5% | 704.1174 | 0.5% | 2.271 | 0.5% | 3.539 | |||
0.0% | minimum | 668.815 | 0.0% | minimum | 2.180 | 0.0% | minimum | 2.851 |
Summary Statistics | Summary Statistics | Summary Statistics | ||||||
Mean | 950.0881 | Mean | 0.321 | Mean | 41.069 | |||
Std Dev | 94.6977 | Std Dev | 0.068 | Std Dev | 16.341 | |||
Std Err Mean | 3.831056 | Std Err Mean | 0.003 | Std Err Mean | 0.772 | |||
Upper 95% Mean | 957.6118 | Upper 95% Mean | 0.327 | Upper 95% Mean | 42.587 | |||
Lower 95% Mean | 942.5644 | Lower 95% Mean | 0.315 | Lower 95% Mean | 39.552 | |||
N | 408 | N | 408 | N | 408 | |||
Quantiles | Face Steam Flow (bar) Quantiles | Adhesive Percent (%) Quantiles | ||||||
100.0% | 100.0% | 100.0% | 100.0% | maximum | 292.618 | 100.0% | maximum | 14.743 |
99.5% | 99.5% | 99.5% | 99.5% | 285.132 | 99.5% | 14.583 | ||
97.5% | 97.5% | 97.5% | 97.5% | 231.635 | 97.5% | 14.089 | ||
90.0% | 90.0% | 90.0% | 90.0% | 190.579 | 90.0% | 13.687 | ||
75.0% | 75.0% | 75.0% | 75.0% | quartile | 160.880 | 75.0% | quartile | 9.555 |
50.0% | 50.0% | 50.0% | 50.0% | median | 114.143 | 50.0% | median | 9.468 |
25.0% | 25.0% | 25.0% | 25.0% | quartile | 69.911 | 25.0% | quartile | 8.579 |
10.0% | 10.0% | 10.0% | 10.0% | 60.558 | 10.0% | 7.662 | ||
2.5% | 2.5% | 2.5% | 2.5% | 51.778 | 2.5% | 5.985 | ||
0.5% | 0.5% | 0.5% | 0.5% | 49.570 | 0.5% | 5.679 | ||
0.0% | 0.0% | 0.0% | 0.0% | minimum | 48.733 | 0.0% | minimum | 5.671 |
Summary Statistics | Summary Statistics | Summary Statistics | ||||||
Mean | Mean | Mean | 121.76 | Mean | 9.50 | |||
Std Dev | Std Dev | Std Dev | 53.10 | Std Dev | 1.93 | |||
Std Err Mean | Std Err Mean | Std Err Mean | 2.36 | Std Err Mean | 0.09 | |||
Upper 95% Mean | Upper 95% Mean | Upper 95% Mean | 126.39 | Upper 95% Mean | 9.68 | |||
Lower 95% Mean | Lower 95% Mean | Lower 95% Mean | 117.13 | Lower 95% Mean | 9.31 | |||
N | 408 | N | 408 | N | 408 |
Runs | Pattern | Face Plate Position | Press Position Time | Adhesive Percent | Face Steam Flow | Random Run Order |
---|---|---|---|---|---|---|
1 | −−00 | 9.3 | 8.0 | 7.75 | 95.5 | 9 |
2 | −0−0 | 9.3 | 8.2 | 7.60 | 95.5 | 5 |
3 | −00− | 9.3 | 8.2 | 7.75 | 90.5 | 1 |
4 | −00+ | 9.3 | 8.2 | 7.75 | 100.5 | 18 |
5 | −0+0 | 9.3 | 8.2 | 7.90 | 95.5 | 23 |
6 | −+00 | 9.3 | 8.4 | 7.75 | 95.5 | 26 |
7 | 0−−0 | 9.5 | 8.0 | 7.60 | 95.5 | 16 |
8 | 0−0− | 9.5 | 8.0 | 7.75 | 90.5 | 17 |
9 | 0−0+ | 9.5 | 8.0 | 7.75 | 100.5 | 2 |
10 | 0−+0 | 9.5 | 8.0 | 7.90 | 95.5 | 25 |
11 | 00−− | 9.5 | 8.2 | 7.60 | 90.5 | 21 |
12 | 00−+ | 9.5 | 8.2 | 7.60 | 100.5 | 15 |
13 | 0000 | 9.5 | 8.2 | 7.75 | 95.5 | 7 |
14 | 0000 | 9.5 | 8.2 | 7.75 | 95.5 | 11 |
15 | 0000 | 9.5 | 8.2 | 7.75 | 95.5 | 3 |
16 | 00+− | 9.5 | 8.2 | 7.90 | 90.5 | 4 |
17 | 00++ | 9.5 | 8.2 | 7.90 | 100.5 | 27 |
18 | 0+−0 | 9.5 | 8.4 | 7.60 | 95.5 | 12 |
19 | 0+0− | 9.5 | 8.4 | 7.75 | 90.5 | 10 |
20 | 0+0+ | 9.5 | 8.4 | 7.75 | 100.5 | 19 |
21 | 0++0 | 9.5 | 8.4 | 7.90 | 95.5 | 8 |
22 | +−00 | 9.7 | 8.0 | 7.75 | 95.5 | 13 |
23 | +0−0 | 9.7 | 8.2 | 7.60 | 95.5 | 24 |
24 | +00− | 9.7 | 8.2 | 7.75 | 90.5 | 14 |
25 | +00+ | 9.7 | 8.2 | 7.75 | 100.5 | 6 |
26 | +0+0 | 9.7 | 8.2 | 7.90 | 95.5 | 20 |
27 | ++00 | 9.7 | 8.4 | 7.75 | 95.5 | 22 |
Ultimate Static Load (kg) Quantiles | Dryer Inlet Temperature (°C) Quantiles | ||||
100.0% | maximum | 283.04 | 100.0% | maximum | 145.00 |
99.5% | 283.04 | 99.5% | 145.00 | ||
97.5% | 269.20 | 97.5% | 133.33 | ||
90.0% | 250.84 | 90.0% | 127.22 | ||
75.0% | quartile | 232.35 | 75.0% | quartile | 125.00 |
50.0% | median | 209.11 | 50.0% | median | 120.28 |
25.0% | quartile | 189.71 | 25.0% | quartile | 112.22 |
10.0% | 174.22 | 10.0% | 103.33 | ||
2.5% | 168.53 | 2.5% | 96.67 | ||
0.5% | 147.87 | 0.5% | 93.33 | ||
0.0% | minimum | 147.87 | 0.0% | minimum | 93.33 |
Summary Statistics | |||||
Mean | 211.96 | Mean | 117.92 | ||
Std Dev | 28.23 | Std Dev | 8.36 | ||
Std Err Mean | 2.31 | Std Err Mean | 1.38 | ||
Upper 95% Mean | 216.52 | Upper 95% Mean | 119.44 | ||
Lower 95% Mean | 207.41 | Lower 95% Mean | 116.40 | ||
N | 150 | N | 150 | ||
Wood Weight (kg) Quantiles | Wet Bin Speed (m/min) Quantiles | ||||
100.0% | maximum | 19.66 | 100.0% | maximum | 9.75 |
99.5% | 19.66 | 99.5% | 9.75 | ||
97.5% | 19.66 | 97.5% | 9.45 | ||
90.0% | 5.68 | 90.0% | 9.14 | ||
75.0% | quartile | 4.38 | 75.0% | quartile | 9.14 |
50.0% | median | 3.20 | 50.0% | median | 8.53 |
25.0% | quartile | 2.19 | 25.0% | quartile | 7.92 |
10.0% | 0.10 | 10.0% | 7.32 | ||
2.5% | 0.10 | 2.5% | 6.40 | ||
0.5% | 0.10 | 0.5% | 6.10 | ||
0.0% | minimum | 0.10 | 0.0% | minimum | 6.10 |
Summary Statistics | |||||
Mean | 3.54 | Mean | 8.45 | ||
Std Dev | 4.19 | Std Dev | 0.84 | ||
Std Err Mean | 0.34 | Std Err Mean | 0.07 | ||
Upper 95% Mean | 4.21 | Upper 95% Mean | 8.59 | ||
Lower 95% Mean | 2.86 | Lower 95% Mean | 8.32 | ||
N | 150 | N | 150 |
Runs | Pattern | Dryer Inlet Temperature | Mat Weight | Wet Bin Speed | Random Run Order |
---|---|---|---|---|---|
1 | −−− | 105 | 1.4 | 8.4 | 6 |
2 | −−+ | 105 | 1.4 | 9.4 | 11 |
3 | a00 | 105 | 1.5 | 8.9 | 15 |
4 | −+− | 105 | 1.6 | 8.4 | 10 |
5 | −++ | 105 | 1.6 | 9.4 | 16 |
6 | 0a0 | 107.5 | 1.4 | 8.9 | 12 |
7 | 00a | 107.5 | 1.5 | 8.4 | 13 |
8 | 000 | 107.5 | 1.5 | 8.9 | 4 |
9 | 000 | 107.5 | 1.5 | 8.9 | 7 |
10 | 00A | 107.5 | 1.5 | 9.4 | 17 |
11 | 0A0 | 107.5 | 1.6 | 8.9 | 2 |
12 | +−− | 110 | 1.4 | 8.4 | 1 |
13 | +−+ | 110 | 1.4 | 9.4 | 8 |
14 | A00 | 110 | 1.5 | 8.9 | 3 |
15 | ++− | 110 | 1.6 | 8.4 | 14 |
16 | +++ | 110 | 1.6 | 9.4 | 5 |
17 | −−− | 105 | 1.4 | 8.4 | 9 |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Young, T.M.; Breyer, R.A.; Liles, T.; Petutschnigg, A. Improving Innovation from Science Using Kernel Tree Methods as a Precursor to Designed Experimentation. Appl. Sci. 2020, 10, 3387. https://doi.org/10.3390/app10103387
Young TM, Breyer RA, Liles T, Petutschnigg A. Improving Innovation from Science Using Kernel Tree Methods as a Precursor to Designed Experimentation. Applied Sciences. 2020; 10(10):3387. https://doi.org/10.3390/app10103387
Chicago/Turabian StyleYoung, Timothy M., Robert A. Breyer, Terry Liles, and Alexander Petutschnigg. 2020. "Improving Innovation from Science Using Kernel Tree Methods as a Precursor to Designed Experimentation" Applied Sciences 10, no. 10: 3387. https://doi.org/10.3390/app10103387