An Enhanced Discriminant Analysis Approach for Multi-Classification with Integrated Machine Learning-Based Missing Data Imputation
Abstract
1. Introduction
2. Methodology
2.1. Linear Discriminant Analysis
2.2. Regularized Discriminant Analysis
2.3. Flexible Discriminant Analysis
2.4. Mixture Discriminant Analysis
2.5. Kernel Discriminant Analysis
2.6. Shrinkage Discriminant Analysis
3. The Simulation Study and Results
3.1. Statistical Methods
3.1.1. Mean Imputation
3.1.2. Regression Imputation
3.2. Machine Learning-Based Methods
3.2.1. K-Nearest Neighbors (KNN) Imputation
3.2.2. Random Forest Imputation
3.2.3. Bagged Trees Imputation
3.2.4. MissRanger Imputation
| DA Methods | p = 5 | p = 10 | |||||
|---|---|---|---|---|---|---|---|
| n = 100 | n = 300 | n = 500 | n = 100 | n = 300 | n = 500 | ||
| 0.3 | LDA | 81.53 (0.0883) 80.99–82.08 0.7242 | 84.91 (0.0338) 84.70–85.12 0.7829 | 89.04 (0.0374) 88.81–89.28 0.8325 | 87.84 (0.0199) 87.71–87.96 0.8081 | 79.25 (0.0435) 78.98–79.53 0.6735 | 86.57 (0.0250) 86.41–86.72 0.7878 |
| RDA | 77.93 (0.0435) 77.69–78.02 0.6651 | 86.48 (0.0257) 86.33–86.64 0.8049 | 89.55 (0.0500) 89.24–89.86 0.8408 | 82.73 (0.0075) 82.68–82.77 0.7175 | 78.54 (0.0248) 78.38–79.69 0.6563 | 85.90 (0.0251) 85.74–86.05 0.7763 | |
| FDA | 81.54 (0.0883) 80.99–82.09 0.7243 | 84.68 (0.0330) 84.48–84.89 0.7797 | 88.88 (0.0400) 88.63–89.12 0.8303 | 86.13 (0.0353) 85.91–86.35 0.7836 | 79.39 (0.0424) 79.13–79.65 0.6761 | 86.57 (0.0273) 86.40–86.74 0.7883 | |
| MDA | 75.22 (0.0440) 74.95–75.49 0.6358 | 84.68 (0.0476) 84.39–84.98 0.7775 | 84.51 (0.0446) 84.23–84.79 0.7629 | 79.28 (0.0053) 79.25–79.32 0.6776 | 74.47 (0.0243) 74.32–74.62 0.6000 | 83.66 (0.0137) 83.58–83.75 0.7439 | |
| KDA | 76.96 (0.0546) 76.62–77.33 0.6484 | 82.88 (0.0488) 82.58–83.18 0.7486 | 86.20 (0.0610) 85.82–86.58 0.7875 | 81.04 (0.0180) 80.93–81.15 0.6742 | 77.00 (0.0361) 76.78–77.23 0.6128 | 83.22 (0.0145) 83.13–83.31 0.7225 | |
| SDA | 80.62 (0.0735) 80.17–81.08 0.7115 | 85.81 (0.0232) 85.66–85.95 0.7944 | 88.38 (0.0394) 88.13–88.62 0.8219 | 84.41 (0.0195) 84.29–84.53 0.7479 | 78.55 (0.0458) 78.26–78.83 0.6557 | 85.90 (0.0190) 85.78–86.02 0.7749 | |
| 0.7 | LDA | 83.79 (0.0289) 83.56–83.92 0.7799 | 87.40 (0.0061) 87.36–87.44 0.7896 | 79.23 (0.0049) 79.20–79.26 0.6850 | 84.57 (0.0750) 84.10–85.03 0.7473 | 83.22 (0.0217) 83.09–83.36 0.7182 | 84.44 (0.0298) 84.25–84.62 0.7353 |
| RDA | 85.07 (0.0323) 84.87–85.27 0.8019 | 88.50 (0.0343) 88.29–88.71 0.8099 | 83.93 (0.0047) 83.90–83.96 0.7610 | 77.54 (0.1670) 76.51–78.58 0.5998 | 81.54 (0.0511) 81.22–81.86 0.7096 | 86.57 (0.0120) 86.50–86.65 0.7780 | |
| FDA | 85.02 (0.0347) 84.80–85.24 0.8106 | 87.40 (0.0059) 87.36–87.44 0.7896 | 79.23 (0.0048) 79.20–79.26 0.6850 | 84.57 (0.0750) 84.10–85.03 0.7481 | 83.22 (0.0218) 83.08–83.35 0.7181 | 84.44 (0.0316) 84.24–84.63 0.7356 | |
| MDA | 85.45 (0.0334) 85.25–85.66 0.8145 | 85.20 (0.0063) 85.16–85.24 0.7573 | 87.90 (0.0026) 87.89–87.92 0.8192 | 80.67 (0.0720) 80.22–81.12 0.6959 | 79.31 (0.0085) 79.26–79.37 0.6589 | 83.89 (0.0357) 83.67–84.11 0.7327 | |
| KDA | 85.05 (0.0287) 84.87–85.22 0.7808 | 89.62 (0.0229) 89.48–89.76 0.8269 | 86.60 (0.0030) 86.58–86.61 0.8004 | 87.61 (0.0539) 87.27–87.94 0.7725 | 82.11 (0.0038) 82.09–82.14 0.6927 | 83.91 (0.0340) 83.70–84.12 0.7240 | |
| SDA | 85.02 (0.0176) 84.91–85.13 0.7867 | 87.77 (0.0093) 87.71–87.83 0.7979 | 80.56 (0.0043) 80.54–80.59 0.7071 | 86.54 (0.0700) 86.10–86.97 0.7769 | 84.32 (0.0115) 84.25–84.39 0.7364 | 84.17 (0.0308) 83.98–84.36 0.7311 | |
| DA Methods | p = 5 | p = 10 | |||||
|---|---|---|---|---|---|---|---|
| n = 100 | n = 300 | n = 500 | n = 100 | n = 300 | n = 500 | ||
| 0.3 | LDA | 76.84 (0.0791) 76.35–77.34 0.6519 | 82.28 (0.0435) 82.01–82.55 0.7339 | 84.44 (0.0306) 84.25–84.63 0.7666 | 74.05 (0.0839) 73.53–74.57 0.5982 | 79.37 (0.0411) 79.11–79.62 0.6697 | 81.02 (0.0305) 80.83–81.21 0.6947 |
| RDA | 76.21 (0.0794) 75.72–76.70 0.6382 | 83.20 (0.0457) 82.91–83.48 0.7464 | 86.08 (0.0322) 85.88–86.28 0.7910 | 74.31 (0.0834) 73.79–74.82 0.5893 | 78.97 (0.0425) 78.70–79.23 0.6589 | 80.57 (0.0318) 80.38–80.77 0.6849 | |
| FDA | 76.80 (0.0791) 76.31–77.29 0.6523 | 82.26 (0.0438) 81.99–82.53 0.7339 | 84.42 (0.0306) 84.23–84.61 0.7665 | 73.74 (0.0840) 73.22–74.26 0.5956 | 79.31 (0.0414) 79.06–79.57 0.6695 | 81.02 (0.0308) 80.83–81.21 0.6951 | |
| MDA | 72.92 (0.0819) 72.41–73.43 0.5970 | 79.79 (0.0454) 79.51–80.07 0.6977 | 82.11 (0.0336) 81.90–82.32 0.7323 | 69.06 (0.0894) 68.51–69.62 0.5291 | 75.96 (0.0449) 75.68–76.28 0.6209 | 78.55 (0.0338) 78.34–78.76 0.6592 | |
| KDA | 74.40 (0.0749) 73.93–74.86 0.5968 | 81.51 (0.0424) 81.25–81.78 0.7177 | 84.59 (0.0284) 84.41–84.77 0.7668 | 76.26 (0.0751) 75.80–76.73 0.5925 | 78.56 (0.0415) 78.30–78.82 0.6373 | 79.77 (0.0316) 79.57–79.96 0.6616 | |
| SDA | 76.25 (0.0796) 75.75–76.74 0.641 | 82.06 (0.0429) 81.79–82.33 0.7296 | 84.27 (0.0302) 84.08–84.46 0.7634 | 74.72 (0.0835) 74.20–75.24 0.6019 | 79.45 (0.0406) 79.19–79.70 0.6672 | 80.96 (0.0308) 80.76–81.15 0.6909 | |
| 0.7 | LDA | 80.28 (0.0751) 79.82–80.75 0.6972 | 84.31 (0.0388) 84.07–84.55 0.7525 | 85.52 (0.0321) 85.32–85.72 0.7711 | 80.28 (0.0771) 79.80–80.76 0.6824 | 85.28 (0.0378) 85.04–85.51 0.7468 | 86.26 (0.0282) 86.09–86.44 0.7608 |
| RDA | 82.74 (0.0776) 82.26–83.22 0.7365 | 87.40 (0.0376) 87.17–87.63 0.8051 | 89.34 (0.0330) 89.13–89.54 0.8351 | 81.50 (0.1031) 80.86–82.14 0.6916 | 86.45 (0.0373) 86.22–86.68 0.7729 | 87.46 (0.0283) 87.28–87.63 0.7888 | |
| FDA | 80.21 (0.0756) 79.74–80.68 0.6977 | 84.34 (0.0389) 84.10–84.58 0.7535 | 85.58 (0.0320) 85.38–85.78 0.7723 | 79.95 (0.0775) 79.47–80.43 0.6796 | 85.24 (0.0379) 85.00–85.47 0.7467 | 86.27 (0.0284) 86.09–86.44 0.7611 | |
| MDA | 80.25 (0.0760) 79.78–80.73 0.7020 | 85.95 (0.0356) 85.73–86.17 0.7830 | 87.62 (0.0295) 87.44–87.80 0.8088 | 77.96 (0.0806) 77.46–78.46 0.6545 | 83.99 (0.0391) 83.74–84.23 0.7309 | 85.63 (0.0301) 85.44–85.81 0.7562 | |
| KDA | 82.38 (0.0716) 81.94–82.83 0.7187 | 88.34 (0.0341) 88.13–88.55 0.8175 | 90.64 (0.0250) 90.48–90.79 0.8544 | 85.50 (0.0613) 85.12–85.88 0.7420 | 86.49 (0.0342) 86.27–86.70 0.7620 | 87.21 (0.0270) 87.04–87.38 0.7769 | |
| SDA | 81.48 (0.0737) 81.03–81.94 0.7184 | 84.74 (0.0380) 84.51–84.98 0.7607 | 85.91 (0.0312) 85.71–86.10 0.7780 | 82.61 (0.0702) 82.17–83.05 0.7207 | 85.72 (0.0374) 85.49–85.95 0.754 | 86.45 (0.0283) 86.28–86.63 0.7644 | |
| DA Methods | p = 5 | p = 10 | |||||
|---|---|---|---|---|---|---|---|
| n = 100 | n = 300 | n = 500 | n = 100 | n = 300 | n = 500 | ||
| 0.3 | LDA | 77.06 (0.0810) 76.55–77.56 0.6547 | 82.31 (0.0426) 82.04–82.57 0.7354 | 84.81 (0.0323) 84.61–85.01 0.7729 | 74.33 (0.0802) 73.83–74.83 0.6009 | 79.59 (0.0412) 79.34–79.85 0.6736 | 80.91 (0.0312) 80.71–81.10 0.6938 |
| RDA | 76.47 (0.0787) 75.98–76.96 0.6398 | 83.57 (0.0452) 83.29–83.85 0.7525 | 86.72 (0.0348) 86.51–86.94 0.8010 | 74.75 (0.0813) 74.25–75.26 0.5930 | 79.10 (0.0411) 78.84–79.35 0.6607 | 80.52 (0.0326) 80.32–80.73 0.6843 | |
| FDA | 76.81 (0.0820) 76.30–77.32 0.6522 | 82.31 (0.0429) 82.04–82.57 0.7356 | 84.79 (0.0324) 84.59–84.99 0.7728 | 74.10 (0.0812) 73.50–74.51 0.5983 | 79.57 (0.0411) 79.32–79.83 0.6739 | 80.89 (0.0314) 80.70–81.09 0.6939 | |
| MDA | 72.56 (0.0859) 72.03–73.09 0.5908 | 79.90 (0.0450) 79.63–80.18 0.7001 | 82.47 (0.0347) 82.26–82.69 0.7384 | 68.91 (0.0893) 68.36–69.46 0.5258 | 75.99 (0.0454) 75.71–76.27 0.6212 | 78.27 (0.0328) 78.07–78.48 0.6544 | |
| KDA | 74.34 (0.0779) 73.86–74.82 0.5934 | 81.40 (0.0411) 81.14–81.65 0.7172 | 84.90 (0.0297) 84.72–85.09 0.7722 | 76.45 (0.0757) 75.98–76.92 0.5920 | 78.54 (0.0405) 78.29–78.79 0.6366 | 79.56 (0.0314) 79.37–79.76 0.6594 | |
| SDA | 76.44 (0.0795) 75.95–76.94 0.6434 | 82.18 (0.0415) 81.92–82.43 0.7324 | 84.65 (0.0323) 84.45–84.85 0.7700 | 75.09 (0.0787) 74.60–75.58 0.6066 | 79.67 (0.0407) 79.42–79.93 0.6716 | 80.85 (0.0311) 80.65–81.04 0.6906 | |
| 0.7 | LDA | 79.94 (0.0737) 79.48–80.39 0.6905 | 83.96 (0.0400) 83.71–84.21 0.7471 | 85.51 (0.0307) 85.32–85.70 0.7708 | 79.51 (0.0834) 78.99–80.02 0.6725 | 84.81 (0.0379) 84.57–85.05 0.7383 | 85.96 (0.0266) 85.80–86.13 0.7556 |
| RDA | 82.42 (0.0725) 81.97–82.87 0.7330 | 86.85 (0.0388) 86.61–87.10 0.7968 | 89.30 (0.0325) 89.09–89.50 0.8344 | 79.69 (0.1201) 78.94–80.43 0.6624 | 86.01 (0.0362) 85.78–86.23 0.7650 | 87.06 (0.0270) 86.89–87.23 0.7825 | |
| FDA | 79.94 (0.0744) 79.47–80.40 0.6928 | 84.01 (0.0398) 83.76–84.26 0.7484 | 85.56 (0.0306) 85.37–85.75 0.7719 | 79.15 (0.0846) 78.62–79.67 0.6699 | 84.74 (0.0385) 84.50–84.98 0.7377 | 85.95 (0.0267) 85.78–86.11 0.7556 | |
| MDA | 79.78 (0.0738) 79.32–80.23 0.6953 | 85.31 (0.0398) 85.06–85.56 0.7732 | 87.54 (0.0299) 87.39–87.73 0.8076 | 77.39 (0.0822) 76.88–77.90 0.6486 | 83.22 (0.0404) 82.97–83.47 0.7181 | 85.11 (0.0287) 84.93–85.29 0.7478 | |
| KDA | 81.81 (0.0692) 81.38–82.24 0.7105 | 87.81 (0.0361) 87.59–88.04 0.8094 | 90.51 (0.0258) 90.35–90.67 0.8524 | 85.15 (0.0679) 84.73–85.57 0.7366 | 86.12 (0.0335) 85.91–86.33 0.7553 | 86.75 (0.0253) 86.59–86.91 0.7687 | |
| SDA | 81.28 (0.0729) 80.83–81.74 0.7148 | 84.42 (0.0392) 84.18–84.67 0.7555 | 85.88 (0.0301) 85.69–86.06 0.7773 | 81.68 (0.0784) 81.19–82.17 0.7067 | 78.30 (0.0374) 85.07–85.54 0.7471 | 86.11 (0.0263) 85.94–86.27 0.7583 | |
| DA Methods | p = 5 | p = 10 | |||||
|---|---|---|---|---|---|---|---|
| n = 100 | n = 300 | n = 500 | n = 100 | n = 300 | n = 500 | ||
| 0.3 | LDA | 73.92 (0.0800) 73.42–74.42 0.6037 | 77.83 (0.0435) 77.56–78.10 0.6633 | 79.37 (0.0335) 79.16–79.58 0.6870 | 72.10 (0.0836) 71.49–72.53 0.5652 | 77.19 (0.0445) 76.92–77.47 0.6323 | 78.77 (0.0315) 78.57–78.96 0.6539 |
| RDA | 73.28 (0.0809) 72.78–73.87 0.5872 | 78.00 (0.0450) 77.72–78.28 0.6655 | 79.88 (0.0343) 79.67–80.09 0.6958 | 72.29 (0.0883) 71.75–72.84 0.5577 | 76.85 (0.0450) 76.57–77.13 0.6218 | 78.19 (0.0321) 77.99–78.39 0.6412 | |
| FDA | 73.76 (0.0804) 73.26–74.26 0.6030 | 77.80 (0.0433) 77.53–78.07 0.6634 | 79.40 (0.0335) 79.19–79.60 0.6877 | 71.68 (0.0846) 71.16–72.21 0.5625 | 77.15 (0.0446) 76.88–77.43 0.6324 | 78.78 (0.0316) 78.58–78.97 0.6545 | |
| MDA | 69.16 (0.0847) 68.64–69.69 0.5394 | 74.73 (0.0463) 74.45–75.02 0.6188 | 76.85 (0.0338) 76.64–77.06 0.6503 | 65.47 (0.0901) 64.91–66.02 0.4772 | 73.56 (0.0460) 73.27–73.84 0.5816 | 75.93 (0.0329) 75.73–76.14 0.6146 | |
| KDA | 71.45 (0.0752) 70.98–71.91 0.5425 | 75.78 (0.0423) 75.52–76.04 0.6266 | 78.16 (0.0330) 77.96–78.37 0.6668 | 74.97 (0.0769) 74.50–75.45 0.5671 | 76.75 (0.0429) 76.48–77.01 0.6045 | 77.68 (0.0319) 77.48–77.88 0.6228 | |
| SDA | 73.58 (0.0794) 73.09–74.07 0.5963 | 77.61 (0.0435) 77.34–77.88 0.6587 | 79.16 (0.0337) 78.95–79.37 0.6829 | 73.00 (0.0831) 72.49–73.52 0.5742 | 77.31 (0.0438) 77.04–77.59 0.6312 | 78.75 (0.0315) 78.56–78.95 0.6515 | |
| 0.7 | LDA | 78.44 (0.0771) 77.96–78.92 0.6646 | 82.19 (0.0411) 81.93–82.44 0.7168 | 83.32 (0.0314) 83.12–83.51 0.7346 | 79.53 (0.0763) 79.05–80.00 0.6705 | 84.42 (0.0381) 84.19–84.66 0.7307 | 85.59 (0.0291) 85.40–85.76 0.7477 |
| RDA | 80.31 (0.0724) 79.82–80.80 0.6990 | 84.43 (0.0423) 84.16–84.69 0.7592 | 85.81 (0.0305) 85.62–86.00 0.7813 | 80.05 (0.1080) 79.37–80.72 0.6699 | 85.34 (0.0392) 85.10–85.59 0.7350 | 86.43 (0.0296) 86.25–86.62 0.7704 | |
| FDA | 78.23 (0.0769) 77.76–78.71 0.6638 | 82.24 (0.0412) 81.98–82.49 0.7183 | 83.36 (0.0313) 83.17–83.56 0.7356 | 79.15 (0.0770) 78.68–79.63 0.6669 | 84.34 (0.0383) 84.11–84.58 0.7299 | 85.57 (0.0291) 85.39–85.75 0.7477 | |
| MDA | 77.92 (0.0774) 77.44–78.40 0.6647 | 82.99 (0.0400) 82.74–82.24 0.7358 | 84.75 (0.0302) 84.56–84.93 0.7632 | 76.59 (0.0838) 76.07–77.12 0.6367 | 82.90 (0.0412) 82.65–83.16 0.7114 | 84.45 (0.0303) 84.26–84.64 0.7352 | |
| KDA | 80.19 (0.0721) 79.74–80.64 0.6822 | 84.94 (0.0391) 84.69–85.18 0.7628 | 87.04 (0.0282) 86.86–87.21 0.7977 | 84.75 (0.0652) 84.35–85.16 0.7280 | 85.68 (0.0364) 85.46–85.91 0.7464 | 86.26 (0.0281) 86.09–86.44 0.7586 | |
| SDA | 79.57 (0.0725) 79.12–80.02 0.6868 | 82.56 (0.0410) 82.31–82.82 0.7244 | 83.63 (0.0311) 83.44–83.82 0.7407 | 81.59 (0.0733) 81.13–82.05 0.7041 | 84.82 (0.0377) 84.58–85.05 0.7378 | 85.76 (0.0285) 85.58–85.93 0.7510 | |
| DA Methods | p = 5 | p = 10 | |||||
|---|---|---|---|---|---|---|---|
| n = 100 | n = 300 | n = 500 | n = 100 | n = 300 | n = 500 | ||
| 0.3 | LDA | 77.38 (0.0776) 76.90–77.87 0.6593 | 82.65 (0.0419) 82.39–82.91 0.7403 | 85.02 (0.0331) 84.82–85.23 0.7759 | 86.20 (0.0297) 86.01–86.38 0.7637 | 86.31 (0.0295) 86.13–86.49 0.7653 | 86.35 (0.0285) 86.17–86.52 0.7661 |
| RDA | 76.82 (0.0785) 76.34–77.31 0.6460 | 83.80 (0.0436) 83.53–84.04 0.755 | 87.03 (0.0336) 86.82–87.24 0.8053 | 90.06 (0.0238) 89.91–90.21 0.8376 | 90.09 (0.0242) 89.94–90.24 0.8380 | 90.03 (0.0230) 89.89–90.17 0.8371 | |
| FDA | 77.32 (0.0780) 76.83–77.80 0.6596 | 82.64 (0.0420) 82.37–82.90 0.7404 | 85.03 (0.0330) 84.83–85.24 0.7762 | 86.27 (0.0296) 86.09–86.45 0.7653 | 86.42 (0.0295) 86.23–86.60 0.7675 | 86.43 (0.0283) 86.25–86.60 0.7678 | |
| MDA | 72.83 (0.0830) 72.32–73.35 0.5966 | 79.94 (0.0446) 79.67–80.22 0.7007 | 82.75 (0.0342) 82.54–82.96 0.7423 | 89.28 (0.0235) 89.13–89.42 0.8235 | 89.30 (0.0240) 89.15–89.45 0.8239 | 89.30 (0.0242) 89.15–89.45 0.8240 | |
| KDA | 74.49 (0.0760) 74.02–74.97 0.5960 | 81.58 (0.0415) 81.32–81.83 0.720 | 85.17 (0.0300) 84.98–85.35 0.7758 | 90.40 (0.0226) 90.26–90.54 0.8408 | 90.27 (0.0225) 90.13–90.41 0.8386 | 90.34 (0.0222) 90.21–90.49 0.8400 | |
| SDA | 76.71 (0.0783) 76.22–77.19 0.647 | 82.49 (0.0413) 82.24–82.75 0.7370 | 84.93 (0.0326) 84.73–85.14 0.7740 | 86.81 (0.0295) 86.62–86.99 0.7752 | 86.91 (0.0293) 86.73–87.09 0.7767 | 86.98 (0.0283) 86.81–87.16 0.7779 | |
| 0.7 | LDA | 80.53 (0.0741) 80.07–80.99 0.6697 | 83.98 (0.0405) 83.72–84.23 0.7071 | 85.56 (0.0320) 85.36–85.75 0.7716 | 86.28 (0.0306) 86.09–86.47 0.7650 | 86.13 (0.0303) 85.94–86.32 0.7618 | 86.19 (0.0300) 86.01–86.38 0.7632 |
| RDA | 82.69 (0.0776) 82.21–83.17 0.7345 | 86.80 (0.0400) 86.55–87.05 0.7961 | 89.19 (0.0327) 88.99–89.40 0.8331 | 90.09 (0.0244) 89.94–90.24 0.8379 | 90.04 (0.0243) 89.89–90.19 0.8371 | 90.10 (0.0244) 89.94–90.25 0.8381 | |
| FDA | 80.33 (0.0743) 79.87–80.79 0.6982 | 84.03 (0.0406) 83.78–84.28 0.7485 | 85.60 (0.0319) 85.40–85.80 0.7726 | 86.39 (0.0304) 86.20–86.58 0.7674 | 86.23 (0.0304) 86.04–86.42 0.7640 | 86.30 (0.0296) 86.12–86.49 0.7654 | |
| MDA | 80.54 (0.0761) 80.07–81.01 0.7062 | 85.25 (0.0385) 85.01–85.49 0.7724 | 87.47 (0.0290) 87.29–87.65 0.8068 | 89.27 (0.0245) 89.12–89.42 0.8234 | 89.16 (0.0250) 89.01–89.32 0.8214 | 89.22 (0.0251) 89.07–89.38 0.8225 | |
| KDA | 82.56 (0.0677) 82.14–82.98 0.7221 | 87.72 (0.0354) 87.50–87.94 0.8079 | 90.51 (0.0273) 90.34–90.68 0.8527 | 90.33 (0.0235) 90.12–90.48 0.8398 | 90.18 (0.0234) 90.03–90.32 0.8370 | 90.26 (0.0234) 90.11–90.40 0.8383 | |
| SDA | 81.52 (0.0718) 81.08–81.97 0.7183 | 84.41 (0.0399) 84.16–84.66 0.7553 | 85.87 (0.0316) 85.67–86.07 0.7774 | 86.89 (0.0303) 86.70–87.08 0.7766 | 86.79 (0.0302) 86.60–86.98 0.7742 | 86.83 (0.0296) 86.64–87.01 0.7751 | |
| Correlation Levels () | Sample Sizes (n) | Methods of Missing Imputation for p = 5 | |||||
|---|---|---|---|---|---|---|---|
| Mean | Reg | KNN | RF | BT | MissRanger | ||
| 0.3 | 100 | 73.90 (LDA) | 81.54 (FDA) | 76.84 (LDA) | 77.06 (LDA) | 73.92 (LDA) | 77.38 (LDA) |
| 300 | 77.90 (FDA) | 86.48 (RDA) | 82.28 (LDA) | 83.57 (RDA) | 77.83 (LDA) | 83.80 (RDA) | |
| 500 | 80.34 (RDA) | 89.55 (RDA) | 86.08 (RDA) | 86.72 (RDA) | 79.88 (RDA) | 87.03 (RDA) | |
| 0.7 | 100 | 80.24 (RDA) | 85.45 (MDA) | 83.28 (KDA) | 82.42 (RDA) | 80.31 (RDA) | 82.69 (RDA) |
| 300 | 83.92 (KDA) | 89.62 (KDA) | 88.34 (KDA) | 87.81 (KDA) | 84.94 (KDA) | 87.72 (KDA) | |
| 500 | 85.56 (KDA) | 87.90 (MDA) | 90.64 (KDA) | 90.51 (KDA) | 87.04 (KDA) | 90.51 (KDA) | |
| Correlation Levels () | Sample Sizes (n) | Methods of Missing Imputation for p = 10 | |||||
|---|---|---|---|---|---|---|---|
| Mean | Reg | KNN | RF | BT | MissRanger | ||
| 0.3 | 100 | 74.78 (KDA) | 87.84 (LDA) | 76.26 (KDA) | 76.45 (KDA) | 74.97 (KDA) | 90.40 (KDA) |
| 300 | 77.52 (SDA) | 79.39 (FDA) | 79.45 (SDA) | 77.67 (SDA) | 77.61 (SDA) | 90.27 (KDA) | |
| 500 | 78.69 (LDA) | 86.57 (FDA) | 81.02 (LDA) | 80.85 (SDA) | 78.78 (FDA) | 90.34 (KDA) | |
| 0.7 | 100 | 84.47 (KDA) | 87.61 (KDA) | 85.50 (KDA) | 85.15 (KDA) | 84.75 (KDA) | 90.33 (KDA) |
| 300 | 85.56 (KDA) | 84.32 (SDA) | 84.49 (KDA) | 86.12 (KDA) | 85.68 (KDA) | 90.18 (KDA) | |
| 500 | 85.83 (RDA) | 86.57 (RDA) | 87.46 (LDA) | 87.06 (RDA) | 86.26 (KDA) | 90.26 (KDA) | |
| Imputation Method | Average Runtime (s) | Peak Memory (MB) |
|---|---|---|
| Mean | 0.05 | 12 |
| Regression | 0.08 | 18 |
| KNN | 2.47 | 65 |
| Random Forest | 8.12 | 120 |
| Bagged Trees | 10.35 | 180 |
| MissRanger | 15.28 | 250 |
4. Results of Actual Data
5. Discussion
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
| DA | Discriminant Analysis |
| LDA | Linear Discriminant Analysis |
| RDA | Regularized Discriminant Analysis |
| FDA | Flexible Discriminant Analysis |
| MDA | Mixture Discriminant Analysis |
| SDA | Shrinkage Discriminant Analysis |
| KNN | k-Nearest Neighbors |
| RF | Random Forest |
| BT | Bagged Trees |
Appendix A
Appendix A.1
| Parameter | Symbol | Values/Description |
|---|---|---|
| Number of predictor variables | 5, 10 | |
| Sample sizes | 100, 300, 500 | |
| Correlation levels | 0.3, 0.7 | |
| Covariance structure | Toeplitz: | |
| Missingness mechanism | – | 10% missing |
| Number of replications | - | 1000 |
| Class distribution | – | Balanced (4 classes) |
| Random seed setup | 99 | Fixed master seed |
References
- Jain, S.; Kuriakose, M. Discriminant analysis—Simplified. Int. J. Contemp. Dent. Med. Rev. 2020, 2019, 031219. [Google Scholar]
- Ramayah, T.; Ahmad, N.H.; Halim, H.A.; Zainal, S.R.M.; Lo, M.C. Discriminant analysis: An illustrated example. Afr. J. Bus. Manag. 2010, 4, 1654–1667. [Google Scholar]
- Singh, A.; Gupta, S. Implementation of linear and quadratic discriminant analysis incorporating costs of misclassification. Processes 2021, 9, 1382. [Google Scholar]
- Chatterjee, A.; Das, D. Comparative Study of Linear Discriminant Analysis (LDA), Quadratic Discriminant Analysis (QDA) and Support Vector Machine (SVM) in Dataset. Int. J. Comput. Appl. 2020, 975, 8887. [Google Scholar]
- Berrar, D. Linear vs. quadratic discriminant analysis classifier: A tutorial. Mach. Learn. Bioinform. 2019, 106, 1–13. [Google Scholar]
- Friedman, J.H. Regularized discriminant analysis. J. Am. Stat. Assoc. 1989, 84, 165–175. [Google Scholar] [CrossRef]
- Di Franco, C.; Palumbo, F. A RDA-based clustering approach for structural data. Stat. Appl. 2022, 34, 249–272. [Google Scholar]
- Hastie, T.; Tibshirani, R.; Buja, A. Flexible discriminant analysis. J. Am. Stat. Assoc. 1994, 89, 1255–1270. [Google Scholar] [CrossRef]
- Hastie, T.; Tibshirani, R. Discriminant analysis by Gaussian mixtures. J. R. Stat. Soc. B 1996, 58, 155–176. [Google Scholar] [CrossRef]
- Notice, D.; Soleimani, H.; Pavlidis, N.G.; Kheiri, A.; Muñoz, M.A. Instance Space Analysis of the Capacitated Vehicle Routing Problem with Mixture Discriminant Analysis. In Proceedings of the GECCO ‘25: Proceedings of the Genetic and Evolutionary Computation Conference, Málaga, Spain, 14–18 July 2025; ACM: New York, NY, USA, 2025; pp. 1–9. [Google Scholar]
- Bai, X.; Zhang, M.; Jin, Z.; You, Y.; Liang, C. Fault Detection and Diagnosis for Chiller Based on Feature-Recognition Model and Kernel Discriminant Analysis. Sustain. Cities Soc. 2022, 79, 103708. [Google Scholar] [CrossRef]
- Bickel, P.J.; Levina, E. Some theory for Fisher’s linear discriminant function, ‘naive Bayes’, and some alternatives when there are many more variables than observations. Bernoulli 2004, 10, 989–1010. [Google Scholar] [CrossRef]
- Vo, T.H.; Nguyen, L.T.; Vo, B.N.; Vo, A.H. Weighted missing linear discriminant analysis. arXiv 2024, arXiv:2407.00710. [Google Scholar] [CrossRef]
- Nguyen, D.; Yan, J.; De, S.; Liu, Y. Efficient parameter estimation for multivariate monotone missing data. arXiv 2020, arXiv:2009.11360. [Google Scholar]
- Pepinsky, T.B. A Note on Listwise Deletion versus Multiple Imputation. Polit. Anal. 2018, 26, 480–488. [Google Scholar] [CrossRef]
- Ibrahim, J.G.; Molenberghs, G. Missing Data Methods and Applications. In The Oxford Handbook of Applied Bayesian Analysis; O’Hagan, A., West, M., Eds.; Oxford University Press: Oxford, UK, 2010. [Google Scholar]
- Kang, H. The prevention and handling of the missing data. Korean J. Anesthesiol. 2013, 64, 402–406. [Google Scholar] [CrossRef]
- Agiwal, V.; Chaudhuri, S. Methods and Implications of Addressing Missing Data in Health-Care Research. Curr. Med. 2024, 22, 60–62. [Google Scholar] [CrossRef]
- Palanivinayagam, A.; Damaševičius, R. Effective Handling of Missing Values in Datasets for Classification Using Machine Learning Methods. Information 2023, 14, 92. [Google Scholar] [CrossRef]
- Khashei, M.; Najafi, F.; Bijari, M. Pattern classification with missing data: A review and future research directions. Appl. Soft Comput. 2023, 136, 110141. [Google Scholar]
- Sharmila, R.; Sundararajan, V.; Krishnamoorthy, S. Classification Techniques for Datasets with Missing Data: A Comprehensive Review. In Proceedings of the 2022 International Conference on Computing, Communication and Green Engineering (CCGE), Coimbatore, India, 2–4 December 2022; pp. 252–256. [Google Scholar]
- Rácz, A.; Gere, A. Comparison of missing value imputation tools for machine learning models based on product development case studies. LWT-Food Sci. Technol. 2025, 221, 117585. [Google Scholar] [CrossRef]
- Hong, J.; Lee, H.; Kim, D. Enhancing missing data imputation using machine learning techniques for diabetes classification. Health Inform. J. 2020, 26, 2671–2685. [Google Scholar]
- Bai, T.; Liang, X.; He, L.; Zhang, H. Deep learning-based imputation with autoencoder for medical data with missing values. IEEE Access 2022, 10, 59301–59313. [Google Scholar]
- van Buuren, S. Flexible Imputation of Missing Data, 2nd ed.; Chapman and Hall/CRC: New York, NY, USA, 2018. [Google Scholar]
- Audigier, V.; White, I.R.; Jolani, S.; Debray, T.P.A.; Quartagno, M.; Carpenter, J.; van Buuren, S.; Resche-Rigon, M. Multiple Imputation for Multilevel Data with Continuous and Binary Variables. Stat. Sci. 2018, 33, 160–183. [Google Scholar] [CrossRef]
- Resche-Rigon, M.; White, I.R.; Bartlett, J.W.; Carpenter, J.R.; van Buuren, S. Multiple Imputation for Missing Data in Multilevel Models: A Practical Guide. Stat. Methods Med. Res. 2020, 29, 1348–1364. [Google Scholar]
- Zhang, F.; Liu, S.; Li, J. A Machine Learning-Based Multiple Imputation Method for Incomplete Medical Data. Information 2023, 10, 77. [Google Scholar] [CrossRef]
- Zhang, Y.; Li, H. GAN-Based Imputation Framework for Multivariate Time-Series Data. Pattern Recognit. Lett. 2024, 184, 56–64. [Google Scholar]
- Park, J.; Kim, S.; Lee, D. A Hybrid Missing Data Imputation Model Combining MICE and Variational Autoencoders. Knowl.-Based Syst. 2025, 298, 112056. [Google Scholar]
- Schäfer, J.; Strimmer, K. A Shrinkage Approach to Large-Scale Covariance Matrix Estimation and Implications for Functional Genomics. Stat. Appl. Genet. Mol. Biol. 2005, 4, 32. [Google Scholar] [CrossRef]
- Stekhoven, D.J.; Bühlmann, P. MissForest—Non-Parametric Missing Value Imputation for Mixed-Type Data. Bioinformatics 2012, 28, 112–118. [Google Scholar] [CrossRef]
- Zhao, S.; Zhang, B.; Yang, J.; Zhou, J.; Xu, Y. Linear Discriminant Analysis. Nat. Rev. Methods Primers 2024, 4, 70. [Google Scholar] [CrossRef]
- McLachlan, G.J. Discriminant Analysis and Statistical Pattern Recognition; Wiley: Hoboken, NJ, USA, 2004. [Google Scholar]
- Baudat, G.; Anouar, F. Generalized Discriminant Analysis Using a Kernel Approach. Neural Comput. 2000, 12, 2385–2404. [Google Scholar] [CrossRef]
- Cai, D.; He, X.; Han, J. Speed Up Kernel Discriminant Analysis. VLDB J. 2011, 20, 21–33. [Google Scholar] [CrossRef]
- Ledoit, O.; Wolf, M. A Well-Conditioned Estimator for Large-Dimensional Covariance Matrices. J. Multivar. Anal. 2004, 88, 365–411. [Google Scholar] [CrossRef]
- Ahdesmäki, M.; Strimmer, K. Feature Selection in Omics Prediction Problems Using CAT Scores and False Non-Discovery Rate Control. Ann. Appl. Stat. 2010, 4, 503–519. [Google Scholar] [CrossRef]
- Little, R.J.A.; Rubin, D.B. Statistical Analysis with Missing Data, 2nd ed.; Wiley: Hoboken, NJ, USA, 2002. [Google Scholar]
- van Buuren, S.; Groothuis-Oudshoorn, K. Mice: Multivariate Imputation by Chained Equations in R. J. Stat. Softw. 2011, 45, 1–67. [Google Scholar] [CrossRef]
- Zhang, S. Nearest Neighbor Selection for Iteratively kNN Imputation. J. Syst. Softw. 2012, 85, 2541–2552. [Google Scholar] [CrossRef]
- Golino, H.F.; Gomes, C.M.A. Random Forest as an Imputation Method for Education and Psychology Research: Its Impact on Item Fit and Difficulty of the Rasch Model. J. Appl. Stat. 2016, 43, 401–421. [Google Scholar] [CrossRef]
- Breiman, L. Bagging Predictors. Mach. Learn. 1996, 24, 123–140. [Google Scholar] [CrossRef]
- Kuhn, M. Building Predictive Models in R Using the caret Package. J. Stat. Softw. 2008, 28, 1–26. [Google Scholar] [CrossRef]
- Wright, M.N.; Ziegler, A. Ranger: A Fast Implementation of Random Forests for High Dimensional Data in C++ and R. J. Stat. Softw. 2017, 77, 1–17. [Google Scholar] [CrossRef]
| Predicted Data ) | ) | |||
|---|---|---|---|---|
| Class 1 | Class 2 | Class 3 | Class 4 | |
| Class 1 | ||||
| Class 2 | ||||
| Class 3 | ||||
| Class 4 | ||||
| DA Methods | p = 5 | p = 10 | |||||
|---|---|---|---|---|---|---|---|
| n = 100 | n = 300 | n = 500 | n = 100 | n = 300 | n = 500 | ||
| 0.3 | LDA | 73.90 (0.0804) 73.40–74.40 0.6057 | 77.86 (0.0422) 77.60–78.12 0.6646 | 79.48 (0.0332) 79.28–79.69 0.6879 | 72.15 (0.0836) 71.63–72.66 0.5679 | 77.47 (0.0434) 77.20–77.74 0.6356 | 78.69 (0.0333) 78.48–78.90 0.6529 |
| RDA | 73.26 (0.0819) 72.75–74.40 0.5899 | 78.05 (0.0446) 77.77–78.32 0.6674 | 80.34 (0.0354) 80.12–80.56 0.7030 | 72.36 (0.0841) 71.84–72.88 0.5580 | 76.99 (0.0445) 76.71–77.27 0.6217 | 78.21 (0.0334) 78.01–78.42 0.6412 | |
| FDA | 73.73 (0.0801) 73.24–74.23 0.6048 | 77.90 (0.0423) 77.64–78.16 0.6656 | 79.50 (0.0333) 79.30–79.71 0.6885 | 71.88 (0.0843) 71.36–72.40 0.5659 | 77.40 (0.0438) 77.13–77.67 0.6352 | 78.71 (0.0332) 78.50–78.91 0.6536 | |
| MDA | 68.85 (0.0846) 68.32–69.37 0.5362 | 74.98 (0.0450) 74.70–75.26 0.6230 | 77.16 (0.0346) 76.95–77.38 0.6539 | 65.50 (0.0913) 64.93–66.07 0.4778 | 73.77 (0.0882) 73.47–74.07 0.5834 | 76.05 (0.0350) 75.83–76.26 0.6165 | |
| KDA | 71.06 (0.0761) 70.59–71.53 0.5381 | 75.72 (0.0431) 75.45–75.99 0.6264 | 78.26 (0.0327) 78.06–78.47 0.6678 | 74.78 (0.0770) 74.30–75.26 0.5661 | 76.98 (0.0434) 76.71–77.25 0.6068 | 77.63 (0.0324) 77.43–77.83 0.6214 | |
| SDA | 72.90 (0.0831) 72.38–73.14 0.5876 | 77.45 (0.0428) 77.18–77.72 0.6563 | 79.18 (0.0321) 78.98–79.38 0.6819 | 72.68 (0.0813) 72.18–73.18 0.5687 | 77.52 (0.0437) 77.25–77.79 0.6318 | 78.62 (0.0326) 78.42–78.82 0.6483 | |
| 0.7 | LDA | 78.47 (0.0801) 77.97–78.96 0.6642 | 81.72 (0.0410) 81.47–81.98 0.7071 | 82.40 (0.0313) 82.20–82.59 0.7175 | 78.69 (0.0760) 78.21–79.16 0.6540 | 84.42 (0.0392) 84.17–84.66 0.7282 | 85.20 (0.0287) 85.02–85.37 0.7400 |
| RDA | 80.24 (0.0793) 79.75–80.73 0.6977 | 83.74 (0.0412) 83.48–83.99 0.7474 | 84.96 (0.0303) 84.77–85.15 0.7669 | 79.63 (0.1085) 78.96–80.30 0.6619 | 85.01 (0.0384) 84.77–85.25 0.7452 | 85.83 (0.284) 85.65–86.00 0.7590 | |
| FDA | 78.35 (0.0807) 77.85–78.85 0.6676 | 81.81 (0.0411) 81.56–82.07 0.7093 | 82.47 (0.0313) 82.28–82.66 0.7191 | 78.40 (0.0766) 77.93–78.88 0.6521 | 84.35 (0.0392) 84.10–84.59 0.7275 | 85.19 (0.0286) 85.02–85.37 0.7403 | |
| MDA | 77.09 (0.0808) 76.59–77.59 0.6536 | 82.43 (0.0412) 82.17–82.69 0.7261 | 84.12 (0.0310) 83.93–84.31 0.7522 | 75.69 (0.0817) 75.18–76.20 0.6176 | 82.77 (0.0409) 82.52–83.03 0.7070 | 84.25 (0.0296) 84.07–84.44 0.7300 | |
| KDA | 79.86 (0.0748) 79.40–80.33 0.6748 | 83.92 (0.0400) 83.67–84.17 0.7463 | 85.86 (0.0274) 85.69–86.03 0.7790 | 84.47 (0.0630) 84.08–84.86 0.7226 | 85.56 (0.0365) 85.34–85.79 0.7432 | 85.74 (0.0272) 85.58–85.91 0.7487 | |
| SDA | 79.58 (0.0787) 79.09–80.06 0.6868 | 82.20 (0.0409) 81.94–82.45 0.7167 | 82.74 (0.0310) 82.55–82.93 0.7243 | 81.11 (0.0730) 80.66–81.57 0.6933 | 84.69 (0.0388) 84.45–84.94 0.7338 | 85.36 (0.0280) 85.19–85.54 0.7433 | |
| Data | Min | Q1 | Median | Mean | SD | Q3 | Max | Missing |
|---|---|---|---|---|---|---|---|---|
| Age [years] | 26.30 | 42.27 | 49.83 | 50.05 | 10.53 | 56.75 | 78.49 | - |
| Bilirubin [mg/dL] | 0.30 | 0.80 | 1.35 | 3.256 | 4.60 | 3.425 | 12.00 | - |
| Cholesterol [mg/dL] | 120 | 249.5 | 309.5 | 369.5 | 234.78 | 400 | 1775.0 | 28 (8.97%) |
| Albumin [g/dL] | 1.96 | 3.31 | 3.55 | 3.52 | 0.40 | 3.80 | 4.64 | - |
| Copper [µg/day] | 4.00 | 41.25 | 73.00 | 97.65 | 88.26 | 123.00 | 588.00 | 2 (0.61%) |
| Alk_Phos [U/liter] | 289.0 | 871.5 | 1259.0 | 1982.7 | 2115.47 | 1980.0 | 13862.4 | - |
| SGOT [mg/dL] | 26.35 | 80.60 | 114.70 | 122.56 | 56.71 | 151.90 | 457.25 | - |
| Triglycerides [mg/dL] | 33.00 | 84.25 | 108.00 | 124.70 | 65.28 | 151.00 | 598.00 | 30 (9.61%) |
| Platelets [1000 cells/mL] | 62.0 | 199.8 | 257.0 | 261.9 | 93.12 | 322.5 | 563.0 | 4 (1.28%) |
| Prothrombin [second] | 9.00 | 10.0 | 10.60 | 10.73 | 1.00 | 11.10 | 17.10 | - |
| Methods | LDA | RDA | FDA | MDA | KDA | SDA |
|---|---|---|---|---|---|---|
| Mean | 51.08 40.44–61.65 0.2516 | 51.08 40.44–61.65 0.2516 | 51.08 40.44–61.65 0.2516 | 53.26 42.56–63.74 0.2955 | 51.08 40.44–61.65 0.2434 | 44.56 34.19–55.29 0.169 |
| Reg | 48.91 38.34–59.55 0.2029 | 48.91 38.34–59.55 0.2029 | 48.91 38.34–59.55 0.2029 | 43.47 33.16–54.21 0.1623 | 52.17 41.50–62.70 0.2695 | 46.73 36.25–57.43 0.1920 |
| KNN | 48.91 38.34–59.55 0.1760 | 40.21 38.34–59.55 0.1760 | 47.82 37.29–57.49 0.1615 | 51.08 40.44–61.65 0.2157 | 47.82 37.29–58.49 0.1697 | 45.56 35.22–56.36 0.4565 |
| RF | 57.60 46.86–67.85 0.5760 | 57.60 46.86–67.85 0.3517 | 57.60 46.86–67.85 0.3517 | 54.34 43.63–64.77 0.2987 | 53.26 42.56–63.74 0.2762 | 46.73 36.25–57.43 0.1972 |
| BT | 52.17 41.50–62.70 0.2737 | 54.34 41.50–62.70 0.2737 | 52.17 41.50–62.70 0.2737 | 52.17 41.50–62.70 0.2877 | 59.78 49.04–69.87 0.3781 | 48.91 38.34–59.55 0.2400 |
| MissRanger | 58.69 47.94–68.86 0.3388 | 52.17 47.94–68.86 0.3388 | 58.69 47.94–68.86 0.3393 | 60.86 50.13–70.88 0.4019 | 63.04 52.34–72.87 0.4190 | 54.34 43.63–64.77 0.5434 |
| DA Methods | Actual Data (y) | Percentage Accuracy | ||||
|---|---|---|---|---|---|---|
| Class 1 | Class 2 | Class 3 | Class 4 | |||
| LDA | Class 1 | 0 | 0 | 0 | 0 | 58.69 |
| Class 2 | 1 | 1 | 0 | 0 | ||
| Class 3 | 4 | 16 | 33 | 12 | ||
| Class 4 | 1 | 1 | 3 | 20 | ||
| MDA | Class 1 | 1 | 0 | 0 | 0 | 60.86 |
| Class 2 | 3 | 6 | 3 | 2 | ||
| Class 3 | 1 | 11 | 29 | 10 | ||
| Class 4 | 1 | 1 | 4 | 20 | ||
| KDA | Class 1 | 0 | 0 | 0 | 0 | 63.04 |
| Class 2 | 1 | 5 | 0 | 0 | ||
| Class 3 | 5 | 13 | 29 | 8 | ||
| Class 4 | 0 | 0 | 7 | 24 | ||
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Araveeporn, A.; Kangtunyakarn, A. An Enhanced Discriminant Analysis Approach for Multi-Classification with Integrated Machine Learning-Based Missing Data Imputation. Mathematics 2025, 13, 3392. https://doi.org/10.3390/math13213392
Araveeporn A, Kangtunyakarn A. An Enhanced Discriminant Analysis Approach for Multi-Classification with Integrated Machine Learning-Based Missing Data Imputation. Mathematics. 2025; 13(21):3392. https://doi.org/10.3390/math13213392
Chicago/Turabian StyleAraveeporn, Autcha, and Atid Kangtunyakarn. 2025. "An Enhanced Discriminant Analysis Approach for Multi-Classification with Integrated Machine Learning-Based Missing Data Imputation" Mathematics 13, no. 21: 3392. https://doi.org/10.3390/math13213392
APA StyleAraveeporn, A., & Kangtunyakarn, A. (2025). An Enhanced Discriminant Analysis Approach for Multi-Classification with Integrated Machine Learning-Based Missing Data Imputation. Mathematics, 13(21), 3392. https://doi.org/10.3390/math13213392

