A Novel MBPSO–BDGWO Ensemble Feature Selection Method for High-Dimensional Classification Data
Abstract
1. Introduction
2. Materials and Methods
2.1. Binary PSO
Modified BPSO
2.2. Binary GWO
Binary Dynamic Grey Wolf Optimization Algorithm (BDGWO)
2.3. Proposed MBPSO-BDGWO-Based Ensemble Feature Selection Method
Objective Function
2.4. Simulation Study Design
2.4.1. Performance Evaluation Metrics
2.4.2. Ablation Experiments
3. Results
Real Data Set Applications
4. Conclusions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
| AUC | Area Under the Receiver Operating Characteristic Curve |
| BDGWO | Binary Dynamic Grey Wolf Optimization |
| BGWO | Binary Grey Wolf Optimization |
| BPSO | Binary Particle Swarm Optimization |
| CPI | Conditional Permutation Importance |
| FS | Feature Selection |
| FPR | False Positive Rate |
| GWO | Grey Wolf Optimization |
| Average Pairwise Jaccard Index | |
| MAD | Median Absolute Deviation |
| MBPSO | Modified Binary Particle Swarm Optimization |
| MI | Mutual Information |
| pDCM | Proposed Dynamic Coefficient Method |
| PSO | Particle Swarm Optimization |
| RF | Random Forest |
| TPR | True Positive Rate |
| XOR | Exclusive OR Logical Operator |
References
- Pourahmadi, M. High-Dimensional Covariance Estimation: With High-Dimensional Data; Wiley: Hoboken, NJ, USA, 2013. [Google Scholar]
- Wang, L.; Jiang, S.; Jiang, S. A feature selection method via analysis of relevance, redundancy, and interaction. Expert Syst. Appl. 2021, 183, 115365. [Google Scholar] [CrossRef]
- Chandrashekar, G.; Sahin, F. A survey on feature selection methods. Comput. Electr. Eng. 2014, 40, 16–28. [Google Scholar] [CrossRef]
- Bellman, R. Dynamic programming. Math. Sci. Eng. 1967, 40, 101–137. [Google Scholar]
- Ayesha, S.; Hanif, M.K.; Talib, R. Overview and comparative study of dimensionality reduction techniques for high-dimensional data. Inf. Fusion. 2020, 59, 44–58. [Google Scholar] [CrossRef]
- Guyon, I.; Elisseeff, A. An introduction to variable and feature selection. J. Mach. Learn. Res. 2003, 3, 1157–1182. [Google Scholar]
- Venkatesh, B.; Anuradha, J. A review of feature selection and its methods. Cybern. Inf. Technol. 2019, 19, 3–26. [Google Scholar] [CrossRef]
- Akhy, S.A.; Mia, M.B.; Mustafa, S.; Chakraborti, N.R.; Krishnachalitha, K.C.; Rabbany, G. A comprehensive study on ensemble feature selection techniques for classification. In Proceedings of the 2024 11th International Conference on Computing for Sustainable Global Development (INDIACom), New Delhi, India, 28 February–1 March 2024; IEEE: New York, NY, USA, 2024; pp. 1319–1324. [Google Scholar]
- Gnana, D.A.A.; Balamurugan, S.A.A.; Leavline, E.J. Literature review on feature selection methods for high-dimensional data. Int. J. Comput. Appl. 2016, 136, 9–17. [Google Scholar] [CrossRef]
- Hijazi, N.M.; Faris, H.; Aljarah, I. A parallel metaheuristic approach for ensemble feature selection based on multi-core architectures. Expert. Syst. Appl. 2021, 15, 182. [Google Scholar] [CrossRef]
- Wu, T.; Hao, Y.; Yang, B.; Peng, L. ECM-EFS: An ensemble feature selection based on enhanced co-association matrix. Pattern Recognit 2023, 139, 109449. [Google Scholar] [CrossRef]
- Saeys, Y.; Inza, I.; Larranaga, P. A review of feature selection techniques in Bioinformatics. Bioinformatics 2007, 23, 2507–2517. [Google Scholar] [CrossRef]
- Almomani, O. A feature selection model for network intrusion detection system based on PSO, GWO, FFA, and GA algorithms. Symmetry 2020, 12, 1046. [Google Scholar] [CrossRef]
- Spooner, A.; Mohammadi, G.; Sachdev, P.S.; Brodaty, H.; Sowmya, A.; Sydney Memory and Ageing Study and the Alzheimer’s Disease Neuroimaging Initiative. Ensemble feature selection with data-driven thresholding for Alzheimer’s disease biomarker discovery. BMC Bioinform. 2023, 24, 9. [Google Scholar] [CrossRef]
- Wang, J.; Xu, J.; Zhao, C.; Peng, Y.; Wang, H. An ensemble feature selection method for high-dimensional data based on sort aggregation. Syst. Sci. Control Eng. 2019, 7, 32–39. [Google Scholar] [CrossRef]
- Manikandan, G.; Abirami, S. A Survey on Feature Selection and Extraction Techniques for High-Dimensional Microarray Datasets. In Knowledge Computing and its Applications; Margret Anouncia, S., Wiil, U., Eds.; Springer: Singapore, 2018. [Google Scholar] [CrossRef]
- Zhuang, Y.; Fan, Z.; Gou, J.; Huang, Y.; Feng, W. A importance-based ensemble method using an adaptive threshold searching for feature selection. Expert. Syst. Appl. 2025, 267, 126152. [Google Scholar] [CrossRef]
- Sumant, A.S.; Patil, D. Stability Investigation of Ensemble Feature Selection for High Dimensional Data Analytics. In Proceedings of the Third International Conference on Image Processing and Capsule Networks, ICIPCN 2022, Bangkok, Thiland, 20–21 May 2022; Chen, J.I.Z., Tavares, J.M.R.S., Shi, F., Eds.; Springer: Cham, Switzerland, 2022; Volume 514. [Google Scholar] [CrossRef]
- Guney, H.; Oztoprak, H. A robust ensemble feature selection technique for high-dimensional datasets based on minimum weight threshold method. Comput. Intell. 2022, 38, 1616–1658. [Google Scholar] [CrossRef]
- Tu, Q.; Chen, X.; Liu, X. Multi-strategy ensemble grey wolf optimizer and its application to feature selection. Appl. Soft Comput. 2019, 76, 16–30. [Google Scholar] [CrossRef]
- Singh, N.; Singh, P. A hybrid ensemble-filter wrapper feature selection approach for medical data classification. Chemom. Intell. Lab. Syst. 2021, 217, 104396. [Google Scholar] [CrossRef]
- Robindro, K.; Devi, S.S.; Clinton, U.B.; Takhellambam, L.; Singh, Y.R.; Hoque, N. Hybrid distributed feature selection using particle swarm optimization-mutual information. Data Sci. Manag. 2024, 7, 64–73. [Google Scholar] [CrossRef]
- Mandal, A.K.; Nadim, M.; Saha, H.; Sultana, T.; Hossain, M.D.; Huh, E.N. Feature subset selection for high-dimensional, low sampling size data classification using ensemble feature selection with a wrapper-based search. IEEE Access 2024, 12, 62341–62357. [Google Scholar] [CrossRef]
- Ab Hamid, T.M.T.; Sallehuddin, R.; Yunos, Z.M.; Ali, A. Ensemble based filter feature selection with harmonize particle swarm optimization and support vector machine for optimal cancer classification. Mach. Learn. Appl. 2021, 5, 100054. [Google Scholar] [CrossRef]
- Xu, J.; Sun, L.; Gao, Y.; Xu, T. An ensemble feature selection technique for cancer recognition. BioMed Mater. Eng. 2014, 24, 1001–1008. [Google Scholar] [CrossRef]
- Barrera-García, J.; Cisternas-Caneo, F.; Crawford, B.; Gómez Sánchez, M.; Soto, R. Feature selection problem and metaheuristics: A systematic literature review about its formulation, evaluation and applications. Biomimetics 2023, 9, 9. [Google Scholar] [CrossRef] [PubMed]
- Dokeroglu, T.; Deniz, A.; Kiziloz, H.E. A comprehensive survey on recent metaheuristics for feature selection. Neurocomputing 2022, 494, 269–296. [Google Scholar] [CrossRef]
- Piri, J.; Mohapatra, P.; Dey, R.; Acharya, B.; Gerogiannis, V.C.; Kanavos, A. Literature review on hybrid evolutionary approaches for feature selection. Algorithms 2023, 16, 167. [Google Scholar] [CrossRef]
- Nguyen, B.H.; Xue, B.; Zhang, M. A survey on swarm intelligence approaches to feature selection in data mining. Swarm Evol. Comput. 2020, 54, 100663. [Google Scholar] [CrossRef]
- Xue, B.; Zhang, M.; Browne, W.N.; Yao, X. A survey on evolutionary computation approaches to feature selection. IEEE Trans. Evol. Comput. 2016, 20, 606–626. [Google Scholar] [CrossRef]
- Gu, S.; Cheng, R.; Jin, Y. Feature selection for high-dimensional classification using a competitive swarm optimizer. Soft Comput. 2018, 22, 811–822. [Google Scholar] [CrossRef]
- Pham, T.H.; Raahemi, B. Bio-inspired feature selection algorithms with their applications: A systematic literature review. IEEE Access 2023, 11, 43733–43758. [Google Scholar] [CrossRef]
- Kennedy, J.; Eberhart, R.C. A discrete binary version of the particle swarm algorithm. In Proceedings of the 1997 IEEE International Conference on Systems, Man, and Cybernetics (SMC’97), Orlando, FL, USA, 12–15 October 1997; IEEE: New York, NY, USA, 1997; Volume 5, pp. 4104–4108. [Google Scholar] [CrossRef]
- Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95 International Conference on Neural Networks, Perth, Australia, 27 November–1 December 1995; Volume 4, pp. 1942–1948. [Google Scholar] [CrossRef]
- Ünler, A.; Murat, A.E.; Chinnam, R.B. mr2PSO: A maximum relevance minimum redundancy feature selection method based on swarm intelligence for support vector machine classification. Inf. Sci. 2011, 181, 4625–4641. [Google Scholar] [CrossRef]
- Abdmouleh, Z.; Gastli, A.; Ben-Brahim, L.; Haouari, M.; Al-Emadi, N.A. Review of Optimization Techniques applied for the Integration of Distributed Generation from Renewable Energy Sources. Renew. Energy 2017, 113, 266–280. [Google Scholar] [CrossRef]
- Tran, B.; Zhang, M.; Xue, B. A PSO-based hybrid feature selection algorithm for high-dimensional classification. In Proceedings of the 2016 IEEE Congress on Evolutionary Computation (CEC), Vancouver, BC, Canada, 24–29 July 2016; IEEE: New York, NY, USA, 2016; pp. 3801–3808. [Google Scholar]
- Gupta, D.K.; Reddy, K.S.; Shweta; Ekbal, A. PSO-ASENT: Feature selection using particle swarm optimization for aspect-based sentiment analysis. In Proceedings of the International Conference on Applications of Natural Language to Information Systems (NLDB 2015), Passau, Germany, 17–19 June 2015; Springer: Cham, Switzerland, 2015; pp. 220–233. [Google Scholar]
- Brezočnik, L. Feature selection for classification using particle swarm optimization. In Proceedings of the IEEE EUROCON 2017—17th International Conference on Smart Technologies, Ohrid, North Macedonia, 6–8 July 2017; IEEE: New York, NY, USA, 2017; pp. 966–971. [Google Scholar]
- Abualigah, L.M.; Khader, A.T.; Hanandeh, E.S. A new feature selection method to improve the document clustering using particle swarm optimization algorithm. J. Comput. Sci. 2018, 25, 456–466. [Google Scholar] [CrossRef]
- Chuang, L.; Yang, C.; Yang, C. Tabu Search and Binary Particle Swarm Optimization for Feature Selection Using Microarray Data. J. Comput. Biol. 2009, 16, 1689–1703. [Google Scholar] [CrossRef]
- Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey wolf optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef]
- Emary, E.; Zawbaa, H.M.; Hassanien, A.E. Binary grey wolf optimization approaches for feature selection. Neurocomputing 2016, 172, 371–381. [Google Scholar] [CrossRef]
- Shen, C.; Zhang, K. Two-stage improved Grey Wolf optimization algorithm for feature selection on high-dimensional classification. Complex. Intell. Syst. 2022, 8, 2769–2789. [Google Scholar] [CrossRef]
- Yousef, J.; Youssef, A.; Keshk, A. A hybrid swarm intelligence based feature selection algorithm for high dimensional datasets. Int. J. Comput. Info 2021, 8, 67–86. [Google Scholar] [CrossRef]
- Too, J.; Abdullah, A.R.; Mohd Saad, N.; Mohd Ali, N.; Tee, W. A New Competitive Binary Grey Wolf Optimizer to Solve the Feature Selection Problem in EMG Signals Classification. Computers 2018, 7, 58. [Google Scholar] [CrossRef]
- El-Kenawy, E.S.; Eid, M. Hybrid Gray Wolf and Particle Swarm Optimization for Feature Selection. Int. J. Innov. Comput. Inf. Control 2020, 16, 831–844. [Google Scholar] [CrossRef]
- Al-Tashi, Q.; Abdulkadir, S.J.; Rais, H.M.; Mirjalili, S.; Alhussian, H. Binary Optimization Using Hybrid Grey Wolf Optimizatio for Feature Selection. IEEE Access 2019, 7, 39496–39508. [Google Scholar] [CrossRef]
- El-Hasnony, I.M.; Barakat, S.I.; Elhoseny, M.; Mostafa, R.R. Improved feature selection model for big data analytics. IEEE Access 2020, 8, 66989–67004. [Google Scholar] [CrossRef]
- Abdo, M.A.; Mostafa, R.; Abdel-Hamid, L. An optimized hybrid approach for feature selection based on Chi-square and particle swarm optimization algorithms. Data 2024, 9, 20. [Google Scholar] [CrossRef]
- Qasim, O.S.; Algamal, Z.Y. Feature selection using particle swarm optimization-based logistic regression model. Chemom. Intell. Lab. Syst. 2018, 182, 41–46. [Google Scholar] [CrossRef]
- Cervante, L.; Xue, B.; Zhang, M.; Shang, L. Binary particle swarm optimisation for feature selection: A filter based approach. In Proceedings of the 2012 IEEE Congress on Evolutionary Computation, Brisbane, QLD, Australia, 10–15 June 2012; IEEE: New York, NY, USA, 2012; pp. 1–8. [Google Scholar] [CrossRef]
- Ma, Y.; Jiang, C.; Hou, Z.; Wang, C. The formulation of the optimal strategies for the electricity producers based on the particle swarm optimization algorithm. IEEE Trans. Power Syst. 2006, 21, 1663–1671. [Google Scholar] [CrossRef]
- Egrioglu, E.; Yolcu, U.; Aladag, C.H.; Kocak, C. An ARMA type fuzzy time series forecasting method based on particle swarm optimization. Math. Probl. Eng. 2013, 2013, 935815. [Google Scholar] [CrossRef]
- Erdoğan, F.; Karakoyun, M.; Gülcü, Ş. A novel binary Grey Wolf Optimizer algorithm with a new dynamic position update mechanism for feature selection problem. Soft Comput. 2024, 28, 12623–12654. [Google Scholar] [CrossRef]
- Shami, T.M.; El-Saleh, A.A.; Alswaitti, M.; Al-Tashi, Q.; Summakieh, M.A.; Mirjalili, S. Particle swarm optimization: A comprehensive survey. IEEE Access 2022, 10, 10031–10061. [Google Scholar] [CrossRef]
- Abbes, W.; Kechaou, Z.; Hussain, A.; Qahtani, A.M.; Almutiry, O.; Dhahri, H.; Alimi, A.M. An Enhanced Binary Particle Swarm Optimization (E-BPSO) algorithm for service placement in hybrid cloud platforms. Neural Comput. Applic 2023, 35, 1343–1361. [Google Scholar] [CrossRef]
- Sancar, N.; Onakpojeruo, E.P.; Inan, D.; Ozsahin, D.U. Adaptive elastic net based on modified PSO for Variable selection in cox model with high-dimensional data: A comprehensive simulation study. IEEE Access 2023, 11, 127302–127316. [Google Scholar] [CrossRef]
- Pan, J.-S.; Hu, P.; Snášel, V.; Chu, S.-C. A survey on binary metaheuristic algorithms and their engineering applications. Artif. Intell. Rev. 2023, 56, 6101–6167. [Google Scholar] [CrossRef] [PubMed]
- Karakoyun, M.; Ozkis, A.; Kodaz, H. A new algorithm based on gray wolf optimizer and shuffled frog leaping algorithm to solve multi-objective optimization problems. Appl. Soft Comput. 2020, 96, 106560. [Google Scholar] [CrossRef]
- Bradley, A.P. The use of the area under the ROC curve in the evaluation of machine learning algorithms. Pattern Recognit. 1997, 30, 1145–1159. [Google Scholar] [CrossRef]
- Chawla, N.V. Data Mining for Imbalanced Datasets: An Overview. In Data Mining and Knowledge Discovery Handbook; Maimon, O., Rokach, L., Eds.; Springer: Boston, MA, USA, 2009; pp. 875–886. [Google Scholar] [CrossRef]
- Ling, C.X.; Huang, J.; Zhang, H. AUC: A better measure than accuracy in comparing learning algorithms. In Proceedings of the Canadian Conference on Artificial Intelligence (AI 2003), Halifax, NS, Canada, 11–13 June 2003; Springer: Berlin/Heidelberg, Germany; pp. 329–341. [Google Scholar]
- Sun, L. AVC: Selecting discriminative features on basis of AUC by feature ranking. BMC Bioinform. 2017, 18, 146. [Google Scholar] [CrossRef] [PubMed]
- Liu, H.; Motoda, H. Computational Methods of Feature Selection; Chapman and Hall/CRC: Boca Raton, FL, USA, 2007. [Google Scholar]
- Chen, X.W.; Wasikowski, M. FAST: A ROC-based feature selection metric for small samples and imbalanced data classification problems. In Proceedings of the 14th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD 2008), Las Vegas, NV, USA, 24–27 August 2008; ACM: New York, NY, USA, 2008; pp. 124–132. [Google Scholar]
- Xu, J.W.; Suzuki, K. Max-AUC feature selection in computer-aided detection of polyps in CT colonography. IEEE J. Biomed. Health Inf. 2014, 18, 585–593. [Google Scholar] [CrossRef]
- Tian, Y.; Shi, Y.; Chen, X.; Chen, W. AUC Maximizing Support Vector Machines with Feature Selection. Procedia Comput. Sci. 2011, 4, 1691–1698. [Google Scholar] [CrossRef]
- Vivek, Y.; Ravi, V.; Krishna, P.R. Feature subset selection for big data via parallel chaotic binary differential evolution and feature-level elitism. Comput. Electr. Eng. 2025, 123, 110232. [Google Scholar] [CrossRef]
- Yang, T.; Ying, Y. AUC maximization in the era of big data and AI: A survey. ACM Comput. Surv. 2022, 55, 1–37. [Google Scholar] [CrossRef]
- Strobl, C.; Boulesteix, A.L.; Zeileis, A.; Hothorn, T. Bias in random forest variable importance measures: Illustrations, sources and a solution. BMC Bioinform. 2007, 8, 25. [Google Scholar] [CrossRef]
- Debeer, D.; Strobl, C. Conditional permutation importance revisited. BMC Bioinform. 2020, 21, 307. [Google Scholar] [CrossRef]
- Breiman, L. Random forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef]
- Belgiu, M.; Drăguţ, L. Random forest in remote sensing: A review of applications and future directions. ISPRS J. Photogramm. Remote Sens. 2016, 114, 24–31. [Google Scholar] [CrossRef]
- Billah, M.; Islam, A.K.M.S.; Bin Mamoon, W.; Rahman, M.R. Random forest classifications for landuse mapping to assess rapid flood damage using Sentinel-1 and Sentinel-2 data. Remote Sens. Appl. Soc. Environ. 2023, 30, 100947. [Google Scholar] [CrossRef]
- Maxwell, A.E.; Warner, T.A.; Fang, F. Implementation of machine-learning classification in remote sensing: An applied review. Int. J. Rem. Sens. 2018, 39, 2784–2817. [Google Scholar] [CrossRef]
- Seijo-Pardo, B.; Porto-Díaz, I.; Bolón-Canedo, V.; Alonso-Betanzos, A. Ensemble feature selection: Homogeneous and heterogeneous approaches. Knowl. Based Syst. 2017, 118, 124–139. [Google Scholar] [CrossRef]
- Bolón-Canedo, V.; Sánchez-Maroño, N.; Alonso-Betanzos, A. A review of feature selection methods on synthetic data. Knowl. Inf. Syst. 2013, 34, 483–519. [Google Scholar] [CrossRef]
- Maldonado, S.; López, J.; Vairetti, C. An alternative approach for feature selection using support vector machines. Inf. Sci. 2014, 279, 163–175. [Google Scholar] [CrossRef]
- Leys, C.; Ley, C.; Klein, O.; Bernard, P.; Licata, L. Detecting outliers: Do not use standard deviation around the mean, use absolute deviation around the median. J. Exp. Soc. Psychol. 2013, 49, 764–766. [Google Scholar] [CrossRef]
- Díaz-Uriarte, R.; Álvarez de Andrés, S. Gene selection and classification of microarray data using random forest. BMC Bioinform. 2006, 7, 3. [Google Scholar] [CrossRef]
- Kursa, M.B.; Rudnicki, W.R. The all relevant feature selection using random forest. arXiv 2011, arXiv:1106.5112. [Google Scholar] [CrossRef]
- Kuncheva, L.I. A stability index for feature selection. In Proceedings of the Artificial Intelligence and Applications (AIAP07), Innsbruck, Austria, 12–14 February 2007; p. 390395. [Google Scholar]
- Nogueira, S.; Sechidis, K.; Brown, G. On the stability of feature selection algorithms. J. Mach. Learn. Res. 2018, 18, 1–54. Available online: http://jmlr.org/papers/v18/17-514.html (accessed on 10 October 2025).
- Somol, P.; Novovicová, J. Evaluating stability and comparing output of feature selectors that optimize feature subset cardinality. IEEE Trans. Pattern Anal. Mach. Intell. 2010, 32, 1921–1939. [Google Scholar] [CrossRef]
- Bushehri, S.; Dehghanizadeh, M.; Kalantar, S.; Zarchi, M. SCADI; UCI Machine Learning Repository: Noida, India, 2018. [Google Scholar] [CrossRef]
- Gül, Ş.; Rahim, F. Toxicity; UCI Machine Learning Repository: Noida, India, 2021. [Google Scholar] [CrossRef]
- Gordon, G.J.; Jensen, R.V.; Hsiao, L.-L.; Gullans, S.R.; Blumenstock, J.E.; Ramaswamy, S.; Richards, W.G.; Sugarbaker, D.J.; Bueno, R. Translation of microarray data into clinically relevant cancer diagnostic tests using gene expression ratios in lung cancer and mesothelioma. Cancer Res. 2002, 62, 4963–4967. [Google Scholar] [PubMed]
- Singh, D.; Febbo, P.G.; Ross, K.; Jackson, D.G.; Manola, J.; Ladd, C.; Tamayo, P.; Renshaw, A.A.; D’Amico, A.V.; Richie, J.P.; et al. Gene expression correlates of clinical prostate cancer behavior. Cancer Cell 2002, 1, 203–209. [Google Scholar] [CrossRef] [PubMed]
- Ramhiser, J. datamicroarray: Microarray Gene Expression Datasets for High-Dimensional Classification. GitHub Repository 2015. Available online: https://github.com/ramhiser/datamicroarray (accessed on 10 December 2025).

| Selected Significant | Selected Redundant | |
|---|---|---|
| True Significant | TP | FN |
| True Redundant | FP | TN |
| Configuration | AUC | FPR | TPR | Precision | Accuracy | F1-Score | |
|---|---|---|---|---|---|---|---|
| Full Ensemble (Proposed) | 0.891 | 0.110 | 0.887 | 0.851 | 0.905 | 0.883 | 0.837 |
| AUC only | 0.645 | 0.425 | 0.653 | 0.587 | 0.629 | 0.624 | 0.363 |
| AUC + parsimony penalty | 0.700 | 0.347 | 0.776 | 0.711 | 0.753 | 0.735 | 0.585 |
| No voting score + MI fusion (only with voting score) | 0.603 | 0.498 | 0.614 | 0.574 | 0.620 | 0.591 | 0.312 |
| No median + MAD threshold (only median threshold) | 0.741 | 0.314 | 0.769 | 0.722 | 0.753 | 0.748 | 0.598 |
| No adaptive alpha (fixed as 0.5) | 0.718 | 0.385 | 0.743 | 0.705 | 0.734 | 0.719 | 0.574 |
| No [kmin, kmax] restriction | 0.625 | 0.425 | 0.648 | 0.612 | 0.629 | 0.632 | 0.421 |
| Accuracy objective instead of AUC | 0.508 | 0.523 | 0.531 | 0.499 | 0.514 | 0.517 | 0.400 |
| Scenario | n | p | Method | Accuracy | TPR | F1 | FPR | AUC | Precision | ||
|---|---|---|---|---|---|---|---|---|---|---|---|
| I-1 | 50 | 60 | ρ = 0.10 | BPSO | 0.709 (0.214) | 0.770 (0.226) | 0.695 (0.198) | 0.301 (0.107) | 0.715 (0.205) | 0.619 (0.188) | 0.456 |
| BGWO | 0.724 (0.206) | 0.757 (0.198) | 0.702 (0.157) | 0.236 (0.089) | 0.719 (0.191) | 0.640 (0.157) | 0.491 | ||||
| MBPSO | 0.791 (0.143) | 0.815 (0.180) | 0.759 (0.141) | 0.237 (0.084) | 0.795 (0.143) | 0.700 (0.121) | 0.583 | ||||
| BDGWO | 0.842 (0.107) | 0.833 (0.102) | 0.802 (0.093) | 0.226 (0.043) | 0.834 (0.099) | 0.733 (0.093) | 0.697 | ||||
| Ensemble | 0.906 (0.036) | 0.962 (0.054) | 0.898 (0.032) | 0.077 (0.010) | 0.919 (0.036) | 0.896 (0.049) | 0.856 | ||||
| I-2 | 50 | 60 | ρ = 0.40 | BPSO | 0.675 (0.232) | 0.674 (0.182) | 0.662 (0.171) | 0.340 (0.129) | 0.686 (0.234) | 0.573 (0.208) | 0.388 |
| BGWO | 0.709 (0.254) | 0.667 (0.169) | 0.673 (0.150) | 0.302 (0.064) | 0.700 (0.159) | 0.604 (0.158) | 0.446 | ||||
| MBPSO | 0.799 (0.102) | 0.820 (0.147) | 0.744 (0.154) | 0.331 (0.086) | 0.781 (0.119) | 0.694 (0.127) | 0.504 | ||||
| BDGWO | 0.830 (0.127) | 0.821 (0.072) | 0.785 (0.110) | 0.186 (0.027) | 0.828 (0.091) | 0.709 (0.100) | 0.663 | ||||
| Ensemble | 0.905 (0.046) | 0.942 (0.049) | 0.895 (0.039) | 0.022 (0.008) | 0.917 (0.044) | 0.901 (0.052) | 0.893 | ||||
| I-3 | 50 | 60 | ρ = 0.90 | BPSO | 0.610 (0.253) | 0.553 (0.257) | 0.574 (0.202) | 0.346 (0.137) | 0.617 (0.190) | 0.498 (0.194) | 0.286 |
| BGWO | 0.658 (0.247) | 0.595 (0.212) | 0.612 (0.191) | 0.364 (0.093) | 0.636 (0.214) | 0.514 (0.114) | 0.311 | ||||
| MBPSO | 0.724 (0.120) | 0.567 (0.168) | 0.671 (0.141) | 0.271 (0.081) | 0.733 (0.108) | 0.566 (0.105) | 0.415 | ||||
| BDGWO | 0.797 (0.116) | 0.645 (0.094) | 0.726 (0.132) | 0.221 (0.036) | 0.787 (0.123) | 0.657 (0.117) | 0.585 | ||||
| Ensemble | 0.900 (0.041) | 0.982 (0.058) | 0.883 (0.066) | 0.012 (0.022) | 0.903 (0.089) | 0.889 (0.045) | 0.885 | ||||
| IV | 50 | 60 | Grouped | BPSO | 0.574 (0.277) | 0.472 (0.245) | 0.515 (0.205) | 0.430 (0.119) | 0.563 (0.247) | 0.352 (0.219) | 0.237 |
| BGWO | 0.600 (0.173) | 0.390 (0.142) | 0.553 (0.140) | 0.399 (0.077) | 0.588 (0.158) | 0.328 (0.199) | 0.308 | ||||
| MBPSO | 0.698 (0.175) | 0.553 (0.160) | 0.622 (0.154) | 0.295 (0.079) | 0.682 (0.179) | 0.514 (0.118) | 0.386 | ||||
| BDGWO | 0.723 (0.092) | 0.57 (0.086) | 0.689 (0.096) | 0.281 (0.037) | 0.737 (0.104) | 0.608 (0.115) | 0.495 | ||||
| Ensemble | 0.885 (0.050) | 0.964 (0.045) | 0.894 (0.042) | 0.018 (0.019) | 0.891 (0.077) | 0.905 (0.049) | 0.851 | ||||
| II-1 | 50 | 100 | ρ = 0.10 | BPSO | 0.606 (0.164) | 0.745 (0.243) | 0.591 (0.233) | 0.396 (0.102) | 0.618 (0.151) | 0.607 (0.175) | 0.432 |
| BGWO | 0.621 (0.267) | 0.735 (0.201) | 0.601 (0.201) | 0.336 (0.108) | 0.632 (0.165) | 0.623 (0.178) | 0.471 | ||||
| MBPSO | 0.691 (0.159) | 0.800 (0.184) | 0.656 (0.121) | 0.331 (0.091) | 0.696 (0.110) | 0.687 (0.121) | 0.560 | ||||
| BDGWO | 0.741 (0.104) | 0.815 (0.130) | 0.701 (0.114) | 0.281 (0.057) | 0.741 (0.098) | 0.720 (0.077) | 0.675 | ||||
| Ensemble | 0.902 (0.045) | 0.958 (0.066) | 0.893 (0.038) | 0.082 (0.021) | 0.915 (0.043) | 0.892 (0.039) | 0.862 | ||||
| II-2 | 50 | 100 | ρ = 0.40 | BPSO | 0.552 (0.248) | 0.652 (0.233) | 0.553 (0.231) | 0.433 (0.109) | 0.584 (0.144) | 0.558 (0.168) | 0.373 |
| BGWO | 0.597 (0.147) | 0.640 (0.219) | 0.571 (0.193) | 0.391 (0.107) | 0.597 (0.169) | 0.592 (0.115) | 0.430 | ||||
| MBPSO | 0.681 (0.175) | 0.785 (0.197) | 0.643 (0.116) | 0.366 (0.083) | 0.675 (0.116) | 0.681 (0.140) | 0.502 | ||||
| BDGWO | 0.729 (0.124) | 0.805 (0.108) | 0.683 (0.092) | 0.271 (0.044) | 0.731 (0.085) | 0.703 (0.114) | 0.650 | ||||
| Ensemble | 0.910 (0.034) | 0.960 (0.058) | 0.895 (0.039) | 0.079 (0.011) | 0.917 (0.070) | 0.896 (0.043) | 0.888 | ||||
| II-3 | 50 | 100 | ρ = 0.90 | BPSO | 0.506 (0.278) | 0.540 (0.246) | 0.471 (0.191) | 0.449 (0.108) | 0.521 (0.158) | 0.492 (0.157) | 0.270 |
| BGWO | 0.523 (0.186) | 0.580 (0.208) | 0.511 (0.132) | 0.441 (0.074) | 0.539 (0.142) | 0.505 (0.169) | 0.305 | ||||
| MBPSO | 0.619 (0.120) | 0.555 (0.224) | 0.566 (0.171) | 0.361 (0.062) | 0.631 (0.121) | 0.553 (0.133) | 0.402 | ||||
| BDGWO | 0.696 (0.089) | 0.630 (0.100) | 0.623 (0.117) | 0.296 (0.045) | 0.686 (0.117) | 0.644 (0.119) | 0.565 | ||||
| Ensemble | 0.898 (0.043) | 0.980 (0.044) | 0.881 (0.024) | 0.011 (0.009) | 0.902 (0.075) | 0.887 (0.068) | 0.882 | ||||
| V | 50 | 100 | Grouped | BPSO | 0.479 (0.243) | 0.460 (0.253) | 0.416 (0.256) | 0.519 (0.081) | 0.472 (0.193) | 0.343 (0.170) | 0.223 |
| BGWO | 0.504 (0.251) | 0.380 (0.168) | 0.456 (0.128) | 0.486 (0.086) | 0.495 (0.192) | 0.320 (0.136) | 0.290 | ||||
| MBPSO | 0.593 (0.134) | 0.540 (0.200) | 0.526 (0.136) | 0.386 (0.091) | 0.585 (0.147) | 0.508 (0.101) | 0.370 | ||||
| BDGWO | 0.621 (0.139) | 0.560 (0.123) | 0.586 (0.128) | 0.373 (0.054) | 0.636 (0.124) | 0.595 (0.077) | 0.480 | ||||
| Ensemble | 0.883 (0.036) | 0.962 (0.053) | 0.891 (0.037) | 0.018 (0.014) | 0.889 (0.063) | 0.903 (0.072) | 0.848 | ||||
| III-1 | 50 | 200 | ρ = 0.10 | BPSO | 0.573 (0.255) | 0.728 (0.216) | 0.577 (0.199) | 0.411 (0.132) | 0.609 (0.198) | 0.593 (0.182) | 0.410 |
| BGWO | 0.596 (0.213) | 0.723 (0.224) | 0.591 (0.124) | 0.347 (0.118) | 0.623 (0.158) | 0.618 (0.192) | 0.455 | ||||
| MBPSO | 0.679 (0.147) | 0.788 (0.176) | 0.645 (0.104) | 0.343 (0.094) | 0.683 (0.153) | 0.677 (0.197) | 0.545 | ||||
| BDGWO | 0.727 (0.085) | 0.805 (0.132) | 0.691 (0.144) | 0.289 (0.045) | 0.733 (0.141) | 0.712 (0.113) | 0.661 | ||||
| Ensemble | 0.895 (0.056) | 0.956 (0.042) | 0.889 (0.057) | 0.090 (0.012) | 0.911 (0.078) | 0.887 (0.073) | 0.857 | ||||
| III-2 | 50 | 200 | ρ = 0.40 | BPSO | 0.561 (0.274) | 0.638 (0.200) | 0.537 (0.201) | 0.446 (0.145) | 0.571 (0.187) | 0.545 (0.205) | 0.355 |
| BGWO | 0.590 (0.204) | 0.622 (0.206) | 0.559 (0.143) | 0.401 (0.122) | 0.591 (0.170) | 0.582 (0.181) | 0.415 | ||||
| MBPSO | 0.667 (0.106) | 0.775 (0.189) | 0.633 (0.123) | 0.379 (0.109) | 0.657 (0.162) | 0.668 (0.120) | 0.490 | ||||
| BDGWO | 0.719 (0.092) | 0.792 (0.129) | 0.671 (0.123) | 0.283 (0.053) | 0.723 (0.153) | 0.695 (0.094) | 0.642 | ||||
| Ensemble | 0.879 (0.060) | 0.940 (0.063) | 0.885 (0.074) | 0.094 (0.017) | 0.908 (0.070) | 0.892 (0.051) | 0.884 | ||||
| III-3 | 50 | 200 | ρ = 0.90 | BPSO | 0.508 (0.255) | 0.522 (0.199) | 0.456 (0.217) | 0.461 (0.135) | 0.513 (0.160) | 0.484 (0.179) | 0.253 |
| BGWO | 0.517 (0.217) | 0.567 (0.204) | 0.496 (0.136) | 0.451 (0.095) | 0.531 (0.216) | 0.498 (0.129) | 0.287 | ||||
| MBPSO | 0.606 (0.155) | 0.544 (0.155) | 0.551 (0.140) | 0.376 (0.104) | 0.619 (0.173) | 0.545 (0.091) | 0.380 | ||||
| BDGWO | 0.683 (0.102) | 0.620 (0.088) | 0.611 (0.065) | 0.303 (0.050) | 0.673 (0.107) | 0.634 (0.080) | 0.555 | ||||
| Ensemble | 0.884 (0.041) | 0.976 (0.056) | 0.879 (0.051) | 0.012 (0.013) | 0.903 (0.061) | 0.883 (0.061) | 0.880 | ||||
| VI | 50 | 200 | Grouped | BPSO | 0.471 (0.223) | 0.445 (0.190) | 0.403 (0.238) | 0.533 (0.129) | 0.457 (0.152) | 0.328 (0.146) | 0.209 |
| BGWO | 0.496 (0.199) | 0.365 (0.187) | 0.439 (0.185) | 0.499 (0.108) | 0.481 (0.237) | 0.305 (0.117) | 0.274 | ||||
| MBPSO | 0.581 (0.137) | 0.525 (0.176) | 0.511 (0.133) | 0.393 (0.115) | 0.571 (0.135) | 0.498 (0.128) | 0.355 | ||||
| BDGWO | 0.606 (0.094) | 0.545 (0.097) | 0.571 (0.088) | 0.379 (0.055) | 0.619 (0.119) | 0.585 (0.114) | 0.460 | ||||
| Ensemble | 0.873 (0.050) | 0.961 (0.062) | 0.867 (0.054) | 0.018 (0.019) | 0.892 (0.053) | 0.862 (0.057) | 0.845 |
| Scenario | Metric | Ensemble vs. BPSO | Ensemble vs. BGWO | Ensemble vs. MBPSO | Ensemble vs. BDGWO |
|---|---|---|---|---|---|
| Accuracy | 0.0021 | 0.0016 | 0.0092 | 0.0120 | |
| I-1 | TPR | 0.0017 | 0.0013 | 0.0103 | 0.0118 |
| F1-Score | 0.0007 | 0.0009 | 0.0100 | 0.0107 | |
| FPR | 0.0000 | 0.0000 | 0.0089 | 0.0090 | |
| AUC | 0.0012 | 0.0015 | 0.0137 | 0.0149 | |
| Precision | 0.0001 | 0.0001 | 0.0117 | 0.0132 | |
| Accuracy | 0.0012 | 0.0012 | 0.0069 | 0.0081 | |
| TPR | 0.0009 | 0.0010 | 0.0082 | 0.0080 | |
| I-2 | F1-Score | 0.0003 | 0.0005 | 0.0071 | 0.0078 |
| FPR | 0.0000 | 0.0000 | 0.0038 | 0.0041 | |
| AUC | 0.0008 | 0.0009 | 0.0076 | 0.0085 | |
| Precision | 0.0000 | 0.0000 | 0.0070 | 0.0073 | |
| Accuracy | 0.0000 | 0.0000 | 0.0051 | 0.0057 | |
| TPR | 0.0000 | 0.0000 | 0.0045 | 0.0052 | |
| F1-Score | 0.0000 | 0.0000 | 0.0050 | 0.0062 | |
| I-3 | FPR | 0.0000 | 0.0000 | 0.0016 | 0.0025 |
| AUC | 0.0000 | 0.0000 | 0.0062 | 0.0069 | |
| Precision | 0.0000 | 0.0000 | 0.0025 | 0.0032 | |
| Accuracy | 0.0000 | 0.0000 | 0.0022 | 0.0033 | |
| TPR | 0.0000 | 0.0000 | 0.0010 | 0.0012 | |
| IV | F1-Score | 0.0000 | 0.0000 | 0.0028 | 0.0035 |
| FPR | 0.0000 | 0.0000 | 0.0019 | 0.0012 | |
| AUC | 0.0000 | 0.0000 | 0.0023 | 0.0041 | |
| Precision | 0.0000 | 0.0000 | 0.0009 | 0.0012 | |
| Accuracy | 0.0000 | 0.0000 | 0.0082 | 0.0098 | |
| TPR | 0.0000 | 0.0000 | 0.0092 | 0.0100 | |
| II-1 | F1-Score | 0.0000 | 0.0000 | 0.0076 | 0.0093 |
| FPR | 0.0000 | 0.0000 | 0.0031 | 0.0045 | |
| AUC | 0.0000 | 0.0000 | 0.0103 | 0.0119 | |
| Precision | 0.0000 | 0.0000 | 0.0079 | 0.0094 | |
| Accuracy | 0.0000 | 0.0000 | 0.0078 | 0.0092 | |
| TPR | 0.0000 | 0.0000 | 0.0074 | 0.0085 | |
| F1-Score | 0.0000 | 0.0000 | 0.0084 | 0.0087 | |
| II-2 | FPR | 0.0000 | 0.0000 | 0.0042 | 0.0040 |
| AUC | 0.0000 | 0.0000 | 0.0088 | 0.0092 | |
| Precision | 0.0000 | 0.0000 | 0.0068 | 0.0089 | |
| Accuracy | 0.0000 | 0.0000 | 0.0011 | 0.0019 | |
| TPR | 0.0000 | 0.0000 | 0.0002 | 0.0015 | |
| F1-Score | 0.0000 | 0.0000 | 0.0010 | 0.0018 | |
| II-3 | FPR | 0.0000 | 0.0000 | 0.0005 | 0.0010 |
| AUC | 0.0000 | 0.0000 | 0.0015 | 0.0020 | |
| Precision | 0.0000 | 0.0000 | 0.0001 | 0.0023 | |
| Accuracy | 0.0000 | 0.0000 | 0.0009 | 0.0013 | |
| TPR | 0.0000 | 0.0000 | 0.0001 | 0.0001 | |
| V | F1-Score | 0.0000 | 0.0000 | 0.0007 | 0.0005 |
| FPR | 0.0000 | 0.0000 | 0.0003 | 0.0003 | |
| AUC | 0.0000 | 0.0000 | 0.0011 | 0.0008 | |
| Precision | 0.0000 | 0.0000 | 0.0000 | 0.0001 | |
| Accuracy | 0.0000 | 0.0000 | 0.0080 | 0.0086 | |
| TPR | 0.0000 | 0.0000 | 0.0085 | 0.0090 | |
| III-1 | F1-Score | 0.0000 | 0.0000 | 0.0070 | 0.0068 |
| FPR | 0.0000 | 0.0000 | 0.0024 | 0.0033 | |
| AUC | 0.0000 | 0.0000 | 0.0096 | 0.0099 | |
| Precision | 0.0000 | 0.0000 | 0.0073 | 0.0088 | |
| Accuracy | 0.0000 | 0.0000 | 0.0074 | 0.0082 | |
| TPR | 0.0000 | 0.0000 | 0.0077 | 0.0078 | |
| III-2 | F1-Score | 0.0000 | 0.0000 | 0.0069 | 0.0074 |
| FPR | 0.0000 | 0.0000 | 0.0019 | 0.0033 | |
| AUC | 0.0000 | 0.0000 | 0.0081 | 0.0087 | |
| Precision | 0.0000 | 0.0000 | 0.0062 | 0.0070 | |
| Accuracy | 0.0000 | 0.0000 | 0.0009 | 0.0019 | |
| TPR | 0.0000 | 0.0000 | 0.0001 | 0.0008 | |
| III-3 | F1-Score | 0.0000 | 0.0000 | 0.0006 | 0.0007 |
| FPR | 0.0000 | 0.0000 | 0.0003 | 0.0005 | |
| AUC | 0.0000 | 0.0000 | 0.0009 | 0.0018 | |
| Precision | 0.0000 | 0.0000 | 0.0000 | 0.0010 | |
| Accuracy | 0.0000 | 0.0000 | 0.0005 | 0.0008 | |
| TPR | 0.0000 | 0.0000 | 0.0000 | 0.0004 | |
| VI | F1-Score | 0.0000 | 0.0000 | 0.0003 | 0.0005 |
| FPR | 0.0000 | 0.0000 | 0.0000 | 0.0000 | |
| AUC | 0.0000 | 0.0000 | 0.0009 | 0.0011 | |
| Precision | 0.0000 | 0.0000 | 0.0000 | 0.0007 |
| Method | p = 60 | p = 100 | p = 200 |
|---|---|---|---|
| BPSO | 20.4 | 33.8 | 56.9 |
| BGWO | 22.1 | 38.7 | 62.4 |
| MBPSO | 27.3 | 44.6 | 68.2 |
| BDGWO | 32.9 | 48.8 | 79.1 |
| MBPSO–BDGWO | 37.2 | 57.9 | 92.6 |
| Measure | BPSO | BGWO | MBPSO | BDGWO | Ensemble | |
|---|---|---|---|---|---|---|
| SCADI | Accuracy | 0.568 (0.089) | 0.618 (0.061) | 0.659 (0.058) | 0.790 (0.039) | 0.895 (0.011) |
| TPR | 0.617 (0.082) | 0.675 (0.073) | 0.735 (0.068) | 0.805 (0.042) | 0.911 (0.014) | |
| F1-score | 0.575 (0.078) | 0.632 (0.066) | 0.673 (0.054) | 0.782 (0.038) | 0.886 (0.010) | |
| FPR | 0.412 (0.061) | 0.406 (0.052) | 0.385 (0.042) | 0.296 (0.032) | 0.094 (0.008) | |
| AUC | 0.559 (0.093) | 0.620 (0.065) | 0.686 (0.045) | 0.773 (0.040) | 0.903 (0.015) | |
| Precision | 0.564 (0.073) | 0.611 (0.067) | 0.646 (0.059) | 0.785 (0.042) | 0.876 (0.018) | |
| Toxicity | Accuracy | 0.509 (0.085) | 0.553 (0.086) | 0.593 (0.055) | 0.686 (0.031) | 0.863 (0.014) |
| TPR | 0.533 (0.079) | 0.567 (0.070) | 0.600 (0.043) | 0.663 (0.037) | 0.875 (0.011) | |
| F1-score | 0.506 (0.081) | 0.542 (0.089) | 0.587 (0.060) | 0.679 (0.029) | 0.857 (0.008) | |
| FPR | 0.423 (0.080) | 0.391 (0.072) | 0.375 (0.042) | 0.331 (0.021) | 0.107 (0.0009) | |
| AUC | 0.546 (0.085) | 0.563 (0.072) | 0.592 (0.047) | 0.668 (0.030) | 0.894 (0.016) | |
| Precision | 0.498 (0.090) | 0.530 (0.081) | 0.577 (0.058) | 0.646 (0.033) | 0.829 (0.010) | |
| Lung | Accuracy | 0.468 (0.110) | 0.470 (0.103) | 0.473 (0.060) | 0.553 (0.066) | 0.726 (0.012) |
| TPR | 0.405 (0.131) | 0.434 (0.126) | 0.446 (0.051) | 0.579 (0.054) | 0.781 (0.009) | |
| F1-score | 0.399 (0.119) | 0.422 (0.106) | 0.439 (0.048) | 0.573 (0.059) | 0.750 (0.008) | |
| FPR | 0.458 (0.198) | 0.441 (0.108) | 0.402 (0.044) | 0.351 (0.022) | 0.121 (0.0010) | |
| AUC | 0.461 (0.122) | 0.460 (0.114) | 0.462 (0.054) | 0.585 (0.062) | 0.787 (0.015) | |
| Precision | 0.384 (0.099) | 0.399 (0.109) | 0.428 (0.041) | 0.569 (0.058) | 0.742 (0.009) | |
| Prostate | Accuracy | 0.482 (0.105) | 0.441 (0.115) | 0.496 (0.071) | 0.604 (0.052) | 0.796 (0.015) |
| TPR | 0.451 (0.111) | 0.448 (0.120) | 0.469 (0.043) | 0.589 (0.042) | 0.769 (0.012) | |
| F1-score | 0.428 (0.114) | 0.421 (0.107) | 0.472 (0.053) | 0.598 (0.053) | 0.778 (0.007) | |
| FPR | 0.453 (0.157) | 0.456 (0.142) | 0.441 (0.038) | 0.342 (0.015) | 0.115 (0.0005) | |
| AUC | 0.477 (0.112) | 0.440 (0.103) | 0.481 (0.061) | 0.602 (0.049) | 0.793 (0.017) | |
| Precision | 0.410 (0.084) | 0.403 (0.096) | 0.459 (0.050) | 0.586 (0.033) | 0.762 (0.013) |
| Method | Median | MAD | Min | Max | |
| SCADI (p = 205, n = 70) | BPSO | 146 | 14 | 128 | 171 |
| BGWO | 152 | 11 | 137 | 175 | |
| MBPSO | 85 | 7 | 74 | 98 | |
| BDGWO | 74 | 5 | 66 | 88 | |
| Ensemble | 12 | 2 | 10 | 14 | |
| Toxicity (p = 1203, n = 171) | BPSO | 851 | 42 | 793 | 974 |
| BGWO | 798 | 36 | 745 | 872 | |
| MBPSO | 544 | 23 | 498 | 587 | |
| BDGWO | 323 | 18 | 295 | 352 | |
| Ensemble | 15 | 4 | 11 | 29 | |
| Lung (p = 12,533, n = 181) | BPSO | 1243 | 96 | 1090 | 2047 |
| BGWO | 985 | 84 | 754 | 1224 | |
| MBPSO | 587 | 55 | 498 | 675 | |
| BDGWO | 208 | 21 | 170 | 249 | |
| Ensemble | 29 | 5 | 17 | 42 | |
| Prostate (p = 12,600, n = 102) | BPSO | 987 | 87 | 844 | 1653 |
| BGWO | 1132 | 92 | 986 | 1758 | |
| MBPSO | 349 | 31 | 267 | 547 | |
| BDGWO | 135 | 19 | 107 | 180 | |
| Ensemble | 38 | 4 | 24 | 47 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2026 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
Share and Cite
Sancar, N. A Novel MBPSO–BDGWO Ensemble Feature Selection Method for High-Dimensional Classification Data. Informatics 2026, 13, 7. https://doi.org/10.3390/informatics13010007
Sancar N. A Novel MBPSO–BDGWO Ensemble Feature Selection Method for High-Dimensional Classification Data. Informatics. 2026; 13(1):7. https://doi.org/10.3390/informatics13010007
Chicago/Turabian StyleSancar, Nuriye. 2026. "A Novel MBPSO–BDGWO Ensemble Feature Selection Method for High-Dimensional Classification Data" Informatics 13, no. 1: 7. https://doi.org/10.3390/informatics13010007
APA StyleSancar, N. (2026). A Novel MBPSO–BDGWO Ensemble Feature Selection Method for High-Dimensional Classification Data. Informatics, 13(1), 7. https://doi.org/10.3390/informatics13010007