Next Article in Journal
Fingerprinting Indoor Positioning Based on Improved Sequential Deep Learning
Previous Article in Journal
Special Issue on Scheduling Theory and Algorithms for Sustainable Manufacturing
Previous Article in Special Issue
Attribute Relevance Score: A Novel Measure for Identifying Attribute Importance
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Editorial

Special Issue “Algorithms for Feature Selection (2nd Edition)”

by
Muhammad Adnan Khan
Department of Software, Faculty of Artificial Intelligence and Software, Gachon University, Seongnam-si 13120, Republic of Korea
Algorithms 2025, 18(1), 16; https://doi.org/10.3390/a18010016
Submission received: 4 December 2024 / Revised: 31 December 2024 / Accepted: 31 December 2024 / Published: 3 January 2025
(This article belongs to the Special Issue Algorithms for Feature Selection (2nd Edition))

1. Introduction

This Special Issue focuses on advancing research on algorithms, with a particular emphasis on feature selection techniques. A key focus is the broad subject of scheduling, aiming to include outstanding papers covering both theoretical and practical aspects. The topics comprise evolutionary search techniques for feature selection [1,2,3,4,5], ensemble methods for feature selection [6,7,8,9,10], the problem of feature selection in high-dimensional and time-series data [11,12,13,14,15,16], and the application of textual data [17,18,19,20,21,22] and deep feature selection [23]. Our goal was to bring together some of the best papers from different fields that deal with this subject matter.
A large number of the submitted papers came in response to the call, with ten researchers submitting papers to the Special Issue, each of them approaching a different area of feature selection [24,25,26,27,28]. Through a difficult peer review process, carried out by the invited experts who were there to ensure the quality of the work conducted [29], we are pleased to say that six very good research papers have been accepted for publication in a revised form.
Along with other fields such as scheduling and time-series analysis [30,31,32,33,34,35], the accepted papers also reveal the use of feature selection in emerging fields such as bioinformatics [36,37,38,39,40,41,42] where the selection of the relevant features plays a crucial role in the modeling of genomics. For example, genetic feature selection [43,44,45,46,47,48,49,50] has helped predict cancer better by narrowing down the most important markers [51]. Techniques with neural networks, along with feature importance and dimension reduction, have shown good effectiveness in the optimization of the computational costs and results [52]. These mixed methods have mainly been used in tackling large-scale data challenges, and they have already proved to be successful [53].
Additionally, ensemble methods for feature selection [54,55,56] have been extensively examined in this Special Issue. The methods in question are designed to enhance robustness and lessen the chances of overfitting, which is particularly relevant in the context of big data analytics [57]. The concept of affirmative action is brought to the forefront by one study which shows how ensemble techniques, as opposed to traditional methods, can serve to tackle more data sources [9]. This Special Issue also deals with the challenges of over-dimensional data [58,59], and, in particular, the combination of statistical and machine learning methods has been stressed, though through which dimensionality has been reduced.

2. Special Issue

In the first paper accepted (Neirz et al. 2024) [1], the authors propose a uniqueness testing model based on binomial algorithms. Such a method allows researchers to verify independence between categorical variables in these contexts more effectively in comparison to traditional methodologies such as Fisher’s exact test. The model was confirmed through three case studies investigating the effects of high-pressure processing, ultrasound treatment, and polyphenolic compounds on the structure–function relationship of ancient wheat doughs. The findings support the applicability of computationally simple models in innovation-oriented green product development and the creation of hybrid goods with great nutritional value.
The second paper (Fávero et al. 2024) that was accepted presents a series of nine levels of scenario analysis for the analysis of the different users of a housing space in Alarcos, Cuenca. Research into the problem of planning “affordable housing” has been proposed in terms of demands and scopes of possible solutions. The results verify that the parameters of the models used to explain the problems related to the study of the competition among users of housing spaces provide a good background for decision making. It was the compact output of the scenario analysis chapter which allowed users to administer various policies in the housing market, which was recommended by the housing technicians as an effective tool for the job.
In the third paper accepted (Zulfqar et al. 2024) a novel idea is proposed to use COT-based RGB feature extraction for the detection of lip synchronization irregularities. The method was evaluated for detection performance in a comprehensive set of images, including different variations of input images such as face movement, lighting variation, and other environmental parameters. The paper, moreover, proposed a simple way to unify the scales of static and dynamic features. The experimental results show the effectiveness of the new method and encourage its integration with other existing methods of video content analysis for the detection of non-compliance with synchronization.
In the fourth accepted paper (Mahmood et al. 2024), it has been suggested that the effective use of the aforementioned AI-enabling applications can lead to accelerated realization of student engagement in science through proper scientific tools and methodologies. The application of AI technologies, for instance, can promote a culture of continued learning by providing students with opportunities to work on projects beyond the classroom and facilitating real-time inquiry into their interests. In this regard, I believe the selling of AI applications to students and their involvement as co-creators in the areas of study will be the basic pillars of the future of AI in education.
The fifth accepted paper (Chen et al. 2024) presents a customized detection system for fatigued drivers, which incorporates a variety of data from eye movement, finger pressure, and plantar pressure sensors. The system is a self-learning one that acquires the individual patterns of the driver’s behavioral modifications and can then identify whether the user is feeling tired through the differences, and therefore it provides a clear-cut intervention that is well tailored to road safety enhancement.
The sixth accepted paper (Gafar et al. 2024) describes the RBAVO-DE algorithm, which is a new genetic algorithm based on differential evolution and is specifically designed for gene selection in high-dimensional RNA-Seq datasets. It demonstrates its efficacy by showing better classification accuracy and also feature reduction in 22 cancer datasets, along with being essential in the accuracy of cancer research and genetic marker discovery.
The seventh accepted paper (Ferdous et al. 2024) is about the introduction of AFP-MVFL, which is a multi-view feature learning model dedicated to the scientific identification of antifungal peptides. The combination of sequential and physicochemical properties of amino acids allows the model to achieve the highest performance values, thus offering a very promising tool for the computer-based fight against scientific fungal infections.
The eighth accepted paper (Azimov et al. 2024) discusses the application of machine learning techniques such as CNNs, random forests, and SVMs for the author identification of Azerbaijani texts. The research processed the text features as a variety of known things and reported the best methods of such text classification, representing another contribution to the advancement of computational linguistics.
The ninth accepted paper (Samkunta et al. 2024) proposes a novel method for the extraction of features of hand kinematics of humans based on sparse coding, where a notable improvement in the classification precision can be seen compared to the PCA and random dictionary techniques. This work provides actionable solutions to the problem of time-series data analysis in human movement studies.
The tenth accepted paper (Đurasević et al. 2024) deals with the feature selection process of genetic programming (GP), which is used to build dispatching rules on the basis of the evolution of the phenotype represented in the scheduling problems of machines. Focusing on the role of composite terminal nodes, the article records a 7% increase in performance, with a strong reminder of the importance of the design of the features in the application of GP optimization.

Funding

This research received no external funding.

Conflicts of Interest

The author declares no conflicts of interest.

List of Contributions

  • Neirz, P.; Allende, H.; Saavedra, C. Attribute Relevance Score: A Novel Measure for Identifying Attribute Importance. Algorithms 2024, 17, 518. https://doi.org/10.3390/a17110518.
  • Fávero, L.P.; Santos, H.P.; Belfiore, P.; Duarte, A.; Costa, I.P.d.A.; Terra, A.V.; Moreira, M.Â.L.; Tarantin Junior, W.; Santos, M.d. A Proposal for a New Python Library Implementing Stepwise Procedure. Algorithms 2024, 17, 502. https://doi.org/10.3390/a17110502.
  • Zulfqar, S.; Elgamal, Z.; Zia, M.A.; Razzaq, A.; Ullah, S.; Dawood, H. ACT-FRCNN: Progress Towards Transformer-Based Object Detection. Algorithms 2024, 17, 475. https://doi.org/10.3390/a17110475.
  • Mahmood, N.; Bhatti, S.M.; Dawood, H.; Pradhan, M.R.; Ahmad, H. Measuring Student Engagement through Behavioral and Emotional Features Using Deep-Learning Models. Algorithms 2024, 17, 458. https://doi.org/10.3390/a17100458.
  • Chen, J.-C.; Chen, Y.-Z. Integrating Eye Movement, Finger Pressure, and Foot Pressure Information to Build an Intelligent Driving Fatigue Detection System. Algorithms 2024, 17, 402. https://doi.org/10.3390/a17090402.
  • Gafar, M.G.; Abohany, A.A.; Elkhouli, A.E.; El-Mageed, A.A.A. Optimization of Gene Selection for Cancer Classification in High-Dimensional Data Using an Improved African Vultures Algorithm. Algorithms 2024, 17, 342. https://doi.org/10.3390/a17080342.
  • Ferdous, S.M.; Mugdha, S.B.S.; Dehzangi, I. New Multi-View Feature Learning Method for Accurate Antifungal Peptide Detection. Algorithms 2024, 17, 247. https://doi.org/10.3390/a17060247.
  • Azimov, R.; Providas, E. A Comparative Study of Machine Learning Methods and Text Features for Text Authorship Recognition in the Example of Azerbaijani Language Texts. Algorithms 2024, 17, 242. https://doi.org/10.3390/a17060242.
  • Samkunta, J.; Ketthong, P.; Mai, N.T.; Kamal, M.A.S.; Murakami, I.; Yamada, K. Feature Extraction Based on Sparse Coding Approach for Hand Grasp Type Classification. Algorithms 2024, 17, 240. https://doi.org/10.3390/a17060240.
  • Đurasević, M.; Jakobović, D.; Picek, S.; Mariot, L. Assessing the Ability of Genetic Programming for Feature Selection in Constructing Dispatching Rules for Unrelated Machine Environments. Algorithms 2024, 17, 67. https://doi.org/10.3390/a17020067.

References

  1. Zhang, Z.; Li, Y.; Liu, Y.; Liu, S. A Local Binary Social Spider Algorithm for Feature Selection in Credit Scoring Model. Appl. Soft Comput. 2023, 144, 110549. [Google Scholar] [CrossRef]
  2. Vasu G, T.; Fiza, S.; Kumar, A.K.; Devi, V.S.; Kumar, C.N.; Kubra, A. Improved Chimp Optimization Algorithm (ICOA) Feature Selection and Deep Neural Network Framework for Internet of Things (IOT) Based Android Malware Detection. Meas. Sens. 2023, 28, 100785. [Google Scholar] [CrossRef]
  3. Premalatha, M.; Jayasudha, M.; Čep, R.; Priyadarshini, J.; Kalita, K.; Chatterjee, P. A Comparative Evaluation of Nature-Inspired Algorithms for Feature Selection Problems. Heliyon 2024, 10, e23571. [Google Scholar] [CrossRef] [PubMed]
  4. Cheng, W.L.; Pan, L.; Juhari, M.R.B.M.; Wong, C.H.; Sharma, A.; Lim, T.H.; Tiang, S.S.; Lim, W.H. Chaotic African Vultures Optimization Algorithm for Feature Selection. In Proceedings of the International Conference on Artificial Life and Robotics, Oita, Japan, 9–12 February 2023. [Google Scholar]
  5. Song, H.; Huang, Y.; Song, Q.; Han, T.; Xu, S. Feature Selection Algorithm Based on P Systems. Nat. Comput. 2023, 22, 149–159. [Google Scholar] [CrossRef]
  6. Hong, S.S.; Lee, E.J.; Kim, H. An Advanced Fitness Function Optimization Algorithm for Anomaly Intrusion Detection Using Feature Selection. Appl. Sci. 2023, 13, 4958. [Google Scholar] [CrossRef]
  7. Agushaka, J.O.; Akinola, O.; Ezugwu, A.E.; Oyelade, O.N. A Novel Binary Greater Cane Rat Algorithm for Feature Selection. Results Control Optim. 2023, 11, 100225. [Google Scholar] [CrossRef]
  8. Fang, L.; Liang, X. A Novel Method Based on Nonlinear Binary Grasshopper Whale Optimization Algorithm for Feature Selection. J. Bionic Eng. 2023, 20, 237–252. [Google Scholar] [CrossRef] [PubMed]
  9. Demir, M.; Canayaz, M.; Topalcengiz, Z. A Meta-Heuristic Algorithm-Based Feature Selection Approach to Improve Prediction Success for Salmonella Occurrence in Agricultural Waters. Tarim. Bilim. Derg. 2024, 30, 118–130. [Google Scholar] [CrossRef]
  10. Ayar, M.; Isazadeh, A.; Gharehchopogh, F.S.; Seyedi, M.H. NSICA: Multi-Objective Imperialist Competitive Algorithm for Feature Selection in Arrhythmia Diagnosis. Comput. Biol. Med. 2023, 161, 107025. [Google Scholar] [CrossRef]
  11. Abedi, F.; Ghanimi, H.M.A.; Algarni, A.D.; Soliman, N.F.; El-Shafai, W.; Abbas, A.H.; Kareem, Z.H.; Hariz, H.M.; Alkhayyat, A. Chimp Optimization Algorithm Based Feature Selection with Machine Learning for Medical Data Classification. Comput. Syst. Sci. Eng. 2023, 47, 2791–2814. [Google Scholar] [CrossRef]
  12. Priyadarshini, J.; Premalatha, M.; Čep, R.; Jayasudha, M.; Kalita, K. Analyzing Physics-Inspired Metaheuristic Algorithms in Feature Selection with K-Nearest-Neighbor. Appl. Sci. 2023, 13, 906. [Google Scholar] [CrossRef]
  13. Yousefi, S.; Yin, S.; Alfarizi, M.G. Intelligent Fault Diagnosis of Manufacturing Processes Using Extra Tree Classification Algorithm and Feature Selection Strategies. IEEE Open J. Ind. Electron. Soc. 2023, 4, 618–628. [Google Scholar] [CrossRef]
  14. Sureja, N.; Patel, P.; Rathod, M.; Labana, M. A Discrete Moth Flame Algorithm for Feature Selection. Int. J. Intell. Eng. Syst. 2023, 16, 235–245. [Google Scholar] [CrossRef]
  15. Mengash, H.A.; Alruwais, N.; Kouki, F.; Singla, C.; Abd Elhameed, E.S.; Mahmud, A. Archimedes Optimization Algorithm-Based Feature Selection with Hybrid Deep-Learning-Based Churn Prediction in Telecom Industries. Biomimetics 2024, 9, 1. [Google Scholar] [CrossRef]
  16. Garip, Z.; Ekinci, E.; Çimen, M.E. A Comparative Study of Optimization Algorithms for Feature Selection on ML-Based Classification of Agricultural Data. Clust. Comput. 2024, 27, 3341–3362. [Google Scholar] [CrossRef]
  17. Arunekumar, N.B.; Joseph, K.S.; Viswanath, J.; Anbarasi, A.; Padmapriya, N. Vigilant Salp Swarm Algorithm for Feature Selection. Comput. Inform. 2023, 42, 805–833. [Google Scholar] [CrossRef]
  18. Ali, W.; Saeed, F. Hybrid Filter and Genetic Algorithm-Based Feature Selection for Improving Cancer Classification in High-Dimensional Microarray Data. Processes 2023, 11, 562. [Google Scholar] [CrossRef]
  19. Hijazi, M.M.; Zeki, A.; Ismail, A. Utilizing Artificial Bee Colony Algorithm as Feature Selection Method in Arabic Text Classification. Int. Arab. J. Inf. Technol. 2023, 20, 536–547. [Google Scholar] [CrossRef]
  20. Kakoly, I.J.; Hoque, M.R.; Hasan, N. Data-Driven Diabetes Risk Factor Prediction Using Machine Learning Algorithms with Feature Selection Technique. Sustainability 2023, 15, 4930. [Google Scholar] [CrossRef]
  21. Eluri, R.K.; Devarakonda, N. Chaotic Binary Pelican Optimization Algorithm for Feature Selection. Int. J. Uncertain. Fuzziness Knowl.-Based Syst. 2023, 31, 497–530. [Google Scholar] [CrossRef]
  22. Li, Z. A Local Opposition-Learning Golden-Sine Grey Wolf Optimization Algorithm for Feature Selection in Data Classification. Appl. Soft Comput. 2023, 142, 110319. [Google Scholar] [CrossRef]
  23. Nayak, A.; Božić, B.; Longo, L. Data Quality Assessment and Recommendation of Feature Selection Algorithms: An Ontological Approach. J. Web Eng. 2023, 22, 175–196. [Google Scholar] [CrossRef]
  24. Wang, Y.; Han, J.; Zhang, T. A Relief-PGS Algorithm for Feature Selection and Data Classification. Intell. Data Anal. 2023, 27, 399–415. [Google Scholar] [CrossRef]
  25. Hashim, F.A.; Houssein, E.H.; Mostafa, R.R.; Hussien, A.G.; Helmy, F. An Efficient Adaptive-Mutated Coati Optimization Algorithm for Feature Selection and Global Optimization. Alex. Eng. J. 2023, 85, 29–48. [Google Scholar] [CrossRef]
  26. Hu, Z.; Zhu, Y. Cross-Project Defect Prediction Method Based on Genetic Algorithm Feature Selection. Eng. Rep. 2023, 5, e12670. [Google Scholar] [CrossRef]
  27. Yuan, X.; Pan, J.S.; Tian, A.Q.; Chu, S.C. Binary Sparrow Search Algorithm for Feature Selection. J. Internet Technol. 2023, 24, 217–232. [Google Scholar] [CrossRef]
  28. Alhussan, A.A.; Abdelhamid, A.A.; El-Kenawy, E.S.M.; Ibrahim, A.; Eid, M.M.; Khafaga, D.S.; Ahmed, A.E. A Binary Waterwheel Plant Optimization Algorithm for Feature Selection. IEEE Access 2023, 11, 94227–94251. [Google Scholar] [CrossRef]
  29. Liu, X.; Wang, S.; Lu, S.; Yin, Z.; Li, X.; Yin, L.; Tian, J.; Zheng, W. Adapting Feature Selection Algorithms for the Classification of Chinese Texts. Systems 2023, 11, 483. [Google Scholar] [CrossRef]
  30. Sallah, A.; Alaoui, E.A.A.; Tekouabou, S.C.K.; Agoujil, S. Machine Learning for Detecting Fake Accounts and Genetic Algorithm-Based Feature Selection. Data Policy 2024, 6, e15. [Google Scholar] [CrossRef]
  31. Yong, X.; Gao, Y. lin Improved Firefly Algorithm for Feature Selection with the ReliefF-Based Initialization and the Weighted Voting Mechanism. Neural Comput. Appl. 2023, 35, 275–301. [Google Scholar] [CrossRef]
  32. Sathish, B.R.; Senthilkumar, R. A Hybrid Algorithm for Feature Selection and Classification. J. Internet Technol. 2023, 24, 593–602. [Google Scholar] [CrossRef]
  33. Al-Khatib, R.M.; Al-qudah, N.E.A.; Jawarneh, M.S.; Al-Khateeb, A. A Novel Improved Lemurs Optimization Algorithm for Feature Selection Problems. J. King Saud. Univ.-Comput. Inf. Sci. 2023, 35, 101704. [Google Scholar] [CrossRef]
  34. Braik, M. Enhanced Ali Baba and the Forty Thieves Algorithm for Feature Selection. Neural Comput. Appl. 2023, 35, 6153–6184. [Google Scholar] [CrossRef] [PubMed]
  35. Abdelrazek, M.; Abd Elaziz, M.; El-Baz, A.H. CDMO: Chaotic Dwarf Mongoose Optimization Algorithm for Feature Selection. Sci. Rep. 2024, 14, 701. [Google Scholar] [CrossRef] [PubMed]
  36. Azar, A.T.; Khan, Z.I.; Amin, S.U.; Fouad, K.M. Hybrid Global Optimization Algorithm for Feature Selection. Comput. Mater. Contin. 2023, 74, 2021–2037. [Google Scholar] [CrossRef]
  37. Banga, A.; Ahuja, R.; Sharma, S.C. Performance Analysis of Regression Algorithms and Feature Selection Techniques to Predict PM2.5 in Smart Cities. Int. J. Syst. Assur. Eng. Manag. 2023, 14, 732–745. [Google Scholar] [CrossRef]
  38. Wang, S.; Yuan, Q.; Tan, W.; Yang, T.; Zeng, L. SCChOA: Hybrid Sine-Cosine Chimp Optimization Algorithm for Feature Selection. Comput. Mater. Contin. 2023, 77, 3057–3075. [Google Scholar] [CrossRef]
  39. Shahsavari, M.; Mohammadi, V.; Alizadeh, B.; Alizadeh, H. Application of Machine Learning Algorithms and Feature Selection in Rapeseed (Brassica napus L.) Breeding for Seed Yield. Plant Methods 2023, 19, 57. [Google Scholar] [CrossRef] [PubMed]
  40. Chen, Y.; Ye, Z.; Gao, B.; Wu, Y.; Yan, X.; Liao, X. A Robust Adaptive Hierarchical Learning Crow Search Algorithm for Feature Selection. Electronics 2023, 12, 3123. [Google Scholar] [CrossRef]
  41. Jain, S.; Dharavath, R. Memetic Salp Swarm Optimization Algorithm Based Feature Selection Approach for Crop Disease Detection System. J. Ambient. Intell. Humaniz. Comput. 2023, 14, 1817–1835. [Google Scholar] [CrossRef]
  42. Shaheen, M.; Naheed, N.; Ahsan, A. Relevance-Diversity Algorithm for Feature Selection and Modified Bayes for Prediction. Alex. Eng. J. 2023, 66, 329–342. [Google Scholar] [CrossRef]
  43. Ramdhani, Y.; Putra, C.M.; Alamsyah, D.P. Heart Failure Prediction Based on Random Forest Algorithm Using Genetic Algorithm for Feature Selection. Int. J. Reconfig. Embed. Syst. 2023, 12, 205–214. [Google Scholar] [CrossRef]
  44. Abd Elaziz, M.; Dahou, A.; Al-Betar, M.A.; El-Sappagh, S.; Oliva, D.; Aseeri, A.O. Quantum Artificial Hummingbird Algorithm for Feature Selection of Social IoT. IEEE Access 2023, 11, 66257–66278. [Google Scholar] [CrossRef]
  45. Saibene, A.; Gasparini, F. Genetic Algorithm for Feature Selection of EEG Heterogeneous Data. Expert. Syst. Appl. 2023, 217, 119488. [Google Scholar] [CrossRef]
  46. Altarabichi, M.G.; Nowaczyk, S.; Pashami, S.; Mashhadi, P.S. Fast Genetic Algorithm for Feature Selection—A Qualitative Approximation Approach. Expert. Syst. Appl. 2023, 211, 118528. [Google Scholar] [CrossRef]
  47. Espinosa, R.; Jiménez, F.; Palma, J. Multi-Surrogate Assisted Multi-Objective Evolutionary Algorithms for Feature Selection in Regression and Classification Problems with Time Series Data. Inf. Sci. 2023, 622, 1064–1091. [Google Scholar] [CrossRef]
  48. Ay, Ş.; Ekinci, E.; Garip, Z. A Comparative Analysis of Meta-Heuristic Optimization Algorithms for Feature Selection on ML-Based Classification of Heart-Related Diseases. J. Supercomput. 2023, 79, 11797–11826. [Google Scholar] [CrossRef] [PubMed]
  49. Zaimoğlu, E.A.; Yurtay, N.; Demirci, H.; Yurtay, Y. A Binary Chaotic Horse Herd Optimization Algorithm for Feature Selection. Eng. Sci. Technol. Int. J. 2023, 44, 101453. [Google Scholar] [CrossRef]
  50. Xu, M.; Song, Q.; Xi, M.; Zhou, Z. Binary Arithmetic Optimization Algorithm for Feature Selection. Soft Comput. 2023, 27, 11395–11429. [Google Scholar] [CrossRef]
  51. Oh, S.; Ahn, C.W. Evolutionary Approach for Interpretable Feature Selection Algorithm in Manufacturing Industry. IEEE Access 2023, 11, 46604–46614. [Google Scholar] [CrossRef]
  52. Kovács, L. Feature Selection Algorithms in Generalized Additive Models under Concurvity. Comput. Stat. 2024, 39, 461–493. [Google Scholar] [CrossRef]
  53. Zhao, L.; Li, Y.; Li, S.; Ke, H. A Frequency Item Mining Based Embedded Feature Selection Algorithm and Its Application in Energy Consumption Prediction of Electric Bus. Energy 2023, 271, 126999. [Google Scholar] [CrossRef]
  54. Zhang, Z.; Song, F.; Zhang, P.; Chao, H.C.; Zhao, Y. A New Online Field Feature Selection Algorithm Based on Streaming Data. J. Ambient. Intell. Humaniz. Comput. 2024, 15, 1365–1377. [Google Scholar] [CrossRef]
  55. Yang, F.; Xu, Z.; Wang, H.; Sun, L.; Zhai, M.; Zhang, J. A Hybrid Feature Selection Algorithm Combining Information Gain and Grouping Particle Swarm Optimization for Cancer Diagnosis. PLoS ONE 2024, 19, e0290332. [Google Scholar] [CrossRef]
  56. Zhu, Z. Analysis of the Innovation Path of University Education Management Informatization in the Era of Big Data. Appl. Math. Nonlinear Sci. 2024, 9, 1–14. [Google Scholar] [CrossRef]
  57. Yilmaz, M.; Yalcin, E.; Kifah, S.; Demir, F.; Sengur, A.; Demir, R.; Mehmood, R.M. Improving the Classification Performance of Asphalt Cracks After Earthquake With a New Feature Selection Algorithm. IEEE Access 2024, 12, 6604–6614. [Google Scholar] [CrossRef]
  58. SabbaghGol, H.; Saadatfar, H.; Khazaiepoor, M. Evolution of the Random Subset Feature Selection Algorithm for Classification Problem. Knowl. Based Syst. 2024, 285, 111352. [Google Scholar] [CrossRef]
  59. Almutairi, M.S. Evolutionary Multi-Objective Feature Selection Algorithms on Multiple Smart Sustainable Community Indicator Datasets. Sustainability 2024, 16, 1511. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Khan, M.A. Special Issue “Algorithms for Feature Selection (2nd Edition)”. Algorithms 2025, 18, 16. https://doi.org/10.3390/a18010016

AMA Style

Khan MA. Special Issue “Algorithms for Feature Selection (2nd Edition)”. Algorithms. 2025; 18(1):16. https://doi.org/10.3390/a18010016

Chicago/Turabian Style

Khan, Muhammad Adnan. 2025. "Special Issue “Algorithms for Feature Selection (2nd Edition)”" Algorithms 18, no. 1: 16. https://doi.org/10.3390/a18010016

APA Style

Khan, M. A. (2025). Special Issue “Algorithms for Feature Selection (2nd Edition)”. Algorithms, 18(1), 16. https://doi.org/10.3390/a18010016

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop