Next Article in Journal
Deep Reinforcement Learning and Discrete Simulation-Based Digital Twin for Cyber–Physical Production Systems
Next Article in Special Issue
Machine Learning-Based Feature Extraction and Selection
Previous Article in Journal
Preparation of Fe3O4@SiO2@N-TiO2 and Its Application for Photocatalytic Degradation of Methyl Orange in Na2SO4 Solution
Previous Article in Special Issue
Exploring the Role of Self-Adaptive Feature Words in Relation Quintuple Extraction for Scientific Literature
 
 
Article
Peer-Review Record

Investigating the Performance of a Novel Modified Binary Black Hole Optimization Algorithm for Enhancing Feature Selection

Appl. Sci. 2024, 14(12), 5207; https://doi.org/10.3390/app14125207
by Mohammad Ryiad Al-Eiadeh 1, Raneem Qaddoura 2 and Mustafa Abdallah 3,*
Reviewer 1: Anonymous
Reviewer 2: Anonymous
Appl. Sci. 2024, 14(12), 5207; https://doi.org/10.3390/app14125207
Submission received: 20 April 2024 / Revised: 31 May 2024 / Accepted: 12 June 2024 / Published: 14 June 2024
(This article belongs to the Special Issue Machine-Learning-Based Feature Extraction and Selection)

Round 1

Reviewer 1 Report

Comments and Suggestions for Authors

In this manuscript, the authors describe a new feature selection method based on the Black Hole Optimization Algorithm. The introduction and method sections are well-written, providing thorough details. In the results section, the authors tested the algorithm's performance using 14 different datasets. Overall, the manuscript is well-prepared. However, I have one minor concern. The authors only utilized ACC and F-1 score to evaluate performance. Given that most datasets are binary, I suggest adding AUC to the results section to enhance comprehensiveness.


 With the rapid development of machine learning techniques, this study is highly relevant to the corresponding research area, focusing on finding the most meaningful features from high-dimensional data. The manuscript was well-written but included too many details, making it hard to read. In the results section, the authors tested the algorithm's performance using 14 different datasets with KNN, and the performance based on ACC and F1 score looks good. However, there are some major concerns about this manuscript:

  1. The paper is too long, including many unnecessary details that make it difficult to read. Remove any unnecessary details and ensure that readers can easily understand the algorithm, datasets used for performance testing, and corresponding results.
  2. There are many machine learning packages that can be easily used for classification problems, such as SVM, random forest, logistic regression, and multiple layer perceptron. Why only use KNN to test the feature selection? Other methods should also be used to provide comprehensive testing results.
  3. The authors only used ACC and F-1 score to evaluate performance. Given that most datasets are binary, I suggest adding AUC to the results section to enhance comprehensiveness.



Author Response

Dear Reviewer 1,


Thank you for reviewing our paper. Your constructive remarks helped us identify areas for improvement. We have revised the paper according to your (and other reviewers’) recommendations.


Please find our response to your additional comments below (in attached file).

Author Response File: Author Response.pdf

Reviewer 2 Report

Comments and Suggestions for Authors

In this work authors have presented a wrapper feature selection algorithm called  modified black hole optimization algorithm (MBHO). As the name suggests, MBHO is derived from modification of the evaluation function of the BHO algorithm. The manuscript is well organized and well written, however, there are a few things (following) that authors should address. 

1. Authors should proof-read the abstract carefully as there are places where 'exploration' is written as 'exploitation'.

2. Instead of spearman correlation, have authors tired other correlation methods which are not dependent on the assumption that the variables have a monotonic relationship. for example kendall correlation. Authors should mention the cases where using the spearman correlation affects their results and cases where it causes the algorithm to perform poorly. How is complexity affected if we change the correlation method?

3. Authors should show test/performance of their algorithm on multiple classification techniques, like Naive Bayes where previous information is inherently part of the classifier or SVMs?

4. Have authors looked into the outlier cases of fitness scores? what are causing those outliers and which cases they specifically refer to?

5. Authors should also provide precision and recall in addition to accuracy. It would be great if they could include the ROC curves in the supplementary section.

6. How does the feature selection algorithm perform when used for unsupervised techniques?

Comments on the Quality of English Language

The manuscript is well written, however, proof reading is required.

Author Response

Dear Reviewer 2,


Thank you for reviewing our paper. Your constructive remarks helped us identify areas for improvement. We have revised the paper according to your (and other reviewers’) recommendations.


Please find our response to your additional comments (attached file)

Author Response File: Author Response.pdf

Back to TopTop