Next Article in Journal
Textile Bandwidth-Enhanced Coupled-Mode Substrate-Integrated Cavity Antenna with Slot
Next Article in Special Issue
Optimizing the Quantum Circuit for Solving Boolean Equations Based on Grover Search Algorithm
Previous Article in Journal
Maintaining Effective Node Chain Connectivity in the Network with Transmission Power of Self-Arranged AdHoc Routing in Cluster Scenario
Previous Article in Special Issue
Leveraging Deep Features Enhance and Semantic-Preserving Hashing for Image Retrieval
 
 
Article
Peer-Review Record

Compiler Optimization Parameter Selection Method Based on Ensemble Learning

Electronics 2022, 11(15), 2452; https://doi.org/10.3390/electronics11152452
by Hui Liu 1,2, Jinlong Xu 3,*, Sen Chen 1,2 and Te Guo 1,2
Reviewer 1:
Reviewer 2: Anonymous
Reviewer 3: Anonymous
Reviewer 4: Anonymous
Electronics 2022, 11(15), 2452; https://doi.org/10.3390/electronics11152452
Submission received: 22 June 2022 / Revised: 24 July 2022 / Accepted: 1 August 2022 / Published: 6 August 2022
(This article belongs to the Special Issue Pattern Recognition and Machine Learning Applications)

Round 1

Reviewer 1 Report

This manuscript needs major revision and handle from multi points
1. rewrite the abstract from the beginnings by focus on the "idea, develop points in this method presented by author"s" and compare the results of this manuscript with other papers as values
2. the title must become more specific because it appear not suitable change it .
3. Draw complete block diagram of that method explain on it the main steps based on the accept format, this is very necessary to make this manuscript as guide to other researchers.
4. add new section under title the hypothesis and limitations of the develop method present by author(s)
5. multi points in this manuscript need scientific justification and prove by the author(s), after each table or figure give specific description of result submitted by it described in three into five lines
6. add new section discuss on it the results with details and explain on it the main advantages and disadvantages of their method based on the author(s) opinion
7. the main benefit of previous works are to compare your work with it from points(techniques, preprocessing stage, results or evaluation measures), while the author(s) in this manuscript not compare their works with the previous works from any points. therefore, must add table at the final the related works analysis the previous work from points "techniques used, preprocessing techniques, type of dataset used, evaluation measures, advantage and disadvantages of that technique ).
8. What's the differentiation between proposed method and previous method? It is not clear in introduction.
9. The paper must contain references related to the five last years at least 2/3 from the total number of references also all reference must contain dio.
--- https://doi.org/10.1016/j.eij.2022.01.004
--- https://doi.org/10.1007/s00521-021-06067-7
--- https://doi.org/10.1007/s00500-020-04905-9
--- doi:10.1007/s00500-019-04495-1
--- doi: 10.1109/ICTCK.2014.7033495
--- DOI: https://doi.org/10.1007/978-3-030-23672-4_23
---https://doi.org/10.1007/978-3-030-23672-4_15
Finally, this work need major revision and must write block diagram and redraw all the figures because the quality of it very low and not suitable of that journal

Author Response

Dear reviewer:
  Thank you very much for your review, your questions and suggestions are critical to improve the quality of this paper. We have carefully revised the paper according to your review comments. Please see the attachment. Thank you very much!
   Sincerely yours,
   Hui Liu

Author Response File: Author Response.pdf

Reviewer 2 Report

The paper proposes an optimization for the parameter selection method by an algorithm of constraining multiobjetive PSO.

The algorithm determines optimal parameters for kernel functions guided by extracted features of the functions to apply a learning method to predict combinations for program optimizations.

The parameters explored by the proposal are from three transformation algorithms: loop unrolling, loop tiling, and array padding.

The paper is well structured and presents relatively good and relevant state-of-art references. However, some questions and clarifications may improve the work for better understanding and refinement of the results and conclusions.

The authors select 2500 hotspot functions of the SPEC CPU2006 benchmark set for training. How ELOPS select these functions?

In the using phase, with the NPB Benchmark, are the features extracted only from the hotspot functions? How these functions are selected in the using phase?

How ELOPS select the function to extract features of a new program?

The problem of finding the optimization parameters for an input program has, theoretically, an infinite search space. The authors claim that their proposal obtains the best predictive performance and the results sometimes are called the best solution. In this problem, the best solution involves finding the optimal parameters in the infinite search space, which is unfeasible by an exhaustive search. How do the authors determine that their solution is optimal (or the best as possible) for this search space?

The authors compare their proposal with a Genetic Algorithm (GA). Is this implementation from a particular source in the literature? How the parameters (e.g., 40 chromosomes, 50 generations) were determined?

An enhanced results section would compare the technique with a predictive model from the literature.

Please present at least a summary of the features used by the proposed tool to maintain a self-contained structure.

Some acronyms are used before their definition.

Readhat -> Redhat?

 

Author Response

Dear reviewer:
  Thank you very much for your review, your questions and suggestions are critical to improve the quality of this paper. We have carefully revised the paper according to your review comments. Please see the attachment. Thank you very much!
   Sincerely yours,
   Hui Liu

Author Response File: Author Response.pdf

Reviewer 3 Report

Dear Authors,

 

The authors have written the manuscript very well. All sections are well written and very detailed. Authors have proposed a new parameters optimization based on multiobjective PSO algorithm and feature extraction method as per feature-class relevance method for Kernal function of an operating system. The proposed method is named ELOPS. The results are well presented for platforms 1 and 2 showings ELOPS outperformed compared to other existing Kernal optimization techniques such as SPEC2006 and NPB. However, I propose the following modifications and some comments to improve the manuscript quality.

19, 23, 24: Abbreviations (PSO, ELOPS, SPEC2006, NPB) should be expanded in their first use. ELOPS must be expanded in its first use in Abstract.

Abstract: Authors can briefly present the result of the ELOPS method in the abstract.

Tables 4 & 5: Predictive results are presented in precision through the manuscript. Is it prediction accuracy or precision? Precision means getting the same result repeatedly for a data or dataset.

Figures 7 & 8: Both figures must be enlarged and made it more visible. Increase the font size of axis title/values and labels.

596 ~ 609: Prediction accuracy of ELOPS is highest for Best Parameter compared to KNN, SVM, and GCC. But accuracy falls quickly and is lower for 3rd, 4th, …. parameters in the case of ELOPS compared to other methods. What is the overall accuracy of the proposed algorithm/method? I am afraid that the overall accuracy of ELOPS may not be better than other well-known methods (KNN, SVM, etc).

For ensemble learning, why it was not considered other methods? For example, Adaptive Boosting (AdaBoost) is very well known for its excellent prediction accuracy for ensemble learning-based optimization.

Related work: This section should be moved after the introduction. Result discussion part can be a separate section after the result of it can be part of the result but not in the part of related work.

 

Conclusion: Authors should add the proposed method ELOPS brief results.  

Author Response

Dear reviewer:
  Thank you very much for your review, your questions and suggestions are critical to improve the quality of this paper. We have carefully revised the paper according to your review comments. Please see the attachment. Thank you very much!
   Sincerely yours,
   Hui Liu

Author Response File: Author Response.pdf

Reviewer 4 Report

In this study, a function-level compiler optimization parameter selection model (they called as ELOPS) is proposed to select the optimal parameters. Proposed method is tested for different benchmark applications and obtained results compared with some existing methods. It is shown that proposed methods provides some program execution time averaging speedup. The organization and overall presentation is good. The only comment on the paper is:

-          -  Authors has mentioned that there are two methods to solve optimization parameter selection: iterative compilation and machine learning based iterative compilation. What about the propose method, can we directly put the proposed method in one group? For either answer, authors should highlight this situation.

Author Response

Dear reviewer:
  Thank you very much for your review, your questions and suggestions are critical to improve the quality of this paper. We have carefully revised the paper according to your review comments. Please see the attachment. Thank you very much!
   Sincerely yours,
   Hui Liu

Author Response File: Author Response.pdf

Reviewer 5 Report

The authors have proposed an optimization parameter selection method by searching the optimal space using the multi-objective PSO method and extracting the features using the feature-class relevance method. The idea is quite exciting and presented in a comprehensive manner.  

The only shortcoming is in the results section, it is advisable to add a confusion matrix so that the usefulness of the proposed algorithm can be compared with contemporary methods. 

Author Response

Dear reviewer:
  Thank you very much for your review, your questions and suggestions are critical to improve the quality of this paper. We have carefully revised the paper according to your review comments. Please see the attachment. Thank you very much!
   Sincerely yours,
   Hui Liu

Author Response File: Author Response.pdf

Round 2

Reviewer 1 Report

accept

Back to TopTop