Next Article in Journal
The Heat Pulse Method for Soil Physical Measurements: A Bibliometric Analysis
Next Article in Special Issue
Written Documents Analyzed as Nature-Inspired Processes: Persistence, Anti-Persistence, and Random Walks—We Remember, as Along Came Writing—T. Holopainen
Previous Article in Journal
Development and Validation of Overpressure Response Model in Steel Tunnels Subjected to External Explosion
Previous Article in Special Issue
A New “Doctor and Patient” Optimization Algorithm: An Application to Energy Commitment Problem
 
 
Article
Peer-Review Record

A Spring Search Algorithm Applied to Engineering Optimization Problems

Appl. Sci. 2020, 10(18), 6173; https://doi.org/10.3390/app10186173
by Mohammad Dehghani 1, Zeinab Montazeri 1, Gaurav Dhiman 2, O. P. Malik 3, Ruben Morales-Menendez 4, Ricardo A. Ramirez-Mendoza 4,*, Ali Dehghani 5, Josep M. Guerrero 6 and Lizeth Parra-Arroyo 4
Reviewer 1:
Reviewer 2: Anonymous
Appl. Sci. 2020, 10(18), 6173; https://doi.org/10.3390/app10186173
Submission received: 18 August 2020 / Revised: 31 August 2020 / Accepted: 2 September 2020 / Published: 4 September 2020

Round 1

Reviewer 1 Report

This research proposed new heuristic optimization that is inspired principles of the modeled spring force systems under the theory of the Hooke’s laws.  The paper's topic is the brand new method to solve the single-objective constrained optimization problems. The paper is explained clearly and logic for the reader easy to understand, the topic is interesting, in line with recent research trends, and the approach could improve the annoying. In my regard, this paper represents a good piece of work. Nevertheless, some critical issues should be considered to make the work publishable: The current version of this manuscript includes concerns showing as follows.

  1. In subsection 4.1, Setting the system, determining laws, and arranging parameters (from line 189), the spring stiffness coefficient is determined by eq.(11), why the author chose this equation? Could the authors explain the reasons for this idea?
  2. In section 7, experimental Results and Discussion, this research was compared with some algorithms in the term of The average and standard deviation of the best optimal solutions; as see the result from table 1 to table 4, the proposed algorithm shown an outstanding amount of all. However, the authors did not show more detail the setting parameters which were used during the experiments in each algorithm. Could the authors present each set of parameters for each algorithm?
  3. The authors did not report the computational time for various algorithms. I think the random-based heuristic population optimization algorithms can generate better results if running in a long time. Therefore, the running time is also a key indicator for the comparison among the algorithms. On the MATLAB software, we can use function “tic” and “toc” to measure the computer time.
  4. Conclusions can be improved. This reviewer strongly suggests that the authors clearly explain what the significant findings are and why your paper is significant. Some of the essential quantitative results should be reported to better demonstrate the findings of the carried-out work. The future directions in this field might be clearly mentioned in the conclusion part.
  5. This paper has checked the similar and paragram by ithenticate software, and the similar rate is about 32%, it is hight, it should be revised carefully. Could the author revise this paper and try to reduce the similarity is less than 25%? More detail about this author can see in the attachments.

 

Comments for author File: Comments.pdf

Author Response

A Spring Search Algorithm Applied to Engineering Optimization Problems

 

 

Ricardo A. Ramirez-Mendoza
Tecnológico de Monterrey, Monterrey NL, 64,489, Mexico

28-Aug-2020

 

 

Applied Sciences Editorial Office

The authors appreciate dear Editor-in-Chief, managing editor, MDPI
Assistant Editor, and the respected reviewers for the carefully consideration and useful comments on the paper. It surely improves the quality of the paper. The paper is revised according to the recommendation and comments given in the decision letter. In the following, the authors' answers and list of changes are presented according to the comments. Besides, the paper was reviewed to avoid the similarity, more than 50% of the paper was redesigned. It must be noted, these modifications are highlighted in the paper.

 

Best regards

 

Ricardo A. Ramirez-Mendoza

Email: [email protected]

 

 

Reviewers Recommendation:

 

Comments from reviewers 1:

 

This research proposed new heuristic optimization that is inspired principles of the modeled spring force systems under the theory of the Hooke’s laws.  The paper's topic is the brand new method to solve the single-objective constrained optimization problems. The paper is explained clearly and logic for the reader easy to understand, the topic is interesting, in line with recent research trends, and the approach could improve the annoying. In my regard, this paper represents a good piece of work. Nevertheless, some critical issues should be considered to make the work publishable: The current version of this manuscript includes concerns showing as follows.

The authors appreciate dear reviewer for the carefully consideration and useful comments on the paper. It surely improves the quality of the paper. Based on these valuable comments, the article has been revised. The authors hope that the revised paper will be accepted by dear reviewer.

 

  1. In subsection 4.1, Setting the system, determining laws, and arranging parameters (from line 189), the spring stiffness coefficient is determined by eq.(11), why the author chose this equation? Could the authors explain the reasons for this idea?

Response: Thank you so much to the dear reviewer for this valuable and accurate comment.

Lines 189 to 215

,

(11)

The spring stiffness coefficient is an important parameter. As explained, the search agents in the proposed  SSA algorithm are weights that are connected to each other by springs. Weights with better objective function values ​​are in better parts of the search space and must move with more precise steps. On the other hand, weights that do not have the aproprate amount of objective function should move to better areas of the search space.

If the values ​​of the objective functions for the two weights are close to each other, both weights have the same position and therefore the displacement rate should be small. If two weights have values ​​with significant differences, the weight with the value of the weak objective function should move towards the weight with the appropriate objective function.

According to the description, the spring stiffness coefficient is a function of the difference between the values ​​of the objective function.

The size of the spring stiffness coefficient is also adjusted based on the larger value of the objective function.

 

  1. In section 7, experimental Results and Discussion, this research was compared with some algorithms in the term of The average and standard deviation of the best optimal solutions; as see the result from table 1 to table 4, the proposed algorithm shown an outstanding amount of all. However, the authors did not show more detail the setting parameters which were used during the experiments in each algorithm. Could the authors present each set of parameters for each algorithm?

Response: Thank you so much to the dear reviewer for this valuable and accurate comment. To address this valuable comment, information of algorithms has been added.

Line 322

The parameters and information of the competitive algorithms are considered as follow:

 

GA

Population size N=80

Crossover 0.9

Mutation 0.05

 

 

PSO

Swarm size S=50

Inertia weight decreases linearly from 0.9 to 0.4

C1(individual-best acceleration factor) increases linearly from 0.5to 2.5

C2(global-best acceleration factor) decreases linearly from 2.5 to 0.5

 

 

GSA

Objects number N=50

Acceleration coefficient (a=20)

Initial gravitational constant (G0=100)

 

 

TLBO

Swarm size S=50

 

 

GOA

Search Agents N= 100

 

l=1.5 and f=0.5

 

 

GWO

Wolves number=50

a variable decreases linearly from 2 to 0

 

 

SHO

Search Agents N=80

M Constant

Control Parameter (h)

 

 

EPO

Search Agents N=80

Temperature Profile

A Constant

Function S()

Parameter M = 2

Parameter f 

Parameter l 

 

 

 

  1. The authors did not report the computational time for various algorithms. I think the random-based heuristic population optimization algorithms can generate better results if running in a long time. Therefore, the running time is also a key indicator for the comparison among the algorithms. On the MATLAB software, we can use function “tic” and “toc” to measure the computer time.

Response: Thank you so much to the dear reviewer for this valuable and accurate comment. In many articles optimization the average and standard deviation of the best optimal solutions is presented as a result. the running time  is a good output. The authors ask the dear reviewer for permission to present the results with the current situation in the article.

The authors will definitely benefit from the valuable suggestion of the dear reviewer in their future articles and studies.

 

  1. Conclusions can be improved. This reviewer strongly suggests that the authors clearly explain what the significant findings are and why your paper is significant. Some of the essential quantitative results should be reported to better demonstrate the findings of the carried-out work. The future directions in this field might be clearly mentioned in the conclusion part.

Response: Thank you so much to the dear reviewer for this valuable and accurate comment. The authors have tried to improve the quality of the conclusion Section.

 

  1. This paper has checked the similar and paragram by ithenticate software, and the similar rate is about 32%, it is hight, it should be revised carefully. Could the author revise this paper and try to reduce the similarity is less than 25%? More detail about this author can see in the attachments.

Response: Thank you so much to the dear reviewer for this valuable and accurate comment. The authors thank the dear reviewer for sharing this report. The authors in the revised version have tried to reduce the similarity percentage.

 

Author Response File: Author Response.docx

Reviewer 2 Report

the paper proposed a new optimization algorithm (SSA) which was inspired by the spring systems. The algorithm was fully introduced and numerically evaluated. The overall idea is interesting, but I have one major concern about this work.

 

The need for accurate algorithms is always high. I believe the authors proposed this algorithm with aim to solve the general problems better than the current algorithms (at least than most of them). However, when I studied the engineering application part of the paper I found that the proposed algorithm had 5880 in Table 5, and the second best is 5885, that means the algorithm improved 5/5885 <0.1%, which was proposed in 2018 and also we have similar result to the third best SHO, which was proposed in 2017. And similar observations can be found in Table 7, Table 9, Table 11, and Table 13. So I am not sure that do we really need this SSA.

Different variations of the PSO, EA, BA were proposed recently. And I am not sure if the proposed algorithm can be better than them. It’s also not fair to ask the authors to compare with all the most current algorithms, but without the state-of-the-art comparison, I cannot be convinced the need of this SSA. Some recent work can be reviewed:

[1] Zhang, H. and Hui, Q., 2020, July. A Coupled Spring Forced Bat Searching Algorithm: Design, Analysis and Evaluation. In 2020 American Control Conference (ACC) (pp. 5016-5021).

[2] Wang, Zi-Jia, Zhi-Hui Zhan, Sam Kwong, Hu Jin, and Jun Zhang. "Adaptive Granularity Learning Distributed Particle Swarm Optimization for Large-Scale Optimization." IEEE Transactions on Cybernetics (2020).

[3] Deng, W., Liu, H., Xu, J., Zhao, H., & Song, Y. (2020). An improved quantum-inspired differential evolution algorithm for deep belief network. IEEE Transactions on Instrumentation and Measurement.

 

 

 

 

Author Response

A Spring Search Algorithm Applied to Engineering Optimization Problems

 

 

Ricardo A. Ramirez-Mendoza
Tecnologico de Monterrey, Monterrey NL, 64,489, Mexico

28-Aug-2020

 

 

Applied Sciences Editorial Office

The authors appreciate dear Editor-in-Chief, managing editor, MDPI
Assistant Editor, and the respected reviewers for the carefully consideration and useful comments on the paper. It surely improves the quality of the paper. The paper is revised according to the recommendation and comments given in the decision letter. In the following, the authors' answers and list of changes are presented according to the comments. Besides, the paper was reviewed to avoid the similarity, more than 50% of the paper was redesigned. It must be noted, these modifications are highlighted in the paper.

 

Best regards

 

Ricardo A. Ramirez-Mendoza

Email: [email protected]

 

 

Reviewers Recommendation:

Comments from reviewers 2:

 

the paper proposed a new optimization algorithm (SSA) which was inspired by the spring systems. The algorithm was fully introduced and numerically evaluated. The overall idea is interesting, but I have one major concern about this work.

The authors appreciate dear reviewer for the carefully consideration and useful comments on the paper. It surely improves the quality of the paper. Based on these valuable comments, the article has been revised. The authors hope that the revised paper will be accepted by dear reviewer.

 

The need for accurate algorithms is always high. I believe the authors proposed this algorithm with aim to solve the general problems better than the current algorithms (at least than most of them). However, when I studied the engineering application part of the paper I found that the proposed algorithm had 5880 in Table 5, and the second best is 5885, that means the algorithm improved 5/5885 <0.1%, which was proposed in 2018 and also we have similar result to the third best SHO, which was proposed in 2017. And similar observations can be found in Table 7, Table 9, Table 11, and Table 13. So I am not sure that do we really need this SSA.

Different variations of the PSO, EA, BA were proposed recently. And I am not sure if the proposed algorithm can be better than them. It’s also not fair to ask the authors to compare with all the most current algorithms, but without the state-of-the-art comparison, I cannot be convinced the need of this SSA. Some recent work can be reviewed:

[1] Zhang, H. and Hui, Q., 2020, July. A Coupled Spring Forced Bat Searching Algorithm: Design, Analysis and Evaluation. In 2020 American Control Conference (ACC) (pp. 5016-5021).

[2] Wang, Zi-Jia, Zhi-Hui Zhan, Sam Kwong, Hu Jin, and Jun Zhang. "Adaptive Granularity Learning Distributed Particle Swarm Optimization for Large-Scale Optimization." IEEE Transactions on Cybernetics (2020).

[3] Deng, W., Liu, H., Xu, J., Zhao, H., & Song, Y. (2020). An improved quantum-inspired differential evolution algorithm for deep belief network. IEEE Transactions on Instrumentation and Measurement.

Response: Thank you so much to the dear reviewer for his valuable and accurate comment.

Each optimization problem has a definite solution called a global solution.
Optimization algorithms with random scanning in the search space provide a solution that is not necessarily the best solution. For this reason, the solution presented by an algorithm that is close to the global optimal solution is called quasi-optimal solution.

An important indicator in evaluating optimization algorithms is exploration power. An algorithm that scans the search space more accurately has higher exploration power and, as a result can provide a better quasi-optimal solution.

As stated by the dear reviewer, it is not possible to examine all optimization algorithms. Therefore, the authors have compared their proposed method with widely used optimization algorithms.

The articles and methods suggested by the dear reviewer are very interesting. The authors have used the articles suggested by the respected reviewer to improve the text of the article and have cited to these articles.

The authors appreciate dear reviewer for the carefully consideration and useful comments on the paper. It surely improves the quality of the paper. Based on these valuable comments, the article has been revised. The authors hope that the revised paper will be accepted by dear reviewer.

Round 2

Reviewer 2 Report

I have very similar suggestions as before. The accuracy of the algorithm is not superior. It's up to the editor to make the final decision. 

Back to TopTop