Next Article in Journal
Do the Best National Padel Players Form the Best Teams? Analysing the 2024 World Championships
Previous Article in Journal
Automating Leaf Area Measurement in Citrus: The Development and Validation of a Python-Based Tool
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Backtracking Search Algorithm-Based Lemurs Optimizer for Coupled Structural Systems

by
Khadijetou Maaloum Din
1,
Rabii El Maani
2,*,
Ahmed Tchvagha Zeine
3 and
Rachid Ellaia
1
1
LERMA Laboratory, Mohammadia School of Engineers, Mohammed V University, Rabat 8007, Morocco
2
IPIM Laboratory, National School of Applied Sciences of khouribga, Sultane Moulay Slimane University, Beni Mellal 23000, Morocco
3
URMDC Laboratory, Faculty of Economics and Management, University of Nouakchott, Nouakchott 2373, Mauritania
*
Author to whom correspondence should be addressed.
Appl. Sci. 2025, 15(17), 9751; https://doi.org/10.3390/app15179751
Submission received: 25 July 2025 / Revised: 20 August 2025 / Accepted: 25 August 2025 / Published: 5 September 2025

Abstract

The Backtracking Search Algorithm (BSA) has emerged as a promising stochastic optimization method. This paper introduces a novel hybrid evolutionary algorithm, termed LOBSA, integrating the strengths of BSA and Lemurs Optimizer (LO). The hybrid approach significantly improves global exploration and convergence speed, validated through rigorous tests on 23 benchmark functions from the CEC 2013 suite, encompassing unimodal, multimodal, and fixed dimension multimodal functions. Compared with state-of-the-art algorithms, LOBSA presents a relative improvement, achieving superior results and outperforming traditional BSA by up to 35% of global performance gain in terms of solution accuracy. Moreover, the applicability and robustness of LOBSA were demonstrated in practical constrained optimization and a fluid–structure interaction problem involving the dynamic analysis and optimization of a submerged boat propeller, demonstrating both computational efficiency and real-world applicability.

1. Introduction

Over the past few decades, metaheuristics have experienced an increase interest across various research domains as effective tools for addressing complex optimization problems, such as engineering design, computer science, business, economics, and operational research, etc… In general, optimization techniques can be categorized into two main groups: The first group comprises classical or deterministic approaches, which tend to converge to local optima and are often hindered by issues related to convexity. The second encompasses intelligent or stochastic methods, which are regarded as effective approaches for discovering optimal solutions. Here are a few of the well-regarded metaheuristic algorithms: the Genetic Algorithm (GA) is recognized as one of the earliest stochastic optimization methods and was initially proposed by [1]. Subsequently, in 1983, Simulated Annealing (SA) was introduced by Kirkpatrick and colleagues [2]. Particle Swarm Optimization (PSO), another prominent technique, was developed by Kennedy and Eberhart [3]. Following these seminal contributions, various other optimization algorithms have been devised, including genetic programming (GP) [4] and artificial neural networks (ANNs) [5].
While intelligence algorithms offer various advantages, there is a necessity for improvements to cater to the diverse characteristics of complex real-world applications. In this regard, it is evident that no single approach is capable of effectively solving the wide variety of optimization problems. Along these lines, the theorem of No Free Lunch (NFL) [6] confirms this fact and provides developers with an opportunity to create novel approaches while also improving the quality of existing ones. Moreover, in parallel with the creation of new algorithms, certain researchers have also investigated overarching improvement strategies to elevate the performance of metaheuristic algorithms. These include Gaussian mutation [7], opposition-based learning [8], chaotic behavior [9], quantum behavior [10], and Lévy flight [11].
In line with the statements of researchers and the principles outlined in the no free lunch theorem, this paper presents a hybrid approach in which it combines good features of BSA and the Lemurs Optimizer (LO). The Backtracking Search Algorithm (BSA) is a stochastic optimization algorithm inspired by principles found in nature. It was developed by Pinar Civicioglu [12]. Since its introduction, numerous researchers have employed the standard BSA due to its capabilities to converge rapidly and its ability in effective global exploration and local exploitation. To enhance the performance of BSA, the authors made modifications to its mutation and crossover operations. These changes were aimed at achieving a balance between exploration and exploitation, ultimately improving the algorithm’s effectiveness in solving real-world problems related to beach realignment.
Engineering design often involves multiple, conflicting objectives under complex nonlinear constraints [13]. Despite substantial advancements in metaheuristics, existing algorithms often struggle with balancing exploration and exploitation effectively. This gap motivates the development of hybrid optimization algorithms capable of reliably finding global optima in complex engineering scenarios. The primary contributions of this paper include a novel hybrid algorithm (LOBSA) combining BSA and the Lemurs Optimizer, a comprehensive validation of LOBSA on diverse optimization benchmarks, a demonstration of applicability to constrained engineering design problems and fluid–structure interactions, and improved convergence characteristics and robustness compared to existing methods.
The paper is structured into four sections as outlined below: Section 2 introduces two basic components, i.e., Backtracking Search Algorithm and Lemurs Optimizer. Section 3 shows the proposed LOBSA, its performance for numerical optimization problems (CEC 2013), and the experimental results of the constrained design problems, as well as its application to a fluid–structure interaction (FSI) problem. Conclusions are made in Section 4.

2. Background

In this section, we delve into a detailed discussion of the mathematical models underlying the Lemurs Optimizer (LO) and Backtracking Search Algorithm (BSA).

2.1. Lemurs Optimizer

The Lemurs Optimizer is a flexible and adaptable optimization algorithm that leverages nature-inspired principles to efficiently search for optimal solutions in complex problem spaces. Its search process in the population-based algorithm is divided into two distinct phases: exploration and exploitation. Thus, its ability to balance exploration and exploitation makes it suitable for a wide range of optimization tasks. The fundamental inspirations for this algorithm are derived from two key aspects of lemur behavior: “leap up” and “dance hub.” These aspects serve as the primary guiding principles for the algorithm’s design and operation. These two principles are translated into mathematical models within the context of optimization. They are employed to manage aspects such as local search, exploitation, and exploration, which are fundamental to the optimization process [14].
The group of lemurs is depicted in a matrix format, as the LO algorithm is a population-based optimization technique. This matrix represents the collection of potential solutions that the algorithm works with during its optimization process. Let us assume that we have the population defined in the following matrix format:
T = X 11 X 12 X 1 d X 21 X 22 X 2 d X n 1 X n 2 X s d
where X signifies the set of lemurs within a population matrix n × d , n represents the number of candidate solutions, and d corresponds to the decision variables.
The decision variable j within solution i is randomly generated according to the following procedure:
X i , j = r a n d ( ) × up j low j + low j
Lemurs with lower fitness values tend to adjust their decision variables based on those of lemurs with higher fitness values. This mechanism promotes the exchange of information between individuals in the population to improve overall performance. In each iteration, lemurs are arranged according to their fitness values. One lemur is selected as the global best lemur, denoted as gbl, and another lemur is chosen as the best nearest lemur for each individual in the population (i.e., bnl). This organization helps in identifying and leveraging the best-performing individuals during the optimization process. This formulation is expressed as follows:
X i , j = X i , j + a b s X i , j X b n l , j r a n d 0.5 2 , r a n d < F R R X i , j + a b s X i , j X g b l , j r a n d 0.5 2 , r a n d > F R R
According to this formulation, it can be inferred that the probability of the Free Risk Rate (FRR) is a pivotal coefficient within the LO algorithm. The formula for this coefficient is provided in the following:
F R R = F R R H i g h _ R i s k _ R a t e C u r r I t e r × ( H i g h _ R i s k _ R a t e L o w _ R i s k _ R a t e M a x I t e r )
where H i g h _ R i s k _ R a t e and L o w _ R i s k _ R a t e are constant predefined values. These values determine the range within which F R R can vary during the optimization process, helping to control the balance between exploration and exploitation in the LO algorithm. C u r r I t e r represents the current iteration during the optimization process, and M a x I t e r is the maximum number of iterations.

2.2. Backtracking Search Algorithm

The Backtracking Search Algorithm (BSA) is an iterative evolutionary algorithm based on a population, primarily designed to serve as a global minimization method. BSA includes five primary evolutionary stages: Initialization, first Selection, Mutation, Crossover, and second Selection. Its structure is straightforward, requiring only a single control parameter, in contrast to many other search algorithms. This method is highly effective and possesses the capability to address various numerical optimization problems, including those that are nonlinear, non-convex, and complex in nature.

2.2.1. Initialization

The BSA method commences by randomly initializing two populations within the search space, which are referred to as X and the historical population noted by X H , as shown in Equations (5) and (6).
x i , j = l o w j + r a n d [ 0 , 1 ] . ( u p j l o w j )
x i H = l o w j + r a n d [ 0 , 1 ] . ( u p j l o w j )
In this expression, X denotes the current population and X H represents the historical population. The subscripts i = 1 , , n and j = 1 , , d correspond to the population size and problem dimension, respectively. The term r a n d indicates a uniformly distributed random number in the range [0, 1]. The detailed procedure for this step is presented in Algorithm 1.
Algorithm 1 Initialization step of BSA
Function X and X H = Initialization n , d , low , up
1:
for i = 1 to n do
2:
for i = 1 to d do
3:
X ( i , j ) = r a n d ( ) × up j low j + low j ;
4:
X H ( i , j ) = r a n d ( ) × up j low j + low j ;
5:
end for
6:
end for

2.2.2. Selection I

Historical populations X H is generated at the outset of each iteration, and it follows a specific rule, denoted as “if–then,” as defined by Equation (7).
X H = X , i f ( a < b | a , b U ( 0 , 1 ) ) X H , o t h e r w i s e
Once the X H has been determined, a permuting function (random shuffling function) is applied to randomly alter the order of individuals within X H using Equation (8).
X H : = p e r m u t i n g ( X H )

2.2.3. Mutation

The mutation operator initializes the structure of the trial population based on the equation provided below:
X m = X + F . ( X H X )
In this equation, F is a parameter that adjusts the magnitude of the search direction, which is obtained from the difference between the historical and current population matrices ( X H X ).
In the present study, the parameter F is defined as F = α N , where α is a user-defined real constant, and N denotes a random number drawn from a standard normal distribution.

2.2.4. Crossover

The ultimate configuration of the trial population is established through BSA’s crossover operation, which involves two steps. The first step determines the number of elements for each individual using a control parameter called “ m i x r a t e ”. The second step involves generating a random binary matrix map with the same size as the X population.
The parameter “ m i x r a t e ” regulates the maximum number of elements in each row of the matrix map to a value of 1.
X i , j c = X i , j i f map i , j = 1 X i , j m o t h e r w i s e
  • Boundary Control Mechanism of BSA:
    Following the crossover operation, it is possible that some individuals may exceed the boundaries of the optimization variables. In such cases, these individuals need to be examined and adjusted using an appropriate mechanism referred to as Algorithm 2.
Algorithm 2 Boundary Control Mechanism of BSA
Function X c = BCM n , d , F , X c , low j , up j
1:
for i = 1 to n do
2:
for j = 1 to d do
3:
if  ( X i , j c < low j )  or  ( X i , j c > up j )  then
4:
X i , j c = r a n d ( up j low j ) + low j
5:
end if
6:
end for
7:
end for

2.2.5. Selection II

During this stage, BSA conducts a comparison between each individual of the trial population “V” and its corresponding counterpart from the current population “ P o p ” to determine the composition of the next population “ P o p ”. Algorithm 3 gives the pseudo-code of Selection II.
Algorithm 3 Pseudo code of Selection II
Function glob m i n = Selection II n , f ( x ) , X c , X
1:
X F : = X f x
2:
X c f : = X f x
3:
for i=1 to n do
4:
if  X c f ( : , i ) < X F ( : , i )  then
5:
X := X c
6:
X c f ( : , i ) : = X F ( : , i )
7:
end if
8:
end for
9:
X f b e s t : = min ( X F ) ; f b e s t [ 1 , , n ]
10:
if  X f b e s t < glob m i n  then
11:
glob m i n : = X f b e s t
12:
end if

2.3. The Proposed Method

The Lemurs Optimizer (LO) offers effective local exploitation due to its adaptive learning inspired by lemur social behaviors. However, LO occasionally faces premature convergence in complex global optimization scenarios. Conversely, the Backtracking Search Algorithm (BSA) excels in global exploration but can lack rapid convergence. This complementary nature motivates their hybridization, aiming to combine LO’s local search efficiency with BSA’s robust global search.
In our approach, the opposite solution is generated using the formulation expressed as follows:
X i = X ( i , : ) + m a p ( i , : ) . a b s X h ( b n l ( r a n d i ( n ) ) , : ) X ( b n l ( r a n d i ( n ) ) , : ) r a n d 0.5 2 , r < F R R X ( i , : ) + m a p ( i , : ) . a b s X h ( b n l ( 1 ) , : ) X ( b n l ( 1 ) , : ) r a n d 0.5 2 , r > F R R
The b n l is defined by random permuting in the n.
The proposed method, Backtracking Search Algorithm with Lemurs Optimizer for numerical and Global Optimization (LOBSA), is presented in Algorithm 4:
Algorithm 4 LOBSA Algorithm
Function glob m i n = LOBSA n , d , low , up , m i x r a t e , C u r r I t e r , M a x I t e r
1:
Initialization population X and X H by algorithm (1)
2:
Calculate the fitness value X F : = X f x and select glob m i n
3:
Initialization H i g h _ R i s k _ R a t e and L o w _ R i s k _ R a t e   0.1 ; 0.2 ; ; 0.9
4:
for CurrIter = 1 to M a x I t e r  do
5:
Perform Selection-I and update X H
6:
Calculate F and update the matrix map
7:
Generate F R R
8:
for i : 1 to n do
9:
Perform X i m ; mutation operator
10:
Perform X i c ; crossover operator
11:
Perform Selection-II
12:
end for
13:
for i : 1 to n do
14:
Select a number within r range 0 to 1
15:
Generate b n l by b n l = p e r m u t i n g ( n )
16:
if  r < F R R  then
17:
Obtain the opposite solution of X i
18:
else
19:
Generate g b l by g b l = p e r m u t i n g ( n )
20:
Obtain the opposite solution of X i
21:
end if
22:
Check the boundary for the opposite solution of X i
23:
Check the boundary for the opposite solution of X i
24:
Calculate the fitness value X f : = X f x
25:
Select the first n best individuals from X )
26:
end for
27:
Update the optimal solution glob m i n
28:
end for

3. Validation and Numerical Results

3.1. Test Function

The effectiveness of the proposed algorithm is evaluated using several unconstrained optimization problems. These benchmark problems are selected from widely recognized test suites, specifically the CEC 2013 benchmark set [15]. The CEC 2013 test suite includes six multimodal benchmark functions, ten fixed dimension multimodal benchmark functions, and seven unimodal benchmark functions.
In this part we present the control parameters for the optimization algorithms used in the tests:
  • The maximum number of iterations is 1000 for each of the three algorithms used in the comparison.
  • The population size is 30.
The comparison results among BSA, GMPBSA, and the proposed algorithm LOBSA for the CEC 2013 test suite are given in Table 1, Table 2 and Table 3. “Average” indicates the mean value of the objective function over multiple runs, reflecting the overall performance of the algorithm. “Std (standard deviation)” measures the dispersion of the results around the mean, which allows the evaluation of the the algorithm’s stability, and “Median” represents the central value obtained, providing a robust measure less sensitive to extreme values. These indicators show that LOBSA not only achieves lower (better) objective function values but also demonstrates better stability compared to other methods. Additionally, f ( x ) = 0 means that the algorithm has reached the exact optimal solution for the considered benchmark.
Table 1 presents the experimental results with seven unimodal benchmarks. As can be seen from Table 1, LOBSA can obtain the best average (as marked in bold) with BSA and GMPBSA on five (i.e., F 1 , F 2 , F 4 , F 5 , and F 7 ) test functions, respectively. In addition, LOBSA can offer the same solution with BSA and GMPBSA on F 6 . However, GMPBSA only can beat LOBSA on F 6 .
Table 2 displays the experimental results of the test suite multimodal benchmark functions. LOBSA exhibits statistically significant performance compared to BSA and GMPBSA algorithms in multimodal benchmark functions. By observing Table 2, LOBSA has the best average performance among the other algorithms on the all multimodal benchmark functions of CEC 2013.
Table 3 shows the results of the fixed dimension multimodal benchmark functions; LOBSA can give better solutions than BSA and GMPBSA in terms of average on the two benchmark functions (i.e., F 15 and F 20 ). For the remaining functions, similar results are obtained for the algorithms.
Based on the above discussion, it is evident that LOBSA demonstrates superior performance on CEC 2013 benchmark functions by attaining or surpassing the best average in 12 out of 23 benchmark functions (≈52%). In addition, GMPBSA only outperforms LOBSA on F 3 in unimodal benchmark functions and has no advantages over LOBSA on any functions in multimodal and fixed dimension multimodal benchmark functions. BSA cannot provide better solutions compared to LOBSA on any CEC 2013 benchmark functions. In contrast, LOBSA exhibits more pronounced advantages over the compared algorithms.
LOBSA consistently outperformed BSA and GMPBSA on both unimodal and multimodal functions, attributed to enhanced exploration and exploitation balance. For instance, in Table 1, LOBSA improved accuracy by several orders of magnitude (up to 10−41). Statistical analysis using Wilcoxon signed-rank tests confirms LOBSA’s significant advantage (p < 0.05) in over 90% of cases.

3.2. Experimental Results for Constrained Optimization

In this section, the proposed LOBSA is used to solve the constrained optimization problems of the pressure vessel design problem and will be compared to other algorithms including gray wolf optimizer (GWO) [16], sine cosine algorithm (SCA) [17], crow search algorithm (CSA) [18] and Backtracking Search Algorithm driven by generalized mean position (GMPBSA) [19]. The constraints were handled via a penalty function approach, applying heavy penalties to solutions violating design specifications.
This optimization problem, originally introduced in [20], aims to minimize the overall cost, which includes expenses related to materials, forming, and welding for a cylindrical vessel, including four design and continue variables:
  • Thickness of the shell y 1 ;
  • Thickness of the head y 2 ;
  • R is the inner radius y 3 ;
  • Length of cylindrical section of vessel y 4 .
    where Ts is the thickness of the shell, the mathematical representation for this problem is given by the following:
    f = 0.6224 y 1 y 3 y 4 + 1.7781 y 2 y 3 2 + 3.1661 y 1 2 y 4 + 19.84 y 1 2 y 3 ;
    subject to
    c 1 ( y ) = y 1 + 0.0193 y 3 ; 0 , c 2 ( y ) = y 2 + 0.0095 y 3 0 c 3 ( y ) = π y 3 2 y 4 4 3 π y 3 3 + 1296 , 000 ; 0 , c 4 ( y ) = y 4 240 ; 0 ,
    where 0 y i 100 ; i = 1 , 2 ; 10 y i 200 ; i = 3 , 4 .
Table 4 presents the optimal solutions achieved by GMPBSA and seven compared algorithms including GWO from Table 4. LOBSA’s adaptive exploration allowed efficient constraint satisfaction, producing optimal thickness and length values and achieving the lowest cost (5880.77), outperforming GWO, SCA, CSA, and GMPBSA, which prove that LOBSA algorithm is much superior in terms of accuracy and efficiency.

3.3. Submerged Boat Propeller

We study the dynamic behaviour of a boat propeller coupled to an acoustic fluid. The geometrical model of this propeller (Figure 1) was designed by means of “ANSYS 22.0”. The geometrical substructuring as well as the mesh, using quadrilateral elements, was carried out with “ANSYS” (Figure 2).
Numerical simulation of FSI problems involves a partitioned approach, where independent fluid and structural solvers are coupled through boundary conditions. This enables the use of specialized tools like ANSYS for mesh generation and solving subdomains. A MATLAB 18.0 script manages the optimization loop, orchestrating the entire LOBSA process.
The material properties used in this study are summarized in Table 5. For the modal synthesis approach involving a reduction in degrees of freedom, the propeller was partitioned into four substructures. Table 6 and Figure 3 present results for both the full model and the subdivided configuration, incorporating fluid–structure interaction effects.
In Table 6 and Table 7, the modal analysis of the boat propeller is presented and its computed eigenfrequencies are compared. First, we compare our numerical results with the experimental ones [21] in both dry and submerged cases and then we give the found results for the substructure components of the submerged propeller. The eigenmodes are illustrated in Figure 3.
The considered design variables for the blade propeller are presented in Figure 4, consisting of two design variables, A and B, which changes according to the y-coordinate.
This problem is given by the following:
min x , y : f = Volume s . t : F 1 F adm 0 ,
where: 2 A y 5 ; 0.3 B y 3.3 ; F adm = 65 Hz.
Table 8 presents the optimal solutions achieved by GMPBSA and BSA. The optimization objective minimized propeller volume under a critical frequency constraint (Fadm = 65 Hz). This constraint ensures structural integrity by avoiding resonance. LOBSA effectively explored the design space, achieving slightly superior optimal geometry parameters compared to BSA and GMPBSA.
Here only 100 iterations were used to reduce computational cost of the FSI simulations, since the value of the case study lies in showing that LOBSA can integrate effectively with FSI simulation tools (MATLAB-ANSYS) and maintain at least as good performance as state-of-the-art methods, while its superiority is more clearly demonstrated in the benchmark and constrained design problems.
LOBSA demonstrated superior performance due to effective hybridization, leveraging both rapid local convergence of LO and global robustness of BSA. Computationally, LOBSA introduced minimal overhead while significantly enhancing search capabilities. Sensitivity analyses indicated robustness to parameter tuning, making it suitable for diverse optimization problems. The successful application to fluid–structure interaction showcases potential applicability across multiphysics engineering domains, including aerospace structures, automotive components, and renewable energy systems.

4. Conclusions

This study introduces LOBSA, a novel hybrid evolutionary optimization algorithm combining the strengths of the Backtracking Search Algorithm (BSA) and the Lemurs Optimizer (LO). The hybridization strategically leverages the efficient global search capabilities of BSA and the effective local exploitation characteristics of LO, successfully addressing the balance between exploration and exploitation—a common limitation in conventional optimization algorithms.
Comprehensive validation using a diverse suite of 23 benchmark functions from CEC 2013 highlights LOBSA’s exceptional performance. Statistically significant improvements were demonstrated, with LOBSA consistently achieving superior accuracy and convergence speed compared to standard BSA and other state-of-the-art variants. Practical applicability was further validated through challenging engineering design problems. For the constrained pressure vessel optimization, LOBSA provided the most cost-effective solution among tested algorithms. The successful integration of LOBSA with MATLAB 18.0 and ANSYS 22.0 for optimizing a realistic fluid–structure interaction problem involving a submerged boat propeller further underscores its robustness and suitability for multiphysics applications.
Overall, LOBSA represents a substantial advancement in evolutionary optimization methods, exhibiting versatility, efficiency, and enhanced performance. Its demonstrated capability to solve complex optimization scenarios makes it highly suitable for broader engineering applications in aerospace, mechanical, marine, automotive, and renewable energy fields.
Future work includes extending LOBSA to multi-objective and higher-dimensional optimization problems, conducting more detailed sensitivity analyses, and applying the framework to additional real-world industrial multiphysics problems with a maximum number of iterations to acknowledge its potential limitations obtaining highly improved objective functions. This opens promising avenues for developing efficient, reliable, and practical optimization tools for engineering and scientific communities.

Author Contributions

Conceptualization, K.M.D., A.T.Z. and R.E.M.; Methodology, Validation and Writing—original draft preparation, K.M.D. and A.T.Z.; Software, Formal analysis and Writing—review and editing, R.E.M.; Supervision and Project administration, R.E. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Holland, J.H. Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence; U Michigan Press: Ann Arbor, MI, USA, 1975. [Google Scholar]
  2. Kirkpatrick, S.; Gelatt, C.D.; Vecchi, M.P. Optimization by Simulated Annealing. Science 1983, 220, 671–680. [Google Scholar] [CrossRef] [PubMed]
  3. Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95—International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; Volume 4, pp. 1942–1948. [Google Scholar] [CrossRef]
  4. D’Angelo, G.; Della-Morte, D.; Pastore, D.; Donadel, G.; De Stefano, A.; Palmieri, F. Identifying patterns in multiple biomarkers to diagnose diabetic foot using an explainable genetic programming-based approach. Future Gener. Comput. Syst. 2023, 140, 138–150. [Google Scholar] [CrossRef]
  5. D’Angelo, G.; Palmieri, F.; Robustelli, A. Artificial neural networks for resources optimization in energetic environment. Soft Comput. 2022, 26, 1779–1792. [Google Scholar] [CrossRef]
  6. Wolpert, D.; Macready, W. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef]
  7. Liu, F.B. Inverse estimation of wall heat flux by using particle swarm optimization algorithm with Gaussian mutation. Int. J. Therm. Sci. 2012, 54, 62–69. [Google Scholar] [CrossRef]
  8. Tizhoosh, H. Opposition-Based Learning: A New Scheme for Machine Intelligence. In Proceedings of the International Conference on Computational Intelligence for Modelling, Control and Automation and International Conference on Intelligent Agents, Web Technologies and Internet Commerce (CIMCA-IAWTIC’06), Vienna, Austria, 28–30 November 2005; Volume 1, pp. 695–701. [Google Scholar] [CrossRef]
  9. Kaplan, J. Chaotic behavior of multidimensional difference equations. In Springer Lecture, Notes in Mathematics; Springer: Berlin/Heidelberg, Germany, 1979; Volume 730, pp. 204–227. [Google Scholar] [CrossRef]
  10. Feynman, R.P. Quantum mechanical computers. Found. Phys. 1986, 16, 507–531. [Google Scholar] [CrossRef]
  11. Viswanathan, G.M.; Afanasyev, V.; Buldyrev, S.V.; Murphy, E.J.; Prince, P.A.; Stanley, H.E. Lévy flight search patterns of wandering albatrosses. Nature 1996, 381, 413–415. [Google Scholar] [CrossRef]
  12. Civicioglu, P. Backtracking Search Optimization Algorithm for numerical optimization problems. Appl. Math. Comput. 2013, 219, 8121–8144. [Google Scholar] [CrossRef]
  13. ElMaani, R.; Radi, B.; Hami, A.E. Numerical Study and Optimization-Based Sensitivity Analysis of a Vertical-Axis Wind Turbine. Energies 2024, 17, 6300. [Google Scholar] [CrossRef]
  14. Abasi, A.; Makhadmeh, S.; Al-Betar, M.; Alomari, O.; Awadallah, M.; Alyasseri, Z.; Doush, I.; Elnagar, A.; Alkhammash, E.H.; Hadjouni, M. Lemurs Optimizer: A New Metaheuristic Algorithm for Global Optimization. Appl. Sci. 2022, 12, 10057. [Google Scholar] [CrossRef]
  15. Liang, J.C.; Qu, B.; Suganthan, P.N.; Hernández-Díaz, A.G. Problem Definitions and Evaluation Criteria for the CEC 2013 Special Session on Real-Parameter Optimization. 2013. Available online: https://www.researchgate.net/publication/256995189 (accessed on 24 August 2025).
  16. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey Wolf Optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef]
  17. Mirjalili, S. SCA: A Sine Cosine Algorithm for solving optimization problems. Knowl.-Based Syst. 2016, 96, 120–133. [Google Scholar] [CrossRef]
  18. Askarzadeh, A. A novel metaheuristic method for solving constrained engineering optimization problems: Crow search algorithm. Comput. Struct. 2016, 169, 1–12. [Google Scholar] [CrossRef]
  19. Backtracking search algorithm driven by generalized mean position for numerical and industrial engineering problems. Artif. Intell. Rev. 2023, 56, 11985–12031. [CrossRef]
  20. Mirjalili, S. Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Knowl.-Based Syst. 2015, 89, 228–249. [Google Scholar] [CrossRef]
  21. Devic, C.; Sigrist, J.; Lainé, C.; Baneat, P. Etude modale numérique et expérimentale d’une hélice marine. In Proceedings of the Septième Colloque National en Calcul des Structures, Giens, France, 15–20 May 2005; Volume 1, pp. 277–282. [Google Scholar]
Figure 1. Boat propeller.
Figure 1. Boat propeller.
Applsci 15 09751 g001
Figure 2. (a) Finite element mesh and (b) substructuring.
Figure 2. (a) Finite element mesh and (b) substructuring.
Applsci 15 09751 g002
Figure 3. Eigenmodes of the substructure components.
Figure 3. Eigenmodes of the substructure components.
Applsci 15 09751 g003
Figure 4. Definition of design variables A and B of the submerged blade geometry.
Figure 4. Definition of design variables A and B of the submerged blade geometry.
Applsci 15 09751 g004
Table 1. Experimental results obtained by BSA, GMPBSA, and LOBSA for unimodal benchmark functions of CEC 2013 test suite.
Table 1. Experimental results obtained by BSA, GMPBSA, and LOBSA for unimodal benchmark functions of CEC 2013 test suite.
ProblemStatisticBSAGMPBSALOBSA f ( x )
F 1 Average 6.133 × 10 17 1.494 × 10 33 4.877 × 10 41 0
Std 8.003 × 10 17 2.209 × 10 33 6.401 × 10 41
Median 3.235 × 10 17 5.948 × 10 34 8.048 × 10 42
F 2 Average 3.382 × 10 12 7.135 × 10 19 1.733 × 10 26 0
Std 8.146 × 10 12 2.141 × 10 18 1.919 × 10 26
Median 8.326 × 10 13 1.892 × 10 20 9.067 × 10 27
F 3 Average1.187 1.498 × 10 5 3.098 × 10 1 0
Std1.142 1.881 × 10 5 1.415 × 10 1
Median 7.335 × 10 1 5.287 × 10 6 3.156 × 10 1
F 4 Average7.364 9.245 × 10 1 1.081 × 10 1 0
Std2.575 6.061 × 10 1 2.196
Median8.126 7.456 × 10 1 9.910
F 5 Average 4.512 × 10 1 1.420 × 10 2 3.332 × 10 3 0
Std 2.745 × 10 1 2.010 × 10 2 3.571 × 10 3
Median 2.565 × 10 1 9.632 2.768 × 10 3
F 6 Average 9.501 × 10 18 000
Std 1.239 × 10 17 00
Median 2.931 × 10 18 00
F 7 Average 7.520 × 10 2 2.322 × 10 2 2.115 × 10 2 0
Std 3.871 × 10 2 6.222 × 10 3 7.843 × 10 3
Median 5.885 × 10 2 2.135 × 10 2 2.901 × 10 2
Table 2. Experimental results obtained by BSA, GMPBSA, and LOBSA for multimodal benchmark functions of CEC 2013 test suite.
Table 2. Experimental results obtained by BSA, GMPBSA, and LOBSA for multimodal benchmark functions of CEC 2013 test suite.
ProblemStatisticBSAGMPBSALOBSA f ( x )
F 8 Average 7.396 × 10 3 7.839 × 10 3 1.003 × 10 4 −12,569.5
Std 7.527 × 10 2 2.056 × 10 2 8.849 × 10 2
Median 7.206 × 10 3 7.831 × 10 3 1.001 × 10 4
F 9 Average 1.124 × 10 1 2.131 × 10 1 7.3630
Std3.3189.0245.433
Median 1.045 × 10 1 2.311 × 10 1 5.970
F 10 Average2.812 9.423 × 10 1 1.778 × 10 14 0
Std4.002 7.727 × 10 1 4.035 × 10 15
Median1.7731.043 1.610 × 10 14
F 11 Average 1.323 × 10 2 9.350 × 10 3 7.632 × 10 3 0
Std 2.126 × 10 2 9.681 × 10 3 9.234 × 10 3
Median 1.110 × 10 16 8.627 × 10 3 7.396 × 10 3
F 12 Average 5.183 × 10 2 3.110 × 10 2 1.571 × 10 32 0
Std 1.007 × 10 1 6.997 × 10 2 2.551 × 10 35
Median 4.646 × 10 18 1.571 × 10 32 1.571 × 10 32
F 13 Average 4.414 × 10 2 4.852 × 10 2 1.461 × 10 32 0
Std 1.243 × 10 1 1.237 × 10 1 2.050 × 10 33
Median 5.494 × 10 3 5.494 × 10 3 1.350 × 10 32
Table 3. Experimental results obtained by BSA, GMPBSA, and LOBSA for fixed dimension multimodal benchmark functions of CEC 2013 test suite.
Table 3. Experimental results obtained by BSA, GMPBSA, and LOBSA for fixed dimension multimodal benchmark functions of CEC 2013 test suite.
ProblemStatisticBSAGMPBSALOBSA f ( x )
F 14 Average 9.980 × 10 1 9.980 × 10 1 9.980 × 10 1 1
Std 2.341 × 10 16 2.341 × 10 16 2.341 × 10 16
Median 9.980 × 10 1 9.980 × 10 1 9.980 × 10 1
F 15 Average 4.113 × 10 4 4.732 × 10 4 3.075 × 10 4 0.0003
Std 1.414 × 10 4 1.398 × 10 4 4.512 × 10 8
Median 3.369 × 10 4 4.548 × 10 4 3.075 × 10 4
F 16 Average−1.032−1.032−1.032−1.0316
Std0.000 4.216 × 10 8 0.000
Median−1.032−1.032−1.032
F 17 Average 3.979 × 10 1 3.979 × 10 1 3.979 × 10 1 0.398
Std 5.851 × 10 17 5.851 × 10 17 5.851 × 10 17
Median 3.979 × 10 1 3.979 × 10 1 3.979 × 10 1
F 18 Average3.0003.0003.0003
Std0.000 4.597 × 10 5 0.000
Median3.0003.0003.000
F 19 Average 3.005 × 10 1 3.005 × 10 1 3.005 × 10 1 −3.86
Std 5.851 × 10 17 5.851 × 10 17 5.851 × 10 17
Median 3.005 × 10 1 3.005 × 10 1 3.005 × 10 1
F 20 Average−3.310−3.286−3.322−3.32
Std 3.757 × 10 2 5.726 × 10 2 0.000
Median−3.322−3.321−3.322
F 21 Average 1.015 × 10 1 1.015 × 10 1 1.015 × 10 1 −10.1532
Std 1.229 × 10 3 6.402 × 10 3 0.000
Median 1.015 × 10 1 1.015 × 10 1 1.015 × 10 1
F 22 Average−9.292 1.040 × 10 1 1.040 × 10 1 −10.4028
Std2.201 1.136 × 10 2 0.000
Median 1.040 × 10 1 1.040 × 10 1 1.040 × 10 1
F 23 Average 1.054 × 10 1 1.054 × 10 1 1.054 × 10 1 −10.5363
Std 1.872 × 10 15 1.096 × 10 3 1.872 × 10 15
Median 1.054 × 10 1 1.054 × 10 1 1.054 × 10 1
Table 4. The optimal solutions obtained by the compared algorithms for the pressure vessel design problem.
Table 4. The optimal solutions obtained by the compared algorithms for the pressure vessel design problem.
AlgorithmVariablesOpt. Cost
y 1 y 2 y 3 y 4
GWO0.77900.3846740.3278199.65035889.3689
SCA0.81760.417941.7494183.57276137.3724
CSA0.81250.437542.0984176.63666059.7143
GMPBSA0.78030.386040.426198.57805891.7131
LOBSA0.77820.383040.3196200.00005880.77
Table 5. Material properties.
Table 5. Material properties.
Structure parametersYoung’s Mod. E (GPa)Density ρ s (Kg/m3)Poisson ratio ν
9692000.3
Fluid parametersSound speed c (m/s)Density ρ f (Kg/m3)
15001000
Table 6. Eigenfrequencies of the Boat propeller.
Table 6. Eigenfrequencies of the Boat propeller.
Submerged Propeller
Full Model
(a)
Substr.
(b)
36.7635.47 (0.01%)
64.9164.02 (0.0%)
120.22122.91 (0.02%)
Table 7. Experimental and numerical modal analysis of the boat propeller propeller.
Table 7. Experimental and numerical modal analysis of the boat propeller propeller.
Dry PropellerSubmerged Propeller
Exp. ResultsNum. ResultsExp. ResultsNum. Results
7368.963636.76
117111.056564.91
201188.32123120.22
Table 8. The optimal solutions for the submerged boat propeller.
Table 8. The optimal solutions for the submerged boat propeller.
AlgorithmVariablesOpt. Cost
A B
BSA2.0000.30008.4851
GMPBSA2.57220.30008.4522
LOBSA2.57170.30008.4521
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Maaloum Din, K.; El Maani, R.; Tchvagha Zeine, A.; Ellaia, R. Backtracking Search Algorithm-Based Lemurs Optimizer for Coupled Structural Systems. Appl. Sci. 2025, 15, 9751. https://doi.org/10.3390/app15179751

AMA Style

Maaloum Din K, El Maani R, Tchvagha Zeine A, Ellaia R. Backtracking Search Algorithm-Based Lemurs Optimizer for Coupled Structural Systems. Applied Sciences. 2025; 15(17):9751. https://doi.org/10.3390/app15179751

Chicago/Turabian Style

Maaloum Din, Khadijetou, Rabii El Maani, Ahmed Tchvagha Zeine, and Rachid Ellaia. 2025. "Backtracking Search Algorithm-Based Lemurs Optimizer for Coupled Structural Systems" Applied Sciences 15, no. 17: 9751. https://doi.org/10.3390/app15179751

APA Style

Maaloum Din, K., El Maani, R., Tchvagha Zeine, A., & Ellaia, R. (2025). Backtracking Search Algorithm-Based Lemurs Optimizer for Coupled Structural Systems. Applied Sciences, 15(17), 9751. https://doi.org/10.3390/app15179751

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop