Next Article in Journal
AgentRed: Towards an Agent-Based Approach to Automated Network Attack Traffic Generation
Next Article in Special Issue
ALIGN: An AI-Driven IoT Framework for Real-Time Sitting Posture Detection
Previous Article in Journal
YOLO-LIO: A Real-Time Enhanced Detection and Integrated Traffic Monitoring System for Road Vehicles
Previous Article in Special Issue
Adaptive Parallel Methods for Polynomial Equations with Unknown Multiplicity
error_outline You can access the new MDPI.com website here. Explore and share your feedback with us.
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Newton-Based Tuna Swarm Optimization Algorithm for Solving Nonlinear Problems with Application to Differential Equations

1
Department of Mathematics, Chandigarh University, Mohali 140413, India
2
Instituto de Matemática Multidisciplinar, Universitat Politècnica de València, Cno. de Vera s/n, 46022 Valencia, Spain
3
Mathematical Modelling and Applied Computation Research Group (MMAC), Department of Mathematics, Faculty of Science, King Abdulaziz University, P.O. Box 80203, Jeddah 21589, Saudi Arabia
*
Author to whom correspondence should be addressed.
Algorithms 2026, 19(1), 40; https://doi.org/10.3390/a19010040
Submission received: 9 December 2025 / Revised: 29 December 2025 / Accepted: 30 December 2025 / Published: 4 January 2026

Abstract

This paper presents two novel hybrid iterative schemes that combine Newton’s method and its variant with the Tuna Swarm Optimization (TSO) algorithm, aimed at solving complex nonlinear equations with enhanced accuracy and efficiency. Newton’s method is renowned for its rapid convergence in root-finding problems, and it is integrated with TSO, a recent swarm intelligence algorithm that surpasses the complex behavior of tuna fish in order to optimize the search for superior solutions. These hybrid methods are reliable and efficient for solving challenging mathematical and applied science problems. Several numerical experiments and applications involving ordinary differential equations have been carried out to demonstrate the superiority of the proposed hybrid methods in terms of convergence rate, accuracy, and robustness compared to traditional optimization and iterative methods. The stability and efficiency of the proposed methods have also been verified. The results indicate that the hybrid approaches outperform traditional methods, making them a promising tool for solving a wide range of mathematical and engineering problems.

1. Introduction

Solving systems of nonlinear equations is a common challenge in various engineering fields, including weather forecasting, petroleum geological prospecting, computational mechanics, and control systems [1]. The mathematical expression for a system of nonlinear equations is represented as
f 1 ( u 1 , u 2 , , u n ) = 0 f 2 ( u 1 , u 2 , , u n ) = 0 f n ( u 1 , u 2 , , u n ) = 0 ,
or F ( u ) = 0 , where f i ( u 1 , u 2 , , u n ) , i = 1 , 2 , , n are the coordinate functions of F. Generally, finding solutions to such systems is difficult due to the lack of an efficient and reliable algorithm. Despite extensive research in this area, existing schemes, such as Newton’s method [2], Halley’s procedure [3], Ostrowski’s method [4], and Jarratt’s scheme [5], are widely used, but their convergence and performance can be highly sensitive to the initial guess provided. If the initial guess is unsuitable, these algorithms may fail to converge. However, selecting an appropriate initial guess for most nonlinear systems is often a challenging task.
With the development of optimization algorithms such as Particle Swarm Optimization (PSO) [6], Differential Evolution (DE) [7], Gray Wolf Optimization (GWO) [8], and Butterfly Optimization algorithm (BOA) [9], there has been a great deal of interest in utilizing these techniques to solve systems of nonlinear equations. The advancement in evolutionary algorithms, in particular, presents novel opportunities to address the challenges typically associated with solving nonlinear systems. A significant advantage of these methods is their capacity to overcome the common issue of selecting an appropriate initial guess, a limitation frequently encountered with traditional approaches. Indeed, the system of nonlinear Equation (1) can be reinterpreted as an optimization problem in (2).
m i n f ( u ) = i = 1 n f i 2 ( u ) , u = ( u 1 , u 2 , , u n ) .
The hybridization of iterative methods with optimization algorithms becomes increasingly significant when addressing complex functions, including various engineering problems. A hybrid algorithm can mitigate the limitations of one approach while leveraging the advantages of the other. Consequently, selecting an appropriate combination of algorithms to create an effective hybrid algorithm is a crucial initial step. The concept of combining evolutionary algorithms with iterative methods for solving complex equations was introduced in 1988 by Karr et al. [1]. They proposed a hybrid approach that utilized the Genetic Algorithm to provide an initial estimate for Newton’s method. However, due to its inefficient results, this early hybridization did not gain significant attention at the time. Two decades later, in 2008, Luo et al. [10] addressed complex nonlinear equations by integrating Quasi-Newton and Chaos Optimization algorithms. The mathematical convergence of the Chaos Optimization algorithm had not been established at that point, so the convergence analysis of their method was not addressed. More recently, in 2021, Sihwail et al. [11] developed a hybrid algorithm called NHHO, which combined Newton’s method with the Harris Hawk Optimization technique to solve an arbitrary system of nonlinear equations. Their work included an examination of non-differentiable functions to demonstrate the method’s efficacy. Sihwail et al. in 2022 [12] developed a hybrid method combining Jarratt’s method and the Butterfly Optimization algorithm (JBOA) for solving a system of nonlinear equations. However, in this method, the Butterfly algorithm may get trapped in local optima, resulting in limited exploration of the search space and potentially compromising the accuracy of the solution. Moreover, due to the presence of derivatives at two points in Jarratt’s method, it is sometimes difficult or time-consuming to compute the derivative of the function at each iteration. After that, Solaiman et al. in 2023 [13] formulated a modified hybrid algorithm that integrated Newton’s method with the Sperm Swarm Optimization Algorithm, achieving enhanced accuracy with fewer iterations. The integration of multiple algorithms can address the limitations of individual methodologies while capitalizing on their respective strengths. Consequently, a crucial initial step is to identify an optimal combination of algorithms to develop an efficacious hybrid approach. Hence, this paper focuses on constructing hybrid approaches that combine the derivative-free Newton iterative scheme [5] and fourth-order Newton variant [14] with the Tuna Swarm optimization algorithm [15]. The objective of this work is to demonstrate the advantages of combining a derivative and derivative-free iterative approach with an optimization algorithm. This integration aims to reduce the complexity of the iterative method and enhance its accuracy in solving nonlinear systems, and also reduce time complexity. Moreover, it can address several limitations of Newton’s method, including divergence issues, local optima entrapment, and initial point selection, and solve large systems with more accuracy and efficiency. The principal contributions of this paper are detailed below.
Section 1 presents the introduction of the work. Section 2 includes the development of hybrid iterative techniques using derivative-free Newton iterative technique, Newton’s fourth-order variant, the Tuna Swarm Optimization algorithm, and its operational mechanism. Section 3 presents a comparison of the numerical results of the proposed algorithms with the original optimization algorithm TSO, derivative-free Newton method (NM), the fourth-order variant of Newton’s method (BM), Newton–Harris Hawk Optimization algorithm (NHHO), and Jarratt–Butterfly Optimization technique. The comparison is based on parameters such as accuracy, stability, fitness value, convergence speed, and computational time. Section 4 includes the convergence analysis and graphical representation of the results. Finally, Section 5 presents some conclusions.

2. Development of Hybrid Iterative Schemes

Firstly, the Tuna Swarm Optimization (TSO) algorithm and the selected classical iterative methods are discussed in detail. Subsequently, the framework of the proposed hybrid methods, namely, derivative-free Newton–Tuna Swarm Optimization algorithm (DNTSO) and Newton fourth-order Tuna Swarm Optimization algorithm (BTSO), is presented and explained.

2.1. Tuna Swarm Optimization Algorithm

In 2021, Lei Xie et al. [15] developed a novel metaheuristic algorithm called the Tuna Swarm Optimization algorithm. This algorithm is inspired by the foraging behavior of the marine fish species tuna. The algorithm incorporates two foraging strategies observed in tuna. The first strategy is spiral foraging, in which tuna drive their prey into shallow water and attack them following a spiral movement pattern. The second strategy is parabolic foraging, wherein they swim in a parabolic shape to encircle their prey before attacking. Similarly to the majority of swarm-based metaheuristic algorithms, TSO initiates the optimization process by uniformly generating initial populations at random within the defined search space:
U i = r a n d ( u b l b ) + l b , i = 1 , 2 , 3 , , N p ,
where N p is the initial tuna population, r a n d is a uniformly distributed random vector having a value between [0, 1], U i is the ith initial individual, and  u b and l b are the upper and lower boundaries of the search space.
Spiral Foraging: Spiral foraging involves the formation of tight spiral formations to drive their prey into shallow water to attack them easily. The spiral foraging behavior in TSO is determined by the following equations:
U i t + 1 = α 1 ( U r a n d t + β | U r a n d t U i t | ) + α 2 U i t , i = 1 , α 1 ( U r a n d t + β | U r a n d t U i t | ) + α 2 U i 1 t , i = 2 , 3 , , N p ,
U i t + 1 = α 1 ( U b e s t t + β | U b e s t t U i t | ) + α 2 U i t , i = 1 , α 1 ( U b e s t t + β | U b e s t t U i t | ) + α 2 U i 1 t , i = 2 , 3 , , N p ,
α 1 = c + ( 1 c ) t t m a x , α 2 = ( 1 c ) ( 1 c ) t t m a x , β = e b l c o s ( 2 π b ) , l = e 3 c o s ( ( ( t m a x + 1 t ) 1 ) π ) ,
where c is a constant used to determine the degree to which the tuna follows the optimal individual and the previous individual in the initial phase; α 1 and α 2 are weight coefficients that control the tendency of individuals to move towards the optimal individual; t m a x is the maximum iterations; t denotes the number of current iterations; and b is a random number uniformly distributed between 0 and 1. U i t + 1 is the ith individual of the t + 1 iteration, and  U b e s t t is the current optimal individual (food). U r a n d t is a reference point in the search space that is generated at random.
Parabolic Foraging: Besides feeding in a spiral pattern, tuna also team up to hunt using a parabolic formation, where the whole school curves into a parabola shape with the prey right at the focus point. At the same time, individual tuna are constantly scanning and searching the area immediately around them for food. They do both of these strategies at once, switching between the cooperative parabolic approach and personal local searching with about a 50–50 chance for each. The parabolic behavior in TSO is modeled as:
U i t + 1 = U b e s t t + r a n d ( U b e s t t U i t ) + T F p 2 ( U b e s t t U i t ) , i f r a n d < 0.5 , ( T F ) p 2 U i t , i f r a n d 0.5 ,
where
p = 1 t t m a x t t m a x ,
where T F is an arbitrary number between [ 1 , 1 ] . Tuna catch their prey using two hunting tactics. Drawing inspiration from this behavior, the algorithm follows one of either of the two tuna-inspired hunting moves (mimicking the cooperative strategies), or with a small probability controlled by a parameter z it just jumps to a completely new random position in the search space. This process keeps going, with every solution constantly updating its position and improving, until the algorithm reaches its stopping criteria (reaching a maximum number of iterations). At the end, it delivers the best solution found along with its fitness score.

2.2. Derivative-Free Newton Iterative Scheme

Newton’s method [2] employs an iterative process and exhibits a second-order convergence rate. Consequently, Newton’s method demonstrates exceptional efficiency in resolving a nonlinear equation system, substantially expediting the process of obtaining accurate solutions. The iterative scheme for Newton’s method (7) is outlined as follows:
U i + 1 = U i [ F ( U i ) ] 1 F ( U i ) ,
where F ( U i ) denotes the jacobian of F ( U i ) and the jacobian here is calculated by using the finite difference approximation (8) as follows:
d F i d U j F i ( U + h ) F i ( U ) h
where ( U + h ) and ( U ) involve each element U j of U and perturbate them by a small step size h. F i ( u + h ) and F i ( u ) are the function evaluations at ( U + h ) and ( U ) .

2.3. Newton’s Fourth-Order Iterative Scheme

Here, we choose the following fourth-order special case from Behl et al. [14], which is given as follows:
V i = U i 2 F ( U i ) 3 F ( U i ) U i + 1 = V i a 1 + a 2 F ( U i ) F ( V i ) 2 F ( U i ) F ( U i ) , i = 0 , 1 , 2 , ,
where a 1 = 5 8 and a 2 = 3 8 .
In a similar manner to Expression (7), a fourth-order variant is developed and hybridized with the Tuna Swarm Optimization method (BTSO). This technique is based on a Newton variant that includes the derivative, but we follow the same hybridization procedure as DNTSO.

2.4. Hybrid Derivative-Free Newton Tuna Swarm Optimization Algorithm (DNTSO) and Newton’s Fourth-Order Tuna Swarm Optimization Algorithm (BTSO)

Optimization algorithms have been successfully employed in numerous contexts and possess a wide range of applications. However, as the No Free Lunch (NFL) theorem [16] states, no algorithm is universally effective for all problems. The hybrid approach combining a derivative-free Newton method and fourth-order variant of Newton’s method with the Tuna Swarm Optimization algorithm (TSO) is utilized to solve nonlinear systems and nonlinear boundary value problems. By integrating the rapid local convergence of iterative methods with the TSO’s global search capabilities, these hybrid methods effectively navigate complex, high-dimensional search spaces while avoiding local optima traps.
Algorithm 1 of the proposed method combining DNTSO and BTSO is explained as follows: The method starts working by the initialization of parameters for the Tuna Swarm Optimization algorithm such as population size ( N P ) , maximum iterations (T), and lower and upper bounds for a randomly generated population. It initializes a random population using Equation (3) and the objective function value F(u) using Equation (2). Tuna are generated randomly as the search agents, and the position update value in the exploration phase is calculated using Equations (4) and (5). Then the exploitation phase is performed using Equation (6). The process iterates for all i s providing the tuna update their position from the TSO. After that, the iterative techniques (derivative-free Newton method/Newton’s fourth-order variant) are applied at the end of each iteration using Equation (7) or (9). The fitness value is calculated for the new tuna position provided by the iterative method and compared to the fitness provided by the TSO. Finally, the position that has better fitness is selected. This loop continues until the stopping criteria for the proposed scheme ( 1 × 10 60 ) is achieved, after which the best optimal solution from the hybrid derivative-free Newton–Tuna Swarm Optimization algorithm (DNTSO) or Newton’s fourth-order Tuna Swarm Optimization algorithm (BTSO) is obtained, ending the process. The iterative methods ensure precise local refinement, while the TSO’s searching behavior, like spiral foraging, enhances global exploration. This hybridization yields superior performance in achieving high accuracy and computational efficiency in tackling nonlinear equations and nonlinear boundary value problems.
The Generalized Pseudocode for the proposed DNTSO and BTSO algorithm is as shown in Algorithm 1.
Algorithm 1 Pseudocode for hybrid DNTSO and BTSO algorithm.
BEGIN
  1:
Input all optimization parameters of Tuna Swarm Optimization algorithm.
  2:
Set the number of Tunas ( N P ) and the total number of iterations ( t m a x ).
  3:
Initialization process of tunas positions starts with Equation (3).
  4:
For t = 1: t m a x
  5:
Update the strongest tuna on objective function value criterion.
  6:
For i = 1: N P
  7:
Perform the Exploration phase of the TSO by using Spiral foraging strategy.
  8:
Calculate and Update new position of the ith tuna using (4) and (5).
  9:
Preform the Exploitation phase utilizing the Parabolic foraging strategy.
10:
Calculate and Update the ith tuna new position using (6).
11:
end
12:
Output the updated tuna position obtained by TSO for the given problem.
13:
Update the Iterative Method’s position U i + 1 in (7)/(9) by using the updated tuna position obtained by TSO.
14:
Calculate the fitness of U i + 1 and compare with TSO’s updated tuna position fitness.
15:
Fitness ( U i + 1 ) < TSO’s fitness.
16:
Update fitness = fitness of U i + 1 .
17:
Update best position = position of U i + 1 .
18:
If termination criteria not met.
19:
end
20:
Return to Step 4.
     
Return: Best individual and best fitness value.
END

3. Numerical Results

In this section, the efficacy of the proposed approach is evaluated through various nonlinear systems of equations. Additionally, applications in differential equations are selected to demonstrate the capabilities of the proposed DNTSO and BTSO methods. It is noteworthy that these problems are standard issues that have been extensively utilized in numerous research studies. The results of the tested problems are compared with original optimization algorithms such as TSO, the iterative methods NM (7) and BM (9), and two previously developed hybrid approaches NHHO [11] and JBOA [12]. The parameters considered for the proposed methods and the other comparative methods are discussed in Table 1 and Table 2.
Example 1.
Consider the system of four nonlinear algebraic equations as the first test case [17]:
F 1 ( u ) = u 2 u 3 + u 4 ( u 2 + u 3 ) = 0 u 1 u 3 + u 4 ( u 1 + u 3 ) = 0 u 1 u 2 + u 4 ( u 1 + u 2 ) = 0 u 1 u 2 + u 1 u 3 + u 2 u 3 1 = 0 .
The approximate roots for the given system are shown in Table 3. From Table 4, it is clear that in terms of fitness (functional error), the proposed method, BTSO and DNTSO, performs better as compared to the TSO method in less computational time. Moreover, the NHHO and JBOA provide better fitness than the traditional optimization approach but do not attain better results than the proposed DNTSO and BTSO method. This is irrespective of the fact that the JBOA and NHHO are also hybrid methods of the Newton and Jarratt iterative scheme with the Butterfly Optimization algorithm and Harris Hawk Optimization method. But due to the poor exploration capabilities of the Butterfly Optimization algorithm and Harris Hawk Optimization method, these methods do not provide efficient results in fewer iterations and less CPU time.
Example 2.
The second test case is an Interval Arithmetic problem [18]:
F 2 ( u ) = u 1 0.25428722 0.18324757 u 4 u 3 u 9 = 0 u 2 0.37842197 0.16275449 u 1 u 10 u 6 = 0 u 3 0.27162577 0.16955071 u 1 u 2 u 10 = 0 u 4 0.19807914 0.15585316 u 7 u 1 u 6 = 0 u 5 0.44166728 0.19950920 u 7 u 6 u 3 = 0 u 6 0.14654113 0.18922793 u 8 u 5 u 10 = 0 u 7 0.42937161 0.21180486 u 2 u 5 u 8 = 0 u 8 0.07056438 0.17081208 u 1 u 7 u 6 = 0 u 9 0.34504906 0.19612740 u 10 u 6 u 8 = 0 u 10 0.42651102 0.21466544 u 4 u 8 u 1 = 0 ,
where 2 u 1 , u 2 , , u 9 , u 10 2 .
The approximate roots of this problem through different techniques are illustrated in Table 5. The numerical results in Table 6 indicate that the proposed methods DNTSO and BTSO achieve faster and more stable convergence results in less CPU time, while JBOA and NHHO also provide approximately similar fitness values.
Example 3
(Application to a differential equation). In this example, a nonlinear boundary value problem [19] is considered to check the efficiency of the proposed method.
v = 1 8 ( 32 + 2 u 3 v v ) , 1 u 3 , v ( 1 ) = 17 ; v ( 3 ) = 43 3 .
The given nonlinear boundary value problem can be converted into a nonlinear system of equations by discretization of the domain of the differential equation [1, 3] into 20 equal sub-intervals. For this, the procedure of the finite difference method is considered in which the step size h = 0.1 and u i = 1 + i h for all the values of u i , i = 0 , 1 , , 20 . After discretization the following two equations of v " and v are approximated through the central difference formula stated in [20], and the equations are given as:
v 1 h 2 ( v i + 1 2 v i + v i 1 ) , v 1 2 h ( v i + 1 v i 1 ) ,
Putting the values of v and v in (10) and i = 1 , 2 , , 19 , the following system of nonlinear equations F 3 ( v ) = 0 has been obtained, whose coordinate functions are:
f 1 = 2 v 1 v 2 + 0.01 ( 4 + 0.33275 + v 1 ( v 2 17 ) 1.6 ) 17 f 2 = v 1 + 2 v 2 v 3 + 0.01 ( 4 + 0.432 + v 2 ( v 3 v 1 ) 1.6 ) f 3 = v 2 + 2 v 3 v 4 + 0.01 ( 4 + 0.5495 + v 3 ( v 4 v 2 ) 1.6 ) f 4 = v 3 + 2 v 4 v 5 + 0.01 ( 4 + 0.686 + v 4 ( v 5 v 3 ) 1.6 ) f 5 = v 4 + 2 v 5 v 6 + 0.01 ( 4 + 0.84375 + v 5 ( v 6 v 4 ) 1.6 ) f 6 = v 5 + 2 v 6 v 7 + 0.01 ( 4 + 1.024 + v 6 ( v 7 v 5 ) 1.6 ) f 7 = v 6 + 2 v 7 v 8 + 0.01 ( 4 + 1.22825 + v 7 ( v 8 v 6 ) 1.6 ) f 8 = v 7 + 2 v 8 v 9 + 0.01 ( 4 + 1.458 + v 8 ( v 9 v 7 ) 1.6 ) f 9 = v 8 + 2 v 9 v 10 + 0.01 ( 4 + 1.71475 + v 9 ( v 10 v 8 ) 1.6 ) f 10 = v 9 + 2 v 10 v 11 + 0.01 ( 4 + 2 + v 10 ( v 11 v 9 ) 1.6 ) f 11 = v 10 + 2 v 11 v 12 + 0.01 ( 4 + 2.31525 + v 11 ( v 12 v 10 ) 1.6 ) f 12 = v 11 + 2 v 12 v 13 + 0.01 ( 4 + 2.662 + v 12 ( v 13 v 11 ) 1.6 ) f 13 = v 12 + 2 v 13 v 14 + 0.01 ( 4 + 3.04175 + v 13 ( v 14 v 12 ) 1.6 ) f 14 = v 13 + 2 v 14 v 15 + 0.01 ( 4 + 3.456 + v 14 ( v 15 v 13 ) 1.6 ) f 15 = v 14 + 2 v 15 v 16 + 0.01 ( 4 + 3.90625 + v 15 ( v 16 v 14 ) 1.6 ) f 16 = v 15 + 2 v 16 v 17 + 0.01 ( 4 + 4.394 + v 16 ( u 17 v 15 ) 1.6 ) f 17 = v 16 + 2 v 17 v 18 + 0.01 ( 4 + 4.92075 + v 17 ( u 18 v 16 ) 1.6 ) f 18 = v 17 + 2 v 18 v 19 + 0.01 ( 4 + 5.488 + v 18 ( u 19 v 17 ) 1.6 ) f 19 = v 18 + 2 v 19 + 0.01 ( 4 + v 19 ( 14.333333 v 18 ) 1.6 ) 14.333333 .
The experimental results in Table 7 show that DNTSO and BTSO provide a more accurate solution than NHHO, JBOA, and TSO. Moreover, in terms of functional value and CPU time, BTSO has superior performance compared to all the other methods. The approximate roots of the given problem are given in Table 8. From these results, we conclude that the proposed methods are suitable for solving complex nonlinear boundary value problems.
Example 4
(Bratu’s problem). To check the efficiency of the proposed method, another nonlinear BVP is considered, commonly known as Bratu’s problem [21]. Bratu’s problem has a variety of applications in various disciplines such as heat transfer, thermal reactions, nanotechnology, chemical reactor theory, the fuel ignition model of thermal combustion, etc. The BVP can be represented as:
v + λ e v = 0 , v ( 0 ) = 0 ; v ( 1 ) = 0 .
To convert the above BVP into a system of nonlinear equations, the finite difference approximation is used, in which the domain of the problem [0, 1] is discretized into 40 sub-intervals with the value of λ = 1 and a step size h = 1 41 = 0.02 considered. To compute v , the central difference approximation is used, represented as:
v 1 h 2 ( v i + 1 2 v i + v i 1 ) ,
By putting the value of v in (11), the following system of nonlinear equations F 4 ( v ) = 0 is obtained, with coordinate functions:
f 1 ( v ) = 2 v 2 + v 1 + 0.000625 e v 1 = 0 f i ( v ) = 2 v i + v i 1 + v i + 1 + 0.000625 e v i = 0 , i = 2 , 3 , , 39 f 40 ( v ) = 2 v 40 + v 39 + 0.000625 e v 40 = 0
The approximate roots obtained from TSO are ( 0.00053498 , 0.000542365 , 0.00054635 ,   0.00053728 , 0.00053937 , 0.00053241 , 0.00054014 , 0.00053907 ,   0.00053229 , 0.00053758 , 0.00054042 , 0.00053057 , 0.00052878 , 0.00054244 ,   0.00053034 , 0.00053665 , 0.00054511 , 0.00054294 , 0.00054735 , 0.00054657 ,   0.00053227 , 0.00054698 , 0.00054159 , 0.00054659 , 0.00054178 , 0.00054688 ,   0.00053544 , 0.00054443 , 0.00053833 , 0.00052958 , 0.00054745 , 0.00053755 ,   0.00054189 , 0.00054153 , 0.00054575 , 0.00054372 , 0.00053763 , 0.00053227 ,   0.00053517 , 0.00053512 , ) T and the approximate roots (best positions) obtained from NHHO, JBOA, DNTSO, and BTSO are ( 0.01384114 , 0.02704858 , 0.03961389 ,   0.05152893 , 0.06278593 , 0.07337743 , 0.08329634 , 0.09253596 , 0.10108999 ,   0.10895253 , 0.11611813 , 0.12258178 , 0.12833891 , 0.13338546 , 0.13771783 ,   0.14133292 , 0.14422812 , 0.14640136 , 0.14785106 , 0.14857618 , 0.14857618 ,   0.14785106 , 0.14640136 , 0.14422812 , 0.14133292 , 0.13771783 , 0.13338546 ,   0.12833891 , 0.12258178 , 0.11611813 , 0.10895253 , 0.10108999 , 0.09253596 ,   0.08329634 , 0.07337743 , 0.06278593 , 0.05152893 , 0.03961389 , 0.02704858 ,   0.01384114 ) T . From Table 9 it is concluded that in terms of functional error reduction, BTSO performs better as compared to all other methods. Moreover, the approximate roots provided by the hybrid methods are the same to some extent, but if we increase the precision, the roots will be different for each method.
Example 5
(Application to second-order ordinary differential equation). Second-order differential equations commonly arise in applied mathematics, engineering, and particularly in modeling dynamical systems. To check the efficiency and robustness of the proposed method, a second-order differential equation has been considered. Generally, such types of equations are solved by the higher order Runge–Kutta method and Implicit Euler numerical iterative methods. The considered nonlinear ordinary second-order differential equation [22] is given as:
v 2 u ( v ) 2 = 0 , v ( 0 ) = 1 , v ( 0 ) = 0
To convert the above ODE into a system of nonlinear equations, the finite difference approximation is used, in which the domain of the problem [0, 1] is discretized into 20 sub-intervals with the step size value of h = 1 20 = 0.05 and u i = i h considered. To compute v and v , the central difference approximation is used, represented as:
v 1 h 2 ( v i + 1 2 v i + v i 1 ) , v 1 2 h ( v i + 1 v i 1 ) ,
By putting the values of v and v in (12), the following generalized system of nonlinear equations F 5 ( v ) = 0 is obtained:
f i ( v ) = v i + 1 2 v i + v i 1 u i 2 ( v i + 1 v i 1 ) 2 = 0 ,
where i = 1 , 2 , , 20 .
The approximate roots obtained from TSO are ( 0.0063245675 , 0.0063245675 ,   0.0063245675 , , 0.0063245675 , 0.0063245675 , 0.0063245675 ) T and the approximate roots obtained from NHHO, JBOA, DNTSO, and BTSO are ( 1.0000000000 ,   1.0000000000 , 1.0000000000 , , 1.0000000000 , 1.0000000000 , 1.0000000000 ) T . From Table 10 it is illustrated that in terms of functional error reduction, BTSO and DNTSO perform better as compared to all other methods. Moreover, the proposed methods also require less computational time for solving the given second-order ordinary differential equation, making the proposed techniques more efficient and robust.
In Table 11, the values of vectors a 1 , a 2 , and a 3 are a 1 = ( 0.5 , 0.5 , 0.5 , 0.5 ) , a 2 = ( 0.1 , 0.2 , 0.4 , 1 ) , and a 3 = ( 0.1 , 0.1 , 0.3 , 1.5 ) . Moreover, in Table 12, the values of vectors b 1 , b 2 , and b 3 are b 1 = ( 1 , 1 , 1 , 1 , 1 , 1 , 1 , 1 , 1 , 1 ) , b 2 = ( 0.5 , 0.8 , 0.9 , 0.6 , 0.9 , 0 , 0.5 , 0.8 , 0.1 , 0.2 ) , and b 3 = ( 1 , 2 , 1 , 0 , 1 , 2 , 1 , 0.5 , 1 , 2 ) .
Table 11, Table 12, Table 13, Table 14 and Table 15 illustrate the comparison of the derivative-free Newton method (NM) and a variant of Newton’s method (BM) with the proposed schemes DNTSO and BTSO. The iterative techniques are tested on three different initial points, and the results are compared in terms of iterations and the norm of the corresponding | | F ( u ) | | in the last iteration. From the above comparison, it is concluded that the proposed methods are more stable, accurate, and robust than the iterative methods, sometimes due to an inappropriate initial guess selection failing to converge or not providing the accurate solution, but in the case of hybrid methods, this problem does not exist, as the optimization method provides the initial guess to the iterative method. Moreover, the iterative techniques may reduce the functional error more if we increase the precision by using the “vpa” command, but it will increase the computational cost of the iterative methods.
The results of comparisons across all problems support the hypothesis that combining two algorithms leads to the inheritance of efficient characteristics from both optimization algorithms and iterative methods. This is demonstrated by the superior performance of hybrid algorithms. Newton’s method, as a local search mechanism, enhances the solution accuracy and prevents the optimization algorithm from getting trapped in local minima. In all the problems, the DNTSO and BTSO provide the best position and reduce the functional error in less computational time compared to all other optimization as well as hybrid approaches.

4. Convergence Analysis

Convergence speed is a crucial metric used to evaluate and compare algorithms. It indicates how quickly an algorithm can reach a solution. The rate at which an algorithm converges is a key performance indicator for assessing and comparing different methods, as it demonstrates how rapidly the algorithm reaches an optimal solution. This characteristic is particularly important for efficiently addressing complex problems, such as systems of nonlinear equations. In the proposed DNTSO and BTSO approach, convergence is evaluated using the fitness value at each iteration. Faster convergence signifies that the algorithm can deliver a highly accurate solution in fewer iterations or with reduced computational time, making it especially valuable for practical applications. The convergence curves shown in Figure 1 illustrate the performance analysis for functions F 1 F 5 . In all tested scenarios, BTSO and DNTSO achieve the optimal fitness values within just 10–15 iterations. In contrast, the traditional optimization technique and hybrid methods like TSO, NHHO, and JBOA exhibit slower convergence, with clear signs of becoming trapped in local optima and inconsistent fitness outcomes. However, the convergence curves of BTSO and DNTSO reveal steady improvement in fitness values over fewer iterations, without falling into local optima. This is due to the effective balance between exploration and exploitation phases in these methods. The Tuna Optimization Algorithm handles both exploration and initial exploitation, after which the top candidate solutions are further refined through an iterative scheme during the exploitation stage, enabling accelerated convergence. Regarding computational efficiency, DNTSO and BTSO require less computational time and fewer iterations to attain the optimal solution, performing 60–70% better than standard metaheuristic algorithm and traditional hybrid approaches. These findings collectively establish DNTSO and BTSO as reliable, consistent, and highly efficient techniques for solving nonlinear equation systems.
Remark 1.
To find the optimal solution, the stopping criterion for the proposed method is 1 × 10 60 . The criterion for determining the fitness values (functional value) is | | F ( u ) | | = f 1 2 + f 2 2 + + f n 2 .
The hybrid algorithm of BTSO and DNTSO significantly enhances convergence for the given examples. The proposed algorithms outperform others by requiring fewer iterations to achieve optimal and effective solutions. This improved convergence speed demonstrates the superiority of the suggested algorithms. Consequently, it can be concluded that the proposed methods exhibit greater consistency and stability in generating precise solutions.

4.1. Stability and Consistency

To illustrate the stability and efficiency of the proposed algorithms DNTSO and BTSO, the primary metrics employed are the average (mean) fitness value and the standard deviation. These statistical measures have been derived from 30 independent runs across all compared algorithms, providing insight into their reliability and consistency. True consistency is evident when the fitness value remains identical or nearly identical across every run. Additionally, the standard deviation serves as an indicator of algorithmic stability: a lower value signifies greater robustness and stability. Table 16 and Table 17 present the mean and standard deviation results for functions F 1 F 5 across the various methods. The data clearly demonstrate that the proposed approaches yield the same or very similar fitness values in all 30 independent executions, underscoring their superior stability, efficiency, and robustness. Figure 2 provides a graphical representation of performance across multiple runs for different problems. The plots reveal that fitness values in conventional methods exhibit considerable fluctuation from one run to another, whereas DNTSO and BTSO maintain highly consistent fitness outcomes in each execution. This behavior further confirms the enhanced robustness and stability of the proposed methods.

4.2. Computational Complexity

The computational complexities of the proposed BTSO and DNTSO methods are determined by several key components: the initialization phase, the tuna position update process, and the incorporation of Newton’s scheme. The initialization step exhibits a complexity of O ( N P ) , where N P denotes the population size (number of tuna). The position update phase, encompassing the identification of the best position, contributes a complexity of O ( t m a x × N P ) + O ( t m a x × N P × M ) , with t m a x representing the maximum number of iterations and M indicating the computational cost associated with evaluating the given function. Additionally, Newton’s scheme adds a complexity of O ( t m a x P ) , where P reflects its per-iteration computational effort. Overall, the total time complexity of DNTSO and BTSO is therefore O ( N P × ( t m a x + t m a x M + 1 ) + t m a x P ) . Naturally, every enhancement comes at a price. The primary goal of the developed hybrid algorithms is to improve both the fitness quality and convergence rate relative to the traditional methods. Nevertheless, integrating one algorithm with another inevitably elevates the computational complexity and execution time compared to the standalone original algorithm. Ultimately, it also depends upon the working of the optimization algorithm. Moreover, in the proposed methods BTSO and DNTSO, the computational cost is much lower as compared to all other methods; this is due to the fast exploration of the Tuna Swarm algorithm and the exploitation of Newton’s method and its fourth-order variant.

5. Conclusions

This study presents the hybrid approaches DNTSO and BTSO that integrate the derivative-free Newton technique and its variant with the Tuna Swarm Optimization (TSO) algorithm to address nonlinear systems of equations. The hybrid method leverages TSO’s exploration capabilities for ensuring rapid convergence. The method’s efficacy is demonstrated through its application to nonlinear problems and further validated through its application to differential equations, illustrating its versatility and practical relevance. Results indicate that the hybrid method outperforms traditional techniques in computational efficiency, reducing functional error and solution accuracy. This hybridization advances the field of numerical optimization and expands its applicability to various engineering and scientific problems involving nonlinear systems. Future research could enhance the method by dynamically adapting TSO parameters or extending it to higher-dimensional systems, thereby solving various optimization benchmark functions to improve scalability and robustness.

Author Contributions

Conceptualization, R.B. and S.B.; methodology, J.R.T.; software, A.C. (Aanchal Chandel); validation, A.C. (Alicia Cordero) and S.B.; formal analysis, R.B.; investigation, A.C. (Aanchal Chandel); writing—original draft preparation, A.C. (Aanchal Chandel); writing—review and editing, A.C. (Alicia Cordero) and J.R.T.; supervision, R.B. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The original contributions presented in this study are included in the article. Further inquiries can be directed to the corresponding author.

Acknowledgments

The authors would like to thank the anonymous reviewers for their comments and suggestions.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Karr, C.L.; Weck, B.; Freeman, L.M. Solutions to systems of nonlinear equations via a genetic algorithm. Eng. Appl. Artif. Intell. 1998, 11, 369–375. [Google Scholar] [CrossRef]
  2. Epperson, J.F. An Introduction to Numerical Methods and Analysis; John Wiley & Sons: Hoboken, NJ, USA, 2013. [Google Scholar]
  3. Chen, D.; Argyros, I.K.; Qian, Q.S. A note on the Halley method in Banach spaces. Appl. Math. Comput. 1993, 58, 215–224. [Google Scholar] [CrossRef]
  4. Ostrowski, A.M. Solution of Equations and Systems of Equations; Academic Press: New York, NY, USA, 1966. [Google Scholar]
  5. Jarratt, P. Some fourth-order multipoint iterative methods for solving equations. Math. Comput. 1966, 20, 434–437. [Google Scholar] [CrossRef]
  6. Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95-International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; IEEE: Piscataway, NJ, USA, 1995; Volume 4, pp. 1942–1948. [Google Scholar]
  7. Storn, R. On the usage of differential evolution for function optimization. In Proceedings of the North American Fuzzy Information Processing, Berkeley, CA, USA, 19–22 June 1996; IEEE: Piscataway, NJ, USA, 1996; pp. 519–523. [Google Scholar]
  8. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey wolf optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef]
  9. Arora, S.; Singh, S. Butterfly optimization algorithm: A novel approach for global optimization. Soft Comput. 2019, 23, 715–734. [Google Scholar] [CrossRef]
  10. Luo, Y.-Z.; Tang, G.-J.; Zhou, L.-N. Hybrid approach for solving systems of nonlinear equations using chaos optimization and quasi-Newton method. Appl. Soft Comput. 2008, 8, 1068–1073. [Google Scholar] [CrossRef]
  11. Sihwail, R.; Solaiman, O.S.; Omar, K.; Ariffin, K.A.Z.; Alswaitti, M.; Hashim, I. A hybrid approach for solving systems of nonlinear equations using harris hawks optimization and Newton’s method. IEEE Access 2021, 9, 95791–95807. [Google Scholar] [CrossRef]
  12. Sihwail, R.; Solaiman, O.S.; Ariffin, K.A.Z. New robust hybrid Jarratt-Butterfly optimization algorithm for nonlinear models. J. King Saud-Univ.-Comput. Inf. Sci. 2022, 34, 8207–8220. [Google Scholar] [CrossRef]
  13. Said Solaiman, O.; Sihwail, R.; Shehadeh, H.; Hashim, I.; Alieyan, K. Hybrid Newton–Sperm Swarm Optimization Algorithm for Nonlinear Systems. Mathematics 2023, 11, 1473. [Google Scholar] [CrossRef]
  14. Behl, R.; Sarría, Í.; González, D.; Magreñán, Á.A. Highly efficient family of iterative methods for solving nonlinear models. J. Comput. Appl. Math. 2019, 346, 110–132. [Google Scholar] [CrossRef]
  15. Xie, L.; Han, T.; Zhou, H.; Zhang, Z.-R.; Han, B.; Tang, A. Tuna swarm optimization: A novel swarm-based metaheuristic algorithm for global optimization. Comput. Intell. Neurosci. 2021, 2021, 9210050. [Google Scholar] [CrossRef] [PubMed]
  16. Adam, S.P.; Alexandropoulos, S.-A.N.; Pardalos, P.M.; Vrahatis, M.N. No free lunch theorem: A review. In Approximation and Optimization: Algorithms, Complexity, and Applications; Springer: Cham, Switzerland, 2019; pp. 57–82. [Google Scholar]
  17. Behl, R.; Cordero, A.; Torregrosa, J.R.; Bhalla, S. A new high-order Jacobian-free iterative method with memory for solving nonlinear systems. Mathematics 2021, 9, 2122. [Google Scholar] [CrossRef]
  18. Van Hentenryck, P.; McAllester, D.; Kapur, D. Solving Polynomial Systems Using a Branch, Prune Approach. SIAM J. Numer. Anal. 1997, 34, 797–827. [Google Scholar] [CrossRef]
  19. Shams, M.; Rafiq, N.; Kausar, N.; Agarwal, P.; Park, C.; Mir, N.A. On iterative techniques for estimating all roots of nonlinear equation, its system with application in differential equation. Adv. Differ. Equ. 2021, 2021, 480. [Google Scholar] [CrossRef]
  20. Burden, R.L.; Faires, J.D. Numerical Analysis; Prindle, Weber & Schmidt: New York, NY, USA, 1985. [Google Scholar]
  21. Kansal, M.; Cordero, A.; Bhalla, S.; Torregrosa, J.R. New fourth-and sixth-order classes of iterative methods for solving systems of nonlinear equations and their stability analysis. Numer. Algorithms 2021, 87, 1017–1060. [Google Scholar] [CrossRef]
  22. Mengesha, L.M.; Denekew, S.A. Revised methods for solving nonlinear second order differential equations. J. Appl. Comput. Math. 2020, 9. [Google Scholar]
Figure 1. Convergence results based on 50 iterations for Examples 1–5.
Figure 1. Convergence results based on 50 iterations for Examples 1–5.
Algorithms 19 00040 g001
Figure 2. Convergence results based on 30 runs for Examples 1–5.
Figure 2. Convergence results based on 30 runs for Examples 1–5.
Algorithms 19 00040 g002
Table 1. General parameter values applied uniformly across all algorithms.
Table 1. General parameter values applied uniformly across all algorithms.
ParameterValue
Number of Search Agents ( N P ) 20
Maximum Iterations ( t m a x ) 50
CPU Timeseconds
Fitness (Functional Error) | | F ( u ) | | = f 1 2 + f 2 2 + + f n 2
SoftwareMATLAB 2025b Windows 11 system, Processor- Intel i5,
8 GB of RAM
Table 2. Parameter values for the competitor algorithms.
Table 2. Parameter values for the competitor algorithms.
AlgorithmParameterValue
TSOc0.7
r a n d random number varying from 0 to 1
brandom number between 0 and 1
probability parameter (z)0.05
T F random number value 1 or 1
NHHO β 1.5
E 0 random number [ 1 , 1 ]
JBOAp probability parameter switch0.8
power exponent0.1
sensory modality0.01
DNTSOc0.7
r a n d random number varying from 0 to 1
brandom number between 0 and 1
probability parameter (z)0.05
T F random number value 1 or 1
h (step size)0.0001
BTSOc0.7
r a n d random number varying from 0 to 1
brandom number between 0 and 1
probability parameter (z)0.05
T F random number value 1 or 1
a 1 and a 2 5 8 , 3 8
Table 3. Best position determined by the five methods for Example 1.
Table 3. Best position determined by the five methods for Example 1.
MethodsTSONHHOJBOADNTSOBTSO
u 1 0.49782860 0.57735026 0.57735026 0.57735026 0.57735026
u 2 0.45355774 0.57735026 0.57735026 0.57735026 0.57735026
u 3 0.80769890 0.57735026 0.57735026 0.57735026 0.57735026
u 4 0.28302776 0.28867513 0.28867513 0.28867513 0.28867513
Table 4. Comparison results of the proposed and alternative methods for Example 1.
Table 4. Comparison results of the proposed and alternative methods for Example 1.
Methods | | F 1 ( u ( 1 ) ) | | | | F 1 ( u ( 2 ) ) | | | | F 1 ( u ( 3 ) ) | | CPU Time (in s)
BTSO 2.6830 × 10 03 1.1693 × 10 13 1.5102 × 10 48 0.003427
DNTSO 2.3636 × 10 02 3.0300 × 10 08 1.2456 × 10 18 0.084532
JBOA 0.04076542 2.7277 × 10 07 3.6671 × 10 27 0.943524
NHHO 0.01084531 1.5797 × 10 04 2.9298 × 10 12 0.546354
TSO 0.11647981 0.07786432 0.06530765 0.234536
Table 5. Best position determined by the five methods for Example 2.
Table 5. Best position determined by the five methods for Example 2.
MethodsTSONHHOJBOADNTSOBTSO
u 1 0.31612673 0.25783336 0.25783336 0.25783336 0.25783336
u 2 0.35428080 0.38109708 0.38109708 0.38109708 0.38109708
u 3 0.25472510 0.27874494 0.27874494 0.27874494 0.27874494
u 4 0.24716309 0.20066892 0.20066892 0.20066892 0.20066892
u 5 0.37472951 0.44525134 0.44525134 0.44525134 0.44525134
u 6 0.09274166 0.14918388 0.14918388 0.14918388 0.14918388
u 7 0.42519527 0.43200968 0.43200968 0.43200968 0.43200968
u 8 0.16204784 0.07340269 0.07340269 0.07340269 0.07340269
u 9 0.32317504 0.34596676 0.34596676 0.34596676 0.34596676
u 10 0.42389006 0.42732625 0.42732625 0.42732625 0.42732625
Table 6. Comparison results of the proposed and alternative methods for Example 2.
Table 6. Comparison results of the proposed and alternative methods for Example 2.
Methods | | F 2 ( u ( 1 ) ) | | | | F 2 ( u ( 2 ) ) | | | | F 2 ( u ( 3 ) ) | | CPU Time (in s)
BTSO 9.7918 × 10 02 3.7169 × 10 09 1.8527 × 10 38 0.004326
DNTSO 4.7572 × 10 03 1.8819 × 10 08 6.4038 × 10 19 0.042043
JBOA 0.28032142 3.1570 × 10 04 5.2945 × 10 16 0.342352
NHHO 1.8265 × 10 02 1.5079 × 10 06 2.7576 × 10 15 0.653423
TSO 1.5463 × 10 + 03 3.72143987 0.61440024 0.183425
Table 7. Comparison results of the proposed and alternative methods for Example 3.
Table 7. Comparison results of the proposed and alternative methods for Example 3.
Methods | | F 3 ( v ( 1 ) ) | | | | F 3 ( v ( 2 ) ) | | | | F 3 ( v ( 3 ) ) | | CPU Time (in s)
BTSO 7.8522 × 10 10 4.6208 × 10 44 1.6157 × 10 44 0.024526
DNTSO 7.6387 × 10 04 2.9161 × 10 10 2.0461 × 10 22 0.063454
JBOA 4.9000 × 10 03 2.4430 × 10 13 4.2721 × 10 29 0.438910
NHHO 6.6025 × 10 04 4.5504 × 10 07 2.7576 × 10 16 0.234546
TSO 16.89494159 8.26683120 4.67676694 0.339687
Table 8. Best position determined by the five methods for Example 3.
Table 8. Best position determined by the five methods for Example 3.
MethodsTSONHHOJBOADNTSOBTSO
v 1 16.09756557 16.76091594 16.76091594 16.76091594 16.76091594
v 2 15.91109183 16.51427713 16.51427713 16.51427713 16.51427713
v 3 15.84018188 16.26028620 16.26028620 16.26028620 16.26028620
v 4 15.91815889 15.99947232 15.99947232 15.99947232 15.99947232
v 5 15.95688901 15.73276840 15.73276840 15.73276840 15.73276840
v 6 15.91655001 15.46161453 15.46161453 15.46161453 15.46161453
v 7 15.94219606 15.18806300 15.18806300 15.18806300 15.18806300
v 8 15.93074968 14.91489655 14.91489655 14.91489655 14.91489655
v 9 15.82418195 14.64575740 14.64575740 14.64575740 14.64575740
v 10 15.87310816 14.38528746 14.38528746 14.38528746 14.38528746
v 11 15.86502284 14.13928125 14.13928125 14.13928125 14.13928125
v 12 15.91235339 13.91485519 13.91485519 13.91485519 13.91485519
v 13 15.90195018 13.72064088 13.72064088 13.72064088 13.72064088
v 14 15.90088751 13.56701542 13.56701542 13.56701542 13.56701542
v 15 15.93691257 13.46639115 13.46639115 13.46639115 13.46639115
v 16 15.94903378 13.43360051 13.43360051 13.43360051 13.43360051
v 17 16.16704139 13.48643256 13.48643256 13.48643256 13.48643256
v 18 16.02141321 13.64640984 13.64640984 13.64640984 13.64640984
v 19 15.39562490 13.93994744 13.93994744 13.93994744 13.93994744
Table 9. Comparison results of the proposed and alternative methods for Example 4.
Table 9. Comparison results of the proposed and alternative methods for Example 4.
Methods | | F 4 ( v ( 1 ) ) | | | | F 4 ( v ( 2 ) ) | | | | F 4 ( v ( 3 ) ) | | CPU Time (in s)
BTSO 2.3315 × 10 10 5.3348 × 10 38 1.3425 × 10 38 0.054435
DNTSO 1.8304 × 10 06 2.9920 × 10 12 2.9324 × 10 24 0.068354
JBOA 5.6874 × 10 04 8.3844 × 10 13 2.9420 × 10 28 0.675991
NHHO 2.0879 × 10 02 2.6697 × 10 08 4.5454 × 10 16 0.754664
TSO 10.20180296 5.4635 × 10 04 1.5625 × 10 05 0.129645
Table 10. Comparison results of the proposed and alternative methods for Example 5.
Table 10. Comparison results of the proposed and alternative methods for Example 5.
Methods | | F 5 ( v ( 1 ) ) | | | | F 5 ( v ( 2 ) ) | | | | F 5 ( v ( 3 ) ) | | CPU Time (in s)
BTSO 1.3354 × 10 09 2.3345 × 10 40 1.2376 × 10 40 0.043867
DNTSO 7.6654 × 10 03 1.9100 × 10 08 4.0465 × 10 24 0.098476
JBOA 5.6575 × 10 02 8004 × 10 10 1.3425 × 10 32 0.675991
NHHO 3.0054 × 10 04 5.6476 × 10 07 2.1349 × 10 16 0.675991
TSO 33.13943720 20.87543671 1.5625 × 10 05 0.948610
Table 11. Comparison results of the proposed and alternative methods for Example 1.
Table 11. Comparison results of the proposed and alternative methods for Example 1.
MethodsIterations x 0 = a 1 T x 0 = a 2 T x 0 = a 3 T CPU Time (in s)
NM51.1493 × 10 13 0.214590.2319870.274563
BM51.3531 × 10 16 0.323241.2732 × 10 9 0.183425
DNTSO58.2115 × 10 27 0.002545
BTSO52.7733 × 10 33 0.001654
Table 12. Comparison results of the proposed and alternative methods for Example 2.
Table 12. Comparison results of the proposed and alternative methods for Example 2.
MethodsIterations x 0 = b 1 T x 0 = b 2 T x 0 = b 3 T CPU Time (in s)
NM5 4.3583 × 10 11 4.7620 × 10 16 4.0393 × 10 07 0.342796
BM5 5.3411 × 10 17 8.9867 × 10 16 5.3411 × 10 17 0.254956
DNTSO5 3.0913 × 10 31 0.003425
BTSO5 2.8527 × 10 33 0.001350
Table 13. Comparison results of the proposed and alternative methods for Example 3.
Table 13. Comparison results of the proposed and alternative methods for Example 3.
MethodsIterations x 0 = ( 0.8 , ) T x 0 = ( 0.3 , ) T x 0 = ( 0.1 , ) T CPU Time (in s)
NM5 3.7275 × 10 11 3.8293 × 10 11 7.4303 × 10 11 0.435268
BM5 6.2552 × 10 15 5.9045 × 10 15 5.8285 × 10 15 0.325467
DNTSO5 3.0103 × 10 24 0.043562
BTSO5 4.3701 × 10 44 0.016734
Table 14. Comparison results of the proposed and alternative methods for Example 4.
Table 14. Comparison results of the proposed and alternative methods for Example 4.
MethodsIterations x 0 = ( 0.3 , ) T x 0 = ( 0.8 , ) T x 0 = ( 0.5 , ) T CPU Time (in s)
NM5 4.8262 × 10 13 1.0329 × 10 12 1.0138 × 10 12 0.983425
BM5 2.3845 × 10 16 3.4235 × 10 16 3.4592 × 10 17 0.534267
DNTSO5 9.5670 × 10 25 0.033425
BTSO5 4.9693 × 10 38 0.013425
Table 15. Comparison results of the proposed and alternative methods for Example 5.
Table 15. Comparison results of the proposed and alternative methods for Example 5.
MethodsIterations x 0 = ( 0.1 , ) T x 0 = ( 0.9 , ) T x 0 = ( 0.5 , ) T CPU Time (in s)
NM5 2.5463 × 10 08 2.6473 × 10 16 2.5463 × 10 12 0.653425
BM5 1.6327 × 10 12 3.4235 × 10 18 3.4592 × 10 12 0.373445
DNTSO5 2.6372 × 10 26 0.063425
BTSO5 2.4536 × 10 40 0.014983
Table 16. Average (mean) of F 1 F 5 in 30 runs.
Table 16. Average (mean) of F 1 F 5 in 30 runs.
Methods F 1 F 2 F 3 F 4 F 5
TSO 0.00199 0.02492 1.10721 0.00244 0.00629
NHHO 2.6919 × 10 20 6.1669 × 10 18 1.0968 × 10 18 2.4556 × 10 15 2.2036 × 10 15
JBOA 9.4074 × 10 27 3.0699 × 10 18 3.0720 × 10 29 1.7229 × 10 26 1.6627 × 10 27
DNTSO 2.6214 × 10 31 1.3597 × 10 31 1.4185 × 10 22 1.2463 × 10 24 1.6441 × 10 24
BTSO 2.6767 × 10 47 1.8343 × 10 37 1.5217 × 10 43 6.1422 × 10 38 4.1196 × 10 42
Table 17. Standard deviation of F 1 F 5 in 30 runs.
Table 17. Standard deviation of F 1 F 5 in 30 runs.
Methods F 1 F 2 F 3 F 4 F 5
TSO 0.00176 9.2712 × 10 10 0.02236 0.00512 0.01158
NHHO 4.4323 × 10 20 1.0952 × 10 17 1.6524 × 10 18 2.8375 × 10 15 2.8325 × 10 15
JBOA 9.5155 × 10 27 4.5243 × 10 18 1.0654 × 10 29 2.7519 × 10 26 2.5683 × 10 27
DNTSO 3.7345 × 10 31 2.3899 × 10 31 2.1661 × 10 22 1.5813 × 10 24 1.6447 × 10 24
BTSO 3.4577 × 10 47 3.3002 × 10 37 1.6262 × 10 43 1.1932 × 10 38 2.6860 × 10 42
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Chandel, A.; Bhalla, S.; Cordero, A.; Torregrosa, J.R.; Behl, R. A Newton-Based Tuna Swarm Optimization Algorithm for Solving Nonlinear Problems with Application to Differential Equations. Algorithms 2026, 19, 40. https://doi.org/10.3390/a19010040

AMA Style

Chandel A, Bhalla S, Cordero A, Torregrosa JR, Behl R. A Newton-Based Tuna Swarm Optimization Algorithm for Solving Nonlinear Problems with Application to Differential Equations. Algorithms. 2026; 19(1):40. https://doi.org/10.3390/a19010040

Chicago/Turabian Style

Chandel, Aanchal, Sonia Bhalla, Alicia Cordero, Juan R. Torregrosa, and Ramandeep Behl. 2026. "A Newton-Based Tuna Swarm Optimization Algorithm for Solving Nonlinear Problems with Application to Differential Equations" Algorithms 19, no. 1: 40. https://doi.org/10.3390/a19010040

APA Style

Chandel, A., Bhalla, S., Cordero, A., Torregrosa, J. R., & Behl, R. (2026). A Newton-Based Tuna Swarm Optimization Algorithm for Solving Nonlinear Problems with Application to Differential Equations. Algorithms, 19(1), 40. https://doi.org/10.3390/a19010040

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop