Solving systems of nonlinear equations is a common challenge in various engineering fields, including weather forecasting, petroleum geological prospecting, computational mechanics, and control systems [
1]. The mathematical expression for a system of nonlinear equations is represented as
or
, where
,
are the coordinate functions of
F. Generally, finding solutions to such systems is difficult due to the lack of an efficient and reliable algorithm. Despite extensive research in this area, existing schemes, such as Newton’s method [
2], Halley’s procedure [
3], Ostrowski’s method [
4], and Jarratt’s scheme [
5], are widely used, but their convergence and performance can be highly sensitive to the initial guess provided. If the initial guess is unsuitable, these algorithms may fail to converge. However, selecting an appropriate initial guess for most nonlinear systems is often a challenging task.
With the development of optimization algorithms such as Particle Swarm Optimization (PSO) [
6], Differential Evolution (DE) [
7], Gray Wolf Optimization (GWO) [
8], and Butterfly Optimization algorithm (BOA) [
9], there has been a great deal of interest in utilizing these techniques to solve systems of nonlinear equations. The advancement in evolutionary algorithms, in particular, presents novel opportunities to address the challenges typically associated with solving nonlinear systems. A significant advantage of these methods is their capacity to overcome the common issue of selecting an appropriate initial guess, a limitation frequently encountered with traditional approaches. Indeed, the system of nonlinear Equation (
1) can be reinterpreted as an optimization problem in (
2).
The hybridization of iterative methods with optimization algorithms becomes increasingly significant when addressing complex functions, including various engineering problems. A hybrid algorithm can mitigate the limitations of one approach while leveraging the advantages of the other. Consequently, selecting an appropriate combination of algorithms to create an effective hybrid algorithm is a crucial initial step. The concept of combining evolutionary algorithms with iterative methods for solving complex equations was introduced in 1988 by Karr et al. [
1]. They proposed a hybrid approach that utilized the Genetic Algorithm to provide an initial estimate for Newton’s method. However, due to its inefficient results, this early hybridization did not gain significant attention at the time. Two decades later, in 2008, Luo et al. [
10] addressed complex nonlinear equations by integrating Quasi-Newton and Chaos Optimization algorithms. The mathematical convergence of the Chaos Optimization algorithm had not been established at that point, so the convergence analysis of their method was not addressed. More recently, in 2021, Sihwail et al. [
11] developed a hybrid algorithm called NHHO, which combined Newton’s method with the Harris Hawk Optimization technique to solve an arbitrary system of nonlinear equations. Their work included an examination of non-differentiable functions to demonstrate the method’s efficacy. Sihwail et al. in 2022 [
12] developed a hybrid method combining Jarratt’s method and the Butterfly Optimization algorithm (JBOA) for solving a system of nonlinear equations. However, in this method, the Butterfly algorithm may get trapped in local optima, resulting in limited exploration of the search space and potentially compromising the accuracy of the solution. Moreover, due to the presence of derivatives at two points in Jarratt’s method, it is sometimes difficult or time-consuming to compute the derivative of the function at each iteration. After that, Solaiman et al. in 2023 [
13] formulated a modified hybrid algorithm that integrated Newton’s method with the Sperm Swarm Optimization Algorithm, achieving enhanced accuracy with fewer iterations. The integration of multiple algorithms can address the limitations of individual methodologies while capitalizing on their respective strengths. Consequently, a crucial initial step is to identify an optimal combination of algorithms to develop an efficacious hybrid approach. Hence, this paper focuses on constructing hybrid approaches that combine the derivative-free Newton iterative scheme [
5] and fourth-order Newton variant [
14] with the Tuna Swarm optimization algorithm [
15]. The objective of this work is to demonstrate the advantages of combining a derivative and derivative-free iterative approach with an optimization algorithm. This integration aims to reduce the complexity of the iterative method and enhance its accuracy in solving nonlinear systems, and also reduce time complexity. Moreover, it can address several limitations of Newton’s method, including divergence issues, local optima entrapment, and initial point selection, and solve large systems with more accuracy and efficiency. The principal contributions of this paper are detailed below.
Section 1 presents the introduction of the work.
Section 2 includes the development of hybrid iterative techniques using derivative-free Newton iterative technique, Newton’s fourth-order variant, the Tuna Swarm Optimization algorithm, and its operational mechanism.
Section 3 presents a comparison of the numerical results of the proposed algorithms with the original optimization algorithm TSO, derivative-free Newton method (NM), the fourth-order variant of Newton’s method (BM), Newton–Harris Hawk Optimization algorithm (NHHO), and Jarratt–Butterfly Optimization technique. The comparison is based on parameters such as accuracy, stability, fitness value, convergence speed, and computational time.
Section 4 includes the convergence analysis and graphical representation of the results. Finally,
Section 5 presents some conclusions.