Next Article in Journal
An Application of the Principle of Differential Subordination to Analytic Functions Involving Atangana–Baleanu Fractional Integral of Bessel Functions
Previous Article in Journal
Hardware in the Loop Topology for an Omnidirectional Mobile Robot Using Matlab in a Robot Operating System Environment
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Two Modified Single-Parameter Scaling Broyden–Fletcher–Goldfarb–Shanno Algorithms for Solving Nonlinear System of Symmetric Equations

School of Mathematics and Statistics, Central South University, Changsha 410083, China
*
Author to whom correspondence should be addressed.
Symmetry 2021, 13(6), 970; https://doi.org/10.3390/sym13060970
Submission received: 23 April 2021 / Revised: 25 May 2021 / Accepted: 26 May 2021 / Published: 30 May 2021

Abstract

:
In this paper, we develop two algorithms to solve a nonlinear system of symmetric equations. The first is an algorithm based on modifying two Broyden–Fletcher–Goldfarb–Shanno (BFGS) methods. One of its advantages is that it is more suitable to effectively solve a small-scale system of nonlinear symmetric equations. In contrast, the second algorithm chooses new search directions by incorporating an approximation method of computing the gradients and their difference into the determination of search directions in the first algorithm. In essence, the second one can be viewed as an extension of the conjugate gradient method recently proposed by Lv et al. for solving unconstrained optimization problems. It was proved that these search directions are sufficiently descending for the approximate residual square of the equations, independent of the used line search rules. Global convergence of the two algorithms is established under mild assumptions. To test the algorithms, they are used to solve a number of benchmark test problems. Numerical results indicate that the developed algorithms in this paper outperform the other similar algorithms available in the literature.

1. Introduction

In this paper, we study solution methods of the following nonlinear system of symmetric equations:
F ( x ) = 0 ,
where F : R n R n is a continuously differentiable function, and its Jacobian J ( x ) = F ( x ) is symmetric, i.e., J ( x ) = J ( x ) T . Such a problem is closely related with many scientific problems, such as unconstrained optimization problems, equality constrained mathematical programming problems, discretized two-point boundary value problems, and discretized elliptic boundary value problems (see Chapter 1 in [1]). For example, when F is the gradient mapping of an objective function f : R n R , (1) is just the first order necessary condition for a local minimizer of the following problem:
min { f ( x ) , x R n } .
For the equality constrained mathematical programming problems:
min f ( z ) s . t . h ( z ) = 0 ,
where h : R n R m is a vector-valued function. The Karush–Kuhn–Tucker (KKT) conditions (see Chapter 8 in [2]) for Problem (3) is also the system (1) with x = ( z , v ) R n + m , and:
F ( z , v ) = f ( z ) + h ( z ) v h ( z ) .
Among various methods for solving (1), the Newton method needs to compute the Jacobian matrix J ( x ) , and requires that J ( x ) is nonsingular at each iterative point. Due to this stringent condition, the Newton method is not applicable in a general case.
Li and Fukushima [3] proposed a Gauss–Newton method to solve the symmetric system of equations, which can ensure that an approximate residual of F ( x ) is descent. Gu et al. [4] modified the method in [3] such that the residual F ( x ) is descent. As a generalization of the method in [5] for solving smooth unconstrained optimization, Zhou [6] presented an inexact modified BFGS method to solve the symmetric system of equations by approximately computing the gradient of the residual square. In [7], Wang and Zhu proposed an inexact-Newton via GMRES (generalized minimal residual) subspace method without line search technique for solving symmetric nonlinear equations. The iterative direction was obtained by solving the Newton equation of the system of nonlinear equations with the GMRES algorithm. Yuan and Yao [8] also proposed a BFGS method for solving symmetric nonlinear equations and the method possesses a good property that the generated sequence of the quasi-Newton matrix is positive definite. However, since the search direction is generated by solving a system of linear equations in these methods, all of them are not applicable to solving large-scale problems.
For large-scale symmetric nonlinear equations, Li and Wang [9] proposed a modified Fletcher–Reeves derivative-free method, as an extension of the conjugate gradient method [10]. Similarly, as an extension of descent conjugate gradient methods in [11] for unconstrained optimization, Xiao et al. [12] presented a family of derivative-free methods for symmetric equations, and established the global convergence under some appropriate conditions, and showed their effectiveness by numerical experiments. Zhou and Shen [13] presented an efficient iterative method for solving large-scale symmetric equations, as an extension of the three-term PRP conjugate gradient method in [14] for solving unconstrained optimization problems. Liu and Feng [15] proposed a norm descent derivative-free algorithm for solving large-scale nonlinear symmetric equations, as an extension of the three-term conjugate gradient method in [16] for solving unconstrained optimization problems. More details can be seen in [17,18,19,20].
Our motivation in this paper is to develop two algorithms to solve Problem (1). Firstly, based on the single-parameter scaling memoryless BFGS method proposed by Lv et al. [21] and the modification of the BFGS method in [22], we intended to develop an efficient algorithm (MSBFGS) which would incorporate the approximation method of computing the gradients in [3] and their difference in [6] such that it can solve the system of nonlinear Equations (1) more efficiently. Secondly, since MSBFGS is involved with computation and the storage of matrices, it is not applicable to solve a large-scale system of nonlinear equations. Therefore, by giving an inverse formula of the update matrix in MSBFGS, we are going to develop another method (MSBFGS2) such that it can solve large-scale systems of nonlinear Equations (1). Additionally, in addition to the establishment of the two algorithms’ convergence, we shall also demonstrate their powerful numerical performance as they are applied to solve benchmark test problems in the literature.
The rest of this paper is organized as follows. In Section 2, we first state the idea to propose two methods for solving the nonlinear symmetric equations. Then, two new algorithms are developed. Global convergence of algorithms is established in Section 3. Section 4 is devoted to numerical tests. Some conclusions are drawn in Section 5.
Some words about our notation: throughout the paper, the space R n is equipped with the Euclidean norm · , the transpose of any matrix is denoted by · T , and the F ( x k ) and J ( x k ) are abbreviated as F k and J k , respectively.

2. Development of Algorithm

In this section, we first simply recall a single-parameter scaling BFGS method [21] for solving the following unconstrained optimization problem:
min f ( x ) , x R n ,
where f : R n R is continuously differentiable such that its gradient is available. This method generates a sequence x k satisfying:
x k + 1 = x k + α k d k ,
where k 0 , x 0 is the initial point, α k > 0 is called a step length obtained by some line search rule, and d k is a search direction defined as
d k = B k 1 f k , B k + 1 = B k B k s k s k T B k s k T B k s k + γ k y k y k T y k T s k .
where s k = x k + 1 x k , y k = f k + 1 f k .
By minimizing the measure function introduced by Byrd and Nocedal [23]:
ψ ( B k ) = tr ( B k ) ln ( det ( B k ) ) .
Lv et al. [21] obtained that:
γ k = y k T s k y k 2 .
In 2006, ref. [22] proposed a modification of the BFGS algorithm for unconstrained nonconvex optimization. The matrix B k in [22] was updated by the formula:
B k + 1 = B k B k s k s k T B k s k T B k s k + y k s y k s T y k s T s k .
where y k s is the sum of y k and t k f k r s k , and t k > 0 , r > 0 .
Due to their impressive numerical efficiency, we now attempt to modify the aforementioned methods to solve the symmetric system of nonlinear Equation (1).
If we define f in (5) as
f ( x ) = 1 2 F ( x ) 2 , x R n .
Then, for this objective function, any global minimizer of Problem (5) at which f vanishes is a solution of Problem (1). If an algorithm stops at a global minimizer x k , i.e., f ( x k ) = 0 , then the algorithm finds a solution of (1).
By a symmetry of J, it holds that
f ( x ) = J ( x ) T F ( x ) = J ( x ) F ( x ) .
In [3], Li and Fukushima suggested that f ( x ) is approximately computed by
g ( x , α ) = F ( x + α F ( x ) ) F ( x ) α ,
where α > 0 , and it can be proved that:
lim α F ( x ) 0 g ( x , α ) = J ( x ) T F ( x ) = f ( x ) .
In other words, when α F ( x ) is sufficiently small, it is true that the vector g ( x , α ) defined by (13) is a nice approximation to f ( x ) .
In the actual calculation, [3] computed g k by
g k = g ( x k , α k 1 ) .
where α k 1 is the step size at the last iterate point x k 1 . In general, the convergence of algorithms can ensure that lim k 0 inf F k = 0 .
Based on the work done by Li and Fukushima [3], Zhou [6] proposed a modified BFGS method to solve (1). The modified BFGS update formula is given by
B k + 1 = B k B k s k s k T B k s k T B k s k + δ k δ k T δ k T s k ,
where:
δ k = δ k ¯ + max 0 , δ k ¯ T s k s k T s k s k + μ F k s k , μ > 0 ,
and:
δ k ¯ = g ( x k + 1 , α k 1 ) g ( x k , α k 1 ) .
Based on the ideas of [6,21,22], we now attempt to propose a modified single-parameter scaling BFGS method to solve (1). The modified BFGS update formula is given by
B k + 1 = B k B k s k s k T B k s k T B k s k + γ k δ k δ k T δ k T s k ,
where γ k > 0 and:
δ k = δ k ¯ + t F k r s k , , if s k T δ k ¯ > 0 , δ k ¯ δ k ¯ T s k s k T s k s k + t F k r s k , otherwise .
It is clear that γ k also minimizes (8) where B k is defined by (19) if we compute γ k by
γ k = δ k T s k δ k 2 .
Moreover, we obtain an approximate quasi-Newton direction:
d k = B k 1 g k ,
where g k is an approximate gradient defined by (15).
Remark 1.
δ k in (20) is slightly different from that in (17) where r = 1 . Since γ k in (21) can minimize (8) where B k is defined by (19), its condition number which is the quotient of the maximum eigenvalue and the minimum eigenvalue of B k is also minimized. Clearly, a smaller condition number of search direction matrices can theoretically ensure the stability of algorithms [21]. Numerical experiments will also show that B k in (19) with γ k being defined by (21) is more efficient and robust than that in (16).
Since nonmonotone line search rules can play a critical role in solving a complicated nonconvex optimization problem, we use the nonmonotone line search in [3,6] to determine a step-size α k along the direction d k . Specifically, let σ 1 , σ 2 , ρ , ρ 1 , ( 0 , 1 ) and η > 0 be five given constants, and let η k be a given positive sequence such that:
k = 0 + η k η < + .
We search for a step size α k satisfying:
α k = 1 , if F ( x k + d k ) ρ 1 F k , max { ρ i F ( x k + ρ i d k ) 2 ( 1 + η k ) F k 2 σ 1 ρ i F k 2 σ 2 ρ i d k 2 , i = 1 , 2 , , } , otherwise .
With the above preparation, we are in a position to develop an algorithm to solve Problem (1). We now present its computer procedure as follows.
Remark 2.
In fact, for all k > 0 , if B 0 is symmetric and positive definite, B k in (19) is also symmetric and positive definite since
δ k T s k t F k r s k 2 > 0 .
Therefore, the algorithm is well defined. From the definition of δ k in (20), we can also obtain:
δ k 2 t 2 F k 2 r s k 2 > 0 .
Since Algorithm 1 cannot efficiently solve large-scale nonlinear symmetric equations, based on the work done by [3], we will develop another algorithm that is not involved with matrix operation and inverse operation. When we set B k = I , the inverse matrix of B k + 1 in (19) can be written as
H k + 1 = I δ k s k T + s k δ k T δ k T s k + 1 γ k + δ k T δ k δ k T s k s k s k T δ k T s k ,
where γ k is the same as (21), and:
δ k = F ( x k + ξ k ) F k , ξ k = F k + 1 F k ,
In fact, δ k and ξ k are completely the same as those in [3,15].
Moreover, in order to guarantee that our proposed method generates descent directions and to further increase its computational efficiency and robustness, we can compute the direction by
d k = F ( x 0 ) g 0 , if k = 0 , g k , i f δ k 1 T s k 1 0 , g k + β k s k 1 + θ k δ k 1 , otherwise ,
where g k is defined in (15) and:
β k = δ k 1 T g k δ k 1 T s k 1 2 δ k 1 2 δ k 1 T s k 1 s k 1 T g k δ k 1 T s k 1 , θ k = s k 1 T g k δ k 1 T s k 1 .
Remark 3.
Note that the nonmonotone line search (31) is a variant of (24) with σ 1 = 0 .
Remark 4.
Since the search direction of Algorithm 1 at each iteration is an approximate quasi-Newton direction, which is involved with the solution of a linear system of equations, Algorithm 1 can only efficiently solve small–medium-scale Problem (1). Instead, the needed search directions in Algorithm 2 is only associated with evaluating the function F without requirement of computing or storing its Jacobian matrix. Thus, compared with Algorithm 1, Algorithm 2 is more applicable to solving large-scale systems of nonlinear equations. In addition, two different approximation methods are used to compute the difference of gradients (see (18) and (28)).
Algorithm 1 (Modified Single-Parameter Scaling BFGS Algorithm (MSBFGS))
Step 0. Choose three constants σ 1 , σ 2 , ρ , ρ 1 , ε ( 0 , 1 ) , r > 0 , t > 0 , α 1 > 0 . Take a sequence η k satisfying (23). Arbitrarily choose an initial iterate point x 0 R n , a symmetric and positive definite matrix B 0 R n × n . Set k : = 0 .
Step 1. If F k ε is satisfied, then the algorithm stops.
Step 2. Compute d k by (22) and (19).
Step 3. Determine a step length α k satisfying (24).
Step 4. Set x k + 1 = x k + α k d k .
Step 5. Set k : = k + 1 , return to Step 1.
Algorithm 2 (Modified Single-Parameter Scaling BFGS Algorithm 2(MSBFGS2))
Step 0. Choose three constants σ , ρ , ε ( 0 , 1 ) . Take a sequence η k satisfying (23). Arbitrarily choose an initial iterate point x 0 R n . Set k = 0 .
Step 1. If F k ε is satisfied, then the algorithm stops.
Step 2. Compute d k by (29).
Step 3. Determine a step length α k = max { ρ i i = 0 , 1 , 2 , , } satisfying:
f ( x k + α k d k ) f ( x k ) σ α k d k 2 + η k f ( x k ) .
Step 4. Set x k + 1 = x k + α k d k .
Step 5. Set k : = k + 1 , return to Step 1.
Remark 5.
By combining the advantages of the three-term conjugate gradient method in [21] and those of the approximation methods for computing the difference of gradients, it is believable that the numerical performance of Algorithm 2 is better than the algorithm in [21]. In the two subsequent sections, apart from establishing the convergence theory of Algorithm 2, we will also test its efficiency in solving large-scale problems.
Remark 6.
Very recently, Liu et al. [15] developed an algorithm to solve (1), where β k and θ k in (29) were replaced by
β k = g k T δ k 1 s k 1 T δ k 1 δ k 1 2 g k T s k 1 ( s k 1 T δ k 1 ) 2 , θ k = g k T s k 1 s k 1 T δ k 1 ,
respectively. Since (30) and (32) are two similar choices, it is interesting to compare their numerical performance for solving the problem (1).

3. Convergence of Algorithm

In this section, we establish the global convergence of Algorithms 1 and 2. For this purpose, we first define the level set:
Ω = x f ( x ) = 1 2 F ( x ) 2 e η f ( x 0 ) .
Clearly, it follows from Step 3 of Algorithm 1 that:
f ( x k + 1 ) = 1 2 F k + 1 2 1 2 ( 1 + η k ) F k 2 1 2 e η k F k 2 1 2 e i = 0 k η i F 0 2 1 2 e η F 0 2 = e η f ( x 0 ) .
Thus, any sequence { x k } generated by Algorithm 1 belongs to Ω , i.e., x k Ω for all k. In other words, there exists a constant Δ > 0 , such that:
F k Δ .
Moreover, since η k satisfies (23), from Lemma 3.3 in [24], we know that the sequence { F k } generated by Algorithm 1 converges.
Likewise, from the line search rule of Algorithm 2, we know that the sequence of iterate points { x k } generated by Algorithm 2 also belongs to Ω and { F k } generated by Algorithm 2 also satisfies (34).
As done in the existing results [13,15,25], we also suppose that F in (1) satisfies the following conditions:
Assumption 1.
The solution set of the problem (1) is nonempty.
Assumption 2.
The level set Ω is bounded.
Assumption 3.
F is a continuous differentiable on an open and convex set V R n containing the level set Ω, and its Jacobian matrix is symmetric and bounded on V, i.e., there exists a positive constant M such that:
J ( x ) M , x V .
Assumption 4.
J ( x ) is uniformly nonsingular on V, i.e., there exists a positive constant m such that:
m p J ( x ) p , x V , p R n .
Clearly, Assumptions 2–4 imply that there exist positive constants M m > 0 such that the following statements are true:
(1) For any x V , p R n ,
m p J ( x ) p M p .
(2) For any x , y V ,
m x y F ( x ) F ( y ) = J θ x + ( 1 θ ) y ( x y ) M x y ,
where θ ( 0 , 1 ) .
(3) For any sequence { x k } V ,
m F k g k = J ( x k + α k 1 t F k ) F k M F k ,
where t ( 0 , 1 ) .
Under Assumptions 2–4, we can prove that Algorithm 1 has the following nice properties.
Lemma 1.
Let { B k } be generated by the BFGS Formula (19), where B 0 is a symmetric and positive definite and γ k is defined by (21). If there exists a positive constant m 0 > 0 , such that:
F k m 0 , k 0 ,
then for any p ( 0 , 1 ) and k > 1 , there exist positive constants β i , i = 1 , 2 , 3 , 4 such that:
β 1 s j B j s j β 2 s j , β 3 s j 2 s j T B j s j β 4 s j 2
hold for at least p k values of j [ 1 , k ] , where t is the smallest integer which is larger than or equal to t.
Proof. 
From (8) and (19), we have:
ψ ( B k + 1 ) = tr ( B k + 1 ) ln ( det ( B k + 1 ) ) = tr ( B k ) B k s k 2 s k T B k s k + γ k δ k 2 δ k T s k ln γ k δ k T s k s k T B k s k det ( B k ) = ψ ( B k ) B k s k 2 s k T B k s k + γ k δ k 2 δ k T s k ln γ k δ k T s k s k T B k s k = ψ ( B k ) B k s k s k s k T B k s k 2 s k T B k s k s k T s k + γ k δ k 2 δ k T s k ln γ k δ k T s k s k T s k s k T s k s k T B k s k .
Take cos θ k = s k T B k s k B k s k s k and q k = s k T B k s k s k T s k , then (42) can be rewritten as
ψ ( B k + 1 ) = ψ ( B k ) + γ k δ k 2 δ k T s k ln γ k δ k T s k s k T s k q k cos 2 θ k + ln q k = ψ ( B k ) + γ k δ k 2 δ k T s k ln γ k δ k T s k s k T s k 1 + ln cos 2 θ k + 1 q k cos 2 θ k + ln q k cos 2 θ k .
On the other hand, from (2.11) in [6], we know:
δ k C 1 s k ,
where C 1 > 0 is a constant. Hence, it follows from (25), (26), (34), (40) and (44) that:
δ k T s k s k T s k t F k r t m 0 r m 1 > 0 ,
and:
m 2 t 2 m 0 2 r C 1 1 γ k = δ k 2 δ k T s k C 1 2 t m 0 r M 2 .
From (43), (45) and (46), we have:
ψ ( B k + 1 ) ψ ( B 1 ) + ( M 2 m 2 1 ln m 1 M 2 ) k + j = 1 k ln cos 2 θ j + 1 q j cos 2 θ j + ln q j cos 2 θ j .
Take η j 0 :
η j = ln cos 2 θ j 1 q j cos 2 θ j + ln q j cos 2 θ j .
It is clear that ψ ( B k + 1 ) > 0 since B k + 1 is symmetric and positive definite. Hence, from (47), we have:
1 k j = 1 k η j < ψ ( B 1 ) k + M 2 m 2 1 ln m 1 M 2 .
Let us define J k to be a set consisting of the p k indices corresponding to the p k smallest values of η j , for j k , and let η m k denote the largest of the η j for j J k . Then:
1 k j = 1 k η j 1 k η m k + j = 1 , j J k k η j = 1 k ( η m k + ( k p k ) η m k ) η m k ( 1 p ) .
Thus, from (48)–(50) and the following fact:
1 q j cos 2 θ j + ln q j cos 2 θ j 0 ,
we have:
ln cos 2 θ j η j 1 1 p ψ ( B 1 ) + M 2 m 2 1 ln m 1 M 2 β 0 , j J k .
It follows from (51) that:
cos θ j e β 0 / 2 , j J k .
On the other hand, since ln cos 2 θ j 0 , we have:
1 q j cos 2 θ j + ln q j cos 2 θ j β 0 , j J k .
Let μ ( t ) = 1 t + ln t , then by simple analysis, we have:
μ ( t ) 0 , arg max t > 0 μ ( t ) = 1 , lim t 0 μ ( t ) = , lim t μ ( t ) = .
Therefore, there exist positive constants β 3 ¯ and β 4 ¯ such that the following inequalities hold:
β 3 ¯ q j cos 2 θ j β 4 ¯ , j J k .
Together with (52), we obtain:
β 3 β 3 ¯ e β 0 q j = s j T B j s j s j T s j = q j cos 2 θ j cos 2 θ j β 4 ¯ β 4 , j J k .
Moreover:
β 3 B j s j s j = q j cos θ j β 4 e β 0 / 2 β 2 , j J k .
Take β 1 = β 3 , we obtain the desired result. □
Remark 7.
From the proof of Lemma 1, we know that if γ k is not defined by (21), Lemma 1 is also true whenever there exist constants m 3 > 0 and M 3 > 0 such that m 3 γ k M 3 holds.
By Lemma 1, since the definition of δ k ¯ and the line search rule are completely the same as those in [6], we can obtain the same convergence result as Algorithm 1 without proof.
Theorem 1.
Suppose that Assumptions 1–4 hold. Let { x k } be a sequence generated by Algorithm 1. Then:
lim k inf F k = 0 .
To establish the global convergence of Algorithm 2, we first prove the following results.
Lemma 2.
Let { x k } be a sequence generated by Algorithm 2. If Assumptions 1–4 hold. Then:
lim k α k d k = 0 .
Proof. 
Similar to the proof of Lemma 3.1 in [15], we can prove (58). □
Lemma 2 shows that lim k s k = 0 holds.
Lemma 3.
Let { x k } be a sequence generated by Algorithm 2. If Assumptions 1–4 hold, then:
δ k M 2 s k .
Additionally, if s k 0 , then there exists a constant m ¯ > 0 such that for all sufficiently large k:
s k T δ k m ¯ s k 2 .
Proof. 
On the one hand, it follows from (28) and (37) that:
δ k = F ( x k + ξ k ) F ( x k ) = J θ 1 x k + ( 1 θ 1 ) ( x k + ξ k ) ξ k M ξ k = M F ( x k + 1 ) F ( x k ) M J θ 2 x k + ( 1 θ 2 ) x k + 1 x k + 1 x k M 2 s k ,
where θ 1 , θ 2 ( 0 , 1 ) . On the other hand, by the mean-value theorem, we have:
s k T δ k = s k T F ( x k + ξ k ) F ( x k ) = s k T 0 1 J ( x k + t ξ k ) d t ξ k = s k T 0 1 J ( x k + t ξ k ) d t 0 1 J ( x k + t s k ) d t s k = s k T 0 1 J ( x k + t s k ) d t 2 s k + s k T 0 1 J ( x k + t ξ k ) J ( x k + t s k ) d t · 0 1 J ( x k + t s k ) d t s k = 0 1 J ( x k + t s k ) d t s k 2 + s k T 0 1 J ( x k + t ξ k ) J ( x k + t s k ) d t · 0 1 J ( x k + t s k ) d t s k = 0 1 J ( x k + t s k ) d t s k 2 s k T 0 1 J ( x k + t s k ) J ( x k + t ξ k ) d t · 0 1 J ( x k + t s k ) d t s k 0 1 J ( x k + t s k ) d t s k 2 0 1 J ( x k + t s k ) J ( x k + t ξ k ) d t s k · 0 1 J ( x k + t s k ) d t s k = F ( x k + 1 ) F ( x k ) 2 0 1 J ( x k + t s k ) J ( x k + t ξ k ) d t s k · 0 1 J ( x k + t s k ) d t s k m 2 s k 2 M s k 2 · 0 1 J ( x k + t s k ) J ( x k + t ξ k ) d t = m 2 M 0 1 J ( x k + t s k ) J ( x k + t ξ k ) d t · s k 2 .
From Lemma 2, we have s k 0 , hence ξ k = F ( x k + 1 ) F ( x k ) 0 . By continuity of J, we get (60). □
Lemma 4.
Suppose that Assumptions 1–4 hold. If there exists a constant r > 0 such that for all k N :
F k r ,
Then, there exists a constant M ^ m ^ > 0 such that:
m ^ d k M ^ ,
hold, where m ^ = r m .
Proof. 
Similar to Proposition 3 in [21], we have:
g k T d k 1 2 g k 2 .
From (39), (63) and (65), it follows that:
d k 1 2 g k 1 2 m F k 1 2 m r .
Therefore, the left-hand side of (64) holds.
From (60), the definition of β k in (29) and (30), we have:
β k = δ k 1 T g k δ k 1 T s k 1 2 δ k 1 2 δ k 1 T s k 1 s k 1 T g k δ k 1 T s k 1 δ k 1 g k δ k 1 T s k 1 + 2 δ k 1 2 s k 1 g k ( s k 1 T δ k 1 ) 2 M 2 s k 1 g k m ¯ s k 1 2 + 2 M 4 s k 1 3 g k m ¯ 2 s k 1 4 M 2 g k m ¯ s k 1 + 2 M 4 g k m ¯ 2 s k 1 .
On the other hand:
θ k = s k 1 T g k δ k 1 T s k 1 s k 1 g k m ¯ s k 1 2 g k m ¯ s k 1 .
From Assumptions 2 and 3, (34) and (39), we know that the sequence { g k } is bounded, i.e., there exists a positive constant γ ^ such that for all k 0 :
g k γ ^ .
Thus, from (29), (30), (67), (68), (69) and the line search rule, it is easy to obtain that:
d k = g k + β k s k 1 + θ k δ k 1 g k + β k s k 1 + θ k δ k 1 γ ^ 1 + M 2 m ¯ + 2 M 4 m ¯ 2 + 1 m ¯ .
The proof is completed. □
Lemma 5.
Suppose that Assumptions 1–4 hold. Then:
α k min 1 , ρ ( g k 2 2 t k F k d k ) M 2 + 2 σ d k 2 ,
where:
t k = 0 1 J ( x k + t ρ 1 α k d k ) J ( x k + t α k 1 F k ) d t .
Proof .
If α k = 1 , it is easy to see that (71) holds. If α k 1 , then α k = ρ 1 α k does not satisfy (31), that is to say, α k satisfies:
f ( x k + α k d k ) f ( x k ) > σ α k d k 2 .
On the other hand, from (38), it follows that:
f ( x k + α k d k ) f ( x k ) = 1 2 F ( x k + α k d k ) 2 1 2 F ( x k ) 2 = F ( x k ) T F ( x k + α k d k ) F ( x k ) + 1 2 F ( x k + α k d k ) F ( x k ) 2 F ( x k ) T 0 1 J ( x k + t α k d k ) α k d k d t + 1 2 M 2 α k d k 2 .
Combined with (73), we obtain:
α k 2 F k T 0 1 J ( x k + t α k d k ) d k d t ( M 2 + 2 σ ) d k 2 = 2 F k T 0 1 J ( x k + t α k d k ) d k d t + 2 g k T d k 2 g k T d k / ( M 2 + 2 σ ) d k 2 g k 2 2 F k T 0 1 J ( x k + t α k d k ) d k d t + 2 g k T d k / ( M 2 + 2 σ ) d k 2 = g k 2 2 F k T 0 1 J ( x k + t α k d k ) d k d t + 2 0 1 J ( x k + t α k 1 F k ) F k d t T d k / ( M 2 + 2 σ ) d k 2 g k 2 2 t k F k d k ( M 2 + 2 σ ) d k 2 ,
where the first inequality follows from (65), the second equality follows from (13) and the differentiability of F. The third inequality follows from the Cauchy–Schwartz inequality.
By (75), we obtain the desired result. □
Lemma 6.
Suppose that Assumptions 1–4 hold. Let { x k } and { d k } be two sequences generated by Algorithm 2. Then, the line search rule (31) by Step 3 in Algorithm 2 is well defined.
Proof .
Our aim is to show that the line search rule (31) terminates finitely with a positive step length α k . In contrast, suppose that for some iterate indexes such as k * , the condition (31) does not hold. As a result, for all m N :
f ( x k * + ρ m d k * ) f ( x k * ) > σ ρ m d k * 2 + η k * f ( x k * ) .
which can be written as
f ( x k * + ρ m d k * ) f ( x k * ) ρ m > σ ρ m d k * 2 + η k * f ( x k * ) ρ m .
By taking the limit as m in both sides of (77), we have:
f ( x k * ) T d k * + .
However, from Assumption 3, Lemma 4 and (34) and the stop rule of Algorithm 2, we obtain:
f ( x k * ) d k *   =   J ( x k * ) F ( x k * ) d k * M M ^ Δ < + .
Clearly, (79) contradicts (78). That is to say, the line search rule terminates within a finite number of many trials to obtain a positive step length α k , i.e., Step 3 of Algorithm 2 is well defined. □
With the above preparation, we now state the convergence result of Algorithm 2.
Theorem 2.
Suppose that Assumptions 1–4 hold. Let { x k } be a sequence generated by Algorithm 2. Then:
lim k inf F k = 0 .
Proof .
For the sake of contradiction, we suppose that the conclusion is not true. Then, there exists a constant ε 0 > 0 such that F k ε 0 for all k N . Hence, (66) holds. Hence, from (58), we have:
lim k α k = 0 .
It follows from (81) and (72) that:
lim k t k = 0 .
From (39), Lemmas 4 and 5, we know the following inequality:
α k min 1 , ρ m 2 ε 0 2 2 t k F k d k M 2 + 2 σ M ^ 2
holds for all sufficiently large k. Therefore, taking the limit as k in both sides of (83), it holds that:
lim k α k min 1 , ρ m 2 ε 0 2 ( M 2 + 2 σ ) M ^ 2 > 0 ,
which is a contradiction. Thus, the proof of Theorem 2 has been completed. □

4. Numerical Tests

In this section, by numerical tests, we study the effectiveness and robustness of Algorithm 1 when it is used to solve nonlinear systems of symmetric equations.
We first list the benchmark test problems F ( x ) = ( F 1 ( x ) , F 2 ( x ) , , F n ( x ) ) T = 0 , which includes all the four test problems in [6].
Problem 1. 
Strictly convex function 1 ([26], p. 29) Let F ( x ) be the gradient of h ( x ) = i = 1 n ( e x i x i ) , meaning that:
F i ( x ) = e x i 1 , i = 1 , 2 , , n .
Problem 2. 
In Reference [22], the elements of F ( x ) are given by
F i ( x ) = 2 x i sin x i , i = 1 , 2 , , n 1 .
Problem 3. 
The discretized Chandrasekhar’s H-Equation [27]:
F i ( x ) = x i 1 c 2 n j = 1 n μ i x j μ i + μ j 1 , i = 1 , 2 , , n 1 .
where c = 0.9 and μ i = ( i 1 / 2 ) / n .
Problem 4. 
Unconstrained optimization problem:
min f ( x ) , x R n ,
with Engval function [28] f : R n R defined by
f ( x ) = i = 2 n ( x i 1 2 + x i 2 ) 2 4 x i 1 + 3 .
The related symmetric nonlinear equation is:
F ( x ) = 1 4 f ( x ) = 0 ,
where F ( x ) = ( F 1 ( x ) , F 2 ( x ) , , F n ( x ) ) T is defined by:
F 1 ( x ) = x 1 ( x 1 2 + x 2 2 ) 1 , F i ( x ) = x i ( x i 1 2 + 2 x i 2 + x i + 1 2 ) 1 , i = 2 , 3 , , n 1 , F n ( x ) = x n ( x n 1 2 + x n 2 ) .
Problem 5. 
The discretized two-point boundary value problem like the problem in [1]:
F ( x ) = A x + G ( x ) ( n + 1 ) 2 , A = 8 1 1 8 1 1 1 8 .
and G ( x ) = ( G 1 ( x ) , G 2 ( x ) , , G n ( x ) ) T with G i ( x ) = sin x i 1 , i = 1 , 2 , , n .
Problem 6. 
In Reference [6], the elements of F ( x ) are given by
F i ( x ) = 2 x i x i + 1 + sin ( x i ) 1 , i = 1 , 2 , , n 1 , F n ( x ) = 2 x n + sin ( x n ) 1 .
Problem 7. 
In Reference [6], the elements of F ( x ) are given by
F i ( x ) = x i 1 , i = 1 , 2 , , n 2 , F n 1 ( x ) = x n 1 i = 1 n 2 i ( x i 1 ) , F n ( x ) = i = 1 n 2 i ( x i 1 ) 2 .
All the algorithms are coded in MATLAB R2021a and run on a desktop (at Peking University) computer with a 3.6 GHZ CPU processor, 16 GB memory and Windows 7 operation system. The relevant parameters are specified by
B 0 = I , σ 1 = σ 2 = 0.01 , ρ = 0.5 , ρ 1 = 0.95 , α 1 = 0.01 ,
and μ in MBFGS method (Algorithm 2.1 in [6]) is the same as [6], i.e., μ = 10 4 . In fact, the above parameters all are same as those in [6]. Similarly to [6], we use the matrix left division command d = B g to directly solve the linear subproblem (22). The termination condition of all the algorithms is: F k 10 6 , or the number of iterations exceeds 10 4 , or the MATLAB R2010b crashes, or the CPU time exceeds 100 s.
In order to choose optimal values for the parameters t and r in Algorithm 1, we first take r = 0.5 , and choose t from the interval [ 1 , 1.1 ] with a step size of 0.01 . We present the total number of iterations (Iter) in Figure 1a as Algorithm 1 is used to solve all the seven test problems with different sizes n (10, 50, 100 and 500) and different initial guesses. The initial guesses are x 1 = ( 0.1 , 0.1 , , 0.1 ) T , x 2 = ( 0.1 , 0.1 , , 0.1 ) T , x 3 = ( 1 , 1 , , 1 ) T , x 4 = ( 1 , 1 , , 1 ) T , x 5 = ( 1 / n , 1 / n , , 1 / n ) T , x 6 = ( 1 / n , 1 / n , , 1 / n ) T . From Figure 1a, we know that Iter changes little when t [ 1.02 , 1.03 ] and it is the least when t = 1.03 .
We then take t = 1.03 , and choose r from the interval [ 0.5 , 0.6 ] with a step size of 0.01 . We present the total number of iterations (Iter) in Figure 1b as Algorithm 1 is used to solve all seven test problems with different sizes n (10, 50, 100 and 500) and different initial guesses ( x 1 x 6 ). Figure 1b shows that Iter changes little when r [ 0.52 , 0.54 ] and Algorithm 1 with r = 0.5 performs the best.
According to the above research, we take t = 1.03 and r = 0.5 , and compare Algorithm 1 (MSBFGS) with two similar algorithms proposed very recently to see which is more efficient as they are used to solve all seven test problems with different sizes n and different initial guesses. One is the Gauss–Newton-based BFGS method (GNBFGS for short) in [3] and another is MBFGS in [6] since they have been reported to be more efficient than the state-of-the-art ones.
In Table A1, we report the numerical performance of the three algorithms. For the simplification of statement, we use the following notations in Table A1.
P: the problems;
Dim: the dimension of test problems;
CPU: the CPU time in seconds;
Ni: the number of iterations;
Nf: the number of function evaluations;
Norm (F): the norm of F k at the stopping point;
F: a notation when an algorithm fails to achieve the given iteration tolerance, or in the limited number of iterations exceeds 10 4 , or the MATLAB R2010b crashes, or in the limited the CPU time exceeds 100 s.
The underlined data in Table A1 indicate the superiority of Algorithm 1 in comparison with the others.
To further show the efficiency of the proposed method, we calculated the number of wins for the three algorithms in terms of the elapsed CPU time (CPU wins), the number of iterations (Iter wins) and the number of function evaluations (Nf wins) and we also calculated the failures (Fails) of the three algorithms. The results are recorded in Table 1.
In addition, we adopted the performance profiles introduced by Dolan and Moré [29] to evaluate the required number of iterations and the number of function evaluations.
It follows from the results in Table 1 and Figure 2 that our algorithm (MSBFGS) performs the best among the three algorithms, either with respect to the number of iterations, or with respect to the elapsed CPU time.
In order to test the efficiency of the Algorithm 2 (MSBFGS2), we compared its performance for solving large-scale nonlinear symmetric equations with Algorithm 2.1 (NDDF) in [15] and Algorithm 2.1 (DFMPRP) in [13]. For the sake of fairness, we chose seven test problems, all from [15], where the relevant parameters of Algorithm 2 are same as those of NDDF in [15]. The values of parameters in DFMPRP are from [13]. The termination condition of all three algorithms is F k 10 4 , or the number of iterations exceeds 10 4 , or the MATLAB R2010b crashes, or the CPU time exceeds 100 s.
The numerical performance of all the algorithms is reported in Table A2 and Table A3. Table A2 shows the numerical performance of all three algorithms with the fixed initial points x 1 x 6 . Table A3 demonstrates the numerical performance of all the three algorithms with initial points x 7 and x 8 randomly generated by Matlab’s Code “rand(n,1)” and “-rand(n,1)”, respectively. Furthermore, we calculated the “CPU wins”, the “Iter wins”, the “Nf wins” and the “Fails” of the three algorithms. The results are recorded in Table 2. We also adopted the performance profiles introduced by Dolan and Moré [29] to evaluate the required number of iterations and the required number of function evaluations of the three algorithms.
All the results of numerical performance in Figure 3 and Table 2, Table A2 and Table A3 demonstrate that our algorithm (MSBFGS2) performs better than the other two algorithms. MSBFGS2 is more efficient and robust than the others since the failures of MSBFGS2 are the least among the three algorithms in the case that different initial guesses are chosen.

5. Conclusions and Future Research

In this paper, we presented two derivative-free methods for solving nonlinear symmetric equations. For the first method, the direction is an approximate quasi-Newton direction and it can solve small-scale problems efficiently. For the second method, since it is not involved with the computation or storage of any matrix, it is applicable to solve the large scale system of nonlinear equations.
Global convergence theories of the developed algorithms were established. Compared with the similar algorithms, numerical tests demonstrated that our algorithms outperformed the others by costing less iterations, or less CPU time to find a solution with the same tolerance.
In future research, it would be valuable to deeply study the local convergence of the developed algorithms, in addition to the conducted analysis of global convergence in this paper. Additionally, our algorithms were designed only for the system of equations which is symmetric and satisfied with some relatively restrictive assumptions. Thus, it is interesting to study how to modify our algorithms to solve a more general system of equations.

Author Contributions

Z.W. conceived and designed the research plan and wrote the paper. J.G. performed the mathematical analysis, the development of the algorithm, experiments and wrote the paper. All authors have read and agreed to the published version of the manuscript.

Funding

This research is supported by the National Natural Science Foundation of China (Grant No. 71671190).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data used to support the findings of this study are available from the corresponding author upon request. All the computer codes used in this study are available from the corresponding author upon request.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A. Numerical Results

Table A1. Numerical results of Problems 1–7.
Table A1. Numerical results of Problems 1–7.
Pdimx0MBFGSGNBFGSMSBFGS
CPUNiNfNorm (F)CPUNiNfNorm (F)CPUNiNfNorm (F)
P110 x 1 0.0000516 1.96 × 10 10 0.0000413 1.47 × 10 7 0.0000413 1.86 × 10 12
x 2 0.0000413 8.37 × 10 7 0.0156413 8.08 × 10 8 0.0000310 9.23 × 10 7
x 3 0.00001151 1.27 × 10 8 0.0000940 4.99 × 10 9 0.0624120604 2.29 × 10 12
x 4 0.0000735 8.98 × 10 7 0.0000931 9.28 × 10 10 0.0000825 4.76 × 10 8
x 5 0.0000516 1.96 × 10 10 0.0000413 1.47 × 10 7 0.0000413 1.86 × 10 12
x 6 0.0000413 8.37 × 10 7 0.0000413 8.08 × 10 8 0.0000310 9.23 × 10 7
50 x 1 0.0936516 4.39 × 10 10 0.0000413 3.30 × 10 7 0.0000413 4.16 × 10 12
x 2 0.0000516 2.28 × 10 10 0.0000413 1.81 × 10 7 0.0000413 1.20 × 10 12
x 3 0.07801154 2.78 × 10 8 0.0000940 1.12 × 10 8 0.0780120604 5.13 × 10 12
x 4 0.0780734 4.99 × 10 8 0.0000931 2.07 × 10 9 0.0000825 1.06 × 10 7
x 5 0.0000310 1.59 × 10 7 0.0000310 5.27 × 10 8 0.0000310 7.90 × 10 12
x 6 0.0000310 1.50 × 10 7 0.0000310 4.98 × 10 8 0.0000310 6.97 × 10 12
100 x 1 0.0312516 6.21 × 10 10 0.0000413 4.66 × 10 7 0.0000413 5.89 × 10 12
x 2 0.0936516 3.23 × 10 10 0.0000413 2.56 × 10 7 0.0000413 1.70 × 10 12
x 3 0.00001039 2.49 × 10 10 0.0000940 1.58 × 10 8 0.2808120604 7.25 × 10 12
x 4 0.0000939 2.45 × 10 7 0.0000931 2.93 × 10 9 0.0000825 1.50 × 10 7
x 5 0.0000310 6.91 × 10 9 0.0000310 2.30 × 10 9 0.000027 4.60 × 10 7
x 6 0.0000310 6.72 × 10 9 0.0000310 2.23 × 10 9 0.000027 4.46 × 10 7
500 x 1 0.1092516 1.39 × 10 9 0.1248516 2.71 × 10 11 0.0780413 1.32 × 10 11
x 2 0.1248516 7.18 × 10 10 0.0780413 5.71 × 10 7 0.0624413 3.81 × 10 12
x 3 0.2028940 2.26 × 10 10 0.2496940 3.53 × 10 8 3.7128120604 1.62 × 10 11
x 4 0.2496938 6.04 × 10 7 0.2496931 6.56 × 10 9 0.2496825 3.36 × 10 7
x 5 0.062427 4.06 × 10 7 0.062427 2.70 × 10 7 0.000027 1.63 × 10 9
x 6 0.000027 4.04 × 10 7 0.031227 2.68 × 10 7 0.078027 1.62 × 10 9
P210 x 1 0.0312310 1.16 × 10 11 0.0000310 1.06 × 10 12 0.000027 1.28 × 10 9
x 2 0.0000310 1.16 × 10 11 0.0000310 1.06 × 10 12 0.000027 1.28 × 10 9
x 3 0.0000516 1.09 × 10 7 0.0000516 3.32 × 10 8 0.0000516 8.89 × 10 8
x 4 0.0000516 1.09 × 10 7 0.0000516 3.32 × 10 8 0.0000516 8.89 × 10 8
x 5 0.0000310 1.16 × 10 11 0.0000310 1.06 × 10 12 0.000027 1.28 × 10 9
x 6 0.0000310 1.16 × 10 11 0.0000310 1.06 × 10 12 0.000027 1.28 × 10 9
50 x 1 0.0000310 3.44 × 10 11 0.0000310 2.37 × 10 12 0.000027 2.86 × 10 9
x 2 0.0000310 3.44 × 10 11 0.0000310 2.37 × 10 12 0.000027 2.86 × 10 9
x 3 0.0000516 2.43 × 10 7 0.0000516 7.42 × 10 8 0.0000516 1.99 × 10 7
x 4 0.0000516 2.43 × 10 7 0.0000516 7.42 × 10 8 0.0936516 1.99 × 10 7
x 5 0.000027 1.07 × 10 8 0.000027 5.06 × 10 9 0.000027 1.46 × 10 15
x 6 0.000027 1.07 × 10 8 0.000027 5.06 × 10 9 0.000027 1.46 × 10 15
100 x 1 0.0000310 5.78 × 10 11 0.0000310 3.35 × 10 12 0.000027 4.05 × 10 9
x 2 0.0000310 5.78 × 10 11 0.0000310 3.35 × 10 12 0.000027 4.05 × 10 9
x 3 0.0000516 3.43 × 10 7 0.0000516 1.05 × 10 7 0.0000516 2.81 × 10 7
x 4 0.0780516 3.43 × 10 7 0.0000516 1.05 × 10 7 0.0000516 2.81 × 10 7
x 5 0.000027 5.18 × 10 10 0.000027 2.24 × 10 10 0.000027 4.04 × 10 18
x 6 0.000027 5.18 × 10 10 0.000027 2.24 × 10 10 0.000027 4.04 × 10 18
500 x 1 0.1248310 2.17 × 10 10 0.0624310 7.48 × 10 12 0.062427 9.05 × 10 9
x 2 0.0624310 2.17 × 10 10 0.2028310 7.48 × 10 12 0.093627 9.05 × 10 9
x 3 0.2340516 7.64 × 10 7 0.1560516 2.35 × 10 7 0.1248516 6.28 × 10 7
x 4 0.1248516 7.64 × 10 7 0.0780516 2.35 × 10 7 0.1248516 6.28 × 10 7
x 5 0.000014 1.20 × 10 7 0.062414 1.20 × 10 7 0.000014 1.20 × 10 7
x 6 0.109214 1.20 × 10 7 0.000014 1.20 × 10 7 0.000014 1.20 × 10 7
P310 x 1 0.06248045068 9.94 × 10 7 0.124811827633 9.81 × 10 7 0.046878331 9.86 × 10 7
x 2 0.03122811671 9.82 × 10 7 0.109210806918 9.84 × 10 7 0.0000104459 7.40 × 10 7
x 3 0.0468134624 9.21 × 10 7 0.0000157760 9.59 × 10 7 0.000036109 8.76 × 10 7
x 4 0.0468183879 9.38 × 10 7 0.09367394591 9.60 × 10 7 0.000065250 9.01 × 10 7
x 5 0.09368045068 9.94 × 10 7 0.187211827633 9.81 × 10 7 0.000078331 9.86 × 10 7
x 6 0.06242811671 9.82 × 10 7 0.171610806918 9.84 × 10 7 0.0000104459 7.40 × 10 7
50 x 1 1.34168735489 9.91 × 10 7 1.809613178585 9.87 × 10 7 0.2028118566 9.76 × 10 7
x 2 0.00001136 2.32 × 10 7 1.528812137782 9.97 × 10 7 0.2808160916 9.95 × 10 7
x 3 0.2028130592 9.31 × 10 7 0.2964162762 9.81 × 10 7 0.093637112 8.65 × 10 7
x 4 0.1560164779 9.76 × 10 7 1.07647624718 9.76 × 10 7 0.1872151736 9.98 × 10 7
x 5 0.00001240 5.29 × 10 8 1.482012688228 9.82 × 10 7 0.1560136695 9.69 × 10 7
x 6 0.00001239 3.25 × 10 8 1.591212388023 9.98 × 10 7 0.1872152809 1.00 × 10 6
100 x 1 3.61928905605 9.95 × 10 7 5.475613498809 9.99 × 10 7 0.280877305 9.73 × 10 7
x 2 0.00001136 2.07 × 10 7 4.804812377932 9.83 × 10 7 0.3744109478 9.75 × 10 7
x 3 0.3744134610 9.06 × 10 7 0.5304162764 9.70 × 10 7 0.062437112 8.69 × 10 7
x 4 0.5928171810 9.99 × 10 7 3.05767774809 9.80 × 10 7 0.4212113514 9.95 × 10 7
x 5 0.06241240 6.16 × 10 8 4.851613028446 9.99 × 10 7 0.187281323 9.80 × 10 7
x 6 0.06241240 4.92 × 10 8 5.132413018422 9.90 × 10 7 0.296484346 9.99 × 10 7
500 x 1 38.62589475970 9.85 × 10 7 60.138414289322 9.90 × 10 7 3.681691376 9.27 × 10 7
x 2 15.24133742253 9.76 × 10 7 54.569113138422 9.99 × 10 7 4.9764131615 9.72 × 10 7
x 3 5.4444145662 9.43 × 10 7 7.0200176827 9.90 × 10 7 1.263639118 6.18 × 10 7
x 4 7.4412179856 9.36 × 10 7 34.47628285128 9.79 × 10 7 1.279240121 5.78 × 10 7
x 5 0.45241240 1.15 × 10 7 59.296013718901 9.91 × 10 7 4.2900104454 9.45 × 10 7
x 6 0.45241240 1.11 × 10 7 57.236813738892 9.97 × 10 7 4.102898420 8.05 × 10 7
P410 x 1 0.000034157 2.29 × 10 7 0.000022108 7.80 × 10 8 0.000052361 9.99 × 10 7
x 2 0.000026113 3.21 × 10 7 0.000023113 8.78 × 10 8 0.046840274 8.03 × 10 8
x 3 0.000040168 5.72 × 10 7 0.000073479 8.91 × 10 7 0.000037264 5.66 × 10 7
x 4 0.000044178 7.37 × 10 7 0.000046195 7.51 × 10 7 0.000053377 1.94 × 10 7
x 5 0.078034157 2.29 × 10 7 0.000022108 7.80 × 10 8 0.000052361 9.99 × 10 7
x 6 0.000026113 3.21 × 10 7 0.000023113 8.78 × 10 8 0.000040274 8.03 × 10 8
50 x 1 0.093668383 8.93 × 10 7 0.062464370 3.89 × 10 7 0.000062435 9.71 × 10 7
x 2 0.000066364 8.46 × 10 7 0.078063367 8.24 × 10 7 0.000040274 9.21 × 10 7
x 3 0.109292453 3.07 × 10 7 0.1248123790 9.82 × 10 7 0.000043308 6.78 × 10 7
x 4 0.062494487 9.36 × 10 7 0.1092146698 8.53 × 10 7 0.062448347 6.51 × 10 7
x 5 0.000066378 5.27 × 10 7 0.078065383 5.35 × 10 7 0.000044306 6.88 × 10 7
x 6 0.062466378 7.88 × 10 7 0.000064389 8.35 × 10 7 0.046838264 8.79 × 10 7
100 x 1 0.062496591 9.52 × 10 7 0.140494574 8.59 × 10 7 0.062441283 8.09 × 10 7
x 2 0.062499591 9.55 × 10 7 0.140491563 9.85 × 10 7 0.109240279 6.69 × 10 7
x 3 0.1404141770 7.32 × 10 7 0.28082031419 6.97 × 10 7 0.000047332 8.20 × 10 7
x 4 0.2184176962 5.94 × 10 7 0.31202041075 6.36 × 10 7 0.109251368 7.80 × 10 7
x 5 0.1716110659 8.91 × 10 7 0.109286543 8.79 × 10 7 0.093639272 5.21 × 10 7
x 6 0.2340125710 6.71 × 10 7 0.0624100633 9.61 × 10 7 0.078041287 5.46 × 10 7
500 x 1 3.2136109667 8.71 × 10 7 4.2432138842 9.33 × 10 7 1.248042294 9.07 × 10 7
x 2 3.1044107644 8.88 × 10 7 3.1356103633 9.64 × 10 7 1.170044310 1.00 × 10 6
x 3 7.47242521549 9.08 × 10 7 12.83894562833 9.15 × 10 7 1.248050357 9.88 × 10 7
x 4 7.65962591581 9.28 × 10 7 20.28016533804 6.96 × 10 7 1.528847348 6.02 × 10 7
x 5 3.1980105633 9.94 × 10 7 4.4772151962 9.39 × 10 7 1.092040283 4.21 × 10 7
x 6 4.2432162983 9.32 × 10 7 3.1980102651 9.44 × 10 7 0.982838265 8.81 × 10 7
P510 x 1 0.0156959 5.39 × 10 7 0.0000959 5.40 × 10 7 0.000014139 4.92 × 10 7
x 2 0.0000959 5.53 × 10 7 0.0000959 5.54 × 10 7 0.000014139 4.92 × 10 7
x 3 0.00001172 5.10 × 10 7 0.00001172 8.43 × 10 7 0.000015149 4.50 × 10 7
x 4 0.00001062 3.20 × 10 7 0.00001062 9.74 × 10 7 0.000015149 4.48 × 10 7
x 5 0.0000959 5.39 × 10 7 0.0000959 5.40 × 10 7 0.000014139 4.92 × 10 7
x 6 0.0000959 5.53 × 10 7 0.0000959 5.54 × 10 7 0.000014139 4.92 × 10 7
50 x 1 0.124837280 6.78 × 10 7 0.000042317 6.90 × 10 7 0.000018177 9.53 × 10 7
x 2 0.000034267 1.20 × 10 7 0.124834267 1.21 × 10 7 0.000018177 9.55 × 10 7
x 3 0.000050380 7.21 × 10 7 0.109236281 1.65 × 10 7 0.000018177 6.66 × 10 7
x 4 0.000053403 9.50 × 10 7 0.000037285 8.69 × 10 7 0.000018177 6.66 × 10 7
x 5 0.000034267 3.06 × 10 9 0.078034267 2.90 × 10 9 0.000017167 7.25 × 10 7
x 6 0.000035268 9.86 × 10 8 0.000035268 9.01 × 10 8 0.093617167 7.31 × 10 7
100 x 1 0.062479610 6.68 × 10 7 0.062470549 8.62 × 10 7 0.062419187 7.76 × 10 7
x 2 0.109266527 5.24 × 10 7 0.109269539 6.10 × 10 7 0.000019187 7.76 × 10 7
x 3 0.171678610 8.88 × 10 7 0.156073579 9.85 × 10 7 0.000019187 4.02 × 10 7
x 4 0.171677603 8.51 × 10 7 0.062484654 8.96 × 10 7 0.062419187 4.02 × 10 7
x 5 0.093665517 4.80 × 10 7 0.109269541 1.79 × 10 7 0.062415147 7.65 × 10 7
x 6 0.062465517 9.82 × 10 7 0.062465517 5.24 × 10 7 0.000015147 7.58 × 10 7
500 x 1 2.340081662 5.23 × 10 7 2.386880655 9.55 × 10 7 0.530420196 4.53 × 10 7
x 2 2.215277630 7.75 × 10 7 2.121679648 9.81 × 10 7 0.561620196 4.53 × 10 7
x 3 2.948498800 8.37 × 10 7 2.839291745 7.14 × 10 7 0.608422216 7.17 × 10 7
x 4 2.839298800 8.21 × 10 7 2.823691745 6.67 × 10 7 0.670822216 7.17 × 10 7
x 5 1.840859482 7.82 × 10 7 2.199659482 7.91 × 10 7 0.358811107 7.59 × 10 7
x 6 1.809659483 9.54 × 10 7 1.794059483 9.87 × 10 7 0.390011107 7.60 × 10 7
P610 x 1 0.0000108484 9.16 × 10 7 0.0000107542 8.70 × 10 7 0.04682672333 9.73 × 10 7
x 2 0.0000111488 8.34 × 10 7 0.0468125640 9.45 × 10 7 0.00002421940 9.41 × 10 7
x 3 0.000090416 9.73 × 10 7 0.000052220 7.73 × 10 7 0.000050380 8.72 × 10 7
x 4 FFFF0.03122221224 9.11 × 10 7 0.00002872801 8.81 × 10 7
x 5 0.0312108484 9.16 × 10 7 0.0000107542 8.70 × 10 7 0.00002672333 9.73 × 10 7
x 6 0.0156111488 8.34 × 10 7 0.0000125640 9.45 × 10 7 0.00002421940 9.41 × 10 7
50 x 1 0.0624148789 8.70 × 10 7 0.0624115625 8.94 × 10 7 0.10921411111 9.97 × 10 7
x 2 0.0624139740 8.90 × 10 7 0.0624148823 9.32 × 10 7 0.35885694552 9.85 × 10 7
x 3 0.0624113611 6.27 × 10 7 0.0624103572 9.11 × 10 7 0.17162802468 9.59 × 10 7
x 4 FFFF0.10921951113 8.38 × 10 7 0.8268106510678 9.21 × 10 7
x 5 0.14041821028 9.64 × 10 7 0.0000102565 8.84 × 10 7 0.45246845784 9.87 × 10 7
x 6 0.062499554 9.81 × 10 7 0.0624115612 8.33 × 10 7 0.59289427985 9.95 × 10 7
100 x 1 0.12481871118 9.12 × 10 7 0.1716160999 9.25 × 10 7 0.21841611315 9.34 × 10 7
x 2 0.28081861151 9.84 × 10 7 0.28081711031 9.70 × 10 7 0.63965204163 9.80 × 10 7
x 3 0.28081781058 9.83 × 10 7 0.18721761049 9.38 × 10 7 0.40562872537 9.99 × 10 7
x 4 FFFF0.48363212078 9.09 × 10 7 4.8048304235312 8.44 × 10 7
x 5 0.1248160974 9.21 × 10 7 0.23401701022 7.19 × 10 7 1.43529408407 9.88 × 10 7
x 6 0.1716163986 8.04 × 10 7 0.17161701018 9.93 × 10 7 1.02969157757 9.89 × 10 7
500 x 1 17.05095764354 9.63 × 10 7 15.53775383805 9.92 × 10 7 19.84336625897 9.96 × 10 7
x 2 FFFF17.33175594014 9.49 × 10 7 15.19454903929 9.86 × 10 7
x 3 17.53455583934 9.70 × 10 7 18.09615483909 9.26 × 10 7 5.64721811556 9.39 × 10 7
x 4 FFFF29.59349618274 8.95 × 10 7 FFFF
x 5 FFFF16.28655343793 9.47 × 10 7 14.16494673744 9.95 × 10 7
x 6 FFFF16.52055443847 9.40 × 10 7 13.57214443564 9.95 × 10 7
P710 x 1 0.000014 3.28 × 10 12 0.000014 3.28 × 10 12 0.000014 3.28 × 10 12
x 2 0.000014 5.61 × 10 11 0.000014 5.61 × 10 11 0.000014 5.61 × 10 11
x 3 0.000001 0.00 0.000001 0.00 0.000001 0.00
x 4 0.000014 3.30 × 10 10 0.000014 3.30 × 10 10 0.000014 3.30 × 10 10
x 5 0.000014 3.28 × 10 12 0.000014 3.28 × 10 12 0.000014 3.28 × 10 12
x 6 0.000014 5.61 × 10 11 0.000014 5.61 × 10 11 0.000014 5.61 × 10 11
50 x 1 0.000014 1.18 × 10 7 0.000014 1.18 × 10 7 0.000014 1.18 × 10 7
x 2 FFFF0.0000210 1.32 × 10 7 0.000027 0.00
x 3 0.000001 0.00 0.000001 0.00 0.000001 0.00
x 4 FFFF0.0000424 0.00 0.000027 0.00
x 5 0.000014 9.80 × 10 8 0.000014 9.80 × 10 8 0.000014 9.80 × 10 8
x 6 0.000014 4.55 × 10 8 0.000014 4.55 × 10 8 0.000014 4.55 × 10 8
100 x 1 FFFF0.0000418 0.00 0.000027 0.00
x 2 FFFF0.0000417 0.00 0.000027 0.00
x 3 0.000001 0.00 0.000001 0.00 0.000001 0.00
x 4 FFFF0.0000426 0.00 0.000027 0.00
x 5 0.000027 2.51 × 10 7 0.000027 2.51 × 10 7 0.000027 0.00
x 6 0.000014 2.61 × 10 7 0.000014 2.61 × 10 7 0.000014 2.61 × 10 7
500 x 1 FFFF0.1248423 0.00 0.000027 0.00
x 2 FFFF0.0156423 0.00 0.046827 0.00
x 3 0.000001 0.00 0.000001 0.00 0.000001 0.00
x 4 FFFF0.0936430 0.00 0.062427 0.00
x 5 0.0624310 0.00 0.0780310 0.00 0.078027 0.00
x 6 0.1248310 0.00 0.0156310 0.00 0.062427 0.00
Table A2. Numerical results of the 7 problems in [15] with fixed initial points.
Table A2. Numerical results of the 7 problems in [15] with fixed initial points.
Pdimx0NDDFDFMPRPMSBFGS2
CPUNiNfNorm (F)CPUNiNfNorm (F)CPUNiNfNorm (F)
P110000 x 1 FFFF0.17161174 3.55 × 10 5 0.0624311 5.65 × 10 7
x 2 FFFF0.0000961 6.61 × 10 5 0.1092311 4.33 × 10 7
x 3 FFFFFFFF0.0000831 4.52 × 10 8
x 4 FFFF0.06241071 3.98 × 10 5 0.0000623 8.90 × 10 7
x 5 0.000013 5.00 × 10 7 0.1092533 5.85 × 10 5 0.000013 5.00 × 10 7
x 6 0.000013 5.00 × 10 7 0.0000533 5.87 × 10 5 0.000013 5.00 × 10 7
100000 x 1 FFFF0.37441281 2.80 × 10 5 0.0000311 1.79 × 10 6
x 2 FFFF0.24961068 5.23 × 10 5 0.0000311 1.37 × 10 6
x 3 FFFFFFFF0.1404831 1.43 × 10 7
x 4 FFFF0.37441178 3.15 × 10 5 0.1248623 2.81 × 10 6
x 5 0.000013 1.58 × 10 8 0.1872426 7.41 × 10 5 0.000013 1.58 × 10 8
x 6 0.000013 1.58 × 10 8 0.1248426 7.41 × 10 5 0.031213 1.58 × 10 8
500000 x 1 FFFF1.65361281 6.27 × 10 5 0.1872311 4.00 × 10 6
x 2 FFFF1.62241175 2.92 × 10 5 0.3276311 3.06 × 10 6
x 3 FFFFFFFF0.6396831 3.20 × 10 7
x 4 FFFF1.46641178 7.04 × 10 5 0.3744623 6.29 × 10 6
x 5 0.062413 1.41 × 10 9 0.4524426 3.31 × 10 5 0.062413 1.41 × 10 9
x 6 0.046813 1.41 × 10 9 0.5304426 3.31 × 10 5 0.062413 1.41 × 10 9
1000000 x 1 FFFF3.01081281 8.87 × 10 5 0.4368311 5.65 × 10 6
x 2 FFFF3.13561175 4.13 × 10 5 0.4836311 4.33 × 10 6
x 3 FFFFFFFF1.0608831 4.52 × 10 7
x 4 FFFF3.22921178 9.95 × 10 5 0.8268623 8.90 × 10 6
x 5 0.062413 5.00 × 10 10 0.7020319 9.37 × 10 5 0.140413 5.00 × 10 10
x 6 0.124813 5.00 × 10 10 0.6864319 9.38 × 10 5 0.124813 5.00 × 10 10
P210000 x 1 FFFF0.10921068 7.81 × 10 5 0.000027 6.16 × 10 10
x 2 FFFF0.06241068 7.81 × 10 5 0.000027 6.16 × 10 10
x 3 FFFF0.18721283 5.01 × 10 5 0.0000311 2.01 × 10 5
x 4 FFFF0.17161283 5.01 × 10 5 0.0000311 2.01 × 10 5
x 5 0.000013 1.67 × 10 11 0.0000533 5.86 × 10 5 0.000013 1.67 × 10 11
x 6 0.000013 1.67 × 10 11 0.0000533 5.86 × 10 5 0.000013 1.67 × 10 11
100000 x 1 FFFF0.39001175 6.17 × 10 5 0.015627 1.95 × 10 9
x 2 FFFF0.54601175 6.17 × 10 5 0.015627 1.95 × 10 9
x 3 FFFF0.49921390 3.96 × 10 5 0.0780311 6.34 × 10 5
x 4 FFFF0.48361390 3.96 × 10 5 0.0936311 6.34 × 10 5
x 5 0.000013 5.27 × 10 14 0.1560426 7.41 × 10 5 0.000013 5.27 × 10 14
x 6 0.062413 5.27 × 10 14 0.0936426 7.41 × 10 5 0.000013 5.27 × 10 14
500000 x 1 FFFF1.87201282 3.45 × 10 5 0.093627 4.36 × 10 9
x 2 FFFF1.77841282 3.45 × 10 5 0.140427 4.36 × 10 9
x 3 FFFF2.54281390 8.86 × 10 5 0.3588415 5.98 × 10 17
x 4 FFFF2.96401390 8.86 × 10 5 0.2496415 5.98 × 10 17
x 5 0.031213 9.43 × 10 16 0.7176426 3.31 × 10 5 0.015613 9.43 × 10 16
x 6 0.031213 9.43 × 10 16 0.6396426 3.31 × 10 5 0.078013 9.43 × 10 16
1000000 x 1 FFFF3.74401282 4.88 × 10 5 0.296427 6.16 × 10 9
x 2 FFFF3.90001282 4.88 × 10 5 0.156027 6.16 × 10 9
x 3 FFFF4.69561497 3.13 × 10 5 0.6084415 3.67 × 10 17
x 4 FFFF4.49281497 3.13 × 10 5 0.7332415 3.67 × 10 17
x 5 0.046813 1.67 × 10 16 1.0296319 9.38 × 10 5 0.046813 1.67 × 10 16
x 6 0.046813 1.67 × 10 16 0.7176319 9.38 × 10 5 0.140413 1.67 × 10 16
P310000 x 1 0.327687625 9.94 × 10 5 0.156048498 5.88 × 10 5 0.343266452 6.51 × 10 5
x 2 0.296474520 9.73 × 10 5 0.390056573 6.30 × 10 5 0.171648326 4.52 × 10 5
x 3 FFFFFFFF9.0325271914327 9.93 × 10 5
x 4 FFFFFFFFFFFF
x 5 0.327679563 9.77 × 10 5 FFFF0.187245309 7.00 × 10 5
x 6 0.390067474 8.60 × 10 5 FFFF0.046835242 9.36 × 10 5
100000 x 1 2.246480570 9.38 × 10 5 2.074853553 5.57 × 10 5 1.544459410 9.27 × 10 5
x 2 2.355682580 9.00 × 10 5 1.809647485 8.74 × 10 5 1.341656381 9.20 × 10 5
x 3 FFFFFFFF56.1448277914670 9.81 × 10 5
x 4 FFFFFFFFFFFF
x 5 3.1200106761 9.30 × 10 5 2.558469692 9.27 × 10 5 1.435257392 9.72 × 10 5
x 6 2.293295677 8.81 × 10 5 1.762852536 8.26 × 10 5 1.092039267 9.77 × 10 5
500000 x 1 10.140167479 6.34 × 10 5 FFFF7.441255385 7.61 × 10 5
x 2 11.606574520 6.45 × 10 5 9.344544479 7.44 × 10 5 7.488053366 9.88 × 10 5
x 3 FFFFFFFFFFFF
x 4 FFFFFFFFFFFF
x 5 10.654964448 7.95 × 10 5 10.452152548 8.44 × 10 5 3.634825175 5.30 × 10 5
x 6 12.152579554 6.70 × 10 5 11.107356576 7.94 × 10 5 3.588025175 4.79 × 10 5
1000000 x 1 FFFF22.198959646 7.32 × 10 5 19.936973505 9.73 × 10 5
x 2 22.105380563 7.62 × 10 5 29.406276791 9.50 × 10 5 9.812539268 9.48 × 10 5
x 3 FFFFFFFFFFFF
x 4 FFFFFFFFFFFF
x 5 21.808974525 8.24 × 10 5 20.389356568 9.30 × 10 5 15.194558396 8.79 × 10 5
x 6 22.807375530 9.75 × 10 5 20.966557584 9.36 × 10 5 18.002563434 7.58 × 10 5
P410000 x 1 1.2324102716 7.59 × 10 5 0.421223249 7.98 × 10 5 0.530431210 9.43 × 10 5
x 2 1.232499716 6.37 × 10 5 FFFF0.343233225 7.26 × 10 5
x 3 0.639660423 9.11 × 10 5 FFFF0.483648337 3.20 × 10 5
x 4 1.3728109767 5.00 × 10 5 0.904856566 7.69 × 10 5 0.265223162 6.99 × 10 5
x 5 0.764467489 5.95 × 10 5 0.936056563 7.74 × 10 5 0.312033230 9.46 × 10 5
x 6 0.904869503 6.38 × 10 5 0.748845456 9.46 × 10 5 0.312029201 7.19 × 10 5
100000 x 1 6.33361451016 5.55 × 10 5 2.028029308 8.39 × 10 5 0.858025171 5.97 × 10 5
x 2 4.305681595 5.72 × 10 5 FFFF1.216826185 9.51 × 10 5
x 3 4.414894672 6.11 × 10 5 FFFFFFFF
x 4 5.7096133941 5.40 × 10 5 FFFF10.67052421693 9.95 × 10 5
x 5 10.81092361669 9.65 × 10 5 4.539676743 8.82 × 10 5 1.014026182 8.04 × 10 5
x 6 12.48012781963 9.33 × 10 5 3.837661588 9.79 × 10 5 1.388433230 6.18 × 10 5
500000 x 1 18.751380563 8.60 × 10 5 11.403736382 7.71 × 10 5 5.163622164 8.06 × 10 5
x 2 27.3314108783 4.19 × 10 5 FFFF6.848434243 6.07 × 10 5
x 3 22.2457104727 9.42 × 10 5 FFFFFFFF
x 4 25.4438108772 4.53 × 10 5 FFFF9.531741290 5.30 × 10 5
x 5 16.005765471 6.16 × 10 5 22.542183787 7.39 × 10 5 FFFF
x 6 15.147765471 6.10 × 10 5 29.62461061001 7.35 × 10 5 FFFF
1000000 x 1 26.114658396 9.70 × 10 5 18.002530318 6.99 × 10 5 13.228930214 8.53 × 10 5
x 2 43.929988641 5.95 × 10 5 FFFF35.193879559 9.75 × 10 5
x 3 50.0763107766 7.27 × 10 5 FFFFFFFF
x 4 41.168789646 6.58 × 10 5 FFFF32.089476544 8.00 × 10 5
x 5 FFFF40.981571696 5.67 × 10 5 15.927736262 7.99 × 10 5
x 6 FFFF46.784784794 8.32 × 10 5 15.319333243 6.13 × 10 5
P510000 x 1 0.187263393 7.45 × 10 5 0.280857523 9.51 × 10 5 0.187243260 9.24 × 10 5
x 2 0.156058359 6.55 × 10 5 0.327651482 9.76 × 10 5 0.218444263 9.33 × 10 5
x 3 0.218466416 7.21 × 10 5 0.327648462 8.48 × 10 5 0.109246279 8.87 × 10 5
x 4 0.218457356 8.81 × 10 5 0.171656529 7.64 × 10 5 0.062448287 8.86 × 10 5
x 5 0.327659372 8.83 × 10 5 0.218450476 9.29 × 10 5 0.171646277 6.38 × 10 5
x 6 0.218461380 5.17 × 10 5 0.280856528 9.65 × 10 5 0.218457342 6.78 × 10 5
100000 x 1 1.185655342 7.59 × 10 5 1.762854505 9.86 × 10 5 1.138849293 9.38 × 10 5
x 2 1.341664403 8.71 × 10 5 2.012470652 8.63 × 10 5 0.967249298 8.25 × 10 5
x 3 1.076455351 8.08 × 10 5 1.731658540 7.87 × 10 5 0.936040244 9.46 × 10 5
x 4 1.060858364 7.90 × 10 5 2.215272664 8.05 × 10 5 1.029650297 9.74 × 10 5
x 5 1.123254345 9.11 × 10 5 1.825260564 6.98 × 10 5 0.982844266 9.25 × 10 5
x 6 1.170054345 9.68 × 10 5 1.918858546 9.60 × 10 5 1.170053317 6.33 × 10 5
500000 x 1 9.859368436 8.54 × 10 5 10.717358544 9.43 × 10 5 6.349248291 9.11 × 10 5
x 2 7.332054343 9.05 × 10 5 9.032553509 8.04 × 10 5 6.630052314 7.39 × 10 5
x 3 9.157361390 9.16 × 10 5 9.391354524 9.64 × 10 5 5.896849297 9.96 × 10 5
x 4 9.172965409 6.99 × 10 5 11.575359554 9.60 × 10 5 6.193248291 9.49 × 10 5
x 5 9.750170446 9.58 × 10 5 10.046554515 8.61 × 10 5 5.148046274 9.18 × 10 5
x 6 10.046574471 8.81 × 10 5 9.906154510 9.37 × 10 5 5.304042254 9.39 × 10 5
1000000 x 1 12.854547300 9.63 × 10 5 19.234952495 9.05 × 10 5 12.261747285 9.79 × 10 5
x 2 17.752964404 8.89 × 10 5 19.437757534 8.71 × 10 5 12.277347284 9.13 × 10 5
x 3 16.598557367 9.40 × 10 5 18.314553514 8.15 × 10 5 9.796941252 9.05 × 10 5
x 4 17.643758365 9.47 × 10 5 21.356557547 9.38 × 10 5 13.338153321 9.97 × 10 5
x 5 16.551758374 9.98 × 10 5 23.056961574 7.28 × 10 5 10.779742254 8.93 × 10 5
x 6 21.106973470 8.30 × 10 5 17.908950484 9.87 × 10 5 9.843748290 9.45 × 10 5
P610000 x 1 0.234029356 8.90 × 10 5 0.358844726 7.57 × 10 5 0.312025302 4.15 × 10 5
x 2 0.202818249 6.26 × 10 5 0.249624394 9.27 × 10 5 0.171627331 1.57 × 10 5
x 3 FFFF0.421246771 6.08 × 10 5 0.218431389 7.49 × 10 6
x 4 0.390039496 5.60 × 10 5 0.265230499 5.28 × 10 5 1.63801782328 9.74 × 10 5
x 5 0.436861714 3.56 × 10 5 0.327631617 9.65 × 10 5 0.280828345 3.73 × 10 5
x 6 0.390032382 4.98 × 10 5 0.436833658 8.18 × 10 5 0.265230365 4.52 × 10 5
100000 x 1 2.761264730 7.66 × 10 5 2.761245745 5.99 × 10 5 1.419627325 4.56 × 10 5
x 2 1.248020271 9.77 × 10 5 1.622426428 6.80 × 10 5 1.638027331 4.95 × 10 5
x 3 FFFF3.135651859 7.87 × 10 5 1.435231389 2.37 × 10 5
x 4 1.513223355 8.61 × 10 5 2.152832531 7.49 × 10 5 10.32732002614 9.53 × 10 5
x 5 1.560035410 5.52 × 10 5 1.996826516 8.87 × 10 5 1.575635424 9.13 × 10 5
x 6 1.326023297 2.67 × 10 5 2.324429577 8.33 × 10 5 1.092023286 8.68 × 10 5
500000 x 1 19.749769785 7.69 × 10 5 17.253749810 3.66 × 10 5 6.692428337 7.63 × 10 5
x 2 6.739221282 7.50 × 10 5 8.314927444 3.39 × 10 5 7.768828343 7.28 × 10 5
x 3 FFFF16.582949823 8.70 × 10 5 7.753231389 5.31 × 10 5
x 4 13.728137510 9.35 × 10 5 10.561335581 7.86 × 10 5 62.77482152809 9.57 × 10 5
x 5 23.587474915 7.18 × 10 5 11.622131617 7.23 × 10 5 6.942023287 7.50 × 10 5
x 6 FFFF14.523737738 7.64 × 10 5 7.035624298 2.68 × 10 5
1000000 x 1 20.451737441 8.63 × 10 5 28.875849812 9.26 × 10 5 15.428529347 3.20 × 10 5
x 2 20.342537436 3.12 × 10 5 15.584527444 4.69 × 10 5 14.008929355 4.89 × 10 5
x 3 FFFF32.183046774 8.76 × 10 5 16.910531389 7.50 × 10 5
x 4 FFFF21.621737613 5.54 × 10 5 FFFF
x 5 22.713740463 5.55 × 10 5 25.943034677 6.85 × 10 5 14.336527332 7.83 × 10 5
x 6 60.40361041197 7.67 × 10 5 25.802637737 6.91 × 10 5 17.035332390 2.16 × 10 5
P710000 x 1 0.85801151106 9.91 × 10 5 1.0452941352 9.98 × 10 5 0.327663572 7.90 × 10 5
x 2 0.68641151100 8.92 × 10 5 0.7956781095 8.56 × 10 5 0.327651470 8.36 × 10 5
x 3 0.85801171106 9.28 × 10 5 1.70041972809 8.56 × 10 5 0.624082776 5.81 × 10 5
x 4 0.499283789 8.12 × 10 5 0.7332811127 8.95 × 10 5 0.280855497 9.46 × 10 5
x 5 0.327644422 8.06 × 10 5 0.390038534 9.13 × 10 5 0.202828261 9.80 × 10 5
x 6 0.280838365 9.66 × 10 5 0.312035490 8.47 × 10 5 0.202830280 8.79 × 10 5
100000 x 1 3.525691876 9.83 × 10 5 5.64721191641 7.16 × 10 5 2.496073669 8.97 × 10 5
x 2 2.714470664 9.68 × 10 5 3.9624791090 9.90 × 10 5 1.747249451 8.62 × 10 5
x 3 1.981254507 6.34 × 10 5 10.45211972921 8.97 × 10 5 2.620873694 7.84 × 10 5
x 4 4.89841291214 8.21 × 10 5 4.82041011375 9.77 × 10 5 2.043654493 8.52 × 10 5
x 5 1.716040384 8.11 × 10 5 1.918834478 6.05 × 10 5 0.982826245 6.81 × 10 5
x 6 1.669240384 8.13 × 10 5 1.747237520 6.48 × 10 5 0.920425235 7.47 × 10 5
500000 x 1 27.08181221175 8.79 × 10 5 28.64181151535 9.80 × 10 5 11.232159539 6.96 × 10 5
x 2 19.125788844 8.49 × 10 5 29.92101051502 9.54 × 10 5 11.232161557 6.76 × 10 5
x 3 17.004178732 9.30 × 10 5 67.86042453540 9.88 × 10 5 13.743771665 8.59 × 10 5
x 4 21.434593863 9.32 × 10 5 35.81781271817 8.58 × 10 5 12.760969622 6.21 × 10 5
x 5 9.906141394 8.66 × 10 5 13.400547669 7.35 × 10 5 5.725226245 8.93 × 10 5
x 6 10.202541394 8.66 × 10 5 11.606544623 8.75 × 10 5 5.491227255 4.76 × 10 5
1000000 x 1 41.933198946 9.61 × 10 5 64.83401281775 9.35 × 10 5 20.919754498 7.66 × 10 5
x 2 50.15431131081 9.85 × 10 5 38.2670791022 9.06 × 10 5 22.947764587 9.00 × 10 5
x 3 44.33551071006 8.38 × 10 5 FFFF30.123880762 9.65 × 10 5
x 4 41.8083101940 9.97 × 10 5 64.42841351839 8.57 × 10 5 19.032151465 9.58 × 10 5
x 5 17.893343413 9.12 × 10 5 19.936940566 6.89 × 10 5 10.779726245 8.30 × 10 5
x 6 18.361343413 9.13 × 10 5 17.550135504 7.07 × 10 5 9.859325235 8.16 × 10 5
Table A3. Numerical results of the 7 test problems in [15] with random initial guesses.
Table A3. Numerical results of the 7 test problems in [15] with random initial guesses.
Pdimx0NDDFDFMPRPMSBFGS2
CPUNiNfNorm (F)CPUNiNfNorm (F)CPUNiNfNorm (F)
P110000 x 7 0.10921143 5.71 × 10 6 FFFF0.0624623 4.08 × 10 5
x 8 0.0780935 6.98 × 10 8 0.10921178 3.52 × 10 5 0.0000519 9.40 × 10 7
100000 x 7 0.21841143 2.04 × 10 5 FFFF0.1560727 8.28 × 10 9
x 8 0.0936935 2.11 × 10 7 0.39001285 2.81 × 10 5 0.0624519 2.85 × 10 6
500000 x 7 0.92041143 4.53 × 10 5 FFFF0.3744727 1.87 × 10 8
x 8 0.6240935 4.80 × 10 7 1.77841285 6.20 × 10 5 0.3744519 6.47 × 10 6
1000000 x 7 1.87201143 6.28 × 10 5 FFFF1.0764727 2.74 × 10 8
x 8 1.3728935 6.73 × 10 7 3.90001285 8.82 × 10 5 0.7956519 9.12 × 10 6
P210000 x 7 0.0624519 2.41 × 10 7 0.06241176 7.03 × 10 5 0.0780415 6.89 × 10 10
x 8 0.0000519 2.33 × 10 7 0.07801176 6.97 × 10 5 0.0000415 8.04 × 10 10
100000 x 7 0.1560519 7.46 × 10 7 0.46801283 5.53 × 10 5 0.0624415 2.41 × 10 9
x 8 0.1404519 7.43 × 10 7 0.43681283 5.51 × 10 5 0.0936415 2.47 × 10 9
500000 x 7 0.3120519 1.66 × 10 6 1.88761390 3.09 × 10 5 0.4056415 5.52 × 10 9
x 8 0.2652519 1.67 × 10 6 1.91881390 3.09 × 10 5 0.2808415 5.52 × 10 9
1000000 x 7 0.7176519 2.35 × 10 6 3.36961390 4.36 × 10 5 0.4836415 7.86 × 10 9
x 8 0.7020519 2.37 × 10 6 3.26041390 4.37 × 10 5 0.4368415 7.81 × 10 9
P310000 x 7 FFFFFFFFFFFF
x 8 0.390084598 7.73 × 10 5 FFFF0.187244305 9.94 × 10 5
100000 x 7 FFFFFFFFFFFF
x 8 1.622467484 8.60 × 10 5 FFFF1.513261418 9.42 × 10 5
500000 x 7 FFFFFFFFFFFF
x 8 8.876563452 6.87 × 10 5 FFFF5.553642293 8.80 × 10 5
1000000 x 7 FFFFFFFFFFFF
x 8 25.771478562 5.31 × 10 5 FFFF13.462956387 4.05 × 10 5
P410000 x 7 0.717663470 9.28 × 10 5 0.468029319 7.79 × 10 5 0.296427193 6.50 × 10 5
x 8 0.639645325 7.25 × 10 5 1.185670715 9.12 × 10 5 0.312031210 7.03 × 10 5
100000 x 7 4.4148102753 4.81 × 10 5 2.558434374 9.94 × 10 5 4.1652102713 3.86 × 10 5
x 8 2.246457411 4.66 × 10 5 11.10731871632 9.72 × 10 5 1.606837273 7.04 × 10 5
500000 x 7 36.51981671221 9.33 × 10 5 9.282131343 7.05 × 10 5 11.138554385 9.44 × 10 5
x 8 10.966951369 9.70 × 10 5 45.95791881703 8.95 × 10 5 19.6717103720 6.99 × 10 5
1000000 x 7 FFFF22.167735386 3.90 × 10 5 23.759055391 7.98 × 10 5
x 8 38.532286616 3.87 × 10 5 FFFF90.52742151527 1.00 × 10 4
P510000 x 7 0.265281495 7.48 × 10 5 0.312066604 7.55 × 10 5 0.280870414 9.63 × 10 5
x 8 0.327683512 6.64 × 10 5 0.280865595 8.63 × 10 5 0.280882487 8.19 × 10 5
100000 x 7 2.2152113695 7.98 × 10 5 2.449286762 8.45 × 10 5 1.622478468 9.51 × 10 5
x 8 2.1216104638 6.55 × 10 5 1.918867621 8.75 × 10 5 1.887690530 9.22 × 10 5
500000 x 7 11.481794577 8.91 × 10 5 10.623769633 8.47 × 10 5 9.703388521 9.80 × 10 5
x 8 10.530187533 6.40 × 10 5 12.355383755 7.93 × 10 5 9.453788523 9.64 × 10 5
1000000 x 7 20.904192564 8.90 × 10 5 21.216175696 8.82 × 10 5 18.486190531 9.86 × 10 5
x 8 25.8806108668 8.34 × 10 5 22.900977712 9.18 × 10 5 17.986987514 8.06 × 10 5
P610000 x 7 1.31041502015 7.23 × 10 5 0.96721221897 6.02 × 10 5 0.218434415 3.80 × 10 5
x 8 0.234036459 5.85 × 10 5 0.436836599 8.73 × 10 5 0.171634430 3.98 × 10 5
100000 x 7 9.11051842547 8.10 × 10 5 11.76252133364 7.34 × 10 5 1.372835422 9.91 × 10 5
x 8 3.9624921151 8.71 × 10 5 1.856433551 5.52 × 10 5 1.310427335 4.85 × 10 5
500000 x 7 73.71052553550 7.16 × 10 5 FFFF8.938941487 4.28 × 10 5
x 8 45.02191532119 6.35 × 10 5 10.249335583 8.12 × 10 5 8.080933398 2.05 × 10 5
1000000 x 7 67.93841071445 8.83 × 10 5 FFFF16.473737449 6.45 × 10 5
x 8 46.0047821094 7.63 × 10 5 21.606142695 8.61 × 10 5 13.930934407 2.22 × 10 5
P710000 x 7 0.54601071006 9.88 × 10 5 1.84081892772 9.46 × 10 5 0.436870647 9.52 × 10 5
x 8 0.577286837 9.93 × 10 5 1.46641732473 8.88 × 10 5 0.405680749 6.61 × 10 5
100000 x 7 2.402472681 8.18 × 10 5 17.58133244944 7.03 × 10 5 1.934459543 7.70 × 10 5
x 8 3.94681091114 6.27 × 10 5 15.03852634219 8.64 × 10 5 3.338490863 9.70 × 10 5
500000 x 7 16.785787822 9.82 × 10 5 FFFF8.236948450 9.59 × 10 5
x 8 37.47141621775 8.86 × 10 5 FFFF14.102573712 9.46 × 10 5
1000000 x 7 41.80831081032 8.80 × 10 5 FFFF25.865080746 7.03 × 10 5
x 8 66.14441341487 9.60 × 10 5 FFFF29.297084827 4.95 × 10 5

References

  1. Ortega, J.M.; Rheinboldt, W.C. Iterative Solution of Nonlinear Equations in Several Variables; Siam: Philadelphia, PA, USA, 1970. [Google Scholar]
  2. Sun, W.; Yuan, Y.X. Optimization Theory and Methods; Springer: Berlin/Heidelberg, Germany, 2006. [Google Scholar]
  3. Li, D.; Fukushima, M. A globally and superlinearly convergent Gauss-Newton based BFGS method for symmetric equations. SIAM J. Numer. Anal. 1999. [Google Scholar] [CrossRef]
  4. Gu, G.Z.; Li, D.H.; Qi, L.; Zhou, S.Z. Descent directions of quasi-Newton methods for symmetric nonlinear equations. SIAM J. Numer. Anal. 2002, 40, 1763–1774. [Google Scholar] [CrossRef] [Green Version]
  5. Li, D.H.; Fukushima, M. A modified BFGS method and its global convergence in nonconvex minimization. J. Comput. Appl. Math. 2001, 129, 15–35. [Google Scholar] [CrossRef] [Green Version]
  6. Zhou, W. A modified BFGS type quasi-Newton method with line search for symmetric nonlinear equations problems. J. Comput. Appl. Math. 2020, 367, 112454. [Google Scholar] [CrossRef]
  7. Wang, J.; Zhu, D. The inexact-newton via GMRES subspace method without line search technique for solving symmetric nonlinear equations. Appl. Numer. Math. 2016, 110, 174–189. [Google Scholar] [CrossRef]
  8. Yuan, G.L.; Yao, S.W. A BFGS algorithm for solving symmetric nonlinear equations. Optimization 2013, 62, 85–99. [Google Scholar] [CrossRef]
  9. Wang, X.L.; Li, D.H. A modified Fletcher-Reeves-type derivative-free method for symmetric nonlinear equations. Numer. Algebr. Control Optim. 2011, 1, 71–82. [Google Scholar]
  10. Zhang, L.; Zhou, W.; Li, D. Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search. Numer. Math. 2006, 104, 561–572. [Google Scholar] [CrossRef]
  11. Dai, Y.H.; Kou, C.X. A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 2013, 23, 296–320. [Google Scholar] [CrossRef] [Green Version]
  12. Xiao, Y.; Wu, C.; Wu, S.Y. Norm descent conjugate gradient methods for solving symmetric nonlinear equations. J. Glob. Optim. 2015, 62, 751–762. [Google Scholar] [CrossRef]
  13. Zhou, W.; Shen, D. Convergence properties of an iterative method for solving symmetric nonlinear equations. J. Optim. Theory Appl. 2015, 164, 277–289. [Google Scholar] [CrossRef]
  14. Zhang, L.; Zhou, W.; Li, D.H. A descent modified Polak-Ribière-Polyak conjugate gradient method and its global convergence. IMA J. Numer. Anal. 2006, 26, 629–640. [Google Scholar] [CrossRef]
  15. Liu, J.K.; Feng, Y.M. A norm descent derivative-free algorithm for solving large-scale nonlinear symmetric equations. J. Comput. Appl. Math. 2018, 344, 89–99. [Google Scholar] [CrossRef]
  16. Liu, J.K.; Li, S.J. New three-term conjugate gradient method with guaranteed global convergence. Int. J. Comput. Math. 2014, 91, 1744–1754. [Google Scholar] [CrossRef]
  17. Cheng, W.Y.; Chen, Z.X. Nonmonotone spectral method for large-scale symmetric nonlinear equations. Numer. Algorithms 2013, 62, 149–162. [Google Scholar] [CrossRef]
  18. Yusuf, W.M.; Sabiu, J. A derivative-free conjugate gradient method and its global convergence for solving symmetric nonlinear equations. Int. J. Math. Math. Sci. 2015, 2015, 1–8. [Google Scholar]
  19. Zhou, W.J.; Chen, X.L. On the convergence of a derivative-free HS type method for symmetric nonlinear equations. Adv. Model.Optim. 2012, 3, 645–654. [Google Scholar]
  20. Yakubu, U.A.; Mamat, M. A recent modification on dai-liao conjugate gradient method for solving symmetric nonlinear equations. Far East J. Math. Sci. 2018, 103, 1961–1974. [Google Scholar] [CrossRef]
  21. Lv, J.; Deng, S.; Wan, Z. An efficient single-parameter scaling memoryless Broyden-Fletcher-Goldfarb-Shanno algorithm for solving large scale unconstrained optimization problems. IEEE Access 2020, 8, 85664–85674. [Google Scholar] [CrossRef]
  22. Zhou, W.J.; Zhang, L. A nonlinear conjugate gradient method based on the MBFGS secant condition. Optim. Methods Softw. 2006, 21, 707–714. [Google Scholar] [CrossRef]
  23. Byrd, R.H.; Nocedal, J. A tool for the analysis of quasi-Newton methods with application to unconstrained minimization. SIAM J. Numer. Anal. 1989, 26, 727–739. [Google Scholar] [CrossRef]
  24. Dennis, J.E. A characterization of superlinear convergence and its application to quasi-Newton methods. Math. Comput. 1974, 28, 549–560. [Google Scholar] [CrossRef]
  25. Li, Q.; Li, D.H. A class of derivative-free methods for large-scale nonlinear monotone equations. IMA J. Numer. Anal. 2011, 31, 1625–1635. [Google Scholar] [CrossRef]
  26. Raydan, M. The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem. SIAM J. Optim. 1997, 7, 26–33. [Google Scholar] [CrossRef]
  27. Kelley, C.T. Iterative methods for linear and nonlinear equations. Front. Appl. Math. 1995, 16, 206–207. [Google Scholar]
  28. Yamakawa, E.; Fukushima, M. Testing parallel variable transformation. Comput. Optim. Appl. 1999, 13, 253–274. [Google Scholar] [CrossRef]
  29. Dolan, E.D.; Moré, J.J. Benchmarking optimization software with performance profiles. Math. Program. 2002, 91, 201–213. [Google Scholar] [CrossRef]
Figure 1. The total number of iterations with different values of algorithmic parameters.
Figure 1. The total number of iterations with different values of algorithmic parameters.
Symmetry 13 00970 g001
Figure 2. Comparison of numerical performance among three methods.
Figure 2. Comparison of numerical performance among three methods.
Symmetry 13 00970 g002
Figure 3. Comparison of numerical performance among three methods.
Figure 3. Comparison of numerical performance among three methods.
Symmetry 13 00970 g003
Table 1. Total number of wins or failures of algorithms.
Table 1. Total number of wins or failures of algorithms.
AlgorithmCPU WinsIter WinsNf WinsFails
MBFGS88566015
GNBFGS9567650
MSBFGS1271241231
Table 2. Total number of wins or failures of algorithms.
Table 2. Total number of wins or failures of algorithms.
AlgorithmCPU WinsIter WinsNf WinsFails
NDDF35313254
DFMPRP1425446
MSBFGS218017719416
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Guo, J.; Wan, Z. Two Modified Single-Parameter Scaling Broyden–Fletcher–Goldfarb–Shanno Algorithms for Solving Nonlinear System of Symmetric Equations. Symmetry 2021, 13, 970. https://doi.org/10.3390/sym13060970

AMA Style

Guo J, Wan Z. Two Modified Single-Parameter Scaling Broyden–Fletcher–Goldfarb–Shanno Algorithms for Solving Nonlinear System of Symmetric Equations. Symmetry. 2021; 13(6):970. https://doi.org/10.3390/sym13060970

Chicago/Turabian Style

Guo, Jie, and Zhong Wan. 2021. "Two Modified Single-Parameter Scaling Broyden–Fletcher–Goldfarb–Shanno Algorithms for Solving Nonlinear System of Symmetric Equations" Symmetry 13, no. 6: 970. https://doi.org/10.3390/sym13060970

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop