Next Article in Journal
Compressive Strength of Geopolymer Concrete Prediction Using Machine Learning Methods
Previous Article in Journal
The Aho-Corasick Paradigm in Modern Antivirus Engines: A Cornerstone of Signature-Based Malware Detection
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Two Classes of Restart Algorithms for Solving Pseudomonotone Nonlinear Equations

by
Jitsupa Deepho
1,
Auwal Bala Abubakar
2,3,5,* and
Abdulkarim Hassan Ibrahim
4,5
1
Faculty of Science, Energy and Environment, King Mongkut’s University of Technology North Bangkok, 19 Moo 11, Tambon Nonglalok, Amphur Bankhai, Rayong 21120, Thailand
2
Department of Art and Science, George Mason University, Songdomunhwa-ro 119-4, Yeonsu-gu, Incheon 21985, Republic of Korea
3
Numerical Optimization Research Group, Department of Mathematical Sciences, Faculty of Physical Sciences, Bayero University, Kano 700241, Nigeria
4
Faculty of Mathematics and Data Science, Emirates Aviation University, Dubai 53044, United Arab Emirates
5
Department of Mathematics and Applied Mathematics, Sefako Makgatho Health Sciences University, Ga-Rankuwa, Pretoria 0204, Medunsa, South Africa
*
Author to whom correspondence should be addressed.
Algorithms 2025, 18(12), 743; https://doi.org/10.3390/a18120743
Submission received: 1 October 2025 / Revised: 31 October 2025 / Accepted: 9 November 2025 / Published: 26 November 2025

Abstract

In this study, we introduce two efficient derivative-free algorithms enhanced by a restart strategy to solve nonlinear pseudomonotone equations. We demonstrate that the algorithm’s search direction is both descent and bounded, and under the assumptions of pseudomonotonicity and continuity, the algorithm generates globally convergent sequences toward the solutions. Numerical experiments on benchmark test problems highlight the computational efficiency of our proposed algorithm compared to several existing methods. Additionally, we illustrate the algorithm’s applicability to logistic regression problems, showcasing its practical relevance.

1. Introduction

Conjugate Gradient (CG) algorithms are highly efficient for solving nonlinear problems, particularly those of large scale. Their efficiency stems from the fact that they avoid storing n × n matrices, instead relying on n × 1 column vectors. These algorithms are particularly effective for solving unconstrained minimization problems of the following form:
min x R n f ( x ) ,
where f : R n R is continuously differentiable and bounded below. To solve this problem using the CG method, an initial point x 0 is chosen, and the following iterative procedure is applied:
x k + 1 = x k + α k d k , k = 0 , 1 , ,
where the sequence { x k } k = 1 is expected to converge to the solution. The search direction d k is determined by the following:
d k : = F ( x k ) , if k = 0 , F ( x k ) + β k d k 1 , if k 1 ,
where F : = f is the gradient of f, and β k is a CG parameter that varies across different CG algorithms. The step size α k is obtained via a line search along the direction d k . A first-order necessary condition for optimality states that if x ˜ is a minimizer of f, then
F ( x ˜ ) = 0 .
Problems of this form arise in numerous high-impact applications, including signal recovery [1], image restoration [2], logistic regression [3], power flow equations [4], and variational inequalities [5].
For a CG algorithm to be both convergent and efficient, the search direction d k must satisfy certain criteria. These include the boundedness of the search direction and the sufficient descent property:
F ( x k ) T d k c F ( x k ) 2 , c > 0 .
However, some CG algorithms may fail to achieve global convergence if they do not satisfy the sufficient descent condition.
A restart strategy is often used to ensure that the search direction d k satisfies the sufficient descent condition at each iteration. Various restart algorithms have been proposed in the literature. For example, Powell [6] introduced a restart technique to improve convergence rates and ensure the positive definiteness of the search direction. Other researchers, such as Bolan and Kowalik [7], Shanno [8], Knockaert and Zutter [9], and Dai [10], have also developed restart procedures to guarantee global convergence in their algorithms.
Despite these advancements, many restart strategies based on the Powell criterion can suffer from slow convergence because they discard information from previous search directions. To address this, Jiang et al. [11] proposed efficient restart algorithms that take advantage of previous search directions, ensuring global convergence under Wolfe line search conditions. Their work demonstrated better performance over traditional restart methods.
In recent years, derivative-free methods have gained attention for solving problems where the gradient f is not available. These methods rely solely on function evaluations, making them suitable for problems where f is not differentiable. Cheng [12] proposed a PRP restart algorithm for solving such problems, demonstrating its efficiency over non-restart algorithms. However, the effectiveness of the restart strategy was not fully validated due to limited numerical experiments. In [13], Li et al. proposed two classes of derivative-free methods with self-adaptive and restart features. The proposed methods, which extend MCG1 and MCG2 [14], generate descent directions and achieve global convergence. Additionally, the linear convergence rate of these methods was established under the local error bound condition and their effectiveness in solving constrained nonlinear equations together with application was demonstrated. However, the methods in [12,13] require F to be Monotone and Lipschitz continuous. For more on derivative-free methods, interested readers are referred to [1,2,15,16,17,18].
Pseudomonotonicity, a weaker condition than monotonicity, has also been explored in the context of CG algorithms. Recent work by Liu et al. [19], Awwal et al. [20], and Liu et al. [21] has focused on solving problems involving pseudomonotone and continuous operators, though their algorithms do not incorporate restart strategies. More details about pseudomonotone operators can also be found in [22].
In this work, we propose two efficient restart algorithms for solving problems involving pseudomonotone and continuous operators. The algorithm is designed to ensure global convergence while maintaining computational efficiency. The manuscript is organized as follows: Section 2 introduces the proposed algorithm and its convergence properties, Section 3 presents numerical experiments demonstrating the algorithm’s performance, Section 4 applies the method to logistic regression models, and Section 5 concludes the work.

2. Derivative-Free Projection Method with Restart Technique

In this section, we propose two classes of restart derivative-free algorithm to solve (1) with F : R n R n being
1.
pseudomonotone, i.e., for all x 1 , x 2 R n
F ( x 1 ) T ( x 2 x 1 ) 0 implies F ( x 2 ) T ( x 2 x 1 ) 0 ,
2.
and continuous.
To begin with, we recall the SPCG2+ method for solving the unconstrained optimization problem proposed by Liu et al. [23]. The search direction is defined as follows:
d k = F k , k = 0 , 1 + β k F k T d k 1 F k 2 F k + β k d k 1 , k 1 ,
where β k is defined as
β k = F k T y k 1 d k 1 T y k 1 F k T d k 1 d k 1 2 , y k 1 : = F ( x k ) F ( x k 1 ) ,
In order to obtain the global convergence of the algorithm, the search direction generated by the algorithm is required to satisfy the sufficient descent property. As such, we propose the following two classes of restart methods with the following search directions:
d k I : = F ( x k ) , if k = 0 , 1 + β k F ( x k ) T d k 1 F ( x k ) 2 F ( x k ) + β k d k 1 , ( k > 0 ) if F ( x k ) y k 1 d k 1 d k 1 T y k 1 , F ( x k ) + μ F ( x k ) T y k 1 y k 1 2 y k 1 , ( k > 0 ) otherwise , | μ | < 1 .
and
d k I I : = F ( x k ) , if k = 0 , 1 + β k F ( x k ) T d k 1 F ( x k ) 2 F ( x k ) + β k d k 1 , ( k > 0 ) if y k 1 d k 1 d k 1 T y k 1 , F ( x k ) + μ F ( x k ) T d k 1 d k 1 2 d k 1 , ( k > 0 ) otherwise , | μ | < 1 .
Note that the restart conditions in d k I and d k I I are necesssry in order for the search direction to satisfy inequality (2). This inequality is neccessary for the algorithm to converge. However, these restart conditions may not always hold. As such, we define d k I and d k I I by F ( x k ) + μ F ( x k ) T d k 1 d k 1 2 d k 1 , | μ | < 1 , which always satisfies (2). In addition, it has been shown to be an efficient search direction when it comes to the practical aspect.
Next, we present two restart algorithms that generate approximate solutions of the pseudomonotone Equation (1) with search direction d k in (5) and (6) together with their convergence analysis.
Lemma 1.
Let F ( x 0 ) , , F ( x k ) be nonzero for some k 0 , where x 0 , , x k are the first k + 1 iterates obtained via the Algorithm 1. If d k is defined by (5) or (6) with = 1 μ , then
F ( x k ) T d k F ( x k ) 2 .
Algorithm 1 Restart Derivative-free Projection Method (RDFPM)
1:
Initialization: Select an initial vector x 0 R n , r ( 0 , 1 ) , κ ( 0 , 2 ) , σ ( 0 , 1 ) . Set k = 0 .
2:
Step 1. Evaluate F ( x k ) . If F ( x k ) t o l then stop, else go to Step 2.
3:
Step 2. Compute the search direction d k by (5) or (6).
4:
Step 3. Set z k = x k + α k d k with α k : = r ι k and
ι k : = min i N { 0 } : F ( x k + r i d k ) T d k σ r i F ( x k + r i d k ) d k 2 .
5:
Step 4. If F ( z k ) = 0 , stop, else go to Step 5.
6:
Step 5. Set
x k + 1 = x k γ F ( z k ) T ( x k z k ) F ( z k ) 2 F ( z k ) .
7:
Step 6. Let k = k + 1 and repeat the process from Step 1.
Proof. 
Consider d k defined by (5), then
Case 1: When k = 0 , F ( x 0 ) T d 0 = F ( x 0 ) 2 ( 1 μ ) F ( x 0 ) 2 .
Case 2: When k > 0 and F ( x k ) y k 1 d k 1 d k 1 T y k 1 , we get
F ( x k ) T d k = 1 + β k F ( x k ) T d k 1 F ( x k ) 2 F ( x k ) T F ( x k ) + β k F ( x k ) T d k 1 = F ( x k ) 2 β k F ( x k ) T d k 1 + β k F ( x k ) T d k 1 = F ( x k ) 2 ( 1 μ ) F ( x k ) 2 .
Case 3: If k > 0 and F ( x k ) y k 1 d k 1 > d k 1 T y k 1 , we get
F ( x k ) T d k = F ( x k ) 2 + μ ( F ( x k ) T y k 1 ) 2 y k 1 2 F ( x k ) 2 + μ F ( x k ) 2 y k 1 2 y k 1 2 F ( x k ) 2 + μ F ( x k ) 2 = ( 1 μ ) F ( x k ) 2 .
Therefore, for all three cases above, we end up with (7).
The proof for the case when d k is defined by (6) is similar because it is independent of the restart condition, so we omit it.    □
Remark 1.
As the Algorithm 1 generates k + 1 iterates x 0 , x k for k 0 with F ( x j ) 0 for j = 0 , 1 , , k , we can deduce from Lemma 1 that for j = 0 , 1 , , k
F ( x j ) d j F ( x j ) T d j F ( x j ) 2 < 0
implies
0 < F ( x j ) d j .
This means that the Algorithm 1 runs as long as x k is not a solution, i.e., F ( x k ) 0 .
Before stating the next Lemma, we would like to assume that the solution set to problem (1) is nonempty.
Lemma 2.
Let x ˜ be a solution of problem (1) for a pseudomonotone and continuous operator F. If a sequence { x k } k = 1 is generated by the Algorithm 1, then
x k + 1 x ˜ 2 x k x ˜ 2 γ ( 2 γ ) σ 2 x k z k 4 .
Precisely, lim k x k x ˜ exists.
Proof. 
Since x ˜ is a solution of (1) and F is pseudomonotone, then
F ( x ˜ ) T ( z k x ˜ ) 0 F ( z k ) T ( z k x ˜ ) 0 .
In addition, by the definition of z k in the Algorithm 1, we have
F ( z k ) T d k σ α k F ( z k ) d k 2 ,
which implies
F ( z k ) T ( x k z k ) = F ( z k ) T ( α k d k ) = α k F ( z k ) T d k σ α k 2 F ( z k ) d k 2 = σ F ( z k ) x k z k 2 0 .
Next, we estimate x k + 1 x ˜ 2 using (8), (10), (11) and γ ( 0 , 2 ) :
x k + 1 x ˜ 2 = x k γ F ( z k ) T ( x k z k ) F ( z k ) 2 Φ ( z k ) x ˜ 2 = x k x ˜ 2 2 γ F ( z k ) T ( x k z k ) F ( z k ) 2 F ( z k ) T ( x k x ˜ ) + γ 2 ( F ( z k ) T ( x k z k ) ) 2 F ( z k ) 2 = x k x ˜ 2 2 γ F ( z k ) T ( x k z k ) 2 F ( z k ) 2 2 γ F ( z k ) T ( x k z k ) F ( z k ) T ( z k x ˜ ) F ( z k ) 2 + γ 2 ( F ( z k ) T ( x k z k ) ) 2 F ( z k ) 2 x k x ˜ 2 γ ( 2 γ ) F ( z k ) T ( x k z k ) 2 Φ ( z k ) 2 x k x ˜ 2 γ ( 2 γ ) σ 2 x k z k 4 .
{ x k x ˜ } is a decreasing sequence from the last inequality, and, hence, the limit exists.    □
Remark 2.
It is interesting to note that Lemma 2 confirms that for any zero x ˜ of F, { x k x ˜ } is a decreasing sequence.
Lemma 3.
Suppose { x k } k = 1 is generated by the Algorithm 1 and F is continuous, then for all k, there exists M > 0 , such that
F ( x k ) M .
In other words, the sequence { F ( x k ) } is bounded.
Proof. 
As the sequence { x k x ˜ } converges by Lemma 2, then there exists M 1 > 0 , such that for all k,
x k x ˜ M 1 .
By the triangle inequality, we end up with
x k M 1 + x ˜ : = M 2 .
Now, since { x k } k = 1 is bounded and F is continuous, { F ( x k ) } is also bounded.    □
Lemma 4.
Let α k and d k be generated via the Algorithm 1. Then
lim k α k d k = 0 .
Proof. 
By Lemma 2, we have that γ ( 2 γ ) σ 2 x k z k 4 x k x ˜ 2 x k + 1 x ˜ 2 ; we get
k = 0 ( α k d k ) 4 = k = 0 x k z k 4 1 γ ( 2 γ ) σ 2 k = 0 ( x k x ˜ 2 x k + 1 x ˜ 2 ) x 0 x ˜ 2 γ ( 2 γ ) σ 2 lim k x k x ˜ 2 < .
Hence,
lim k α k d k = 0 .
   □
Lemma 5.
For all k, d k defined by Equation (5) or (6) is bounded. That is
d k C , C > 0 .
Proof. 
Let d k be defined by (5), then
  • For k = 0 ,   d k = F ( x k ) = F ( x k ) .
  • For k > 0 and F ( x k ) y k 1 d k 1 d k 1 T y k 1 , we have
    d k I = F ( x k ) + β k F ( x k ) T d k 1 F ( x k ) 2 F ( x k ) + β k d k 1 F ( x k ) + | β | F ( x k ) 2 d k 1 F ( x k ) 2 + | β | d k 1 = F ( x k ) + 2 F ( x k ) T y k 1 d k 1 T y k 1 F ( x k ) T d k 1 d k 1 2 d k 1 F ( x k ) + 2 | F ( x k ) T y k 1 | | d k 1 T y k 1 | d k 1 + 2 | F ( x k ) T d k 1 | d k 1 2 d k 1 F ( x k ) + 2 F ( x k ) y k 1 d k 1 | d k 1 T y k 1 | + 2 F ( x k ) d k 1 2 d k 1 2 F ( x k ) + 2 + 2 F ( x k ) = 3 F ( x k ) + 2 3 M + 2
  • For k > 0 and F ( x k ) y k 1 d k 1 > d k 1 T y k 1 , we have
    d k I = F ( x k ) + μ F ( x k ) T y k 1 y k 1 2 y k 1 F ( x k ) + μ F ( x k ) y k 1 2 y k 1 2 = F ( x k ) + μ F ( x k ) = ( 1 + μ ) F ( x k ) .
On the other hand, if d k be defined by (6), then
  • For k = 0 ,   d k = F ( x k ) = F ( x k ) .
  • For k > 0 and y k 1 d k 1 d k 1 T y k 1 , we have
    d k I I = F ( x k ) + β k F ( x k ) T d k 1 F ( x k ) 2 F ( x k ) + β k d k 1 F ( x k ) + | β | F ( x k ) 2 d k 1 F ( x k ) 2 + | β | d k 1 = F ( x k ) + 2 F ( x k ) T y k 1 d k 1 T y k 1 F ( x k ) T d k 1 d k 1 2 d k 1 F ( x k ) + 2 | F ( x k ) T y k 1 | | d k 1 T y k 1 | d k 1 + 2 | F ( x k ) T d k 1 | d k 1 2 d k 1 F ( x k ) + 2 F ( x k ) y k 1 d k 1 | d k 1 T y k 1 | + 2 F ( x k ) d k 1 2 d k 1 2 F ( x k ) + 2 F ( x k ) + 2 F ( x k ) = 5 F ( x k ) .
  • For k > 0 and y k 1 d k 1 > d k 1 T y k 1 , we have
    d k I I = F ( x k ) + μ F ( x k ) T d k 1 d k 1 2 d k 1 F ( x k ) + μ F ( x k ) d k 1 2 d k 1 2 = F ( x k ) + μ F ( x k ) = ( 1 + μ ) F ( x k ) .
That is, with C = max { 5 M , 3 M + 2 , 1 + μ } , we have
d k C , k > 0 .
Proposition 1.
If { x k } k = 1 is generated by the Algorithm 1 with a pseudomonotone and continuous operator Φ, then
lim inf k F ( x k ) = 0 .
Proof. 
Suppose by contradiction that lim inf k F ( x k ) > 0 . This implies that there exists κ > 0 with
inf k F ( x k ) κ .
Using (15) together with (9) and (13), we have
κ F ( x k ) d k C , k > 0 .
So lim k α k d k = 0 obtained in Lemma 4 implies
lim k α k = 0 .
In addition, as { x k } and { d k } are bounded, there exist convergent subsequences { x k j } and { d k j } , such that
lim j x k j = x ˜ and lim j d k j = d ˜ .
Also, from (7),
F ( x k j ) T d k j F ( x k j ) 2
By the continuity of F, (15), and allowing j , we have
F ( x ˜ ) T d ˜ F ( x ˜ ) 2 > κ 2 > 0 .
On the other hand, from Step 4 of the Algorithm 1, we have
F ( x k j + r 1 α k j d k j ) T d k j < σ r 1 α k j F ( x k j + r 1 α k j d k j ) d k j 2 .
Combining the above argument together with (16), we have
F ( x ˜ ) T d ˜ 0 ,
which contradicts (17). Hence, (14) is true.   □
The next Theorem confirms the convergence of a sequence generated by the Algorithm 1 to a zero of the pseudomonotone and continuous operator F.
Theorem 1.
Suppose F is pseudomonotone and continuous; then, a sequence { x k } k = 1 generated via the Algorithm 1 converges to a zero x ¯ of F, that is, F ( x ¯ ) = 0 .
Proof. 
From Proposition 1, one can choose a subsequence { x k j } such that { F ( x k j ) } converges to 0. Because { x k } is bounded, we may further extract a convergent subsequence from { x k j } , which we also denote by { x k j } for simplicity. Let x ¯ = lim j x k j . Since F is continuous, we have
F ( x ¯ ) = lim j F ( x k j ) = 0 .
Furthermore, since lim k x k x ¯ exists by Lemma 2 and lim j x k j x ¯ = 0 , then the sequence { x k } must converge to x ¯ .   □

3. Numerical Experiments

In this section, we evaluate the performance of the proposed methods; that is, Algorithm 1 implemented with search direction (5) and search direction (6). We refer to these methods as RDFPM I and RDFPM II, respectively. The performance of these methods is compared against recently proposed methods in the literature; RDFIA [24], a relaxed-inertial derivative-free algorithm for pseudomonotone systems, and HDFPM [25], a memoryless BFGS hyperplane projection method for monotone equations. All experiments were executed in MATLAB R2025a Prerelease Update 2 (25.1.0.2833191) on an Apple Mac OS system with an M3 chip, 8GB using unified parameters for both proposed methods: r = 0.8 , σ = 0.001 , γ = 1.2 , and μ = 0.6 .
Seven benchmark nonlinear equations from the literature were selected as test problems. These problems include Problem 1 (Problem 2 in [26]), Problem 2 (Logarithmic function in [27]), Problem 3 (Example 4.1 in [28]), Problem 4 (Modified problem 10 in [29]), Problem 5 (Strictly convex function II in [27], Problem 6 (Nonsmooth function Problem 2 in [30]), and Problem 7 (Problem 4.6 in [31]). To assess the performance of each method, we evaluated four problem dimensions: n = 2000 , 15 , 000 , 75 , 000 , and 150 , 000 . For each dimension, six initial points were tested: five deterministic vectors x 0 = α · ones ( n , 1 ) with α { 0.2 , 0.5 , 1.0 , 1.2 , 1.5 } , and one randomized initial point x 0 = rand ( n , 1 ) . Each algorithm was implemented under identical experimental conditions (that is, all methods were tested on the same problem instances, dimensions, initial points and stopping criteria) to ensure fairness. The algorithms terminated when the residual satisfied F ( z k ) 10 6 or F ( x k ) 10 6 , or after a maximum of 2000 iterations.
Detailed results of the numerical experiments are presented in Table A1, Table A2, Table A3, Table A4, Table A5, Table A6 and Table A7 in the following link: https://github.com/ibrahimkarym/Two-classes-of-restart.git (accessed on 1 November 2025) or at the Appendix A. To assess the performance of the methods, we used the Dolan and Moré [32] performance profile, which allows us to plot the results. This profile provides an insightful comparison by summarizing the efficiency of each algorithm in terms of its ability to minimize iterations, function evaluations, and computation time. The best solver is the curve that tops the plot. Across all problems and initializations, RDFPM I and II consistently outperformed RDFIA and HDFPM. These results in Figure 1, Figure 2 and Figure 3 confirm that RDFPM I and II are robust and efficient for solving nonlinear equations.

4. Application to Logistic Regression

In this section, we consider applying the two proposed methods (RDFPM I and RDFPM II) to solve the regularized centralized logistic regression problem. This problem is formulated as follows
min x R n f ( x ) = 1 N i = 1 N log ( 1 + exp ( b i a i x ) ) + ϱ 2 x 2 ,
where ϱ > 0 is a regularization parameter that promotes stability and prevents overfitting, and the logistic loss function 1 N i = 1 N log ( 1 + exp ( b i a i x ) ) models the classification error based on the dataset consisting of pairs ( a i , b i ) R n × { 1 , 1 } . Given the strong convexity of f, the optimal solution x * R n is uniquely characterized as the root of the corresponding system of nonlinear monotone equations [33]:
F ( x ) : = f ( x ) = 1 N i = 1 N b i exp ( b i a i x ) a i 1 + exp ( b i a i x ) + ϱ x = 0 .
To solve (19), we employ our newly developed algorithm, leveraging its efficiency and robustness in handling large-scale nonlinear systems. We compare its performance with the HDFPM method [25]. For fairness, the parameter settings for RDFPM I, RDFPM II, and HDFPM follow those used in previous experiments. For benchmarking, we utilized real-world datasets from the LIBSVM repository [34], available at https://www.csie.ntu.edu.tw/~cjlin/libsvmtools/datasets/ (accessed on 1 November 2025). Specifically, the benchmark instances a1a–a9a and colon-cancer were loaded using the libsvmread function. No additional preprocessing was performed. There was no imputation, categorical encoding, or feature scaling—and the logistic objective was evaluated directly on the full dataset without a train/validation/test split to ensure a consistent optimization setting across all solvers. Details on the selected datasets are provided in Table 1.
Experiments were conducted in MATLAB (details mentioned earlier). The global random number generator was fixed to mt19937ar with seed 97006855 using RandStream.setGlobalStream. To ensure deterministic behavior and comparable wall-clock timing, we restricted BLAS operations to a single thread by setting maxNumCompThreads(1) and the environment variables OMP_NUM_THREADS=1 and MKL_NUM_THREADS=1; no parallel pools were used.
The numerical experiments are conducted by initializing x using the MATLAB script “ 4 ( rand ( n , 1 ) 0.5 ) ” and setting the regularization parameter to ϱ = 0.1 . Each test instance is evaluated across five independent runs, and we report the average performance metrics. As demonstrated in Table 1, our proposed algorithms achieves superior efficiency, outperforming HDFPM in terms of computational time, iteration count, and function evaluations. This highlights the effectiveness of our algorithm in addressing regularized centralized logistic regression problems.

5. Conclusions

In this paper, we propose two classes of conjugate gradient algorithm that integrate restart conditions to solve nonlinear pseudomonotone equations. We have analyzed its global convergence, particularly in cases where the pseudomonotone operator is continuous. To evaluate the efficiency of the algorithm, we conducted numerical experiments on some benchmark test problems. In addition, we evaluated its performance by applying it to a logistic regression model, demonstrating its competitiveness against existing methods. The results highlight the superior efficiency of our algorithms. It successfully solved a larger number of problems while requiring fewer iterations, function evaluations, and CPU time compared to other approaches. Furthermore, we aim to extend our algorithms to multivalued pseudomonotone operators, enhancing their applicability and efficiency for a broader range of problems.

Author Contributions

Conceptualization, J.D., A.B.A. and A.H.I.; methodology, A.B.A.; software, J.D.; validation, A.H.I. and A.B.A.; formal analysis, A.B.A.; investigation, J.D.; resources, A.H.I.; data curation, A.B.A.; writing—original draft preparation, J.D., A.B.A. and A.H.I.; writing—review and editing, J.D., A.B.A. and A.H.I.; visualization, J.D.; supervision, A.B.A.; project administration, A.H.I.; funding acquisition, A.B.A. and J.D. All authors have read and agreed to the published version of the manuscript.

Funding

This research recieved no external funding.

Data Availability Statement

The data used in this study ia available at https://www.csie.ntu.edu.tw/~cjlin/libsvmtools/datasets/ (accessed on 1 November 2025).

Acknowledgments

This research was funded by King Mongkut’s University of Technology North Bangkok, Contract no. KMUTNB-69-KNOW-20. The second and third authors acknowledge with thanks the Department of Mathematics and Applied Mathematics at the Sefako Makgatho Health Sciences University.

Conflicts of Interest

On behalf of all authors, the corresponding author states that there are no conflicts of interest.

Appendix A

Table A1. Numerical result for test problem 1.
Table A1. Numerical result for test problem 1.
RDFPM IRDFPM IIRIDFAHDFPM
DIMINPITERFVALTIMENORMFITERFVALTIMENORMFITERFVALTIMENORMFITERFVALTIMENORMF
2000 x 0 ( 1 ) 8310.0050643.45 × 10−78300.0036048.97 × 10−7170.0010040.5045441054210.0289698.71 × 10−7
x 0 ( 2 ) 5220.0021233.85 × 10−76240.0010739.45 × 10−7001.05 × 10−54.501891953810.0069339.32 × 10−7
x 0 ( 3 ) 4180.0009093.56 × 10−77260.0011513.85 × 10−72110.000413.3612571014050.0043588.68 × 10−7
x 0 ( 4 ) 9340.0023433.33 × 10−710370.0024455.22 × 10−74200.0007914.0597821064250.0047828.69 × 10−7
x 0 ( 5 ) 6260.0026344.32 × 10−79350.0014832.3 × 10−74200.0008154.0230091074290.0040428.88 × 10−7
x 0 ( 6 ) 15580.0024574.84 × 10−718590.0040569.85 × 10−72110.0006093.4366231014050.0037619.7 × 10−7
15,000 x 0 ( 1 ) 11400.063662.67 × 10−712410.0201077.67 × 10−712660.090511.5953531154610.1035269.4 × 10−7
x 0 ( 2 ) 6230.00987207260.00766803160.0226874.4374181064250.0926548.54 × 10−7
x 0 ( 3 ) 5190.00571308300.0077432.19 × 10−77360.0583211.3322071114450.0981379.36 × 10−7
x 0 ( 4 ) 11380.049307012420.0097679.79 × 10−715830.0706971.2459731164650.1025279.37 × 10−7
x 0 ( 5 ) 9330.0092409340.0090079.31 × 10−7181010.0377921.2782171174690.1011799.58 × 10−7
x 0 ( 6 ) 18710.0165844.03 × 10−724780.0198137.47 × 10−77360.0087571.3294261124490.0909169.04 × 10−7
75,000 x 0 ( 1 ) 12430.0463563.1 × 10−78300.0256418.45 × 10−7251460.1475811.5101131204810.3654559.32 × 10−7
x 0 ( 2 ) 7280.0289394.9 × 10−79340.0269726.78 × 10−76310.0280521.7497961104410.3331229.97 × 10−7
x 0 ( 3 ) 5190.0192549.06 × 10−2211360.028787014770.0713951.5174761164650.3466829.29 × 10−7
x 0 ( 4 ) 13440.046825011380.0321740321950.227041.5041391214850.4791469.3 × 10−7
x 0 ( 5 ) 10360.050407011390.0330768.69 × 10−7392440.2734561.5255711224890.3674179.5 × 10−7
x 0 ( 6 ) 16570.06158.16 × 10−723750.0728797.67 × 10−716890.0939471.4508681174690.3560088.95 × 10−7
150,000 x 0 ( 1 ) 12410.068004012430.0676726.73 × 10−7382360.3637111.5582441224890.7100289.53 × 10−7
x 0 ( 2 ) 8290.056762015510.0841622.27 × 10−78410.0632871.3908391134530.739018.66 × 10−7
x 0 ( 3 ) 5190.05588909300.0456660191070.1680221.5351261184730.6906399.49 × 10−7
x 0 ( 4 ) 13440.096893010380.056757.69 × 10−7452840.4800741.5728631234930.7173049.5 × 10−7
x 0 ( 5 ) 11410.074562.86 × 10−712420.063630533400.5714781.5628051244970.8274229.71 × 10−7
x 0 ( 6 ) 16580.1101972.32 × 10−726840.1373038.36 × 10−7211190.1826221.4754531194770.701169.14 × 10−7
Table A2. Numerical result for test problem 2.
Table A2. Numerical result for test problem 2.
RDFPM IRDFPM IIRIDFAHDFPM
DIMINPITERFVALTIMENORMFITERFVALTIMENORMFITERFVALTIMENORMFITERFVALTIMENORMF
2000 x 0 ( 1 ) 9280.0012133.43 × 10−77220.0004685.16 × 10−716650.0010619.74 × 10−7993960.0046699.57 × 10−7
x 0 ( 2 ) 6190.0005932.09 × 10−716490.0013261.25 × 10−8 963850.0040288.8 × 10−7
x 0 ( 3 ) 6180.00089.64 × 10−79270.000459.34 × 10−715620.0008786.66 × 10−7923680.0039598.53 × 10−7
x 0 ( 4 ) 9280.0008085.03 × 10−76190.0003259.87 × 10−715600.0008716.64 × 10−71014040.0042339.63 × 10−7
x 0 ( 5 ) 10310.0007984.76 × 10−714430.0006331.61 × 10−716640.0009277 × 10−71044160.0042678.92 × 10−7
x 0 ( 6 ) 17630.0015242.96 × 10−718570.0008287.93 × 10−715640.0008989.36 × 10−7963840.0039358.52 × 10−7
15,000 x 0 ( 1 ) 9280.0115662.67 × 10−77220.0060986.03 × 10−8251100.0256456.86 × 10−71094360.0898829.98 × 10−7
x 0 ( 2 ) 6200.0060655.81 × 10−85170.0041751.11 × 10−716670.0160278.12 × 10−71064250.0875339.25 × 10−7
x 0 ( 3 ) 5160.0057932.63 × 10−75160.0040012.72 × 10−720850.0182857.85 × 10−71024080.085848.89 × 10−7
x 0 ( 4 ) 10310.0097022.52 × 10−77220.0060728.69 × 10−8261150.0265276.68 × 10−71124480.0932588.54 × 10−7
x 0 ( 5 ) 11340.0142952.44 × 10−76190.0056772.56 × 10−7301420.0401177.32 × 10−71144560.0992259.3 × 10−7
x 0 ( 6 ) 181090.0261612.29 × 10−726800.0223968.06 × 10−721900.0218537.36 × 10−71064240.0839259.06 × 10−7
75,000 x 0 ( 1 ) 11340.0544752.34 × 10−810320.0291617.66 × 10−9402040.213478.38 × 10−71144560.4824939.9 × 10−7
x 0 ( 2 ) 7230.025971.25 × 10−86200.0191041.98 × 10−817740.083958.67 × 10−71114450.3574879.17 × 10−7
x 0 ( 3 ) 7220.0246832.39 × 10−87220.0223111.18 × 10−8271250.1455366.43 × 10−71074280.342688.81 × 10−7
x 0 ( 4 ) 11340.0405193.18 × 10−97230.0213242.43 × 10−8442280.2590856.55 × 10−71164640.3812929.96 × 10−7
x 0 ( 5 ) 12370.056078.89 × 10−913400.0361492.17 × 10−8502630.4583397.44 × 10−71194760.3725869.22 × 10−7
x 0 ( 6 ) 421410.1677039.8 × 10−727830.0794948.95 × 10−7291350.1458756.27 × 10−71114440.3812658.97 × 10−7
150,000 x 0 ( 1 ) 12380.0780311.57 × 10−98260.040938.57 × 10−9462410.4258179.67 × 10−71174680.7526828.59 × 10−7
x 0 ( 2 ) 8260.0440944.61 × 10−96210.0327861.13 × 10−921920.1787227.76 × 10−71134530.6811569.37 × 10−7
x 0 ( 3 ) 8260.0503381.49 × 10−913400.0679464.23 × 10−9321570.2770459.72 × 10−71094360.7424089 × 10−7
x 0 ( 4 ) 13410.0702491.34 × 10−911350.0607973.12 × 10−9512720.4488596.25 × 10−71194760.7350228.65 × 10−7
x 0 ( 5 ) 13400.0785451.86 × 10−98260.0432021.79 × 10−9683910.7329327.44 × 10−71214840.7170679.42 × 10−7
x 0 ( 6 ) 341080.195028.06 × 10−728860.1551247.3 × 10−7351740.3405247.83 × 10−71134520.8321279.16 × 10−7
Table A3. Numerical result for test problem 3.
Table A3. Numerical result for test problem 3.
RDFPM IRDFPM IIRIDFAHDFPM
DIMINPITERFVALTIMENORMFITERFVALTIMENORMFITERFVALTIMENORMFITERFVALTIMENORMF
2000 x 0 ( 1 ) 7300.0006562.89 × 10−713470.0004322.39 × 10−715620.000596.3 × 10−71034130.0032488.63 × 10−7
x 0 ( 2 ) 6240.0302546.7 × 10−77250.0004539.51 × 10−714570.0006386.9 × 10−7953810.0032988.63 × 10−7
x 0 ( 3 ) 5220.0005254.99 × 10−710360.0013524.61 × 10−7 1004010.0040978.52 × 10−7
x 0 ( 4 ) 9360.0015643.53 × 10−714480.004835.92 × 10−715630.0006877.49 × 10−71034130.0033019.59 × 10−7
x 0 ( 5 ) 9370.0007523.32 × 10−711420.0005442.07 × 10−716660.000628.24 × 10−71044170.003339.08 × 10−7
x 0 ( 6 ) 261030.001973.51 × 10−7241300.0012249.34 × 10−714570.0006359.18 × 10−71004010.0031398.71 × 10−7
15,000 x 0 ( 1 ) 10390.0089672.62 × 10−79360.0087825.39 × 10−7251170.0212466.57 × 10−71134530.0864719.31 × 10−7
x 0 ( 2 ) 7280.0066623.97 × 10−711370.0088998.18 × 10−713530.0097378.66 × 10−71054210.0747449.31 × 10−7
x 0 ( 3 ) 5230.005175.46 × 10−78310.0068393.45 × 10−719840.0140436.44 × 10−71104410.083339.19 × 10−7
x 0 ( 4 ) 12450.0101173.35 × 10−714500.0136854.39 × 10−7251200.0215459.18 × 10−71144570.0815738.79 × 10−7
x 0 ( 5 ) 12460.0099033.1 × 10−711430.0101412.26 × 10−7261360.0237937.94 × 10−71144570.0809799.8 × 10−7
x 0 ( 6 ) 291340.0278615.17 × 10−7495270.0975599.95 × 10−720890.0149546.35 × 10−71104410.0833799.65 × 10−7
75,000 x 0 ( 1 ) 11420.0330613.09 × 10−79360.0269433.6 × 10−7392100.133688.37 × 10−71184730.3067089.24 × 10−7
x 0 ( 2 ) 8310.0213814.71 × 10−710340.023658017730.0487258.22 × 10−71104410.2901779.23 × 10−7
x 0 ( 3 ) 6240.01706806260.015726.01 × 10−7231110.0667876.81 × 10−71154610.3043639.12 × 10−7
x 0 ( 4 ) 13480.0362363.96 × 10−717570.0412130442440.2273447.94 × 10−71194770.3852288.73 × 10−7
x 0 ( 5 ) 13490.0372763.66 × 10−711430.0296282.11 × 10−7512960.2146637.71 × 10−71194770.3147219.72 × 10−7
x 0 ( 6 ) 21870.0658731.09 × 10−7658440.5059297.22 × 10−7271300.0917027.43 × 10−71154610.2982489.59 × 10−7
150,000 x 0 ( 1 ) 11420.0537484.37 × 10−713480.0603362.5 × 10−7492780.3435367.08 × 10−71204810.6166429.44 × 10−7
x 0 ( 2 ) 9340.0440233.5 × 10−78290.0358538.21 × 10−720860.1043536.25 × 10−71124490.5721239.44 × 10−7
x 0 ( 3 ) 6240.03219206260.0289498.49 × 10−7281400.1688554.12 × 10−71174690.8555979.32 × 10−7
x 0 ( 4 ) 14510.0675432.94 × 10−79370.044982.59 × 10−7543130.3735228.62 × 10−71214850.680298.92 × 10−7
x 0 ( 5 ) 14520.0699642.72 × 10−712460.0565655.71 × 10−7684300.5293976.32 × 10−71214850.6124169.93 × 10−7
x 0 ( 6 ) 22870.1155031.57 × 10−7617871.022971.63 × 10−7351780.3564579.82 × 10−71174690.7854699.79 × 10−7
Table A4. Numerical result for test problem 4.
Table A4. Numerical result for test problem 4.
RDFPM IRDFPM IIRIDFAHDFPM
DIMINPITERFVALTIMENORMFITERFVALTIMENORMFITERFVALTIMENORMFITERFVALTIMENORMF
2000 x 0 ( 1 ) 7280.0035464.04 × 10−711400.0005332.46 × 10−717710.000887.38 × 10−71084330.0049529.68 × 10−7
x 0 ( 2 ) 8310.0018143.08 × 10−710370.0004554.02 × 10−719810.0010356.66 × 10−71114450.0048758.71 × 10−7
x 0 ( 3 ) 8310.000742.71 × 10−79340.0008153.95 × 10−717730.0008297.86 × 10−71104410.0046129.03 × 10−7
x 0 ( 4 ) 7280.0005083.57 × 10−711390.0004875.97 × 10−714590.0006722.33 × 10−71084330.0045228.55 × 10−7
x 0 ( 5 ) 7280.0004462.86 × 10−76250.0003465.5 × 10−7 1064250.0044489.5 × 10−7
x 0 ( 6 ) 8310.0004362.71 × 10−732970.0013897.27 × 10−717720.0007998.95 × 10−71104410.0046669 × 10−7
15,000 x 0 ( 1 ) 10370.0107673.15 × 10−79340.0103499.33 × 10−7331640.0418959.47 × 10−71194770.1451528.89 × 10−7
x 0 ( 2 ) 10370.0137854.62 × 10−79340.0104217.49 × 10−7392010.0540338.51 × 10−71214850.1487129.41 × 10−7
x 0 ( 3 ) 10370.0100734.07 × 10−77290.0086495.47 × 10−7371880.0475539.15 × 10−71204810.1268239.75 × 10−7
x 0 ( 4 ) 10370.0095522.78 × 10−77290.009243.74 × 10−7311520.0397069.48 × 10−71184730.1262339.24 × 10−7
x 0 ( 5 ) 9340.0112844.3 × 10−712430.0122223.52 × 10−7291390.0369336.9 × 10−71174690.1218668.72 × 10−7
x 0 ( 6 ) 10370.0100424.11 × 10−7371120.0313187.61 × 10−7371880.0474329.14 × 10−71204810.1352639.83 × 10−7
75,000 x 0 ( 1 ) 11380.05362809320.0435590603470.4810936.15 × 10−71244970.7679228.82 × 10−7
x 0 ( 2 ) 12410.065807013440.0607738.67 × 10−7764600.5900127.29 × 10−71265050.6667659.34 × 10−7
x 0 ( 3 ) 11380.051439012410.0552270704190.6159278.24 × 10−71255010.6669269.68 × 10−7
x 0 ( 4 ) 11380.051954015490.065450543030.399856.86 × 10−71234930.7270999.17 × 10−7
x 0 ( 5 ) 10350.048287012410.0553378.53 × 10−7432290.3022957.44 × 10−71224890.6617248.65 × 10−7
x 0 ( 6 ) 11380.0520940391180.1620779.56 × 10−7694150.5620687.49 × 10−71255010.6586199.75 × 10−7
150,000 x 0 ( 1 ) 11370.096002014460.1159590754531.2296367.8 × 10−71265051.423719.01 × 10−7
x 0 ( 2 ) 11370.094155014460.118101026551.8060756.03 × 10−71285131.3722989.54 × 10−7
x 0 ( 3 ) 11370.09408209320.077630875361.3981489.78 × 10−71275091.353459.89 × 10−7
x 0 ( 4 ) 10340.08669011370.0951430684051.1020829.01 × 10−71255011.4499629.37 × 10−7
x 0 ( 5 ) 10340.08671013440.1128158.01 × 10−7593440.8697196.51 × 10−71244971.2805638.84 × 10−7
x 0 ( 6 ) 11370.171044028870.2282248.72 × 10−7875361.4460039.5 × 10−71275091.3778659.97 × 10−7
Table A5. Numerical result for test problem 5.
Table A5. Numerical result for test problem 5.
RDFPM IRDFPM IIRIDFAHDFPM
DIMINPITERFVALTIMENORMFITERFVALTIMENORMFITERFVALTIMENORMFITERFVALTIMENORMF
2000 x 0 ( 1 ) 311280.0025238.82 × 10−7321620.0010927.58 × 10−718740.0005326.8 × 10−71164650.0029398.75 × 10−7
x 0 ( 2 ) 21820.0011264.62 × 10−732970.0006469.34 × 10−719770.0006157.59 × 10−71134520.0028429.7 × 10−7
x 0 ( 3 ) 301190.0016048.16 × 10−7371860.0012327.38 × 10−718740.0005099.54 × 10−71124480.0028359.47 × 10−7
x 0 ( 4 ) 231040.0009564.4 × 10−7362100.0013779.35 × 10−718750.000529.6 × 10−71154610.00298.76 × 10−7
x 0 ( 5 ) 331710.001593.05 × 10−7393150.0019632.71 × 10−718740.0006059.39 × 10−71134530.0028689.42 × 10−7
x 0 ( 6 ) 261020.0007863.25 × 10−7423240.002039.52 × 10−718740.0005158.23 × 10−71154600.0029618.82 × 10−7
15,000 x 0 ( 1 ) 371890.0315723.07 × 10−7576790.0907331.79 × 10−7281220.0186897.7 × 10−71345370.0805639.52 × 10−7
x 0 ( 2 ) 341550.0267472.64 × 10−7361080.0169247.41 × 10−7321410.0203346.95 × 10−71325280.0844418.85 × 10−7
x 0 ( 3 ) 412530.0366413.91 × 10−7545080.0699338.82 × 10−7301310.0177449.5 × 10−71315240.0846598.63 × 10−7
x 0 ( 4 ) 472320.0335853.13 × 10−8626840.2982946.72 × 10−8291310.0182659.33 × 10−71335330.0812599.56 × 10−7
x 0 ( 5 ) 371900.0267649.82 × 10−7597260.1041736.2 × 10−7301380.0204446.01 × 10−71325290.0799438.78 × 10−7
x 0 ( 6 ) 331630.0242049.34 × 10−7 301320.0178439.81 × 10−71315240.0777939.13 × 10−7
75,000 x 0 ( 1 ) 442340.1053828.71 × 10−77110330.5644829.78 × 10−7442150.1029329.55 × 10−71435730.3073839.08 × 10−7
x 0 ( 2 ) 271060.0483163.86 × 10−7371110.0574568.28 × 10−7542730.1280259.62 × 10−71405600.3282229.92 × 10−7
x 0 ( 3 ) 503370.1437495.03 × 10−7648690.3449895.66 × 10−7492400.118956.3 × 10−71395560.2778319.68 × 10−7
x 0 ( 4 ) 341750.0763542.54 × 10−7 452220.099427.74 × 10−71425690.2722769.12 × 10−7
x 0 ( 5 ) 472610.1144727.81 × 10−7638570.3413735.62 × 10−7482500.1207558.55 × 10−71405610.2712229.86 × 10−7
x 0 ( 6 ) 401840.0839694.03 × 10−79515610.6571426.18 × 10−7502460.1120486.99 × 10−71415640.2714579.35 × 10−7
150,000 x 0 ( 1 ) 474410.3140313.76 × 10−78913341.0973686.86 × 10−7562890.2296379.49 × 10−71475890.4881398.72 × 10−7
x 0 ( 2 ) 331380.1132728.53 × 10−7381140.1040077.22 × 10−7673520.4069369.65 × 10−71445760.6035229.52 × 10−7
x 0 ( 3 ) 504160.3075957.05 × 10−87811600.7987126.85 × 10−7633250.2641225.98 × 10−71435720.4616529.29 × 10−7
x 0 ( 4 ) 462660.20478.02 × 10−7 593110.2537876.19 × 10−71465850.504758.76 × 10−7
x 0 ( 5 ) 442380.3159177 × 10−7709270.7625536.71 × 10−7623380.2937516.14 × 10−71445770.4737439.47 × 10−7
x 0 ( 6 ) 321390.1097173.37 × 10−7 633250.294256.64 × 10−71465840.6215619.16 × 10−7
Table A6. Numerical result for test problem 6.
Table A6. Numerical result for test problem 6.
RDFPM IRDFPM IIRIDFAHDFPM
DIMINPITERFVALTIMENORMFITERFVALTIMENORMFITERFVALTIMENORMFITERFVALTIMENORMF
2000 x 0 ( 1 ) 8520.0006023.95 × 10−7 592370.0017367.78 × 10−7
x 0 ( 2 ) 8480.0006893.77 × 10−7 572290.0011498.59 × 10−7
x 0 ( 3 ) 7450.0002772.22 × 10−7 8330.0001984.95 × 10−7461850.000927.7 × 10−7
x 0 ( 4 ) 8470.0003078.32 × 10−7 592370.0011729.98 × 10−7
x 0 ( 5 ) 8460.0002934.07 × 10−78430.0002181.67 × 10−7 612450.0012298.17 × 10−7
x 0 ( 6 ) 604630.0024589.08 × 10−7181090.0005541.46 × 10−7 572290.0011448.23 × 10−7
15,000 x 0 ( 1 ) 9550.0068642.11 × 10−7 642570.0366279.94 × 10−7
x 0 ( 2 ) 9540.0077343.1 × 10−7 10460.0072836.22 × 10−7632530.0328128.21 × 10−7
x 0 ( 3 ) 8510.0057961.82 × 10−7 512050.0268349.85 × 10−7
x 0 ( 4 ) 9500.0058694.67 × 10−710560.0075975.56 × 10−7 652610.0354199.54 × 10−7
x 0 ( 5 ) 9520.0074873.35 × 10−7 672690.0381587.81 × 10−7
x 0 ( 6 ) 714390.0555081.95 × 10−7191150.0161763.27 × 10−7 632530.0349317.92 × 10−7
75,000 x 0 ( 1 ) 9550.0180734.73 × 10−7 672690.0973929.29 × 10−7
x 0 ( 2 ) 9510.0180334.65 × 10−7 17880.0375224.48 × 10−7662650.100257.67 × 10−7
x 0 ( 3 ) 8510.0178524.08 × 10−7 542170.0781259.2 × 10−7
x 0 ( 4 ) 10560.020771.57 × 10−7 682730.1032598.91 × 10−7
x 0 ( 5 ) 9490.0146035.02 × 10−7 692770.1072079.76 × 10−7
x 0 ( 6 ) 867760.2353251.44 × 10−7 652610.0979579.9 × 10−7
150,000 x 0 ( 1 ) 9520.0275444.49 × 10−7 682730.1781479.82 × 10−7
x 0 ( 2 ) 9510.0267836.58 × 10−710600.0345964.67 × 10−7 672690.1644168.11 × 10−7
x 0 ( 3 ) 8480.0254873.9 × 10−7 552210.1382419.73 × 10−7
x 0 ( 4 ) 10560.0309372.22 × 10−7 692770.2548149.42 × 10−7
x 0 ( 5 ) 9490.0254637.1 × 10−7 712850.1943467.71 × 10−7
x 0 ( 6 ) 937220.3706964.72 × 10−7221270.0749472.08 × 10−7 672690.1627267.82 × 10−7
Table A7. Numerical result for test problem 7.
Table A7. Numerical result for test problem 7.
RDFPM IRDFPM IIRIDFAHDFPM
DIMINPITERFVALTIMENORMFITERFVALTIMENORMFITERFVALTIMENORMFITERFVALTIMENORMF
2000 x 0 ( 1 ) 8650.0010031.93 × 10−78590.0003217.63 × 10−718920.0005397.98 × 10−7411650.0008448.83 × 10−7
x 0 ( 2 ) 7610.0008553.17 × 10−7 17860.0004827.65 × 10−7371490.0007529.18 × 10−7
x 0 ( 3 ) 7570.0003465.08 × 10−7 17870.000476.48 × 10−7381530.0007889.12 × 10−7
x 0 ( 4 ) 7570.0003414.86 × 10−7 211070.00069.59 × 10−7421690.0008657.02 × 10−7
x 0 ( 5 ) 8640.0003681.15 × 10−7 211080.0005669.03 × 10−7431730.0008896.55 × 10−7
x 0 ( 6 ) 201690.0009575.73 × 10−7161270.0006152.71 × 10−718910.0012167.09 × 10−7401610.0008487.06 × 10−7
15,000 x 0 ( 1 ) 8610.00797.61 × 10−79670.0097711.16 × 10−7261380.0354568.48 × 10−7451810.0256488.58 × 10−7
x 0 ( 2 ) 8650.0078581.14 × 10−7 18920.0486259.54 × 10−7411650.0200438.92 × 10−7
x 0 ( 3 ) 8650.007652.54 × 10−7 19960.0204049.29 × 10−7421690.0211428.86 × 10−7
x 0 ( 4 ) 8650.0078822.43 × 10−78600.0080046.17 × 10−7271440.0363469.54 × 10−7461850.026766.83 × 10−7
x 0 ( 5 ) 8600.0069154.54 × 10−7 301630.0200728.8 × 10−7461850.0286219.81 × 10−7
x 0 ( 6 ) 293360.040099.45 × 10−7151200.0163929.01 × 10−7221140.0134349.85 × 10−7441770.0245246.75 × 10−7
75,000 x 0 ( 1 ) 9690.020281.55 × 10−7 362070.0750659.66 × 10−7471890.0786678.08 × 10−7
x 0 ( 2 ) 8650.0181472.55 × 10−7 221120.0400937.67 × 10−7431730.0687748.4 × 10−7
x 0 ( 3 ) 8650.0186235.67 × 10−79620.0232633.32 × 10−7231200.0436879.76 × 10−7441770.0646358.34 × 10−7
x 0 ( 4 ) 8650.018735.43 × 10−78500.0159546.37 × 10−7412420.0862129.37 × 10−7471890.0701229.9 × 10−7
x 0 ( 5 ) 9680.0197829.25 × 10−8 472840.1036399.41 × 10−7481930.076249.24 × 10−7
x 0 ( 6 ) 262140.0693981.31 × 10−7151200.0397998.16 × 10−7271440.0514878.02 × 10−7451810.068989.81 × 10−7
150,000 x 0 ( 1 ) 9690.0363142.2 × 10−7 422490.1545769.61 × 10−7481930.1185017.41 × 10−7
x 0 ( 2 ) 8650.0323123.61 × 10−7 231180.0758387.66 × 10−7441770.1103457.71 × 10−7
x 0 ( 3 ) 8610.0310785.78 × 10−7 251320.0850519.52 × 10−7451810.1134937.66 × 10−7
x 0 ( 4 ) 8610.1436955.52 × 10−78540.0344418.02 × 10−7462770.1759779.96 × 10−7481930.1189359.09 × 10−7
x 0 ( 5 ) 9680.0567251.31 × 10−7 613950.2472859.76 × 10−7491970.1214698.48 × 10−7
x 0 ( 6 ) 201640.0838391.82 × 10−7151240.0675889.13 × 10−7311720.1159739.31 × 10−7461850.1256259.01 × 10−7

References

  1. Abubakar, A.B.; Kumam, P.; Mohammad, H. A note on the spectral gradient projection method for nonlinear monotone equations with applications. Comput. Appl. Math. 2020, 39, 129. [Google Scholar] [CrossRef]
  2. Abubakar, A.B.; Kumam, P.; Mohammad, H.; Awwal, A.M. A Barzilai-Borwein gradient projection method for sparse signal and blurred image restoration. J. Frankl. Inst. 2020, 357, 7266–7285. [Google Scholar] [CrossRef]
  3. Yu, H.F.; Huang, F.L.; Lin, C.J. Dual coordinate descent methods for logistic regression and maximum entropy models. Mach. Learn. 2011, 85, 41–75. [Google Scholar] [CrossRef]
  4. Wood, A.J.; Wollenberg, B.F.; Sheblé, G.B. Power Generation, Operation, and Control; John Wiley & Sons: Hoboken, NJ, USA, 2013. [Google Scholar]
  5. Zhao, Y.; Li, D. Monotonicity of fixed point and normal mappings associated with variational inequality and its application. SIAM J. Optim. 2001, 11, 962–973. [Google Scholar] [CrossRef]
  6. Powell, M.J.D. Restart procedures for the conjugate gradient method. Math. Program. 1977, 12, 241–254. [Google Scholar] [CrossRef]
  7. Boland, W.; Kowalik, J. Extended conjugate-gradient methods with restarts. J. Optim. Theory Appl. 1979, 28, 1–9. [Google Scholar] [CrossRef]
  8. Shanno, D.F. Conjugate gradient methods with inexact searches. Math. Oper. Res. 1978, 3, 244–256. [Google Scholar] [CrossRef]
  9. Knockaert, L.; De Zutter, D. Regularization of the moment matrix solution by a nonquadratic conjugate gradient method. IEEE Trans. Antennas Propag. 2000, 48, 812–816. [Google Scholar] [CrossRef]
  10. Dai, Y.H. New properties of a nonlinear conjugate gradient method. Numer. Math. 2001, 89, 83–98. [Google Scholar] [CrossRef]
  11. Jiang, X.Z.; Zhu, Y.H.; Jian, J.B. Two efficient nonlinear conjugate gradient methods with restart procedures and their applications in image restoration. Nonlinear Dyn. 2023, 111, 5469–5498. [Google Scholar] [CrossRef]
  12. Cheng, W. A PRP type method for systems of monotone equations. Math. Comput. Model. 2009, 50, 15–20. [Google Scholar] [CrossRef]
  13. Li, S.; Pang, L.; Xue, M.; Wang, X. Two self-adaptive derivative-free methods with restart procedure for constrained nonlinear equations with applications. J. Appl. Math. Comput. 2024, 70, 6219–6243. [Google Scholar] [CrossRef]
  14. Bojari, S.; Eslahchi, M. Two families of scaled three-term conjugate gradient methods with sufficient descent property for nonconvex optimization. Numer. Algorithms 2020, 83, 901–933. [Google Scholar] [CrossRef]
  15. Abdullahi, M.; Abubakar, A.B.; Feng, Y.; Liu, J. Comment on: “A derivative-free iterative method for nonlinear monotone equations with convex constraints”. Numer. Algorithms 2023, 94, 1551–1560. [Google Scholar] [CrossRef]
  16. Abubakar, A.B.; Kumam, P.; Mohammad, H.; Ibrahim, A.H.; Kiri, A.I. A hybrid approach for finding approximate solutions to constrained nonlinear monotone operator equations with applications. Appl. Numer. Math. 2022, 177, 79–92. [Google Scholar] [CrossRef]
  17. Ibrahim, A.H.; Kumam, P.; Abubakar, A.B.; Jirakitpuwapat, W.; Abubakar, J. A hybrid conjugate gradient algorithm for constrained monotone equations with application in compressive sensing. Heliyon 2020, 6, e03466. [Google Scholar] [CrossRef] [PubMed]
  18. Abubakar, J.; Kumam, P.; Rehman, H.U.; Hassan Ibrahim, A. Inertial iterative schemes with variable step sizes for variational inequality problem involving pseudomonotone operator. Mathematics 2020, 8, 609. [Google Scholar] [CrossRef]
  19. Liu, W.; Jian, J.; Yin, J. An inertial spectral conjugate gradient projection method for constrained nonlinear pseudo-monotone equations. Numer. Algorithms 2024, 97, 985–1015. [Google Scholar] [CrossRef]
  20. Awwal, A.M.; Botmart, T. A new sufficiently descent algorithm for pseudomonotone nonlinear operator equations and signal reconstruction. Numer. Algorithms 2023, 94, 1125–1158. [Google Scholar] [CrossRef]
  21. Liu, J.; Lu, Z.; Xu, J.; Wu, S.; Tu, Z. An efficient projection-based algorithm without Lipschitz continuity for large-scale nonlinear pseudo-monotone equations. J. Comput. Appl. Math. 2022, 403, 113822. [Google Scholar] [CrossRef]
  22. Hadjisavvas, N.; Schaible, S.; Wong, N.C. Pseudomonotone operators: A survey of the theory and its applications. J. Optim. Theory Appl. 2012, 152, 1–20. [Google Scholar] [CrossRef]
  23. Liu, H.; Yao, Y.; Qian, X.; Wang, H. Some nonlinear conjugate gradient methods based on spectral scaling secant equations. Comput. Appl. Math. 2016, 35, 639–651. [Google Scholar] [CrossRef]
  24. Ibrahim, A.H.; Rapajić, S.; Kamandi, A.; Kumam, P.; Papp, Z. Relaxed-inertial derivative-free algorithm for systems of nonlinear pseudo-monotone equations. Comput. Appl. Math. 2024, 43, 239. [Google Scholar] [CrossRef]
  25. Huang, F.; Deng, S.; Tang, J. A derivative-free memoryless BFGS hyperplane projection method for solving large-scale nonlinear monotone equations. Soft Comput. 2023, 27, 3805–3815. [Google Scholar] [CrossRef]
  26. Zhou, W.; Li, D. Limited memory BFGS method for nonlinear monotone equations. J. Comput. Math. 2007, 25, 89–96. [Google Scholar]
  27. Cruz, W.L.; Raydan, M. Nonmonotone spectral methods for large-scale nonlinear systems. Optim. Methods Softw. 2003, 18, 583–599. [Google Scholar] [CrossRef]
  28. Wang, C.; Wang, Y.; Xu, C. A projection method for a system of nonlinear monotone equations with convex constraints. Math. Methods Oper. Res. 2007, 66, 33–46. [Google Scholar] [CrossRef]
  29. Bing, Y.; Lin, G. An efficient implementation of Merrill’s method for sparse or partially separable systems of nonlinear equations. SIAM J. Optim. 1991, 1, 206–221. [Google Scholar] [CrossRef]
  30. Sun, M.; Liu, J. A modified Hestenes–Stiefel projection method for constrained nonlinear equations and its linear convergence rate. J. Appl. Math. Comput. 2015, 49, 145–156. [Google Scholar] [CrossRef]
  31. Liu, J.; Feng, Y. A derivative-free iterative method for nonlinear monotone equations with convex constraints. Numer. Algorithms 2018, 82, 245–262. [Google Scholar] [CrossRef]
  32. Dolan, E.D.; Moré, J.J. Benchmarking optimization software with performance profiles. Math. Program. 2002, 91, 201–213. [Google Scholar] [CrossRef]
  33. Jian, J.; Yin, J.; Tang, C.; Han, D. A family of inertial derivative-free projection methods for constrained nonlinear pseudo-monotone equations with applications. Comput. Appl. Math. 2022, 41, 309. [Google Scholar] [CrossRef]
  34. Chang, C.C.; Lin, C.J. LIBSVM: A library for support vector machines. ACM Trans. Intell. Syst. Technol. (TIST) 2011, 2, 1–27. [Google Scholar] [CrossRef]
Figure 1. Performance profile for iteration.
Figure 1. Performance profile for iteration.
Algorithms 18 00743 g001
Figure 2. Performance profile for fval.
Figure 2. Performance profile for fval.
Algorithms 18 00743 g002
Figure 3. Performance profile for time.
Figure 3. Performance profile for time.
Algorithms 18 00743 g003
Table 1. LIBSVM instance statistics and performance metrics.
Table 1. LIBSVM instance statistics and performance metrics.
RDFPM IRDFPM IIHDFPM
DatasetData Points NVariables nNITERNFVALCPUTNormFNITERNFVALCPUTNormFNITERNFVALCPUTNormF
a1a.t30,956123121.6364.80.74389.0798 × 10−7134.2458.60.91979.6339 × 10−7213.8899.41.79399.7467 × 10−7
a2a.t30,296123121.8365.80.73499.1922 × 10−7143.6502.61.01579.4277 × 10−7220.2929.41.87169.1506 × 10−7
a3a.t29,376123120.8364.00.71819.2259 × 10−7156.0551.01.08309.1583 × 10−7216.8913.61.79159.3153 × 10−7
a4a.t27,780123122.0366.80.68809.2408 × 10−7128.2433.60.81359.0834 × 10−7216.4912.01.70919.4888 × 10−7
a5a.t26,147123121.2363.60.65469.0098 × 10−7167.2596.61.05807.8891 × 10−7218.0919.61.62759.3824 × 10−7
a6a.t21,341123120.4361.20.50089.3515 × 10−7139.8486.60.67368.7210 × 10−7216.8914.01.26389.6296 × 10−7
a7a.t16,461123121.8365.80.38139.3367 × 10−7139.8474.00.49808.7366 × 10−7212.0891.60.93579.7320 × 10−7
a8a.t9865122110.8339.60.19879.3058 × 10−7146.8501.40.29259.0627 × 10−7208.2875.20.51069.7813 × 10−7
a9a.t16,281122120.2361.20.37149.1402 × 10−7148.8517.80.53778.8241 × 10−7210.6886.40.92089.2898 × 10−7
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Deepho, J.; Abubakar, A.B.; Ibrahim, A.H. Two Classes of Restart Algorithms for Solving Pseudomonotone Nonlinear Equations. Algorithms 2025, 18, 743. https://doi.org/10.3390/a18120743

AMA Style

Deepho J, Abubakar AB, Ibrahim AH. Two Classes of Restart Algorithms for Solving Pseudomonotone Nonlinear Equations. Algorithms. 2025; 18(12):743. https://doi.org/10.3390/a18120743

Chicago/Turabian Style

Deepho, Jitsupa, Auwal Bala Abubakar, and Abdulkarim Hassan Ibrahim. 2025. "Two Classes of Restart Algorithms for Solving Pseudomonotone Nonlinear Equations" Algorithms 18, no. 12: 743. https://doi.org/10.3390/a18120743

APA Style

Deepho, J., Abubakar, A. B., & Ibrahim, A. H. (2025). Two Classes of Restart Algorithms for Solving Pseudomonotone Nonlinear Equations. Algorithms, 18(12), 743. https://doi.org/10.3390/a18120743

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop