Modified Inertial Subgradient Extragradient Method with Regularization for Variational Inequality and Null Point Problems
Abstract
:1. Introduction
- (i)
- monotone, if
- (ii)
- - strongly monotone, if there exists a number such that
- (iii)
- -inverse strongly monotone, if there exists a positive number such that
- (iv)
- k- Lipschitz continuous, if there exists such that
- (v)
- nonexpansive, if
- The projection of p onto a ball is computed by
- The projection of p onto a half-space is computed by
Algorithm 1: The subgradient extragradient algorithm (SEGM). |
Initialization: Set , and let be arbitrary. Step 1. Given , compute
and construct the half-space the bounding hyperplane of which supports C at ,
Step 2. Calculate the next iterate
Step 3. If , then stop. Otherwise, set and return to Step 1. |
Algorithm 2: Self-adaptive subgradient extragradient method (SSEGM). |
Initialization: Set , , and , and let be arbitrary. Step 1. Given , compute
If , then stop: is a solution. Otherwise, go to Step 2. Step 2. Construct the half-space the bounding hyperplane of which supports C at ,
and calculate
Step 3. Compute
where is updated by
Step 4. Set and return to Step 1. |
Algorithm 3: Self adaptive viscosity-type inertial subgradient extragradient method with nonmonotonic stepsizes (SSEGMN). |
Initialization: Set , , , and , and let be arbitrary. Step 1. Given and , compute
where
Step 2. Compute
and construct the half-space the bounding hyperplane of which supports C at ,
Step 3. Compute
Step 4. Compute
where is updated by
Step 5. Set and return to Step 1. |
2. Preliminaries
3. Main Results
- (C1)
- , where τ is that in Lemma 9;
- (C2)
- , ;
- (C3)
- ;
- (C4)
- ;
- (C5)
- .
Algorithm 4: Modified inertial subgradient extragradient method with regularization (MSEMR). |
Initialization: Set , , , , , and . Choose a nonnegative real sequence such that . Let be arbitrary. Step 1.Compute where Step 2.Compute and construct the half-space the bounding hyperplane of which supports C at , Step 3.Calculate where is updated by Step 4.Set and return to Step 1. |
- Note that the stepsizes adopted in the , , , and are monotonically decreasing, which may affect the execution efficiency of such methods. However, the adopts a new nonmonotonic stepsize criterion that overcomes the drawback of the monotonically decreasing stepsize sequences generated by the other mentioned methods.
- One of the advantages of the is its strong convergence, which is preferable to the weak convergence resulting from the and in infinite dimensional spaces.
- The strong convergence of the comes from the regularization technique, which is different to the Halpern iteration and the viscosity methods used in the and , respectively.
4. Application to Split Minimization Problems
Algorithm 5: Modified inertial subgradient extragradient method to split minimization problems (MSESM). |
Initialization: Set , , , , , and . Choose a nonnegative real sequence such that . Let be arbitrary. Step 1.Compute where Step 2.Compute and construct the half-space the bounding hyperplane of which supports C at , Step 3.Calculate where is updated by Step 4.Set and return to Step 1. |
5. Numerical Illustrations
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Fichera, G. Sul problema elastostatico di Signorini con ambigue condizioni al contorno. Atti Accad. Naz. Lincei Rend. Cl. Sci. Fis. Mat. Nat. 1963, 34, 138–142. [Google Scholar]
- Hartman, P.; Stampacchia, G. On some nonlinear elliptic differential functional equations. Acta Math. 1966, 115, 271–310. [Google Scholar] [CrossRef]
- Song, Y.; Chen, X. Analysis of Subgradient Extragradient Method for Variational Inequality Problems and Null Point Problems. Symmetry 2022, 14, 636. [Google Scholar] [CrossRef]
- Yao, Y.; Shehu, Y.; Li, X.H.; Dong, Q.L. A method with inertial extrapolation step for split monotone inclusion problems. Optimization 2021, 70, 741–761. [Google Scholar] [CrossRef]
- Ogwo, G.N.; Izuchukwu, C.; Mewomo, O.T. Inertial methods for finding minimum-norm solutions of the split variational inequality problem beyond monotonicity. Numer. Algor. 2021, 88, 1419–1456. [Google Scholar] [CrossRef]
- Kazmi, K.R.; Rizvi, S.H. Iterative approximation of a common solution of a split equilibrium problem, a variational inequality problem and a fixed point problem. J. Egypt. Math. Soc. 2013, 21, 44–51. [Google Scholar] [CrossRef] [Green Version]
- Song, Y.L.; Ceng, L.C. Convergence theorems for accretive operators with nonlinear mappings in Banach spaces. Abstr. Appl. Anal. 2014, 12, 1–12. [Google Scholar] [CrossRef]
- Jolaoso, L.O.; Karahan, I. A general alternative regularization method with line search technique for solving split equilibrium and fixed point problems in Hilbert spaces. Comput. Appl. Math. 2020, 39, 1–22. [Google Scholar] [CrossRef]
- Korpelevich, G.M. An extragradient method for finding saddle points and for other problems. Matecon 1976, 12, 747–756. [Google Scholar]
- Akashi, S.; Takahashi, W. Weak convergence theorem for an infinite family of demimetric mappings in a Hilbert space. J. Nonlinear Convex Anal. 2016, 10, 2159–2169. [Google Scholar]
- Yao, Y.; Cho, Y.J.; Liou, Y.C. Algorithms of common solutions for variational inclusions, mixed equilibrium problems and fixed point problems. Eur. J. Oper. Res. 2011, 212, 242–250. [Google Scholar] [CrossRef]
- Alvarez, F.; Attouch, H. An inertial proximal method for monotone operators via discretization of a nonlinear oscillator with damping. Set-Valued Anal. 2001, 9, 3–11. [Google Scholar] [CrossRef]
- Song, Y.L.; Ceng, L.C. Strong convergence of a general iterative algorithm for a finite family of accretive operators in Banach spaces. Fixed Point Theory Appl. 2015, 2015, 1–24. [Google Scholar] [CrossRef] [Green Version]
- Tan, B.; Qin, X.; Yao, J.C. Strong convergence of self-adaptive inertial algorithms for solving split variational inclusion problems with applications. J. Sci. Comput. 2021, 87, 1–34. [Google Scholar] [CrossRef]
- Censor, Y.; Gibali, A.; Reich, S. The subgradient extragradient method for solving variational inequalities in Hilbert space. J. Optim. Theory Appl. 2011, 148, 318–335. [Google Scholar] [CrossRef] [Green Version]
- Fulga, A.; Afshari, H.; Shojaat, H. Common fixed point theorems on quasi-cone metric space over a divisible Banach algebra. Adv. Differ. Equations 2021, 2021, 1–15. [Google Scholar] [CrossRef]
- Yang, J.; Liu, H.; Liu, Z. Modified subgradient extragradient algorithms for solving monotone variational inequalities. Optimization 2018, 67, 2247–2258. [Google Scholar] [CrossRef]
- Tan, B.; Qin, X. Self adaptive viscosity-type inertial extragradient algorithms for solving variational inequalities with applications. Math. Model. Anal. 2022, 27, 41–58. [Google Scholar] [CrossRef]
- Goebel, K.; Kirk, W.A. Topics in Metric Fixed Point Theory; Cambridge University Press: Cambridge, UK, 1990. [Google Scholar]
- Goebel, K.; Reich, S. Uniform Convexity, Hyperbolic Geometry, and Nonexpansive Mappings; Dekker: New York, NY, USA, 1984. [Google Scholar]
- Cottle, R.W.; Yao, J.C. Pseudo-monotone complementarity problems in Hilbert space. J. Optim. Theory Appl. 1992, 75, 281–295. [Google Scholar] [CrossRef]
- Xu, H.K. Iterative algorithm for nonlinear operators. J. Lond. Math. Soc. 2002, 2, 1–17. [Google Scholar] [CrossRef]
- Xu, H.K. Viscosity approximation methods for nonexpansive mappings. J. Math. Anal. Appl. 2004, 298, 279–291. [Google Scholar] [CrossRef] [Green Version]
- Iiduka, H.; Takahashi, W. Strong convergence theorems for nonexpansive mappings and inverse-strongly monotone mappings. Nonlinear Anal. Theory Methods Appl. 2005, 61, 341–350. [Google Scholar] [CrossRef]
- Zhou, H.; Zhou, Y.; Feng, G. Iterative methods for solving a class of monotone variational inequality problems with applications. J. Inequalities Appl. 2015, 2015, 1–17. [Google Scholar] [CrossRef] [Green Version]
- Meng, S.; Xu, H.K. Remarks on the gradient-projection algorithm. J. Nonlinear Anal. Optim. 2010, 1, 35–43. [Google Scholar]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Song, Y.; Bazighifan, O. Modified Inertial Subgradient Extragradient Method with Regularization for Variational Inequality and Null Point Problems. Mathematics 2022, 10, 2367. https://doi.org/10.3390/math10142367
Song Y, Bazighifan O. Modified Inertial Subgradient Extragradient Method with Regularization for Variational Inequality and Null Point Problems. Mathematics. 2022; 10(14):2367. https://doi.org/10.3390/math10142367
Chicago/Turabian StyleSong, Yanlai, and Omar Bazighifan. 2022. "Modified Inertial Subgradient Extragradient Method with Regularization for Variational Inequality and Null Point Problems" Mathematics 10, no. 14: 2367. https://doi.org/10.3390/math10142367
APA StyleSong, Y., & Bazighifan, O. (2022). Modified Inertial Subgradient Extragradient Method with Regularization for Variational Inequality and Null Point Problems. Mathematics, 10(14), 2367. https://doi.org/10.3390/math10142367