Abstract
An extragradient type method for finding the common solutions of two variational inequalities has been proposed. The convergence result of the algorithm is given under mild conditions on the algorithm parameters.
Keywords:
variational inequality; extragradient-type method; inverse-strongly-monotone; projection; strong convergence MSC:
47H05; 47J05; 47J25
1. Introduction
Let H be a real Hilbert space equipped with inner product and norm . Let be a closed and convex set. Let be a mapping. Recall that the variational inequality (VI) seeks an element such that
The solution set of (1) is denoted by VI.
The problem (1) introduced and studied by Stampacchia [1] is being applied as a useful tool and model to refine a multitude of problems. A large number of methods for solving VI (1) are projection methods that implement projections onto the feasible set of the VI (1), or onto another set in order to achieve a solution. Several iterative methods for solving the VI (1) have been proposed. See, e.g., [2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30]. A basic one is the natural extension of the gradient projection algorithm for solving the optimization problem . For , calculate iteratively the sequence through
where is the metric projection and is the step-size.
Korpelevich [31] introduced an iterative method for solving the VI (1), known as the extragradient method ([7]). In Korpelevich’s method, two projections are used for computing the next iteration. For the current iteration , compute
where is a fixed number.
Korpelevich’s method has received so much attention by a range of scholars, who improved it in several ways; see, e.g., [32,33,34,35,36]. Now, we know that Korpelevich’s method (3) can only achieve weak convergence in a large dimensional spaces ([37,38]). In order to reach strong convergence, Korpelevich’s method was adapted by many mathematicians. For example, in [32], it is shown that several extragradient-type methods converge strongly to an element in VI.
Very recently, Censor, Gibali and Reich [39] presented an alternating method for finding common solutions of two variational inequalities. In [40], Zaslavski studied an extragradient method for finding common solutions of a finite family of variational inequalities.
Inspired by the work given above, in this article, we present an extragradient type method for finding the common solutions of two variational inequalities. We prove the strong convergence of the proposed method under the mild assumptions on the parameters.
2. Preliminaries
Let H be a real Hilbert space. Let be a nonempty, closed and convex set.
Definition 1.
An operator is called Lipschitz if
where is a constant.
If , we call A is nonexpansive.
Definition 2.
An operator is called inverse strongly monotone if
where is a constant.
In this case, we call A is -inverse-strongly-monotone.
Proposition 1
([41]). If C is a bounded closed convex subset of a real Hilbert space H and is an inverse strongly monotone operator, then .
For fixed , there exists a unique satisfying
We denote by . The following inequality is an important property of projection : for given ,
which is equivalent to
It follows that is nonexpansive. We also know that is nonexpansive.
Lemma 1
([41]). If is a closed convex subset of a real Hilbert space H and is an α-inverse strongly monotone operator, then
Especially, is nonexpansive provided .
Lemma 2
([42]). Suppose that and are two bounded sequences in Banach spaces. Let be a sequence satisfying . Suppose and . Then .
Lemma 3
([43]). Let , and be three real number sequences. If for all with and or , then .
3. Main Results
Let be a convex and closed subset of a real Hilbert space H. Let the operators be -inverse strongly monotone and -inverse strongly monotone, respectively. Let , , and be four sequences. In the sequel, assume VI VI.
Motivated by the algorithms presented in [31,39,40], we present the following iterative Algorithm 1 for finding the common solution of two variational inequalities.
| Algorithm 1: |
| For . Assume the sequence has been constructed. Compute the next iteration by the following manner |
Suppose that the control parameters , , and satisfy the following assumptions:
|
We will divide our main result into several propositions.
Proposition 2.
Proof.
Choose any . Note that for any . Hence,
From Lemma 1, we know that and are nonexpansive. Thus, from (7), we get
So,
It follows that
Then is bounded, and hence the sequences , and are all bounded. □
Proposition 3.
The following two conclusions hold
Proof.
Let . It is clear that S is nonexpansive. Set for all . Then, we can rewrite in (5) as
where for all .
Hence,
Hence,
By using the nonexpansivity of and S to deduce
Next, we estimate . By (5), we get
Since and , we derive that , and At the same time, note that , , and are bounded. Therefore,
Applying Lemma 2 to derive
Hence,
It follows that
This implies that
Thus
and it follows that
Therefore,
Since and , we derive
This concludes the proof. □
Proposition 4.
, where .
Proof.
Let be a subsequence of satisfying
By the boundedness of , we can choose a subsequence of such that .
Next, we demonstrate that . First, we prove that VI. Let be the normal cone of C at ; i.e., Define a mapping T by the formula
Let . Since and , we deduce . According to , we obtain
that is,
Thus,
Noting that and , we deduce . Hence, and thus VI.
Next, we show that VI. Define a mapping as follows
Let . Since and , we obtain . By virtue of , we obtain
that is,
Therefore, we have
Noting that and , we get . Hence, and VI. Thus, and we have
□
Finally, we prove our main result.
Theorem 1.
Suppose that VI VI. Assume that , , and satisfy the following restrictions (C1)–(C4). Then defined by (5) converges strongly to .
Proof.
First, we have Propositions 2–4 in hand. In terms of (4), we have
It follows that
Therefore,
By Lemma 3 and the above inequality, we deduce . This completes the proof. □
If we take , then we have the following Algorithm 2.
| Algorithm 2: |
| For initial value . Assume the sequence has been constructed. Compute the next iteration by the following manner |
Corollary 1.
Suppose that VI VI. Assume that , , and satisfy the following restrictions (C1)–(C4). Then defined by (11) converges strongly to the minimum norm element in Ω.
4. Conclusions
In this paper, we investigated the variational inequality problem. We suggest an extragradient type method for finding the common solutions of two variational inequalities. We prove the strong convergence of the method under the mild conditions. Noting that in our suggested iterative sequence (Equation (5)), the involved operators A and B require some form of strong monotonicity. A natural question arises, i.e., how to weaken these assumptions?
Author Contributions
All the authors have contributed equally to this paper. All the authors have read and approved the final manuscript.
Funding
This research was partially supported by the grants NSFC61362033 and NZ17015.
Acknowledgments
The authors are thankful to the anonymous referees for their careful corrections and valuable comments on the original version of this paper.
Conflicts of Interest
The authors declare no conflict of interest.
References
- Stampacchi, G. Formes bilineaires coercivites sur les ensembles convexes. C. R. Acad. Sci. 1964, 258, 4413–4416. [Google Scholar]
- Alber, Y.-I.; Iusem, A.-N. Extension of subgradient techniques for nonsmooth optimization in Banach spaces. Set Valued Anal. 2001, 9, 315–335. [Google Scholar] [CrossRef]
- Bello Cruz, J.-Y.; Iusem, A.-N. A strongly convergent direct method for monotone variational inequalities in Hilbert space. Numer. Funct. Anal. Optim. 2009, 30, 23–36. [Google Scholar] [CrossRef]
- Cho, S.-Y.; Qin, X.; Yao, J.-C.; Yao, Y. Viscosity approximation splitting methods for monotone and nonexpansive operators in Hilbert spaces. J. Nonlinear Convex Anal. 2018, 19, 251–264. [Google Scholar]
- Dong, Q.-L.; Cho, Y.-J.; Zhong, L.-L.; Rassias, T.-M. Inertial projection and contraction algorithms for variational inequalities. J. Glob. Optim. 2018, 7, 687–704. [Google Scholar] [CrossRef]
- Dong, Q.-L.; Cho, Y.-J.; Rassias, T.-M. The projection and contraction methods for finding common solutions to variational inequality problems. Optim. Lett. 2018, 12, 1871–1896. [Google Scholar] [CrossRef]
- Facchinei, F.; Pang, J.-S. Finite-Dimensional Variational Inequalities and Complementarity Problems; Springer: New York, NY, USA, 2003; Volumes 1 and 2. [Google Scholar]
- He, B.-S.; Yang, Z.-H.; Yuan, X.-M. An approximate proximal-extragradient type method for monotone variational inequalities. J. Math. Anal. Appl. 2004, 300, 362–374. [Google Scholar] [CrossRef]
- Bello Cruz, J.-Y.; Iusem, A.-N. Convergence of direct methods for paramonotone variational inequalities. Comput. Optim. Appl. 2010, 46, 247–263. [Google Scholar] [CrossRef]
- Iiduka, H.; Takahashi, W. Weak convergence of a projection algorithm for variational inequalities in a Banach space. J. Math. Anal. Appl. 2008, 339, 668–679. [Google Scholar] [CrossRef]
- Li, C.-L.; Jia, Z.-F.; Postolache, M. New convergence methods for nonlinear uncertain variational inequality problems. J. Nonlinear Convex Anal. 2018, 19, 2153–2164. [Google Scholar]
- Lions, J.-L.; Stampacchia, G. Variational inequalities. Comm. Pure Appl. Math. 1967, 20, 493–517. [Google Scholar] [CrossRef]
- Lu, X.; Xu, H.-K.; Yin, X. Hybrid methods for a class of monotone variational inequalities. Nonlinear Anal. 2009, 71, 1032–1041. [Google Scholar] [CrossRef]
- Solodov, M.-V.; Svaiter, B.-F. A new projection method for variational inequality problems. SIAM J. Control Optim. 1999, 37, 765–776. [Google Scholar] [CrossRef]
- Solodov, M.-V.; Tseng, P. Modified projection-type methods formonotone variational inequalities. SIAM J. Control Optim. 1996, 34, 1814–1830. [Google Scholar] [CrossRef]
- Thakur, B.-S.; Postolache, M. Existence and approximation of solutions for generalized extended nonlinear variational inequalities. J. Inequal. Appl. 2013, 2013, 590. [Google Scholar] [CrossRef]
- Wang, S.-H.; Zhao, M.-L.; Kumam, P.; Cho, Y.-J. A viscosity extragradient method for an equilibrium problem and fixed point problem in Hilbert space. J. Fixed Point Theory Appl. 2018, 20, 19. [Google Scholar] [CrossRef]
- Xu, H.-K.; Kim, T.-H. Convergence of hybrid steepest-descent methods for variational inequalities. J. Optim. Theory Appl. 2003, 119, 185–201. [Google Scholar] [CrossRef]
- Yao, Y.; Chen, R.; Xu, H.-K. Schemes for finding minimum-norm solutions of variational inequalities. Nonlinear Anal. 2010, 72, 3447–3456. [Google Scholar] [CrossRef]
- Yao, Y.; Liou, Y.-C.; Kang, S.-M. Approach to common elements of variational inequality problems and fixed point problems via a relaxed extragradient method. Comput. Math. Appl. 2010, 59, 3472–3480. [Google Scholar] [CrossRef]
- Yao, Y.; Liou, Y.-C.; Postolache, M. Self-adaptive algorithms for the split problem of the demicontractive operators. Optimization 2018, 67, 1309–1319. [Google Scholar] [CrossRef]
- Yao, Y.; Postolache, M.; Liou, Y.C. Variant extragradient-type method for monotone variational inequalities. Fixed Point Theory Appl. 2013, 2013, 185. [Google Scholar] [CrossRef]
- Yao, Y.; Postolache, M.; Liou, Y.-C.; Yao, Z.-S. Construction algorithms for a class of monotone variational inequalities. Optim. Lett. 2016, 10, 1519–1528. [Google Scholar] [CrossRef]
- Yao, Y.; Qin, X.; Yao, J.-C. Projection methods for firmly type nonexpansive operators. J. Nonlinear Convex Anal. 2018, 19, 407–415. [Google Scholar]
- Yao, Y.; Shahzad, N. Strong convergence of a proximal point algorithm with general errors. Optim. Lett. 2012, 6, 621–628. [Google Scholar] [CrossRef]
- Yao, Y.; Yao, J.-C.; Liou, Y.-C.; Postolache, M. Iterative algorithms for split common fixed points of demicontractive operators without priori knowledge of operator norms. Carpath. J. Math. 2018, 34, 459–466. [Google Scholar]
- Zegeye, H.; Yao, Y. Minimum-norm solution of variational inequality and fixed point problem in Banach spaces. Optimization 2015, 64, 453–471. [Google Scholar] [CrossRef]
- Zhao, J.; Liang, Y.-S.; Liu, Y.-L.; Cho, Y.-J. Split equilibrium, variational inequality and fixed point problems for multi-valued mappings in Hilbert spaces. Appl. Comput. Math. 2018, 17, 271–283. [Google Scholar]
- Yao, Y.; Postolache, M.; Yao, J.-C. An iterative algorithm for solving the generalized variational inequalities and fixed points problems. Mathematics 2019, 7, 61. [Google Scholar] [CrossRef]
- Yao, Y.; Postolache, M.; Yao, J.-C. Iterative algorithms for generalized variational inequalities. Univ. Politeh. Buch. Ser. A 2019, in press. [Google Scholar]
- Korpelevich, G.-M. An extragradient method for finding saddle points and for other problems. Ekon. Matorsz. Metod. 1976, 12, 747–756. [Google Scholar]
- Bnouhachem, A.; Noor, M.-A.; Hao, Z. Some new extragradient iterative methods for variational inequalities. Nonlinear Anal. 2009, 70, 1321–1329. [Google Scholar] [CrossRef]
- Censor, Y.; Gibali, A.; Reich, S. Strong convergence of subgradient extragradient methods for the variational inequality problem in Hilbert space. Optim. Methods Softw. 2011, 26, 827–845. [Google Scholar] [CrossRef]
- Iusem, A.-N.; Svaiter, B.-F. A variant of Korpelevich’s method for variational inequalities with a new search strategy. Optimization 1997, 42, 309–321. [Google Scholar] [CrossRef]
- Iusem, A.-N.; Lucambio Peŕez, L.-R. An extragradient-type algorithm for non-smooth variational inequalities. Optimization 2000, 48, 309–332. [Google Scholar] [CrossRef]
- Khobotov, E.-N. Modification of the extra-gradient method for solving variational inequalities and certain optimization problems. USSR Comput. Math. Math. Phys. 1989, 27, 120–127. [Google Scholar] [CrossRef]
- Censor, Y.; Gibali, A.; Reich, S. Extensions of Korpelevich’s extragradient method for the variational inequality problem in Euclidean space. Optimization 2012, 61, 1119–1132. [Google Scholar] [CrossRef]
- Censor, Y.; Gibali, A.; Reich, S. The subgradient extragradient method for solving variational inequalities in Hilbert space. J. Optim. Theory Appl. 2011, 148, 318–335. [Google Scholar] [CrossRef]
- Censor, Y.; Gibali, A.; Reich, S. A von Neumann alternating method for finding common solutions to variational inequalities. Nonlinear Anal. 2012, 75, 4596–4603. [Google Scholar] [CrossRef]
- Zaslavski, A.-J. The extragradient method for finding a common solution of a finite family of variational inequalities and a finite family of fixed point problems in the presence of computational errors. J. Math. Anal. Appl. 2013, 400, 651–663. [Google Scholar] [CrossRef]
- Takahashi, W.; Toyoda, M. Weak convergence theorems for nonexpansive mappings and monotone mappings. J. Optim. Theory Appl. 2003, 118, 417–428. [Google Scholar] [CrossRef]
- Suzuki, T. Strong convergence theorems for infinite families of nonexpansive mappings in general Banach spaces. Fixed Point Theory Appl. 2005, 2005, 103–123. [Google Scholar] [CrossRef]
- Xu, H.-K. Iterative algorithms for nonlinear operators. J. Lond. Math. Soc. 2002, 66, 240–256. [Google Scholar] [CrossRef]
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).