Abstract
The forward–backward–forward (FBF) splitting method is a popular iterative procedure for finding zeros of the sum of maximal monotone and Lipschitz continuous monotone operators. In this paper, we introduce a forward–backward–forward splitting method with reflection steps (symmetric) in real Hilbert spaces. Weak and strong convergence analyses of the proposed method are established under suitable assumptions. Moreover, a linear convergence rate of an inertial modified forward–backward–forward splitting method is also presented.
1. Introduction
In this paper we are concerned with solving monotone inclusion problems and introducing a new self-adaptive, reflected forward–backward–forward method for solving it. Monotone inclusions appear naturally and play an important role in many applied fields, such as fixed point problems, equilibriums, and many more, as, for example, in [1,2,3,4,5,6,7,8]. More precisely, various problems in signal processing, computer vision and machine learning can be modelled mathematically using this formulation, see for example [9] and the references therein.
Let us recall the definition of the monotone inclusion problem. Given a maximal monotone operator and a Lipschitz continuous monotone operator defined on a real Hilbert space H; the monotone inclusion problem is formulated as finding a point such that
One of the simplest and most popular methods for solving (1) is the well-known forward–backward splitting method, introduced by Passty [7] and Lions and Mercier [5]. The iterative step of the method is phrased as follows.
where denotes the resolvent of the maximal monotone operator B. Tseng, in [10], introduced a modification of (2) that includes an extra step that enables us to obtain convergence under weaker assumptions than those mentioned above.
The Tseng [10] iterative step is formulated as follows.
where (L is the Lipschitz constant of A) is obtained using an Armijo line search rule, as seen in ([10] (2.4)). The forward–backward–forward algorithm (3) has been studied extensively in the literature due to its applicability—e.g., in [11,12,13,14,15,16,17,18,19,20,21].
Recently, Malitsky and Tam [22] introduced the following forward–reflected–backward splitting method for solving (1).
where and is defined via a line search procedure, as seen in ([22] Algorithm 1).
Observe that the iterative methods (4) and (5) coincide when but (4) is more general due to the choice of . In any case, a major drawback of these two methods is the prior knowledge of the Lipschitz constant of A and, in most applications, this is either unknown or difficult to approximate. Furthermore, when a line search procedure is used as an inner loop, this might include extra computations and hence results in slow convergence and then could make the method inefficient.
Very recently, Hieu et al., in [24], proposed a self-adaptive forward–backward splitting variant that does not depend on the Lipschitz constant of A and no line search procedure is required. Choose and .
Motivated by the above recent developments in the field of algorithms for solving inclusion problems (1), our contributions in this paper are:
- We propose a new reflected forward–backward–forward iterative method for solving inclusion problems that has a different structure from the methods proposed in [10,22,23,24].
- Our scheme is a self-adaptive procedure that does not require prior knowledge of the Lipschitz constant of A and no line search procedure is needed.
- We also propose a modification of the forward–backward–forward method with inertial extrapolation step and obtain linear convergence result under some standard assumptions.
For deeper understanding and motivation, we next present the relations between dynamical systems and monotone inclusions.
Dynamical Systems and Monotone Inclusions
The forward–backward splitting method (2) can be interpreted as a discretization of the dynamical system (see [25,26])
which consequently takes the form
as a monotone inclusion (1). Therefore, (7) can be considered as the dynamical system for the monotone inclusion (1), where A is co-coercive.
Now, let us consider the monotone inclusion (1) for which A is monotone and Lipschitz continuous on H. To solve (1), let us consider the dynamical system, in the spirit of (8):
where is a Lebesgue measurable function. Observe that the dynamical system (9) is not explicit because appears on both sides of (9). The dynamical system (9) is different from the second order dynamical system considered in ([27] Section 2).
Using the forward discretization on the left-hand side and the backward discretization on the right-hand side of (9), we have
2. Preliminaries
We give some lemmas for our analysis.
Lemma 1
(See [2]). The following statements hold in H:
- (i)
- for all ;
- (ii)
- for all
- (iii)
- .
Lemma 2
(Maingé [28]). Let and be sequences in such that
and there exists a real number θ with for all Then the following hold:
- (i)
- where
- (ii)
- there exists such that
Lemma 3
(Opial [29]). Let C be a nonempty set of H and be a sequence in H such that the following two conditions hold:
- (i)
- for any , exists;
- (ii)
- every sequential weak cluster point of is in C.
Then converges weakly to a point in C.
Lemma 4
(Xu [30]). Let be a sequence of nonnegative real numbers satisfying the following relation:
where
- (a)
- (b)
- ;
- (c)
Then, as .
Definition 1.
A mapping is called
- (a)
- strongly monotone with modulus on H ifIn this case, we say that A is γ-strongly monotone;
- (b)
- monotone on H if
- (c)
- co-coercive on H if
- (d)
- Lipschitz continuous on H if there exists a constant such thatfor all .
Definition 2.
A multi-valued operator with graph is said to be monotone if for any and
A monotone operator B is said to be maximal if whenever is monotone and . For more details, see, for instance [31].
Lemma 5
(See [2]). Let be a maximal monotone mapping and be a mapping. Define a mapping
Then where denotes the set of all fixed points of .
3. Our Results
We give two new Forward–Backward–Forward algorithms and their analysis under suitable conditions. We assume that the following conditions hold for the rest of this paper.
Assumption 1
- (a)
- Letbe a maximal monotone operator; a monotone and L-Lipschitz continuous.
- (b)
- The solution set of the inclusion problem (1) is nonempty.
- (c)
Remark 1.
Lemma 6.
The sequence generated by Algorithm 1 is bounded.
| Algorithm 1 Forward–Backward–Forward Algorithm 1 |
|
Proof.
Let us define . Then
We now show that
From , we obtain . Noting that B is maximal monotone, we obtain , such that
Therefore,
Furthermore, and . Since is maximal monotone, one has
Thus,
Using Algorithm 1, we get
which in turn implies that
Note that
and this implies
Additionally, by Lemma 1 (iii),
Define
Since , we have
By condition (c) of Assumption 1, one gets
Therefore, is non-increasing. Similarly,
Note that
From (32), we have
This implies
Therefore, Additionally, from (16), we get
From (27), we get
Hence, is bounded. Therefore is bounded. □
Theorem 1.
generated by Algorithm 1 converges weakly to a point in .
Proof.
By (24), we have
Therefore,
Now, and are bounded by the boundedness of . Hence, there exists such that (noting (21))
Therefore,
Additionally, by the boundedness of , there exists subsequence such that .
Let . Thus, and so . Hence, , which turns to
B is maximal monotone, giving
Thus,
This implies by the maximal monotonicity of (see ([2] Corollary 24.4(i))) that . Thus .
Since by (39), we have that exists. Hence, Opial’s lemma 3 shows that converges weakly to a point in . This completes the proof. □
If A is strongly monotone and Lipschitz continuous on H, we show that converges strongly in Algorithm 1. Note that in this case the splitting operator in Lemma 5 is a contraction mapping and hence is a singleton.
Theorem 2.
Suppose A is strongly monotone and Lipschitz continuous on H. Then generated Algorithm 1 converges strongly to the unique point in .
Proof.
Take unique point . From the definition of , there exists , such that
From , there exists , such that . Given that , , we have
Since is strongly monotone, there exists such that
Since and the sequence is monotonically decreasing, we have . Let be a fixed number in the interval . Additionally, since , there exists such that . So, , we have
Consequently,
Therefore
So,
Since is bounded by Lemma 6 and by (36), we obtain . Hence . Consequently, we get
as . This concludes the proof. □
Remark 2.
- 1.
- Observe that the convergence Theorems 1 and 2 assume that the mapping A is monotone and Lipschitz continues. In case that the Lipschitz constant L of A is known or can be easily evaluated, then one can choose the step-sizes in Algorithm 1, as follows, and the convergence theorems remain valid.This and our adaptive step-size rule for determining are quite flexible and general so extend several related results in the literature—e.g., in [22,23,24,32,33].
- 2.
- In case we incorporate a general inertial term for (not necessarily 1), then Lemma 6 and Theorems 1 and 2 still hold. Moreover, with this general term and under η-strongly monotonicity and L-Lipschitz continuity we are able to present linear convergence of the next algorithm.
Theorem 3.
Suppose that A is η-strongly monotone and L-Lipschitz continuous on H. Then generated by Algorithm 2 converges linearly to the unique point in .
| Algorithm 2 Forward–Backward–Forward Algorithm 2 |
|
Proof.
Since , we therefore have
Observe that since . We obtain from (55) that
where since .
Denote . Then (56) implies
Therefore,
This concludes the proof. □
We give some remarks about the contributions of our proposed methods and the consequent improvements over some related methods in the literature.
Remark 3.
(a) The proposed methods in [22,23] use a fixed constant step size which depends on the Lipschitz constant of the forward operator A. This approach is quite restrictive and has limited applications since the Lipschitz constant or an estimate of it must be known before the methods in [22,23] can be applied. Our proposed method in Algorithm 1 uses a self-adaptive step size in (15), which is more applicable and without recourse to finding the Lipschitz constant or an estimate of it.
(b) In our proposed methods in this paper, the forward–backward acted on the reflection , (i.e., there is ) which speeds up the acceleration of our proposed methods since solves the inclusion problem (1). This is not the case in the proposed methods in [22,23,24]. In these methods, does not act on the reflection .
(c) We give a linear convergence rate in Theorem 3. No linear convergence for the proposed methods in [22,23] is given.
4. Conclusions
In this work, we study the Tseng-type algorithm with reflection step for solving monotone inclusion in real Hilbert spaces. We propose two variants and their weak and strong convergence results under suitable conditions, as well as convergence rate for stronger assumptions. Our contributions in this paper show that we can modify the Tseng algorithm with extrapolation term with and obtain convergence analysis. This approach has not been considered before in the literature. Our work generalizes and extends some related results in the literature, such as in [10,22,23,24]. Some of the continuing projects that can be studied further are splitting algorithms for finding a zero of the sum of three monotone operators in which two are maximal monotone and the third is Lipschitz continuous—e.g., in [16,33].
Author Contributions
Analysis, Y.S. and A.G.; Investigation, Y.S.; Methodology, Y.S.; Visualization, A.G.; Writing—original draft, Y.S.; Writing—review and editing, A.G. All authors have read and agreed to the published version of the manuscript.
Funding
This research received no external funding.
Conflicts of Interest
The authors declare no conflict of interest.
References
- Aoyama, K.; Kimura, Y.; Takahashi, W. Maximal monotone operators and maximal monotone functions for equilibrium problems. J. Convex Anal. 2008, 15, 395–409. [Google Scholar]
- Bauschke, H.H.; Combettes, P.L. Convex Analysis and Monotone Operator Theory in Hilbert Spaces, CMS Books in Mathematics; Springer: New York, NY, USA, 2011. [Google Scholar]
- Chen, G.H.-G.; Rockafellar, R.T. Convergence rates in forward-backward splitting. SIAM J. Optim. 1997, 7, 421–444. [Google Scholar] [CrossRef]
- Combettes, P.; Wajs, V.R. Signal recovery by proximal forward-backward splitting. Multiscale Model. Simul. 2005, 4, 1168–1200. [Google Scholar] [CrossRef]
- Lions, P.L.; Mercier, B. Splitting algorithms for the sum of two nonlinear operators. SIAM J. Numer. Anal. 1979, 16, 964–979. [Google Scholar] [CrossRef]
- Moudafi, A.; Thera, M. Finding a zero of the sum of two maximal monotone operators. J. Optim. Theory Appl. 1997, 94, 425–448. [Google Scholar] [CrossRef]
- Passty, G.B. Ergodic convergence to a zero of the sum of monotone operators in Hilbert spaces. J. Math. Anal. Appl. 1979, 72, 383–390. [Google Scholar] [CrossRef]
- Peaceman, D.H.; Rachford, H.H. The numerical solutions of parabolic and elliptic differential equations. J. Soc. Indust. Appl. Math. 1955, 3, 28–41. [Google Scholar] [CrossRef]
- Beck, A.; Teboulle, M. Gradient-Based Algorithms with Applications to Signal Recovery Problems. In Convex Optimization in Signal Processing and Communications; Yonina, E., Daniel, P., Eds.; Cambridge University Press: Cambridge, UK, 2009. [Google Scholar]
- Tseng, P. A modified forward-backward splitting method for maximal monotone mappings. SIAM J. Control Optim. 2000, 38, 431–446. [Google Scholar] [CrossRef]
- Alves, M.M.; Geremia, M. Iteration complexity of an inexact Douglas-Rachford method and of a Douglas-Rachford-Tseng’s F-B four-operator splitting method for solving monotone inclusions. Numer. Algorithms 2019, 82, 263–295. [Google Scholar] [CrossRef]
- Boţ, R.I.; Csetnek, E.R. An inertial Tseng’s type proximal algorithm for nonsmooth and nonconvex optimization problems. J. Optim. Theory Appl. 2016, 171, 600–616. [Google Scholar] [CrossRef]
- Cholamjiak, W.; Cholamjiak, P.; Suantai, S. An inertial forward-backward splitting method for solving inclusion problems in Hilbert spaces. J. Fixed Point Theory Appl. 2018, 20, 42. [Google Scholar] [CrossRef]
- Gibali, A.; Thong, D.V. Tseng type methods for solving inclusion problems and its applications. Calcolo 2018, 55, 49. [Google Scholar] [CrossRef]
- Khatibzadeh, H.; Moroşanu, G.; Ranjbar, S. A splitting method for approximating zeros of the sum of two monotone operators. J. Nonlinear Convex Anal. 2017, 18, 763–776. [Google Scholar]
- Latafat, P.; Patrinos, P. Asymmetric forward-backward-adjoint splitting for solving monotone inclusions involving three operators. Comput. Optim. Appl. 2017, 68, 57–93. [Google Scholar] [CrossRef]
- Shehu, Y. Convergence results of forward-backward algorithms for sum of monotone operators in Banach spaces. Results Math. 2019, 74, 138. [Google Scholar] [CrossRef]
- Shehu, Y.; Cai, G. Strong convergence result of forward-backward splitting methods for accretive operators in Banach spaces with applications. Rev. R. Acad. Cienc. Exactas Fís. Nat. Ser. A Math. RACSAM 2018, 112, 71–87. [Google Scholar] [CrossRef]
- Thong, D.V.; Vuong, P.T. Modified Tseng’s extragradient methods for solving pseudo-monotone variational inequalities. Optimization 2019, 68, 2203–2222. [Google Scholar] [CrossRef]
- Thong, D.V.; Vinh, N. The Inertial methods for fixed point problems and zero point problems of the sum of two monotone mappings. Optimization 2019, 68, 1037–1072. [Google Scholar] [CrossRef]
- Wang, Y.; Wang, F. Strong convergence of the forward-backward splitting method with multiple parameters in Hilbert spaces. Optimization 2018, 67, 493–505. [Google Scholar] [CrossRef]
- Malitsky, Y.; Tam, M.K. A Forward-Backward splitting method for monotone inclusions without cocoercivity. arXiv 2020, arXiv:1808.04162. [Google Scholar] [CrossRef]
- Csetnek, E.R.; Malitsky, Y.; Tam, M.K. Shadow Douglas–Rachford Splitting for Monotone Inclusions. Appl. Math Optim. 2019, 80, 665–678. [Google Scholar] [CrossRef]
- Van Hieu, D.; Anh, P.K.; Muu, L.D. Modified forward–backward splitting method for variational inclusions. 4OR-Q J. Oper. Res. 2020. [Google Scholar] [CrossRef]
- Abbas, B.; Attouch, H.; Svaiter, B.F. Newton-Like Dynamics and Forward-Backward Methods for Structured Monotone Inclusions in Hilbert Spaces. J. Optim. Theory Appl. 2014, 161, 331–360. [Google Scholar] [CrossRef]
- Attouch, H.; Cabot, A. Convergence of a Relaxed Inertial Forward–Backward Algorithm for Structured Monotone Inclusions. Appl. Math. Optim. 2019. [Google Scholar] [CrossRef]
- Boţ, R.I.; Sedlmayer, M.; Vuong, P.T. A relaxed inertial Forward-Backward-Forward algorithm for solving monotone inclusions with application to GANs. arXiv 2020, arXiv:2003.07886. [Google Scholar]
- Maingé, P.-E. Convergence theorems for inertial KM-type algorithms. J. Comput. Appl. Math. 2008, 219, 223–236. [Google Scholar] [CrossRef]
- Opial, Z. Weak convergence of the sequence of successive approximations for nonexpansive mappings. Bull. Am. Math. Soc. 1967, 73, 591–597. [Google Scholar] [CrossRef]
- Xu, H.K. Iterative algorithms for nonlinear operators. J. London. Math. Soc. 2002, 66, 240–256. [Google Scholar] [CrossRef]
- Barbu, V. Nonlinear Semigroups and Differential Equations in Banach Spaces; Editura Academiei RS Romania: Bucharest, Romania, 1976. [Google Scholar]
- Cevher, V.; Vũ, B.C. A reflected Forward-Backward splitting method for monotone inclusions involving Lipschitzian operators. Set-Valued Var. Anal. 2020. [Google Scholar] [CrossRef]
- Rieger, J.; Tam, M.K. Backward-Forward-Reflected-Backward splitting for three operator monotone inclusions. Appl. Math. Comput. 2020, 381, 125248. [Google Scholar] [CrossRef]
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).