Unified Semi-Local Convergence for k − Step Iterative Methods with Flexible and Frozen Linear Operator

Abstract: The aim of this article is to present a unified semi-local convergence analysis for a k-step iterative method containing the inverse of a flexible and frozen linear operator for Banach space valued operators. Special choices of the linear operator reduce the method to the Newton-type, Newton’s, or Stirling’s, or Steffensen’s, or other methods. The analysis is based on center, as well as Lipschitz conditions and our idea of the restricted convergence region. This idea defines an at least as small region containing the iterates as before and consequently also a tighter convergence analysis.


Introduction
Let X , Y be Banach spaces and D ⊂ X be a nonempty and open set.By L(X , Y ), we denote the space of bounded linear operators from X into Y.Let also U(w, d) stand for an open set centered at w ∈ X and of radius d > 0 and Ū(w, d) stand for its closure.
That is why researchers and practitioners have developed iterative methods that on the one hand avoid the computation of derivatives and on the other hand achieve a high order of convergence.In particular, we unify the study of such methods by considering k-step iterative methods with a frozen linear operator defined for each n = 0, 1, 2, . . ., by: x where n for each k = 0, 1, 2, . . . .Special choices of operator A lead to well-known methods.If k = 1 and A(x) = F (x) for each x ∈ D, we obtain Newton's method (2), whereas, if k = 1, 2, . . .and A(x) = F (x) for each x ∈ D, we obtain a method whose semi-local convergence was given in [12].
. ., where g 1 : X −→ X and g 2 : X −→ X, we obtain Steffensen-type methods.Stirling's and other one-point methods are also special cases of method (4).Based on the above, it is important to study the semi-local convergence analysis of method (4).It is well known that as the convergence order increases, the convergence region decreases in general.To avoid this problem as well, we introduce a center-Lipschitz-type condition that helps us determine an at least as small region as before containing the iterates {x n }.This way, the resulting Lipschitz constants are at least as small.A tighter convergence analysis is obtained this way.
The rest of the article is organized as follows: Section 2 contains the conditions to be used in the semi-local convergence that follows in Section 3. Final remarks are given in the concluding Section 4.

Semi-Local Convergence
We need some auxiliary results to show the semi-local convergence of Method (4).Lemma 1. Suppose that there exists r > 1 such that x Then, method ( 4) is well defined.

It follows that x
(i) 0 = x 1 belong in U(x 0 , rη).Define: Next, we study Method (4) for n = 1 in an analogous way to n = 0.It follows from Lemma 1 that A(x 1 ) −1 ∈ L(Y, X ) and: Hence, x where: Define as previously, Then, we have again that: Next, we continue for n = 2.By Lemma 1, A(x 2 ) −1 ∈ L(Y, X ) and: Notice that β 2 = β 1 .Then, for i = 1 and since 2 , we get as in (14): x where . Then, as before, we can write: We are motivated by the preceding items to define recurrent relations: Hence, we arrive at: Lemma 3. Suppose that the hypotheses of Lemma 1 hold.Then, x Proof.As in the cases n = 1, 2, we get for each n = 1, 2, 3, . . .: That is, we obtain: F(x Define function ϕ on the interval [0, 1] by: We have that ϕ(0) = −1 and ϕ(1) = 1 > 0. It then follows from the intermediate value theorem that equation ϕ(t) = 0 has at least one solution in (0, 1).Denote by s the smallest such solution.Notice that for: a simple inductive argument shows that: and: Hence, we arrive at: Lemma 4. Suppose that (20) holds.Then, sequences {h n } and {T n (r)} are decreasing.

Proof. It follows immediately from (19).
Taking into account x (i) n ∈ U(x 0 , rη n ) and ( 20)-( 22), we can obtain in turn the estimate: where we also used that: Then, we can show: Theorem 1. Suppose Condition (A) is satisfied and for each fixed number of steps k, equation: has at least one positive solution.Denote by r the smallest such solution.Moreover, suppose that (20) is satisfied and U(x 0 , rη) ⊆ D.Then, sequence {x n } generated by Method (4) is well defined, remains in Ū(x 0 , rη) for each n = 0, 1, 2, . . ., i = 1, 2, . . ., k and converges to a solution x * ∈ Ū(x 0 , rη) of equation F(x) = 0.The solution x * is unique in D 1 .
Proof.It follows from the previous results that x n and x (k) n = x n belong in U(x 0 , rη).We must show that sequence {x n } is complete: , so {x n } is complete in a Banach space X, and as such, it converges to some x * ∈ Ū(x 0 , rη), since Ū(x 0 , rη) is a closed set.Moreover, we have: . By (a4) and (a6), we get in turn that: Remark 1.As noted in the Introduction, even if specialized to A(x) = F (x), Theorem 1 can give better results, since K 0 ≤ K.As an example, consider the uniqueness result in [12], where: r < 2 βKη = r 1 , but r 0 < r 1 for K 0 < K.

Conclusions
We presented a semi-local convergence analysis for a k-steps iterative method with a flexible and frozen linear operator.The results obtained in this article reduce to the ones given in [1,2,12], if we choose A(x) = F (x) for each x ∈ D. On top of that, in the special case, our results have the following advantages over these works: (1) Larger convergence region, leading to more initial points; (2) Tighter upper bound estimates on x n+1 − x n , as well as x n − x * , which means that fewer iterations are needed to arrive at a desired error tolerance.(3) The information on the location of the solution is at least as precise.
These advantages are obtained, since we locate a ball inside the old ball containing the iterates.Then, the Lipschitz constants depend on the smallest ball and that is why these constants are at least as small as the old ones.It is also worth noticing that these advantages are attained, because the new constants are special cases of the old ones.That is no additional effort is required to compute the new constants.A plethora of numerical examples where the new constants are strictly smaller than the old ones can be found in [3][4][5][6][7][8].Finally, other choices of operator A lead to methods not studied before.