Due to the aforementioned, to obtain global convergence results, it is clear to think in fixed point results. To do this, firstly, we transform the quadratic matrix Equation (
1) into a fixed matrix equation. It is clear that there are different ways to carry out this process. So, if
with
, then it is necessary that a fixed matrix for operator
T must be a solution of Equation (
1). The Successive Approximations and Picard methods are iterative schemes that we can use. The Successive Approximations Method for operator
T is given by
and the well-known Picard method [
11]:
with
given by
. As we have already indicated before, it is easy to verify that both the Successive Approximations and the Picard methods provide the same iterations, since
. A fixed matrix of
P is a fixed matrix of
T. However, the different algorithmic expressions of both methods allow us to obtain different convergence results for them.
Now, to define the operator
T, it is important to do so in such a way that the Successive Approximations Method is stable [
12]. Thus, as can be seen in [
13], if we consider
, then the Successive Approximations Method given in (
4) is stable. So, we consider the following algorithm:
Obviously, in this situation, a fixed matrix of
T is a solution of (
1).
2.1. The Successive Approximations Method
To start our study, we are going to consider local convergence for the Successive Approximations Method given in (
6). In this case, we need to require that there exists
a fixed matrix of
T in the domain
. Then, through conditions for
R, applying Theorem 1, we obtain conditions for method (
6) to be convergent for any starting matrix
in
.
Taking into account that we can deduce the following result.
Lemma 1. Suppose that is a fixed matrix for the operator T and there exists such that . Then, for each with , the following items are satisfied:
- (i)
for each there exists and , with
- (ii)
,
- (iii)
.
Proof. Firstly, we consider
for
. Therefore, by Banach lemma, there exists
, and
since
. Taking into account item (i), we obtain:
Finally, to prove item (iii), notice that
Thus,
and item (iii) is proved. □
Next, to apply Theorem 1 to
T with
,
T must be a contraction map of
into itself. To prove that
T is a map of
into itself, we consider
and, from Lemma 1, there exists
if
. In this situation,
T is well defined, and it follows
where
Therefore,
So, if
; then,
T maps
into itself.
On the other hand, if
for
then
T is a contraction map. By Lemma 1, it follows
Thus, the operator
T is a contraction map of
into itself if
Thus, both conditions are verified if , since that where . Then, we obtain the following result.
Theorem 2. Suppose that is a fixed matrix for the operator T and there exists such that . If , where then, from any starting matrix with the Successive Approximations Method is convergent to . Moreover, is the unique fixed matrix of T in .
In order to obtain a new local result, under conditions of Theorem 2 for
, we have that
Then,
, where 0 is the null matrix in
. Let us see if we can ensure the convergence of the Successive Approximations Method being
To do this, we give the following result.
Theorem 3. Suppose that is a fixed matrix for T and there exists such that . If , andthen, the Successive Approximations Method is convergent to from any starting matrix . Moreover, is the unique fixed matrix of T in . Proof. Being as there exists
and
from any matrix
it follows
Now, as
, then there exists
, and
Therefore,
This condition is satisfied for
if
Observe that in any situation,
. Thus,
In addition, if
for any
, then
T is a contraction map in
. So, from
, if
then,
T is a contraction map in
. Observe that, in this case, we also have that
.
Finally, notice that the size of the existence domains can be related to the amount . So, we always have to verify that However, only if . □
Note that it is always verified that and, therefore, .
From the previous result, it is not difficult to think of obtaining another result of restricted global convergence in
. So, if there exists
, then
for
and
Therefore, taking
, by the Perturbation Lemma in matrix analysis, there exists
and
. Now, we can obtain the following result of semilocal convergence. Note that in the following result, we do not require the existence of a fixed matrix for
T.
Theorem 4. Let the matrix M be such that there exists with and . Then, the Successive Approximations Method is convergent to the fixed matrix of T from any starting matrix with . Moreover, is the unique fixed matrix of T in .
Proof. We apply Theorem 1 restricted to
. Firstly, under hypothesis, we have
for
. Thus, if
and
, then,
. Therefore, if
and
, then
. Notice that
.
Moreover, if
for any
, then
T is a contraction map in
or equivalently, if
. Moreover, as
taking
, applying the Theorem 1 restricted to
, the result follows. □
To illustrate the results obtained previously, we consider a simple academic example. So, we consider quadratic matrix Equation (
1) with
It is easy to check that
is a solution of (
1).
The possible application of the results obtained depends on the values that the parameter takes. Thus, for , it follows that and then, from Theorem 2, we obtain that there exists a unique solution in , where . Moreover, the Successive Approximations Method is globally convergent in said ball.
Now, considering , we can apply the results of Theorems (3) and (4). In this case, and by Theorem 3, there is a unique solution in , with Furthermore, the Successive Approximations Method is globally convergent on that ball. In addition, taking into account Theorem 4 and then there is a unique solution in with . Moreover, we obtain global convergence for the Successive Approximations Method on the same ball.
In view of the results obtained, we observe that the best separation of solutions is provided by Theorem 2. On the other hand, the best location of the solution is provided by Theorem 4 with .
2.2. Picard Method
As it is known, the fixed point conditions are quite restrictive. Afterward, we smooth the above results. For this, we consider the Picard method (
5), and we study its convergence by using an auxiliary point. This technique allow us to smooth out the previously obtained results. In addition, we obtain results of both local and semilocal convergence.
Theorem 5. Let such that there exists with . We suppose that , with and . Then, from any starting matrix the Picard method (5) converges to a solution of Equation (1) and for , withwhere Proof. Firstly, we have that
for
. Therefore, if
, then there exists
, and
Moreover, we take
then
Therefore,
if
Being as
then
If condition (
8) is satisfied, then (
9) is true. Therefore, we obtain that
On the other hand, we observe that
for
. Therefore, as
then there exists
, and
Thus, we have
Therefore,
if
The last condition is satisfied since that
, with
.
Now, by a mathematical inductive procedure, we have
On the other hand, notice that
is a strictly decreasing sequence of positive real numbers. Consequently, there exists
. Now, applying the continuity of the operator
F, we obtain that
and, therefore,
is a solution of Equation (
1). □
Now, we obtain a result of the uniqueness of solution of quadratic matrix equation given in (
1).
Theorem 6. Under conditions of Theorem 5, is the unique solution of Equation (1) in Proof. We suppose that
is another solution of Equation (
1) in
. Then, we can write
where
is given by
So, if there exists
then it follows that
. To prove this, we first establish the following condition
for
Therefore, there exists
and
Therefore, for all
we have
Thus,
and then, there exists
. Obviously,
, and the result is proved. □
Notice that, from Theorems (5) and (6), we obtain domains of existence and uniqueness of solution. Now, we obtain from Theorem 5 both local and semilocal convergence results for the Picard method given in (
5).
To obtain a local convergence result, we consider the existence of
a solution of (
1). Thus,
and
In addition, notice that
Corollary 1. Let be a solution of Equation (1) such that there exists with and . Then, method (5) converges to from any starting at where Moreover, is unique in . Now, a semilocal convergence result for method (
5) is obtained. For that, we take
in Theorema 5.
Corollary 2. Let be such that exists with . Suppose that , with and . Then, the Picard method (5) converges to a solution of Equation (1) and for , withwhere Moreover, the solution is the unique solution of the equation in . Now, by using recurrence relations [
16] relative to Picard iterations, we obtain another semilocal convergence result for the Picard method.
Theorem 7. Let such that there exists with and . We suppose that there exists R the smallest positive real root of the auxiliar scalar equationIf and then, method (5) converges to a solution of Equation (1) starting at . Moreover, for all and is unique in . Proof. Obviously, from (
11), we have
and then
.
On the one hand, it follows that
for
. Therefore, if
then there exists
, and
On the other hand, we have
Thus, from
, it follows that
and
Therefore,
.
Proceeding in a similar way, we obtain
for
. Then, there exists
with
Moreover, we have
Being as
, we obtain
On the other hand,
so,
.
By a mathematical inductive procedure, we obtain for
:
Method (
5) converges to a solution
and by continuity of
F, it follows that
is a solution of Equation (
1).
To finish, proceeding as in Theorem 6, the uniqueness is followed. □
Now, we apply the Picard method to the example given in (
7).
Thus, for
, and
it follows that there exists
with
and
and
, are satisfied. Thus, method (
5) converges to a solution
of Equation (
7) from any starting matrix
being
. Furthermore,
is the unique solution of Equation (
1) in
Now, we compare the results obtained in Corollary 2 and in Theorem 7. For that, we choose the null matrix as starting matrix
So, the hypotheses of Corollary 2 with , , and are satisfied. Thus, the Picard method converges to a solution , which is unique in and with .
Furthermore,
is the smallest positive root of Equation (
11) and
. Thus, starting at
, the Picard method converges to
a solution of (
7). Moreover,
Thus, Corollary 2 and Theorem 7 improve the results of Theorems 3 and 4. Therefore, the location and the separation of solutions is improved applying the Picard method.