Eventually DSDD Matrices and Eigenvalue Localization

Firstly, the relationships among strictly diagonally dominant (SDD) matrices, doubly strictly diagonally dominant (DSDD) matrices, eventually SDD matrices and eventually DSDD matrices are considered. Secondly, by excluding some proper subsets of an existing eigenvalue inclusion set for matrices, which do not contain any eigenvalues of matrices, a tighter eigenvalue inclusion set of matrices is derived. As its application, a sufficient condition of determining non-singularity of matrices is obtained. Finally, the infinity norm estimation of the inverse of eventually DSDD matrices is derived.


Introduction
Let n be a positive integer, n ≥ 2, J = {1, 2, • • • , n}, N be the set of all positive integers, C be the set of all complex numbers, C n×n be the set of all n × n complex matrices and I be the identity matrix.Let A = [a ij ] ∈ C n×n and σ(A) be the set of all eigenvalues of A. For i, j ∈ J , j = i, denote r i (A) := ∑ t∈J ,t =i |a it | and r j i (A) := r i (A) − |a ij |.A matrix A is called a strictly diagonally dominant (SDD) matrix if, for each i ∈ J , |a ii | > r i (A).
In addition, A is doubly strictly diagonally dominant (DSDD) if, for any i, j ∈ J , i = j, |a ii ||a jj | > r i (A)r j (A).
Locating eigenvalue and bounding infinity norm of the inverse for nonsingular matrices are two major problems in applied linear algebra [1][2][3][4][5][6][7].The first problem is to find a set in the complex plane to include all eigenvalues of matrices, as if the obtained set is on the right-hand side of the complex plane.Then, one can conclude the positive definiteness of the corresponding matrix.Moreover, the "eigenvalue localization" is very important for the convergence speed of algorithms on which web searching engines are based.For the case of PageRank, see [4] and references therein.Another problem is to bound the infinity norm for the inverse of nonsingular matrices, as it can be used to estimate the condition number for the linear systems of equations [5], as well as linear complementarity problems [6].In general, it is not easy to discuss these two problems for an arbitrary given nonsingular matrix.One traditional way for solving the two problems is to locate all eigenvalues or to bound the infinity norm for some subclasses of nonsingular matrices.
In 2015, Cvetković [7] et al. extended the class of SDD matrices to the class of eventually SDD (SDD ∃ ) matrices, and proved that SDD ∃ matrices are nonsingular.
Theorem 1 ([7], Theorem 2).Let A ∈ SDD ∃ .Then, for two certain numbers s and k, It is worth noting in Theorem 1 that, from Examples 3 and 4 in [7] and the structure of Ψ s k (A), which includes two parameters s and k, one can conclude that, if there exists two groups of different numbers s and k, both such that A is an SDD ∃ matrix, then the different selection of s and k will affect the infinity norm bounds for the inverse of A.
According to the non-singularity of SDD ∃ matrices, Liu et al. [8] obtained the following eigenvalue inclusion set of matrices, which corrects Theorem 1 of [7].
In 2016, Liu [9] introduced the class of eventually DSDD (DSDD ∃ ) matrices, and located all its eigenvalues.Unfortunately, the answer is negative.Let and Note here that, when k = 1, the sets Ω s i,j (B k ) in Theorem 3 are ovals, while the sets Ω s i,j (B k ) with k ≥ 2 are lemniscates; see [10] (pp.35-52), for details.
The outline of the rest of this paper is as follows.In Section 2, by some existing criteria for non-singularity of matrices, a new eigenvalue localization set of matrices is derived.That is, a tighter eigenvalue localization set is obtained by excluding some proper subsets from Ω s i,j (B k ) in Theorem 3, which is proved to not include any eigenvalues of matrices.In Section 3, the infinity norm for the inverse of DSDD ∃ matrices is given.Finally, some concluding remarks are given in Section 4 to summarize this paper.

Eigenvalue Localization of Matrices
Firstly, a lemma in [12] is listed, which is very useful for deriving a new eigenvalue inclusion set.
For any given positive integer k, where Then, by Lemma 1, we have (s − λ) k I − B k is nonsingular, i.e., (s − λ) k is not an eigenvalue of B k .On the other hand, because λ ∈ σ(A), there is a vector x, x = 0, such that Bx = (s − λ)x.Furthermore, we have B k x = (s − λ) k x, which implies that (s − λ) k is an eigenvalue of B k .This is a contradiction.Hence, λ ∈ ∆ s k (A).The conclusion holds.
By the arbitrariness of s ∈ C and k ∈ N in Lemma 2, the following eigenvalue localization theorem is obtained easily.
For all i, j ∈ J , i = j, the relationship holds, hence the following comparison theorem for Theorem 3, Lemma 2 and Theorem 4 is given easily.
Theorem 5. Let A = sI − B ∈ C n×n .Given an arbitrary positive integer k, then Note here that, from the proof of Lemma 2, it can be seen that ∆ s i,t (B k ) does not contain any eigenvalues of A, and that ∆ s k (A) is derived by excluding some proper subsets ∆ s i,t (B k ) from Ω s i,j (B k ).Hence, ∆ s i,t (B k ) is called an exclusion set of ∆ s k (A).By Theorem 5, one can exclude some regions that do not include any eigenvalues of matrices to locate them more precisely.
Taking s = 0, k = 1 in Theorem 4, the following result, which is also Theorem 4 in [12], is deduced immediately. and Next, based on Lemma 2 and the fact that det(A) = 0 if and only if 0 ∈ σ(A) for a matrix A, the following condition such that det(A) = 0 is given easily.

Corollary 2. Let
If there is a positive integer k such that for any i, j ∈ J , i = j, either or for some t = i and t ∈ J , then A is nonsingular.
Note here that, if taking s = 0 and k = 1, then Corollary 2 is reduced to Lemma 1.That is to say, Corollary 2 is a generalization of Lemma 1.
Finally, an example is given to validate Theorem 5, and show that the different selection of s and k will affect the eigenvalue location and the determination of non-singularity of A = sI − B.
The eigenvalue inclusion set Γ s k (A), Ω s k (A) and ∆ s k (A) with different s and k, and all eigenvalues are drawn in Figure 1, where Γ s k (A), Ω s k (A) and ∆ s k (A), respectively, are showed by black boundary, yellow zone and its interior, and yellow zone.All eigenvalues are plotted by '+'.From Figure 1, it can be seen that: (1) Whether s = 10, k = 1, 2, 3 or s = 7, k = 2, it can be seen that obtained by Theorem 1 for the same k and s.
(2) When s = 10, if taking k = 1, k = 2 and k = 3, then differences of the three eigenvalue inclusion sets ∆ 10 1 (A), ∆ 10 2 (A), ∆ 10 3 (A) are clear.This implies that, if s is the same, but k is different, then these eigenvalue inclusion sets ∆ s k (A) are different in general. (3)When k = 2, if taking s = 7 and s = 10, then the two eigenvalue inclusion sets ∆ 2 2 (A) and ∆ 10 2 (A) are also different, which implies that, if k is the same, but s is different, then these eigenvalue inclusion sets ∆ s k (A) are also different in general.( 4) When s = 7 and k = 2, we can see that 0 ∈ ∆ 7 2 (A).When s = 10 and k = 1, 2, 3, we can see that 0 / ∈ ∆ 10 1 (A), 0 / ∈ ∆ 10 2 (A) and 0 / ∈ ∆ 10 3 (A).That is to say, we cannot determine the non-singularity of A when s = 7 and k = 2, but we can do it when s = 10 and k = 1, 2, 3, respectively.When k is the same one, the upper bounds Θ 5 k (A) and Ψ 5 k (A), respectively, are less than or equal to the upper bounds Θ 4 k (A) and Θ 10 k (A), and Ψ 4 k (A) and Ψ 10 k (A).In addition, if taking s = 5 and k = 8, the upper bound Θ 5 8 (A) = Ψ 5 8 (A) = 0.4657, which reaches the true value of ||A −1 || ∞ .
Then, an interesting problem is considered naturally: how to choose s and k to minimize the eigenvalue inclusion set ∆ s k (A) and determine the non-singularity of A = sI − B? Hence, it is essential to study this question in the future.

Infinity Norm Bounds for the Inverse of DSDD ∃ Matrices
Firstly, a lemma is listed, which is used to obtain the infinity norm bounds for the inverse of DSDD ∃ matrices.

Lemma 3 ([13]).
. Theorem 6.Let A ∈ DSDD ∃ .Then, for two certain numbers s and k, Proof.Since A ∈ DSDD ∃ , there exists k ∈ N such that s k I − B k ∈ DSDD.Obviously, A and s k I − B k are both nonsingular.By This implies that Label (2) holds.The proof is completed.
Proof.Since A is SDD, i.e., |a ii | > r i (A), i ∈ J , hence |a ii ||a jj | > r i (A)r j (A), i, j ∈ J , i = j.In order to obtain Label (3), it is sufficient to show that for all i, j ∈ J , i = j, the following inequality holds: then multiplying this inequality with r j (A) gives This can be rewritten as multiplying this inequality with |a ii |, we get which can be rewritten as The proof is completed.
Next, a comparison for the bounds in Theorems 1 and 6 is given.

Theorem 7.
If A ∈ SDD ∃ , then, for two certain numbers s and k,

Conclusions
In this paper, by excluding some proper subsets of Ω s k (A) that do not contain any eigenvalues of A, we obtain a tighter eigenvalue localization set ∆ s k (A) than those in Theorems 2 and 3.Then, the bound Θ s k (A) for the infinity norm of the inverse of DSDD ∃ matrices is given.Numerical examples show the effectiveness of the obtained results.However, there is a problem that has not been solved: how can s and k be picked to minimize the eigenvalue localization set ∆ s k (A) and the infinity norm bounds Θ s k (A)?This is a question worthy of further study.Finally, the relationship between DSDD matrices and SDD ∃ matrices is discussed.Consider again the matrix A in Example 2. It is not difficult to validate that A is an SDD ∃ matrix, but not a DSDD matrix, which implies that an SDD ∃ matrix is not necessarily a DSDD matrix, that is, {SDD ∃ } {DSDD}.
Then, another meaningful discussion is concerned: whether DSDD matrices are SDD ∃ matrices or not.That is, whether the relationship {DSDD} ⊆ {SDD ∃ } holds or not.This question is also interesting and worthy of further study.

Definition 2 (Remark 1 .
[9], Definition 3.2.1).Let A = sI − B, where s ∈ C. A is called an DSDD ∃ matrix if s k I − B k is DSDD for a positive integer k.From Definition 2, a meaningful discussion is concerned: Is s K I − B K DSDD if s k I − B k with k < K has this property?

Figure 1 .
Figure 1.Comparisons of Γ s k (A), Ω s k (A) and ∆ s k (A) with different s and k.

Author Contributions:
Conceptualization, Methodology, Software, Validation, Formal Analysis, Investigation, Resourcces, Data Curation, Writing-Original Draft Preoaration, Writing-Review and Editing and Visualization are all made by C.S.; Supervision, Project Administration and Funding Acqusition are all made by J.Z. Funding: This work was funded by the National Natural Science Foundation of China (Grant No. 11501141) and the Science and Technology Top-notch Talents Support Project of Education Department of Guizhou Province (Grant No. QJHKYZ [2016]066).