Abstract
In this paper, we present a new concept of the generalized core orthogonality (called the C-S orthogonality) for two generalized core invertible matrices A and B. A is said to be C-S orthogonal to B if and , where is the generalized core inverse of A. The characterizations of C-S orthogonal matrices and the C-S additivity are also provided. And, the connection between the C-S orthogonality and C-S partial order has been given using their canonical form. Moreover, the concept of the strongly C-S orthogonality is defined and characterized.
Keywords:
C-S inverse; C-S orthogonality; strongly C-S orthogonality; C-S additivity; C-S partial order MSC:
15A09; 06A06
1. Introduction
As we all know, there are two forms of the orthogonality: one-sided or two-sided orthogonality. We use and to denote the ranges of A and B, respectively. It is stated that and are orthogonal if . If , then and are orthogonal. And, we state that and are orthogonal if . If and , then A and B are orthogonal, denoted as . Notice that, when exists and , where is group inverse of A, we have . And, it is obvious that implies . Thus, when exists, if and only if and (i.e., A and B are #-orthogonal, denoted as ). Hestenes [1] gave the concept of ∗-orthogonality: let ; if and , then A is ∗-orthogonal to B, denoted by . For matrices, Hartwig and Styan [2] stated that if the dagger additivity (i.e., , where is the Moore–Penrose inverse of A) and the rank additivity (i.e., ), then A is ∗-orthogonal to B.
Ferreyra and Malik [3] introduced the core and strongly core orthogonal matrices by using the core inverse. If we let with Ind, where Ind is the index of A, if and , then A is core orthogonal to B, denoted as . , where Ind and Ind are strongly core orthogonal matrices (denoted as ) if and . In [3], we can see that implies (core additivity).
In [4], Liu, Wang, and Wang proved that with Ind and Ind are strongly core orthogonal, if and only if and (or ), instead of , which is more concise than Theorem in [3]. And, Ferreyra and Malik in [3], have proven that if A is strongly core orthogonal to B, then rkrkrk and . But, whether the reverse holds is still an open question. In [4], Liu, Wang, and Wang solved the problem completely. Furthermore, they also gave some new equivalent conditions for the strongly core orthogonality, which are related to the minus partial order and some Hermitian matrices.
On the basis of the core orthogonal matrix, Mosić, Dolinar, Kuzma, and Marovt [5] extended the concept of the core orthogonality and present the new concept of the core-EP orthogonality. A is said to be core-EP orthogonal to B, if and , where is core-EP inverse of A. A number of characterizations for core-EP orthogonality were proven in [5]. Applying the core-EP orthogonality, the concept and characterizations of the strongly core-EP orthogonality were introduced in [5].
In [6], Wang and Liu introduced the generalized core inverse (called the C-S inverse) and gave some properties and characterizations of the inverse. By the C-S inverse, a binary relation (denoted “ ”) and a partial order (called the C-S partial order and denoted “ ”) are given.
Motivated by these ideas, we give the concepts of the C-S orthogonality and the strongly C-S orthogonality, and discuss their characterizations in this paper. The connection between the C-S partial order and the C-S orthogonality has been given. Moreover, we obtain some characterizing properties of the C-S orthogonal matrix when A is EP.
2. Preliminaries
For , and k is the index of A, we consider the following equations:
- 1.
- ;
- 2.
- ;
- 3.
- ;
- 4.
- ;
- 5.
- ;
- 6.
- ;
- 7.
- ;
- 8.
- ;
- 9.
- ;
- 10.
- .
The set of all elements , which satisfies equations in Equations (1)–(10), are denoted as . If there exists
then it is called the Moore–Penrose inverse of A, and is unique. It was introduced by Moore [7] and improved by Bjerhammar [8] and Penrose [9]. Furthermore, based on the Moore–Penrose inverse, it is known to us that it is EP if and only if . If there exists
then it is called the group inverse of A, and is unique [10]. If there exists
then is called the core inverse of A [11]. And, if there exists
then is called the core-EP inverse of A [12]. Moreover, is the set of all core-EP invertible matrices of . The symbols and will stand for the subsets of consisting of group and EP matrices, respectively.
Drazin [13] introduces the star partial order on the set of all regular elements of semigroups with involution, and applies this definition to the complex matrices, which is defined as
By using the -inverse, Hartwig and Styan [2,14] give the definition of the minus partial order,
And, Mitra [15] defines the sharp partial order as
According to the core inverse and the sharp partial order, Baksalary and Trenkler [11] propose the definition of the core partial order:
Definition 1
([6]). Let , and . Then, the C-S inverse of A is defined as the solution of
and X is denoted as .
Lemma 1
([16]). Let , and be the core-EP decomposition of A. Then, there exists a unitary matrix U such that
where T is non-singular, and N is nilpotent.
Then, the core-EP decomposition of A is
And, by applying Lemma 1, Wang and Liu in [6] obtained the following canonical form for the C-S inverse of A:
3. C-S Orthgonality and Its Consequences
Firstly, we give the concept of the C-S orthogonality.
Definition 2.
Let and Ind. If
then A is generalized core orthogonal to B, A is C-S orthogonal to B, and is denoted as .
If , then
Remark 1.
Let and Ind. Notice that can be proven if . Then, we have . And, if , we have , which implies . It is obvious that
Applying Definition 2, we can also state that A is generalized core orthogonal to B, if
Next, we study the range and null space of the matrices which are C-S orthogonal. Firstly, we give some characterizations of the C-S inverse as follows.
Lemma 2.
Let , and Ind, then .
Proof.
Lemma 3.
Let , and Ind, then is core invertible. In this case, .
Remark 2.
The core inverse of a square matrix of the index at most 1 satisfies the following properties [3]:
where A is a square matrix with . It has been proven that is core invertible in Lemma 3, so we have
Theorem 1.
Let , and ; then, the following are equivalent:
- (1)
- ;
- (2)
- , ;
- (3)
- , ;
- (4)
- , ;
- (5)
- , ;
- (6)
- , ;
- (7)
- , .
Proof.
. From , we have
By Lemma 3, is core invertible, which implies . As a consequence, we have . By using , we obtain
: this is evident.
: according to Remark 1, we obtain , .
: this is evident.
Applying properties of Transposition of , we verify that , , and are equivalent. □
In view of and in Theorem 1, we obtain from . Using Lemma in [3], we have that – in Theorem 1 and are equivalent, i.e., and are equivalent. And, from Lemma in [4], it can be seen that is equivalent to and . As a consequence of the theorem, we have the following.
Corollary 1.
Let , and , then the following are equivalent:
- (1)
- ;
- (2)
- ;
- (3)
- ;
- (4)
- ;
- (5)
- .
Lemma 4.
Let , and , . If , then
- (1)
- ;
- (2)
- ;
- (3)
- ;
- (4)
Proof.
(1) By applying (3), we have . Then, by using the fact that has an index of 1 at most, we obtain
Moreover, it is obvious that . Then, .
(2) Let , we have . Since has an index of 1 at most, then we can prove by .
(3) Let , then , i.e., . Since
and , we obtain , which implies .
On the other hand, it is obvious that . Then, .
(4) Let , and we have . By , it is easy to check that is true. □
Theorem 2.
Let , and , . If , then
- (1)
- ;
- (2)
- ;
- (3)
- ;
- (4)
- ;
- (5)
- ;
- (6)
- ;
- (7)
- ;
- (8)
- .
Proof.
By applying , i.e., and , we obtain
and
It is obvious that and . As a consequence, it is reasonable to obtain that the statements (1)–(8) are true by Lemma 4. □
Using the core-EP decomposition, we obtain the following characterization of C-S orthogonal matrices.
Theorem 3.
Let , and , then the following are equivalent:
- (1)
- ;
- (2)
- There exist nonsingular matrices , , nilpotent matrices , , and a unitary matrix U, such thatwhere and .
Proof.
Let the core-EP decomposition of A be
where T is nonsingular and N is nilpotent. Then, the decomposition of is (2). And, write
Since
it implies that and ; that is, .
Since
it implies that , and we have . Therefore,
where , i.e., .
Now, let
be the core EP decomposition of and . Partition N according to the partition of ; then,
Applying , we obtain
which leads to . Thus, and . And,
which implies that and . Then,
where and .
. Let
Using and , we can obtain
and
Thus, . □
Example 1.
Consider the matrices
Then,
By calculating the matrices, it can be obtained that . Thus, .
Next, based on the C-S partial order, we obtain some relation between the C-S orthogonality and the C-S partial order.
Lemma 5
([6]). Let . There is a binary relation such that:
In this case, there exists a unitary matrix U, such that
where T is invertible, N is nilpotent, and .
Lemma 6
([6]). Let . The partial order on is defined as
We call it C-S partial order.
Theorem 4.
Let , and ; then, the following are equivalent:
- (1)
- , ;
- (2)
- .
Proof.
. Let , i.e., and . Then, and . Since
and
we have and , which implies .
By applying , we have .
Then, is established.
. Let , i.e., and . It is clear that and . It follows that . □
When A is an matrix, we have a more refined result, which reduces to the well-known characterizations of the orthogonality in the usual sense.
Theorem 5.
Let ; then, the following are equivalent:
- (1)
- ;
- (2)
- ;
- (3)
- ;
- (4)
- ;
- (5)
- There exist nonsingular matrices , , a nilpotent matrix N and a unitary matrix U, such that
Proof.
Since , the decompositions of A and are
where is nonsingular and U is unitary. Then, . It is clear that is equivalent to . It follows from Corollary in [3] that – are equivalent. □
4. Strongly C-S Orthgonality and Its Consequences
The concept of strongly C-S orthogonality is considered in this section as a relation that is symmetric but unlike the C-S orthogonality.
Definition 3.
Let , and . If
then A and B are said to be strongly C-S orthogonal, denoted as
Remark 3.
Applying Remark 1, we have that is equivalent to . Since and are equivalent, it is interesting to observe that . Then, is equivalent to , . Therefore, the concept of strongly C-S orthogonality can be defined by another condition; that is,
Theorem 6.
Let , and . Then, the following statements are equivalent.
- (1)
- ;
- (2)
- There exist nonsingular matrices , , nilpotent matrices , , and a unitary matrix U, such thatwhere and .
Proof.
. Let , i.e., and . From Theorem 3, the core-EP decompositions of A and B are (7), respectively. And,
Since
it implies ; that is, . On the other hand,
which yields ; that is, . According to the above results, we have
where and .
. Let
It follows from and that
and
Thus, . □
Example 2.
Consider the matrices
Then,
By calculating the matrices, it can be seen that and . Thus, .
Lemma 7.
Let , , and the forms of B and be
respectively. Then,
Proof.
Applying
and
we see that , and , which lead to . And, , . □
Theorem 7.
Let , and , then , if and only if and .
Proof.
Only if: From Theorem 6, we have the forms of A and B from (9). Since , are nilpotent matrices with Ind, we can see that .
It follows that
and
where and . And, it is clear that and .
If: Let the core-EP decomposition of A be as in (1), and the form of be as in (6). Partition B according to the partition of A, then the form of B is (8). Then, write
Applying and , we have
and
Then, the form of B is
where , and .
Let , then
Applying , it is clear that . Thus,
where . Then,
where and . Then, we obtain and , which imply that . It follows from Lemma 7 that
and
Therefore, we obtain
where and . According to , we have that
In addition,
which implies that
and . Then, we have
which implies .
By and , it is clear that . Then, it is obvious that , i.e., . Using , we have . Thus, there is . It follows from and that , that is . And, it implies that . It is clear that . Therefore, it follows that , which leads to .
Using , we have
where . It follows that and . Therefore, we obtain
where and . By Theorem 6, . □
Example 3.
Consider the matrices
It is obvious that .
By calculating the matrices, it can be seen that
and
that is, and . Then, we have , i.e., .
But, we consider the matrices
It is obvious that and . However,
Thus, we cannot see that .
Corollary 2.
Let , and . Then, the following are equivalent:
- (1)
- ;
- (2)
- , and ;
- (3)
- , .
Proof.
. This follows from Theorem 7.
. Applying Remark 1, we have that is equivalent to and . □
Theorem 8.
Let , and . Then, the following are equivalent:
- (1)
- ;
- (2)
- , .
Proof.
. Let , i.e., and . By Definition 1 and , we have
which implies . It follows that . According to Theorem 4, we obtain . In the same way, we see that .
. This is clear by Theorem 4. □
Author Contributions
Conceptualization, X.L., Y.L. and H.J.; methodology, X.L., Y.L. and H.J.; writing original draft preparation, X.L., Y.L. and H.J.; writing review and editing, X.L., Y.L. and H.J.; funding acquisition, X.L. and H.J. All authors have read and agreed to the published version of the manuscript.
Funding
This work was supported by the National Natural Science Foundation of China (No. 12061015); Guangxi Science and Technology Base and Talents Special Project (No. GUIKE21220024) and Guangxi Natural Science Foundation (No. 2018GXNSFDA281023).
Data Availability Statement
Data will be made available on reasonable request.
Conflicts of Interest
No potential conflicts of interest was reported by the authors.
References
- Hestenes, M.R. Relative hermitian matrices. Pac. J. Math. 1961, 11, 225–245. [Google Scholar] [CrossRef]
- Hartwig, R.E.; Styan, G.P.H. On some characterizations of the “star” partial ordering for matrices and rank subtractivity. Linear Algebra Appl. 1986, 82, 145–161. [Google Scholar] [CrossRef]
- Ferreyra, D.E.; Malik, S.B. Core and strongly core orthogonal matrices. Linear Multilinear Algebr. 2021, 70, 5052–5067. [Google Scholar] [CrossRef]
- Liu, X.; Wang, C.; Wang, H. Further results on strongly core orthogonal matrix. Linear Multilinear Algebr. 2023, 71, 2543–2564. [Google Scholar] [CrossRef]
- Mosić, D.; Dolinar, G.; Kuzma, B.; Marovt, J. Core-EP orthogonal operators. Linear Multilinear Algebr. 2022, 1–15. [Google Scholar] [CrossRef]
- Wang, H.; Liu, N. The C-S inverse and its applications. Bull. Malays. Math. Sci. Soc. 2023, 46, 90. [Google Scholar] [CrossRef]
- Moore, E.H. On the reciprocal of the general algebraic matrix. Bull. Am. Math. Soc. 1920, 26, 394–395. [Google Scholar]
- Bjerhammar, A. Application of calculus of matrices to method of least squares: With special reference to geodetic calculations. Trans. R. Inst. Technol. Stock. Sweden 1951, 49, 82–84. [Google Scholar]
- Penrose, R. A generalized inverse for matrices. Math. Proc. Camb. Philos. Soc. 1955, 51, 406–413. [Google Scholar] [CrossRef]
- Ben-Israel, A.; Greville, T.N.E. Generalized Inverses: Theory and Applications, 2nd ed.; Springer: New York, NY, USA, 2003. [Google Scholar]
- Baksalary, O.M.; Trenkler, G. Core inverse of matrices. Linear Multilinear Algebr. 2010, 58, 681–697. [Google Scholar] [CrossRef]
- Manjunatha, P.K.; Mohana, K.S. Core-EP inverse. Linear Multilinear Algebr. 2014, 62, 792–802. [Google Scholar] [CrossRef]
- Drazin, M.P. Natural structures on semigroups with involution. Bull. Am. Math. Soc. 1978, 84, 139–141. [Google Scholar] [CrossRef]
- Hartwig, R.E. How to partially order regular elements. Math. Jpn. 1980, 25, 1–13. [Google Scholar]
- Mitra, S.K. On group inverses and the sharp order. Linear Algebra Appl. 1987, 92, 17–37. [Google Scholar] [CrossRef]
- Wang, H. Core-EP decomposition and its applications. Linear Algebra Appl. 2016, 508, 289–300. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).