Next Article in Journal
Multiple Solutions of Fractional Kazdan–Warner Equation for Negative Case on Finite Graphs
Next Article in Special Issue
The m-CCE Inverse in Minkowski Space and Its Applications
Previous Article in Journal
Evolutionary Search for Polynomial Lyapunov Functions: A Genetic Programming Method for Exponential Stability Certification
Previous Article in Special Issue
High Relative Accuracy for Corner Cutting Algorithms
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Reverse Order Law for the {1,3M,4N}—The Inverse of Two Matrix Products

School of Mathematics and Computational Science, Wuyi University, Jiangmen 529020, China
*
Author to whom correspondence should be addressed.
Axioms 2025, 14(5), 344; https://doi.org/10.3390/axioms14050344
Submission received: 9 March 2025 / Revised: 17 April 2025 / Accepted: 24 April 2025 / Published: 30 April 2025
(This article belongs to the Special Issue Advances in Linear Algebra with Applications, 2nd Edition)

Abstract

:
By using the maximal and minimal ranks of some generalized Schur complement, the equivalent conditions for the reverse order law ( A B ) { 1 , 3 M , 4 K } = B { 1 , 3 N , 4 K } A { 1 , 3 M , 4 N } are presented.

1. Introduction

At first, we will provide some definitions for the convenience of readers.
Definition 1 
([1,2]).
  • C m × n : the set of m × n complex matrices;
  • R ( A ) , N ( A ) : the range space and the null space of A, respectively;
  • r ( A ) , A * : the rank and the conjugate transpose of A, respectively;
  • A ( M , N ) = A # = N 1 A * M : the weighted conjugate transpose matrices of A with positive-definite Hermitian matrices M and N.
Definition 2 
([3]). Let M be a positive-definite Hermitian matrix and let A C m × n , b C m . The aim of the so-called weighted least squares problem [WLS] is to find x C n that satisfies the following:
A x b M = M 1 2 ( A x b ) 2 2 = min x C n A x b M ,
where M = M 1 2 M 1 2 , . 2 is a spectral norm, and . M is a weighted M norm.
Definition 3 
([2]). Let A C m × n , and let M and N be two positive-definite Hermitian matrices. X is the weighted Moore–Penrose inverse of A when it satisfies
( 1 ) A X A = A , ( 2 ) X A X = X , ( 3 M ) ( M A X ) * = M A X , ( 4 N ) ( N X A ) * = N X A ,
where X is denoted by X = A ( 1 , 2 , 3 M , 4 N ) = A M , N . Set’s A { 1 , 3 M , 4 N } -inverses of A are A { 1 , 3 M , 4 N } = X C n × m | A X A = A , ( M A X ) * = M A X , ( N X A ) * = N X A .
The reverse order law for the weighted generalized inverse has been widely applied in the study of the WLS (see [4,5,6]), the weighted perturbation theory (see [7,8,9,10,11,12]), the optimization problems and other related topics (see [13,14,15,16,17,18]).
The reverse order laws for the weighted generalized inverses of matrix products have attracted considerable attentions (see [19,20,21,22,23,24,25,26,27,28]). It is well known that the core problem here concerns the reverse order law, which depends on what conditions the equations
A n ( i , j , , k ) A n 1 ( i , j , , k ) A 1 ( i , j , , k ) = ( A 1 A 2 A n ) ( i , j , , k )
hold, where i , j , , k { 1 , 2 , 3 M , 4 N } .
The purpose of this paper is to show some equivalent conditions for the following reverse order law:
B { 1 , 3 N , 4 K } A { 1 , 3 M , 4 N } = ( A B ) { 1 , 3 M , 4 K } .
Furthermore, the equivalent conditions for the inclusions
B { 1 , 3 N , 4 K } A { 1 , 3 M , 4 N } ( A B ) { 1 , 3 M , 4 K }
and
B { 1 , 3 N , 4 K } A { 1 , 3 M , 4 N } ( A B ) { 1 , 3 M , 4 K } ,
are presented.
The following lemmas are essential in the rest of this paper.
Lemma 1 
([2,4]). Let M and N be two positive-definite Hermitian matrices, and let A C m × n . Then,
X A { 1 , 3 M } A * M A X = A * M ,
X A { 1 , 4 N } X A N 1 A * = N 1 A * ,
X A { 1 , 4 N } X * A * { 1 , 3 N 1 } .
Lemma 2 
([17]). Suppose that A, B, C and D are four complex matrices, and suppose that M and N are two positive-definite Hermitian matrices. Then,
r ( D C A M , N B ) = r A * M A N 1 A * A * M B C N 1 A * D r ( A ) ;
max A ( 1 , 3 M , 4 N ) r ( D C A ( 1 , 3 M , 4 N ) B ) = min { r A * M A A * M B C D r ( A ) , r A N 1 A * B C N 1 A * D r ( A ) } ;
min A ( 1 , 3 M , 4 N ) r ( D C A ( 1 , 3 M , 4 N ) B ) = r A * M A A * M B C D + r A N 1 A * B C N 1 A * D r A B C D r ( A ) .
Lemma 3 
([29]). Let X and Y be two arbitrary matrices, and let A i j C m i × n j ( 1 i , j 3 ) . Then,
min X , Y r A 11 A 12 X A 21 A 22 A 23 Y A 32 A 33 = r A 21 , A 22 , A 23 + r A 12 A 22 A 32 + max { r A 11 A 12 A 21 A 22 r A 12 A 22 r A 21 , A 22 , r A 22 A 23 A 32 A 33 r A 22 A 32 r A 22 , A 23 } .
The contents of this paper are organized as follows: In Section 2, we first present the equivalent conditions for Inclusions (3) and (4) using the maximal and minimal ranks of the generalized Schur complement and completions of partial matrices. Then, applying these results, we study the reverse order law (2). At last, in Theorem 3, we obtain necessary and sufficient conditions for the reverse order law (2) via rank conditions of known matrices.

2. Main Results

To obtain the main results (Theorem 3), we need present the minimal rank of the generalized Schur complement D B ( 1 , 3 N , 4 K ) A ( 1 , 3 M , 4 N ) .
Lemma 4. 
Let M, N, and K be three positive-definite Hermitian matrices, and let A C m × n , B C n × k , D C k × m . Then,
min A ( 1 , 3 M , 4 N ) , B ( 1 , 3 N , 4 K ) r ( D B ( 1 , 3 N , 4 K ) A ( 1 , 3 M , 4 N ) ) = r B * N B D B * N A * M A * M A + r N 1 A * B K 1 B * D A N 1 A * K 1 B * r ( A ) r ( B ) + max { r ( B * A * B * N B D A N 1 A * ) r ( B D A N 1 A * N 1 A * ) r ( B * N B D A B * N ) , r K 1 B * D A * M A B K 1 B * A * M r ( D A B K 1 B * K 1 B * ) r ( A * M A B D A * M ) n } .
Proof. 
It is well known that A ( 1 , 3 M , 4 N ) = A M , N + F A ˜ V E A ˜ , where F A ˜ = I n A M , N A , E A ˜ = I m A A M , N and V is an arbitrary matrix (see [17]). Combining this fact, we have
r ( D B ( 1 , 3 N , 4 K ) A ( 1 , 3 M , 4 N ) ) = r [ ( B N , K + F B ˜ W E B ˜ ) ( A M , N + F A ˜ V E A ˜ ) D ] = r ( B N , K A M , N + B N , K F A ˜ V E A ˜ + F B ˜ W E B ˜ A M , N + F B ˜ W E B ˜ F A ˜ V E A ˜ D ) = r O O O O I n V O O E A ˜ O O I m O O O I n F A ˜ O O F B ˜ B N , K A M , N D B N , K O O I n O E B ˜ A M , N E B ˜ O O W I k O O O O k m 3 n .
where F A ˜ = I n A M , N A , E A ˜ = I m A A M , N , F B ˜ = I k B N , K B , E B ˜ = I n B B N , K and the matrices V and W are arbitrary. Applying Lemma 3, we have
min A ( 1 , 3 M , 4 N ) , B ( 1 , 3 N , 4 K ) r ( D B ( 1 , 3 N , 4 K ) A ( 1 , 3 M , 4 N ) ) = r F B ˜ , B N , K A M , N D , B N , K F A ˜ + r E A ˜ B N , K A M , N D E B ˜ A M , N + max { r O E A ˜ F B ˜ B N , K A M , N D r O E A ˜ F B ˜ B N , K A M , N D O E B ˜ A M , N r O E A ˜ O F B ˜ B N , K A M , N D B N , K F A ˜ , r B N , K A M , N D B N , K F A ˜ E B ˜ A M , N E B ˜ F A ˜ r E A ˜ O B N , K A M , N D B N , K F A ˜ E B ˜ A M , N E B ˜ F A ˜ r F B ˜ B N , K A M , N D B N , K F A ˜ O E B ˜ A M , N E B ˜ F A ˜ }
The next step is to use block matrix operations to simplify the rank of the right-hand side of (14). For the first one, combining F A ˜ = I n A M , N A and F B ˜ = I k B N , K B , we have
r F B ˜ , B N , K A M , N D , B N , K F A ˜ = r I k B N , K B , B N , K A M , N D , B N , K B N , K A M , N A = r I k B N , K B , B N , K A M , N D , B N , K D A = r B N , K B N , K B B N , K A M , N B N , K O O I k B N , K B , B N , K A M , N D , B N , K D A O O O A M , N O A M , N r ( A ) r ( B ) = r B N , K B N , K B O B N , K B N , K A M , N B N , K I k , D , D A O O O A M , N O A M , N r ( A ) r ( B ) = r B N , K B D B N , K B D A B N , K B N , K A M , N A M , N O A M , N + k r ( A ) r ( B ) = r B N , K B D B N , K A M , N A M , N A + k r ( A ) r ( B ) .
From Definition 1, we have A # = N 1 A * M , B # = K 1 B * N and
B # B O O A # A B N , K B D B N , K A M , N A M , N A = B # B D B # A # A # A
and
B N , K ( B N , K ) # O O A M , N ( A M , N ) # B # B D B # A # A # A = B N , K B D B N , K A M , N A M , N A .
That is,
r B N , K B D B N , K A M , N A M , N A = r B # B D B # A # A # A = r B * N B D B * N A * M A * M A .
Substituting Formula (16) into (15), we have
r F B ˜ , B N , K A M , N D , B N , K F A ˜ = r B * N B D B * N A * M A * M A + k r ( A ) r ( B ) .
For the second partitioned matrix of the right-hand of (14), we have
r E A ˜ B N , K A M , N D E B ˜ A M , N = r I n A M , N O E A ˜ B N , K D E B ˜ O n = r A M , N O A M , N O O I n A M , N O O O E A ˜ O O B N , K D O O E B ˜ O O O B N , K O B N , K n r ( A ) r ( B ) = r A M , N O A M , N O A M , N I n O O A A M , N O I m O O O D O O I n O B B N , K O B N , K O B N , K n r ( A ) r ( B ) = r A M , N B B N , K D A A M , N B N , K + m r ( A ) r ( B ) .
Similarly to (16), we have
A M , N B B N , K D A A M , N B N , K A A # O O B B # = A # B B # D A A # B #
and
A # B B # D A A # B # ( A M , N ) # A M , N O O ( B N , K ) # B N , K = A M , N B B N , K D A A M , N B N , K .
That is,
r A M , N B B N , K D A A M , N B N , K = r A # B B # D A A # B # = r N 1 A * B K 1 B * D A N 1 A * K 1 B * .
Substituting Formula (19) into (18), we have
r E A ˜ B N , K A M , N D E B ˜ A M , N = r N 1 A * B K 1 B * D A N 1 A * K 1 B * + m r ( A ) r ( B ) .
Substituting E A ˜ = I m A A M , N and F B ˜ = I k B N , K B into the third block matrix in (14), we have
r O E A ˜ F B ˜ B N , K A M , N D = r A M , N O A M , N O O O E A ˜ O O F B ˜ B N , K A M , N D B N , K O O O B N , K r ( A ) r ( B ) = r A M , N O A M , N O A A M , N O I m O O I k D B N , K B N , K A M , N B N , K B O B N , K r ( A ) r ( B ) = r B N , K A M , N B N , K B D A A M , N + k + m r ( A ) r ( B ) .
This is because
r B # A # B # B D A A # = r [ B # B B N , K A M , N B N , K B D A A M , N A A # ] r B N , K A M , N B N , K B D A A M , N
and
r B N , K A M , N B N , K B D A A M , N = r [ B N , K ( B N , K ) # B # A # B # B D A A # ( A M , N ) # A M , N ] r B # A # B # B D A A # .
Then,
r B N , K A M , N B N , K B D A A M , N = r B # A # B # B D A A # = r B * A * B * N B D A N 1 A *
That is,
r O E A ˜ F B ˜ B N , K A M , N D = r B * A * B * N B D A N 1 A * + k + m r ( A ) r ( B ) .
For the fourth partitioned matrix on the right-hand of (14), we have
r O E A ˜ F B ˜ B N , K A M , N D O E B ˜ A M , N = r O E A ˜ F B ˜ B N , K A M , N D O A M , N B D = r O E A ˜ F B ˜ B N , K B D D O A M , N B D = r O E A ˜ F B ˜ O O A M , N B D = r ( F B ˜ ) + r A M , N A M , N O E A ˜ O A M , N B D r ( A ) = k r ( B ) + r A M , N A M , N A A M , N I m A M , N B D r ( A ) = r ( B D A A M , N A M , N ) + k + m r ( B ) r ( A ) .
Using Definition 1, we obtain
r ( B D A A M , N A M , N ) = r [ ( B D A A # A # ) ( A M , N ) # A M , N ] r ( B D A A # A # )
and
r ( B D A A # A # ) = r [ ( B D A A M , N A M , N ) A A # ] r ( B D A A M , N A M , N ) .
That is,
r ( B D A A M , N A M , N ) = r ( B D A A # A # ) = r ( B D A N 1 A * N 1 A * ) .
Substituting Formula (25) into (24), we have
r O E A ˜ F B ˜ B N , K A M , N D O E B ˜ A M , N = r ( B D A N 1 A * N 1 A * ) + k + m r ( B ) r ( A ) .
For the fifth block matrix in (14), via the row or column elementary block matrix operations and E A ˜ A = O , E A ˜ = I m A A M , N , F A ˜ = I n A M , N A and F B ˜ = I k B N , K B , we have
r O E A ˜ O F B ˜ B N , K A M , N D B N , K F A ˜ = r O E A ˜ O F B ˜ B N , K A M , N D B N , K D A = r A M , N O A M , N O O O O E A ˜ O O O F B ˜ B N , K A M , N D B N , K D A B N , K O O O O B N , K r ( A ) r ( B ) = r A M , N O A M , N O O A A M , N O I m O O O I k D D A B N , K B N , K A M , N B N , K B O B N , K B N , K r ( A ) r ( B ) = r B N , K B D A B N , K + m + k r ( A ) r ( B ) .
From Definition 1, we have
r B N , K B D A B N , K = r [ B N , K ( B N , K ) # B # B D A B # ] r B # B D A B # = r [ B # B B N , K B D A B N , K ] r B N , K B D A B N , K .
That is,
r B N , K B D A B N , K = r B # B D A B # = r B * N B D A B * N .
Substituting Formula (28) into (27), we have
r O E A ˜ O F B ˜ B N , K A M , N D B N , K F A ˜ = r B * N B D A B * N + m + k r ( A ) r ( B ) .
For the sixth block matrix in (14), using the same method as above, we have
r B N , K A M , N D B N , K F A ˜ E B ˜ A M , N E B ˜ F A ˜ = r B N , K A M , N D B N , K F A ˜ A M , N B D F A ˜ = r B N , K B D D O A M , N B D F A ˜ = r B N , K O O O B N , K B N , K B D D O O O A M , N B D F A ˜ A M , N O O O A M , N r ( A ) r ( B ) = r B N , K B N , K B D O O B N , K D O O O B D I n A M , N O A M , N A M , N A A M , N r ( A ) r ( B ) = r B N , K D A M , N A B B N , K A M , N + n r ( A ) r ( B ) .
From Definition 1, we have
r B N , K D A M , N A B B N , K A M , N = r I O O A M , N ( A M , N ) # B # D A # A B B # A # ( B N , K ) # B N , K O O I r B # D A # A B B # A #
and
r B # D A # A B B # A # = r I O O A # A B N , K D A M , N A B B N , K A M , N B B # O O I r B N , K D A M , N A B B N , K A M , N .
That is,
r B N , K D A M , N A B B N , K A M , N = r B # D A # A B B # A # = r K 1 B * D A * M A B K 1 B * A * M
Substituting Formula (31) into (30), we have
r B N , K A M , N D B N , K F A ˜ E B ˜ A M , N E B ˜ F A ˜ = r K 1 B * D A * M A B K 1 B * A * M + n r ( A ) r ( B ) .
For the seventh partitioned matrix in (14), using the same method as above with E A ˜ A = O , E A ˜ = I m A A M , N , F A ˜ = I n A M , N A and E B ˜ = I n B B N , K , we have
r E A ˜ O B N , K A M , N D B N , K F A ˜ E B ˜ A M , N E B ˜ F A ˜ = r E A ˜ O B N , K A M , N D B N , K D A E B ˜ A M , N E B ˜ = r E A ˜ O D A A M , N D B N , K D A O E B ˜ = r A M , N A M , N O O O E A ˜ O O O D A A M , N D B N , K D A O O O E B ˜ O O O B N , K B N , K r ( A ) r ( B ) = r A M , N A M , N O O A A M , N I m O O D A A M , N D D A B N , K O O I n B B N , K O O B N , K B N , K r ( A ) r ( B ) = r D A B B N , K B N , K + m + n r ( A ) r ( B ) .
From Definition 1, we have
r D A B B N , K B N , K = r [ D A B B # B # ( B N , K ) # B N , K ] r D A B B # B #
and
r D A B B # B # = r [ D A B B N , K B N , K B B # ] r D A B B N , K B N , K .
That is,
r D A B B N , K B N , K = r D A B B # B # = r D A B K 1 B * K 1 B * .
Substituting Formula (34) into (33), we have
r E A ˜ O B N , K A M , N D B N , K F A ˜ E B ˜ A M , N E B ˜ F A ˜ = r D A B K 1 B * K 1 B * + n + m r ( A ) r ( B ) .
For the last partitioned matrix in (14), using the basic block matrix operations for rows or columns and B F B ˜ = O , F A ˜ = I n A M , N A , F B ˜ = I k B N , K B and E B ˜ = I n B B N , K , we have
r F B ˜ B N , K A M , N D B N , K F A ˜ O E B ˜ A M , N E B ˜ F A ˜ = r F B ˜ B N , K A M , N D B N , K F A ˜ O A M , N B D F A ˜ = r F B ˜ B N , K B D D O O A M , N B D F A ˜ = r B N , K O O O O B N , K F B ˜ B N , K B D D O O O O A M , N B D F A ˜ A M , N O O O O A M , N r ( A ) r ( B ) = r B N , K B N , K B B N , K B D O O B N , K I k D O O O O B D I n A M , N O O A M , N A M , N A A M , N r ( A ) r ( B ) = r A M , N A B D A M , N + n + k r ( A ) r ( B ) .
From Definition 1, we have
r A M , N A B D A M , N = r [ A M , N ( A M , N ) # A # A B D A # ] r A # A B D A #
and
r A # A B D A # = r [ A # A A M , N A B D A M , N ] r A M , N A B D A M , N .
That is,
r A M , N A B D A M , N = r A # A B D A # = r A * M A B D A * M .
Substituting Formula (37) into (36), we have
r F B ˜ B N , K A M , N D B N , K F A ˜ O E B ˜ A M , N E B ˜ F A ˜ = r A * M A B D A * M + n + k r ( A ) r ( B ) .
According to the above proof process, by substituting Formulas (17), (20), (23), (26), (29), (32), (35) and (38) into (14), we have Formula (12). Thus, Lemma 4 is proved. □
Theorem 1. 
Let M, N, and K be three positive-definite Hermitian matrices, and let A C m × n , B C n × k . Then,
( A B ) { 1 , 3 M , 4 K } B { 1 , 3 N , 4 K } A { 1 , 3 M , 4 N } r ( A B ) = max min { a 1 , a 2 } , min { k r ( B ) , m r ( A ) } + a 3 + a 4 n .
where
a 1 = r B * N B B * A * B * A * M A B B * A * M A N 1 A * , a 3 = r B * N B * A * M A
and
a 2 = r A N 1 A * A B K 1 B * A * B * A * B * N B K 1 B * A * , a 4 = r A N 1 A B K 1 B * .
Proof. 
We can see that there are some ( A B ) ( 1 , 3 M , 4 K ) ( A B ) { 1 , 3 M , 4 K } , A ( 1 , 3 M , 4 N ) A { 1 , 3 M , 4 N } and B ( 1 , 3 N , 4 K ) B { 1 , 3 N , 4 K } , such that the following three formulas are equivalent:
( A B ) { 1 , 3 M , 4 K } B { 1 , 3 N , 4 K } A { 1 , 3 M , 4 N } ,
B ( 1 , 3 N , 4 K ) A ( 1 , 3 M , 4 N ) = ( A B ) ( 1 , 3 M , 4 K )
and
max ( A B ) ( 1 , 3 M , 4 K ) min A ( 1 , 3 M , 4 N ) , B ( 1 , 3 N , 4 K ) r ( ( A B ) ( 1 , 3 M , 4 K ) B ( 1 , 3 N , 4 K ) A ( 1 , 3 M , 4 N ) ) = 0 .
It is well known that A B ( A B ) M , K = A B ( A B ) ( 1 , 3 M , 4 K ) = P R ( A B ) , N ( B # A # ) and ( A B ) M , K A B = ( A B ) ( 1 , 3 M , 4 K ) A B = P R ( B # A # ) , N ( A B ) (see [4]). Combining this fact with Formula (12) in Lemma 4, we have
min A ( 1 , 3 M , 4 N ) , B ( 1 , 3 N , 4 K ) r ( ( A B ) ( 1 , 3 M , 4 K ) B ( 1 , 3 N , 4 K ) A ( 1 , 3 M , 4 N ) ) = r O B * N A * M A * M A B ( A B ) M , K A * M A + r N 1 A * B K 1 B * O K 1 B * ( A B ) M , K A B K 1 B * r ( A ) r ( B ) + max { r ( B * A * B * N B ( A B ) ( 1 , 3 M , 4 K ) A N 1 A * ) r ( N 1 A * B ( A B ) ( 1 , 3 M , 4 K ) A N 1 A * ) r ( B * N B ( A B ) ( 1 , 3 M , 4 K ) A B * N ) , r A * M A * M A B K 1 B * ( A B ) ( 1 , 3 M , 4 K ) K 1 B * r ( ( A B ) M , K A B K 1 B * K 1 B * ) r ( A * M A B ( A B ) M , K A * M ) n } .
Applying (8) of Lemma 2, we have
r O B * N A * M A * M A B ( A B ) M , K A * M A = r O B * N A * M A * M A O A * M A B ( A B ) M , K I m O = r ( A B ) * M A B K 1 ( A B ) * ( A B ) * M O O O B * N A * M A B K 1 ( A B ) * A * M A * M A r ( A B ) = a 3 + r ( A ) r ( A B )
and
r N 1 A * B K 1 B * O K 1 B * ( A B ) M , K A B K 1 B * = r N 1 A * B K 1 B * O K 1 B * O I k ( A B ) M , K O A B K 1 B * = r ( A B ) * M A B K 1 ( A B ) * O ( A B ) * M A B K 1 B * O N 1 A * B K 1 B * K 1 ( A B ) * O K 1 B * r ( A B ) = a 4 + r ( B ) r ( A B )
and
r ( ( A B ) M , K A B K 1 B * K 1 B * ) = r ( K 1 B * ( A B ) M , K A B K 1 B * ) = r ( A B ) * M A B K 1 ( A B ) * ( A B ) * M A B K 1 B * K 1 ( A B ) * K 1 B * r ( A B ) = r ( B ) r ( A B )
and
r ( A * M A B ( A B ) M , K A * M ) = r ( A * M A * M A B ( A B ) M , K ) = r ( A B ) * M A B K 1 ( A B ) * ( A B ) * M A * M A B K 1 ( A B ) * A * M r ( A B ) = r ( A ) r ( A B ) .
Applying (9) and (10) of Lemma 2, we have
min ( A B ) ( 1 , 3 M , 4 K ) r ( N 1 A * B ( A B ) ( 1 , 3 M , 4 K ) A N 1 A * ) = r ( A B ) * M A B ( A B ) * M A N 1 A * B N 1 A * + r A B K 1 ( A B ) * A N 1 A * B K 1 ( A B ) * N 1 A * r A B A N 1 A * B N 1 A * r ( A B ) = a 4 r ( A B ) = max ( A B ) ( 1 , 3 M , 4 K ) r ( N 1 A * B ( A B ) ( 1 , 3 M , 4 K ) A N 1 A * )
and
min ( A B ) ( 1 , 3 M , 4 K ) r ( B * N B ( A B ) ( 1 , 3 M , 4 K ) A B * N ) = min ( A B ) ( 1 , 3 M , 4 K ) r ( B * N B * N B ( A B ) ( 1 , 3 M , 4 K ) A ) = r ( A B ) * M A B ( A B ) * M A B * N B B * N + r A B K 1 ( A B ) * A B * N B K 1 ( A B ) * B * N r A B A B * N B B * N r ( A B ) = a 3 r ( A B ) = max ( A B ) ( 1 , 3 M , 4 K ) r ( B * N B ( A B ) ( 1 , 3 M , 4 K ) A B * N ) .
For any ( A B ) ( 1 , 3 M , 4 K ) ( A B ) { 1 , 3 M , 4 K } , we have
r ( N 1 A * B ( A B ) ( 1 , 3 M , 4 K ) A N 1 A * ) = a 4 r ( A B )
and
r ( B * N B ( A B ) ( 1 , 3 M , 4 K ) A B * N ) = a 3 r ( A B ) .
Combining Formulas Combining Formulas (45)–(51), we have
max ( A B ) ( 1 , 3 M , 4 K ) min A ( 1 , 3 M , 4 N ) , B ( 1 , 3 N , 4 K ) r ( ( A B ) ( 1 , 3 M , 4 K ) B ( 1 , 3 N , 4 K ) A ( 1 , 3 M , 4 N ) ) = max { max ( A B ) ( 1 , 3 M , 4 K ) r ( B * A * B * N B ( A B ) ( 1 , 3 M , 4 K ) A N 1 A * ) ,
max ( A B ) ( 1 , 3 M , 4 K ) r A * M A * M A B K 1 B * ( A B ) ( 1 , 3 M , 4 K ) K 1 B * + a 3 + a 4 r ( A ) r ( B ) n } .
Using (9) of Lemma 2 with A = A B , B = A N 1 A * , C = B * N B and D = B * A * , we obtain
max ( A B ) ( 1 , 3 M , 4 K ) r ( B * A * B * N B ( A B ) ( 1 , 3 M , 4 K ) A N 1 A * ) = min { r ( A B ) * M A B ( A B ) * M A N 1 A * B * N B B * A * r ( A B ) , r A B K 1 ( A B ) * A N 1 A * B * N B K 1 ( A B ) * B * A * r ( A B ) } = min a 1 r ( A B ) , a 2 r ( A B ) .
Using (9) of Lemma 2 with A ˜ = A B , X ˜ = ( A B ) ( 1 , 3 M , 4 K ) , B ˜ = I m , O , C ˜ = O I k and D ˜ = A * M A * M A B K 1 B * O K 1 B * , we obtain
max ( A B ) ( 1 , 3 M , 4 K ) r A * M A * M A B K 1 B * ( A B ) ( 1 , 3 M , 4 K ) K 1 B * = max ( A B ) ( 1 , 3 M , 4 K ) r ( D ˜ C ˜ X ˜ B ˜ ) = min { r ( A B ) * M A B ( A B ) * M O O A * M A * M A B K 1 B * I k O K 1 B * , r A B K 1 ( A B ) * I m O O A * M A * M A B K 1 B * K 1 ( A B ) * O K 1 B * } r ( A B ) = min r ( A ) r ( A B ) + k , r ( B ) r ( A B ) + m .
Combining Formulas (52), (53) and (54), we have
max ( A B ) ( 1 , 3 M , 4 K ) min A ( 1 , 3 M , 4 N ) , B ( 1 , 3 N , 4 K ) r ( ( A B ) ( 1 , 3 M , 4 K ) B ( 1 , 3 N , 4 K ) A ( 1 , 3 M , 4 N ) ) = max min { a 1 , a 2 } , min { k r ( B ) , m r ( A ) } + a 3 + a 4 n r ( A B ) .
According to Formulas (42), (43) and (44) (i.e., (55)), we obtain Formula (39). □
Theorem 2. 
Let M, N, and K be three positive-definite Hermitian matrices, and let A C m × n , B C n × k . Then,
B { 1 , 3 N , 4 K } A { 1 , 3 M , 4 N } ( A B ) { 1 , 3 M , 4 K } min a 1 + m r ( A ) , a 3 = r ( B ) a n d min a 2 + k r ( B ) , a 4 = r ( A ) ,
where a 1 , a 2 , a 3 and a 4 are the same as in Theorem 1.
Proof. 
For the convenience of future research, we simply provide an idea for proving this theorem. For any A ( 1 , 3 M , 4 N ) A { 1 , 3 M , 4 N } and B ( 1 , 3 N , 4 K ) B { 1 , 3 N , 4 K } , according to Lemma 1, Formulas (57)–(59) are equivalent, as follows:
B { 1 , 3 N , 4 K } A { 1 , 3 M , 4 N } ( A B ) { 1 , 3 M , 4 K } .
( A B ) * M A B B ( 1 , 3 N , 4 K ) A ( 1 , 3 M , 4 N ) = ( A B ) * M a n d B ( 1 , 3 N , 4 K ) A ( 1 , 3 M , 4 N ) A B K 1 ( A B ) * = K 1 ( A B ) * .
and
max A ( 1 , 3 M , 4 N ) , B ( 1 , 3 N , 4 K ) r ( B * A * M B * A * M A B B ( 1 , 3 N , 4 K ) A ( 1 , 3 M , 4 N ) ) = 0 a n d max B ( 1 , 3 N , 4 K ) , A ( 1 , 3 M , 4 N ) r ( K 1 B * A * B ( 1 , 3 N , 4 K ) A ( 1 , 3 M , 4 N ) A B K 1 B * A * ) = 0 .
From (9) of Lemma 2 with A = A , B = I m , C = B * A * M A B B ( 1 , 3 N , 4 K ) and D = B * A * M , we have
max A ( 1 , 3 M , 4 N ) r ( B * A * M B * A * M A B B ( 1 , 3 N , 4 K ) A ( 1 , 3 M , 4 N ) ) = min { r A * M A A * M B * A * M A B B ( 1 , 3 N , 4 K ) B * A * M r ( A ) , r A N 1 A * I m B * A * M A B B ( 1 , 3 N , 4 K ) N 1 A * B * A * M r ( A ) } = min { r ( B * A * M A B * A * M A B B ( 1 , 3 N , 4 K ) ) , r ( B * A * M A N 1 A * B * A * M A B B ( 1 , 3 N , 4 K ) N 1 A * ) + m r ( A ) } .
Using (10) of Lemma 2 with A = B , B = I n , C = B * A * M A B and D = B * A * M A , we have
min B ( 1 , 3 N , 4 K ) r ( B * A * M A B * A * M A B B ( 1 , 3 N , 4 K ) ) = r B * N B B * N B * A * M A B B * A * M A + r B K 1 B * I n B * A * M A B K 1 B * B * A * M A r B I n B * A * M A B B * A * M A r ( B ) = a 3 r ( B ) .
Using (10) of Lemma 2 with A = B , B = N 1 A * , C = B * A * M A B and D = B * A * M A N 1 A * , we have
min B ( 1 , 3 N , 4 K ) r ( B * A * M A N 1 A * B * A * M A B B ( 1 , 3 N , 4 K ) N 1 A * ) = r B * N B B * A * B * A * M A B B * A * M A N 1 A * + r B K 1 B * N 1 A * B * A * M A B K 1 B * B * A * M A N 1 A * r B N 1 A * B * A * M A B B * A * M A N 1 A * r ( B ) .
This is because
r B K 1 B * N 1 A * B * A * M A B K 1 B * B * A * M A N 1 A * = r B K 1 / 2 N 1 A * B * A * M A B K 1 / 2 B * A * M A N 1 A * ( B K 1 / 2 ) * O O I r B K 1 / 2 N 1 A * B * A * M A B K 1 / 2 B * A * M A N 1 A *
and
r B K 1 / 2 N 1 A * B * A * M A B K 1 / 2 B * A * M A N 1 A * = r B K 1 B * N 1 A * B * A * M A B K 1 B * B * A * M A N 1 A * ( B K 1 B * ) B K 1 / 2 O O I r B K 1 B * N 1 A * B * A * M A B K 1 B * B * A * M A N 1 A * .
Then, we have
r B K 1 B * N 1 A * B * A * M A B K 1 B * B * A * M A N 1 A * = r B K 1 / 2 N 1 A * B * A * M A B K 1 / 2 B * A * M A N 1 A * = r B N 1 A * B * A * M A B B * A * M A N 1 A * .
Combining Formulas (62)–(64), we have
min B ( 1 , 3 N , 4 K ) r ( B * A * M A N 1 A * B * A * M A B B ( 1 , 3 N , 4 K ) N 1 A * ) = r B * N B B * A * B * A * M A B B * A * M A N 1 A * r ( B ) = a 1 r ( B ) .
From Formulas (60), (61) and (65), we have
max A ( 1 , 3 M , 4 N ) , B ( 1 , 3 N , 4 K ) r ( B * A * M B * A * M A B B ( 1 , 3 N , 4 K ) A ( 1 , 3 M , 4 N ) ) = min { min B ( 1 , 3 N , 4 K ) r ( B * A * M A B * A * M A B B ( 1 , 3 N , 4 K ) ) , [ min B ( 1 , 3 N , 4 K ) r ( B * A * M A N 1 A * B * A * M A B B ( 1 , 3 N , 4 K ) N 1 A * ) ] + m r ( A ) } = min a 1 r ( A ) r ( B ) + m , a 3 r ( B ) .
On the other hand, using Lemma 2 again and similarly to the method in (66), we have
max B ( 1 , 3 N , 4 K ) , A ( 1 , 3 M , 4 N ) r ( K 1 B * A * B ( 1 , 3 N , 4 K ) A ( 1 , 3 M , 4 N ) A B K 1 B * A * ) = min a 2 + k r ( A ) r ( B ) , a 4 r ( A ) .
Finally, by combining Formulas (57), (58), (59), (66) and (67), we have Formula (56). □
The reverse order law (2) holds if and only if the two inclusions (3) and (4) hold. Thus, combining Theorems 1 and 2, we immediately obtain the main result of this paper.
Theorem 3. 
Let M, N, and K be three positive-definite Hermitian matrices. Let A C m × n and B C n × k . Let a 1 , a 2 , a 3 and a 4 be the same as in Theorem 1. Then, the following statements are equivalent:
( 1 ) B { 1 , 3 N , 4 K } A { 1 , 3 M , 4 N } = ( A B ) { 1 , 3 M , 4 K } ( 2 ) r ( A B ) = max min { a 1 , a 2 } , min { k r ( B ) , m r ( A ) } + a 3 + a 4 n a n d r ( B ) = min a 1 + m r ( A ) , a 3 a n d r ( A ) = min a 2 + k r ( B ) , a 4 .
Corollary 1. 
Let A C m × n and B C n × k . Then, the following statements are equivalent:
( 1 ) B { 1 , 3 , 4 } A { 1 , 3 , 4 } = ( A B ) { 1 , 3 , 4 } ( 2 ) r ( A B ) = max min { b 1 , b 2 } , min { k r ( B ) , m r ( A ) } + b 3 + b 4 n a n d r ( B ) = min b 1 + m r ( A ) , b 3 a n d r ( A ) = min b 2 + k r ( B ) , b 4 ,
where
b 1 = r B * B B * A * B * A * A B B * A * A A * , b 3 = r B * B * A * A
and
b 2 = r A A * A B B * A * B * A * B * B B * A * , b 4 = r A A B B * .

3. Conclusions

The reverse order law for a matrix product’s weighted generalized inverses was studied in this article by using the ranks of generalized Schur complements. The work in this paper can be utilized as a useful tool in many algorithms for the computation of matrix equations’ weighted least squares technique.

Author Contributions

B.Q. and Z.X.: resources; Y.Q.: conceptualization and writing—review and editing. All authors have read and agree to the published version of the manuscript.

Funding

This work was supported by the project for characteristic innovation of 2018 Guangdong University (No. 2018KTSCX234) and the Guangdong Basic and Applied Basic Research Foundation (No. 2025A1515012526).

Institutional Review Board Statement

This study did not involve human or animal subjects.

Informed Consent Statement

Not applicable.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Acknowledgments

The authors wish to thank the anonymous referees of this paper.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Hernandez, A.; Lattanz, M.; Thome, N. On a partial order defined by the weighted Moore–Penrose inverse. Appl. Math. Comput. 2013, 219, 7310–7318. [Google Scholar] [CrossRef]
  2. Wang, G.R.; Wei, Y.M.; Qiao, S.Z. Generalized Inverse: Theory and Computations; Science Press: Beijing, China, 2004. [Google Scholar]
  3. Wei, M. Theory and Calculation of Generalized Least Squares Problem; Science Press: Beijing, China, 2006. [Google Scholar]
  4. Ben-Israel, A.; Greville, T.N.E. Generalized Inverse: Theory and Applications. In Wiley-Interscience, 1974, 2nd ed.; Springer: New York, NY, USA, 2002. [Google Scholar]
  5. Campbell, S.L.; Meyer, C.D. Generalized Inverse of Linear Transformations; Dover: NewYork, NY, USA, 1979. [Google Scholar]
  6. Cvetković-IIić, D.S.; Milosevic, J. Reverse order laws for {1, 3}-generalized inverses. Linear Multilinear Algebra 2018, 234, 114–117. [Google Scholar]
  7. Pierro, A.R.D.; Wei, M. Reverse order law for reflexive generalized inverses of products of matrices. Linear Algebra Appl. 1998, 277, 299–311. [Google Scholar] [CrossRef]
  8. Djordjević, D.S. Further results on the reverse order law for generalized inverses. SIAM J. Matrix Anal. Appl. 2007, 29, 1242–1246. [Google Scholar]
  9. Greville, T.N.E. Note on the generalized inverses of a matrix products. SIAM Rev. 1966, 8, 518–521. [Google Scholar] [CrossRef]
  10. Hartwig, R.E. The reverse order law revisited. Linear Algebra Appl. 1986, 76, 241–246. [Google Scholar] [CrossRef]
  11. Liu, Y.; Tian, Y. A mixed-type reverse order law for generalized inverse of a triple matrix product. Acta Math. Sin. Chin. Ser. 2009, 52, 197–204. [Google Scholar]
  12. Liu, Q.; Wei, M. Reverse order law for least squares g-inverses of multiple matrix products. Linear Mult. Algebra 2008, 56, 491–506. [Google Scholar] [CrossRef]
  13. Penrose, R. A generalized inverse for matrix. Proc. Camb. Philos. Soc. 1955, 51, 406–413. [Google Scholar] [CrossRef]
  14. Sampazo, R.J.B.; Sun, W.; Yuan, J. On trust region algorithm for nonsmooth optimization. Appl. Math. Comput. 1997, 85, 109–116. [Google Scholar]
  15. Rao, C.R.; Mitra, S.K. Generalized Inverse of Matrices and Its Applications; Wiley: NewYork, NY, USA, 1971. [Google Scholar]
  16. Tian, Y. Reverse order laws for the generalized inverse of multiple matrix products. Linear Algebra Appl. 1994, 211, 85–100. [Google Scholar] [CrossRef]
  17. Tian, Y. More on maximal and minimal ranks of Schur complements with applications. Appl. Math. Comput. 2004, 152, 675–692. [Google Scholar] [CrossRef]
  18. Wei, M. Reverse order laws for generalized inverse of multiple matrix products. Linear Algebra Appl. 1999, 293, 273–288. [Google Scholar] [CrossRef]
  19. Sitha, B.; Behera, R.; Sahoo, J.K.; Mohapatra, R.N.; Stanimirović, P.; Stupina, A. Characterizations of weighted generalized inverses. Aequat. Math. 2025, 1–36. [Google Scholar] [CrossRef]
  20. Kyrchei, I. Weighted singular value decomposition and determinantal representations of the quaternion weighted Moore-Penrose inverse. Appl. Math. Comput. 2017, 309, 1–16. [Google Scholar] [CrossRef]
  21. Nikolov, J.; Cvetković-Ilić, D.S. Reverse order laws for the weighted generalized inverses. Appl. Math. Lett. 2011, 24, 2140–2145. [Google Scholar] [CrossRef]
  22. Sun, W.; Wei, Y. Inverse order rule for weighted generalized inverse. SIAM J. Matrix Anal. Appl. 1998, 19, 772–775. [Google Scholar] [CrossRef]
  23. Sun, W.; Wei, Y. Triple reverse-order rule for weighted generalized inverses. Appl. Math. Comput. 2002, 125, 221–229. [Google Scholar]
  24. Tian, Y. A family of 512 reverse order laws for generalized inverse of a matrix product: A review. Heliyon 2020, 6, e04924. [Google Scholar] [CrossRef]
  25. Tian, Y. Characterizations of matrix equalities involving the sums and products of multiple matrices and their generalized inverse. Electron. Res. Arch. 2023, 31, 5866–5893. [Google Scholar] [CrossRef]
  26. Xiong, Z.P.; Liu, Z.S. Applications of completions of operator matrices to some properties of operator products on Hilbert spaces. Complex Anal. Oper. Theory 2018, 12, 123–140. [Google Scholar] [CrossRef]
  27. Xiong, Z.P.; Zheng, B. The reverse order laws for {1, 2, 3}- and {1, 2, 4}-inverses of two matrix product. Appl. Math. Lett. 2008, 21, 649–655. [Google Scholar] [CrossRef]
  28. Zheng, B.; Xiong, Z.P. On reverse order laws for the weighted generalized inverse. Arab. J. Sci. Eng. 2008, 34, 195–203. [Google Scholar]
  29. Cohen, N.; Johnson, C.R.; Rodman, L.; Woerdeman, H.J. Ranks of completions of partial matrices. Oper. Theory Adv. Appl. 1989, 40, 165–185. [Google Scholar]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Qin, Y.; Qiu, B.; Xiong, Z. The Reverse Order Law for the {1,3M,4N}—The Inverse of Two Matrix Products. Axioms 2025, 14, 344. https://doi.org/10.3390/axioms14050344

AMA Style

Qin Y, Qiu B, Xiong Z. The Reverse Order Law for the {1,3M,4N}—The Inverse of Two Matrix Products. Axioms. 2025; 14(5):344. https://doi.org/10.3390/axioms14050344

Chicago/Turabian Style

Qin, Yingying, Baifeng Qiu, and Zhiping Xiong. 2025. "The Reverse Order Law for the {1,3M,4N}—The Inverse of Two Matrix Products" Axioms 14, no. 5: 344. https://doi.org/10.3390/axioms14050344

APA Style

Qin, Y., Qiu, B., & Xiong, Z. (2025). The Reverse Order Law for the {1,3M,4N}—The Inverse of Two Matrix Products. Axioms, 14(5), 344. https://doi.org/10.3390/axioms14050344

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop