The Forward Order Law for Least Squareg-Inverse of Multiple Matrix Products

Abstract: The generalized inverse has many important applications in the aspects of the theoretic research of matrices and statistics. One of the core problems of the generalized inverse is finding the necessary and sufficient conditions of the forward order laws for the generalized inverse of the matrix product. In this paper, by using the extremal ranks of the generalized Schur complement, we obtain some necessary and sufficient conditions for the forward order laws A1{1, 3}A2{1, 3} · · · An{1, 3} ⊆ (A1 A2 · · · An){1, 3} and A1{1, 4}A2{1, 4} · · · An{1, 4} ⊆ (A1 A2 · · · An){1, 4}.


Introduction
Throughout this paper, C m×n denotes the set of m × n matrices with complex entries and C m denotes the set of m-dimensional vectors.I m denotes the identity matrix of order m, and O m×n is the m × n matrix of all zero entries (if no confusion occurs, we will drop the subscript).For a matrix A ∈ C m×n , A * , r(A), R(A), and N(A) denote the conjugate transpose, the rank, the range space, and the null space of A, respectively.Furthermore, for the sake of simplicity in the later discussion, we will adopt the following notation for the matrix products with A i ∈ C m×m ,i = 0, 1, 2, • • • n − 1, Let A ∈ C m×n , a generalized inverse X ∈ C n×m of A, be a matrix that satisfies some of the following four Penrose equations [1]: (1) AXA = A, (2) XAX = X, (3) (AX) * = AX, (4) (XA) * = XA. ( For a subset η ⊆ {1, 2, 3, 4}, the set of n × m matrices satisfying the equations that is contained in η is denoted by A{η}.A matrix from A{η} is called an {η}-inverse of A and is denoted by A (η) .For example, an n × m matrix X of the set A{1} is called a {1}-inverse of A and is denoted by X = A (1) .One usually denotes any {1, 3}-inverse of A as A (1,3) , which is also called a least squares g-inverse of A. Any {1, 4}-inverse of the set A{1, 4} is denoted by A (1,4) , which is also called a minimum norm g-inverse of A. The unique {1, 2, 3, 4}-inverse of A is denoted by A † , which is also called the Moore-Penrose inverse of A. We refer the reader to [2,3] for basic results on the generalized inverses.Theory and computations of the reverse (forward) order laws for generalized inverse are important in many branches of applied sciences, such as non-linear control theory [4], matrix analysis [2,5], statistics [4,6], and numerical linear algebra; see [3,6].
is used in many practical scientific problems [2,4,6,7].Any solution x of the above LS problem can be expressed as (1,4) b.The unique minimal norm least squares solution x of the LS problem is One problem concerning the above LS problem is under what conditions do the following reverse order laws hold?
Another problem is under what conditions do the following forward order laws holds? where The reverse order law for the generalized inverse of multiple matrix products yields a class of interesting problems that are fundamental in the theory of the generalized inverse of matrices; see [2,3,5,8,9].As one of the core problems in reverse order laws, the necessary and sufficient conditions for the reverse order laws for the generalized inverse of matrix product hold, is useful in both theoretical study and practical scientific computing, this has attracted considerable attention, and many interesting results have been obtained; see [8,[10][11][12][13][14][15][16][17][18].
The forward order law for generalized inverse of multiple matrix products (4) originally arose in studying the inverse of multiple matrix Kronecker products.Let A i , i = 1, 2, • • • , n, be n nonsingular matrices, then the Kronecker product A 1 A 2 • • • A n is nonsingular as well, and the inverse of However, this so-called forward order law is not necessarily true for the matrix product, that is Recently, Z. Liu and Z. Xiong [19,20] studied the forward order law for {1, 3}-inverses of three matrix products by using the maximal rank of the generalized Schur complement [21], and some necessary and sufficient conditions for In this paper, we further study this subject, and some necessary and sufficient conditions by the ranks, the ranges, or the null spaces of the known matrices are provided for the following forward order laws: and: The main tools of the later discussion are the following lemmas.

Lemma 2 ([5]
).Let A ∈ C m×n , B ∈ C m×k , and C ∈ C p×n , then: where the projectors Let L, M be a complementary subspace of C n , and let P L,M be the projector on L along M. Then: (1)

Main Results
In this section, we will present some necessary and sufficient conditions for the forward order law (5) by using the maximal ranks of some generalized Schur complement forms.Let: where For the convenience of the readers, we first give a brief outline of the basic idea.From the formula (1) in Lemma 4, we know that the forward order law (5) holds if and only if Hence, we can present the equivalent conditions for the forward order law (5) if the concrete expression of the maximal rank involved in the identity ( 8) is derived.The relative results are included in the following lemma.
Let A i be as in (1) and S µ * be as in (7), and Proof.By Lemma 1 and the formula (2) of Lemma 2, we have: max According to Lemma 1, the formula (2) in Lemma 2, and Equations ( 1) and ( 9), we have: Again, by Lemma 1, the formula (2) in Lemma 2 and the results in ( 1) and ( 10), we have: Suppose X 0 = I m .We contend that, for 2 We proceed by induction on i.For i = 2, from (11), the equality relation ( 12) has been proven.Assuming that ( 12) is true for i − 1 (i ≥ 3), that is: Next, we will prove that ( 12) is also true for i.In fact, by Lemma 1, the formula (2) in Lemma 2, and the results in ( 13) and (1), we have: That is, the equality relation ( 12) has been proven.Specifically, when i = n − 1, we have: Combining (15) with Lemma 2, we finally have: max From Lemma 5, Lemma 2, and Lemma 3, we immediately obtain the following theorem by Equation (8).
Then, the following statements are equivalent: Proof.
By Lemma 4, we know that X ∈ A{1, 4} if and only if X * ∈ A{1, 3}.Therefore, from the results obtained in Theorem 1, we can get the necessary and sufficient conditions for the forward order law (6), and hence provide the following theorem without the proof.