Norm retrieval and phase retrieval by projections

We make a detailed study of norm retrieval. We give several classification theorems for norm retrieval and give a large number of examples to go with the theory. One consequence is a new result about Parseval frames: If a Parseval frame is divided into two subsets with spans $W_1,W_2$ and $W_1 \cap W_2=\{0\}$, then $W_1 \perp W_2$.


Introduction
Signal reconstruction is an important problem in engineering and has a wide variety of applications. Recovering signals when there is partial loss of information is a significant challenge. Partial loss of phase information occurs in application areas such as speech recognition [4,17,18], and optics applications such as X-ray crystallography [3,13,14], and there is a need to do phase retrieval efficiently. The concept of phase retrieval for Hilbert space frames was introduced in 2006 by Balan, Casazza, and Edidin [2], and since then it has become an active area of research in signal processing and harmonic analysis.
Phase retrieval has been defined for vectors as well as for projections and in general deals with recovering the phase of a signal given its intensity measurements from a redundant linear system. Phase retrieval by projections, where the signal is projected onto some higher dimensional subspaces and has to be recovered from the norms of the projections of the vectors onto the subspaces, appears in real life problems such as crystal twinning [12]. We refer the reader to [8] for a detailed study of phase retrieval by projections.
Another related problem is that of phaseless reconstruction, where the unknown signal is reconstructed from the intensity measurements. Recently, the two terms phase retrieval and phaseless reconstruction were used interchangeably. However, it is not clear from their respective definitions how these two are equivalent. Recently, in [5] the authors proved the equivalence of phase retrieval and phaseless reconstruction in real as well as in complex case. Due to The first, second and fourth authors were supported by NSF DMS 1609760; NSF ATD 1321779; and ARO W911NF-16-1-0008. Part of this research was carried out while the first and fourth authors were visiting the Hong Kong University of Science and Technology on a grant from (ICERM) Institute for computational and experimental research in Mathematics. this equivalence, in this paper, we restrict ourselves to proving results regarding phase retrieval. Further, a weaker notion of phase retrieval and phaseless reconstruction was introduced in [6].
In this work, we consider the notion of norm retrieval which was recently introduced by Bahmanpour et.al. in [1], and is the problem of retrieving the norm of a vector given the absolute value of its intensity measurements. Norm retrieval arises naturally from phase retrieval when one utilizes both a collection of subspaces and their orthogonal complements. Here we study norm retrieval and certain classifications of it. We use projections to do norm retrieval and to extend certain results from [16] for frames. We provide a complete classification of subspaces of R N which do norm retrieval. Various examples for phase and norm retrieval by projections are given. Further, a classification of norm retrieval using Naimark's theorem is also obtained.
We organize the rest of the paper as follows. In Section 2, we include basic definitions and results of phase retrieval. Section 3 introduces the norm retrieval and properties. Section 4 provides the relationship between phase and norm retrieval and related results. Detailed classifications of vectors and subspaces which do norm retrieval are provided in Section 5.

Preliminaries
We denote by H N a N dimensional real or complex Hilbert space, and we write R N or C N when it is necessary to differentiate between the two explicitly. Below, we give the definition of a frame in H N .
The following definitions and terms are useful in the sequel.
• The constants A and B are called the lower and upper frame bounds of the frame, respectively. • If A = B, the frame is called an A-tight frame (or a tight frame). In particular, if A = B = 1, the frame is called a Parseval frame. • Φ is an equal norm frame if φ i = φ j for all i, j and is called a unit norm frame if φ i = 1 for all i = 1, 2, · · · n. • If, only the right hand side inequality holds in (1), the frame is called a B-Bessel family with Bessel bound B. Note that in a finite dimensional setting, a frame is a spanning set of vectors in the Hilbert space. We refer to [10] for an introduction to Hilbert space frame theory and applications.
The analysis operator associated with Φ is defined as the operator T : Here, {e i } M i=1 is understood to be the natural orthonormal basis for ℓ M 2 . The adjoint T * of the analysis operator T is called the synthesis operator of the frame Φ. It can be shown that T * (e i ) = φ i .
The frame operator for the frame Φ is defined as S : Note that the frame operator S is a positive, self-adjoint and invertible operator satisfying the operator inequality AI ≤ S ≤ BI, where A and B are the frame bounds and I denotes the identity on H N . Frame operators play an important role since they are used to reconstruct the vectors in the space. To be precise, any x ∈ H N can be written as The frame operator of a Parseval frame is the identity operator. Thus, if We concentrate on norm retrieval and its classifications in this paper. We now see the basic definitions of phase retrieval formally, starting with phase retrieval by projections. Throughout the paper, the term projection is used to describe orthogonal projection (orthogonal idempotent operator) onto subspaces.
be the projections onto each of these subspaces. We say that ) yields phase retrieval if for all x, y ∈ H N satisfying P i x = P i y for all i = 1, 2, · · · , M then x = cy for some scalar c such that |c| = 1 Phase retrieval by vectors is a particular case of the above.
Φ yields phase retrieval with respect to an orthonormal basis Orthonormal bases fail to do phase retrieval, since in any given orthonormal basis, the corresponding coefficients of a vector are unique. One of the fundamental properties to identify the minimum number of vectors required to do phase retrieval is the complement property.
It is proved in [2] that phase retrieval is equivalent to the complement property in R N . Further, it is proven that a generic family of (2N − 1)-vectors in R N does phase retrieval, however no set of (2N − 2)-vectors can. Here, generic refers to an open dense set in the set of (2N − 1)-element frames in H N . Full spark is another important notion of vectors in frame theory. A formal definition is given below: in H N , the spark of Φ is defined as the cardinality of the smallest linearly dependent subset of Φ. When spark(Φ) = N + 1, every subset of size N is linearly independent, and in that case, Φ is said to be full spark.
Note from the definitions that full spark frames with M ≥ 2N − 1 have the complement property and hence do phase retrieval. Moreover, if M = 2N − 1 then the complement property clearly implies full spark.
Next result, known as Naimark's theorem, characterizes Parseval frames in a finite dimensional Hilbert space. This theorem facilitates a way to construct Parseval frames, and crucially it is the only way to obtain Parseval frames. Later, we use this to obtain a classification of frames which do norm retrieval. The notation [M] = {1, 2, · · · , M} is used throughout the paper. Theorem 2.6 (Naimark's Theorem).

Beginnings of Norm Retrieval
In this section, we provide the definition of norm retrieval along with certain related results, and pertinent examples.
be the orthogonal projections onto each of these subspaces. We say that ) yields norm retrieval if for all x, y ∈ H N satisfying P i x = P i y for all i = 1, 2, · · · , M then x = y .

In particular, a set of vectors {φ
Remark 3.2. It is immediate that a family of vectors doing phase retrieval does norm retrieval.
An obvious choice of vectors which do norm retrieval are orthonormal bases.
The following theorem provides a sufficient condition under which the subspaces spanned by the canonical basis vectors do norm retrieval.
It is easy to see that tight frames do norm retrieval.
Theorem 3.4. Tight frames do norm retrieval.
for any ψ j ∈ H N . This is generalized in the following proposition.
contains an orthonormal basis, then it does norm retrieval. Moreover, in this case, be an orthonormal basis for H N and let P i be the projections The above proposition does not hold if the number of hyperplanes is strictly less than N. This is proved in the next theorem.
Now, we strengthen the above result by not requiring the vectors to be orthogonal. To prove this, we need the following lemma.
Proof. We do this by induction on N with the case N = 2 obvious. So assume this holds for N − 1. Given As λ varies from −∞ to +∞, the right hand side varies from −∞ to +∞ and for some λ, we have Proof. Let P i be the projection onto W i and choose But x 2 = 1 while y 2 = 1, and so norm retrieval fails.
However, in the following theorem, we show that three proper subspaces of codimension one can do norm retrieval in R N . Theorem 3.9. In R N three proper subspaces of codimension one can do norm retrieval.
It follows that in R 3 , two 2-dimensional subspaces cannot do norm retrieval but three 2-dimensional subspaces can do norm retrieval.
The following proposition shows a relationship between subspaces doing norm retrieval and the sum of the dimensions of the subspaces. The importance of this proposition is that we are looking for conditions on subspaces to do norm retrieval. To do so, the dimension of the subspaces is one of the tools we have.
Proof. If M i=1 dim W i < N then we may pick non-zero x ⊥ W i for each i so that P i x = 0 for all i and therefore {W i } M i=1 fails norm retrieval. For the moreover part, let {g i } N i=1 be an orthonormal basis. We represent this basis L-times as a multiset: , · · · , g N , g 1 , · · · , g N , · · · , g 1 , · · · , g N }, and index it as: . We may pick a partition of [LN] in the following manner: Hence the result.
As we have seen, the above proposition may fail if M i=1 k i = LN.

Phase retrieval and Norm Retrieval
In this section, we provide results relating phase retrieval and norm retrieval. The following theorem of Edidin [11] is significant in phase retrieval as it gives a necessary and sufficient condition for subspaces to do phase retrieval.
Proof. If not, pick non-zero x ⊥ W ⊥ i for all i ∈ I c . This implies x ∈ ∩ i∈I c W i and therefore {P i (x)} N i=1 contains at most N − 1 distinct vectors and can not span R N . This contradicts the theorem 4.1.
Proof. If (W ⊥ i ) does not span, then there exists 0 = x ∈ ∩W i . So P i x = x for all i = 1, 2, · · · , M, and so {P i (x)} does not span. Thus, by Theorem 4.1, (W i ) does not do phase retrieval.
The following example shows that it is possible for subspaces to do norm retrieval even if {W ⊥ i } do not span the space which we see as one of main differences between phase retrieval and norm retrieval.
be a orthonormal basis for R 3 , then let Any collection of subspaces which does phase retrieval yields norm retrieval, which follows from the definitions. However, the converse need not hold true always. For instance, any orthonormal basis does norm retrieval in R N . But it has too few vectors to do phase retrieval as it requires at least 2N − 1 vectors to do phase retrieval in R N .
Given subspaces {W i } M i=1 of H N which yield phase retrieval, it is not necessarily true that {W ⊥ i } M i=1 do phase retrieval. The following result proves that norm retrieval is the condition needed to pass phase retrieval to orthogonal complements. Though the result is already proved in [1], we include it here for completeness.
does norm retrieval.
Proof. Assume that (I − P i )x = (I − P i )y for all i = 1, 2, · · · , M and {P i } M i=1 does norm retrieval. I.e. x = y . Then Since x = y , we have P i x = P i y for all i = 1, 2, · · · , M.
Since {P i } M i=1 does phase retrieval, it follows that x = cy for some |c| = 1. The other direction of the theorem is clear.
Next is an example of a family of subspaces {W i } M i=1 which does phase retrieval but complements fail phase retrieval and hence fail norm retrieval [8].
does norm retrieval, we can conclude the latter does phase retrieval as well which follows from Lemma 4.5.
The next result gives us a sufficient condition for the subspaces to do norm retrieval. It is enough to check if the identity is in the linear span of the projections in order for the subspaces to do norm retrieval. A similar result in the case of phase retrieval is proved in [7].
does norm retrieval.
Proof. Given x ∈ R N , then Since for each i the coefficients a i and P i x are known, the collection does norm retrieval.
A counter example for the converse of the above proposition is given in [1] where the authors construct a collection of projections, P i , which do phase retrieval but I ∈ span P i . Here, we provide another example for the same. We give a set of five vectors in R 3 which does phase retrieval; however the identity operator is not in the span of these vectors. We need the following theorem that provides a necessary and sufficient condition for a frame to be not scalable in R 3 . Recall that a frame is a Parseval frame [15]. Later in the next section, we prove that scalable frames always do norm retrieval.  Choose five full spark vectors in the cone referred in the previous theorem 4.9. These vectors do phase retrieval and hence norm retrieval in R 3 . Now, given The next proposition gives a sufficient condition for the complements to do norm retrieval when the subspaces do.
does norm retrieval.
Proof. Observe the following By the previous proposition this shows {I − P i } L i=1 does norm retrieval.
It is possible that a i P i = I = b i P i with a i = 1 but b i = 1, as we will see in the following example.
be an orthonormal basis for R 3 . Now let

Classification of Norm Retrieval
In this section, we give classifications of norm retrieval by projections. The following theorem in [16] uses the span of the frame elements to classify norm retrievable frames in R N .
Next, we prove one of the main results of this paper. This is an extension of the previous Theorem 5.1 and it fully classifies the subspaces of R N which do norm retrieval.
Then the following are equivalent: Given any orthonormal bases {φ i,j } I i j=1 of W i and any subcollection and so x, y = 0.
By (2), we must have that x + y, x − y = 0, which implies that x and y have the same norm. The third equivalence is immediate from the result in Theorem (5.1).
, c i = 0 does norm retrieval. Hence all scalable frames do norm retrieval.
Proof. This is an immediate result of Theorem 5.2. Observe the conditions in Theorem 5.2 do not depend on the norm of each vector φ i .
For the complex case we have: Proof. Given x, y as above, | x + y, φ ij | = | x − y, φij |, for all (i,j).
We use Theorem 5.2 to give a simple proof of a result in [7] which has a very complicated proof in that paper.
do norm retrieval in R N , then the vectors are orthogonal.
Proof. Assume φ i = 1 and that φ j is not orthogonal so span {φ i } i =j . Choose x ⊥ a i for all i = j. Let y = x − x, a j a j . Now, a j , y = a j , x − x, a j a j , a j = 0.
Let I = {i : i = j}. Then x ⊥ span {a i } i∈I and y ⊥ a j , but x, y = x, x − x, a j x, a j = 1 − | x, a j | 2 = 0, contradicting the theorem.
(2) For i = 1, 2, · · · , M if W 1 = span{φ i } i∈I and W 2 = span{φ i } i∈I c then, Both phase retrieval and norm retrieval are preserved when applying projections to the vectors. Also, phase retrieval is preserved under the application of any invertible operator (refer to [1] for details). This is not the case with norm retrieval, in general. We prove this in the next corollary.
Corollary 5.7. Norm retrieval is not preserved under the application of an invertible operator, in general.
be linearly independent vectors in R N which are not orthogonal. Then by Corollary 5.5, Φ cannot do norm retrieval. But there exists an invertible operator T on R N so that {T φ i } N i=1 is an orthonormal basis and so does norm retrieval.
However, we note that unitary operators, which are invertible, do preserve norm retrieval.
The following corollary about Parseval frames also holds in the infinite dimensional case with the same proof.
Corollary 5.8. If Φ is a Parseval frame, it does norm retrieval. Hence, if we partition Φ into two disjoint sets, and choose a vector orthogonal to each set, then these vectors are orthogonal.
Proof. Let Φ = {φ i } i∈I be a Parseval frame and let J ⊆ I. Let T be its analysis operator. If x ⊥ {φ i } i∈J and y ⊥ {φ i } i∈J c . Then T x = ( x, φ i ) and T y = ( y, φ i ) do not have any nonzero coordinates in common. So T x ⊥ T y. Since, the analysis operator of a Parseval frame is an isometry, we have x ⊥ y.
A classic result in frame theory is that a Parseval frame {φ i } i∈I has the property that if φ j / ∈ W = span i =j {φ i } then φ j ⊥ W. It turns out that a much more general result holds. Corollary 5.10. If Φ = {φ i } N i=1 is a frame for R M with frame operator S which does norm retrieval, then for every I ⊂ {1, 2, · · · , N}, if x ⊥ span {φ i } i∈I then x ∈ span {S −1 φ i } i∈I c . In particular, if Φ is a Parseval frame then x ∈ span {φ i } i∈I c .
Proof. Given x as in the corollary, We next provide a classification of norm retrieval using Naimark's theorem. It turns out that every frame can be scaled to look similar to Naimark's theorem. Proof. Let {g i } N i=1 be the eigenbasis for the frame with respective eigenvalues 1 = λ i ≥ λ 2 ≥ · · · ≥ λ N . For M + 1 ≤ M + i ≤ 2M − 1 let Then is a Parseval frame. So R N ⊂ ℓ 2 (2M − 1) with orthonormal basis {e i } 2M −1 i=1 and the projection down to R N satisfies P e i = φ i for all i ∈ [2M − 1].
Theorem 5.12. Let Φ = {φ i } M i=1 be a frame for R N . The following are equivalent: (1) Φ does norm retrieval. be an orthonormal basis for ℓ 2M −1 2 and the projection onto R N satisfies P e i = φ i for i = 1, 2, · · · , M. Now, Φ does norm retrieval if and only if for any x ∈ R N , knowing | x, φ i | gives us x . But x, φ i = x, P e i = P x, e i = x, e i . Now, knowing | x, e i | for i = 1, 2, · · · , M means knowing x . But: x, e i e i 2 .