Distributed Two-Dimensional MUSIC for Joint Range and Angle Estimation with Distributed FMCW MIMO Radars

To estimate range and angle information of multiple targets, FMCW MIMO radars have been exploited with 2D MUSIC algorithms. To improve estimation accuracy, received signals from multiple FMCW MIMO radars are collected at the data fusion center and processed coherently, which increases data communication overhead and implementation complexity. To resolve them, we propose the distributed 2D MUSIC algorithm with coordinate transformation, in which 2D MUSIC algorithm is operated with respect to the reference radar’s coordinate at each radar in a distributed way. Rather than forwarding the raw data of received signal to the fusion center, each radar performs 2D MUSIC with its own received signal in the transformed coordinates. Accordingly, the distributed radars do not need to report all their measured signals to the data fusion center, but they forward their local cost function values of 2D MUSIC for the radar image region of interest. The data fusion center can then estimate the range and angle information of targets jointly from the aggregated cost function. By applying the proposed scheme to the experimentally measured data, its performance is verified in the real environment test.


Introduction
Recently, frequency-modulated continuous waveform (FMCW) radar has been widely exploited in military and automotive radar sensing/imaging systems [1][2][3][4] because it has advantages in the fast-processing time and the robustness to harsh environmental conditions. Accordingly, to estimate the angle with adequate spatial resolution, FMCW multiple-input multiple-output (MIMO) radar has been studied in [5][6][7][8] (and references therein), in which multiple M t transmitters and M r receivers are co-located and the orthogonal waveforms are transmitted from multiple transmit antennas. Then, the reflected signals at the receiver can be equivalently modeled as the received signal from the virtual array with M t M r antenna elements.
To estimate the range and angle information of multiple targets with the FMCW MIMO radar, two-dimensional multiple signal classification (2D MUSIC) algorithms can be exploited [9][10][11][12], in which the covariance matrix of the received signal is estimated for the subspace-based signal processing. We note that the 2D MUSIC algorithm is widely exploited to estimate 2D directions of arrival (DOA) [13][14][15][16], where the elevation and the azimuth angles are jointly estimated by exploiting the planar array antenna. The 2D MUSIC algorithm can also be exploited to estimate the range and angle parameters. In [9], 2D MUSIC algorithm is combined with a FFT-based parameter estimator to reduce the computational complexity of high-resolution 2D MUSIC algorithm. In [10], the weighted 2D root MUSIC algorithm is developed for the joint angle-Doppler estimation. In [11], a reduced-dimension MUSIC algorithm for near-field source localization is proposed, and in [12], a cascade angle estimation algorithm exploiting CAPON and Beam space MUSIC is developed. In addition, to estimate the range and velocity of multiple targets with OFDM radar or to estimate the azimuth and elevation angles of targets, 2D MUSIC algorithm is exploited [17,18]. Specifically, in [17], 2D MUSIC algorithm is exploited for the joint range-velocity estimation, and in [18], 2D MUSIC algorithm is combined with compressive sensing to reduce the system complexity in the joint estimation of azimuth and elevation angles. Furthermore, to improve the estimation accuracy, the received signals from multiple FMCW MIMO radars are collected at the data fusion center and processed coherently [19][20][21], which may increase the data communication overhead and the implementation complexity. In [22], by implementing a low phase noise FMCW radar, the applicability of highly integrated sensors for cooperative bistatic radar networks is demonstrated.
In this paper, to estimate the range and angle information of multiple targets with distributed FMCW MIMO radars, we propose the distributed 2D MUSIC algorithm with coordinate transformation. Specifically, in our proposed scheme, the coordinate at each FMCW MIMO radar is first transformed into the reference radar's coordinate and then the 2D MUSIC algorithm is separately applied to the received FMCW signal at each distributed radar with respect to the reference radar's coordinate. Accordingly, rather than reporting the raw data of the received signal to the data fusion center [19][20][21], the distributed radars forward their local cost function values of the 2D MUSIC for the radar image region of interest, resulting in the reduction of data communication burden for the cooperation. Furthermore, because the local cost function values are computed with respect to the same coordinate, their weighted sum can be computed at the data fusion center, and the range and angle information of multiple targets is jointly estimated from the aggregated cost function. Through the computer simulations, the estimation performance of the proposed distributed 2D MUSIC is verified. That is, when the proposed distributed 2D MUSIC algorithm is exploited, high-resolution radar images can be achieved having narrow-width peaks associated with the targets. Accordingly, it can be found that the proposed algorithm shows lower root mean square error (RMSE) than the conventional method especially at low signal-to-noise ratio (SNR). In addition, by applying the proposed scheme to the experimentally measured data, its performance is also verified in the experimentally test.
The rest of this paper is organized as follows. In Section 2, the system model for the distributed FMCW MIMO radar is introduced and the reformulation of received signal is represented to make the signal which is used in the joint range and angle estimation. In Section 3, the conventional 2D MUSIC algorithm is briefly introduced and the distributed 2D MUSIC algorithm with coordinate transformation is proposed to estimate range and angle of multiple targets without sharing all the received signals at the distributed FMCW MIMO radars. In Section 4, we provide several simulation results, and in Section 5, we experimentally demonstrate the proposed scheme with real data measured via two W-band FMCW radars. In Section 6, we give our conclusion.

Transmitted/Received Signal Model at Distributed FMCW MIMO Radar
We consider that the m t th Tx antenna element of the the ith radar transmits FMCW signal within a pulse duration T PR . Then, the associated FMCW signal can be expressed as where f c is the carrier frequency and α is the chirp rate. In the distributed FMCW MIMO radar system, frequency offset ∆ f c (i.e., ∆ f d ) is introduced in (1) to avoid the intra-radar interference caused by the Tx antennas within each FMCW MIMO radar (i.e., the interradar interference caused by the Tx antennas from different FMCW MIMO radars). When the transmit signal s (i) m t (t) is reflected from the kth target, the reflected signal is received at each the m r th Rx antenna of the ith radar, which is expressed as where n (i) m r (t) denotes the additive white Gaussian noise. Note that γ k (i) implies the target reflection coefficient and τ (i) m r m t k is the propagation time delay. Here, it is assumed that the antenna gain and the path-loss are also reflected in γ k (i) and τ (i) m r m t k is the time which the signal from the m t th transmit antenna at the ith radar was reflected to the kth target and received by the m r th received antenna. Then, assuming that the relative velocity of the kth target with respect to the ith target is v and where G and γ k are the antenna gain and the reflection gain of the kth target, respectively.
In this paper, (R (i) 0k ) 4 because we assume a far-field distance between the kth target and the ith radar. Without loss of generality, γ k is modeled as a complex Gaussian random variable (i.e., γ k ∼ CN (0, 1)). In (3) and (4), R (i) m t k (resp., R (i) m r k ) is the distance between the m t th Tx (resp., the m r th Rx) antenna at the ith radar and the kth target. Here, d is the inter-antenna spacing when the virtual antenna array is formed as a linear array by properly locating Tx/Rx antennas [23]. Throughout the paper, is is assumed that d = λ/2, where λ is the wavelength of the FMCW radar waveform. In the last approximation in (4), R (i) 0k is the distance between the position of the reference element (i.e., the first element in the virtual array) in the ith radar and the kth target. Here, we use the virtual element index m = 1, · · · , M r M t instead of Tx/Rx indices. Then, (4) can be rewritten as where τ 0k /c, m = 1, . . . , M t M r . The received baseband signal at each ith radar is then given as where n (i) m (t) ∼ CN 0, σ 2 n and LP{·} is the low-pass filter output, which passes the beat frequency component of the received signal. Here, the inner product of the Tx waveform and the Rx signal is the de-ramping process. In addition, because we can approximate x (i) m (t) as the second term in (6).

Reconfiguration to Received Signals into Discrete-Time Signal Matrix
To express (6) as a discrete time domain signal, x (i) m (t) can be sampled with the sampling frequency, f s = 1/T s which is given as where n (5) is substituted into (6) and the second order term πα(τ Furthermore, for far-field target range, (7) can be approximated as whereγ 0k }. In this paper, total S FMCW pulses are collected at the receiver and by introducing the pulse index s, the discrete signal x (i) m [n, s], the nth sample of the sth FMCW pulses, can then be represented as (9), by introducing the array response vector a(θ can be rewritten as a vector form. Specifically, for a linear array antenna, the array steering vector is given as and x (i) m [n, s] can then be vectorized as As shown in Figure 2, by stacking x (i) [n, s] for n = 1, . . . , N, we can further havē . .
where the notation ⊗ is Kronecker product operator and for ατ where C exp{j2π(ατ Figure 2, the pictorial description forx (i) [s] and x (i) [n, s] in (10) and (11). Finally, by concatenatingx (i) [s] for s = 1, · · · , S, we have a discrete time de-ramped signals as

2D MUSIC Algorithm with a Single FMCW MIMO Radar
In the FMCW MIMO radar system, ranges and azimuth angles of multiple targets are jointly estimated through the 2D FFT operation, but to obtain high-resolution estimates of the ranges and azimuth angles of multiple targets, we develop the subspace-based 2D MUSIC algorithm. We also note that, in [24], 2D MUSIC algorithm is exploited to estimate the angles and Doppler frequencies of the targets. First, the sample covariance matrix of X (i) is computed from (13) and its eigenvalue decomposition (EVD) can be given as where E (i) s ∈ C MN×K is the matrix whose columns consist of the eigenvectors that span the signal subspace ofR (i) . In addition, the columns E  (13)). Accordingly, from (11) and (12), Then, by letting the cost function J (i) (τ, θ) as the ranges (more specifically, the time delays associated with the ranges of targets) and angles can be jointly estimated at the ith radar as where K is the number of targets. When K is not known at radar, to estimate the number of signal sources, the minimum description length (MDL) [25,26] can be exploited aŝ where Here, λ (i) p is the pth eigenvalue ofR (i) in (14).

Distributed 2D MUSIC Algorithm with Coordinate Transformation
To improve the estimation accuracy, the received signals from the distributed FMCW MIMO radars are collected at the data fusion center and processed non-coherently, which requires large data communication overhead and computational complexity burden at the data fusion center. Instead, we propose a distributed 2D MUSIC using coordinate transformation. Note that the range and angle pairs (R . . , K are the coordinates with respect to the ith FMCW MIMO radar. Accordingly, in the proposed scheme, the coordinates are transformed with respect to the reference radar before the 2D MUSIC algorithm is processed. That is, rather than forwarding the raw data of the received signal to the data fusion center, each radar performs the 2D MUSIC algorithm with its own received signal using the transformed coordinates. We consider that the distributed radars are in a straight line (see Figure 3), but it can be easily extended to the cases with their arbitrary locations. Specifically, the ith radar is located in (x i , 0). Without loss of generality, the radar located at (x 1 , 0) is denoted as the reference radar. From Figure 3, we can have two equations representing the relationship between (R (1) where D i1 is the distance between the reference radar and the ith radar (i.e., Because τ (22) and (23), we can express (τ, θ) in the ith radar's coordinate as where (τ,θ) is based on the reference radar's coordinate. Then, by substituting (τ, θ) in (24) and (25) into (15) and (16), at the ith FMCW MIMO radar, the cost function in (16) is reformulated with respect to the reference radar's coordinate (τ,θ) as Then, the distributed radars report J (i) (τ,θ) for the radar image region of interest to the data fusion center. At the data fusion center, the aggregated cost function can be formulated asJ where w (i) is the weight for the ith local cost function value and it can be determined as being proportional to the received SNR at the ith radar, The delays and angles can then be jointly estimated at the data fusion center as (τ k ,θ k ) = arg max τ,θJ (τ,θ), for k = 1, . . . , K.
Note that the distributed radars do not need to report all their measured signals to the data fusion center, but they forward their local cost function (26) for the radar image region of interest. Based on the above description, the proposed distributed 2D MUSIC algorithm with coordinate transformation is summarized in Algorithm 1.

Discussion
To see the effectiveness of the proposed 2D MUSIC algorithm with coordinate transformation, we look into the application of 2D MUSIC algorithm with the received signal sharing. That is, when the received signals from the distributed radars are transferred perfectly to the data fusion center and the received signals are uncorrelated, we can have the sample covariance matrix of X = [(X (1) ) H , . . . , (X (L) ) H ] H as and therefore, the eigenvectors that span the noise subspace ofR can be given as Let us define the time and spatial array response vector as whereτ andθ are, respectively, the delay and the azimuth angle based on the reference radar's coordinate and f i (τ,θ) is the ith relative time and spatial array response vector, where the relative position of the ith radar with respect to the reference radar is reflected.
We note that f i (τ,θ) = K i b(τ (i) ) ⊗ a(θ (i) ) with a complex valued constant K i such that |K i | = 1. Here, τ (i) and θ (i) are, respectively, the delay and the azimuth angle based on the ith radar's coordinate. The cost function for the 2D MUSIC with the received signal sharing can then be given as and accordingly, the delays and angles can be estimated at the data fusion center as (τ k ,θ k ) = arg max τ,θ J T (τ,θ), for k = 1, . . . , K. (34) We note that (33) is comparable with (27). That is, both (27) and (33) are maximized tr (τ,θ), g tr (τ,θ)) for i = 1, . . . , L simultaneously become close to zero. From the above observation, our proposed 2D MUSIC, which does not require the transfer of the raw data of the received signals to the data fusion center, gives a similar estimation performance compared to the 2D MUSIC with the received signal sharing.

Simulation Results
To verify the proposed distributed 2D MUSIC algorithm, the computer simulations are performed. Throughout the simulations, we use 77 GHz as the center frequency, 300 MHz as the operating bandwidth, 3.33 µs as the pulse duration, and 600 MHz as the sampling frequency. The number of pulses is set as 100. In addition, the number of time samples is set as 2000 samples per pulse and FMCW MIMO radar is exploited with M t = 2 and M r = 4, where the transceiver antennas are placed such that virtual uniform linear antenna array is formed with inter-antenna spacing λ/2, where λ is a wavelength of the FMCW waveform. Throughout the simulations, we assume that Line-of-Sight (LoS) component is retained for each target and there is no multi-path. For the LoS, a typical path-loss exponent is set as 2, but it can be extended to general types of channel model.

The 2D MUSIC-Based Radar Image Comparison
We consider that three targets are located (R k , θ k ) = {(75 m, −15 • ), (80 m, 0 • ), (85 m, 15 • )} and two distributed FMCW MIMO radars are exploited, where the reference radar is located at the origin point (0, 0) m and the other radar is at (D, 0) m. Figure 4a shows the radar image at the reference radar based on the conventional 2D MUSIC algorithm in Section 3.1 when the received SNR is 10 dB. Note that three peaks are observed at {(75 m, −15 • ), (80 m, 0 • ), (85 m, 15 • )}. Interestingly, all peaks have the relatively wide width along the azimuth angle axis, compared to the range axis. This is because the number of elements in the virtual array is much smaller than that of fast-time samples, resulting in the relatively low angle resolution. In Figure 4b, the radar image using the proposed distributed 2D MUSIC with coordinate transformation is shown with D = 5 m, when the received SNR is 10 dB at both FMCW MIMO radars. That is, the image is plotted based on (27). We can also find three peaks at {(75 m, −15 • ), (80 m, 0 • ), (85 m, 15 • )}. Note that the peaks are sharper than those in Figure 4a, which implies that the proposed distributed 2D MUSIC can have a higher angle resolution and lower estimation errors compared to the conventional 2D MUSIC without the received signal sharing.
In Figure 5, the radar images are shown when (a) the images obtained from two distributed radars are simply averaged and (b) the proposed 2D MUSIC with coordinate transformation is applied using (27). Here, we also consider that three targets are located at (R k , θ k ) = {(75 m, −15 • ), (80 m, 0 • ), (85 m, 15 • )}. From Figure 5a, some ghost targets are observed because the coordinates of the distributed radars are not properly aligned. However, in the Figure 5b, three peaks can be found associated with the original target positions with a high resolution.
In Figure 6a, the radar image using the proposed 2D MUSIC algorithm with coordinate transformation is shown with D = 1 m, when the received SNRs are different at the distributed radars. Specifically, the received SNR at the reference radar is 6.6 dB, while 10 dB at the other radar. We can also find three peaks at the same point. For comparison purpose, in Figure 6b, the radar image at the reference radar using the conventional 2D MUSIC algorithm is shown. We note that the peaks associated with the targets are not apparent compared to that in Figure 6a.
In Figure 7, the radar images using the proposed distributed 2D MUSIC with coordinate transformation are shown with the different inter-radar spaces (i.e., D = {5, 10} m). We can also find as the inter-radar space increases, the radar image resolution can be improved.

Mean Square Error Comparison
In this section, the Monte Carlo simulations are carried out to present the performance of the proposed algorithm. As performance measures, we evaluate the RMSEs of range and angle estimation, given as follows: In Figure Figure 8a shows the RMSEs of range estimation for various SNRs when two distributed FMCW MIMO radars with D = 10 m are exploited with the proposed 2D MUSIC algorithm. For comparison purposes, we also evaluate the RMSEs when the conventional 2D MUSIC with/without the received signal sharing are exploited. From Figure 8a, as SNR increases, the RMSEs for the proposed algorithm and the conventional algorithms with/without received signal sharing decrease, but at low SNR values, the proposed algorithm shows lower RMSEs than the conventional algorithm without received signal sharing. Interestingly, it can be found that the proposed 2D MUSIC algorithm exhibits a similar RMSE performance compared to the 2D MUSIC with the received signal sharing, which coincides with the discussion in Section 3.3. From Figure 8b, RMSEs of angle estimation for the proposed algorithm are lower than those for the conventional algorithm without data sharing, which is a similar observation as that found in Figure 8a.  Figure 9 shows the RMSEs of angle and range estimation for various numbers of FMCW MIMO radars and Rx antennas. Specifically, in Figure 9a, the RMSEs for various numbers of FMCW MIMO radars are evaluated when the proposed algorithm is exploited with SNR = 4 dB, M t = 2, M r = 4. That is, the ith radars are located at the point ((i − 1)D, 0) m with D = 1 m. From Figure 9a, as the number of radars increases, RMSEs of both range and angle are decrease. Accordingly, as the number of radars increases, high-resolution images can be achieved having narrow-width peaks associated with the targets, which coincides with the observation in Section 4.1. In Figure 9b

Experiment Results
In order to experimentally demonstrate the proposed 2D MUSIC algorithm, a distributed FMCW MIMO radar system is set as in Figure 10 by using two W-band FMCW radars (TI AWR-1642) with two Tx and four Rx antennas. Here, two radars are located at (0, 0) m and (2, 0) m. We consider that two targets are located at (7.76 m, −14.93 • ) and (5.39 m, 21.80 • ) with respect to the reference radar at (0, 0) m. The FMCW chirp configuration used in our experiment is shown in Table 1. Figure 11a shows the radar image when the measured data obtained from the reference radar located at (0, 0) m is exploited with the conventional 2D MUSIC algorithm. We can find that two targets are detected and the estimated targets are given as (R k , θ k ) = {(8.0 m, −15.30 • ), (5.64 m, 21.60 • )}. Figure 11b shows the radar image when the measured data are acquired at (2, 0) m is exploited with the conventional 2D MUSIC algorithm using the coordinate transformation. Accordingly, the targets are slightly slanted. In Figure 12, the radar image is obtained by using the proposed 2D MUSIC algorithm with coordinate transformation. Radar image in Figure 12 shows that two peaks are located at {(8.16 m, −15.30 • ), (5.64 m, 19.0 • )} with a higher resolution. That is, the target images appear more focused. Accordingly, we can estimate the targets with a high resolution without sharing the raw data of the received signals at the distributed radars.

Conclusions
In this paper, we propose the distributed 2D MUSIC algorithm with coordinate transformation to estimate the range and angle information of multiple targets with distributed FMCW MIMO radars. In the proposed scheme, by the coordinate transformation at each FMCW MIMO radar, we can proceed with the 2D MUSIC algorithm separately with respect to the reference radar's coordinate. Accordingly, rather than reporting the raw data of the received signal to the data fusion center, the distributed radars forward their local cost function values of the 2D MUSIC for the radar image region of interest. Because the local cost function values are computed with respect to the same coordinate, their weighted sum can be computed at the data fusion center, and the range and angle information of multiple targets is jointly estimated from the aggregated cost function. In the computer simulations, when the proposed distributed 2D MUSIC algorithm is exploited, high-resolution radar images can be achieved having narrow-width peaks associated with the targets. Accordingly, it is confirmed through the Monte Carlo simulation that the proposed algorithm shows lower RMSEs than the conventional method especially at low SNR (below 10 dB), which implies that the proposed algorithm has higher immunity to the additive noise. Finally, by applying the proposed scheme to the experimentally measured data, it is verified that the range and angle parameters of multiple targets can be estimated with a high resolution without sharing the raw data of the received signals at the distributed radars.