Next Article in Journal
Real-Time Closed-Loop Detection Method of vSLAM Based on a Dynamic Siamese Network
Previous Article in Journal
Residual Energy Estimation-Based MAC Protocol for Wireless Powered Sensor Networks
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Distributed Two-Dimensional MUSIC for Joint Range and Angle Estimation with Distributed FMCW MIMO Radars

1
Division of Smart Robot Convergence and Application Engineering, Department of Electronic Engineering, Pukyong National University, Busan 48513, Korea
2
Communication & Media Research Laboratory, Radio & Satellite Research Division, Electronics and Telecommunications Research Institute, Daejeon 34129, Korea
*
Author to whom correspondence should be addressed.
Sensors 2021, 21(22), 7618; https://doi.org/10.3390/s21227618
Submission received: 20 August 2021 / Revised: 10 November 2021 / Accepted: 12 November 2021 / Published: 16 November 2021
(This article belongs to the Section Radar Sensors)

Abstract

:
To estimate range and angle information of multiple targets, FMCW MIMO radars have been exploited with 2D MUSIC algorithms. To improve estimation accuracy, received signals from multiple FMCW MIMO radars are collected at the data fusion center and processed coherently, which increases data communication overhead and implementation complexity. To resolve them, we propose the distributed 2D MUSIC algorithm with coordinate transformation, in which 2D MUSIC algorithm is operated with respect to the reference radar’s coordinate at each radar in a distributed way. Rather than forwarding the raw data of received signal to the fusion center, each radar performs 2D MUSIC with its own received signal in the transformed coordinates. Accordingly, the distributed radars do not need to report all their measured signals to the data fusion center, but they forward their local cost function values of 2D MUSIC for the radar image region of interest. The data fusion center can then estimate the range and angle information of targets jointly from the aggregated cost function. By applying the proposed scheme to the experimentally measured data, its performance is verified in the real environment test.

1. Introduction

Recently, frequency-modulated continuous waveform (FMCW) radar has been widely exploited in military and automotive radar sensing/imaging systems [1,2,3,4] because it has advantages in the fast-processing time and the robustness to harsh environmental conditions. Accordingly, to estimate the angle with adequate spatial resolution, FMCW multiple-input multiple-output (MIMO) radar has been studied in [5,6,7,8] (and references therein), in which multiple M t transmitters and M r receivers are co-located and the orthogonal waveforms are transmitted from multiple transmit antennas. Then, the reflected signals at the receiver can be equivalently modeled as the received signal from the virtual array with M t M r antenna elements.
To estimate the range and angle information of multiple targets with the FMCW MIMO radar, two-dimensional multiple signal classification (2D MUSIC) algorithms can be exploited [9,10,11,12], in which the covariance matrix of the received signal is estimated for the subspace-based signal processing. We note that the 2D MUSIC algorithm is widely exploited to estimate 2D directions of arrival (DOA) [13,14,15,16], where the elevation and the azimuth angles are jointly estimated by exploiting the planar array antenna. The 2D MUSIC algorithm can also be exploited to estimate the range and angle parameters. In [9], 2D MUSIC algorithm is combined with a FFT-based parameter estimator to reduce the computational complexity of high-resolution 2D MUSIC algorithm. In [10], the weighted 2D root MUSIC algorithm is developed for the joint angle-Doppler estimation. In [11], a reduced-dimension MUSIC algorithm for near-field source localization is proposed, and in [12], a cascade angle estimation algorithm exploiting CAPON and Beam space MUSIC is developed. In addition, to estimate the range and velocity of multiple targets with OFDM radar or to estimate the azimuth and elevation angles of targets, 2D MUSIC algorithm is exploited [17,18]. Specifically, in [17], 2D MUSIC algorithm is exploited for the joint range-velocity estimation, and in [18], 2D MUSIC algorithm is combined with compressive sensing to reduce the system complexity in the joint estimation of azimuth and elevation angles. Furthermore, to improve the estimation accuracy, the received signals from multiple FMCW MIMO radars are collected at the data fusion center and processed coherently [19,20,21], which may increase the data communication overhead and the implementation complexity. In [22], by implementing a low phase noise FMCW radar, the applicability of highly integrated sensors for cooperative bistatic radar networks is demonstrated.
In this paper, to estimate the range and angle information of multiple targets with distributed FMCW MIMO radars, we propose the distributed 2D MUSIC algorithm with coordinate transformation. Specifically, in our proposed scheme, the coordinate at each FMCW MIMO radar is first transformed into the reference radar’s coordinate and then the 2D MUSIC algorithm is separately applied to the received FMCW signal at each distributed radar with respect to the reference radar’s coordinate. Accordingly, rather than reporting the raw data of the received signal to the data fusion center [19,20,21], the distributed radars forward their local cost function values of the 2D MUSIC for the radar image region of interest, resulting in the reduction of data communication burden for the cooperation. Furthermore, because the local cost function values are computed with respect to the same coordinate, their weighted sum can be computed at the data fusion center, and the range and angle information of multiple targets is jointly estimated from the aggregated cost function. Through the computer simulations, the estimation performance of the proposed distributed 2D MUSIC is verified. That is, when the proposed distributed 2D MUSIC algorithm is exploited, high-resolution radar images can be achieved having narrow-width peaks associated with the targets. Accordingly, it can be found that the proposed algorithm shows lower root mean square error (RMSE) than the conventional method especially at low signal-to-noise ratio (SNR). In addition, by applying the proposed scheme to the experimentally measured data, its performance is also verified in the experimentally test.
The rest of this paper is organized as follows. In Section 2, the system model for the distributed FMCW MIMO radar is introduced and the reformulation of received signal is represented to make the signal which is used in the joint range and angle estimation. In Section 3, the conventional 2D MUSIC algorithm is briefly introduced and the distributed 2D MUSIC algorithm with coordinate transformation is proposed to estimate range and angle of multiple targets without sharing all the received signals at the distributed FMCW MIMO radars. In Section 4, we provide several simulation results, and in Section 5, we experimentally demonstrate the proposed scheme with real data measured via two W-band FMCW radars. In Section 6, we give our conclusion.

2. System Model for Distributed FMCW MIMO Radar System

Figure 1 shows the distributed FMCW MIMO radar system with multiple targets in two-dimensional space. Specifically, each FMCW MIMO radar consists of M t number of transmit (Tx) antennas and M r number of receive (Rx) antennas, and K number of targets having the angle of θ k ( i ) and the range of R k ( i ) with respect to the ith FMCW MIMO radar are randomly located with far-field assumption.

2.1. Transmitted/Received Signal Model at Distributed FMCW MIMO Radar

We consider that the m t th Tx antenna element of the the ith radar transmits FMCW signal within a pulse duration T P R . Then, the associated FMCW signal can be expressed as
s m t ( i ) ( t ) = exp { j ( 2 π ( f c + Δ f c ( m t 1 ) + Δ f d ( i 1 ) ) t + π α t 2 ) } , 0 t T P R ,
where f c is the carrier frequency and α is the chirp rate. In the distributed FMCW MIMO radar system, frequency offset Δ f c (i.e., Δ f d ) is introduced in (1) to avoid the intra-radar interference caused by the Tx antennas within each FMCW MIMO radar (i.e., the inter-radar interference caused by the Tx antennas from different FMCW MIMO radars). When the transmit signal s m t ( i ) ( t ) is reflected from the kth target, the reflected signal is received at each the m r th Rx antenna of the ith radar, which is expressed as
r m r ( i ) ( t ) = k = 1 K m t = 1 M t γ k ( i ) s m t ( i ) ( t τ m r m t k ( i ) ) + n m r ( i ) ( t ) ,
where n m r ( i ) ( t ) denotes the additive white Gaussian noise. Note that γ k ( i ) implies the target reflection coefficient and τ m r m t k ( i ) is the propagation time delay. Here, it is assumed that the antenna gain and the path-loss are also reflected in γ k ( i ) and τ m r m t k ( i ) is the time which the signal from the m t th transmit antenna at the ith radar was reflected to the kth target and received by the m r th received antenna. Then, assuming that the relative velocity of the kth target with respect to the ith target is v k ( i ) , we can have
γ k ( i ) = G ( R m t k ( i ) ) 2 ( R m r k ( i ) ) 2 G ( R 0 k ( i ) ) 4 γ k ,
and
τ m r m t k ( i ) ( t ) = 2 c R m t k ( i ) + R m r k ( i ) 2 + v k ( i ) 2 t 2 c R 0 k ( i ) + d ( m 1 ) sin θ k ( i ) + v k ( i ) 2 t ,
where G and γ k are the antenna gain and the reflection gain of the kth target, respectively. In this paper, ( R m t k ( i ) ) 2 ( R m r k ( i ) ) 2 is approximated as ( R 0 k ( i ) ) 4 because we assume a far-field distance between the kth target and the ith radar. Without loss of generality, γ k is modeled as a complex Gaussian random variable (i.e., γ k CN ( 0 , 1 ) ). In (3) and (4), R m t k ( i ) (resp., R m r k ( i ) ) is the distance between the m t th Tx (resp., the m r th Rx) antenna at the ith radar and the kth target. Here, d is the inter-antenna spacing when the virtual antenna array is formed as a linear array by properly locating Tx/Rx antennas [23]. Throughout the paper, is is assumed that d = λ / 2 , where λ is the wavelength of the FMCW radar waveform. In the last approximation in (4), R 0 k ( i ) is the distance between the position of the reference element (i.e., the first element in the virtual array) in the ith radar and the kth target. Here, we use the virtual element index m = 1 , , M r M t instead of Tx/Rx indices. Then, (4) can be rewritten as
τ m k ( i ) = τ 0 k ( i ) + 2 c d ( m 1 ) sin θ k ( i ) + v k ( i ) c t ,
where τ 0 k ( i ) = 2 R 0 k ( l ) / c , m = 1 , , M t M r .
The received baseband signal at each ith radar is then given as
x m ( i ) ( t ) = L P { r m r ( i ) * ( t ) s m t ( i ) ( t ) } = k = 1 K m t = 1 M t γ k ( i ) exp { j ( 2 π ( f c + Δ f c ( m t 1 ) + Δ f d ( i 1 ) ) τ m k ( i ) + π α t 2 ) + 2 π α τ m k ( i ) t π α ( τ m k ( i ) ) 2 ) } + n m ( i ) ( t ) k = 1 K γ k ( i ) e x p { j ( 2 π f c τ m k ( i ) + 2 π α τ m k ( i ) t π α ( τ m k ( i ) ) 2 ) } + n m ( i ) ( t ) ,
where n m ( i ) ( t ) CN 0 , σ n 2 and L P { · } is the low-pass filter output, which passes the beat frequency component of the received signal. Here, the inner product of the Tx waveform and the Rx signal is the de-ramping process. In addition, because f c is f c Δ f c ( r e s p , Δ f d ) , we can approximate x m ( i ) ( t ) as the second term in (6).

2.2. Reconfiguration to Received Signals into Discrete-Time Signal Matrix

To express (6) as a discrete time domain signal, x m ( i ) ( t ) can be sampled with the sampling frequency, f s = 1 / T s which is given as
x m ( i ) [ n ] = x m ( i ) ( n T s ) k = 1 K γ k ( i ) e x p { j ( 2 π f c ( τ 0 k ( i ) + 2 c ( d ( m 1 ) sin θ k ( i ) ) + v k ( i ) c n T s ) + j 2 π α ( τ 0 k ( i ) + 2 c ( d ( m 1 ) sin θ k ( i ) ) + v k ( i ) c n T s ) n T s } + n m ( i ) [ n ] ,
where n m ( i ) [ n ] n m ( i ) ( n T s ) . Here, τ m k ( i ) in (5) is substituted into (6) and the second order term π α ( τ m k ( i ) ) 2 in (6) is ignored. Furthermore, for far-field target range, τ 0 k ( i ) 2 c ( d ( m 1 ) sin θ k ( i ) + v k ( i ) c n T s ) and therefore x m ( i ) [ n ] in (7) can be approximated as
x m ( i ) [ n ] k = 1 K γ ¯ k ( i ) e x p { j 2 π f c ( 2 c ( d ( m 1 ) sin θ k ( i ) ) + v k ( i ) c n T s ) + j 2 π α τ 0 k ( i ) n T s } ,
where γ ¯ k ( i ) = γ k ( i ) e x p { j 2 π f c τ 0 k ( i ) } . In this paper, total S FMCW pulses are collected at the receiver and by introducing the pulse index s, the discrete signal x m ( i ) [ n , s ] , the nth sample of the sth FMCW pulses, can then be represented as
x m ( i ) [ n , s ] x m ( i ) ( n T s ) | s t h p u l s e k = 1 K γ ¯ k ( i ) e x p { j 2 π f c ( 2 c ( d ( m 1 ) sin θ k ( i ) + v k ( i ) c ( s T P R + n T s ) ) + j 2 π α τ 0 k ( i ) n T s } = k = 1 K γ ¯ k ( i ) e x p { j 2 π f c ( 2 c ( d ( m 1 ) sin θ k ( i ) + v k ( i ) c ( s T P R ) ) + j 2 π ( α τ 0 k ( i ) + v k ( i ) λ ) n T s } .
Then, from e x p { j 2 π f c ( 2 / c ( d ( m 1 ) sin θ k ( i ) } in (9), by introducing the array response vector a ( θ k ( i ) ) , x m ( i ) [ n , s ] can be rewritten as a vector form. Specifically, for a linear array antenna, the array steering vector is given as
a ( θ k ( i ) ) = [ 1 , e x p { 2 π 2 d λ ) sin θ k ( i ) } , , e x p { 2 π 2 d ( M 1 ) λ ) sin θ k ( i ) } ] T ,
and x m ( i ) [ n , s ] can then be vectorized as
x ( i ) [ n , s ] = x 1 ( i ) [ n , s ] x M ( i ) [ n , s ] = k = 1 K γ ¯ k ( i ) a ( θ k ( i ) ) e x p { j 2 π v k ( i ) λ s T P R + j 2 π ( α τ 0 k ( i ) + v k ( i ) λ ) n T s ) } ,
As shown in Figure 2, by stacking x ( i ) [ n , s ] for n = 1 , , N , we can further have
x ¯ ( i ) [ s ] = x ( i ) [ 1 , s ] x ( i ) [ N , s ] = k = 1 K γ ¯ k ( i ) b ( τ 0 k ( i ) , v k ( i ) ) a ( θ k ( i ) ) e x p { j 2 π v k ( i ) λ s T P R } ,
where the notation ⊗ is Kronecker product operator and for α τ 0 k ( i ) v k ( i ) / λ , b ( τ 0 k ( i ) , v k ( i ) ) can be approximated as
b ( τ 0 k ( i ) , v k ( i ) ) = C [ 1 exp { j 2 π ( α τ 0 k ( i ) + v k ( i ) λ ) ( N 1 ) T s ] T C [ 1 exp { j 2 π ( α τ 0 k ( i ) ) ( N 1 ) T s ] T ( b ( τ 0 k ( i ) ) ) ,
where C exp { j 2 π ( α τ 0 k ( i ) + v k ( i ) / λ ) T s } exp { j 2 π ( α τ 0 k ( i ) ) T s } . We note that b ( τ 0 k ( i ) ) is the fast-time array response vector associated with τ 0 k ( i ) , while a ( θ k ( i ) ) is a spatial response vector associated with θ k ( i ) . In Figure 2, the pictorial description for x ¯ ( i ) [ s ] and x ( i ) [ n , s ] in (10) and (11). Finally, by concatenating x ¯ ( i ) [ s ] for s = 1 , , S , we have a discrete time de-ramped signals as
X ( i ) = [ x ¯ ( i ) [ 1 ] , , x ¯ ( i ) [ S ] ]

3. Distributed 2D MUSIC Algorithm for Joint Range and Angle Estimation

3.1. 2D MUSIC Algorithm with a Single FMCW MIMO Radar

In the FMCW MIMO radar system, ranges and azimuth angles of multiple targets are jointly estimated through the 2D FFT operation, but to obtain high-resolution estimates of the ranges and azimuth angles of multiple targets, we develop the subspace-based 2D MUSIC algorithm. We also note that, in [24], 2D MUSIC algorithm is exploited to estimate the angles and Doppler frequencies of the targets. First, the sample covariance matrix of X ( i ) is computed from (13) and its eigenvalue decomposition (EVD) can be given as
R ¯ ( i ) = 1 S X ( i ) X ( i ) H = E s ( i ) E n ( i ) Λ s ( i ) 0 0 Λ n ( i ) E s ( i ) E n ( i ) H ,
where E s ( i ) C M N × K is the matrix whose columns consist of the eigenvectors that span the signal subspace of R ¯ ( i ) . In addition, the columns E n ( i ) C M N × ( M N K ) are the eigenvectors that span the noise subspace of R ¯ ( i ) . Here, Λ s ( i ) C K × K and Λ n ( i ) C ( M N K ) × ( M N K ) are the diagonal matrices whose diagonal elements consist of the ( K , M N K ) eigenvalues of R ¯ ( i ) . Note that the columns of E n ( i ) are orthogonal to those of E s ( i ) (equivalently, the signal subspace spanned by the columns of X ( i ) in (13)). Accordingly, from (11) and (12), we define f ( τ , θ ) as
f ( τ , θ ) b ( τ ) a ( θ ) .
Then, by letting the cost function J ( i ) ( τ , θ ) as
J ( i ) ( τ , θ ) 1 f H ( τ , θ ) E n ( i ) E n ( i ) H f ( τ , θ ) ,
the ranges (more specifically, the time delays associated with the ranges of targets) and angles can be jointly estimated at the ith radar as
( τ ^ k ( i ) , θ ^ k ( i ) ) = arg max τ , θ J ( i ) ( τ , θ ) , for k = 1 , , K ,
where K is the number of targets. When K is not known at radar, to estimate the number of signal sources, the minimum description length (MDL) [25,26] can be exploited as
K ^ ( i ) = arg min k { 1 , , M N } M D L ( i ) ( k ) ,
where
M D L ( i ) ( k ) = log p = k + 1 M N ( λ p ( i ) ) 1 / ( M N k ) 1 ( M N k ) p = k + 1 M N λ p ( i ) ( M N k ) S + 1 2 k ( 2 M N k ) log S .
Here, λ p ( i ) is the pth eigenvalue of R ¯ ( i ) in (14).

3.2. Distributed 2D MUSIC Algorithm with Coordinate Transformation

To improve the estimation accuracy, the received signals from the distributed FMCW MIMO radars are collected at the data fusion center and processed non-coherently, which requires large data communication overhead and computational complexity burden at the data fusion center. Instead, we propose a distributed 2D MUSIC using coordinate transformation. Note that the range and angle pairs ( R k ( i ) , θ k ( i ) ) , k = 1 , , K are the coordinates with respect to the ith FMCW MIMO radar. Accordingly, in the proposed scheme, the coordinates are transformed with respect to the reference radar before the 2D MUSIC algorithm is processed. That is, rather than forwarding the raw data of the received signal to the data fusion center, each radar performs the 2D MUSIC algorithm with its own received signal using the transformed coordinates.
We consider that the distributed radars are in a straight line (see Figure 3), but it can be easily extended to the cases with their arbitrary locations. Specifically, the ith radar is located in ( x i , 0 ) . Without loss of generality, the radar located at ( x 1 , 0 ) is denoted as the reference radar. From Figure 3, we can have two equations representing the relationship between ( R k ( 1 ) , θ k ( 1 ) ) and ( R k ( i ) , θ k ( i ) ) as
R k ( i ) cos θ k ( i ) = R k ( 1 ) cos θ k ( 1 )
R k ( i ) sin θ k ( i ) R k ( 1 ) sin θ k ( 1 ) = D i 1 ,
where D i 1 is the distance between the reference radar and the ith radar (i.e., D i 1 = | x i x 1 | ). Accordingly, ( R k ( i ) , θ k ( i ) ) can be expressed in terms of ( R k ( 1 ) , θ k ( 1 ) ) as
R k ( i ) = ( R k ( 1 ) ) 2 + D i 1 2 + 2 D i 1 R k ( 1 ) sin θ k ( 1 )
θ k ( i ) = sin 1 ( x 1 x i ) + R k ( 1 ) sin θ k ( 1 ) ( R k ( 1 ) ) 2 + D i 1 2 + 2 D i 1 R k ( 1 ) sin θ k ( 1 ) ,
Because τ k ( i ) = 2 R k ( i ) / c , from (22) and (23), we can express ( τ , θ ) in the ith radar’s coordinate as
τ = ( τ ¯ c / 2 ) 2 + D i 1 2 + 2 D i 1 τ ¯ c / 2 sin θ k ( 1 ) f t r ( τ ¯ , θ ¯ )
θ = sin 1 ( x 1 x i ) + τ ¯ c / 2 sin θ k ( 1 ) ( τ ¯ c / 2 ) 2 + D i 1 2 + 2 D i 1 τ ¯ c / 2 sin θ k ( 1 ) g t r ( τ ¯ , θ ¯ ) ,
where ( τ ¯ , θ ¯ ) is based on the reference radar’s coordinate. Then, by substituting ( τ , θ ) in (24) and (25) into (15) and (16), at the ith FMCW MIMO radar, the cost function in (16) is reformulated with respect to the reference radar’s coordinate ( τ ¯ , θ ¯ ) as
J ¯ ( i ) ( τ ¯ , θ ¯ ) = J ¯ ( i ) ( f t r ( τ ¯ , θ ¯ ) , g t r ( τ ¯ , θ ¯ ) ) = 1 f H ( f t r ( τ ¯ , θ ¯ ) , g t r ( τ ¯ , θ ¯ ) ) E n ( i ) E n ( i ) H f ( f t r ( τ ¯ , θ ¯ ) , g t r ( τ ¯ , θ ¯ ) )
Then, the distributed radars report J ( i ) ( τ ¯ , θ ¯ ) for the radar image region of interest to the data fusion center. At the data fusion center, the aggregated cost function can be formulated as
J ¯ ( τ ¯ , θ ¯ ) = i = 1 L w ( i ) J ¯ ( i ) ( τ ¯ , θ ¯ ) ,
where w ( i ) is the weight for the ith local cost function value and it can be determined as being proportional to the received SNR at the ith radar,
w ( i ) = S N R ( i ) i = 1 I S N R ( i )
The delays and angles can then be jointly estimated at the data fusion center as
( τ ¯ ^ k , θ ¯ ^ k ) = arg max τ ¯ , θ ¯ J ¯ ( τ ¯ , θ ¯ ) , for k = 1 , , K .
Note that the distributed radars do not need to report all their measured signals to the data fusion center, but they forward their local cost function (26) for the radar image region of interest. Based on the above description, the proposed distributed 2D MUSIC algorithm with coordinate transformation is summarized in Algorithm 1.
Algorithm 1 Distributed 2D MUSIC algorithm with coordinate transformation
1:
The ith radar, i = 1 , , L
2:
   Compute R ¯ ( i )
3:
    Estimate the number of targets( K ^ ( i ) ) using MDL criterion in (18).
4:
   Compute the EVD of R ¯ ( i ) as R ¯ ( i ) = E ( i ) Σ ( i ) E ( i ) H .
5:
   Formulate f t r ( τ ¯ , θ ¯ ) and g t r ( τ ¯ , θ ¯ ) through the coordinate transformation as (24) and (25).
6:
   Compute the cost function J ¯ ( i ) ( τ ¯ , θ ¯ ) using f t r ( τ ¯ , θ ¯ ) and g t r ( τ ¯ , θ ¯ ) as (26).
7:
   Forward its local cost J ¯ ( i ) ( τ ¯ , θ ¯ ) for the radar image region of interest.
8:
The data fusion center:
9:
   Compute the aggregated cost function as (27)
10:
   Estimate the parameters ( τ ¯ ^ k , θ ¯ ^ k ) associate with the largest K ^ peaks using aggregated cost function as in (29). Here, K ^ = max i = 1 , , L K ^ ( i ) .

3.3. Discussion

To see the effectiveness of the proposed 2D MUSIC algorithm with coordinate transformation, we look into the application of 2D MUSIC algorithm with the received signal sharing. That is, when the received signals from the distributed radars are transferred perfectly to the data fusion center and the received signals are uncorrelated, we can have the sample covariance matrix of X = [ ( X ( 1 ) ) H , , ( X ( L ) ) H ] H as
R ¯ = 1 S X X H d i a g { R ¯ ( 1 ) , , R ¯ ( L ) } .
and therefore, the eigenvectors that span the noise subspace of R ¯ can be given as
E n = E n ( 1 ) 0 0 0 E n ( 2 ) 0 0 0 E n ( L ) .
Let us define the time and spatial array response vector as
f T ( τ ¯ , θ ¯ ) = [ f 1 T ( τ ¯ , θ ¯ ) , , f L T ( τ ¯ , θ ¯ ) ] T ,
where τ ¯ and θ ¯ are, respectively, the delay and the azimuth angle based on the reference radar’s coordinate and f i ( τ ¯ , θ ¯ ) is the ith relative time and spatial array response vector, where the relative position of the ith radar with respect to the reference radar is reflected. We note that f i ( τ ¯ , θ ¯ ) = K i b ( τ ( i ) ) a ( θ ( i ) ) with a complex valued constant K i such that | K i | = 1 . Here, τ ( i ) and θ ( i ) are, respectively, the delay and the azimuth angle based on the ith radar’s coordinate. The cost function for the 2D MUSIC with the received signal sharing can then be given as
J ¯ T ( τ ¯ , θ ¯ ) = 1 f T H ( τ ¯ , θ ¯ ) E n E n H f ( τ ¯ , θ ¯ ) = 1 i = 1 L f H ( τ ( i ) , θ ( i ) ) E n ( i ) E n ( i ) H f ( τ ( i ) , θ ( i ) ) = 1 i = 1 L f H ( f t r ( τ ¯ , θ ¯ ) , g t r ( τ ¯ , θ ¯ ) ) E n ( i ) E n ( i ) H f ( f t r ( τ ¯ , θ ¯ ) , g t r ( τ ¯ , θ ¯ ) ) ,
and accordingly, the delays and angles can be estimated at the data fusion center as
( τ ¯ ^ k , θ ¯ ^ k ) = arg max τ ¯ , θ ¯ J T ( τ ¯ , θ ¯ ) , for k = 1 , , K .
We note that (33) is comparable with (27). That is, both (27) and (33) are maximized when f H ( f t r ( τ ¯ , θ ¯ ) , g t r ( τ ¯ , θ ¯ ) ) E n ( i ) E n ( i ) H f ( f t r ( τ ¯ , θ ¯ ) , g t r ( τ ¯ , θ ¯ ) ) for i = 1 , , L simultaneously become close to zero. From the above observation, our proposed 2D MUSIC, which does not require the transfer of the raw data of the received signals to the data fusion center, gives a similar estimation performance compared to the 2D MUSIC with the received signal sharing.

4. Simulation Results

To verify the proposed distributed 2D MUSIC algorithm, the computer simulations are performed. Throughout the simulations, we use 77 GHz as the center frequency, 300 MHz as the operating bandwidth, 3.33 μ s as the pulse duration, and 600 MHz as the sampling frequency. The number of pulses is set as 100. In addition, the number of time samples is set as 2000 samples per pulse and FMCW MIMO radar is exploited with M t = 2 and M r = 4 , where the transceiver antennas are placed such that virtual uniform linear antenna array is formed with inter-antenna spacing λ / 2 , where λ is a wavelength of the FMCW waveform. Throughout the simulations, we assume that Line-of-Sight (LoS) component is retained for each target and there is no multi-path. For the LoS, a typical path-loss exponent is set as 2, but it can be extended to general types of channel model.

4.1. The 2D MUSIC-Based Radar Image Comparison

We consider that three targets are located ( R k , θ k ) = { ( 75 m , 15 ) , ( 80 m , 0 ) , ( 85 m , 15 ) } and two distributed FMCW MIMO radars are exploited, where the reference radar is located at the origin point ( 0 , 0 ) m and the other radar is at ( D , 0 ) m. Figure 4a shows the radar image at the reference radar based on the conventional 2D MUSIC algorithm in Section 3.1 when the received SNR is 10 dB. Note that three peaks are observed at { ( 75 m , 15 ) , ( 80 m , 0 ) , ( 85 m , 15 ) } . Interestingly, all peaks have the relatively wide width along the azimuth angle axis, compared to the range axis. This is because the number of elements in the virtual array is much smaller than that of fast-time samples, resulting in the relatively low angle resolution. In Figure 4b, the radar image using the proposed distributed 2D MUSIC with coordinate transformation is shown with D = 5 m, when the received SNR is 10 dB at both FMCW MIMO radars. That is, the image is plotted based on (27). We can also find three peaks at { ( 75 m , 15 ) , ( 80 m , 0 ) , ( 85 m , 15 ) } . Note that the peaks are sharper than those in Figure 4a, which implies that the proposed distributed 2D MUSIC can have a higher angle resolution and lower estimation errors compared to the conventional 2D MUSIC without the received signal sharing.
In Figure 5, the radar images are shown when (a) the images obtained from two distributed radars are simply averaged and (b) the proposed 2D MUSIC with coordinate transformation is applied using (27). Here, we also consider that three targets are located at ( R k , θ k ) = { ( 75 m , 15 ) , ( 80 m , 0 ) , ( 85 m , 15 ) } . From Figure 5a, some ghost targets are observed because the coordinates of the distributed radars are not properly aligned. However, in the Figure 5b, three peaks can be found associated with the original target positions with a high resolution.
In Figure 6a, the radar image using the proposed 2D MUSIC algorithm with coordinate transformation is shown with D = 1 m, when the received SNRs are different at the distributed radars. Specifically, the received SNR at the reference radar is 6.6 dB, while 10 dB at the other radar. We can also find three peaks at the same point. For comparison purpose, in Figure 6b, the radar image at the reference radar using the conventional 2D MUSIC algorithm is shown. We note that the peaks associated with the targets are not apparent compared to that in Figure 6a.
In Figure 7, the radar images using the proposed distributed 2D MUSIC with coordinate transformation are shown with the different inter-radar spaces (i.e., D = { 5 , 10 } m ). We can also find as the inter-radar space increases, the radar image resolution can be improved.

4.2. Mean Square Error Comparison

In this section, the Monte Carlo simulations are carried out to present the performance of the proposed algorithm. As performance measures, we evaluate the RMSEs of range and angle estimation, given as follows:
R M S E r = 1 K k = 1 K ( r ¯ ^ k r k ) 2 R M S E θ = 1 K k = 1 K ( θ ¯ ^ k θ k ) 2
In Figure 8, we set M t = 2 , M r = 4 , L = 2 , and K = 2 with ( R k , θ k ) = { ( 70 m , 15 ) , ( 75 m , 15 ) } . Figure 8a shows the RMSEs of range estimation for various SNRs when two distributed FMCW MIMO radars with D = 10 m are exploited with the proposed 2D MUSIC algorithm. For comparison purposes, we also evaluate the RMSEs when the conventional 2D MUSIC with/without the received signal sharing are exploited. From Figure 8a, as SNR increases, the RMSEs for the proposed algorithm and the conventional algorithms with/without received signal sharing decrease, but at low SNR values, the proposed algorithm shows lower RMSEs than the conventional algorithm without received signal sharing. Interestingly, it can be found that the proposed 2D MUSIC algorithm exhibits a similar RMSE performance compared to the 2D MUSIC with the received signal sharing, which coincides with the discussion in Section 3.3. From Figure 8b, RMSEs of angle estimation for the proposed algorithm are lower than those for the conventional algorithm without data sharing, which is a similar observation as that found in Figure 8a.
Figure 9 shows the RMSEs of angle and range estimation for various numbers of FMCW MIMO radars and Rx antennas. Specifically, in Figure 9a, the RMSEs for various numbers of FMCW MIMO radars are evaluated when the proposed algorithm is exploited with S N R = 4 dB, M t = 2 , M r = 4 . That is, the ith radars are located at the point ( ( i 1 ) D , 0 ) m with D = 1 m. From Figure 9a, as the number of radars increases, RMSEs of both range and angle are decrease. Accordingly, as the number of radars increases, high-resolution images can be achieved having narrow-width peaks associated with the targets, which coincides with the observation in Section 4.1. In Figure 9b, the RMSEs for various numbers of Rx antennas with two FMCW MIMO radars are evaluated. Here, S N R = 4 dB, M t = 2 , and D = 10 m. Furthermore, from Figure 9b, as the number of Rx antennas increases, RMSEs of both range and angle are decreases.

5. Experiment Results

In order to experimentally demonstrate the proposed 2D MUSIC algorithm, a distributed FMCW MIMO radar system is set as in Figure 10 by using two W-band FMCW radars (TI AWR-1642) with two Tx and four Rx antennas. Here, two radars are located at ( 0 , 0 ) m and ( 2 , 0 ) m. We consider that two targets are located at ( 7.76 m , 14 . 93 ) and ( 5.39 m , 21 . 80 ) with respect to the reference radar at ( 0 , 0 ) m. The FMCW chirp configuration used in our experiment is shown in Table 1.
Figure 11a shows the radar image when the measured data obtained from the reference radar located at ( 0 , 0 ) m is exploited with the conventional 2D MUSIC algorithm. We can find that two targets are detected and the estimated targets are given as ( R k , θ k ) = { ( 8.0 m , 15 . 30 ) , ( 5.64 m , 21 . 60 ) } . Figure 11b shows the radar image when the measured data are acquired at ( 2 , 0 ) m is exploited with the conventional 2D MUSIC algorithm using the coordinate transformation. Accordingly, the targets are slightly slanted. In Figure 12, the radar image is obtained by using the proposed 2D MUSIC algorithm with coordinate transformation. Radar image in Figure 12 shows that two peaks are located at { ( 8.16 m , 15 . 30 ) , ( 5.64 m , 19 . 0 ) } with a higher resolution. That is, the target images appear more focused. Accordingly, we can estimate the targets with a high resolution without sharing the raw data of the received signals at the distributed radars.

6. Conclusions

In this paper, we propose the distributed 2D MUSIC algorithm with coordinate transformation to estimate the range and angle information of multiple targets with distributed FMCW MIMO radars. In the proposed scheme, by the coordinate transformation at each FMCW MIMO radar, we can proceed with the 2D MUSIC algorithm separately with respect to the reference radar’s coordinate. Accordingly, rather than reporting the raw data of the received signal to the data fusion center, the distributed radars forward their local cost function values of the 2D MUSIC for the radar image region of interest. Because the local cost function values are computed with respect to the same coordinate, their weighted sum can be computed at the data fusion center, and the range and angle information of multiple targets is jointly estimated from the aggregated cost function. In the computer simulations, when the proposed distributed 2D MUSIC algorithm is exploited, high-resolution radar images can be achieved having narrow-width peaks associated with the targets. Accordingly, it is confirmed through the Monte Carlo simulation that the proposed algorithm shows lower RMSEs than the conventional method especially at low SNR (below 10 dB), which implies that the proposed algorithm has higher immunity to the additive noise. Finally, by applying the proposed scheme to the experimentally measured data, it is verified that the range and angle parameters of multiple targets can be estimated with a high resolution without sharing the raw data of the received signals at the distributed radars.

Author Contributions

Conceptualization, J.S. and J.P.; methodology, J.S. and J.P.; software, J.S., J.L. and J.P.; validation, J.P., H.K. and S.Y.; formal analysis, J.P.; investigation, J.S. and J.L.; writing—original draft preparation, J.S.; writing—review and editing, J.P.; supervision, J.P.; project administration, J.P., H.K. and S.Y.; funding acquisition, H.K. and S.Y. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported in part by Electronics and Telecommunications Research Institute (ETRI) grant funded by the Korean government (21ZH1100; Study on 3D Communication Technology for Hyperconnectivity). It is also supported in part by the Basic Science Research Program through the National Research Foundation of Korea funded by the Ministry of Education (2018R1D1A1B07043786).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Hasch, J.; Topak, E.; Schnabel, R.; Zwick, T.; Weigel, R.; Waldschmidt, C. Millimeter-Wave Technology for Automotive Radar Sensors in the 77 GHz Frequency Band. IEEE Trans. Microw. Theory Tech. 2012, 60, 845–860. [Google Scholar] [CrossRef]
  2. Jankiraman, M. Design of Multi-Frequency CW Radars; SciTech Publishing: Boston, MA, USA, 2007; Volume 2. [Google Scholar]
  3. Stove, A.G. Linear FMCW Radar Techniques. IEE Proc. F Radar Signal Process. 1992, 139, 343–350. [Google Scholar] [CrossRef]
  4. Hakobyan, G.; Yang, B. High-Performance Automotive Radar: A Review of Signal Processing Algorithms and Modulation Schemes. IEEE Signal Process. Mag. 2019, 36, 32–44. [Google Scholar] [CrossRef]
  5. Belfiori, F.; van Rossum, W.; Hoogeboom, P. 2D-MUSIC Technique Applied to a Coherent FMCW MIMO Radar. In Proceedings of the IET International Conference on Radar Systems (Radar 2012), Glasgow, UK, 22–25 October 2012; pp. 1–6. [Google Scholar] [CrossRef]
  6. Feger, R.; Wagner, C.; Schuster, S.; Scheiblhofer, S.; Jager, H.; Stelzer, A. A 77-GHz FMCW MIMO Radar Based on an SiGe Single-Chip Transceiver. IEEE Trans. Microw. Theory Tech. 2009, 57, 1020–1035. [Google Scholar] [CrossRef]
  7. De Wit, J.J.M.; van Rossum, W.L.; de Jong, A.J. Orthogonal Waveforms for FMCW MIMO Radar. In Proceedings of the 2011 IEEE RadarCon (RADAR), Kansas City, MO, USA, 23–27 May 2011; pp. 686–691. [Google Scholar]
  8. Kim, B.S.; Jin, Y.; Lee, J.; Kim, S. High-Efficiency Super-Resolution FMCW Radar Algorithm Based on FFT Estimation. Sensors 2021, 21, 4018. [Google Scholar] [CrossRef] [PubMed]
  9. Kim, S.; Lee, K.K. Low-Complexity Joint Extrapolation-MUSIC-Based 2-D Parameter Estimator for Vital FMCW Radar. IEEE Sens. J. 2019, 19, 2205–2216. [Google Scholar] [CrossRef]
  10. Lee, J.; Park, J.; Chun, J. Weighted Two-Dimensional Root MUSIC for Joint Angle-Doppler Estimation with MIMO Radar. IEEE Trans. Aerosp. Electron. Syst. 2019, 55, 1474–1482. [Google Scholar] [CrossRef]
  11. Zhang, X.; Chen, W.; Zheng, W.; Xia, Z.; Wang, Y. Localization of Near-Field Sources: A Reduced-Dimension MUSIC Algorithm. IEEE Commun. Lett. 2018, 22, 1422–1425. [Google Scholar] [CrossRef]
  12. Kim, T.Y.; Hwang, S.S. Cascade AOA Estimation Algorithm Based on Flexible Massive Antenna Array. Sensors 2020, 20, 6797. [Google Scholar] [CrossRef] [PubMed]
  13. Nie, W.; Xu, K.; Feng, D.; Wu, C.Q.; Hou, A.; Yin, X. A Fast Algorithm for 2D DOA Estimation Using an Omnidirectional Sensor Array. Sensors 2017, 17, 515. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  14. Amine, I.M.; Seddik, B. 2-D DOA Estimation Using MUSIC Algorithm with Uniform Circular Array. In Proceedings of the 2016 4th IEEE International Colloquium on Information Science and Technology (CiSt), Tangier, Morocco, 24–26 October 2016; pp. 850–853. [Google Scholar] [CrossRef]
  15. Goossens, R.; Rogier, H. A Hybrid UCA-RARE/Root-MUSIC Approach for 2-D Direction of Arrival Estimation in Uniform Circular Arrays in the Presence of Mutual Coupling. IEEE Trans. Antennas Propag. 2007, 55, 841–849. [Google Scholar] [CrossRef]
  16. Zhao, H.; Cai, M.; Liu, H. Two-Dimensional DOA Estimation with Reduced-Dimension MUSIC Algorithm. In Proceedings of the 2017 International Applied Computational Electromagnetics Society Symposium (ACES), Firenze, Italy, 26–30 March 2017; pp. 1–2. [Google Scholar]
  17. Xie, R.; Hu, D.; Luo, K.; Jiang, T. Performance Analysis of Joint Range-Velocity Estimator with 2D-MUSIC in OFDM Radar. IEEE Trans. Signal Process. 2021, 69, 4787–4800. [Google Scholar] [CrossRef]
  18. Shi, F. Two Dimensional Direction-of-Arrival Estimation Using Compressive Measurements. IEEE Access 2019, 7, 20863–20868. [Google Scholar] [CrossRef]
  19. Shin, D.H.; Jung, D.H.; Kim, D.C.; Ham, J.W.; Park, S.O. A Distributed FMCW Radar System Based on Fiber-Optic Links for Small Drone Detection. IEEE Trans. Instrum. Meas. 2017, 66, 340–347. [Google Scholar] [CrossRef]
  20. Lin, C.; Huang, C.; Su, Y. A Coherent Signal Processing Method for Distributed Radar System. In Proceedings of the 2016 Progress in Electromagnetic Research Symposium (PIERS), Shanghai, China, 8 August–11 September 2016; pp. 2226–2230. [Google Scholar] [CrossRef]
  21. Yin, P.; Yang, X.; Liu, Q.; Long, T. Wideband Distributed Coherent Aperture Radar. In Proceedings of the 2014 IEEE Radar Conference, Cincinnati, OH, USA, 19–23 May 2014; pp. 1114–1117. [Google Scholar] [CrossRef]
  22. Frischen, A.; Hasch, J.; Jetty, D.; Girma, M.; Gonser, M.; Waldschmidt, C. A Low-Phase-Noise 122-GHz FMCW Radar Sensor for Distributed Networks. In Proceedings of the 2016 European Radar Conference (EuRAD), London, UK, 5–7 October 2016; pp. 49–52. [Google Scholar]
  23. Commin, H.; Manikas, A. Virtual SIMO Radar Modelling in Arrayed MIMO Radar. In Proceedings of the Sensor Signal Processing for Defence (SSPD 2012), London, UK, 25–27 September 2012; pp. 1–6. [Google Scholar]
  24. Lee, J.; Hwang, S.; You, S.; Byun, W.; Park, J. Joint Angle, Velocity, and Range Estimation Using 2D MUSIC and Successive Interference Cancellation in FMCW MIMO Radar System. IEICE Trans. Commun. 2019. [Google Scholar] [CrossRef]
  25. Wax, M.; Kailath, T. Detection of Signals by Information Theoretic Criteria. IEEE Trans. Acoust. Speech Signal Process. 1985, 33, 387–392. [Google Scholar] [CrossRef] [Green Version]
  26. Grünwald, P.D.; Myung, I.J.; Pitt, M.A. Advances in Minimum Description Length: Theory and Applications; MIT Press: Cambridge, MA, USA, 2005. [Google Scholar]
Figure 1. Distributed FMCW MIMO radar and the multiple target environment.
Figure 1. Distributed FMCW MIMO radar and the multiple target environment.
Sensors 21 07618 g001
Figure 2. Pictorial description for x ( i ) [ n , s ] and x ˜ ( i ) [ s ] .
Figure 2. Pictorial description for x ( i ) [ n , s ] and x ˜ ( i ) [ s ] .
Sensors 21 07618 g002
Figure 3. Coordinate transformation in distributed FMCW MIMO radar system.
Figure 3. Coordinate transformation in distributed FMCW MIMO radar system.
Sensors 21 07618 g003
Figure 4. Radar images obtained by using (a) the conventional 2D MUSIC with S N R = 10 dB and (b) the proposed distributed 2D MUSIC with D = 5 m and S N R = 10 dB.
Figure 4. Radar images obtained by using (a) the conventional 2D MUSIC with S N R = 10 dB and (b) the proposed distributed 2D MUSIC with D = 5 m and S N R = 10 dB.
Sensors 21 07618 g004
Figure 5. Radar images obtained by using (a) the conventional 2D MUSIC and (b) the proposed distributed 2D MUSIC in a top view.
Figure 5. Radar images obtained by using (a) the conventional 2D MUSIC and (b) the proposed distributed 2D MUSIC in a top view.
Sensors 21 07618 g005
Figure 6. Radar images obtained by using (a) the proposed 2D MUSIC and (b) the conventional 2D MUSIC when SNR= { 10 , 6.6 } dB at two different radars.
Figure 6. Radar images obtained by using (a) the proposed 2D MUSIC and (b) the conventional 2D MUSIC when SNR= { 10 , 6.6 } dB at two different radars.
Sensors 21 07618 g006
Figure 7. Radar images obtained by using the proposed distributed 2D MUSIC algorithm with (a) D = 5 m and (b) D = 10 m.
Figure 7. Radar images obtained by using the proposed distributed 2D MUSIC algorithm with (a) D = 5 m and (b) D = 10 m.
Sensors 21 07618 g007
Figure 8. RMSE values of (a) range estimation and (b) angle estimation over various SNRs.
Figure 8. RMSE values of (a) range estimation and (b) angle estimation over various SNRs.
Sensors 21 07618 g008
Figure 9. RMSE values of range and angle estimation over various numbers of (a) FMCW MIMO radars and (b) Rx antennas.
Figure 9. RMSE values of range and angle estimation over various numbers of (a) FMCW MIMO radars and (b) Rx antennas.
Sensors 21 07618 g009
Figure 10. Experiment setup to measure the distributed FMCW radar signals.
Figure 10. Experiment setup to measure the distributed FMCW radar signals.
Sensors 21 07618 g010
Figure 11. Radar images (a) at reference radar by using the conventional 2D MUSIC and (b) at distributed radar by using the conventional 2D MUSIC with coordinate transformation.
Figure 11. Radar images (a) at reference radar by using the conventional 2D MUSIC and (b) at distributed radar by using the conventional 2D MUSIC with coordinate transformation.
Sensors 21 07618 g011
Figure 12. Radar image by using the proposed 2D MUSIC algorithm with coordinate transformation.
Figure 12. Radar image by using the proposed 2D MUSIC algorithm with coordinate transformation.
Sensors 21 07618 g012
Table 1. FMCW experiment parameter setting.
Table 1. FMCW experiment parameter setting.
ParameterValue
Type of Signal waveformLinear Chirped waveform
Chirp BW1798.82 (Mhz)
Number of Chirps per frame256
Number of Chirp Loops120
Range resolution0.1953 (m)
Velocity resolution0.0472 (m/s)
Tx power12 dBm
Sampling Rate10,000 ksps
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Seo, J.; Lee, J.; Park, J.; Kim, H.; You, S. Distributed Two-Dimensional MUSIC for Joint Range and Angle Estimation with Distributed FMCW MIMO Radars. Sensors 2021, 21, 7618. https://doi.org/10.3390/s21227618

AMA Style

Seo J, Lee J, Park J, Kim H, You S. Distributed Two-Dimensional MUSIC for Joint Range and Angle Estimation with Distributed FMCW MIMO Radars. Sensors. 2021; 21(22):7618. https://doi.org/10.3390/s21227618

Chicago/Turabian Style

Seo, Jiho, Jonghyeok Lee, Jaehyun Park, Hyungju Kim, and Sungjin You. 2021. "Distributed Two-Dimensional MUSIC for Joint Range and Angle Estimation with Distributed FMCW MIMO Radars" Sensors 21, no. 22: 7618. https://doi.org/10.3390/s21227618

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop