Next Article in Journal
Anti-Spoofing Method by RGB-D Deep Learning for Robust to Various Domain Shifts
Previous Article in Journal
Three-Phase Power Factor Correction Front-End for Motor Drive Applications
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

ESCI: An End-to-End Spatiotemporal Correlation Integration Framework for Low-Observable Extended UAV Tracking with Cascade MIMO Radar Subject to Mixed Interferences

1
School of Mechanical and Electrical Engineering, Southwest Petroleum University, Chengdu 610500, China
2
School of Artificial Intelligence, Anhui University, Hefei 230001, China
3
School of Transportation and Transportation Engineering, Chongqing Jiaotong University, Chongging 400074, China
*
Author to whom correspondence should be addressed.
Electronics 2025, 14(11), 2181; https://doi.org/10.3390/electronics14112181
Submission received: 31 March 2025 / Revised: 17 May 2025 / Accepted: 19 May 2025 / Published: 27 May 2025

Abstract

:
Continuous and robust trajectory tracking of unmanned aerial vehicles (UAVs) plays a crucial role in urban air transportation systems. Accordingly, this article presents an end-to-end spatiotemporal correlation integration (ESCI)-based UAV tracking framework by leveraging a high-resolution cascade multiple input multiple output (MIMO) radar. On this account, a novel joint anti-interference detection and tracking system for weak extended targets is presented in this paper; the proposed method handles them jointly by integrating a continuous detection process into tracking. It not only eliminates the threshold decision-making process to avoid the loss of weak target information, but also significantly reduces the interference from other co-channel radars and strong clutters by exploring the spatiotemporal correlations within a sequence of radar frames, thereby improving the detectability of weak targets. In addition, to accommodate the time-varying number and extended size of radar reflections, with the ellipse spatial probability distribution model, the extended UAV with multiple scattering sources can be treated as an entity to track, and the complex measurement-to-object association procedure can be avoided. Finally, with Texas Instruments AWR2243 (TI AWR2243) we can utilize a cascade frequency-modulated continuous wave–multiple input multiple output (FMCW-MIMO) radar platform. The results show that the proposed method can obtain outstanding anti-interference performance for extended UAV tracking compared with state-of-the-art methods.

1. Introduction

Thanks to the enormous progress of UAV technology, urban air mobility (UAM) is predicted to be an innovative transportation system that will provide a variety of services, such as the safe and efficient delivery of shipments and passengers, search and rescue missions and medical assistance in low-altitude airspace, which is as important as ground-based transportation systems [1,2]. As a critical component of the bottom-most perception layer in UAM, UAV tracking plays a key role in accurate control decisions, intelligent and safe navigation and malicious vehicle prevention [3,4,5,6]. In addition, in future zero-trust management architecture [7], reliable trajectory tracking is the prerequisite for trust evaluation and safe flight in UAM. However, it is always a popular and challenging issue for UAV tracking because of the small shape and complex interferences in low altitude.
In the past decades, frequency modulated continuous wave (FMCW) radar, which can provide both long detection range and high-range resolution has been widely utilized in moving target tracking [7,8,9,10]. Recently, the advances in cascade multiple input multiple output (MIMO) techniques [11,12] provide increasing resolutions for angle of arrival (AOA). High range and angle resolutions of cascade FMCW-MIMO radar can not only achieve good resistance to interferences from multipath and clutter but also obtain rich extension parameters in addition to conventional locations and kinematic states, such as shape, size and, possibly, directions of motion, which can provide supplemental information for target tracking, as well as next planning and control [13,14]. On this account, we explore the feasibility of implementing low-observable aerial vehicle tracking with high-resolution cascade FMCW-MIMO radar in this article.
In high-resolution cascaded FMCW-MIMO radar systems, there may be multiple spatially distributed measurement sources on the unmanned aerial vehicle (UAV). In each time step, the backscattered signals from different measurement sources will occupy multiple resolution cells of the measurement space, as shown in Figure 1. On this account, the point-target assumption in classical tracking methods, such as particle filter (PF) [15], Kalman filter (KF) and its variants [16,17], will become invalid. Although the spatially distributed measurements can be treated as multiple targets to track by means of data association-based algorithms, such as nearest neighbor data association (NNDA) [18], joint probabilistic data association (JPDA) [19] and multiple hypothesis tracker (MHT) [20], it is difficult to complete accurate measurement-to-source association because the extended measurements highly depend on the properties of the target and the radar-to-object geometry, which often quickly varies with moving targets. Also, the contour tracking algorithm [21] is not appropriate for tracking extended UAV either because it is nearly impossible to correctly extract the contour with the sparse and time-varying radar point cloud. In [22], to achieve extended target tracking, the center point of the multiple spatially distributed measurements was first extracted by density-based spatial clustering of applications with noise (DBSCAN) and then the classic point tracking method was employed to track the cluster center. Unfortunately, the extended features (e.g., the shapes and sizes) hidden in measurements will be lost in the clustering process. By introducing shape parameters in the target state, the random matrix method (RMM) [23,24] and random finite set (RFS) approaches [25,26] can simultaneously track the positions and extensions of a target without considering measurement-to-source association problem. Additionally, the second-order extended Kalman filter (SOEKF) [27] and random hypersurface modeling (RHM) [28] are two other effective extended target tracking methods. Nevertheless, the tracking performance of these approaches highly depend on the accuracy of detections, which may be easily interfered with by interferences in urban air environments, such as mutual intrusion from other co-channel radars and strong clutter interruption from infrastructures.
Aiming at mitigating the impact of interferences in complex electromagnetic environments, several anti-interference techniques have been presented recently. Among these, the moving target indication (MTI) algorithm [29] is a straightforward way to suppress clutter interferences depending on Doppler frequency diversity of clutters and moving targets. In [30], for further mitigating the residual clutters, a subspace tracking algorithm with an H-infinity filter was proposed for robust target tracking. Other algorithms like clutter model [31,32] and multiply-domain joint adaptive processing [33,34] were also utilized for clutter cancellation. Nevertheless, they can only eliminate the clutter interferences.
To address mutual interferences from other radars, in [35,36], the distributed radar networks were presented. By identifying the consistency of real target position and velocity observed by different radars, data correlation testing algorithms were developed to suppress the deception jamming. In [37], Aydogdu et al. indicate that the mutually orthogonal radar waveforms can be employed for high interference suppression. However, they are currently not available for prevalent low-cost hardware architectures. Apart from the radar network and orthogonal waveforms, the advanced signal processing algorithm is a promising alternative for mutual inference suppression. In [38], by employing the low-rank property of target echoes across multiple channels, Wang et al. proposed the tensor decomposition-based interference mitigation method. In [39], the beat-frequencies interpolation, combined with phase matching and reconfigurable linear prediction coefficients estimation, was presented for mutual interference suppression in the short-time Fourier transform (STFT) domain. In [40], the authors exploit the one-way propagation characteristic of mutual interferences to achieve interference detection and suppression. In [41], the convolutional autoencoder deep learning network was given for interference mitigation and multi-person activity sensing. However, the UAV with low radar cross sections (RCS) has weak backscattered signals, which cannot meet the high signal-to-interference-plus-noise ratio (SINR) assumption of the above algorithms [42,43]. Although the long-time coherent integration algorithms [44,45,46,47] can improve SINR by coherently integrating multiple backscattered signals, the integration performance overly relies on the motion model, which is extremely difficult to correctly build in a long-time duration since the movement of UAV is complex and flexible.
In the field of low-observable UAV tracking, the multi-scattering-source characteristics of UAVs lead to spatially distributed backscattered signals spanning multiple resolution cells, rendering classical “point-target” tracking methods ineffective. Data association-based multi-target tracking algorithms face challenges in resolving measurement-to-source association ambiguity, while contour tracking and cluster-center tracking approaches suffer from feature loss or limited applicability, while also being susceptible to mutual interference from co-band radars and strong clutter. Furthermore, current methods encounter bottlenecks in robustness under complex environments, interference suppression capability and adaptability to dynamic motion models. This necessitates further exploration of adaptive tracking techniques and integrated anti-interference technologies tailored for low SINR scenarios.
In this article, we aim to provide a continuous and robust tracking framework for low-observable and extended UAV by employing a high-resolution cascade FMCW-MIMO radar subject to interferences from strong clutters and other radars. In this tracking framework, we first convert the uncertainty of the scattering sources into the randomness of ellipse sizes and intensities in spatially extended resolution cells. The benefits are that the complex measurement-to-source association process is removed and the extended UAV can be treated as an entity to track by means of an ellipse-based spatial probability distribution model. On this basis, for dealing with the low-observable UAV tracking problem in the presence of ghost targets generated from strong clutters or mutual interfering radars in urban low-altitude, we formulate and derive an end-to-end spatiotemporal correlation integration (ESCI)-based tracking framework that can simultaneously capture both the uncertainty regarding the object’s existence as well as the uncertainty of the target’s state. It employs the coherent integration algorithm to sufficiently improve the SINR in one radar frame and then presents the recursive Bayesian estimation-based track-before-detection strategy to complete UAV tracking and interference suppression by directly utilizing the unthresholding radar spectrum, which can avoid weak target information loss caused by the conventional threshold-decision process prior to tracking. The main contributions can be summarized as follows.
(1) We present a two dimension-Keystone transform (2D-KT) to correct the range cell migration (RCM) in the range slow time domain and range-channel plane for coherently integrating the multi-pulse power of the UAV in one radar frame, which can sufficiently improve the SINR in the radar’s raw spectrum.
(2) We regard the occupied resolution cells from time-varying scattering sources spatially structured on the UAV as an entity and further develop an ellipse spatial probability distribution to model the extension of the UAV in the radar spectrum, removing the complex measurement-to-source association process in conventional tracking methods.
(3) By exploring the spatiotemporal correlation characteristics of the backscattered signals from the UAV in a sequence of radar spectra, we propose an ESCI-based UAV tracking framework to recursively estimate the shape parameters and its kinematic variables (position, velocity) of target in the presence of mixed interferences. By means of the ellipse spatial probability distribution model and the point diffusion function, the pseudo-range–Doppler–azimuth spectrum is generated to derive the measurement likelihood ratio for avoiding weak target information loss induced by the threshold-decision process.
(4) Theoretical analysis and experimental results show that the proposed method can obtain outstanding anti-interference performance of UAV tracking in contrast to the state-of-the-art (SOTA) methods, especially under low SINR conditions.

2. System Design

As shown in Figure 2, this section presents the proposed spatioemporal correlation integration-based trajectory tracking framework for low-observable and extended UAV in an urban air transportation system. Firstly, the high-resolution cascade radar platform and the corresponding signal processing algorithm are given in the following subsection, where the power of the received signals from the UAV are maximally accumulated in the range–Doppler–azimuth spectrum to improve the input SINR. Then, with a sequence of range–Doppler–azimuth spectra, an end-to-end spatiotemporal correlation integration-based tracking framework is proposed in the second subsection to recursively track the low-observable and extended UAV.

2.1. Cascade FMCW-MIMO Radar Signal Processing for Intra-Channel and Inter-Channel Coherent Integration

2.1.1. Signal Model of Cascade FMCW-MIMO Radar for Extended UAV

At the transmitter, the FMCW signals are first generated and transmitted to the are of interest, which can be expressed as
s ( t ^ , t m ) = exp [ j 2 π f 0 ( t ^ + t m ) + j π μ t ^ 2 ]
where t ^ [ 0 , T p ) is the fast time variable, T p is the pulse duration time, t m = m T p , m = 1 , 2 , , M is the slow time variable, M is the number of pulses in one radar frame, μ = B T p is the chirp rate, B is the bandwidth and f 0 is the carrier frequency.
Then, based on the relationship shown in Figure 3, the backscattered signals of the extended UAV from the n T -th transmitting antenna at the n R -th receiving element can be described as
r u ( t ^ , t m , n T , n R ) = i = 1 I k σ i exp { j 2 π f 0 [ t ^ + t m τ i k ( t m ) τ i k ( n T ) τ i k ( n R ) ] + j π μ [ t ^ τ i k ( t m ) τ i k ( n T ) τ i k ( n R ) ] 2 }
where I k is the number of reflection sources at the k-th radar frame, σ i is the backscattered power from the i-th reflection source of extended UAV, τ i k ( t m ) = 2 ( R i + v 0 t m ) / c is the round-trip time of backscattered signals from the first transmitting antenna at the first receiver, R i is the initial slant distance between radar and the i-th reflection source, v 0 is the initial velocity of UAV, which is assumed as a constant during one radar frame, c = 3 × 10 8 m/s is the light speed, τ i k ( n T ) = ( n T 1 ) d T sin θ i / c and τ i k ( n R ) = ( n R 1 ) d R sin θ i / c are, respectively, the additional propagation time delays induced by spacing differences of transmitting and receiving antennas, θ i is the angle of arrival (AOA), which is approximately equal to the angle of departure (AOD), d T and d R are, respectively, the spacing distances of transmitting and receiving arrays, n T = 1 , 2 , , N T and n T = 1 , 2 , , N R are, respectively, the number of transmitting and receiving antennas, N T is the number of transmitting antennas and N R is the number of receiving elements.
After demodulation and dechirp processes [48], the beat signal collected by the data capture board can be described as
r b e a t ( t ^ , t m , n T , n R ) = r u ( t ^ , t m , n T , n R ) × s ( t ^ , t m ) i = 1 I k A i exp j 4 π μ t ^ R i c exp j 4 π ( f 0 + μ t ^ ) v 0 t m c × exp j 2 π ( f 0 + μ t ^ ) [ ( n T 1 ) d T sin θ i + ( n R 1 ) d R sin θ i ] c
In this article, as shown in Figure 4, with the time division multiple access-MIMO (TDMA-MIMO) strategy [49], the signals transmitted from different transmitting elements can be separated, and the presented N T × N R MIMO array can be synthesized as a 1 × N a virtual MIMO array, where N a = N T × N R is the total number of virtual antennas. More specifically, in a single time block, the chirp signals from different transmitting antennas are transmitted to the area of interest at different time slots sequentially. At the receiver, the backscattered signal from the n T -th transmitting element is received at the n T -th time slot. For the signal transmitted by the n T -th antenna and received by the n R -th antenna, the additional propagation path induced by the spacing difference of transmitting and receiving array elements is ( n T 1 ) d T sin θ i + ( n R 1 ) d R sin θ i . Furthermore, by reasonably designing the spacing distances of transmitting arrays (e.g., d T = N R d R ), the additional propagation path can be rewritten as [ ( n T 1 ) N R + ( n R 1 ) ] d R sin θ i , which is equal to the space difference between the first and [ ( n T 1 ) N R + n R ] -th receiving antennas in a 1 × N a array. Hence, the N T × N R dimensional MIMO array can be restructured as a 1 × N a virtual array. On this condition, the azimuth angle resolution can be increased to 1 / ( N T × N R ) , which only needs N T + N R antennas, far less than N T × N R .
Then, the beat signal of the n a -th virtual antenna can be written as
r b e a t ( t ^ , t m , n a ) = i = 1 I k A i exp j 4 π μ t ^ R i c × exp j 4 π ( f 0 + μ t ^ ) v 0 t m c exp j 2 π ( f 0 + μ t ^ ) n a d R sin θ i c
where n a = 0 , 1 , , N a is the virtual antenna number.
Applying 3D Fourier transform (FT) to (4) with respect to t ^ , t m and n a yields
r b e a t ( f r , f d , f a ) = i = 1 I k A i sin c [ f r 2 μ c ( R i + v 0 t m + 1 2 n a d R sin θ i ) ] sin c ( f d 2 v 0 λ ) sin c ( f a d R λ sin θ i )
where f r , f d and f a are the frequency variables in terms of range, Doppler, and azimuth angle, respectively, A i = A i N a M T c 2 is a constant, and sin c ( x ) = sin x x is a Sinc function.
From (5), we can observe that for different reflection sources spatially structured on UAV, the differences of R i and θ i will cause the extension of the target’s signals where they may occupy several resolution cells in the range–azimuth spectrum, as shown in Figure 3. The rich information, including the extension of target signals, such as shape, size and orientation, can enable the cascade FMCW-MIMO radar to be utilized for UAV control and planning decisions like camera or Lidar. Additionally, due to the coupling of t ^ , t m and n a , we can also see that the peak location of the Sinc function will vary with the target moving when v 0 t m or 1 2 n a d R sin θ i is larger than the range resolution, which is well known as range cell migration (RCM). The RCMs in both ranges—the slow time domain and virtual antenna channel plane—would distribute the signal power into different range cells, which may bring about serious integration loss after 3D-FT. Accordingly, inspired by the successful application of keystone transform (KT) in the field of radar imaging, we present a 2D-KT-based coherent integration method to sufficiently accumulate the intra-channel and inter-channel signal power for improving the SINR of raw radar data cube in the following part.

2.1.2. Intra-Channel and Inter-Channel Coherent Integration in the Presence of Interferences

In urban air transportation systems, our cascade MIMO radar system will inevitably be subjected to noise interferences, mutual intrusion from other co-channel radars and strong clutter interruption from infrastructures and high buildings. Consequently, at the receiver, the received signals in (3) can be rewritten as
z b e a t ( t ^ , t m , n a ) = r b e a t ( t ^ , t m , n a ) + r b e a t i n t ( t ^ , t m , n a ) + c b e a t ( t ^ , t m , n a ) + w ( t ^ , t m , n a )
where
r b e a t i n t ( t ^ , t m , n a ) = r i n t ( t ^ , t m ) × s ( t ^ ) γ i n t A i n t exp j 2 π ( f 0 i n t f 0 + μ i n t R i n t ) c t ^ × exp j 2 π ( f 0 i n t + μ i n t t ^ ) v i n t t m c × exp j 2 π ( f 0 i n t + μ i n t t ^ ) n a d R sin θ i n t c
denotes the received signals from interfering radar, γ i n t 0 , 1 denotes the co-channel indicator, which is equal to 1 when the interference signal is in the bandwidth of interest of our radar, μ i n t denotes the chirp slope of interfering radar, A i n t denotes the amplitude of the interference signal, R i n t , θ i n t and v i n t are, respectively, the relative range, azimuth angle and velocity between victim and interfering radar, c b e a t ( t ^ , t m , n a ) is the backscattered clutters from infrastructures and high buildings and w ( t ^ , t m , n a ) is the noises.
In general, according to the Friis free-space propagation equation and the radar equation, the mutual interference signal is stronger than the desired backscattered signal. As a result, after performing 3D-FT on (6), the ghost target may occur if the carrier frequency of interfering radar is in the bandwidth of interest of our radar. Additionally, the existence of clutters will not only increase the noise floor but also perhaps generate pseudo-targets in radar range–Doppler–azimuth spectrum.
The mutual interference and clutters will sharply decrease the SINR of the raw 3D radar data cube. Accordingly, in this part, we first utilize the 2D-KT to correct the RCM induced by the target moving and spacing differences of the virtual array for sufficiently integrating the intra-channel and inter-channel signal power of the target in a single radar frame, as shown in Figure 5. From (3), we can see that the appearance of RCM is essentially caused by the coupling of t ^ and t m in the second exponential term for a single channel. Additionally, the range drift in different channels is also induced by the coupling of t ^ and n a in the third exponential term. Fortunately, the types of these two couplings are the same and they can be removed through a simple scaling operation, which can be achieved by the interpolation process. Consequently, for removing the couplings of t ^ , t m and n a , we, respectively, apply ( f 0 + μ t ^ ) m = f 0 m and ( f 0 + μ t ^ ) n a = f 0 n a to (5) with respect to the slow time t m and channel number n a by means of a Sinc interpolation function and we can obtain
z b e a t ( t ^ , t m , n a ) = 0 M 1 0 N a 1 z b e a t ( t ^ , t m , n a ) × sin c ( f 0 f 0 + μ t ^ m m ) sin c ( f 0 f 0 + μ t ^ n a m ) = r b e a t ( t ^ , t m , n a ) + r b e a t i n t ( t ^ , t m , n a ) + c b e a t ( t ^ , t m , n a ) + w ( t ^ , t m , n a )
where
r b e a t ( t ^ , t m , n a ) = i = 1 I k A i exp j 4 π μ t ^ R i c × exp j 4 π f 0 v 0 t m c exp j 2 π f 0 n a d R sin θ i c
t m = m T p , m = 0 , 1 , , M and n a = 1 , 2 , , N a are, respectively, scaled slow time variable and scaled channel number.
From (8) and (9), we can observe that the couplings of t ^ , t m and n a are eliminated in the second and third exponential terms for beat signals of the target, but the couplings still exists in the mutual interference signals. On this account, after performing 3D-FT on (7) along t ^ , t m and n a , we have
z ( f r , f d , f a ) = r b e a t ( f r , f d , f a ) + r b e a t i n t ( f r , f d , f a ) + c b e a t ( f r , f d , f a ) + w ( f r , f d , f a )
where
r b e a t ( f r , f d , f a ) = i = 1 I k A i sin c ( f r 2 μ R i c ) × sin c ( f d 2 v 0 λ ) sin c ( f a d R λ sin θ i )
is the range–Doppler–azimuth spectrum of the target, z b e a t i n t ( f r , f d , f a ) , c b e a t ( f r , f d , f a ) and w ( f r , f d , f a ) are the spectra of interference signals, clutters and noises respectively.
In (11), the signal power of the UAV’s i-th reflection source is coherently integrated from multiple diffuse resolution cells into a single resolution unit in the range–Doppler–azimuth spectrum, and the theoretical processing gain is 10 lg ( M N a ) [44]. However, due to the residual couplings of t ^ , t m and n a , the signal power of the interfering radar still distributes into different range cells. This accumulation gain can fully improve the SINR in the raw 3D radar data cube and thus improve the trajectory tracking performance for low-observable UAV.

2.1.3. Simulation Results

Figure 6 shows the results of the coherent integration of 2D-KT verified using MATLAB R2023a. In this figure, the radar’s mutual interference was added to the returned signals of the target, and another static object was also presented to simulate the strong clutters. The parameters of the radar system were set as follows: the carrier frequency f 0 = 77 GHz, the bandwidth B = 2.4 GHz, the number of virtual antennas N a = 128 , the pulse duration T p = 50 μ s and the chirp rate μ = 85.021 MHz/ μ s. In addition, the chirp parameters of the jamming radar are f 0 i n t = 77 GHz, T p i n t = 45 μ s, μ i n t = 85.106 MHz/ μ s. The range of the slow plane of the first RX-TX channel before and after KT are given in Figure 6a,b. In Figure 6a, we can clearly see the RCM that the target power of different pulses distributes at different range cells. After KT, the RCM is corrected and the signal power of different pulses from the same reflection source is located at the same range cell. Then, after performing Fourier transform along the pulse number, a peak which integrates the power of all pulses from the same reflection source of the target will occur in the Doppler spectrum range. Figure 6c,d presents the channel plane range of the first chirp before and after KT. Due to the large number of virtual channels, it can be seen from Figure 6c that the spacing distance differences of the virtual antennas also bring about RCM, which will degrade the target detection and tracking performance. Fortunately, this type of RCM can be also removed after a second KT process, as shown in Figure 6d. In addition, due to the difference in chirp rates between interfering and victim radars, from Figure 6a–c, we can observe that the signal power of radar mutual interference distributes at different range cells after KT, which will decrease the final signal power of interference and thus is beneficial to target detection and tracking. The final azimuth spectrum range before and after 2D-KT are depicted in Figure 6e,f. We can observe that, after 2D-KT, the SINR is coherently integrated from 12.84 dB to 31.3 dB, which can effectively improve the low-observable UAV detection and tracking performance.

2.2. End-to-End Spatiotemporal Correlation Integration-Based Tracking Framework with Recursive Bayesian Estimation

2.2.1. State and Existence Indicator Transition Models

In this section, we regard the extended UAV as an entity to track for avoiding the complex measurement-to-source association procedure. In addition, the shape of the extended UAV can be approximatively described by a rectangle which is often modeled as an ellipse in radar spectrum. On this account, we first model the state of extended UAV as X k = x k c , s k T , where x k c = x k c , y k c , v k x , v k y is the kinematic state vector of the scatting center at the k-th time step, s k = α k , l k a , l k b is the corresponding elliptic extension state, α k , l k a and l k b is respectively the elliptic heading angle, major axis and minor axis. Generally, due to the short time interval between two adjacent radar frames, the speed of UAV can be nearly treated as a constant. Hence, the state transition model can be formulated as
X k + 1 = F X k + G w k
where F = F x 0 4 × 3 0 3 × 4 F s denotes the state transition matrix, F x = 1 T 0 1 0 0 0 0 0 0 0 0 1 T 0 1 denotes the kinematic state transition matrix, F s = 1 0 0 0 1 0 0 0 1 denotes the extension state transition matrix, T = M T p is the time interval between two adjacent radar frames, G = G x 0 4 × 5 0 3 × 4 G s denotes the noise transition matrix, G x = T 2 / 2 0 0 T T 2 / 2 0 0 T and G s = ε α 0 0 0 ε l a 0 0 0 ε l b , ε α , ε l a and ε l b are the corresponding noise intensities, w k = w x , w y , w v x , w v y , w α , w l a , w l b T denotes the Gaussian white noise vector.
Then, for improving the anti-interference ability, in this article, we introduce an existence indicator E k = { 0 , 1 } to describe whether there exists a target in the radar field of view (FOV) at the kth timestamp instead of the threshold-decision detection process. If there exists a target, E k = 1 ; otherwise, E k = 0 . By introducing the target’s birth and death probabilities, the existence indicator evolution can be modeled as a Markov process which can be described as
P ( E k + 1 = 1 ) P ( E k + 1 = 0 ) = 1 P b P b P d 1 P d P ( E k = 1 ) P ( E k = 0 )
where P b = P ( E k + 1 = 1 | E k = 0 ) represents the birth probability, P d = P ( E k + 1 = 0 | E k = 1 ) represents the death probability.

2.2.2. Measurement Model for Cascade MIMO Radar

According to (10), we can obtain a sequence of range–Doppler–azimuth spectra, i.e., Z 1 : k + 1 = z 1 , z 2 , , z k + 1 , where z k + 1 = z k + 1 ( f r , f d , f a ) 2 denotes the range–Doppler–azimuth power spectrum, which contains N r × N v × N θ resolution cells at k + 1th radar frame. In this article, we regard the measurements originating from the extended UAV as two separate processes, including a spatially scattering process and a measurement error process. The former employ an elliptic shape and a spatial probability distribution model to describe how the extended measurement sources are distributed over the UAV. The random scattering sources, following a given spatial probability distribution, are distributed in the ellipse, which must satisfy
x k + 1 i y k + 1 i x k + 1 c y k + 1 c D k + 1 × x k + 1 i y k + 1 i x k + 1 c y k + 1 c T 1
where x k + 1 c , y k + 1 c and x k + 1 i , y k + 1 i are, respectively, the position coordinates of the centering and i-th scatting sources, D k + 1 = R ( α k + 1 ) ( l k + 1 a ) 2 0 0 ( l k + 1 b ) 2 R ( α k + 1 ) T represents the outline feature information of the ellipse, R ( α k + 1 ) = cos α k + 1 sin α k + 1 sin α k + 1 cos α k + 1 represents the rotation matrix.
The measurement error process describes the sensor noise process, which is generally assumed to be a Gaussian white noise vector. In addition, instead of the traditional radar detections, for avoiding the information loss induced by the threshold-decision process, the point diffusion function-based pseudo range–Doppler–azimuth power spectrum is employed to derive the measurement model. On this account, the measurement model for cascade MIMO radar can be derived as
z k + 1 = h ( X k + 1 ) + v k + 1 , E k + 1 0 v k + 1 , E k + 1 = 0
where
h ( X k + 1 ) = i = 1 I k + 1 A k + 1 i h ( x k + 1 i ) = i = 1 I k + 1 A k + 1 i exp [ ( r r k + 1 i ) 2 L r 2 N r + ( v v k + 1 i ) 2 L v 2 N v + ( θ θ k + 1 i ) 2 L θ 2 N θ ]
is the pseudo-range–Doppler–azimuth power spectrum, h ( x k i ) represents the point diffusion function of the i-th scattering source, x k + 1 i = x k + 1 i , y k + 1 i , v k + 1 x , v k + 1 y represents the corresponding kinematic state vector, v k + 1 is the noise matrix, I k + 1 is the total number of scattering sources of the target at k + 1-th radar frame, A k + 1 i is the amplitude of the i-th scattering point, L r , L v and L θ represent loss constants, r, v and θ are, respectively, the range, velocity and azimuth variables, r k + 1 i , v k + 1 i and θ k + 1 i are the range, velocity and azimuth of the i-th scattering source, which can be calculated by
r k + 1 i = ( x k + 1 i ) 2 + ( y k + 1 i ) 2
v k + 1 i = x k + 1 i v k + 1 x + y k + 1 i v k + 1 y ( x k + 1 i ) 2 + ( y k + 1 i ) 2
θ k + 1 i = arctan y k + 1 i x k + 1 i

2.2.3. End-to-End Spatiotemporal Correlation Integration-Based State Estimation with Recursive Bayesian Estimation

Since the target information is definitely contained in the radar raw spectrum, in this article, we concentrate on estimating the posterior probability P ( X k + 1 , E k + 1 = 1 | Z 1 : k + 1 ) to achieve UAV tracking by directly utilizing the observed range–Doppler–azimuth spectra, i.e., Z 1 : k + 1 = z 1 , z 2 , , z k + 1 , rather than depending on threshold-decision detections. On this basis, by employing the spatiotemporal correlation of the target trajectory within a sequence of radar frames, the extended low-observable UAV can be detected and tracked simultaneously if the posterior probability P ( X k + 1 , E k + 1 = 1 | Z 1 : k + 1 ) is larger than a given value.
Firstly, in the prediction phase, depending on the state and existence indicator transition models, the prior probability can be described as
P ( X k + 1 , E k + 1 = 1 | Z 1 : k ) = P ( X k + 1 , E k + 1 = 1 , X k , E k = 1 | Z 1 : k ) d X k + P ( X k + 1 , E k + 1 = 1 , X k , E k = 0 | Z 1 : k ) d X k = ( 1 P d ) P ( X k + 1 | X k , E k + 1 = 1 , E k = 1 ) × P ( X k , E k = 1 | Z 1 : k ) d X k + P b P ( X k )
where P ( X k + 1 | X k , E k + 1 = 1 , E k = 1 ) is the state transition probability, which can be calculated by (12), P ( X k , E k = 1 | Z 1 : k ) is the posterior probability at the k-th timestamp and P ( X k ) is the initial state probability.
Subsequently, after receiving the latest range–Doppler–azimuth spectrum z k + 1 , the posterior probability at the k + 1 -th timestamp can be derived as
P ( X k + 1 , E k + 1 = 1 | Z 1 : k + 1 ) P ( X k + 1 , E k + 1 = 1 | Z 1 : k ) P ( z k + 1 | | X k + 1 , E k + 1 = 1 )
where P ( z k + 1 | X k + 1 , E k + 1 = 1 ) is the likelihood probability.
Since the measurements originating from the extended UAV can be separated as a spatial scattering process and a measurement error process, the probability distribution function (PDF) of P ( z k + 1 | X k + 1 , E k + 1 = 1 ) can be expressed as
p ( z k + 1 | X k + 1 , E k + 1 = 1 ) = p ( z k + 1 , x k + 1 i | X k + 1 , E k + 1 = 1 ) d x k + 1 i = p ( z k + 1 | x k + 1 i , E k + 1 = 1 ) × p ( x k + 1 i | X k + 1 , E k + 1 = 1 ) d x k + 1 i
where p ( x k + 1 i | X k + 1 , E k + 1 = 1 ) is a specific spatial probability distribution model, which is assumed to be a 2D uniform distribution in this article, p ( z k + 1 | x k + 1 i , E k + 1 = 1 ) is the measurement error process.
Assuming each resolution unit is statistically independent in radar spectrum, p ( z k + 1 | x k + 1 i , E k + 1 = 1 ) can be expressed as the product of edge probability PDF of each resolution unit, i.e.,
p ( z k + 1 | x k + 1 i , E k + 1 = 1 ) = n r N r n v N v n θ N θ p ( z k + 1 n r , n v , n θ | x k + 1 i , E k + 1 = 1 )
where p ( z k + 1 n r , n v , n θ | x k + 1 i , E k + 1 = 1 ) N ( z k + 1 n r , n v , n θ A k + 1 i h n r , n v , n θ ( x k + 1 i ) , σ 2 ) is the edge probability at the resolution cell ( n r , n v , n θ ) and z k + 1 n r , n v , n θ and σ 2 are, respectively, the corresponding signal power and noise variance.
Then, substituting (23) into (22), we have
p ( z k + 1 | X k + 1 , E k + 1 = 1 ) = i = 1 I k + 1 n r N r n v N v n θ N θ p ( z k + 1 n r , n v , n θ | x k + 1 i , E k + 1 = 1 ) = i = 1 I k + 1 n r N r n v N v n θ N θ 1 2 π σ exp [ A k + 1 i h n r , n v , n θ ( x k + 1 i ) z k + 1 n r , n v , n θ ] 2 2 σ 2
In (21), with a sequence of radar frames, the UAV can be detected and tracked simultaneously if the posterior probability P ( X k + 1 , E k + 1 = 1 | Z 1 : k + 1 ) is larger than a given value, where the target detection process is integrated into the tracking procedure. To address the complicated integral problem in (21), in this article, we employ the sequential Monte Carlo sampling to approximately approach the posterior PDF. More specifically, we first uniformly generate a set of independent “samples” with random state vector X 0 q . Meanwhile, each sample has its own importance weight w 0 q . Then, after obtaining the radar observations Z 1 : k + 1 at the k + 1 -th timestamp, the state vector X 0 q and the corresponding importance weight w 0 q are recursively updated as X k + 1 q and w k + 1 q , respectively, which can be mainly divided into prediction and update processes.
In the prediction procedure, with P b and P d , the samples are randomly classified into three categories, including new born, dead and continuing samples. For the dead samples, the existence indicator is set to 0. For the continuing samples, the predicted state X k + 1 q is calculated by (12). For the new born samples, the predicted state X k + 1 q is randomly generated.
In the update process, for further enlarging the weight differences between the presence and absence of UAV, we employ the likelihood ratio L k + 1 q z k + 1 | X k + 1 q to update the importance weight instead of likelihood probability. With (15) and (24), the likelihood ratio can be derived as
L k + 1 q z k + 1 | X k + 1 q = i = 1 I k + 1 n r N r n v N v n θ N θ exp H ( x k + 1 i ) [ H ( x k + 1 i ) 2 z k + 1 n r , n v , n θ ] 2 σ 2 , E k + 1 q = 1 1 , E k + 1 q = 0
where H ( x k + 1 i ) = A k + 1 i h n r , n v , n θ ( x k + 1 i ) , E k + 1 q denotes the existence indicator of the q-th sample.
With (25), the importance weight w k + 1 q can be updated as w k + 1 q L k + 1 q z k + 1 | X k + 1 q w k q after receiving the range–Doppler–azimuth power spectrum z k + 1 . In (25), the state X k + 1 q of the q-th sample is closer to the true value, and the value of the likelihood ratio L k + 1 q z k + 1 | X k + 1 q is greater. As a result, the importance weight w k + 1 q will also be larger. After normalization and resampling processing, the updated states X k + 1 q + of most samples will be replaced by the predicted state of samples with large importance weights. Consequently, the optimal state of the extended UAV can be estimated as
X ^ k + 1 = q = 1 N q E k + 1 q X k + 1 q + q = 1 N q E k + 1 q
where N q is the total number of samples. Finally, after a sequence of consecutive radar frames, the obtained optimal states in (26) can be confirmed as the target’s tracking trajectory when the cumulative existence probability P c = q = 1 N q E k + 1 q / N q is larger than a preset threshold.

2.2.4. Simulation Results

Based on the coherent integration results of 2D-KT in Figure 6, the tracking results of the MATLAB R2023a-implemented ESCI-based framework are presented in Figure 7. In this figure, the flying trajectory of UAV was set as the closed circular loop, as given in Figure 7a. It can be seen that most of the target measurements can be obtained after a threshold-decision process. However, the ghost measurements induced by co-channel radar interferences and static clutters can also be clearly seen in Figure 7a, which causes wrong target-to-measurement association. On this account, the tracking results of RMM, SOEKF and DBSCAN deteriorate seriously. As shown in Figure 7b, although particle filter track-before-detect (PFTBD) also utilizes the non-thresholding spectra, the mismatch of the point-source model decreases the tracking precision of the extended UAV. By maximizing the posterior probability P ( X k + 1 , E k + 1 = 1 | Z 1 : k + 1 ) in a sequence of consecutive radar range–Doppler–azimuth spectra, the proposed method can track the object which satisfies Newtonian dynamic equations and elliptic energy distribution in the presence of radar mutual interferences and strong clutters. As a result, the proposed method can accurately track the low-observable extended UAV. Figure 7c,d present the intersection over union (IoU) between the tracked shape and the true extension and error cumulative distribution functions (CDFs) of the estimated trajectory. We can observe that all estimated IoUs are larger than 0.5 and the errors of the estimated trajectory are within 0.5 m, which means that the proposed method can simultaneously track the shape parameters and kinematic variables of UAV precisely in the presence of radar mutual interferences and strong clutter.

3. Evaluation Metrics and Experimental Results

3.1. Evaluation Metrics

In this article, we employ the error CDF and the IoU to jointly evaluate the tracking performance of the extended UAV. The former can show the error cumulative distribution of the tracked trajectory. The second metric presents the coincidence of the tracked shape and the true extension.

3.2. Comparisons

Comparison method: Firstly, four target tracking methods were proposed as the comparison tools for evaluating the detection and tracking performance of micro unmanned aerial vehicles under FMCW-MIMO radar.
DBSCAN: This method leverages DBSCAN-based density clustering to dynamically configure neighborhood parameters, isolate valid target clusters and compute cluster centroids. Following this, it initializes a Kalman filter to predict target kinematics via state equations and refines the target state using observational updates, enabling robust trajectory tracking.
RMM: By modeling extended targets as a hybrid motion vector and spatial extension matrix, this approach employs random matrix theory to extract dominant eigenvalues for precise signal–noise separation. The integration of Kalman filtering ensures stable tracking of spatially distributed targets.
SOEKF: Utilizing a second-order extended Kalman filter (SOEKF), this framework simultaneously estimates target dynamics and shape parameters. It embeds measurement covariance into likelihood function optimization, dynamically recalibrates model weights through maximum likelihood principles and achieves extended target tracking.
PFTBD: This method improves signal-to-noise ratio (SNR) through multi-frame radar data accumulation. It models target motion via particle filtering, iteratively updates particle weights to approximate posterior distributions and uses the probability ratio test to verify the continuous trajectories, ensuring reliable tracking of extended targets.

3.2.1. Error CDF of Tracked Trajectory

This metric shows the error cumulative distribution of the tracked trajectory, which can be estimated as F ( χ ) = Prob ( d e r r o r k χ ) , where F ( χ ) is the probability when d e r r o r k χ , d e r r o r k = x k c x ^ k c is the tracking error at the k-th frame.

3.2.2. Intersection over Union

This coefficient evaluates the similarity between the tracked shape and the true extension, which is defined as IoU = S S , where S and S are, respectively, the intersection and union areas of the tracked shape and the true extension.

3.3. Experimental Results

In this section, we verify the tracking effectiveness of the proposed method for low-observable and extended UAV in four different scenarios, including a mutual interference scenario, strong clutter scenario, mixed interferences scenario and rainy scenario. In these experiments, as shown in Figure 8a, a low-cost, high-resolution cascade MIMO radar with four AWR2243 chips was presented to track the flying UAV. In this cascaded radar system, as shown in Figure 8a, with a local-oscillator (LO) signal distributed among all four devices by a single master device, the transmitting and receiving microstrip patch array antennas from different devices can be regarded as an integrated transceiver supporting up to 9 TX and 16 RX antenna elements in azimuth. Meanwhile, the interfering radar presented in our experiments is TI AWR 1642, as shown in Figure 8b. Generally, since the incoherent radar interference that interfering and victim radars have different chirp parameters occurs in most cases and always results in severe performance loss for extended UAV tracking [37,50], we mainly concentrate on incoherent radar interference in this article and the configured parameters of interfering and victim radars are presented in Table 1. It should be pointed out that the dechirp technique [48] is employed in our radar, and thus, the sample rate is significantly smaller than the bandwidth. Figure 8c shows the developed UAV, which was embedded in a high-precision ultra-wideband (UWB) positioning module to obtain the flying trajectories as the ground truth for comparison. In addition, as given in Figure 8d, the DJI Tello Micro-UAV was also employed as the target in a rainy scenario. Three classical extended target tracking methods, i.e., DBSCAN [22], RMM [23,24], SOEKF [27], and the point-target tracking method, i.e., PFTBD [7], are also given for comparison.

3.3.1. Mutual Interference Scenario

In this part, the tracking results of extended UAV in the presence of mutual radar interference (In the same area, multiple radars have overlapping working frequency bands, resulting in the phenomenon that the receiving end receives signals from other radars) are presented in Figure 9. In this experiment, the cascade MIMO radar was located at the coordinate origin and the UAV was controlled to fly a U-shape trajectory in the front of the cascade MIMO radar, as shown in Figure 9a. The start and end position coordinates of the UAV are (−4, 6) and (4, 6), respectively. The radar raw range–azimuth spectra of the 31, 44 and 52 radar frames are presented in Figure 9b–d. We can clearly see the random interfering speckles in the range–azimuth spectrum. In addition, one can also see that the returned signals from UAV also appear in the radar spectrum regularly. On this basis, the proposed method employs this spatiotemporal information to achieve weak UAV tracking and mutual interference suppression. The tracking results of UAV extension and flying trajectory are respectively presented in Figure 9e,f. We can see that both the ghost measurements and the miss detections induced by interfering speckles lead to poor tracking performance of RMM, SOEFK and DBSCAN. The estimated trajectories of RMM, SOEKF and DBSCAN move gradually away from the ground truth, especially in the turn points. Meanwhile, the IoU between the tracked shape and the true extension is also given in Figure 9g. The IoU of the proposed method is larger than 0.3, and the maximum value is approaching 0.6, which means that the proposed method can well track the shape and size of the extended UAV. Figure 9h also presents the error CDF of the tracking trajectory, where we can observe that the proposed method can obtain superior UAV tracking performance than RMM, SOEKF, DBSCAN and PFTBD since it can well mitigate the mutual interferences by jointly addressing a sequence of radar spectra.

3.3.2. Strong Clutter Scenario

In this section, we evaluate the UAV tracking performance of the proposed method in the presence of strong clutter (objects in the environment with high radar cross-section (RCS), such as buildings and large vehicles, cause interference to radar systems). As shown in Figure 10a, two corner reflectors were placed in the radar FOV in advance of the strong clutter interference. Then, the UAV flew a rectangle trajectory in the front of our radar. The radar was also located at the coordinate origin, and the start/end position coordinates of the UAV were (4, 4). The radar raw range–azimuth spectra of the 54-th, 69-th and 79-th radar frames are also shown in Figure 10b–d. We can clearly see the ground clutter and the strong echoes of the two corner reflectors. Figure 10e shows the corresponding tracking results. We can observe that serious ghost measurements induced by the two corner reflectors and ground clutters bring about worse tracking results of RMM and SOEKF in this figure. The estimated trajectories are also presented in Figure 10f. Due to lack of measures to suppress the ghost measurements, SOEKF and DBSCAN cannot complete the weak UAV tracking in the presence of strong clutter in a single radar frame. Although PFTBD employed the non-thresholding radar spectra to track the UAV, the point-target assumption is not appropriate to track the extended object. Instead of the explicit and fixed point-target source model in particle filter-assisted track-before-detection (PFTBD), the proposed method introduces an ellipse shape in addition to a spatial probability distribution as the extent spectrum model to derive the measurement likelihood ratio, achieving robust tracking of the extended UAV, as illustrated in Figure 10g,h. Additionally, we can observe from these images that 100% tracking errors of the proposed method are within 0.5 m and most IoUs are larger than 0.4. Accordingly, the proposed method can track the extended and low-observable UAV in the presence of strong clutter.

3.3.3. Mixed Interference Scenario

For further verifying the effectiveness of the proposed method, we also conduct an experiment in the presence of mutual interference and strong clutters simultaneously. In this scenario, as given in Figure 11a, the UAV was also controlled to fly with a rectangle trajectory that the start/end position coordinate was (−4, 4). The experimental scene and radar raw spectra are presented in Figure 11b–d. The random speckles from the interfering radar and the fixed strong clutters can be clearly seen in radar raw spectra of these images. The tracking results of different methods are presented in Figure 11e–h. It can be seen from Figure 11e,f that RMM, SOEKF, DBSCAN and PFTBD completely obtain wrong trajectories under the mixed interference scenario. Figure 11g depicts the estimated IoUs of RMM, SOEKF and the proposed method, and Figure 11h presents the error CDFs of estimated trajectoies by different methods. It can be seen that the IoUs of RMM and SOEKF are near zero in the whole tracking process, which means that the tracked shape of these two methods have few overlaps with the true extension. We can also notice that in Figure 11g, because the target signals were nearly submerged by interferences around the 28th and 56th frames, the tracking performance of the proposed method decreases quickly around these frames. Fortunately, by deriving an ESCI-based tracking framework, which employs the recursive Bayesian estimation-based track-before-detection strategy to complete UAV tracking and interference suppression by directly utilizing the non-thresholding radar spectra, the positioning errors of the proposed method are within 0.5 m and most IoUs are larger than 0.4. Hence, the proposed method can obtain better tracking performance than RMM, SOEKF, DBSCAN and PFTBD in the presence of ghost targets generated from strong clutters or mutual interfering radars in urban low-altitude.

3.3.4. Rainy Scenario

Robust tracking of UAVs is a fundamental capability for modern radar systems under adverse weather conditions. As a result, we conducted a rainy experiment, where the DJI Tello micro-UAV, which has extremely low RCS, was controlled to fly a rectangle trajectory in the front of our cascade MIMO radar under rainy conditions (in radar echoes, dense precipitation particles will generate dynamic distributed clutter, resulting in attenuation of the target signal), as illustrated in Figure 12a. The start/end position of the DJI Tello micro-UAV was (−2, 6). The range–azimuth spectra of radar echoes at different radar frames are also presented in Figure 12b–d. We can observe that the signal power of the DJI Tello micro-UAV is quite weak, where the UAV is nearly submerged by noises and rain clutter. Additionally, we can also notice that the signal power of raindrops around radar is much stronger than the DJI Tello micro-UAV, and the SINRs in these three frames are all lower than 0 dB, which will cause serious mis-detections and false alarms after a threshold-decision process, as shown in Figure 12e. Since the RMM and the SOEKF highly depend on the radar measurements, they wrongly tracked the rain clutters, as illustrated in this figure. Fortunately, by jointly employing the spatiotemporal correlation information of the target existing in a sequence of radar raw spectra, the proposed method can accurately track the DJI Tello micro-UAV. As shown in Figure 12g, the IoUs of RMM and SOEKF approach zero, but our method is 0.4. The trajectory tracking results and the corresponding error CDF are given in Figure 12f,h. We can see that more than 80% of tracking errors of the proposed method are within 0.6 m, which is much better than other SOTA algorithms. Accordingly, compared with SOTA algorithms, the proposed method can robustly achieve low-observable and extended UAV tracking in the presence of rain clutter.

4. Performance Analysis and Discussion

4.1. Impact of Occlusion

In real applications, the flying UAV may be partially or completely occluded by other objects, which may lead to the failure of target tracking. For evaluating the tracking performance of the proposed method in the occlusion scene (during the radar detection process, due to the intervention of obstructions, the echo signals of the target objects are partially or completely blocked), we continued to conduct an occlusion experiment where the UAV flew as a specific trajectory, and a whiteboard was placed in front of the radar as an occluded object, as shown in Figure 13a. Meanwhile, when tracking the flying UAV, a volunteer also stood on the right front of the radar, which would cause the second occlusion. Figure 13b presents the corresponding radar spectrum, where we can clearly observe the occlusion objects and ground clutters. Figure 13c,d show the UAV tracking results of PFTBD, DBSCAN, SOEKF, RMM and the proposed method. We can observe that although the proposed method failed to continuously track the UAV at the two occlusion cases, it can retrack the UAV quickly after the target reappeared in radar FOV. However, due to the interference of clutter, PFTBD, DBSCAN, SOEKF and RMM cannot complete UAV tracking. The estimated IoUs at different radar frames are presented in Figure 13e. We can see that the estimated IoUs of the proposed method decreases at the two occlusion cases and quickly increases to more than 0.4 after leaving the occlusion area, which implies that the proposed method can robustly track the UAV as long as it occurs in radar FOV. Finally, the error CDF of the estimated trajectory is presented in Figure 13f. We can see that 80% of the tracking errors of the proposed method are within 0.5 m, which is superior than other SOTA algorithms.

4.2. Impact of SINR

Since the weak echoes from extended UAV are easily submerged by noise and interference by means of Monte Carlo simulations, Figure 14 presents the tracking performance of different methods under different SINRs. In this figure, the Gaussian noise was added in the target echoes, and the input SINR varied from −9 dB to 7 dB with a step size of 2 dB. The tracking RMSEs with SINR increasing is first given in Figure 14a. We can see that the proposed method has close tracking performance with RMM, SOEKF, DBSCAN and PFTBD when the SINR is larger than 3 dB. Nevertheless, the tracking RMSEs of RMM, SOEKF and DBSCAN increase quickly when the SINR is less than 3 dB. Thanks to directly utilizing the unthresholding radar spectrum, the RMSE of PFTBD increases until the SINR is less than −3 dB. Figure 14b presents the estimated IoUs with SINR increasing. One can also see that RMM, SOEKF and the proposed method closely estimate IoUs when the SINR is larger than 3 dB, while the IoUs of RMM and SOEKF decrease sharply when the SINR is less than 5 dB. Fortunately, with 2D-KT and the pseudo-spectrum radar measurement model, the RMSE of the proposed method is within 1 m, and the corresponding IoU is about 0.4 when the SINR is reduced to −9 dB, which can get more than a 10 dB improvement of anti-interference ability for extended UAV tracking.

4.3. Computational Complexity Analysis

The main procedure of the proposed method includes range Fast Fourier Transform (FFT), 2D-KD and the ESCI-based tracking algorithm. Assume that N r , N v and N θ are, respectively, the number of range samples, the number of integrated pulses in one radar frame and the number of channels. Then, the computational cost of range FFT is o ( N θ N v N r log 2 N r ) . For KT, it can be implemented by means of chirp-z transform (CZT) instead of interpolation with a close computational complexity with FFT [44]. Consequently, the computational complexity of 2D-KT is about o ( N r N θ N v log 2 N v + N v N r N θ log 2 N θ ) . Supposing that N q is the number of Monte Carlo samples, the computational complexity of the ESCI-based tracking algorithm is about o ( N r N θ N v N p log 2 N p ) . Fortunately, considering that the amplitudes of target echoes attenuate rapidly with the resolution cell away from the target, we employ the impact area, with a size of M r × M θ × M v , to estimate the likelihood ratio in (25) for avoiding accessing all the resolution cells of the radar spectrum. On this basis, the computational complexity of the ESCI-based tracking algorithm is reduced to o ( M r M θ M v N p log 2 N p ) . Assume N r = N θ = N v = N p = N and M r = M θ = M v = M . Then, the total computational burden of the proposed method can be approximately estimated as o ( 3 N 3 log 2 N + M 3 N log 2 N ) . For PFTBD, the 3D-FFT operation and the sequential Monte Carlo tracking algorithm are the two main processing parts with a computational complexity of o ( 3 N 3 log 2 N + M 3 N log 2 N ) . For RMM, SOEKF and DBSCAN, the computational cost is mainly consumed in the 3D-FFT operation and threshold-decision process. Hence, the computational costs of RMM, SOEKF and DBSCAN are o ( 3 N 3 log 2 N + L w N 3 ) , where L w is the length of the sliding window of the threshold-decision algorithm [45]. Since M is far less than N generally, the proposed method has close computational complexity with PFTBD, RMM, SOEKF and DBSCAN. Additionally, we can also observe that the main computational consumption of the proposed method is also concentrated in radar signal processing, which can be implemented on FPGA to meet the real-time requirements in application.

5. Conclusions

In this article, with a high-resolution cascade MIMO radar, we proposed an ESCI-based UAV tracking framework in the presence of strong clutters and radar mutual interference. The major properties of the proposed method can be concluded as follows.
(1) In a single radar frame, we presented a 2D-KT-based coherent integration algorithm to maximally accumulate the target power in different pulses and channels, the SINR is coherently integrated from 12.84 dB to 31.3 dB, which can significantly improve the SINR in radar range–Doppler–azimuth spectrum.
(2) With a sequence of radar range–Doppler–azimuth spectra, we proposed an ESCI-based UAV tracking framework in the presence of radar mutual interferences and strong clutters by maximizing the posterior probability of the target, which satisfies Newtonian dynamic equations and elliptic energy distribution.
(3) Extensive experimental results show that this method can achieve joint positioning and spatial range state estimation and has a high anti-interference ability when facing interference. It performs exceptionally well in complex occlusion, mixed interferences and tracking under adverse weather conditions.
However, in this work, we did not directly remove the interference from radar raw spectrum before tracking, resulting in the performance loss of UAV tracking when the echoes of target and interferences are close to each other in radar range–Doppler–azimuth spectrum. In the future work, we will develop interference identification, localization and elimination algorithms to further improve the UAV tracking performance under harsh environments.

Author Contributions

Conceptualization G.H. and X.F.; methodology, G.H. and X.F.; software, G.H.; investigation, X.F. and Z.Z.; resources, X.F. and Z.Z.; data curation, G.H. writing original draft preparation, G.H.; writing review and editing, X.F. and Z.Z.; supervision, D.H.; funding acquisition, X.F. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported in part by the National Natural Science Foundation of China (No. 62303386) and in part by the Natural Science Foundation of Sichuan Province (No. 24NSFSC0525).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data that support the findings of this study are available from the corresponding author upon reasonable request.

Acknowledgments

The authors thank the editor and anonymous reviewers for their helpful comments and valuable suggestions.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Biswas, K.; Ghazzai, H.; Sboui, L.; Massoud, Y. Urban air mobility: An iot perspective. IEEE Internet Things Mag. 2023, 6, 122–128. [Google Scholar] [CrossRef]
  2. Cummings, C.; Mahmassani, H. Emergence of 4-D system fundamental diagram in urban air mobility traffic flow. Transp. Res. Rec. 2021, 2675, 841–850. [Google Scholar] [CrossRef]
  3. Li, G.; Liu, X.; Loianno, G. Rotortm: A flexible simulator for aerial transportation and manipulation. IEEE Trans. Robot. 2023, 40, 831–850. [Google Scholar] [CrossRef]
  4. Guo, J.; Chen, L.; Li, L.; Na, X.; Vlacic, L.; Wang, F.-Y. Advanced air mobility: An innovation for future diversified transportation and society. IEEE Trans. Intell. Veh. 2024, 9, 3106–3110. [Google Scholar] [CrossRef]
  5. Li, J.; Ye, D.H.; Kolsch, M.; Wachs, J.P.; Bouman, C.A. Fast and robust UAV to UAV detection and tracking from video. IEEE Trans. Emerg. Top. Comput. 2021, 10, 1519–1531. [Google Scholar] [CrossRef]
  6. Zhang, Y.; Deng, J.; Liu, P.; Li, W.; Zhao, S. Domain adaptive detection of mavs: A benchmark and noise suppression network. IEEE Trans. Autom. Sci. Eng. 2024, 22, 1764–1779. [Google Scholar] [CrossRef]
  7. Huang, D.; Zhang, Z.; Fang, X.; He, M.; Lai, H.; Mi, B. STIF: A spatial–temporal integrated framework for end-to-end micro-UAV trajectory tracking and prediction with 4-D MIMO radar. IEEE Internet Things J. 2023, 10, 18821–18836. [Google Scholar] [CrossRef]
  8. Rai, P.K.; Idsøe, H.; Yakkati, R.R.; Kumar, A.; Khan, M.Z.A.; Yalavarthy, P.K.; Cenkeramaddi, L.R. Localization and activity classification of unmanned aerial vehicle using mmwave FMCW radars. IEEE Sens. J. 2021, 21, 16043–16053. [Google Scholar] [CrossRef]
  9. Wang, B.; Lai, T.; Wang, Q.; Huang, H. A nonlinear compensation method for enhancing the detection accuracy of weak targets in FMCW radar. Remote Sens. 2025, 17, 829. [Google Scholar] [CrossRef]
  10. Takamatsu, H.; Hinohara, N.; Suzuki, K.; Sakai, F. Experimental analysis of accuracy and precision in displacement measurement using millimeter-wave FMCW radar. Appl. Sci. 2025, 15, 3316. [Google Scholar] [CrossRef]
  11. Huang, Z.; Jiang, P.; Fu, M.; Deng, Z. Angle estimation for range-spread targets based on scatterer energy focusing. Sensors 2025, 25, 1723. [Google Scholar] [CrossRef] [PubMed]
  12. Jansen, F.; Laghezza, F.; Alhasson, S.; Lok, P.; van Meurs, L.; Geraets, R.; Paker, Ö.; Overdevest, J. Simultaneous multi-mode automotive imaging radar using cascaded transceivers. In Proceedings of the 2021 18th European Radar Conference (EuRAD), London, UK, 5–7 April 2022; pp. 441–444. [Google Scholar]
  13. Sun, S.; Petropulu, A.P.; Poor, H.V. MIMO radar for advanced driver-assistance systems and autonomous driving: Advantages and challenges. IEEE Signal Process. Mag. 2020, 37, 98–117. [Google Scholar] [CrossRef]
  14. Fan, L.; Wang, J.; Chang, Y.; Li, Y.; Wang, Y.; Cao, D. 4D mmwave radar for autonomous driving perception: A comprehensive survey. IEEE Trans. Intell. Veh. 2024, 9, 4606–4620. [Google Scholar] [CrossRef]
  15. Ahmed, N.; Rutten, M.; Bessell, T.; Kanhere, S.S.; Gordon, N.; Jha, S. Detection and tracking using particle-filter-based wireless sensor networks. IEEE Trans. Mob. Comput. 2010, 9, 1332–1345. [Google Scholar] [CrossRef]
  16. Liu, H.; Huang, C.; Gan, L.; Zhou, Y.; Truong, T.-K. Clutter reduction and target tracking in through-the-wall radar. IEEE Trans. Geosci. Remote Sens. 2019, 58, 486–499. [Google Scholar] [CrossRef]
  17. Kulikov, G.Y.; Kulikova, M.V. The accurate continuous-discrete extended Kalman filter for radar tracking. IEEE Trans. Signal Process. 2015, 64, 948–958. [Google Scholar] [CrossRef]
  18. Rakai, L.; Song, H.; Sun, S.; Zhang, W.; Yang, Y. Data association in multiple object tracking: A survey of recent techniques. Expert. Syst. Appl. 2022, 192, 116300. [Google Scholar] [CrossRef]
  19. Angle, R.B.; Streit, R.L.; Efe, M. A low computational complexity JPDA filter with superposition. IEEE Signal Process. Lett. 2021, 28, 1031–1035. [Google Scholar] [CrossRef]
  20. Sheng, H.; Chen, J.; Zhang, Y.; Ke, W.; Xiong, Z.; Yu, J. Iterative multiple hypothesis tracking with tracklet-level association. IEEE Trans. Circuits Syst. Video Technol. 2018, 29, 3660–3672. [Google Scholar] [CrossRef]
  21. Li, C.; Tsay, T.J. Robust visual tracking in cluttered environment using an active contour method. In Proceedings of the 2018 57th Annual Conference of the Society of Instrument and Control Engineers of Japan (SICE), Nara, Japan, 11–14 September 2018; pp. 53–58. [Google Scholar]
  22. Wagner, T.; Feger, R.; Stelzer, A. Modification of DBSCAN and application to range/doppler/doa measurements for pedestrian recognition with an automotive radar system. In Proceedings of the 2015 European Radar Conference (EuRAD), Paris, France, 9–11 September 2015; pp. 269–272. [Google Scholar]
  23. Tuncer, B.; Özkan, E. Random matrix based extended target tracking with orientation: A new model and inference. IEEE Trans. Signal Process. 2021, 69, 1910–1923. [Google Scholar] [CrossRef]
  24. Larrat, M.; Sales, C. Classification of flying drones using millimeter-wave radar: Comparative analysis of algorithms under noisy conditions. Sensors 2025, 25, 721. [Google Scholar] [CrossRef]
  25. Granstrom, K.; Lundquist, C.; Gustafsson, F.; Orguner, U. Random set methods: Estimation of multiple extended objects. IEEE Robot. Autom. Mag. 2014, 21, 73–82. [Google Scholar] [CrossRef]
  26. Chen, Z.; Ristic, B.; Kim, D.Y. Bernoulli filter for extended target tracking in the framework of possibility theory. IEEE Trans. Aerosp. Electron. Syst. 2023, 59, 9733–9739. [Google Scholar] [CrossRef]
  27. Wang, S.; Men, C.; Li, R.; Yeo, T.-S. A maneuvering extended target tracking IMM algorithm based on second order EKF. IEEE Trans. Instrum. Meas. 2024, 73. [Google Scholar] [CrossRef]
  28. Wang, L.; Zhan, R. Joint detection, tracking, and classification of multiple maneuvering star-convex extended targets. IEEE Sens. J. 2024, 24, 5004–5024. [Google Scholar] [CrossRef]
  29. Aubry, A.; Maio, A.D.; Carotenuto, V.; Farina, A. Radar phase noise modeling and effects-part I: MTI filters. IEEE Trans. Aerosp. Electron. Syst. 2016, 52, 698–711. [Google Scholar] [CrossRef]
  30. Liu, Y.; Manikas, A. MIMO radar: An h-infinity approach for robust multitarget tracking in unknown cluttered environment. IEEE Trans. Aerosp. Electron. Syst. 2024, 60, 1740–1752. [Google Scholar] [CrossRef]
  31. Xue, J.; Fan, Z.; Xu, S. Adaptive coherent detection for maritime radar range-spread targets in correlated heavy-tailed sea clutter with lognormal texture. IEEE Geosci. Remote Sens. Lett. 2024, 21. [Google Scholar] [CrossRef]
  32. Huang, P.; Zou, Z.; Xia, X.-G.; Liu, X.; Liao, G.; Xin, Z. Multichannel sea clutter modeling for spaceborne early warning radar and clutter suppression performance analysis. IEEE Trans. Geosci. Remote Sens. 2020, 59, 8349–8366. [Google Scholar] [CrossRef]
  33. Liu, Z.; Zhu, S.; Xu, J.; He, X.; Duan, K.; Lan, L. Range-ambiguous clutter suppression for STAP-based radar with vertical coherent frequency diverse array. IEEE Trans. Geosci. Remote Sens. 2023, 61, 1–17. [Google Scholar] [CrossRef]
  34. Cui, N.; Duan, K.; Xing, K.; Yu, Z. Beam-space reduced-dimension 3d-STAP for nonside-looking airborne radar. IEEE Geosci. Remote. Sens. Lett. 2021, 19, 1–5. [Google Scholar] [CrossRef]
  35. Huang, D.; Cui, G.; Yu, X.; Ge, M.; Kong, L. Joint range–velocity deception jamming suppression for SIMO radar. IET Radar Sonar Navig. 2019, 13, 113–122. [Google Scholar] [CrossRef]
  36. Han, X.; He, H.; Zhang, Q.; Yang, L.; He, Y. Suppression of deception-false-target jamming for active/passive netted radar based on position error. IEEE Sens. J. 2022, 22, 7902–7912. [Google Scholar] [CrossRef]
  37. Aydogdu, C.; Keskin, M.F.; Carvajal, G.K.; Eriksson, O.; Hellsten, H.; Herbertsson, H.; Nilsson, E.; Rydstrom, M.; Vanas, K.; Wymeersch, H. Radar interference mitigation for automated driving: Exploring proactive strategies. IEEE Signal Process. Mag. 2020, 37, 72–84. [Google Scholar] [CrossRef]
  38. Wang, Y.; Huang, Y.; Liu, J.; Zhang, R.; Zhang, H.; Hong, W. Interference mitigation for automotive FMCW radar with tensor decomposition. IEEE Trans. Intell. Transp. Syst. 2024, 25, 9204–9223. [Google Scholar] [CrossRef]
  39. Neemat, S.; Krasnov, O.; Yarovoy, A. An interference mitigation technique for FMCW radar using beat-frequencies interpolation in the STFT domain. IEEE Trans. Microw. Theory Tech. 2018, 67, 1207–1220. [Google Scholar] [CrossRef]
  40. Yang, S.; Zhang, D.; Li, Y.; Hu, Y.; Sun, Q.; Chen, Y. iSense: Enabling radar sensing under mutual device interference. IEEE Trans. Mobile Comput. 2024, 23, 10554–10569. [Google Scholar] [CrossRef]
  41. Raeis, H.; Kazemi, M.; Shirmohammadi, S. CAE-MAS: Convolutional autoencoder interference cancellation for multiperson activity sensing with FMCW microwave radar. IEEE Trans. Instrum. Meas. 2024, 73, 1–10. [Google Scholar] [CrossRef]
  42. Patel, J.S.; Fioranelli, F.; Anderson, D. Review of radar classification and RCS characterisation techniques for small UAVs or drones. IET Radar Sonar Navig. 2018, 12, 911–919. [Google Scholar] [CrossRef]
  43. Pieraccini, M.; Miccinesi, L.; Rojhani, N. RCS measurements and ISAR images of small UAVs. IEEE Aerosp. Electron. Syst. Mag. 2017, 32, 28–32. [Google Scholar] [CrossRef]
  44. Xu, J.; Yu, J.; Peng, Y.-N.; Xia, X.-G. Radon-Fourier transform for radar target detection, i: Generalized doppler filter bank. IEEE Trans. Aerosp. Electron. Syst. 2011, 47, 1186–1202. [Google Scholar] [CrossRef]
  45. Fang, X.; Li, J.; Zhang, Z.; Xiao, G. FMCW-MIMO radar-based pedestrian trajectory tracking under low-observable environments. IEEE Sens. J. 2022, 22, 19675–19687. [Google Scholar] [CrossRef]
  46. Li, X.; Yang, Y.; Sun, Z.; Cui, G.; Yeo, T.S. Multi-frame integration method for radar detection of weak moving target. IEEE Trans. Veh. Technol. 2021, 70, 3609–3624. [Google Scholar] [CrossRef]
  47. Li, X.; Sun, Z.; Yeo, T.S.; Zhang, T.; Yi, W.; Cui, G.; Kong, L. STGRFT for detection of maneuvering weak target with multiple motion models. IEEE Trans. Signal Process. 2019, 67, 1902–1917. [Google Scholar] [CrossRef]
  48. Wang, J.; Cai, D.; Wen, Y. Comparison of matched filter and dechirp processing used in linear frequency modulation. In Proceedings of the 2011 IEEE 2nd International Conference on Computing, Control and Industrial Engineering, Wuhan, China, 20–21 August 2011; Volume 2, pp. 70–73. [Google Scholar]
  49. Robey, F.C.; Coutts, S.; Weikle, D.; McHarg, J.C.; Cuomo, K. MIMO radar theory and experimental results. In Proceedings of the Conference Record of the Thirty-Eighth Asilomar Conference on Signals, Systems and Computers, Pacific Grove, CA, USA, 7–10 November 2004; Volume 1, pp. 300–304. [Google Scholar]
  50. Kim, G.; Mun, J.; Lee, J. A peer-to-peer interference analysis for automotive chirp sequence radars. IEEE Trans. Veh. 2018, 67, 8110–8117. [Google Scholar] [CrossRef]
Figure 1. Illustration of air transportation environments and extended object tracking problem. (a) Urban air transportation environments. (b) Extended object tracking problem.
Figure 1. Illustration of air transportation environments and extended object tracking problem. (a) Urban air transportation environments. (b) Extended object tracking problem.
Electronics 14 02181 g001
Figure 2. Framework of the proposed method.
Figure 2. Framework of the proposed method.
Electronics 14 02181 g002
Figure 3. Geometry relationship and the extended range–azimuth spectrum. (a) Geometry relationship between radar and extended UAV. (b) Range–azimuth spectrum of extended UAV.
Figure 3. Geometry relationship and the extended range–azimuth spectrum. (a) Geometry relationship between radar and extended UAV. (b) Range–azimuth spectrum of extended UAV.
Electronics 14 02181 g003
Figure 4. Cascade MIMO antenna array and the corresponding virtual antenna elements.
Figure 4. Cascade MIMO antenna array and the corresponding virtual antenna elements.
Electronics 14 02181 g004
Figure 5. Flowchart of the proposed method.
Figure 5. Flowchart of the proposed method.
Electronics 14 02181 g005
Figure 6. Coherent integration results of 2D-KT. (a) Range-slow time plane of the first RX-TX channel before KT. (b) Range-slow time plane of the first RX-TX channel after KT. (c) Range–channel plane of the first chirp before KT. (d) Range–channel plane of the first chirp after KT. (e) Range–azimuth spectrum before 2D-KT. (f) Range–azimuth spectrum after 2D-KT.
Figure 6. Coherent integration results of 2D-KT. (a) Range-slow time plane of the first RX-TX channel before KT. (b) Range-slow time plane of the first RX-TX channel after KT. (c) Range–channel plane of the first chirp before KT. (d) Range–channel plane of the first chirp after KT. (e) Range–azimuth spectrum before 2D-KT. (f) Range–azimuth spectrum after 2D-KT.
Electronics 14 02181 g006
Figure 7. Simulation results of extended UAV tracking. (a) Tracking results. (b) Estimated trajectories. (c) IoU between the tracked shape and the true extension. (d) Error CDF of estimated trajectory.
Figure 7. Simulation results of extended UAV tracking. (a) Tracking results. (b) Estimated trajectories. (c) IoU between the tracked shape and the true extension. (d) Error CDF of estimated trajectory.
Electronics 14 02181 g007
Figure 8. Experimental platform. (a) Cascade MIMO radar with four AWR2243 chips. (b) Interfering radar. (c) The developed UAV. (d) DJI Tello Micro-UAV.
Figure 8. Experimental platform. (a) Cascade MIMO radar with four AWR2243 chips. (b) Interfering radar. (c) The developed UAV. (d) DJI Tello Micro-UAV.
Electronics 14 02181 g008
Figure 9. Tracking results in the presence of mutual interference. (a) Experimental scene. (b) Radar spectrum in 31st frame. (c) Radar spectrum in 44th frame. (d) Radar spectrum in 52nd frame. (e) Tracking results of UAV extension. (f) Tracking results of flying trajectory. (g) IoU between the tracked shape and true extension. (h) Error CDF of estimated trajectory.
Figure 9. Tracking results in the presence of mutual interference. (a) Experimental scene. (b) Radar spectrum in 31st frame. (c) Radar spectrum in 44th frame. (d) Radar spectrum in 52nd frame. (e) Tracking results of UAV extension. (f) Tracking results of flying trajectory. (g) IoU between the tracked shape and true extension. (h) Error CDF of estimated trajectory.
Electronics 14 02181 g009
Figure 10. Tracking results in the presence of strong clutter. (a) Experimental scene. (b) Radar spectrum in 54-th frame. (c) Radar spectrum in 69-th frame. (d) Radar spectrum in 79-th frame. (e) Tracking results of UAV extension. (f) Tracking results of flying trajectory. (g) IoU between the tracked shape and the true extension. (h) Error CDF of estimated trajectory.
Figure 10. Tracking results in the presence of strong clutter. (a) Experimental scene. (b) Radar spectrum in 54-th frame. (c) Radar spectrum in 69-th frame. (d) Radar spectrum in 79-th frame. (e) Tracking results of UAV extension. (f) Tracking results of flying trajectory. (g) IoU between the tracked shape and the true extension. (h) Error CDF of estimated trajectory.
Electronics 14 02181 g010
Figure 11. Tracking results in the presence of mutual interference and strong clutter. (a) Experimental scene. (b) Radar spectrum in 52-nd frame. (c) Radar spectrum in 63-rd frame. (d) Radar spectrum in 73-rd frame. (e) Tracking results of UAV extension. (f) Tracking results of flying trajectory. (g) IoU between the tracked shape and the true extension. (h) Error CDF of estimated trajectory.
Figure 11. Tracking results in the presence of mutual interference and strong clutter. (a) Experimental scene. (b) Radar spectrum in 52-nd frame. (c) Radar spectrum in 63-rd frame. (d) Radar spectrum in 73-rd frame. (e) Tracking results of UAV extension. (f) Tracking results of flying trajectory. (g) IoU between the tracked shape and the true extension. (h) Error CDF of estimated trajectory.
Electronics 14 02181 g011
Figure 12. Tracking results in rainy scenario. (a) Experimental scene. (b) Radar spectrum in 40-th frame. (c) Radar spectrum in 50-th frame. (d) Radar spectrum in 60-th frame. (e) Tracking results of UAV extension. (f) Tracking results of flying trajectory. (g) IoU between the tracked shape and the true extension. (h) Error CDF of estimated trajectory.
Figure 12. Tracking results in rainy scenario. (a) Experimental scene. (b) Radar spectrum in 40-th frame. (c) Radar spectrum in 50-th frame. (d) Radar spectrum in 60-th frame. (e) Tracking results of UAV extension. (f) Tracking results of flying trajectory. (g) IoU between the tracked shape and the true extension. (h) Error CDF of estimated trajectory.
Electronics 14 02181 g012
Figure 13. Tracking results of extended UAV in occlusion scene. (a) Experimental scene. (b) Radar raw spectrum. (c) Tracking results. (d) Estimated trajectories. (e) IoU between the tracked shape and the true extension. (f) Error CDF of estimated trajectory.
Figure 13. Tracking results of extended UAV in occlusion scene. (a) Experimental scene. (b) Radar raw spectrum. (c) Tracking results. (d) Estimated trajectories. (e) IoU between the tracked shape and the true extension. (f) Error CDF of estimated trajectory.
Electronics 14 02181 g013
Figure 14. Tracking performance of extended UAV under different SINRs. (a) Tracking RMSEs with SINR increasing. (b) Estimated IoUs with SINR increasing.
Figure 14. Tracking performance of extended UAV under different SINRs. (a) Tracking RMSEs with SINR increasing. (b) Estimated IoUs with SINR increasing.
Electronics 14 02181 g014
Table 1. Parameter configuration of victim and interfering radar systems.
Table 1. Parameter configuration of victim and interfering radar systems.
ParametersVictim RadarInterfering Radar
Carrier frequency77 GHz77 GHz
Bandwidth2.4 GHz0.87 GHz
Pulse duration50 μ s18.98 μ s
No. of transmitters122
No. of receivers164
Sweep slope85.021 MHz/ μ s85.021 MHz/ μ s
Sample rate9000 KHz6250 KHz
Range resolution0.062 m0.17 m
Azimuth resolution1.4062°14.3°
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Hu, G.; Fang, X.; Huang, D.; Zhang, Z. ESCI: An End-to-End Spatiotemporal Correlation Integration Framework for Low-Observable Extended UAV Tracking with Cascade MIMO Radar Subject to Mixed Interferences. Electronics 2025, 14, 2181. https://doi.org/10.3390/electronics14112181

AMA Style

Hu G, Fang X, Huang D, Zhang Z. ESCI: An End-to-End Spatiotemporal Correlation Integration Framework for Low-Observable Extended UAV Tracking with Cascade MIMO Radar Subject to Mixed Interferences. Electronics. 2025; 14(11):2181. https://doi.org/10.3390/electronics14112181

Chicago/Turabian Style

Hu, Guanzheng, Xin Fang, Darong Huang, and Zhenyuan Zhang. 2025. "ESCI: An End-to-End Spatiotemporal Correlation Integration Framework for Low-Observable Extended UAV Tracking with Cascade MIMO Radar Subject to Mixed Interferences" Electronics 14, no. 11: 2181. https://doi.org/10.3390/electronics14112181

APA Style

Hu, G., Fang, X., Huang, D., & Zhang, Z. (2025). ESCI: An End-to-End Spatiotemporal Correlation Integration Framework for Low-Observable Extended UAV Tracking with Cascade MIMO Radar Subject to Mixed Interferences. Electronics, 14(11), 2181. https://doi.org/10.3390/electronics14112181

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop