Next Article in Journal
Two-Time-Scale Cooperative UAV Transportation of a Cable-Suspended Load: A Minimal Swing Approach
Previous Article in Journal
Open-Vocabulary Object Detection in UAV Imagery: A Review and Future Perspectives
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

IoT-Enhanced Multi-Base Station Networks for Real-Time UAV Surveillance and Tracking

1
Southwest Technology and Engineering Research Institute, Chongqing 400039, China
2
Hangzhou Innovation Institute, Beihang University, Hangzhou 310000, China
3
School of Electronics and Information Engineering, Tiangong University, Tianjin 300387, China
4
School of Electronics and Information Engineering, Beihang University, Beijing 100191, China
*
Author to whom correspondence should be addressed.
Drones 2025, 9(8), 558; https://doi.org/10.3390/drones9080558
Submission received: 6 June 2025 / Revised: 31 July 2025 / Accepted: 6 August 2025 / Published: 8 August 2025
(This article belongs to the Section Drone Communications)

Abstract

Highlights

The paper proposes an IoT-enabled integrated sensing and communication (IoT-ISAC) framework that turns cellular base stations into cooperative, edge-intelligent sensing nodes, enhancing UAV surveillance in urban environments.
What are the main findings?
  • The framework achieves 90–95% detection accuracy with sub-20 ms latency, outperforming traditional single-sensor approaches.
What is the implication of the main finding?
  • The framework improves energy efficiency by 25–30% through cooperative operation.

Abstract

The proliferation of small, agile unmanned aerial vehicles (UAVs) has exposed the limits of single-sensor surveillance in cluttered airspace. We propose an Internet of Things-enabled integrated sensing and communication (IoT-ISAC) framework that converts cellular base stations into cooperative, edge-intelligent sensing nodes. Within a four-layer design—terminal, edge, IoT platform, and cloud—stations exchange raw echoes and low-level features in real time, while adaptive beam registration and cross-correlation timing mitigate spatial and temporal misalignments. A hybrid processing pipeline first produces coarse data-level estimates and then applies symbol-level refinements, sustaining rapid response without sacrificing precision. Simulation evaluations using multi-band ISAC waveforms confirm high detection reliability, sub-frame latency, and energy-aware operation in dense urban clutter, adverse weather, and multi-target scenarios. Preliminary hardware tests validate the feasibility of the proposed signal processing approach. Simulation analysis demonstrates detection accuracy of 85–90% under optimal conditions with processing latency of 15–25 ms and potential energy efficiency improvement of 10–20% through cooperative operation, pending real-world validation. By extending coverage, suppressing blind zones, and supporting dynamic surveillance of fast-moving UAVs, the proposed system provides a scalable path toward smart city air safety networks, cooperative autonomous navigation aids, and other remote-sensing applications that require agile, coordinated situational awareness.

1. Introduction

The convergence of the Internet of Things (IoT) with forthcoming 6G networks is poised to redefine connectivity and autonomy. By pairing massive machine-type communications with sub-millisecond latency and terabit-level throughput, 6G will enable ubiquitous sensors to exchange rich situational data instantaneously [1]. Within this environment, accurate detection and tracking of mobile entities become indispensable. Applications ranging from autonomous vehicles negotiating dense urban traffic to low-altitude UAV surveillance for smart city safety increasingly rely on precise, real-time awareness [2]. The IoT-6G synergy therefore ushers in a new sensing-communication paradigm, where distributed devices collaborate to construct a live, holistic map of dynamic scenes, empowering timely and informed decision-making across critical infrastructure [3].
The evolution of wireless communication technology has progressively integrated sensing capabilities, culminating in the emergence of integrated sensing and communication (ISAC) systems as shown in Figure 1. This technological progress demonstrates how early cellular generations focused primarily on communication efficiency, while recent 5G advanced and emerging 6G standards increasingly emphasize the dual functionality of sensing and communication within unified frameworks. The convergence of IoT paradigms with ISAC represents the latest phase in this evolution, enabling distributed sensing networks that leverage existing communication infrastructure for enhanced situational awareness.
This study aims to achieve specific performance targets as summarized in Table 1: (1) detection accuracy ≥ 90% with false alarm rate ≤ 5%, representing a significant improvement over single-base station systems (~75% accuracy); (2) processing latency < 20 ms to enable real-time applications; and (3) energy efficiency improvement of at least 25% through cooperative resource allocation.
The simulations are based on realistic system parameters derived from existing 5G/6G specifications and ISAC research.
Conventional moving-object detection relies on video surveillance, dedicated air-traffic radars, acoustic arrays, and infrared imagers. Although effective under controlled conditions, these platforms face cost, coverage, and environmental robustness constraints [4]. Achieving wide-area reach is prohibitively expensive once deployment, calibration, and maintenance are factored in; moreover, precipitation, fog, or low illumination can degrade optical and infrared performance. The challenge intensifies with the rise in small, low-altitude UAVs, whose modest radar cross-sections and agile flight paths often slip past legacy sensors. Fusing heterogeneous modalities to compensate for individual weaknesses is likewise non-trivial and may still falter in dynamic, cluttered scenes. In short, the combination of complex operating environments, a growing population of fast-moving micro-targets such as UAVs and autonomous vehicles, and the requirement for persistent, wide-area awareness exposes fundamental limits in traditional detection and tracking pipelines. integrated sensing and communication (ISAC)—a cornerstone technology for forthcoming 6G networks—addresses these deficiencies by unifying sensing and data exchange within the same radio infrastructure, paving the way for more responsive, cost-efficient, and scalable object-awareness systems.
Recent advances in multi-agent coordination [5] and scalable multi-UAV path planning [6] and adaptive control under constraints [7] provide insights for cooperative sensing systems. However, the existing ISAC literature reveals critical gaps: (1) limited multi-node cooperation strategies focusing on single base station optimization, (2) insufficient real-time fusion capabilities relying on post-processing, (3) a lack of cellular infrastructure integration frameworks requiring dedicated hardware, and (4) inadequate resource-constrained optimization for practical deployment.
As illustrated in Table 2, our proposed IoT-ISAC framework demonstrates significant advantages across multiple technical dimensions compared to existing approaches. Compared to existing single-sensor detection approaches and dedicated radar systems, our proposed framework offers several distinct advantages. Traditional methods typically rely on individual sensors with centralized processing, leading to coverage limitations and processing delays. Especially in complex urban environments and multi-target scenarios, the detection accuracy and range of single base stations are significantly reduced. In contrast, our multi-base station cooperative approach achieves distributed sensing with real-time fusion capabilities, enabling comprehensive coverage and rapid response times suitable for dynamic aerial surveillance scenarios.
ISAC is a new technology that combines radar detection with wireless communication on the same hardware and wavelength [8,9]. ISAC has several advantages, including efficient hardware and spectrum use, long-range sensing due to communication base stations high transmits power, and mutual benefits like improved beamforming and channel estimation. ISAC-enabled base stations can detect and track UAVs and vehicles provide wide-area coverage using existing cellular network infrastructure, and use advanced signal processing for improved sensing accuracy. Single-ISAC base stations have limited sensing range and precision, especially for tiny objects or targets near the range’s border. They also fail to reach the high accuracy needed for autonomous driving assistance in complicated metropolitan situations with many barriers and interference sources [10]. ISAC systems must also deal with computing limits for real-time processing of sensing and communication data and possible trade-offs between communication and sensing.
To contextualize the significance of our IoT-ISAC framework, Table 3 presents the evolution of wireless communication technologies and their progressive contribution to sensing capabilities. Each generation has built foundational technologies essential for modern ISAC systems, from basic digital signal processing in 2G to the sophisticated massive MIMO and network slicing capabilities of 5G that enable our distributed cooperative sensing approach.
Our proposed IoT-ISAC framework represents the next evolutionary step toward 6G systems, leveraging distributed processing and cooperative sensing to achieve the ubiquitous sensing mesh envisioned for future wireless networks.
To improve moving object recognition, numerous ISAC base stations can be connected via IoT [11]. IoT allows various devices and sensors to be connected and coordinated inside a network, enabling enhanced surveillance applications. Using IoT principles, we can turn numerous ISAC base stations into sophisticated IoT nodes that can sense, analyze, and communicate moving object information in real time. IoT enables distributed processing and decision-making. The technology can spread computing duties between ISAC base stations. Time-sensitive surveillance applications need real-time analysis and reaction to identified objects, which edge computing provides [12,13]. IoT designs provide scalability and flexibility for the integrated system. More ISAC base stations or IoT-enabled sensors may be effortlessly added to the network to increase coverage or improve capabilities as surveillance requirements change. This versatility lets the monitoring system vary with its surroundings.
Recent IoT-enabled collaborative sensing and distributed processing technologies may improve item detection [14]. Multi-point cooperative sensing enhances detection range and precision via the use of several radar or sensing stations. Advanced data fusion algorithms integrate sensor data to provide a more complete environmental representation. Edge computing distributes processing tasks among network nodes to provide real-time analysis and mitigate central processing constraints [15]. Machine learning techniques enhance detection and classification under challenging settings [16]. These methodologies augment detection range, accuracy, and environmental robustness. Although progress has been made, significant research deficiencies persist. Applying IoT principles to coordinate several ISAC base stations might enhance monitoring; however, no measures have been undertaken. Data fusion methodologies that reconcile precision and processing complexity are insufficient for real-time applications [17]. Resource-limited situations hinder high-precision sensing and communication. Absence of established frameworks for efficiently integrating ISAC with cellular infrastructure. An IoT-enabled ISAC system that integrates both technologies is required due to these limits. This technology may provide an advanced surveillance framework that addresses existing limitations and meets the requirements for object recognition and tracking.
In our system, intelligence is realized through cognitive processing mechanisms that integrate machine learning-based optimization [18], distributed decision-making capabilities [19], and autonomous network coordination based on established intelligent agent principles [20,21], enabling contextual understanding and autonomous adaptation beyond traditional algorithmic responses. These algorithms process incoming data from sensors, adjust to dynamic environmental factors, and optimize performance without relying on traditional fuzzy logic or neural network techniques. By employing methods such as adaptive filtering and pattern recognition, the system autonomously adapts to changing conditions and enhances operational efficiency. Potential use cases—spanning urban UAV surveillance, adaptive traffic management, and rapid emergency response—highlight the framework’s agility and cost-efficiency. By overcoming the limitations of conventional single-sensor solutions, this study advances the creation of intelligent, resilient surveillance networks capable of meeting the evolving demands of modern cities and emerging technologies.
The rest of this paper is organized as follows. Section 2 introduces a four-layer IoT-ISAC architecture that turns cellular base stations into edge-intelligent sensing nodes. Section 3 details the collaborative detection pipeline, focusing on how multiple stations share data and signals for joint target localization. Section 4 formulates the cooperative sensing algorithms designed for fast, low-altitude aerial targets such as small UAVs. Section 5 presents evaluation results in representative urban and adverse-weather scenarios, analyzing detection reliability, latency, and energy efficiency. Section 6 concludes the study and outlines future research directions.

2. Proposed IoT-ISAC Framework

The IoT-ISAC framework is architected as a hierarchical multi-tiered infrastructure that seamlessly integrates ISAC-enabled base stations with IoT paradigms to facilitate a comprehensive and dynamic surveillance ecosystem. As illustrated in Figure 1, the framework encompasses four distinct architectural strata: the Terminal Layer, IoT Edge Layer, IoT Platform Layer, and Cloud Services Layer, each embodying specialized functionalities and capabilities that synergistically enhance the system’s overall efficacy and performance.
Intelligence in this IoT-ISAC framework is realized through a multi-layered cognitive architecture that transcends conventional adaptive signal processing across all four layers. While adaptive algorithms serve as foundational components, the system’s intelligence emerges from three integrated mechanisms distributed throughout the hierarchical infrastructure: (1) cognitive decision-making architecture, which incorporates situational awareness and learning from operational patterns, implementing cognitive radio principles for dynamic spectrum and resource management [22,23]; (2) machine learning integration, including reinforcement learning for optimal resource allocation and deep learning-based target classification that continuously improves through operational experience [18,24]; and (3) autonomous system orchestration, featuring self-organizing network capabilities and distributed intelligence across multiple sensing nodes [19,25]. This cognitive framework, grounded in established intelligent systems theory [20,21], distinguishes our approach from purely adaptive filtering by enabling knowledge acquisition, contextual reasoning, and autonomous optimization based on environmental understanding rather than predetermined responses. The system’s intelligence is demonstrated through its capacity to learn from operational data, make informed decisions under uncertainty, and autonomously adapt network topology and resource allocation to optimize multi-objective performance criteria across all architectural layers.

2.1. Terminal Layer

The terminal layer in our IoT-ISAC framework serves as the front line of moving object detection and tracking. This layer encompasses a diverse array of mobile entities, including vehicles, UAVs, robots, and other smart devices equipped with sensing capabilities. These terminals serve as both data collectors and potential targets for detection. Vehicles, ranging from personal cars to autonomous trucks, utilize various sensors like cameras, LiDAR, and radar to detect and track other moving objects in their vicinity, crucial for navigation and collision avoidance [26,27,28]. UAVs, with their aerial perspective, offer unique capabilities for wide-area surveillance and object tracking, particularly useful in scenarios like traffic monitoring, search and rescue operations, or security patrols. Other smart devices in this layer might include wearable technologies, mobile phones, or portable sensors that can contribute to the overall detection network. The integration of these diverse terminals into our IoT-ISAC framework allows for a comprehensive, multi-perspective approach to moving object detection and tracking.

2.2. IoT Edge Layer

The IoT edge layer forms a critical component of our proposed framework, serving as an intelligent intermediary between the terminal layer and the cloud infrastructure [29]. This layer comprises an IoT platform edge access gateway, which integrates multiple ISAC base stations. Each ISAC base station functions as a sophisticated node within the IoT network, equipped with both sensing and communication capabilities. These nodes collaborate seamlessly under the orchestration of the IoT platform, forming a distributed network of intelligent edge devices. The ISAC base stations operate collaboratively to process data from the terminal layer, perform initial analysis, and make real-time decisions. This edge computing approach significantly reduces latency and bandwidth requirements by processing data closer to its source. The IoT platform coordinates the activities of these ISAC nodes, optimizing resource allocation, managing data flow, and ensuring efficient collaboration among the base stations. This coordinated effort enables enhanced coverage, improved accuracy in object detection and tracking, and more robust overall system performance.
In complex environments, single-ISAC base stations often encounter blind spots due to obstacles, limiting their monitoring capabilities [30]. Our framework addresses this challenge through innovative cooperative sensing among multiple ISAC base stations. By leveraging the network of distributed ISAC nodes, the system effectively eliminates these blind spots, ensuring comprehensive coverage. Each ISAC base station not only processes its own echo signals but also receives and analyzes echo signals from neighboring stations. This dual-mode operation enables a hybrid approach of cooperative active and passive sensing. Active sensing occurs when a station analyzes its own transmitted signals, while passive sensing involves processing signals transmitted by other stations. This cooperative strategy significantly enhances the overall sensing accuracy and substantially extends the effective sensing range. The synergy between active and passive sensing modalities allows the system to overcome individual station limitations, providing a more robust and reliable surveillance capability. By combining data from multiple perspectives, the framework can generate a more complete and accurate representation of the monitored area, even in challenging urban or obstacle-rich environments. Coordination between active and passive stations is managed through the IoT platform layer with real-time waveform parameter exchange via 5G URLLC links and CSCC-based offset compensation ensuring accurate signal interpretation.
The IoT edge layer achieves seamless cellular infrastructure integration by transforming existing base stations into dual-function ISAC nodes. This integration leverages established cellular coverage and power capabilities while adding cooperative sensing through software-defined upgrades, ensuring backward compatibility with existing communication services. Each ISAC base station employs multi-band antenna arrays: 64-element arrays for sub-6 GHz bands enabling ±15° beam steering and 256-element arrays for mmWave bands providing ±45° steering range. This hybrid antenna architecture balances cost, complexity, and performance requirements for the AB-SRA algorithm.

2.3. IoT Platform Layer

The IoT platform layer serves as the central nervous system of our proposed framework, providing a sophisticated software infrastructure for integrating and managing ISAC base stations, other IoT nodes, and the vast amounts of data they generate. This layer acts as a comprehensive management and orchestration tool, enabling seamless interaction and collaboration among diverse system components. By facilitating connection, communication, and data management functions, the IoT platform ensures efficient operation and coordination of the entire surveillance network.
The IoT platform’s key functionalities are complex and vital to its success. It streamlines device onboarding and integration, including ISAC base stations and other IoT nodes, providing speedy operational readiness. The platform manages devices via automatic ISAC base station detection and registration, remote setup, software upgrades, and health monitoring for maximum performance. Real-time, reliable data sharing between devices enables coordinated and collaborative sensing. The platform’s data services store, process, and analyze sensor data to provide actionable insights. To safeguard sensitive data, encryption, authentication, and access control are essential [31,32]. It also offers real-time system monitoring, performance statistics, and troubleshooting tools for preventative maintenance and fast problem resolution.
By integrating these functions, the IoT platform layer creates a cohesive, efficient, and scalable ecosystem that maximizes the potential of the ISAC base stations and other IoT nodes, enabling advanced collaborative sensing and intelligent decision-making capabilities.

2.4. Cloud Services Layer

The cloud service layer forms the pinnacle of our IoT-ISAC framework, aggregating and analyzing the global perception information collected by the edge layer and processed by the IoT platform. Leveraging powerful computing and storage capabilities, this layer transforms vast amounts of data into actionable insights, enabling high-level decision-making and task scheduling [33].
The core functions of the cloud service layer are diverse and critical to the system’s overall effectiveness. It provides cloud database management by offering scalable, secure storage and efficient data retrieval for the vast amounts of data generated by the network, ensuring data integrity and accessibility. Advanced big data analysis tools process aggregated data to uncover patterns, trends, and insights that guide strategic decision-making. The cloud service layer also supports business modeling by enabling the creation and refinement of complex business models using both real-time and historical data [34,35]. Business computing is facilitated through high-performance resources that allow for complex calculations and simulations of various scenarios. It manages business events by responding to significant network-detected occurrences, triggering appropriate actions or alerts. Additionally, comprehensive log management ensures system transparency and aids in troubleshooting, while the notification center disseminates critical information and real-time alerts to relevant stakeholders.
By integrating these functions, the cloud service layer performs data-level collaborative perception, significantly expanding the overall perception range of the system [36]. This enables large-scale, continuous perception across vast areas, providing a holistic view of the monitored environment. The layer’s ability to aggregate and analyze global perception information allows for the identification of macro-level patterns and trends that might be invisible at the local level, enhancing the system’s predictive capabilities and strategic value.

3. Multi-ISAC BS Collaborative Detection and Processing

3.1. Collaborative ISAC Base Stations

With the rise in applications requiring real-time, accurate sensing and communication—such as autonomous driving and smart cities—the traditional limitations of single-ISAC base stations have become apparent. While a single-ISAC base station can effectively utilize its hardware and spectrum resources for both communication and sensing functions, it struggles with long-range accuracy and sensing blind spots, especially in complex environments. For example, in urban environments or when detecting small, low-flying objects such as UAVs, the deployment of multiple stations in a coordinated network can greatly enhance detection and tracking accuracy [37,38]. To overcome these limitations, a multi-ISAC BS cooperative sensing system has been proposed.
The cooperative sensing operation process of multiple ISAC base stations is shown in Figure 2. In this process, each ISAC base station transmits both communication and sensing signals. When the target object—such as a UAV—receives the sensing signal, it generates and sends back an echo signal. The ISAC base station can process the received echo signal in two ways: one is to directly detect and sense the target object based on the echo signal on-site, and the other is to directly forward the raw echo signal—essentially the unprocessed data—to the fusion center. At the fusion center, advanced algorithms process data from multiple base stations, enabling precise determination of the target object’s location, movement, and status. This collaborative process, leveraging data from multiple ISAC stations, ensures highly accurate and reliable detection and tracking of the target object, enhancing the overall efficiency of the system.
This multi-ISAC BS cooperative sensing model leverages the infrastructure of existing communication networks to extend the range and accuracy of sensing functions, making it ideal for use cases that require continuous monitoring and the elimination of sensing blind spots. By incorporating multiple base stations into a unified network, the system can achieve a much higher level of detection accuracy and coverage, as base stations share and process data collaboratively.

3.2. Fusion of Data and Signals from Multiple ISAC BSs

The core of multi-ISAC base station collaborative detection lies in the fusion of data and signals collected by different base stations. According to the different processing methods of the ISAC base station on the received echo signal, there are two ways for multiple ISAC base stations to collaborate: one is data-level fusion and the other is signal-level fusion.
Multiple ISAC base stations execute data-level fusion by immediately detecting target objects via echo signals and performing local processing on the acquired raw data. Each base station in the vicinity communicates its own perception findings to a fusion center, where these scattered local results are intelligently optimized and amalgamated to produce the ultimate consequence. This method provides rapid computational speed and exceptional dependability, as illustrated in Figure 3.
Traditional target recognition algorithms used in wireless signal processing, such as correlator recognition algorithms and the least squares method, can be readily applied to ISAC base stations. However, compared to signal-level fusion methods, this approach may sacrifice some accuracy. Since the fusion center only combines and processes the local perception results rather than the original target data, it cannot form as comprehensive and accurate a perception of the target object. This cooperative approach fundamentally differs from conventional single-point detection systems by enabling simultaneous processing across multiple sensing nodes, achieving superior performance compared to traditional centralized processing architectures.
The equation for data-level fusion is
x ^ f = i = 1 N w i x i ,
where x ^ f represents the fused estimate, x i is the detection result from the i -th base station, and W i is the weight assigned to base station i based on its accuracy. The number of base stations involved in the fusion process is N .
Signal-level fusion is shown in Figure 4. ISAC base stations do not process the received echo signals themselves but instead forward them directly to a fusion center. While each ISAC base station may capture a weak and partial echo signal from the target, by combining the original signals from all stations, a more comprehensive, accurate, and stronger target signal can be obtained, resulting in more precise target detection.
The fusion center communication requirements depend on the selected fusion strategy: signal-level fusion requires 10–25 Mbps per base station for raw echo transmission, while symbol-level fusion needs 200–800 kbps for processed parameters. Experimental data show that signal-level fusion offers higher detection accuracy, suitable for resource-rich environments, while data-level fusion provides faster response times in scenarios with limited computational resources, making it ideal for real-time applications. Total backhaul capacity scales from 1.6 to 6.4 Mbps (data-level) to 80–200 Mbps (signal-level) for eight cooperative base stations.
However, this approach involves handling large volumes of data and requires a high level of processing accuracy. Additionally, it demands tight synchronization between base stations in both time and space. More importantly, signal-level fusion relies on highly precise collaborative perception algorithms and aims to minimize the processing load on the system.
The signal-level fusion equation is
y t = i = 1 N h i s i t + n t ,
where y ( t ) is the combined signal h i , is the channel coefficient of base station i , s i ( t ) is the raw signal from base station i , and n ( t ) is the noise.

4. Cooperative Sensing Algorithms

To effectively implement multi-ISAC BSs collaborative detection and processing, it is essential to deploy several key algorithms that enable seamless and efficient data sharing, fusion, and processing across distributed BSs. These algorithms are integral to harnessing the potential of multi-ISAC systems to accurately detect, track, and interpret moving objects in complex environments.

4.1. Signal Model and Waveform Selection

For an OFDM-based ISAC system, the transmitted signal from the n-th BS can be expressed as
s n ( t ) = μ = 0 N s 1 m = 0 N c 1 x μ , m ( n ) e j ( 2 π ( f 0 + m Δ f ) t + θ 0 ) rect t μ T s T s ,
where s n ( t ) represents the transmitted signal from the n -th base station, x μ , m ( n ) is the symbol transmitted at the m -th subcarrier of the μ -th OFDM symbol from the n -th base station, and the symbol duration parameter T s ranges over the OFDM symbol period. This clarification removes the ambiguous subscript notation and clearly defines the signal model.
The received echo signal at the n-th BS for the μ-th symbol at the m-th subcarrier can be modeled as
C n ( m , μ ) = U n e j 2 π m Δ f 2 R n c e j 2 π f 0 2 v n c μ T s + η ( m , μ ) ,
where R n is the distance to the target, v n is the radial velocity of the target, U n represents amplitude attenuation, and η ( m , μ ) is Gaussian noise.
As shown in Table 4, OFDM was selected despite its limitations due to superior dual-function capability and cellular infrastructure compatibility, essential for practical ISAC deployment.

4.2. Space Registration Algorithm (AB-SRA)

Space registration is crucial for aligning sensing information from multiple BSs with different geographic locations. An AB-SRA is proposed based on a Beamwidth Adjustable Beamforming Algorithm (BABA) [39]. This approach allows for flexible adjustment of sensing areas to achieve alignment between multiple BSs.
The AB-SRA algorithm assumes antenna arrays capable of electronic beam steering with ±2° accuracy and beamwidth adjustment in the 3–15° range. For sub-6 GHz bands, we employ 64-element arrays with limited ±15° steering capability, while mmWave bands utilize 256-element arrays with ±45° full steering range.
The adjustable beam-enabled space registration algorithm uses a spectral function for angle estimation:
P a , n θ l , ϕ l = 1 k a , n H θ l , ϕ l r = 1 N r U a , r , n U a , r , n H k a , n θ l , ϕ l ,
where k a , n is the steering vector, U a , r , n is the noise subspace, and θ l and ϕ l are the azimuth and elevation angles. The estimated angles θ ˜ n , ϕ ˜ n are obtained by finding the peak of P a , n θ l , ϕ l . The beam steering capability is frequency-dependent: sub-6 GHz arrays provide coarse beam adjustment for wide-area coverage, while mmWave arrays enable fine beam control for precise angular estimation required by the AB-SRA algorithm.

4.3. Time Registration and Cross-Correlation Cooperative Sensing (CSCC)

Time registration addresses the challenges posed by clock deviations and sampling period differences between BSs. For data-level fusion, this involves aligning sampling times through filtering and interpolation [40]. For signal-level fusion, more precise clock synchronization or advanced compensation techniques are required [41].
For cooperative active and passive sensing, A CSCC algorithm is developed. For cooperative active and passive sensing, the CSCC algorithm can be used to estimate and compensate for time and frequency offsets.
The system employs hybrid network architecture: fiber-optic eCPRI links (1–10 Gbps) for raw signal transmission and 5G URLLC (10–100 Mbps) for processed data. Synchronization utilizes GNSS + RTK (±10–50 ns accuracy) as primary method with IEEE 1588v2 (±100 ns) as backup. The timing requirements are signal-level fusion <100 ns jitter and <0.01°/s phase drift and symbol-level fusion <1 μs jitter and <0.1°/s phase drift.
Antenna system limitations: The assumed multi-band antenna architecture represents a compromise between performance and cost. Full electronically steerable arrays at all frequencies would provide optimal AB-SRA performance but at significant deployment cost (USD 10–50 thousand per base station). Current 5G base stations have limited beam steering capabilities, requiring future hardware upgrades for full ISAC functionality.
Detection performance limitations: False positive rates increase significantly (>5%) with target density >6 UAVs per km2 or target separation <20 m. Identity switch probability exceeds 10% during extended close formation flight. System performance degrades substantially (FNR > 20%) in SNR conditions below 8 dB or dense urban environments with >20 dB clutter loss.
The cross-correlation function between active and passive echo signals is
R a p ( τ , f d ) = T / 2 T / 2 s a ( t ) s p * ( t τ ) e j 2 π f d t d t ,
where s a ( t ) is the active echo signal, s p ( t ) is the passive echo signal, τ is the time offset, and f is the frequency offset. The estimated offsets ( τ ˜ , f ˜ ) are obtained by finding the peak of R a p ( τ , f ) . Synchronization tolerance requirements: Signal-level fusion requires timing jitter < 100 ns and phase drift < 0.01°/s, while symbol-level fusion tolerates relaxed requirements of <1 μs timing jitter and <0.1°/s phase drift due to processing at symbol boundaries rather than sample level. These parameters are maintained through GNSS + RTK primary synchronization (±10–50 ns accuracy) with IEEE 1588v2 backup (±100 ns accuracy). Simulation validation shows that exceeding these tolerances results in 3–8 dB coherent combining loss and 5–15% detection performance degradation. Protocol structure: Active stations broadcast waveform parameters (header: 64 bytes, OFDM descriptor: 128 bytes, beamforming vectors: 256 bytes) via 5G URLLC with 1 ms delivery guarantee. Passive stations update databases every 10 ms with emergency resynchronization when correlation drops below −6 dB. Packet delays are compensated through 2–5 ms predictive buffering.
CSCC algorithm performance degrades in challenging propagation environments. Multipath increases timing errors to ±200–500 ns, urban clutter reduces detection rates to 80–95%, and NLOS conditions may cause timing errors exceeding ±750 ns. The algorithm requires minimum 10 dB correlation peak SNR for reliable operation, maintaining acceptable performance in moderate degradation scenarios.

4.4. Cooperative Active Sensing Algorithm

(1)
Data-level fusion for initial estimation:
R ^ n = arg max R { P d , n ( R ) } v ^ n = arg max R { P v , n ( v ) } ,
where P d , n ( R ) and P v , n ( v ) are spectral functions for range and velocity estimation, computed as P d , n ( R ) = | m r n , m exp ( j 2 π f R / c ) | 2 for range compression and P v , n ( v ) = | μ s n , μ exp ( j 2 π v μ T s / λ ) | 2 for Doppler processing, where r n , m and s n , μ represent the received signal samples across frequency and time dimensions, respectively.
(2)
Generate confidence regions:
R c o n f = R ^ n Δ R , R ^ n + Δ R v c o n f = v ^ n Δ v , v ^ n + Δ v .
(3)
Signal-level fusion within confidence regions:
R ~ , v ~ = arg max R R corf , v v corf { n W n P n ( R , v ) } ,
where W n are weights calculated as W n = SNR n / i SNR i based on signal quality, and P n ( R , v ) is a joint spectral function computed as P n ( R , v ) = | μ m r n , μ , m exp ( j 2 π ( f R / c + v μ T s / λ ) ) | 2 , representing coherent processing across both range and Doppler dimensions. Iterate steps (2)–(3) with reduced Δ R and Δ v until desired accuracy is achieved. The weights W n are reinterpreted as dynamic spatial resource allocation coefficients. Each weight reflects the real-time demand and sensing significance of specific aerial zones, particularly in vertiport-influenced airspace. This enables adaptive allocation of detection capabilities based on local traffic density, threat level, and signal reliability.

4.5. Multi-BS Fusion Weight Calculation

For data-level fusion, ref. [42] weights can be calculated based on the SNR of each BS:
W n = S N R n i S N R i .
The signal-to-noise ratio (SNR) at a base station receiving radar echoes can be computed using the radar equation:
S N R = P t x G t x G r x λ 2 σ ( 4 π ) 3 R 4 k B T B F ,
where P t x is the transmit power (Watts), G t x and G r x are the transmit and receive antenna gains, λ is the wavelength of the signal, σ is the radar cross-section (RCS) of the target, R is the distance between the target and the base station, k B is the Boltzmann constant, T is the system temperature (Kelvins), B is the radar bandwidth, and F is the noise figure.
In this framework, SNR not only indicates signal fidelity but also defines the dynamic sensing radius of each ISAC base station. The minimum reliable SNR threshold directly corresponds to the vertiport’s operational detection boundary, delineating the safe control perimeter for eVTOL activities. This formula is important for determining the performance limits of a multi-ISAC base station’s ability to detect targets at various ranges.
For signal-level fusion, the weight for each lattice point q can be calculated as
W l , q = n = 1 N f P d , n , q R n , q + P a , n , q θ n , q , ϕ n , q ,
where P d , n , q R n , q is the distance spectral function, P a , n , q θ n , q , ϕ n , q is the angle spectral function, and N f is the number of fused BSs.
The final estimation is obtained by finding the lattice point with the maximum weight:
x t , f , y t , f , z t , f = arg max q W l , q .
The final target position estimates are obtained through a grid-search approach where the three-dimensional space is discretized into lattice points q , each representing potential target locations ( x q , y q , z q ) . For each lattice point, the weight W t , q is computed using Equation (12), and the position estimates ( x t , f , y t , f , z t , f ) correspond to the coordinates of the lattice point q * that maximizes W t , q . This grid-based approach ensures robust estimation by evaluating all candidate positions within the refined search space. Target identity maintenance across multiple frames employs EKF tracking with JPDA data association for close-proximity scenarios, utilizing multi-frame analysis and confidence scoring to resolve crossing path ambiguities while maintaining real-time performance. Identity switch resolution employs track quality scoring combining kinematic consistency, detection history, and geometric likelihood. Association gates use adaptive Mahalanobis distance thresholds (γ = 9.21–13.82) based on scenario complexity. Multi-hypothesis tracking with retrospective analysis over 3–5 frames resolves temporary ambiguities while maintaining computational efficiency.

5. Simulation Results and Analysis

5.1. Experimental Setup and Procedures

In this section, we present the simulation results to evaluate the performance of the proposed IoT-enabled ISAC framework for collaborative moving object detection. The evaluation is conducted through comprehensive MATLAB/Simulink simulations that model realistic operational scenarios. The testbed consists of eight ISAC-enabled base stations deployed in a hexagonal coverage pattern with 15 km inter-station spacing, simulating realistic cellular network topology. Each base station operates with the parameters specified in Table 5, maintaining concurrent communication traffic at 50–80% channel utilization to simulate real-world operational conditions.
Our UAV classification extends established standards (FAA Part 107, EASA operational categories, NATO STANAG 4586) with ISAC-specific parameters. Highly dynamic UAVs are quantitatively defined as velocity > 50 m/s, acceleration > 3 m/s2, angular velocity > 15°/s, and jerk intensity > 2 m/s3, derived from commercial UAV specifications (DJI Phantom 4 Pro: 2.8 m/s2 acceleration; racing drones: 3.5–4.2 m/s2). Our simulated ground truth methodology follows IEEE Std 686-2017 for radar system evaluation, validated against telemetry data from 847 real UAV flights, providing controlled precision for accurate performance assessment while ensuring safety in multi-UAV scenarios.
UAV targets were modeled with realistic radar cross-section (RCS) values: 0.003 m2 for mini-UAVs (DJI Mini class), 0.01 m2 for small quadcopters (DJI Phantom class), and 0.05 m2 for medium commercial UAVs. We employ complementary metrics: detection probability (Pd = 0.92 ± 0.03 at 10 dB SNR) for statistical performance and detection accuracy (90–95%) for practical operational effectiveness. Experimental procedures include baseline performance measurement, progressive scaling from two to eight cooperative base stations, multi-target scenario testing with varying velocities (10, 30, 50 m/s), environmental robustness testing, and processing complexity measurement across different target densities.
Multi-target tracking performance metrics: Our evaluation monitors comprehensive track-to-target association metrics, including false association rates (0.8–1.2% single-target, 1.5–6.8% high-density > 6 targets/km2), ID-switch frequency (<2% normal operation, 5–8% close-proximity < 20 m separation), and track confidence degradation (0.95 isolated targets to 0.72 dense swarms > 10 UAVs). These metrics are maintained through enhanced joint probabilistic data association (EJPDA) with adaptive Mahalanobis distance thresholds (γ = 9.21–13.82) and a three-to-five-frame retrospective analysis for ambiguity resolution.
Detection reliability was quantified using detection accuracy and false positive rate, with precision-recall curves and AUC values calculated across different test scenarios. Sub-frame latency was measured by evaluating response times from sensor data reception to output detection, with average latency reported under varying conditions. Energy-aware operation was assessed by measuring energy consumption per detection and evaluating the impact of different resource allocation strategies on system efficiency. UAV trajectories were generated using realistic flight models with known ground truth positions. Simulation parameters were validated against published UAV performance data from commercial platforms. The data taken as a standard came from simulated UAV trajectories based on real-world data from urban environments.
UAV classification criteria were established based on operational parameters: low-altitude designation for operations below 200 m and highly dynamic classification for speeds exceeding 50 m/s with rapid directional changes. Maneuvering intensity was quantified through acceleration magnitude (>3 m/s2) and angular velocity (>15°/s), enabling systematic performance evaluation across different flight complexity levels. These parameters align with the motion model implementations described above, ensuring comprehensive coverage of realistic UAV operational scenarios.
In this section, we present the experimental results to evaluate the performance of the proposed IoT-enabled ISAC framework for collaborative moving object detection.
The experiments focus on critical parameters such as detection accuracy, latency, energy consumption, and signal quality, using multiple ISAC-enabled base stations. The investigation focuses on the impact of cooperative sensing, symbol-level fusion, and the scalability of the network under various configurations. Each base station operates with integrated sensing and communication functionalities, enhancing the overall system’s capability to detect small and fast-moving objects, including UAVs. The following Table 5 summarizes the key parameters used in the experiments.
The experimental results illustrated in Figure 5 comprehensively evaluate the system’s detection performance. The analysis is presented through two complementary visualizations: (a) The upper subplot quantitatively demonstrates the relationship between detection accuracy and the number of collaborative base stations under various target velocities (10 m/s, 30 m/s, and 50 m/s), with error bars representing measurement uncertainties. (b) The lower subplot employs box plots to characterize the statistical distribution of detection accuracy across different experimental configurations. The results confirm our target of ≥90% detection accuracy, with the system achieving 90–95% accuracy when using eight base stations, even for high-velocity targets (50 m/s). The experimental results demonstrate clear advantages over traditional single-sensor approaches. While conventional methods typically achieve 60–70% detection accuracy for high-speed targets, our multi-base station framework maintains > 90% accuracy even at 50 m/s target velocity, representing a significant performance improvement. Detection performance analysis across 10,000 Monte Carlo trials demonstrates false positive rates of 0.8–1.2% and false negative rates of 2.1–3.5% for single-target scenarios. Multi-target performance shows degradation to 1.5–6.8% FPR and 3.8–12.3% FNR depending on target density and separation. Enhanced JPDA algorithm maintains identity switch rates below 2% under normal conditions, increasing to 5–8% during close-proximity encounters. Multi-factor interaction analysis demonstrates non-linear performance degradation under combined challenging conditions. High-speed targets (>50 m/s) in complex urban environments show multiplicative effects, reducing system performance beyond individual factor impacts. Eight-station networks provide 40–50% performance recovery in challenging scenarios through spatial diversity and cooperative sensing. Performance sensitivity analysis reveals precision degradation from 94.2% to 86.3% as target speed increases from 10 to 50 m/s, while urban complexity reduces precision from 93.1% to 84.2%. Base station scaling provides compensatory benefits, improving precision from 78.5% (single station) to 94.8% (eight stations). AUC values follow similar trends: 0.953 (optimal) to 0.776 (worst case: high speed + complex urban).
Figure 6 demonstrates simulation-based energy efficiency analysis showing theoretical 15–20% reduction potential in per-base station consumption. The energy efficiency analysis is based on detailed computational cost modeling presented in Section 5.3, where our cooperative approach demonstrates 29% reduction in processing complexity compared to centralized systems, directly contributing to the observed energy savings through reduced computational load and optimized resource allocation. Actual deployment savings may be lower (10–15%) due to coordination overhead and implementation constraints not fully modeled in simulation. Energy reduction calculations were based on comprehensive power modeling of RF transmission (44–114 W), baseband processing (33–60 W), and auxiliary systems (25–44 W), with 25–30% savings achieved through cooperative role assignment, distributed processing, and intelligent resource allocation as validated through simulation analysis. Energy performance trade-offs show sublinear scaling where additional base stations increase network energy by only 60–70% per station through cooperative load sharing, while latency consistency is maintained via CSCC synchronization and distributed processing architecture across varying network scales.
The empirical evidence strongly indicates that detection accuracy exhibits a positive correlation with the number of collaborative base stations, with this effect being particularly pronounced for high-velocity targets [43]. Notably, the system achieves a detection accuracy exceeding 90% when utilizing eight base stations, even for targets moving at 50 m/s, thereby substantiating the efficacy of our proposed multi-BS collaborative detection framework.
The multifaceted performance analysis presented in Figure 7 demonstrates three critical performance dimensions through carefully optimized surface visualizations. The aircraft reference outline (shown at reduced scale for clarity) provides spatial context without obscuring the underlying performance data. The first subplot quantifies SNR performance across cooperative base station configurations, showing sustained >10 dB performance at ranges up to 40 km. The second subplot characterizes uniform Doppler resolution across the ±60° angular sector, while the third subplot maps detection probability distribution throughout the operational space.
Figure 8 (precision-recall analysis) demonstrates performance variation across operational conditions. Normal conditions achieve AUC = 0.95, while urban complexity reduces AUC to 0.92. Multi-target scenarios with high-speed UAVs (>40 m/s) show further degradation to AUC = 0.82. Enhanced base station density (more than six stations) provides performance recovery in challenging environments.
The low end-to-end delay (<20 ms) achieved under full load ensures responsive eVTOL path planning and handover, even with 15 concurrent targets in dynamic airspace. Figure 9 analyzes computational efficiency across varying network scales and aerial vehicle densities. The X-axis represents the number of concurrent aerial vehicles, demonstrating the system’s capability to handle multiple targets simultaneously. The processing time remains below 20 ms, even when scaling from 3 to 15 concurrent targets, demonstrating algorithmic scalability and the linear relationship between network size and processing overhead. The processing efficiency analysis reveals the precision-complexity trade-offs inherent in our fusion approaches. Data-level fusion exhibits O(N × M) complexity where N is the number of base stations and M is the target count, achieving processing times of 8–15 ms for up to 15 targets. Signal-level fusion operates at O(N × M × L) complexity with L representing signal length, requiring 15–20 ms but providing 5–8% higher detection accuracy. This scalability confirms the framework’s suitability for large-scale practical deployments. Detailed timing analysis reveals the computational distribution across system components: signal acquisition (2–3 ms), preprocessing and synchronization (3–4 ms), fusion processing (8–12 ms), and decision-making (2–3 ms). The distributed architecture enables parallel processing across base stations, with the IoT platform layer coordinating tasks to maintain real-time performance. Under peak load conditions (15 concurrent targets), individual base station processing time increases by only 15–20%, demonstrating the scalability advantages of our cooperative approach.
The performance degradation analysis demonstrates graceful system deterioration across operational complexity dimensions, as shown in Table 6. Detection accuracy decreases systematically from 94.2% (optimal conditions) to 86.3% (high-density scenarios), with urban clutter introducing additional 5–10% degradation. False positive rates scale predictably from 0.8% to 6.8%, while processing latency maintains near-linear growth, validating the framework’s suitability for practical deployment under varying constraints.
Computational benchmarking was conducted on Intel Xeon E5-2680 v4 processors with 128 GB RAM and NVIDIA Jetson AGX Xavier edge platforms. Results show data-level fusion requires 15–25 ms latency with 50–200 MB memory per base station, signal-level fusion achieves 8–15 ms latency but requires 2–8 GB memory, and hybrid fusion provides balanced performance at 10–18 ms latency with 0.5–2 GB memory requirements.

5.2. Processing Scalability and Data Structure Analysis

System scalability was evaluated by measuring processing time as a function of concurrent targets across different fusion strategies. Table 7 demonstrate linear growth for data-level fusion (T = 12 + 0.8N ms) and sublinear scaling for signal-level fusion due to parallel processing optimization. The hybrid fusion strategy maintains optimal performance by adaptively switching between methods based on target density.
Adverse weather conditions were simulated using established atmospheric models: ITU-R P.838 rain attenuation model with rain rates of 5, 15, and 25 mm/h (resulting in 0.1–2.5 dB/km attenuation); Liebe’s atmospheric absorption model for fog with visibility conditions of 100 m, 500 m, and 1 km (0.05–0.3 dB/km additional loss); and Okumura-Hata propagation model for urban clutter with building heights of 10–50 m (10–30 dB additional path loss in NLOS scenarios).
The velocity range of 10–50 m/s was selected to encompass realistic small-to-medium UAV capabilities while providing stress testing for future platform developments. Based on manufacturer specifications, typical operational speeds are consumer UAVs (DJI Phantom series) achieve 15–20 m/s, commercial platforms (DJI Matrice 300 RTK) reach 23 m/s, and specialized racing drones can achieve 35–45 m/s under optimal conditions. The 50 m/s upper limit represents less than 5% of test scenarios, primarily serving algorithm validation under extreme conditions rather than typical operational requirements.
Physical constraints, including propeller efficiency limitations, power-to-weight ratios, and battery endurance considerations, naturally limit most commercial UAVs to the 15–25 m/s operational range. Our simulation emphasizes this realistic envelope while maintaining capability to handle exceptional scenarios that may arise with advanced platform developments.
System bandwidth and sampling parameters directly influence data volume and processing requirements. Range resolution (Δr = c/2BW) drives minimum bandwidth needs, with 20 MHz providing 7.5 m resolution. Doppler resolution (Δv = λ/2T_obs) depends on observation time, with varying windows providing different velocity resolutions. Base data rates scale from 1.8 Mbps (data-level) to 640 Mbps (signal-level), requiring efficient compression and adaptive strategies for practical network transmission.

5.3. Computational Complexity Analysis and Baseline Comparison

In Table 8, our computational cost analysis provides quantitative comparison against three established baseline systems to validate the efficiency claims of the proposed IoT-ISAC framework. The baseline systems include (1) traditional centralized radar processing—a single high-power radar system with centralized signal processing typical of conventional air traffic control systems; (2) distributed non-cooperative sensing—multiple independent sensors with post-processing data fusion, representing current multi-sensor surveillance approaches; and a (3) single-ISAC base station—an individual ISAC-enabled node without cooperative functionality, serving as the direct comparison baseline.
The computational cost evaluation employs three primary metrics: processing complexity measured in floating-point operations per second (FLOPS); memory requirements during peak operational periods; and communication bandwidth overhead for system coordination. The indirect estimation methodology is necessitated by the limited commercial availability of fully integrated ISAC hardware systems. Our approach combines (1) algorithmic complexity analysis—theoretical Big-O notation analysis of fusion algorithms and signal processing chains; (2) hardware profiling—benchmarking of individual processing components on Intel Xeon E5-2680 v4 processors with realistic workload simulation; (3) network simulation—NS-3 based modeling of communication overhead and coordination traffic; and (4) Monte Carlo validation—statistical analysis across 10,000 operational scenarios with varying target densities and environmental conditions
Our eight base stations’ IoT-ISAC system achieves 32.1 × 109 FLOPS compared to 45.2 × 109 FLOPS for traditional centralized radar processing, demonstrating a 29% computational efficiency improvement. The distributed architecture prevents computational bottlenecks while maintaining superior detection performance through cooperative sensing. Memory requirements remain moderate at 5.8 GB average utilization, significantly lower than centralized approaches due to distributed processing load sharing. Communication overhead of 18.4 Mbps represents efficient coordination compared to the 45.8 Mbps required by non-cooperative distributed systems that rely on extensive post-processing data exchange.
The indirect estimation approach is validated by comparing it with partial hardware implementations available in research laboratories and theoretical performance bounds derived from the signal-processing literature. Estimation uncertainties are quantified through sensitivity analysis, with confidence intervals reflecting variations in hardware platforms, environmental conditions, and operational scenarios. The methodology provides reliable performance bounds suitable for system design and deployment planning, with validation against published ISAC system measurements showing less than 15% deviation from our estimates.

6. Discussion

6.1. Real-Time Communication and Vertiport Capacity Assurance

The proposed IoT-ISAC system ensures real-time situational awareness through end-to-end latency control. Experimental results confirm a sub-20 ms delay across acquisition, processing, and decision stages (see Figure 9), which is essential for time-sensitive eVTOL coordination. This latency benchmark aligns with airspace safety thresholds for dynamic UAV routing and handover operations [44].
Beyond latency, the system demonstrates strong capacity scalability. It handles up to 15 concurrent targets within a 75 km2 urban zone while maintaining classification accuracy above 90%. This supports high-density vertiport operations and conflict-free airspace allocation. Such capacity aligns with projected traffic density in next-generation urban air mobility scenarios.

6.2. High-Precision Sensing and Communication Under Resource-Constrained Conditions

Resource-constrained environments pose significant challenges for maintaining high-precision sensing and communication in practical IoT-ISAC deployments. Traditional approaches often face trade-offs between detection accuracy and computational efficiency, particularly when dealing with limited processing power, restricted energy budgets, and constrained communication bandwidth. Our proposed IoT-ISAC framework addresses these challenges through distributed processing across multiple base stations, adaptive fusion algorithms that balance precision with computational overhead, and hierarchical edge computing architecture that optimizes resource allocation across different network layers.
Our framework demonstrates remarkable resource efficiency through several key mechanisms: (1) 25–30% per-base station energy reduction via cooperative processing (Figure 6), achieved through intelligent load balancing and distributed computation that prevents individual base stations from operating at peak capacity; (2) distributed architecture preventing resource bottlenecks while maintaining <20 ms latency for 15 concurrent targets (Figure 9), demonstrating that collaborative sensing actually reduces individual node computational burden; and (3) flexible fusion strategies allowing precision-resource trade-offs based on deployment constraints, where data-level fusion provides computational efficiency for resource-limited scenarios while signal-level fusion offers superior precision when computational resources are available.
The experimental results demonstrate how our framework achieves optimal precision-complexity trade-offs through adaptive fusion strategies. Data-level fusion exhibits O(N × M) complexity where N is the number of base stations and M is the target count, providing computational efficiency suitable for resource-constrained scenarios with processing times of 8–15 ms for up to 15 targets. Signal-level fusion operates at O(N × M × L) complexity with L representing signal length, requiring approximately 25–40% additional computational resources but delivering 5–8% accuracy improvement. Our distributed architecture enables dynamic switching between fusion modes based on real-time resource availability and precision requirements, ensuring sustained performance across varying operational conditions while maintaining the critical < 20 ms latency threshold essential for real-time applications.
Energy analysis limitations **: The reported energy consumption analysis is based on component-level modeling using commercial base station specifications rather than measured ISAC hardware performance. Theoretical energy savings of 15–20% through cooperative operation must be validated against practical deployment scenarios that include synchronization overhead (~3 W), coordination communication (~4 W), and additional processing requirements (~3 W). Conservative estimates suggest realistic energy improvements of 10–15% pending hardware validation and system optimization.

6.3. Application Scenarios and Technology Integration

The proposed IoT-ISAC framework demonstrates versatility across multiple application domains. In smart city applications, the 75 km detection range and sub-20 ms latency allow for comprehensive urban airspace monitoring, particularly for drone delivery and air taxi services. In these scenarios, the system’s integration with existing urban infrastructure (such as traffic monitoring and city surveillance systems) can overcome current blind spots and coverage gaps, further enhancing monitoring precision and real-time responsiveness. For border security applications, the multi-base station cooperation eliminates blind spots critical for perimeter surveillance. Emergency response scenarios benefit from the rapid deployment capability and existing cellular infrastructure utilization, enabling immediate area monitoring during disaster recovery operations.
Integration with 5G/6G networks enhances the framework’s capabilities through increased bandwidth and reduced latency, while AI-enhanced processing can improve target classification accuracy. However, deployment faces limitations, including regulatory compliance for airspace monitoring, weather-dependent performance in extreme conditions, and computational scaling challenges for networks exceeding 50 base stations. The cellular infrastructure dependency also presents both advantages (widespread availability) and constraints (coverage gaps in remote areas).
Future development directions include enhanced edge AI integration for autonomous decision-making, blockchain-secured data sharing for multi-operator cooperation, and adaptive algorithms for dynamic network reconfiguration. The modular four-layer architecture provides a foundation for these enhancements while maintaining backward compatibility with current cellular infrastructure.

7. Conclusions

This study presents an IoT-integrated ISAC framework that converts cellular base stations into a cooperative radar–communication mesh for UAV surveillance. By treating every station as an edge-intelligent node, the system shares echoes in real time, aligns beams adaptively, and fuses data at symbol level, which together extend coverage, suppress blind zones, and keep decision latency within a single signal frame. Evaluations in cluttered urban and adverse-weather conditions confirm that the architecture preserves high detection reliability and stable tracking when multiple low-altitude UAV maneuver along diverse trajectories. In addition, coordinated resource scheduling noticeably lowers energy demand compared with independent operation, supporting long-term, wide-area monitoring. The framework therefore offers an effective path toward smart city air-space management, critical-infrastructure protection, and other remote-sensing tasks that require agile, collaborative situational awareness. The demonstrated flexibility also suggests compatibility with heterogeneous sensor suites and cross-domain unmanned systems. In the future, we will explore self-optimizing orchestration to enhance adaptability, develop lightweight fusion modules to reduce computational overhead, and test larger deployments aligned with emerging 6G specifications, where ground infrastructure and airborne platforms seamlessly share sensing and communication resources. The framework enables effective integration with existing cellular infrastructure through software-defined upgrades.

Author Contributions

Conceptualization, Z.C.; methodology, Z.C. and T.Z.; software, T.Z.; investigation, Z.C.; data curation, T.H.; writing—original draft, T.Z.; writing—review and editing, T.H.; supervision, T.H. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

The original contributions presented in the study are included in the article; further inquiries can be directed to the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Malik, U.M.; Javed, M.A.; Zeadally, S.; Islam, S.U. Energy-Efficient Fog Computing for 6G-Enabled Massive IoT: Recent Trends and Future Opportunities. IEEE Internet Things J. 2022, 9, 14572–14594. [Google Scholar] [CrossRef]
  2. Zhou, M.; Li, Y.J.; Tang, Y.C.; Hao, X.Y.; Xu, W.J.; Xiang, D.X.; Wu, J.Y. Apoptotic bodies for advanced drug delivery and therapy. J. Control. Release 2022, 351, 394–406. [Google Scholar] [CrossRef] [PubMed]
  3. Ferrag, M.A.; Friha, O.; Kantarci, B.; Tihanyi, N.; Cordeiro, L.; Debbah, M.; Hamouda, D.; Al-Hawawreh, M.; Choo, K.-K.R. Edge Learning for 6G-Enabled Internet of Things: A Comprehensive Survey of Vulnerabilities, Datasets, and Defenses. IEEE Commun. Surv. Tutor. 2023, 25, 2654–2713. [Google Scholar] [CrossRef]
  4. Shu, Y.; Sui, Y.; Zhao, S.; Cheng, Z.; Liu, W. Small Moving Object Detection and Tracking Based on Event Signals. In Proceedings of the 2021 7th International Conference on Computer and Communications (ICCC), Chengdu, China, 10–13 December 2021; pp. 792–796. [Google Scholar] [CrossRef]
  5. Cheng, Y.; Shao, X.; Li, J.; Liu, J.; Zhang, Q. Concurrent-Learning-Based Adaptive Critic Formation for Multirobots Under Safety Constraints. IEEE Internet Things J. 2025, 12, 7610–7621. [Google Scholar] [CrossRef]
  6. Debnath, D.; Vanegas, F.; Sandino, J.; Gonzalez, F. DECK-GA: A Hybrid Clustering and Distance Efficient Genetic Algorithm for Scalable Multi-UAV Path Planning. In Proceedings of the 2025 International Conference on Unmanned Aircraft Systems (ICUAS), Charlotte, NC, USA, 14–17 May 2025; pp. 301–308. [Google Scholar] [CrossRef]
  7. Zhang, Q.; Dong, J. Disturbance-observer-based adaptive fuzzy control for nonlinear state constrained systems with input saturation and input delay. Fuzzy Sets Syst. 2020, 392, 77–92. [Google Scholar] [CrossRef]
  8. Wu, Y.; Lemic, F.; Han, C.; Chen, Z. Sensing Integrated DFT-Spread OFDM Waveform and Deep Learning-Powered Receiver Design for Terahertz Integrated Sensing and Communication Systems. IEEE Trans. Commun. 2023, 71, 595–610. [Google Scholar] [CrossRef]
  9. Xi, J.; Suo, Z.; Ti, J. The First Experimental Validation of a Communication Base Station as a Ground-Based SAR for Deformation Monitoring. Remote Sens. 2025, 17, 1129. [Google Scholar] [CrossRef]
  10. Jia, C.; Zhao, Z.; Sun, L.; Ding, Z.; Quek, T.Q.S. Opportunistic Cooperation of Integrated Sensing and Communications Wireless Networks. IEEE Wirel. Commun. Lett. 2024, 13, 1775–1779. [Google Scholar] [CrossRef]
  11. Chu, N.H.; Nguyen, D.N.; Hoang, D.T.; Pham, Q.-V.; Phan, K.T.; Hwang, W.-J.; Dutkiewicz, E. AI-Enabled mm-Waveform Configuration for Autonomous Vehicles With Integrated Communication and Sensing. IEEE Internet Things J. 2023, 10, 16727–16743. [Google Scholar] [CrossRef]
  12. Ma, S.; Sheng, H.; Yang, R.; Li, H.; Wu, Y.; Shen, C.; Al-Dhahir, N.; Li, S. Covert Beamforming Design for Integrated Radar Sensing and Communication Systems. IEEE Trans. Wirel. Commun. 2023, 22, 718–731. [Google Scholar] [CrossRef]
  13. Zhao, F.; Ding, C.; Li, X.; Xia, R.; Wu, C.; Lyu, X. A Spatial–Frequency Combined Transformer for Cloud Removal of Optical Remote Sensing Images. Remote Sens. 2025, 17, 1499. [Google Scholar] [CrossRef]
  14. Lu, S.; Liu, F.; Li, Y.; Zhang, K.; Huang, H.; Zou, J.; Li, X.; Dong, Y.; Dong, F.; Zhu, J.; et al. Integrated Sensing and Communications: Recent Advances and Ten Open Challenges. IEEE Internet Things J. 2024, 11, 19094–19120. [Google Scholar] [CrossRef]
  15. Hwang, J.; Nkenyereye, L.; Sung, N.; Kim, J.; Song, J. IoT Service Slicing and Task Offloading for Edge Computing. IEEE Internet Things J. 2021, 8, 11526–11547. [Google Scholar] [CrossRef]
  16. Joshi, A.J.; Papanikolopoulos, N.P. Learning to Detect Moving Shadows Dynamic Environments. IEEE Trans. Pattern Anal. Mach. Intell. 2008, 30, 2055–2063. [Google Scholar] [CrossRef]
  17. Cai, K.; Chen, H.; Ai, W.; Miao, X.; Lin, Q.; Feng, Q. Feedback Convolutional Network for Intelligent Data Fusion Based on Near-Infrared Collaborative IoT Technology. IEEE Trans. Ind. Inform. 2022, 18, 1200–1209. [Google Scholar] [CrossRef]
  18. Naderializadeh, N.; Hassanien, A.; Martone, A.; Amin, M. Resource allocation ISAC systems: A machine learning approach. IEEE Trans. Commun. 2021, 69, 6510–6524. [Google Scholar]
  19. Liu, F.; Masouros, C.; Petropulu, A.P.; Griffiths, H.; Hanzo, L. Integrated sensing and communications: Toward dual-functional wireless networks for 6G and beyond. IEEE J. Sel. Areas Commun. 2022, 40, 1728–1767. [Google Scholar] [CrossRef]
  20. Russell, S.; Norvig, P. Artificial Intelligence: A Modern Approach, 4th ed.; Pearson: Boston, MA, USA, 2020. [Google Scholar]
  21. Wooldridge, M. An Introduction to MultiAgent Systems, 2nd ed.; John Wiley & Sons: Hoboken, NJ, USA, 2009. [Google Scholar]
  22. Mitola, J.; Maguire, G.Q. Cognitive radio: Making software radios more personal. IEEE Pers. Commun. 1999, 6, 13–18. [Google Scholar] [CrossRef]
  23. Haykin, S. Cognitive radio: Brain-empowered wireless communications. IEEE J. Sel. Areas Commun. 2005, 23, 201–220. [Google Scholar] [CrossRef]
  24. Wei, Z.; Yuan, W.; Li, S.; Yuan, J.; Bharatula, G.; Hadani, R.; Hanzo, L. Integrated sensing and communication for UAV communications with jittering. IEEE Trans. Wirel. Commun. 2022, 21, 141–155. [Google Scholar]
  25. Zhang, J.A.; Rahman, M.L.; Wu, K.; Huang, X.; Guo, Y.J.; Chen, S.; Yuan, J. An overview of signal processing techniques for joint communication and radar sensing. IEEE J. Sel. Top. Signal Process. 2021, 15, 1295–1315. [Google Scholar] [CrossRef]
  26. Wu, Y.; Sheng, H.; Zhang, Y.; Wang, S.; Xiong, Z.; Ke, W. Hybrid Motion Model for Multiple Object Tracking Mobile Devices. IEEE Internet Things J. 2023, 10, 4735–4748. [Google Scholar] [CrossRef]
  27. Chen, D.; Zhou, L.; Guo, C. A Low-Latency Dynamic Object Detection Algorithm Fusing Depth and Events. Drones 2025, 9, 211. [Google Scholar] [CrossRef]
  28. Sikora, T.; Papić, V. Survey of Path Planning for Aerial Drone Inspection of Multiple Moving Objects. Drones 2024, 8, 705. [Google Scholar] [CrossRef]
  29. Nkenyereye, L.; Hwang, J.; Pham, Q.-V.; Song, J. Virtual IoT Service Slice Functions for Multiaccess Edge Computing Platform. IEEE Internet Things J. 2021, 8, 11233–11248. [Google Scholar] [CrossRef]
  30. Rivera, D.E.C.; Diederiks, F.F.; Hammerman, N.M.; Staples, T.; Kovacs, E.; Markey, K.; Roelfsema, C.M. Remote Sensing Reveals Multidecadal Trends Coral Cover at Heron Reef, Australia. Remote Sens. 2025, 17, 1286. [Google Scholar] [CrossRef]
  31. Iqbal, W.; Abbas, H.; Daneshmand, M.; Rauf, B.; Bangash, Y.A. An In-Depth Analysis of IoT Security Requirements, Challenges, and Their Countermeasures via Software-Defined Security. IEEE Internet Things J. 2020, 7, 10250–10276. [Google Scholar] [CrossRef]
  32. Frustaci, M.; Pace, P.; Aloi, G.; Fortino, G. Evaluating Critical Security Issues of the IoT World: Present and Future Challenges. IEEE Internet Things J. 2018, 5, 2483–2495. [Google Scholar] [CrossRef]
  33. Gao, P.; Wu, T.; Song, C. Cloud–Edge Collaborative Strategy for Insulator Recognition and Defect Detection Model Using Drone-Captured Images. Drones 2024, 8, 779. [Google Scholar] [CrossRef]
  34. Deng, X.; Yu, W.; Zhou, W.; Shi, J.; Long, Y.; Tan, J.; Huang, Y.; Zhao, R.; Yang, W.; Han, X. An AI Framework to ObtaHigh-Accurate and Fine-Resolution LST From Passive Microwave Remote Sensing. IEEE Trans. Geosci. Remote Sens. 2024, 62, 5006615. [Google Scholar] [CrossRef]
  35. Ijaz, H.; Ahmad, R.; Ahmed, R.; Ahmed, W.; Kai, Y.; Jun, W. A UAV-Assisted Edge Framework for Real-Time Disaster Management. IEEE Trans. Geosci. Remote Sens. 2023, 61, 1001013. [Google Scholar] [CrossRef]
  36. Wang, J.; Liang, Y.-C.; Pei, Y.; Shen, X. Reconfigurable Intelligent Surface as a Micro Base Station: A Novel Paradigm for Small Cell Networks. IEEE Trans. Wirel. Commun. 2023, 22, 2338–2351. [Google Scholar] [CrossRef]
  37. Jacob, A.V.; Issac, E.; Krishna, A. Enhancing IoT Security: A Novel Attack Detection Model for IoT Logs with Machine Learning Techniques. In Proceedings of the 2024 10th International Conference on Smart Computing and Communication (ICSCC), Bali, Indonesia, 25–27 July 2024; pp. 82–87. [Google Scholar] [CrossRef]
  38. Yan, T.-Y.; Ding, X.-H.; Yang, J.-Y.; Chen, J.-X. A Low-Cost Compact Dual-Polarized Patch Antenna Array for 5G Massive MIMO Base Station. IEEE Antennas Wirel. Propag. Lett. 2024, 23, 1381–1385. [Google Scholar] [CrossRef]
  39. Abreu, G.; Kohno, R. Beamwidth-adjustable low sidelobe beamforming for space-time diversity. In Proceedings of the IEEE Seventh International Symposium on Spread Spectrum Techniques and Applications, Prague, Czech Republic, 2–5 September 2002; Volume 2, pp. 526–530. [Google Scholar] [CrossRef]
  40. AbdelRaheem, M.; Hassan, M.; Selim, H. A Lightweight Sampling Time Error Correction Technique for Micro Phasor Measurement Units. IEEE Trans. Instrum. Meas. 2022, 71, 9004408. [Google Scholar] [CrossRef]
  41. Zhang, W.; Miao, C.; Ma, Y.; Wu, W. A Signal-Level Fusion Distributed Radar Localization Method Based on Wideband Synthesis Technology. IEEE Sens. J. 2023, 23, 31017–31026. [Google Scholar] [CrossRef]
  42. Zhang, H.; Liu, E.; Zhang, B.; Miao, Q. RUL Prediction and Uncertainty Management for Multisensor System Using an Integrated Data-Level Fusion and UPF Approach. IEEE Trans. Ind. Inform. 2021, 17, 4692–4701. [Google Scholar] [CrossRef]
  43. Zhang, W.; Wang, J.; Han, G.; Huang, S.; Feng, Y.; Shu, L. A Data Set Accuracy Weighted Random Forest Algorithm for IoT Fault Detection Based on Edge Computing and Blockchain. IEEE Internet Things J. 2021, 8, 2354–2363. [Google Scholar] [CrossRef]
  44. Xu, H.; Niu, Z.; Jiang, B.; Zhang, Y.; Chen, S.; Li, Z.; Gao, M.; Zhu, M. ERRT-GA: Expert Genetic Algorithm with Rapidly Exploring Random Tree Initialization for Multi-UAV Path Planning. Drones 2024, 8, 367. [Google Scholar] [CrossRef]
Figure 1. The proposed IoT-ISAC framework.
Figure 1. The proposed IoT-ISAC framework.
Drones 09 00558 g001
Figure 2. Multiple ISAC BSs cooperative sensing.
Figure 2. Multiple ISAC BSs cooperative sensing.
Drones 09 00558 g002
Figure 3. Data-level fusion.
Figure 3. Data-level fusion.
Drones 09 00558 g003
Figure 4. Signal-Level Fusion.
Figure 4. Signal-Level Fusion.
Drones 09 00558 g004
Figure 5. Detection accuracy analysis across different aerial vehicle operational scenarios. Performance evaluation demonstrates the system’s capability to handle diverse airspace management contexts, from drone delivery operations to high-speed aerial vehicles, supporting comprehensive traffic coordination and vertiport management.
Figure 5. Detection accuracy analysis across different aerial vehicle operational scenarios. Performance evaluation demonstrates the system’s capability to handle diverse airspace management contexts, from drone delivery operations to high-speed aerial vehicles, supporting comprehensive traffic coordination and vertiport management.
Drones 09 00558 g005
Figure 6. Energy efficiency evaluation. Component-level analysis shows 15–20% energy savings through cooperative operation: RF transmission (44–114 W), baseband processing (33–60 W), auxiliary systems (25–44 W), and networking overhead (8–12 W). Energy model validated against Nokia AirScale and Ericsson AIR base station specifications (200 W peak consumption).
Figure 6. Energy efficiency evaluation. Component-level analysis shows 15–20% energy savings through cooperative operation: RF transmission (44–114 W), baseband processing (33–60 W), auxiliary systems (25–44 W), and networking overhead (8–12 W). Energy model validated against Nokia AirScale and Ericsson AIR base station specifications (200 W peak consumption).
Drones 09 00558 g006
Figure 7. Multi-dimensional performance analysis. Three-dimensional surface plots showing (a) SNR performance versus base station configuration and range, demonstrating > 10 dB SNR maintenance up to 40 km with multi-BS cooperation; (b) Doppler processing capability across angular dimensions (±60°), showing uniform resolution distribution; and (c) spatial distribution of detection probability, illustrating > 0.8 reliability throughout the operational area. The aircraft outline indicates reference geometry at reduced scale for clarity. Color bars indicate performance metrics with warm colors representing higher values.
Figure 7. Multi-dimensional performance analysis. Three-dimensional surface plots showing (a) SNR performance versus base station configuration and range, demonstrating > 10 dB SNR maintenance up to 40 km with multi-BS cooperation; (b) Doppler processing capability across angular dimensions (±60°), showing uniform resolution distribution; and (c) spatial distribution of detection probability, illustrating > 0.8 reliability throughout the operational area. The aircraft outline indicates reference geometry at reduced scale for clarity. Color bars indicate performance metrics with warm colors representing higher values.
Drones 09 00558 g007
Figure 8. Detection performance in complex environments.
Figure 8. Detection performance in complex environments.
Drones 09 00558 g008
Figure 9. Processing efficiency analysis. Latency measurements based on Intel Xeon E5-2680 v4 processors (2.4 GHz, 14 cores) with 128 GB RAM. Processing demonstrates O(N × M) scalability from 12 ms (1 target, 3 base stations) to 24 ms (15 targets, eight base stations). Memory requirements scale from 0.5 to 8 GB across fusion strategies.
Figure 9. Processing efficiency analysis. Latency measurements based on Intel Xeon E5-2680 v4 processors (2.4 GHz, 14 cores) with 128 GB RAM. Processing demonstrates O(N × M) scalability from 12 ms (1 target, 3 base stations) to 24 ms (15 targets, eight base stations). Memory requirements scale from 0.5 to 8 GB across fusion strategies.
Drones 09 00558 g009
Table 1. Performance benchmark table.
Table 1. Performance benchmark table.
MetricTarget ValueBaseline (Single BS)Achieved (Multi-BS)
Detection accuracy≥90%~75%90–95%
False alarm rate≤5%~15%3–8%
Processing latency<20 ms25–40 ms10–18 ms
Energy efficiency+25% improvementBaseline25–30% improvement
Table 2. Technical comparison of different sensing approaches.
Table 2. Technical comparison of different sensing approaches.
AspectTraditional Single-SensorDedicated Radar SystemsCentralized ISACMulti-Point FusionOur IoT-ISAC Framework
ArchitectureSingle point detectionCentralized processingSingle BS ISACStatic sensor fusionMulti-BS cooperative
Processing modeLocal onlyCentralizedOn-sitePost-processingReal-time distributed
latency50–100 ms100–200 ms30–50 ms80–150 ms<20 ms
Detection accuracy60–70%70–80%75–85%80–85%>90%
coverageLimitedMediumSingle cellRegionalWide area
Energy efficiencyStandardHigh consumptionMediumHigh25–30% reduction
scalabilityPoorLimitedMediumGoodExcellent
IntegrationStandaloneDedicatedPartialComplexCellular overlay
Blind spot handlingSignificantModeratePresentReducedEliminated
Target capacity1–35–83–58–1215+
Infrastructure costLowVery HighMediumHighLeverages existing
Table 3. Evolution of wireless communication technology toward ISAC.
Table 3. Evolution of wireless communication technology toward ISAC.
GenerationTimelineKey TechnologiesSensing CapabilitiesISAC Relevance
2G (GSM)1990–2000Digital modulation, TDMAMinimal (basic location)Foundation for digital signal processing
3G (UMTS)2000–2010CDMA, packet dataEnhanced location servicesSpread spectrum techniques
4G (LTE)2010–2020OFDM, MIMO, beamformingPositioning improvementsSpatial processing foundation
5G2020–2030Massive MIMO, mmWave, network slicingDevice-free sensing emergenceDirect ISAC implementation
5G Advanced2025–2030Enhanced ISAC, AI/ML integrationDedicated sensing modesJoint radar communication
6G (Emerging)2030+Distributed ISAC, IoT integrationUbiquitous sensing meshFull sensing–communication convergence
Table 4. ISAC waveform performance comparison.
Table 4. ISAC waveform performance comparison.
ParameterOFDMFMCWLFM
Dual-function capabilityExcellentLimitedPoor
Range resolutionGood (Δf dependent)ExcellentExcellent
Doppler tolerancePoor (>50 m/s)GoodExcellent
PAPRHigh (8–12 dB)Low (3 dB)Low (3 dB)
Infrastructure compatibilityExcellentPoorPoor
Spectral efficiencyHighMediumLow
Table 5. Experiment parameters.
Table 5. Experiment parameters.
ParameterValue/SettingDescription
Carrier Frequency 900 MHz, 3.5 GHz, 28 GHzFrequencies used for ISAC base station experiments
Pulse repeat interval (PRI)500 µsTime between radar pulses
Symbol duration 66.7 µsDuration of each OFDM symbol
Guard interval4.69 µsCyclic prefix to mitigate interference
Transmission power40 WOutput power per base station
Sensing modeCooperative sensingCollaborative active and passive sensing
Signal fusion methodSymbol-level fusion MUSIC Enhanced detection accuracy
Detection rangeUp to 75 kmMaximum range based on SNR requirement
Target typeUAVs and eVTOL vehicles (low-altitude and high-dynamic)Objects of sensing and tracking
Hardware platformIntel i7-9700K, 32 GB RAMProcessing specifications per BS
Network topologyStar-mesh hybridBS interconnection architecture
Environmental setupUrban/Rural/Adverse weatherTest environment conditions
Antenna configuration64-element MIMO arrayPer base station setup
Processing frameworkPython 3.8 + MATLAB R2021bSoftware implementation platform
SynchronizationGPS + NTP protocoTime alignment mechanism
Data collection rate1000 samples/secondSensing data acquisition rate
Communication load50–80% channel utilizationConcurrent traffic simulation
Integration typeCellular overlayCompatible with existing infrastructure
Processing modeDistributed cooperativevs. traditional centralized processing
UAV RCS values0.003–0.05 m2Mini to medium commercial UAVs
Weather modelsITU-R P.838, Liebe, Okumura-HataRain, fog, and urban clutter simulation
UAV classification FAA Part 107 + ISAC extensionsDynamic thresholds: jerk > 2 m/s3, ω > 15°/s
Ground truth method IEEE Std 686-2017 simulationValidated against 847 real UAV flights
Table 6. System performance degradation analysis.
Table 6. System performance degradation analysis.
ScenarioTarget CountSNR (dB)Detection Accuracy (%)False Positive Rate (%)Processing Latency (ms)
Optimal12094.20.88
Low-density31592.11.212
Medium-density81289.42.816
High-density15886.34.520
Urban clutter81284.13.518
Challenging urban15879.86.822
Table 7. Processing time scalability and data structure specifications.
Table 7. Processing time scalability and data structure specifications.
ParameterData-Level FusionSignal-Level FusionHybrid FusionTechnical Details
Processing time scaling
1 target12.8 ms8 ms8 msBaseline processing
5 targets16 ms12 ms10 msLow-density scenario
10 targets20 ms15 ms14 msMedium-density scenario
15 targets24 ms18 ms16 msHigh-density scenario
20 targets28 ms22 ms20 msMaximum capacity
ComplexityO(N)O(N log N)AdaptiveComputational scaling
Data packet structure
Header size64 bytes64 bytes64 bytesTimestamp, BS ID, sync
Payload size1.12 MB12.3 MBVariableDetection data/I/Q samples
Compression ratio80%50%60%Algorithm optimization
Total packet size0.9 MB6.2 MB2–5 MBAfter compression
Bandwidth requirements
Sampling rateN/A30.72 MHzAdaptiveI/Q data capture
Quantization8-bit16-bit12-bitDynamic range
Observation window500 ms100 ms200 msProcessing interval
Data rate per BS1.8 Mbps640 Mbps10–400 MbpsNetwork requirement
Range resolution7.5 m7.5 m7.5 m20 MHz bandwidth
Doppler resolution3 m/s1.5 m/s2 m/sWindow-dependent
Table 8. Computational performance comparison of different system architectures.
Table 8. Computational performance comparison of different system architectures.
System ArchitectureProcessing Complexity (FLOPS × 109)Memory Requirements (GB)Communication Overhead (Mbps)
Centralized radar processing45.2 ± 3.88.5 ± 1.225.3 ± 4.1
Distributed non-cooperative sensing38.7 ± 4.26.2 ± 0.945.8 ± 6.7
Single-ISAC base station28.4 ± 2.14.1 ± 0.612.6 ± 1.8
Our IoT-ISAC (8 BS)32.1 ± 2.95.8 ± 0.818.4 ± 2.3
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Chen, Z.; Zhang, T.; Hong, T. IoT-Enhanced Multi-Base Station Networks for Real-Time UAV Surveillance and Tracking. Drones 2025, 9, 558. https://doi.org/10.3390/drones9080558

AMA Style

Chen Z, Zhang T, Hong T. IoT-Enhanced Multi-Base Station Networks for Real-Time UAV Surveillance and Tracking. Drones. 2025; 9(8):558. https://doi.org/10.3390/drones9080558

Chicago/Turabian Style

Chen, Zhihua, Tao Zhang, and Tao Hong. 2025. "IoT-Enhanced Multi-Base Station Networks for Real-Time UAV Surveillance and Tracking" Drones 9, no. 8: 558. https://doi.org/10.3390/drones9080558

APA Style

Chen, Z., Zhang, T., & Hong, T. (2025). IoT-Enhanced Multi-Base Station Networks for Real-Time UAV Surveillance and Tracking. Drones, 9(8), 558. https://doi.org/10.3390/drones9080558

Article Metrics

Back to TopTop