Next Article in Journal
Optimization of Multi-User Secure Communication Rate Under Swarm Warden Detection in ISAC Networks
Previous Article in Journal
Adaptive Edge Intelligent Joint Optimization of UAV Computation Offloading and Trajectory Under Time-Varying Channels
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Airport Ground-Based Aerial Object Surveillance Technologies for Enhanced Safety: A Systematic Review

School of Graduate Studies, Embry-Riddle Aeronautical University, Daytona Beach, FL 32114, USA
*
Author to whom correspondence should be addressed.
Drones 2026, 10(1), 22; https://doi.org/10.3390/drones10010022
Submission received: 5 December 2025 / Revised: 27 December 2025 / Accepted: 29 December 2025 / Published: 31 December 2025

Highlights

What are the main findings?
  • Layered multi-sensor fusion architectures provide the most reliable detection of Low Slow Small targets in airport environments, since single-sensor systems show persistent gaps in coverage, classification, and situational awareness.
  • Differentiating drones from birds remains difficult because their radar signatures and motion patterns are similar, leading to frequent false alarms and reduced confidence in surveillance systems, especially when studies lack strong validation procedures.
What are the implications of the main findings?
  • Airport detection systems should integrate with Air Traffic Management and UAS Traffic Management through standardized data exchange approaches that improve situational awareness and support timely operational decisions during emerging aerial hazards.
  • Airports should adopt a risk-based deployment strategy, using simple Remote ID reception for low-risk rural environments and more comprehensive multi-sensor fusion systems for high-density hubs, major airports, and advanced air mobility vertiports.

Abstract

Airport airspace safety is increasingly threatened by small, unmanned aircraft systems and wildlife that traditional radar cannot detect. While earlier reviews addressed general counter-UAS techniques, individual sensors, or the detection of single objects such as birds or drones, none has systematically reviewed airport-based, multi-sensor surveillance strategies through a safety-theoretical lens. A systematic review, performed according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) 2020 statement, synthesized recent research on fixed, ground-based aerial detection capabilities for small aerial hazards, specifically unmanned aircraft systems (sUAS) and avian targets, within operational airport environments. Searches targeted English-language, peer-reviewed articles from 2016 through 2025 in Web of Science and Scopus. Due to methodological heterogeneity across sensor technologies, a narrative synthesis was executed. The review of thirty-six studies, analyzed through Reason’s Swiss Cheese Model and Endsley’s Situational Awareness framework, found that only layered multi-sensor fusion architectures effectively address detection gaps for Low-Slow-Small (LSS) threats. Based on these findings, the review proposes seamless integration with Air Traffic Management (ATM) and UAS Traffic Management (UTM) systems through standardized data-exchange interfaces, complemented by theoretically grounded risk-based deployment strategies aligning surveillance technology tiers with operational risk profiles, from basic Remote ID receivers in low-risk rural environments to comprehensive multi-sensor fusion at high-density hubs, major airports, and urban vertiports.

1. Introduction

Airport airspace faces a dual threat: small, unmanned aircraft (UAS) and wildlife that evade legacy surveillance. Wildlife strikes have long been a safety risk, with 21,426 strikes reported in the U.S. in 2024 and 7212 in 2025 [1,2]. In 2021, an estimated $328 million in damage and 140,000 h of aircraft downtime were reported [3]. The rise of sUAS has added a new intruder to controlled airspace. The Federal Aviation Administration (FAA) receives over one hundred UAS sighting reports per month near airports [4], and Germany has logged over five hundred such incidents since 2015, climbing sharply [5]. High-profile disruptions, such as the Gatwick Airport closure in December 2018, which shut down a major airport for several hours, affecting 140,000 passengers and 1000 flights [6], illustrate the severe safety and economic risks of unauthorized aerial objects. Urgent improvements in airport airspace monitoring are critical.
The current airport surveillance infrastructure has critical detection gaps. Conventional air traffic control (ATC) radars and transponder-based systems, such as Automatic Dependent Surveillance-Broadcast (ADS-B), primarily track cooperative, crewed aircraft but often fail to detect drones or wildlife that lack a transponder [6,7]. These legacy radars are optimized for large, fast-moving targets, rather than low-speed, small (LSS) threats such as hobbyist drones or birds [8,9]. Many airports rely on reactive detection methods, such as pilot sightings, which leave dangerously short response times when an untracked object suddenly appears on a flight path.

1.1. Surveillance Technologies

Each sensor modality has inherent tradeoffs [9]. Radar provides long-range detection but can confuse drones with birds. Electro-optical/infrared (EO/IR) offers visual classification but fails in poor visibility. Acoustic sensors have a limited range in noisy environments, and RF scanners miss non-transmitting UAS. Consequently, false alarms, especially from birds, are persistent [5,9]. Multi-sensor fusion offsets these limitations by combining multiple sensor types, such as radar, cameras, and RF receivers, to improve reliability and situational awareness [5,10]. Layered sensing is the leading approach for complex environments [5], with fusion algorithms correlating detections to reduce false alerts and improve rogue UAS detection [10].
The multi-sensor architecture follows a four-stage workflow (detection, classification, identification, and tracking) that turns blips into monitored targets. Figure 1 illustrates the multi-sensor surveillance workflow. Detection (Stage 1) uses radar, EO/IR video, acoustic arrays, and RF receivers. Classification (Stage 2) then leverages ML/DL models; micro-Doppler radar, for instance, can distinguish rotary-wing UAS from birds [11]. Identification (Stage 3) reads transponder/ADS-B codes or Remote ID broadcasts to support ATC threat assessment. Tracking (Stage 4) employs motion models, such as Kalman filtering, to associate detections, bridge occlusions, and handle sensor dropouts [12]. These stages are often linked via cross-cueing (e.g., a radar hit slewing a camera) to reduce false alarms.

1.2. Research Gap and Study Contribution

Despite advances in sUAS detection and wildlife surveillance, prior research remains fragmented. The unique strength of this systematic review is its focus on the ‘sensing-to-mitigation’ workflow, specifically through the application of Reason’s Swiss Cheese Model and Endsley’s Situational Awareness framework. Unlike broader surveys that provide a general overview of all counter-UAS aspects, this study provides a deep, theoretically grounded synthesis of fixed, ground-based barriers within operational airport environments. By analyzing methodological heterogeneity across sensor technologies, this paper transforms disparate evidence into a cohesive risk-based deployment strategy that addresses critical ‘leakage paths’ in current airport safety. It also provides guidance for ATM/UTM integration and stakeholder-specific implementation roadmaps that explicitly address operational constraints in distinguishing LSS targets. Table 1 highlights these gaps.

1.3. Research Purpose and Questions

The purpose of the review is to synthesize recent research on fixed, ground-based aerial detection capabilities for small aerial hazards at airports and identify how they can be strengthened. Focusing on radar, EO/IR, and RF detectors (with acoustic where applicable), the review surveys capabilities and limits of airport-airspace sensing systems for both cooperative and non-cooperative objects. In line with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) 2020 [24], the present review systematically maps current and emerging airport aerial hazard detection technologies. The research questions are:
  • RQ1: What is the current state of ground-based aerial object detection technologies at airports?
  • RQ2: What are the technical interoperability barriers and regulatory gaps limiting deployment?

2. Methodology

The methodology followed the structured guidance of PRISMA 2020, a reporting standard that promotes complete and transparent systematic literature reviews [24]. The process comprised four phases: Identification, Screening, Appraisal, and Synthesis to ensure methodological rigor, reproducibility, and traceability. Figure 2 shows the workflow.

2.1. Eligibility Criteria

Inclusion and exclusion criteria were established to ensure the review captured relevant, high-quality evidence and retained only studies directly addressing fixed-site detection of UAS or birds within operational airport environments. Studies were eligible if they examined airport-based detection of aerial objects, primarily sUAS and avian targets, using ground or fixed-site sensing (e.g., radar, RF, EO/IR, acoustic, LiDAR). Inclusion was limited to English-language, peer-reviewed journal articles and conference proceedings published from 1 January 2016, through September 2025. Conceptual, experimental, and field studies of detection systems were considered. Exclusions covered onboard sensing, non-airport contexts, and non-scholarly white papers lacking quantitative results. The review differentiated between simulation-based and field experiments, ensuring both provided empirical performance data. Risk of bias and study quality were systematically assessed using the JBI Critical Appraisal Tool, as detailed in Section 2.4. The next step is to execute the search as follows.

2.2. Identification of Information Sources and Search Strategy

A search was conducted in the Web of Science Core Collection (WoS) and Scopus, covering the period from 2016 to September 2025. These databases were selected to guarantee the scientific rigor and impact of the evidence, as they include the most reputable international journals publishing work from major research hubs in Europe, the U.S., and China. To capture relevant gray literature and regulatory reports, a targeted supplementary scoping search was subsequently conducted in IEEE Xplore, Google Scholar, and regulator repositories (e.g., FAA, EASA, ICAO) using the exact keywords; this process did not materially change the eligible study set under our predefined scope. Concept blocks combined (a) target terms (e.g., sUAS, drone*, bird*, avian), (b) sensing/modality terms (radar, RF, SDR, ADS-B/Remote ID, EO/IR, thermal, acoustic, LiDAR), and (c) context terms (airport*, aerodrome, airfield, runway, approach). Boolean operators and field tags were tailored per database; truncation and phrase searching were applied where appropriate. All retrieved records were exported to a reference manager and deduplicated before screening.

2.3. Study Selection and Data Collection Process

Records were deduplicated and screened in two stages: titles/abstracts followed by full text, against the eligibility criteria. The reviewer completed screening; uncertainties were resolved by rechecking the source text against predefined rules. No automation tools were used for study selection. A PRISMA 2020 flow diagram reports counts at each stage. A structured extraction form captured bibliographic details, sensing modality, fusion approach, target classes, environment (operational airport vs. simulation/test range), scenario design, datasets, and quantitative outcomes. The reviewer performed data extraction without automated tools. When performance data were incomplete, reported figures or tables were used to conservatively estimate values, or the item was omitted. Table 2 lists the extracted data items.

2.4. Study Risk of Bias Assessment

The Joanna Briggs Institute [25] analytical cross-sectional checklist (8 items) was applied to all 36 included studies. For technology and simulation research, “exposure” was interpreted as the sensor, algorithm, or configuration under evaluation, and “outcome” as the reported performance, classification, or tracking result. Each item was coded (Yes, No, Unclear, Not Applicable); a small number of cells were marked context-dependent and treated as not applicable when calculating item-level proportions. A single-reviewer assessment, with predefined rules, was used to maintain consistency across heterogeneous designs.

2.5. Synthesis Methods

Due to the heterogeneity of sensing technologies, experimental protocols, and performance metrics, a quantitative meta-analysis was not feasible. Instead, a narrative synthesis was used. Results were grouped by sensing modality, operational setting, and performance measures. Thematic analysis coded findings by technology capabilities (RQ1) and challenges/opportunities (RQ2). Methodological differences, such as those between real-world and simulated settings and target types, were examined to explain the inconsistencies. When quantitative results differed widely, only the observed ranges were reported, not pooled averages.

3. Results

This section presents the key findings from a systematic review of the recent literature on ground-based aerial object detection at airports. The results are organized to highlight the study selection process, characteristics of the included research, and the range of sensor technologies evaluated.

3.1. Study Selection

Database searches yielded 875 records (Web of Science; Scopus). After removing 255 duplicates and 54 non-article records, 566 titles/abstracts were screened (Figure 3). Of these, 213 were excluded for non-airport context. Full texts were retrieved for 353 reports; 317 were excluded for non-airport context, non-ground-based systems, or non-aerial detection. Thirty-six studies were included: 28 conducted at airports and 8 in simulated airport environments.

3.2. Characteristics of the Included Studies

Figure 4 shows the study demographics and publication years (2016–2025), with Table 3 summarizing the included studies. Section 3.4.1 compares sensor types and tradeoffs. Technologies studied include radar, RF/ADS-B/Remote ID, EO/IR, acoustic arrays, and multi-sensor fusion.

3.3. Risk of Bias in Studies

The JBI appraisal showed generally strong methodological quality: inclusion criteria (92%), study context (94%), outcome validity (80%), and statistical analysis (94%) were well addressed, while confounder identification (66%) and mitigation strategies (52%) were weakest, indicating overall moderate risk of bias. Results are shown in Table 4.

3.4. Synthesis of Findings

The current synthesis addresses the review’s research questions by organizing the findings thematically. It focuses on hazards and challenges, the current state of detection technologies, integration with air traffic management, and performance evaluation metrics.

3.4.1. Current State of Detection Technologies

To critically evaluate the efficacy of current detection technologies, this review operationalizes Reason’s (1997) [62] Swiss Cheese Model of accident causation and Endsley’s (1995) [63] Situational Awareness (SA) framework. In the context of airport security, no single sensor modality functions as an impenetrable barrier against sUAS. Instead, each technology acts as a distinct filtering stage, while its inherent physical limitations constitute the “leakage paths” (Figure 5).
Operationalizing the Swiss Cheese Model for C-UAS
Safety science dictates that high-reliability systems require redundancy to prevent failure trajectories [62]. The current review operationalizes the model as a sequential filtering process where different sensor technologies act as barriers against threats. For instance, while the radar filter may allow a “leakage path” for targets with low Radar Cross-Section (RCS) or amidst ground clutter, the subsequent RF monitoring layer can intercept the pilot’s signal, effectively terminating the evasion trajectory [26]. By integrating diverse modalities, the system shifts from reliance on a single point of failure to a robust defense-in-depth architecture. This multi-layered approach not only mitigates specific sensor blindness (e.g., acoustic masking by jet engines or visual obscuration by weather) and enhances the operator’s SA through cross-verification, reducing false positives and ensuring a validated decision matrix [26].
Mapping Sensor Workflows to Situational Awareness
Situational Awareness (SA) is essential for aviation safety and involves three steps: perceiving elements, comprehending their meaning, and projecting future status, as formalized by Endsley’s (1995) [63] model (Figure 6). The multi-source surveillance workflow for LSS targets described earlier closely aligns with this hierarchy. Radar (millimeter-wave, holographic, phased-array) [43], acoustic sensing (TDoA, Doppler stretch) [29], and optical/EO/IR detection (stereovision, gated active imaging) [35,41] provide Level 1 SA by perceiving objects and their current states. Machine-learning classification and identity resolution supply Level 2 SA by categorizing targets (birds, drones, clutter) using dynamic features or signatures [34,39]. Tracking models (IMM) estimate target state and intent [30,56]. MSDF integrates Level 1 inputs to clarify ambiguities and provide a unified air traffic view [37]. Tracking and trajectory prediction support Level 3 SA by estimating the future actions of detected objects, helping anticipate risks and inform mitigation. This includes Collision Risk Estimation, which calculates the predicted minimum distance between hazards and aircraft paths [44]. Prediction is aided by RNN/LSTM models for short-term flight traffic forecasting [50] and long-horizon bird activity projections [40,60,61]. The outcome enables mitigation strategies, such as fast trajectory re-planning [31]. These integrated multi-sensor systems create an evolving mental picture that supports timely decision-making.
Sensor-Specific Observations
The included studies explored a range of ground-based sensor technologies. Radar, including high-resolution millimeter-wave (mmW) and specialized avian radars, offers long-range, all-weather detection and can extract micro-Doppler signatures [34]. In the Swiss cheese analogy, radar forms an outer barrier but has limitations: clutter, difficulty differentiating birds and drones, and limited performance against very low-radar-cross-section targets. EO/IR cameras provide visual confirmation and precise localization [35,58]; they strengthen Level 2 SA by helping operators comprehend object type. Cameras are vulnerable to poor weather, have a limited detection range (<1.2 miles), require a line of sight, and impose a high computational load for real-time classification [26,41]. RF monitoring, including ADS-B and proprietary systems such as AeroScope, is invaluable for cooperative surveillance and identification [28,48,50]. RF methods act as an identity barrier but fail against non-cooperative or autonomous UAS, and they face coverage gaps at low altitude. Acoustic sensors detect characteristic-sound signatures and can offer a cost-effective warning layer, but airport ambient noise severely limits their range and leads to false detections [37,51]. Table 5 outlines the main advantages and limitations of each modality and classifies them by safety-barrier role and SA level.
No single sensor provides a comprehensive SA or risk control; comprehensive surveillance emerges only when multiple barriers are layered and fused. Multi-Sensor Data Fusion (MSDF) is the most promising approach [37,38], in which data are processed to produce a common best estimate of the track state. By integrating complementary sensors, MSDF enhances detection reliability, accuracy, and classification, while reducing false alarms [31].

3.4.2. Aerial Hazards and Detection Challenges

The literature identifies two principal aerial hazards in the airport environment: UAS and wildlife, particularly birds. The proliferation of UAS presents a significant and growing threat to aviation safety, ranging from unintentional endangerment by hobbyists to malicious incursions that can cause severe operational and economic disruption [26,27,48]. Concurrently, bird strikes remain a persistent and costly threat, responsible for substantial aircraft damage and safety incidents, especially during critical phases of flight at low altitudes [33,44,55]. Consequently, modeling and characterizing bird activity patterns using radar data is a key component of risk assessment [40]. These threats are often categorized as Low-small-slow (LSS) targets, which pose distinct and formidable detection challenges [34,43]. When appearing in large numbers, these hazards manifest as drone swarms or bird flocks, which exhibit higher RCS but complicate identification due to mutual signal coupling within radar scanning volumes [34,39]. While experimental campaigns have evaluated multi-UAS swarms of up to five vehicles, differentiating their coordinated kinematics from consistent avian flocking remains a significant hurdle for current automated classifiers [37]. Key challenges include their small radar cross-section (RCS), low-altitude flight profiles that are subject to significant ground clutter, and high maneuverability [27,34,45]. A recurring difficulty is distinguishing between birds and UAS, as they can exhibit similar radar signatures and flight kinematics, resulting in high false alarm rates [39,43,56]. Furthermore, each sensor modality is subject to environmental and operational limitations. Radar performance can be degraded by clutter [27], while optical systems are hindered by adverse weather and poor lighting conditions [26,41]. Acoustic sensors have a limited effective range in noisy airport environments [26,37,51], and RF detectors are ineffective against autonomous or non-transmitting UAS [26,37,56].

3.4.3. Air Traffic and UAS Management (ATM/UTM)

The integration of UAS and Advanced Air Mobility (AAM) into the National Airspace System (NAS) requires robust ATM and UTM frameworks featuring layered architectures, ATM/ATC, UAS Service Supplier (USS), and the UAS layer [28,31,50]. Surveillance planning must account for new infrastructure, like vertiports, especially where cooperative surveillance (ADS-B/Remote ID) may be deficient (see Table 6) [28,34]. Effective management relies on Multi-Sensor Fusion (MSDF), combining cooperative (ADS-B) and non-cooperative surveillance data [28,50]. Systems like GBDAA integrate radar and ADS-B data to provide SA and proximity alerts, essential for safe BVLOS operations [28,53]. AI/ML models, including Recurrent Neural Networks (RNNs), predict traffic patterns and support AAM flight forecasting [50]. AI assists in identifying unusual flight patterns and prioritizing non-participatory or hazardous tracks, thereby improving operator focus and reducing workload [28,50]. GIS platforms help unify data for risk analysis [28]. A persistent challenge remains the lack of a centralized data system to visualize all UAS and crewed aircraft information [28].

3.4.4. Human Factors and Operator Workload

The successful deployment of multi-sensor systems at airports fundamentally hinges on mitigating human factors challenges and operator workload. While these systems offer unprecedented detection capabilities, synthesis underscores the critical need for human-centered design to prevent cognitive overload for ATC and UAS pilots. Studies show that well-designed interfaces can reduce workload by streamlining communication tasks through precise, timely advisories [36]. Automation, such as 24/7 autonomous radar operations for detection, recognition, and tracking, is crucial to prevent human operators from being overwhelmed by continuous monitoring of fluctuating LSS targets [34]. Information display significantly affects usability, with preferences for clear target presentations on primary radar to deliver actionable intelligence efficiently [36]. Experienced operators are needed to interpret cluttered situation displays and maintain a low false alarm rate [37]. Rigorous evaluation using standardized metrics such as SUS, UEQ, NASA-Task Load Index, and SART is crucial for assessing usability, user experience, ergonomics, and SA of new human–machine interfaces (HMIs), including augmented reality systems [31]. Addressing ergonomic issues, such as AR headset fatigue and visibility challenges in varied lighting conditions, is vital for sustained operational effectiveness and safe ATM/UTM integration [31].

3.4.5. Key Performance Metrics and Evaluation

The evaluation of these systems relies on a comprehensive set of metrics. Technical performance is usually assessed using detection range [35], accuracy [26,35,54], precision [26,54], recall [26,54], F1-score [26], and false alarm rate (FAR) [26,35,54]. For vision-based systems, Average Precision (mAP) is a standard metric [52]. Localization and tracking performance are evaluated using Root Mean Square Error (RMSE) against ground-truth data [37]. Operational efficiency is measured by inference speed (frames per second) [52] and bird detection response time (DRT) below 5 s (FAA mandate), critical for real-time applications [34]. System-level evaluations assess the impact on controller workload [36] and overall SA [30,31], often using standardized questionnaires [31,36] and simulations.

4. Discussion and Implications

A review of 36 studies on airport aerial object surveillance highlights rapid technological change driven by increased UAS traffic and ongoing wildlife risks. The following sections summarize the main findings, limitations, and implications for practice, policy, and research.

4.1. Principal Findings

Addressing the first question (current state), the review confirms that a layered multimodal surveillance architecture is now essential. Radar remains the principal sensor for wide-area monitoring, with Holographic Doppler radar providing unique advantages over traditional scanning radars; by eliminating beam scanning in favor of continuous “staring” illumination, it achieves extended accumulation times and higher Doppler resolution to distinguish LSS targets from environmental clutter [43]. This allows for the extraction of fine micro-Doppler features, such as UAV rotor rotations or bird wing flaps, that are often lost in conventional systems. For example, the Gamekeeper 16U holographic radar can identify targets with a radar cross-section as small as 0.01 m at ranges up to three miles. These systems integrate EO/IR systems for recognition/visual confirmation, RF detectors (for cooperative drones), and cost-effective acoustic detectors using DL. The necessary capability for non-cooperative targets is provided by MSDF augmented by AI/ML, which synthesizes sensor inputs to create a reliable common track estimate. Deep Learning, notably RNNs, is leveraged for Non-Cooperative Target Recognition (NCTR) by extracting non-linear features from raw radar signals.
Addressing the second question, the most significant challenge is the reliable classification and identification of LSS targets amidst intense environmental clutter, compounded by the high similarity between drone and bird radar echoes. Tests show radar displays frequently present over 50% false targets, rapidly eroding operator trust and demanding experienced manual interpretation. Current systems suffer from limited interoperability and insufficient detection range (many systems cover only 1–4 miles, short of the required OLS range). Opportunities focus on developing sophisticated MSDF algorithms and generalizable AI/ML models trained on large, high-quality, real-world datasets that leverage techniques such as Causal-Symmetric Transformation (CST) to improve robustness in dynamic environments. Groundbreaking signal processing, such as the DSCR detector and geometry-mapping Automatic Target Recognition (ATR), is required to achieve millisecond-level Detection Response Time (DRT) and circumvent unreliable tracking features. Success relies on seamless system integration into unified UTM/ATM platforms via standardized interfaces.

4.2. Limitations: Evidence and Review Processes

The current review faces several limitations: inconsistent reporting standards, varied metrics and thresholds, and insufficient details on weather or operational conditions. Most studies are short-term and small-scale (typically under three months), which restricts the generalizability of results to continuous 24/7 airport operations and limits the reliability of conclusions for safety-critical decision-making. The heavy reliance on simulation-based models rather than longitudinal field data means that system performance under rare edge cases or complex round-the-clock interference patterns remains partially unverified. Detection rates and classification performance may be optimistic or difficult to reproduce in real operational settings, as ground-truth validation for non-cooperative targets is frequently weak. Many underlying studies rely on human observers, ADS-B data, which is inherently limited to cooperative users, or inferred labels. These proxies may fail to capture the full complexity of adversarial or accidental incursions where no electronic signature is present, potentially overstating system efficacy [27,28]. Furthermore, the review may be subject to publication bias, as several included studies are industry-led or manufacturer-funded (e.g., Leonardo S.p.A. AULOS system, Saab G1X system, Merlin system by DeTect Inc.), which may bias reporting toward optimized performance data. A regional bias is also evident, with a predominance of solutions from Europe and China, potentially limiting the generalizability of findings to other global regulatory or environmental contexts. The heavy focus on China and Europe may reduce relevance for airports in other regulatory or environmental settings. Methodological constraints include the primary reliance on high-impact academic databases and on English-only studies. While the inclusion of major repositories ensures a high standard of peer-reviewed evidence, the exclusion of non-English studies may still bias results toward well-funded academic contexts in China, Europe, and North America, potentially omitting valuable regional field trials or unique regulatory findings. Simulated environments and substantial heterogeneity across designs, sensors, and metrics further limit external validity, necessitating narrative synthesis rather than meta-analyses.

4.3. Theoretical and Practical Implications

The findings highlight that airport surveillance effectiveness is grounded in both technological capability and safety theory. Layering radar, EO/IR, RF, and acoustic sensors conforms to the Swiss cheese model [62], reducing the likelihood that the same weakness will be present across all defenses [64]. By aligning sensor workflows with the cognitive stages of situational awareness, system designers can better understand how perception, comprehension, and projection interact. Operationalizing these frameworks involves translating the ‘barriers’ of the Swiss Cheese Model into measurable safety cases, such as a target level of safety (TLS) of less than 10−7 mid-air collision (MAC) events per flight hour [28]. Certification criteria can be established by auditing surveillance software against standards like Mil-STD 882E to determine risk levels for specific airport operational volumes. Validation benchmarks are further derived by mapping technical performance, such as millisecond-level detection response times and latency thresholds, directly to Situational Awareness levels, ensuring that Stage 1 ‘perception’ provides sufficient time for Stage 3 ‘projection’ and automated mitigation [26,27]. Table 7 summarizes stakeholder-specific implications for airport aerial object surveillance systems.
Consistent with FAA’s risk-based framework, airports should match surveillance technology to operational risk, defined by population density and airspace class rather than airport size. Table 8 operationalizes this alignment. Low-risk rural Class G/E sites (Categories 1–2) need only Remote ID receivers and minimal local sensing. Moderate-risk suburban/regional facilities (Categories 3–4) should add strategic deconfliction via certified services (UTM/ADSP) and selective non-cooperative sensors (e.g., medium-range radar or optical units). High-risk Class B/C hubs and dense urban vertiports (Category 5) require comprehensive, layered radar/EO/RF fusion, wide-area Remote ID capture, and full interoperability between ATC and UTM.

4.4. Future Research Directions and Outlook

Future research should create open, multimodal, airport-grade datasets and benchmarks for drone-versus-bird classification; conduct multi-airport, longitudinal trials to test generalization across terrain, weather, and traffic; and advance classification and behavior prediction using causality-aware DL with meteorological features. Work is needed on advanced, real-time multimodal fusion, automatic signal extraction and tracking, and verification of human out of the loop (HOOTL) operations that cut workload without reducing safety. Priorities now include cyber-resilient, standards-based data exchange and audit trails for sensor-to-UTM integrations, standardized scenario libraries and public benchmarks for runway and vertiport perimeters [10], and digital-twin/AAM testbeds to quantify interface latency and data quality [71,72]. Pursue spectrum-/network-aware sensing (RF/JC&S/5G) for non-cooperative targets [8].

Advanced Air Mobility (AAM) and Vertiport Surveillance Challenges

The rapid evolution of AAM operations introduces distinctive surveillance challenges that surpass traditional airport frameworks. Vertiports, operating in densely populated urban areas, face significant electromagnetic interference and spatial constraints that complicate sensor deployment [73]. Moreover, the complex three-dimensional traffic flow, including multiple approach corridors within limited airspace, necessitates precise, low-latency detection systems capable of distinguishing authorized eVTOL aircraft from rogue UAS and environmental clutter such as construction cranes [74]. Achieving 99.9% detection probability in urban canyons requires research into optimal sensor placement and multi-vertiport data-sharing architectures [75]. The integration of rooftop-mounted mmW radar, visual sensors, and mandated Remote ID broadcasts must be explored in the context of persistent line-of-sight limitations [76]. Living-lab deployments such as BEYOND Phase 2 should produce certification-grade evidence aligned with the FAA’s BVLOS NPRM and EB-105A vertiport design guidance [67,69]. Studies should quantify PD at range, false alarms per hour, latency, and SA outcomes, and define interface requirements for ATM/UTM and AAM/vertiport operations [67,77]. AI and DL will shape surveillance by enabling intelligent software that extracts more from sensor data, allowing models to detect features such as micro-Doppler wing or rotor signatures, thereby improving bird and UAS classification [38,43].

5. Conclusions

Airport surveillance of sUAS and birds sits at the intersection of technological innovation and safety science. A systematic review of thirty-six studies indicates that no single sensor modality adequately detects low, slow, and small targets in the complex airport environment. Applying the Swiss cheese model and situational awareness theory clarifies why layering multiple sensors is essential: each modality constitutes an imperfect barrier with its own holes, and only through combination can the holes be covered and a coherent understanding of the airspace be achieved. While radar is crucial, its limitations with LSS targets require cameras, RF sensors, and fusion algorithms. Key challenges include improving classification, reducing false alarms, and ensuring reliability amid environmental noise and deception. Progress in machine learning, data sharing, and new sensor types is vital. Technology must be paired with clear protocols and ATC integration. The field is transitioning from experimental to operational, and coordinated innovation among airports, regulators, and technologists is essential for unified airspace security and the safe integration of unmanned systems.

Author Contributions

Conceptualization, J.S. and C.Y.; methodology, J.S. and C.Y.; validation, J.S. and C.Y.; formal analysis, J.S.; resources, J.S. and C.Y.; data curation, J.S.; writing—original draft preparation, J.S.; writing—review and editing, J.S. and C.Y.; visualization, J.S.; supervision, C.Y.; funding acquisition, C.Y. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

The original contributions presented in this study are included in this article; contact the corresponding author for questions.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
ACIAirports Council International
ADSPAeronautical Data Service Provider
AAMAdvanced Air Mobility
ATCAir Traffic Control
ATMAir Traffic Management
BVLOSBeyond Visual Line of Sight
DLDeep Learning
EASAEuropean Union Aviation Safety Agency
GAGeneral Aviation
HOOTLHuman Out of the Loop
JC&SJoint Communications & Sensing
NPRMNotice of Proposed Rulemaking
OLSObstacle Limitation Surfaces
PRISMAPreferred Reporting Items for Systematic Reviews and Meta-Analyses
PDProbability of Detection
RFRadio Frequency
RNN-LSTMRecurrent Neural Networks—Long Short-Term Memory
SASituational Awareness
UASUnmanned Aerial System
UTMUnmanned Traffic Management

Appendix A

Table A1. Results of JBI Risk of Bias Assessment.
Table A1. Results of JBI Risk of Bias Assessment.
#Study ContextQ1Q2Q3Q4Q5Q6Q7Q8 #Study ContextQ1Q2Q3Q4Q5Q6Q7Q8
1Ali & Nathwani, 2024YNUYYUUY 20Martelli et al., 2019YYYYYNYY
2Ananenkov et al., 2018UNYUNNUU 21Martelli et al., 2020aYYYYYNYY
3Case & Hupy, 2025YYUYYYYY 22Martelli et al., 2020bYYNAYNNYY
4Chervoniak et al., 2020YYUYNNUY 23Mott et al., 2020YYNYNAYYY
5Coombes et al., 2016YYYYNNYY 24Phillips et al., 2018YYYYYYYY
6De Luca et al., 2025YYUNYNNY 25Pothana et al., 2025YYYYYNYY
7Gauthreaux Jr et al., 2019YYYYYYYY 26Rollo et al., 2020YYNAYNANAYY
8Gerringer et al., 2016YYYYNYYY 27S. Liu et al., 2022YYYNYNYY
9Gong et al., 2024NYNNNNANN 28Stamm et al., 2018YYNAYNAYYY
10Gradolewski et al., 2021YYYYNYYY 29Thai et al., 2019YYYYYYYY
11Hale & Stanley, 2017YYYYYYYY 30W. S. Chen et al., 2018YYYYYYYY
12Heidger et al., 2021YYYYNYYY 31W. S. Chen et al., 2019YYNAYNAYYY
13Holt et al., 2024NYNUNNANY 32W. S. Chen et al., 2021YYNAYYYYY
14J. Liu et al., 2021YYNYYYYY 33Wang et al., 2023YYNANYNNAY
15J. Liu et al., 2024YYYYYYYY 34Washburn et al., 2022YYYYYYYY
16Juranyi et al., 2020YYYYNANYY 35Xu et al., 2023YYYNYNYY
17Krasnenko et al., 2023YYNNNNNY 36Xu et al., 2024YYYYNNYY
18L. Chen et al., 2025YYYYYYYY
19Lopez-Lago et al., 2017YYNANYNYY
Note. Y: Yes; N: No; U: Unclear; NA: Applicable. The criteria Q1–Q8 represent: Q1 (Inclusion criteria defined); Q2 (Subjects and setting described); Q3 (Exposure measured validly); Q4 (Condition measured objectively); Q5 (Confounders identified); Q6 (Confounder strategies stated); Q7 (Outcomes measured reliably); and Q8 (Appropriate statistical analysis). References: Ali and Nathwani (2024) [26]; Ananenkov et al. (2018) [27]; Case and Hupy (2025) [28]; Chervoniak et al. (2020) [29]; Coombes et al. (2016) [30]; De Luca et al. (2025) [31]; Gauthreaux Jr. et al. (2019) [32]; Gerringer et al. (2016) [33]; Gong et al. (2024) [34]; Gradolewski et al. (2021) [35]; Hale and Stanley (2017) [36]; Heidger et al. (2021) [37]; Holt et al. (2024) [38]; J. Liu et al. (2021) [39]; J. Liu et al. (2024) [40]; Juranyi et al. (2020) [41]; Krasnenko et al. (2023) [42]; L. Chen et al. (2025) [43]; Lopez-Lago et al. (2017) [44]; Martelli et al. (2019) [45]; Martelli et al. (2020a) [46]; Martelli et al. (2020b) [47]; Mott et al. (2020) [48]; Phillips et al. (2018) [49]; Pothana et al. (2025) [50]; Rollo et al. (2020) [51]; S. Liu et al. (2022) [52]; Stamm et al. (2018) [53]; Thai et al. (2019) [54]; W. S. Chen et al. (2018) [55]; W. S. Chen et al. (2019) [56]; W. S. Chen et al. (2021) [57]; Wang et al. (2023) [58]; Washburn et al. (2022) [59]; Xu et al. (2023) [60]; Xu et al. (2024) [61].

References

  1. Federal Aviation Administration. FAA Wildlife Strike Database; Federal Aviation Administration: Washington, DC, USA, 2025. Available online: https://wildlife.faa.gov/search (accessed on 1 November 2025).
  2. Wood, J. Wildlife Strikes up 14% in Last Year. General Aviation News, 22 August 2025. Available online: https://generalaviationnews.com/2025/08/22/wildlife-strikes-up-14-in-last-year/ (accessed on 2 November 2025).
  3. Searing, G. How the FAA Is Looking to Counteract Increasing Bird Strikes in U.S. Aviation. Bird Strike Association Canada. 23 March 2023. Available online: https://canadianbirdstrike.ca/how-the-faa-is-looking-to-counteract-increasing-bird-strikes-in-us-aviation/ (accessed on 5 November 2025).
  4. Federal Aviation Administration. FAA Drone Detection Testing; Federal Aviation Administration: Washington, DC, USA, 2025. Available online: https://www.faa.gov/newsroom/faa-drone-detection-testing-0 (accessed on 1 October 2025).
  5. Kies, A. Systematic Detection of Drones at Airports. Transportation Review. Available online: https://airport-management-europe.transportationreview.com/cxoinsight/systematic-detection-of-drones-at-airports--nwid-313.html (accessed on 15 October 2025).
  6. Airports Council International (ACI) Asia-Pacific & Middle East. Challenges for Airports in Addressing Existing and Emerging Threats. In Proceedings of the ICAO MID GASeP Forum, Middle East, Cairo, Egypt, 15–19 June 2025; PowerPoint Presentation. Available online: https://www2023.icao.int/MID/Documents/2025/GASeP-CZ%20Seminars/ICAO%20MID%20GASeP%20forum%20-%20ACI%20presentation%20-%20threat%20and%20risk.pdf (accessed on 25 October 2025).
  7. Barduson, L.; Carrasco, E.; Khan, Z.; Krstulic, K. Cooperative and Non-Cooperative UAS Detection, 2025. Available online: https://www.researchgate.net/publication/2242544_A_Survey_on_UAV_Detection_and_Classification_Techniques (accessed on 30 October 2025).
  8. Khawaja, W.; Ezuma, M.; Semkin, V.; Erden, F.; Ozdemir, O.; Guvenc, I. A Survey on Detection, Classification, and Tracking of UAVs Using Radar and Communications Systems. arXiv 2025, arXiv:2402.05909v2. Available online: https://arxiv.org/html/2402.05909v2 (accessed on 15 November 2025).
  9. Semenyuk, V.; Kurmashev, I.; Lupidi, A.; Alyoshin, D.; Krivoshein, E.; Gasanov, M. Advances in UAV Detection: Integrating Multi-Sensor Systems and AI for Enhanced Accuracy and Efficiency. Int. J. Crit. Infrastruct. Prot. 2025, 49, 100744. [Google Scholar] [CrossRef]
  10. Dong, Y.; Wu, F.; Zhang, S.; Chen, G.; Hu, Y.; Yano, M.; Sun, J.; Huang, S.; Liu, F.; Dai, Q.; et al. Securing the Skies: A Comprehensive Survey on Anti-UAV Methods, Benchmarking, and Future Directions. arXiv 2025, arXiv:2504.11967. Available online: https://arxiv.org/abs/2504.11967 (accessed on 22 November 2025).
  11. Molchanov, P.; Harmanny, R.I.A.; de Wit, J.J.M.; Egiazarian, K.; Astola, J. Classification of small UAVs and birds by micro-Doppler signatures. Int. J. Microw. Wirel. Technol. 2014, 6, 435–444. [Google Scholar] [CrossRef]
  12. Luo, W.; Xing, J.; Milan, A.; Zhang, X.; Liu, W.; Kim, T.-K. Multiple object tracking: A literature review. Artif. Intell. 2021, 293, 103448. [Google Scholar] [CrossRef]
  13. Al-Qubaydhi, N.; Alenezi, A.; Alanazi, T.; Senyor, A.; Alanezi, N.; Alotaibi, B.; Alotaibi, M.; Razaque, A.; Hariri, S. Deep learning for unmanned aerial vehicles detection: A review. Comput. Sci. Rev. 2024, 51, 100614. [Google Scholar] [CrossRef]
  14. Bisio, I.; Garibotto, C.; Haleem, H.; Lavagetto, F.; Sciarrone, A. RF/WiFi-based UAV surveillance systems: A systematic literature review. Internet Things 2024, 26, 101201. [Google Scholar] [CrossRef]
  15. Coluccia, A.; Parisi, G.; Fascista, A. Detection and classification of multirotor drones in radar sensor networks: A review. Sensors 2020, 20, 4172. [Google Scholar] [CrossRef]
  16. de Macedo, S.O.; Caetano, M.; da Costa, R.M. Drone detection in airport environments: A literature review. Array 2025, 28, 100511. [Google Scholar] [CrossRef]
  17. Rahman, M.H.; Sejan, M.A.S.; Aziz, M.A.; Tabassum, R.; Baik, J.-I.; Song, H.-K. A comprehensive survey of unmanned aerial vehicles detection and classification using machine learning approach: Challenges, solutions, and future directions. Remote Sens. 2024, 16, 879. [Google Scholar] [CrossRef]
  18. Samaras, S.; Diamantidou, E.; Ataloglou, D.; Sakellariou, N.; Vafeiadis, A.; Magoulianitis, V.; Lalas, A.; Dimou, A.; Zarpalas, D.; Votis, K.; et al. Deep learning on multi-sensor data for counter-UAV applications: A systematic review. Sensors 2019, 19, 4837. [Google Scholar] [CrossRef]
  19. Seidaliyeva, U.; Ilipbayeva, L.; Taissariyeva, K.; Smailov, N.; Matson, E.T. Advances and challenges in drone detection and classification techniques: A state-of-the-art review. Sensors 2024, 24, 125. [Google Scholar] [CrossRef]
  20. Seidaliyeva, U.; Ilipbayeva, L.; Utebayeva, D.; Smailov, N.; Matson, E.T.; Tashtay, Y.; Turumbetov, M.; Sabibolda, A. LiDAR technology for UAV detection: From fundamentals and operational principles to advanced detection and classification techniques. Sensors 2025, 25, 2757. [Google Scholar] [CrossRef] [PubMed]
  21. Chen, W.; Huang, Y.; Lu, X.; Zhang, J. Review on critical technology development of avian radar system. Aircr. Eng. Aerosp. Technol. 2022, 94, 445–457. [Google Scholar] [CrossRef]
  22. Yang, C.; Huang, C.; Zhao, Y. The intellectual structure and the future of counter-uncrewed aerial systems (UAS) research: A bibliometric and a scoping review. Int. J. Aviat. Aeronaut. Aerosp. 2024, 11, 3. [Google Scholar] [CrossRef]
  23. Liu, Z.; An, P.; Yang, Y.; Qiu, S.; Liu, Q.; Xu, X. Vision-based drone detection in complex environments: A survey. Drones 2024, 8, 643. [Google Scholar] [CrossRef]
  24. Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Moher, D. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. BMJ 2021, 372, n71. [Google Scholar] [CrossRef]
  25. Joanna Briggs Institute. The Joanna Briggs Institute Critical Appraisal Tools for Use in JBI Systematic Reviews: Critical Appraisal Checklist for Analytical Cross Sectional Studies; Joanna Briggs Institute: Adelaide, Australia, 2017; pp. 1–7. Available online: https://jbi.global/sites/default/files/2019-05/JBI_Critical_Appraisal-Checklist_for_Analytical_Cross_Sectional_Studies2017_0.pdf (accessed on 10 December 2025).
  26. Ali, M.; Nathwani, K. Exploiting wavelet scattering transform and 1D-CNN for unmanned aerial vehicle detection. IEEE Signal Process. Lett. 2024, 31, 1790–1794. [Google Scholar] [CrossRef]
  27. Ananenkov, A.E.; Marin, D.V.; Nuzhdin, V.M.; Rastorguev, V.V.; Sokolov, P.V. Possibilities to observe small-size UAVs in the prospective airfield radar. In Proceedings of the 2018 20th Anniversary International Conference on Transparent Optical Networks (ICTON), Bucharest, Romania, 1–5 July 2018; IEEE: Piscataway, NJ, USA, 2018. [Google Scholar] [CrossRef]
  28. Case, R.P.; Hupy, J.P. Airport Cooperative Research Program Graduate Research Award 11-04: Geographic information system application to unmanned traffic management within the national airspace system. Transp. Res. Rec. 2025, 2679, 1064–1078. [Google Scholar] [CrossRef]
  29. Chervoniak, Y.; Sinitsyn, R.; Yanovsky, F. Passive acoustic radar system for flying vehicle localization. In Proceedings of the 23rd International Microwave and Radar Conference (MIKON 2020), Warsaw, Poland, 5–7 October 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 278–281. [Google Scholar] [CrossRef]
  30. Coombes, M.; Liu, C.; Chen, W.-H. Situation awareness for UAV operating in terminal areas using bearing-only observations and circuit flight rules. In Proceedings of the 2016 American Control Conference (ACC), Boston, MA, USA, 6–8 July 2016; IEEE: Boston, MA, USA, 2016; pp. 479–485. [Google Scholar] [CrossRef]
  31. De Luca, V.; Pascarelli, C.; Colucci, M.; Afrune, P.; Corallo, A.; Avanzini, G. A platform for safe operations of unmanned aircraft systems in critical areas. Engineering 2025, 49, 314–331. [Google Scholar] [CrossRef]
  32. Gauthreaux, S.A., Jr.; Shapiro, A.-M.; Mayer, D.; Clarke, B.L.; Herricks, E.E. Detecting bird movements with L-band avian radar and S-band dual-polarization Doppler weather radar. Remote Sens. Ecol. Conserv. 2019, 5, 237–246. [Google Scholar] [CrossRef]
  33. Gerringer, M.B.; Lima, S.L.; DeVault, T.L. Evaluation of an avian radar system in a midwestern landscape. Wildl. Soc. Bull. 2016, 40, 150–159. [Google Scholar] [CrossRef]
  34. Gong, J.; Yan, J.; Kong, D.; Li, D. Enhancing airport airspace safety: A small radar system for mitigating bird, drone, and wake vortex hazards. In Proceedings of the 2024 AIAA DATC/IEEE 43rd Digital Avionics Systems Conference (DASC), San Diego, CA, USA, 29 September–3 October 2024; IEEE: Piscataway, NJ, USA, 2024; pp. 1–10. [Google Scholar] [CrossRef]
  35. Gradolewski, D.; Dziak, D.; Kaniecki, D.; Jaworski, A.; Skakuj, M.; Kulesza, W.J. A runway safety system based on vertically oriented stereovision. Sensors 2021, 21, 1464. [Google Scholar] [CrossRef] [PubMed]
  36. Hale, M.R.; Stanley, R.M. Evaluating the design and suitability of the wildlife surveillance concept. In Proceedings of the 2017 Integrated Communications, Navigation and Surveillance Conference (ICNS), Herndon, VA, USA, 18–20 April 2017; IEEE: Piscataway, NJ, USA, 2017. [Google Scholar] [CrossRef]
  37. Heidger, R.; Lambercy, V.; Lambers, D. Tracking analysis of drone detection systems at airports: Methodology and results. In Proceedings of the 21st International Radar Symposium (IRS 2021), Berlin, Germany, 21–22 June 2021; IEEE: Piscataway, NJ, USA, 2021. [Google Scholar] [CrossRef]
  38. Holt, D.; Guo, W.; Sun, M.; Panagiotakopoulos, D.; Warston, H. Deep learning for radar classification. In Proceedings of the 2024 International Conference on Unmanned Aircraft Systems (ICUAS), Chania, Greece, 4–7 June 2024; IEEE: Piscataway, NJ, USA, 2024; pp. 1208–1215. [Google Scholar] [CrossRef]
  39. Liu, J.; Xu, Q.; Chen, W.S. Classification of bird and drone targets based on motion characteristics and random forest model using surveillance radar data. IEEE Access 2021, 9, 160135–160144. [Google Scholar] [CrossRef]
  40. Liu, J.; Xu, Q.; Su, M.; Chen, W.S. Activity modeling and characterization for airport bird situation awareness using avian radar datasets. Aerospace 2024, 11, 442. [Google Scholar] [CrossRef]
  41. Juranyi, Z.; Schmidt, J.; Peters, E.; Lehmann, F. Characterization of an affordable and compact gated-viewing system for maritime search and rescue applications. In Electro-Optical Remote Sensing XIV; Kamerman, G.W., Steinvall, O., Eds.; SPIE: Washington, DC, USA, 2020; Volume 11538, p. 1153808. [Google Scholar] [CrossRef]
  42. Krasnenko, N.P.; Vlasov, E.V.; Rakov, A.S. Radio physical methods of ensuring ornithological safety of objects and territories. Russ. Phys. J. 2023, 66, 479–484. [Google Scholar] [CrossRef]
  43. Chen, L.; Zhu, N.; Chen, H.; Dong, Y.; Zhang, Y.; Cai, N. A causality-inspired single-source domain generalized method for low-slow-small threat target recognition through holographic Doppler radar. Expert Syst. Appl. 2025, 287, 128104. [Google Scholar] [CrossRef]
  44. Lopez-Lago, M.; Casado, R.; Bermudez, A.; Serna, J. A predictive model for risk assessment on imminent bird strikes on airport areas. Aerosp. Sci. Technol. 2017, 62, 19–30. [Google Scholar] [CrossRef]
  45. Martelli, T.; Colone, F.; Cardinali, R. Simultaneous short- and long-range surveillance of drones and aircraft with DVB-T-based passive radar. In Proceedings of the 2019 International Radar Conference (RADAR 2019), Toulon, France, 23–27 September 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 561–566. [Google Scholar] [CrossRef]
  46. Martelli, T.; Colone, F.; Cardinali, R. DVB-T-based passive radar for simultaneous counter-drone operations and civil air traffic surveillance. IET Radar Sonar Navig. 2020, 14, 505–515. [Google Scholar] [CrossRef]
  47. Martelli, T.; Filippini, F.; Colone, F. Tackling the different target dynamics issues in counter-drone operations using passive radar. In Proceedings of the 2020 IEEE International Radar Conference (RADAR), Washington DC, USA, 27 April–27 May 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 512–517. [Google Scholar] [CrossRef]
  48. Mott, J.H.; Marshall, Z.A.; Vandehey, M.A.; May, M.; Bullock, D.M. Detection of conflicts between ADS-B-equipped aircraft and unmanned aerial systems. Transp. Res. Rec. 2020, 2674, 197–204. [Google Scholar] [CrossRef]
  49. Phillips, A.C.; Majumdar, S.; Washburn, B.E.; Mayer, D.; Swearingin, R.M.; Herricks, E.E.; Guerrant, T.L.; Beckerman, S.F.; Pullins, C.K. Efficacy of avian radar systems for tracking birds on the airfield of a large international airport. Wildl. Soc. Bull. 2018, 42, 467–477. [Google Scholar] [CrossRef]
  50. Pothana, P.; Snyder, P.; Vidhyadharan, S.; Ullrich, M.; Thornby, J. Air traffic trends and UAV safety: Leveraging automatic dependent surveillance–broadcast data for predictive risk mitigation. Aerospace 2025, 12, 284. [Google Scholar] [CrossRef]
  51. Rollo, M.; Kaiser, V.; Volf, P. Modeling and simulation of sensor placement strategies to detect malicious UAS operations. In Proceedings of the 2020 Integrated Communications Navigation and Surveillance Conference (ICNS), Herndon, VA, USA, 8–10 September 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 2G2-1–2G2-12. [Google Scholar] [CrossRef]
  52. Liu, S.; Qu, J.; Wu, R. HollowBox: An anchor-free UAV detection method. IET Image Process. 2022, 16, 2922–2936. [Google Scholar] [CrossRef]
  53. Stamm, R.J.; Glaneuski, J.; Kennett, P.R.; Belanger, J.M. Advances in the use of NAS infrastructure and GBDAA for UAS operations. In Proceedings of the 2018 IEEE/AIAA 37th Digital Avionics Systems Conference (DASC), London, UK, 23–27 September 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 1507–1515. [Google Scholar] [CrossRef]
  54. Thai, V.-P.; Zhong, W.; Pham, T.; Alam, S.; Duong, V. Detection, tracking, and classification of aircraft and drones in digital towers using machine learning on motion patterns. In Proceedings of the 2019 Integrated Communications, Navigation and Surveillance Conference (ICNS), Washington, DC, USA, 9–11 April 2019; IEEE: Piscataway, NJ, USA, 2019. [Google Scholar] [CrossRef]
  55. Chen, W.S.; Zhang, J.; Li, J. Intelligent decision-making with bird-strike risk assessment for airport bird repellent. Aeronaut. J. 2018, 122, 988–1002. [Google Scholar] [CrossRef]
  56. Chen, W.S.; Liu, J.; Li, J. Classification of UAV and bird target in low-altitude airspace with surveillance radar data. Aeronaut. J. 2019, 123, 191–211. [Google Scholar] [CrossRef]
  57. Chen, W.S.; Huang, Y.F.; Lu, X.F.; Zhang, J. Analysis of bird situation around airports using avian radar. Aeronaut. J. 2021, 125, 2149–2168. [Google Scholar] [CrossRef]
  58. Wang, Z.; Williams, D.; Wong, K.C. Probability map based aerial target detection and localisation using networked cameras. In Proceedings of the AIAA SCITECH 2023 Forum, National Harbor, MD, USA, 23–27 January 2023; American Institute of Aeronautics and Astronautics: Reston, VA, USA, 2023. [Google Scholar] [CrossRef]
  59. Washburn, B.E.; Maher, D.; Beckerman, S.F.; Majumdar, S.; Pullins, C.K.; Guerrant, T.L. Monitoring raptor movements with satellite telemetry and avian radar systems: An evaluation for synchronicity. Remote Sens. 2022, 14, 2658. [Google Scholar] [CrossRef]
  60. Xu, Q.; Liu, J.; Su, M.; Chen, W.S. Multi-scale temporal characters mining for bird activities based on historical avian radar system datasets. Aeronaut. J. 2023, 127, 1452–1472. [Google Scholar] [CrossRef]
  61. Xu, Q.; Liu, J.; Su, M.; Chen, W.S. Bird behaviour characterisation and environment dependence modelling in airport airspace based on radar datasets. Aeronaut. J. 2024, 128, 2815–2831. [Google Scholar] [CrossRef]
  62. Reason, J. Managing the Risks of Organizational Accidents, 1st ed.; Ashgate: Aldershot, UK, 1997. [Google Scholar]
  63. Endsley, M.R. Toward a theory of situation awareness in dynamic systems. Hum. Factors 1995, 37, 32–64. [Google Scholar] [CrossRef]
  64. Wang, W.; Guo, X.; Liu, Y.; Tang, A.; Yang, Q. Factors affecting unmanned aerial vehicles’ unsafe behaviors and influence mechanism based on social network theory. Transp. Res. Rec. 2023, 2677, 1030–1045. [Google Scholar] [CrossRef]
  65. Lykou, G.; Moustakas, D.; Gritzalis, D. Defending airports from UAS: A survey on cyber-attacks and counter-drone sensing technologies. Sensors 2020, 20, 3537. [Google Scholar] [CrossRef]
  66. Swinney, C.J.; Woods, J.C. The effect of real-world interference on CNN feature extraction and machine learning classification of unmanned aerial systems. Aerospace 2021, 8, 179. [Google Scholar] [CrossRef]
  67. Federal Aviation Administration. Engineering Brief No. 105A: Vertiport Design; Supplemental Guidance to Advisory Circular 150/5390-2D, Heliport Design; U.S. Department of Transportation: Washington, DC, USA, 2024. Available online: https://www.faa.gov/airports/engineering/engineering_briefs/eb_105a_vertiports (accessed on 11 December 2025).
  68. U.S. Government Accountability Office. Advanced Air Mobility: Legal Authorities and Issues to Consider for Operations (GAO-24-106451); U.S. Government Accountability Office: Washington, DC, USA, 2024. Available online: https://www.gao.gov/assets/d24106451.pdf (accessed on 16 December 2025).
  69. Federal Aviation Administration. BEYOND 2025. Available online: https://www.faa.gov/uas/programs_partnerships/beyond (accessed on 15 December 2025).
  70. Federal Aviation Administration. Normalizing unmanned aircraft systems beyond visual line of sight operations. Fed. Regist. 2025, 90, 38212–38391. [Google Scholar]
  71. Zahn, D.; Ginn, S.; Eggum, S.; Harris, R.J. NASA Advanced Air Mobility (AAM) Project National Campaign: Development of Airspace Operations, Infrastructure and Data; Report No. AAM-NC-069-001; NASA Armstrong Flight Research Center: Edwards, CA, USA, 2023. Available online: https://ntrs.nasa.gov/citations/20230011793 (accessed on 15 December 2025).
  72. Federal Aviation Administration. UTM Field Test (UFT) Final Report, 1st ed.; Federal Aviation Administration: Washington, DC, USA, 2023. Available online: https://www.faa.gov/uas/advanced_operations/traffic_management/UFT-Final-Report.pdf (accessed on 28 November 2025).
  73. Kopyt, A.; Dylicki, S. Urban air mobility vertiport capacity simulation and analysis. Aerospace 2025, 12, 560. [Google Scholar] [CrossRef]
  74. Vitiello, F.; Causa, F.; Opromolla, R.; Fasano, G.; Dolph, C.; Ferrante, T.; Ippolito, C. Demonstration of data processing and fusion from distributed radars for advanced air mobility surveillance. J. Aerosp. Inf. Syst. 2025, 22, 510–522. [Google Scholar] [CrossRef]
  75. Veneruso, P.; Manica, L.; Miccio, E.; Opromolla, R.; Tiana, C.; Gentile, G.; Fasano, G. FMCW radar-aided navigation for unmanned aircraft approach and landing in AAM scenarios: System requirements and processing pipeline. Sensors 2025, 25, 2429. [Google Scholar] [CrossRef]
  76. Dulia, E.; Shihab, S. Designing a surveillance sensor network with information clearinghouse for advanced air mobility. Sensors 2024, 24, 803. [Google Scholar] [CrossRef]
  77. Ruseno, N.; Lin, C.-Y. Development of UTM monitoring system based on network remote ID with inverted teardrop detection algorithm. Unmanned Syst. 2025, 13, 105–120. [Google Scholar] [CrossRef]
Figure 1. The hierarchical process of aerial object tracking from the literature.
Figure 1. The hierarchical process of aerial object tracking from the literature.
Drones 10 00022 g001
Figure 2. Systematic Review Workflow Aligned with the PRISMA 2020 Framework.
Figure 2. Systematic Review Workflow Aligned with the PRISMA 2020 Framework.
Drones 10 00022 g002
Figure 3. PRISMA flow diagram adapted from PRISMA 2020 guidelines [24].
Figure 3. PRISMA flow diagram adapted from PRISMA 2020 guidelines [24].
Drones 10 00022 g003
Figure 4. Distribution of included studies by year and country (2016–2025).
Figure 4. Distribution of included studies by year and country (2016–2025).
Drones 10 00022 g004
Figure 5. Operationalization of the Swiss Cheese Model for sUAS Surveillance (Adapted from Reason, 1997 [62]).
Figure 5. Operationalization of the Swiss Cheese Model for sUAS Surveillance (Adapted from Reason, 1997 [62]).
Drones 10 00022 g005
Figure 6. Mapping sensor workflows to Endsley’s (1995) levels of situational awareness [63].
Figure 6. Mapping sensor workflows to Endsley’s (1995) levels of situational awareness [63].
Drones 10 00022 g006
Table 1. Review comparison: past reviews vs. current review.
Table 1. Review comparison: past reviews vs. current review.
Systematic Review, YearPRISMA *Airport FocusRadarRF/ADS-BEO/IRAcousticMulti-Sensor FusionSafety Theory AlignmentFixed-Based System Focus
Al-Qubaydhi et al., 2024 [13]
Bisio et al., 2024 [14]
Coluccia et al., 2020 [15]
de Macedo et al., 2025 [16]
Rahman et al., 2024 [17]
Samaras et al., 2019 [18]
Seidaliyeva et al., 2024 [19]
Seidaliyeva et al., 2025 [20]
W. S. Chen et al., 2022 [21]
Yang et al., 2024 [22]
Z. Liu et al., 2024 [23]
Current Review (2025)
Note. * PRISMA promotes methodological rigor, reproducibility, and transparent reporting in systematic reviews.
Table 2. Summary of data items extracted from included studies.
Table 2. Summary of data items extracted from included studies.
CategoryItems Extracted
Technology Type (RQ1)Radar Systems, RF-based Detection, Acoustic Monitoring, Visual/Electro-optical Systems, Sensor Fusion/Multimodal
Performance Metrics (RQ2)Detection Range, Accuracy, Precision, Recall, F1, Inference Time, Detection Time, Localization Certainty, IoU
Advantages & Limitations (RQ1)Strengths and Weaknesses
Challenges (RQ2)Differentiating UAS from birds, environmental conditions/clutter/noise, sensor limitations, real-time processing demands, data availability/quality
Opportunities (RQ2)AI/ML/DL Integration, Multi-sensor Fusion, Improved Data Processing/Feature Extraction, Enhanced Datasets, Real-time/Cost-effective Solutions
Table 3. List of the included studies.
Table 3. List of the included studies.
#Authors,
Year
Airport/
Simulation
Target
Combination
AcousticOpticalADSBPassive RFPassive RadarWX/DopplerAvian RadarAirfield RadarSensor Fusion
1Ali & Nathwani, 2024Airport (simulation)UAS + Birds + Aircraft
2Ananenkov et al., 2018Voskresensk airfieldUAS only
3Case & Hupy, 2025LAF, IND, SAVUAS + Aircraft
4Chervoniak et al., 2020Airport (general)UAS + Aircraft
5Coombes et al., 2016EGBW (simulation)UAS + Aircraft
6De Luca et al., 2025LIBG: ICAOUAS + Aircraft
7Gauthreaux Jr et al., 2019KDFWBirds only
8Gerringer et al., 2016KHUFBirds only
9Gong et al., 2024ZHHH: ICAOBirds + UAS + Wake
10Gradolewski et al., 2021EPLL: ICAOBirds (+UAS surrogates)
11Hale & Stanley, 2017KPHL (simulation)Birds only
12Heidger et al., 2021EDDF, EDDM: ICAOUAS (+Birds interference) *
13Holt et al., 2024EGTC: ICAOUAS + Aircraft
14J. Liu et al., 2021ZGBH: ICAOUAS + Birds + Weather
15J. Liu et al., 2024ZGBH: ICAOBirds only
16Jurányi et al., 2020Closed AirstripUAS + Surface/Maritime
17Krasnenko et al., 2023UNTT: ICAOBirds only
18L. Chen et al., 2025ZSPD: ICAOUAS + Birds
19Lopez-Lago et al., 2017Airport (simulation)Birds + Aircraft
20Martelli et al., 2019LIRE: ICAOUAS + Aircraft
21Martelli et al., 2020aLIRE: ICAOUAS + Aircraft
22Martelli et al., 2020bLIRE: ICAOSmall Aircraft + Birds
23Mott et al., 2020KMLBUAS + Aircraft
24Phillips et al., 2018KORDBirds only
25Pothana et al., 2025LAX, SMF, SMXAircraft only
26Rollo et al., 2020LKPR (simulation)UAS only
27S. Liu et al., 2022Airport in TianjinUAS only
28Stamm et al., 2018KSGH (simulation)UAS + Aircraft
29Thai et al., 2019KJYO (simulation)UAS + Aircraft
30W. S. Chen et al., 2018ZGBH: ICAOBirds only *
31W. S. Chen et al., 2019Airport (simulation)UAS + Birds
32W. S. Chen et al., 2021Airport SW, NE ChinaBirds only
33Wang et al., 2023YSSY: ICAOAircraft only
34Washburn et al., 2022KORDBirds only
35Xu et al., 2023Airport (NE China)Birds + Bats
36Xu et al., 2024Airport (general)Birds only
21085321179
Note. * Also includes infrared camera. References: Ali and Nathwani (2024) [26]; Ananenkov et al. (2018) [27]; Case and Hupy (2025) [28]; Chervoniak et al. (2020) [29]; Coombes et al. (2016) [30]; De Luca et al. (2025) [31]; Gauthreaux Jr. et al. (2019) [32]; Gerringer et al. (2016) [33]; Gong et al. (2024) [34]; Gradolewski et al. (2021) [35]; Hale and Stanley (2017) [36]; Heidger et al. (2021) [37]; Holt et al. (2024) [38]; J. Liu et al. (2021) [39]; J. Liu et al. (2024) [40]; Juranyi et al. (2020) [41]; Krasnenko et al. (2023) [42]; L. Chen et al. (2025) [43]; Lopez-Lago et al. (2017) [44]; Martelli et al. (2019) [45]; Martelli et al. (2020) [46]; Martelli et al. (2020) [47]; Mott et al. (2020) [48]; Phillips et al. (2018) [49]; Pothana et al. (2025) [50]; Rollo et al. (2020) [51]; S. Liu et al. (2022) [52]; Stamm et al. (2018) [53]; Thai et al. (2019) [54]; W. S. Chen et al. (2018) [55]; W. S. Chen et al. (2019) [56]; W. S. Chen et al. (2021) [57]; Wang et al. (2023) [58]; Washburn et al. (2022) [59]; Xu et al. (2023) [60]; and Xu et al. (2024) [61].
Table 4. JBI risk of bias assessment.
Table 4. JBI risk of bias assessment.
Criteria for Risk of Bias AssessmentYes n/NYes %
(1) Inclusion Criteria Defined33/3691.7
(2) Subjects and Setting Described34/3694.4
(3) Exposure Measured Validly20/2969.0
(4) Outcome Measured by Standard Criteria27/3675.0
(5) Confounders Identified21/3265.6
(6) Strategies to Address Confounding17/3351.5
(7) Outcomes Measured Validly and Reliably28/3580.0
(8) Appropriate Statistical Analysis34/3694.4
Note. Details of the assessment are provided in Appendix A, Table A1, where the column headers Q1–Q8 correspond directly to the eight appraisal criteria elaborated upon in Table 3.
Table 5. Comparative analysis of aerial object detection sensor types at airports.
Table 5. Comparative analysis of aerial object detection sensor types at airports.
Sensor/
Technology
SA LevelRole (Swizz Cheese)StrengthsLimitationsOperational ImplicationRefs.
Radar:
Millimeter-Wave (mmW) Radar, Phased-Array Radar (L/S/X-band), Holographic Doppler
Primarily level 1; supports level 2 via tracking and signaturesOuter barrier; detects LSS objectsLong-range, all-weather, 24/7 autonomous detection/tracking; micro-Doppler for classificationSmall RCS LSS target detection, low-altitude clutter interference, difficulty differentiating birds/drones, and long-range micro-Doppler issues.Saturates operators with false positives, erodes trust, and increases cognitive workload.[26,27,32,34,35,39,43,45,49,56,58,59]
Visual:
Electro-Optical/Infrared (EO/IR)
Level 1 (Perception); Level 2 (Comprehension via classification)Visual confirmation barrierVisual confirmation, precise localization, object identification using high-resolution cameras/deep learning.Weather-dependent (fog, rain, night), non-line-of-sight, limited range for small targets, high computational demand.Research is needed for robust computer vision algorithms.[26,31,35,41,52,54,58]
RF/Cooperative:
RF monitoring, ADS-B, RemoteID, AeroScope, Passive Coherent Location radar
Level 1 (Perception); Level 2 (Comprehension via ID)Identity barrierReal-time identification, position for cooperative aircraft/UAS (ADS-B, AeroScope); critical for ATM/UTMIneffective for non-cooperative/autonomous UAS, manufacturer-specific, low-altitude coverage gaps, susceptible to interference.Necessitates universal Remote ID compliance for effectiveness[26,27,28,50,53]
Acoustic:
Acoustic sensors/arrays
Level 1 (Perception); Level 2 (Comprehension via signature analysis)Noise Signature confirmation barrierPassive, cost-effective, lightweight, weather-resistant UAS detection using sound signatures/MLVery short range, severely constrained by high ambient airport noise, and high computational demand for deep networks.Need robust acoustic algorithms to function reliably in all conditions[26,29,37,51]
Sensor Fusion:
MSDF (Radar + RF + EO/IR fusion
Level 2 (Comprehension); enables Level 3 when coupled with predictive modelsIntegrating barriersOvercomes single sensor limitations, enhances detection reliability, accuracy, situational awareness, reduces false alarmsIntegration complexity, data-quality disparities, lack of unified standards, storage and compute burdens.Moves from disjointed alerts to cohesive safety shield.[31,34,37,38,46,51,53]
Table 6. Synthesis of ATM/UTM integration approaches and platforms.
Table 6. Synthesis of ATM/UTM integration approaches and platforms.
SystemFeaturesKey ComponentsBenefits for
Airport Operations
Ref
AcrOSS PlatformHolistic UTM IntegrationThree-layered architecture (ATM, USS, UAS), Drone Boxes, N&A service, simulation tools.Manages UAS operations in critical areas, clear communication protocols, and handles contingencies.[31]
Drone Detection System (DDS)Multi-sensor system for rogue targets.Central Multi-Sensor Data Fusion (MSDF), radar, RF detection, and EO/IR sensorsDetects, tracks, displays, and alerts for UAS. Multiple sensors are needed for full coverage.[37]
Ground-Based Detect and Avoid (GBDAA)Multi-Sensor Data FusionPrimary/secondary radars, ADS-B, multilateration, datalink from Ground Control Stations (GCS).Provides an integrated air picture, enhances SA, and offers proximity alerts.[53]
Remote IDFAA standard for UASSensors and receivers (expected to be similar infrastructure to ADS-BProvides an additional data source for UAS tracking; the future challenge is integrating it with existing systems[28]
Geographic Information Systems (GIS)Spatial Data ContextualizationMulti-layered mapping, quantitative spatial analysis, and comprehensive data aggregation.Improves risk management, identifies spatial patterns of incidents, and visualizes complex interactions.[28]
AI IntegrationPredictive Analytics & SimulationTrained AI models (e.g., RNNs), digital twin of airspace, ADS-B, and historical radar data.Forecasts flight patterns, identifies non-standard tracks, enables dynamic scheduling, and refines ATC focus.[50]
Table 7. Stakeholder-specific implications.
Table 7. Stakeholder-specific implications.
Practitioners
(Airports, Integrators, UAS Operators)
Regulators
(Authorities, Standards Bodies)
Academic Researchers
(Safety, Tech, Data)
Multi-sensor deployment:
  • • Layered radar/EO/RF/acoustic coverage.
  • • Slew-to-cue workflow where one sensor flags, another confirms, & cameras reposition automatically.
  • • Operators receive a single fused cue rather than isolated sensor hits.
Cross-sensor confirmation rules:
  • • Require sensor-to-sensor verification before operational restrictions.
  • • Field inspections to verify accuracy & calibration.
  • • Implement multiple-defenses principles to reduce false positives.
Open multimodal datasets:
  • • Airport-grade radar, RF, EO/IR, & acoustic datasets w/precise labels.
  • • Include clutter, weather variation, ATC noise, & rare events [19].
Procedures & protocols:
  • • Operational procedures rather than just technical documentation.
  • • Written alert levels and pausing criteria.
  • • Integration with ATC procedures.
Regulatory frameworks & standards:
  • • Set performance requirements, latency thresholds, & minimum detection/confirmation rates.
  • • Define ATM–UTM interoperability & risk-based surveillance tiers [65].
  • • Standardize incident reporting formats & retention rules.
Methods & benchmarks:
  • • Bird vs. UAS classification models.
  • • Metrics for PD, FAR, latency, SA.
  • • Scenario libraries for runway, taxiway, & vertiport environments.
Human factors & training:
  • • Human-centered HMI design.
  • • Usability testing with SUS, NASA-TLX, SART.
  • • Cross-cueing & SA training so operators judge whether an alert affects active traffic.
Oversight, audits, & learning:
  • • False-alarm tracking & periodic audits.
  • • Central incident-reporting database
  • • Clear audit trails for RF/EO captures to avoid scope creep & maintain public trust.
Human–automation studies:
  • • HOOTL validation for 24/7 automation.
  • • Workload–safety trade-off studies [66]
  • • Cognitive-load effects during high-alert periods.
Risk-based deployment & cost:
  • • Match sensing stack to airport category.
  • • Modular, upgradeable installations suited for regional airports.
  • • Pre-agreed handoff procedures with police & wildlife teams.
Governance, privacy, Remote ID:
  • • Data-governance and privacy rules.
  • • Strict Remote ID enforcement & standardized data-sharing protocols.
  • • ICAO/FAA interoperability & privacy guidance.
System-of-systems testbeds:
  • • Digital twins for AAM & vertiport layouts.
  • • Synthetic data for rare edge cases.
  • • Cyber-resilient, standards-based data-exchange frameworks [66].
AAM & vertiport readiness:
  • • Interoperable radar/EO/RF feeds connected to ATC/UTM displays.
  • • Pre-arranged handoffs to first responders & wildlife units [65].
  • • Align with FAA EB-105A for vertiport operations & siting [67,68].
AAM readiness:
  • • Certification pathways for novel sensors, including mmW radar & LiDAR [19].
  • • Standardize latency, quality, & failover expectations for AAM–UTM integration.
Future communications & sensing:
  • • Spectrum-aware RF/5G/JCS sensing.
  • • Living-lab deployments (BEYOND, BVLOS) for certification-grade evidence.
  • • Adversarial modeling & counter-spoofing research.
Table 8. Risk-based operational categories and surveillance tiers.
Table 8. Risk-based operational categories and surveillance tiers.
CategoryTier 1Tier 2Tier 3
Risk Level/Site TypeLow-risk sites (rural & uncontrolled)Moderate-risk facilities (suburban/regional airports)High-risk/high-density nodes (major hubs & urban vertiports)
FAA Category1–2 (Remote/Sparsely Populated)Rural 3–4 (suburban, light urban)5 (dense urban, major hubs)
AirspaceClass G, ED, E, under a Class B shelfClass B, C
SurveillanceBasic cooperative tracking, Remote ID, see-and-avoidMixed sensor strategy, strategic deconfliction via UTM, ADSP requiredComprehensive, layered, multi-sensor surveillance, mandatory RID, and onboard DAA required
Sensor MixMinimal: Remote ID (RID) receiver(s), onboard GPS, lightweight detect-and-avoid (DAA), visual observers, or onboard EO/IR DAACooperative (ADS-B receivers, electronic conspicuity devices), selective non-cooperative (medium-range radar, optical detection), RID receiversHigh-end primary/4D radars, ADS-B/transponder receivers, wide-area RID capture, acoustic/visual systems, multilateration, LiDAR, IR cameras
UTM/ATC IntegrationFull UTM integration is optional, simple flight rules, and traditional ATC is uninvolvedModerate integration, drones file strategic flight plans, data may feed ATC tower or coordination center.Highest priority, certified UTM/ADSP platforms, real-time conformance monitoring, ATC–UTM interoperability
ExampleSmall GA airports in rural regions, isolated vertiport sitesBusy regional airports, municipal airports near smaller cities, vertiports in suburban areas, Project ATLAS in NCLarge international airports, urban vertiports, Memphis Airport, North Dakota Vantis network
Note. Sources include FAA 2024 [67]; FAA, 2025 [69,70].
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Samu, J.; Yang, C. Airport Ground-Based Aerial Object Surveillance Technologies for Enhanced Safety: A Systematic Review. Drones 2026, 10, 22. https://doi.org/10.3390/drones10010022

AMA Style

Samu J, Yang C. Airport Ground-Based Aerial Object Surveillance Technologies for Enhanced Safety: A Systematic Review. Drones. 2026; 10(1):22. https://doi.org/10.3390/drones10010022

Chicago/Turabian Style

Samu, Joel, and Chuyang Yang. 2026. "Airport Ground-Based Aerial Object Surveillance Technologies for Enhanced Safety: A Systematic Review" Drones 10, no. 1: 22. https://doi.org/10.3390/drones10010022

APA Style

Samu, J., & Yang, C. (2026). Airport Ground-Based Aerial Object Surveillance Technologies for Enhanced Safety: A Systematic Review. Drones, 10(1), 22. https://doi.org/10.3390/drones10010022

Article Metrics

Back to TopTop