Next Article in Journal
Stability and Positivity Preservation in Conventional Methods for Space-Fractional Diffusion Problems: Analysis and Algorithms
Previous Article in Journal
AI-Enhanced UAV Clusters for Search and Rescue in Natural Disasters
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Cooperative 4D Trajectory Prediction and Conflict Detection in Integrated Airspace

1
Air Traffic Management Institute, Civil Aviation Flight University of China, Guanghan 618307, China
2
AVIC Chengdu Aircraft Design & Research Institute, Chengdu 610091, China
*
Author to whom correspondence should be addressed.
Algorithms 2026, 19(1), 32; https://doi.org/10.3390/a19010032 (registering DOI)
Submission received: 9 November 2025 / Revised: 12 December 2025 / Accepted: 24 December 2025 / Published: 1 January 2026

Abstract

In order to effectively ensure the flight safety of unmanned aerial vehicles (UAVs) and effectively deal with the risk of integrated airspace operation, this study carried out a series of key technology exploration and verification. In terms of data processing, Density-based spatial clustering of applications with noise (DBSCAN) clustering method is used to preprocess the characteristics of UAV automatic dependent surveillance–broadcast (ADS-B) data, effectively purify the data from the source, eliminate the noise and outliers of track data in spatial dimension and spatial-temporal dimension, significantly improve the data quality and standardize the data characteristics, and lay a reliable and high-quality data foundation for subsequent trajectory analysis and prediction. In terms of trajectory prediction, the convolutional neural networks-bidirectional gated recurrent unit (CNN-BiGRU) trajectory prediction model is innovatively constructed, and the integrated intelligent calculation of ‘prediction-judgment’ is successfully realized. The output of the model can accurately and prospectively judge the conflict situation and conflict degree between any two trajectories, and provide core and direct technical support for trajectory conflict warning. In the aspect of conflict detection, the performance of the model and the effect of conflict detection are fully verified by simulation experiments. By comparing the predicted data of the model with the real track data, it is confirmed that the CNN-BiGRU prediction model has high accuracy and reliability in calculating the distance between aircraft. At the same time, the preset conflict detection method is used for further verification. The results show that there is no conflict risk between the UAV and the manned aircraft in integrated airspace during the full 800 s of terminal area flight. In summary, the trajectory prediction model and conflict detection method proposed in this study provide a key technical guarantee for the construction of an active and accurate integrated airspace security management and control system, and have important application value and reference significance for improving airspace management efficiency and preventing flight conflicts.

1. Introduction

With the rapid development of UAV technology, industry and academia generally believe that the realization of ‘integrated operation’ of UAV and manned aircraft in the same airspace will be an important direction for future development [1]. ‘Fusion operation’ means that UAVs and manned aircraft fly together in the same airspace, and ensure that they can operate safely and reliably through effective airspace management, flight control, and information sharing systems. With the rapid development of China’s UAV industry and the continuous expansion of application scenarios, the scenarios and needs of UAVs and manned aircraft in the same airspace and sharing airport facilities are gradually emerging. Although low-altitude fusion operation has many potential advantages, such as improving airspace utilization and promoting commercial application of UAVs, there are also some obvious difficulties and challenges [2]. Nowadays, the integration of manned and unmanned aerial vehicles is not only limited to flight training and testing, but also involves logistics, agriculture, inspection, and other industries. However, the number of low-altitude aircraft is increasing, the coordination between aircraft becomes complicated, and the risk of flight conflicts increases. The existing conflict detection technology is mainly based on the distance between aircraft or the time of future conflict points. With the explosive growth of low-altitude air traffic in the future, the potential safety hazards of low-altitude traffic are highlighted [3].
The development of efficient trajectory prediction technology conforms to the concepts of ‘smart civil aviation’ and ‘green aviation’, which is conducive to the construction of the next generation of air transportation system and the sustainable development of Unmanned Aerial System (UAS) and Urban Air Mobility (UAM), promotes productivity development and socio-economic construction, and promotes global political, economic and cultural exchanges [4]. In the flight environment of fusion airspace, accurate 4D trajectory prediction of UAVs is of great significance for conflict detection. The 4D trajectory refers to adding the time dimension to the three-dimensional space (longitude, latitude, height) to form a complete flight path of the aircraft in the air. Accurate 4D trajectory prediction aims to achieve the minimum deviation from the planned route and the global optimal operation result of the aircraft flight trajectory [5]. Conflict detection refers to judging whether there is a flight conflict with the obstacle by accurately predicting the trajectory to take corresponding countermeasures in advance to ensure the safety of the fusion operation. The risk of flight conflict between UAV and manned aircraft in the terminal area is increasing. How to carry out real-time conflict detection for integrated operation is the focus of the future fine management stage [6].
With the continuous upgrading of communication, navigation, surveillance, and airborne equipment, as well as the development of computer technology and data mining methods, more and more data-driven methods have emerged. In view of the actual conflict between the UAV and the manned aircraft at the navigation airport, the combination of trajectory prediction and conflict detection based on machine learning has certain effectiveness [7]. The use of machine learning algorithms for trajectory prediction is to transform the trajectory prediction problem into an ordered time series prediction problem and establish a prediction model by analyzing the relationship between data in the trajectory training set. Wu [8] studied a four-dimensional trajectory prediction model based on a Back Propagation (BP) neural network. The predicted four-dimensional trajectory closely matches the real flight data and is highly accurate. HAN et al. [9] proposed a short-term real-time trajectory coordinate point prediction method based on Gated Recurrent Unit (GRU), which verifies that the GRU neural network has obvious advantages in prediction accuracy and applicability. Shi et al. [10] proposed using the Long Short-Term Memory (LSTM) model for trajectory prediction, accounting for the influence of various factors on track data during the real-time flight process, thereby achieving higher accuracy for the target trajectory. Zhang et al. [11] proposed a 4D trajectory joint-prediction model based on a genetic algorithm. The model is dynamically weighted by a genetic algorithm, enabling accurate prediction of the UAV’s trajectory and the time of entry into the protected area. Han Ping et al. [12] proposed combining the Density-Based Spatial Clustering of Applications with Noise (DBSCAN) algorithm with the GRU algorithm to predict trajectories in the terminal area. The DBSCAN clustering algorithm is used to cluster trajectories in the terminal area, providing a new solution to the trajectory prediction problem in complex airspace. Research on aircraft flight conflict detection technology in complex airspace at home and abroad is mainly divided into geometric-determination and probability-analysis types. Liu Yang et al. [13] proposed an approximate analytical algorithm to calculate the short-term instantaneous collision probability between aircraft, achieving sufficient accuracy and meeting real-time requirements. Miao et al. [14] proposed a new low-altitude flight conflict-detection algorithm based on a multi-level spatio-temporal grid index. This transforms the traditional path-by-path traversal calculation into a conflict-state query in a distributed database, enabling conflict detection results to be calculated across different paths. Mou [15] proposed a dynamic distance buffer suitable for conflict detection, which not only reduces the missed-alarm rate but is also better suited to high-density, high-flow traffic. Wang Lili et al. [16] established a speed–obstacle conflict detection model. For conflict detection in flight, drones actively provide avoidance for manned aircraft that may have a collision risk. Yue Rentian et al. [17] used an improved Kalman filter to predict the trajectory and a three-dimensional deterministic conflict detection model to detect conflicts between the two aircraft and provide a conflict resolution strategy. Hu et al. [18] proposed a PPO-based deep reinforcement learning model for UAS obstacle avoidance guidance. By mapping the continuous state space to continuous heading and speed control via a scenario-state representation and a reward function, the model enables the UAS to reach destinations while bypassing obstacles. Experiments on static/moving obstacles (accounting for environmental uncertainties and safety bounds) show it achieves a>99% conflict-resolution success rate, supporting UAM and UTM safety with high computational efficiency. Nan et al. [19] proposed the TRGPS meta-model for digital service network requirements. To resolve goal conflicts from service integration during evolution, they developed an automatic detection/resolution approach integrating LLMs and TRGPS, using Chain of Thought (CoT) prompting and conflict rules to enable LLM-based conflict reasoning and resolution recommendations. A supporting visualization tool and qualitative experiments validate its effectiveness.
To address the limitations of the traditional four-dimensional trajectory prediction model for UAVs, this paper proposes a terminal-area trajectory prediction model based on an improved convolutional neural network. Based on historical flight data from UAV fusion operations, the model’s dynamic weight adjustment is achieved by improving the convolutional neural network algorithm, enabling accurate, real-time UAV trajectory prediction. To address real-time conflict detection between manned and unmanned aerial vehicles in the fusion airspace, a geometric conflict-detection method is proposed. Its purpose is to comprehensively evaluate data quality, model robustness, conflict detection, and other factors to improve the safety and operational efficiency of the integrated operation aircraft.

2. Trajectory Prediction Model Construction

2.1. UAV Data Preprocessing

2.1.1. Cubic Spline Interpolation of Track Data

The track data from the UAV is location information with a timestamp, including longitude, latitude, speed, and altitude. The historical flight track dataset used in the fusion operation scenario is obtained in real time from the UAV’s ADS-B. However, due to signal loss, statistical deviation, and other factors in actual operation, ADS-B data often contain noise and missing values.
Assuming that A is a set of all UAV trajectories, including n trajectories, denoted by the following:
A = A 1 , A 2 , A n
Set the k-th trajectory to be represented by A k . Assuming that each trajectory consists of n track points, the following is established:
A k = { w k 1 , w k 2 , , w k n }
w k i is the i-th track point in A k . As each track point is characterized by n features, the following is established:
w k i = { z k i 1 , z k i 2 , , z k i j }
Z k i j is the j-th feature of track point W k i j .
The characteristics of each track feature point are shown in Table 1:
As a smooth and continuous data fitting method, cubic spline interpolation can fit new data points by curves between known data points. The cubic spline interpolation is used to fill the data gap to construct a more complete and reliable track dataset to support subsequent high-precision prediction [20].
Define f ( x ) as a continuous, quadratic, differentiable function on the interval [ p , q ] , and divide the interval [ p , q ] into n intervals as follows:
[ ( x 0 , x 1 ) , ( x 1 , x 2 ) , , ( x n 1 , x n ) ]
There are n + 1 points, and x0 = p, xn = q, then the following can be obtained:
A ( x ) = A 1 ( x ) , x [ x 1 , x 2 ] A i ( x ) , x [ x i , x i + 1 ] A n ( x ) , x [ x n , x n + 1 ]
A x is an interpolation function, which constructs a complete, smooth, and reasonable track curve from the existing discrete track points. If A x i satisfies each of the following four conditions, a corresponding output occurs as follows.
(1) When calculating the existing track position point data, the interpolation results should be equal to the basic data, then the following can be established:
A ( x i ) = f ( x i ) , i = 1 , 2 , , n + 1
In ADS-B data interpolation, we ensured that the interpolation curve passes through all known track data points accurately. This means that the known latitude and longitude, height, and time points must be completely retained and cannot deviate from the measured position due to smoothing.
(2) When A x is calculated in [ x i , x i + 1 ] ( i = 1 , 2 , , n 1 ) , the polynomial or zero polynomial can be constrained to be no more than three times:
A ( x ) = a i + b i x + c i x 2 + d i x 3
Between every two adjacent track points, the interpolation curve is a cubic polynomial, which specifies the partial values of four coefficients.
(3) A x is twice continuously differentiable:
lim x x i A ( x ) = A ( x i ) = w i , i = 1 , 2 , , n 1
lim x x i A ( x ) = A ( x i ) = w i , i = 1 , 2 , , n 1
In the trajectory, the change in velocity (first-order guidance) and acceleration (second-order guidance) of the aircraft is continuous and smooth, which is in line with the motion characteristics of the actual aircraft.
(4) Because the track data is non-negative, if the calculation result is negative, it is equal to the minimum value in the interval, as the following can be established:
A ( x ) = min [ x i , x i + 1 ] [ x i , x i + 1 ] ,   i f   A ( x ) < 0
If the interpolation result is negative, it is corrected to the minimum value in the interval.

2.1.2. Track Data Normalization Processing

In order to make the value of track data in different dimensions still have characteristic comparability, the track data is normalized to achieve the purpose of improving the accuracy of track prediction, and can effectively solve the issues of height, latitude, and longitude, speed, and heading angle. The data compression caused by different basic conditions of mathematical description, and the loss of computing resources, is optimized.
In this paper, the maximum-minimum normalization method is used. The processing of x is shown in Equation (11), and similar processing is used for y and z .
X n o r m = X X min X max X min
Among them, X n o r m is the normalized data, X is the original data, X m i n is the minimum value of track data, and X m a x is the maximum value of track data.

2.1.3. Track Data Clustering Analysis

In order to improve the accuracy and reliability of the four-dimensional trajectory (longitude, latitude, height, time) prediction model of UAV, aiming at the noise, mutation points and abnormal trajectories in the data, the DBSCAN clustering algorithm based on density is used to effectively identify and eliminate the outliers in the space-time dimension to provide high-quality and consistent input data for the subsequent trajectory prediction model.
N ε = [ q A d i s t ( p , q ) ε ]
where N ε is the set of points in the ε -neighborhood of a point p ; ε is the neighborhood radius; d i s t ( p , q ) is the Euclidean distance function; A is the entire dataset.
Core point determination:
c o r e   p o i n t ( p ) = T r u e   i f N ε ( p ) m i n   s a m p l e s
Outlier determination:
o u t l i e r ( p ) = T u r e   i f   p a n y   c o r e   p o i n t s ( N ε )
This paper takes the UAV data under the integrated operation of Zigong Airport. Based on the spatial resolution of ADS-B coordinates and the data-driven selection of typical positioning errors, eps = 0.5 is selected. According to the data update rate of the UAV and the requirements of the continuous trajectory segment, min_sample = 5 is selected for timing reasoning. The sensitivity analysis to ensure the robustness of these parameters is briefly mentioned, indicating that they achieve the best balance between noise removal and trajectory retention. The processing results of track outliers are as follows (Figure 1).
The data sample before preprocessing is shown in Table 2.
In summary, there are 11 outlier points processed by DBSCAN clustering. The preprocessed ADS-B data can restore the real state of flight operation with higher fidelity, which lays a solid data foundation for the 4D trajectory prediction model, and then significantly improves the accuracy and reliability of the prediction results.

2.2. Prediction Model Establishment

In the low-altitude terminal area fusion operation environment, accurate 4D trajectory prediction is the basis for detecting flight conflicts. At present, due to the limitations of multi-runway and complex airspace structure, the mixed operation of aircraft is accompanied by frequent low-altitude wind shear. This study proposes a combined prediction model to improve the comprehensiveness and accuracy of 4D trajectory prediction in the fusion terminal area. The track point expression is established, and the number is N, then there is {W…}. The track point prediction expression is established, and the CNN-BiGRU is used as the prediction model. The number is N′, and then {m…}. At the same time, indicators such as multi-source, multi-layer, and spatio-temporal characteristics are considered to improve prediction accuracy, while indicators such as prediction duration and time series characteristics are used to strengthen the reliability of the model.
The trajectory prediction model consists of three core modules: the data preprocessing module is responsible for cleaning and calibrating the ADS-B raw data; the feature extraction module uses a one-dimensional CNN to capture track space features; the time series prediction module uses a Bi-GRU network to learn complex time dynamic relationships, and finally outputs accurate four-dimensional trajectory prediction results. The structure of the CNN-BiGRU model is shown in Figure 2.
Through analysis and preprocessing, the characteristics of the track position point of the effective track data at time i are as follows:
A i = { t , l o n , l a t , a l t , v e l , h }
Among them, when the time is i , the track A has the characteristics of time t and precision ( l o n , l a t , a l t , etc.), and the time series tensor is used as the input format. In order to strengthen the calculation ability and accuracy, the continuous track point information is used to predict the next target track point information in the future, and the number of continuous track points is set to 6.
Rectified Linear Unit (ReLU) is an activation function used in neural networks and deep learning to introduce nonlinearity into the model. In terms of activation function, the use of ReLU can simplify the calculation and reduce the cost. The ReLU activation function formula is as follows:
f ( x ) = r e l u ( x ) = x , x 0 0 , x < 0
Through the ReLU function, if the input x is greater than or equal to 0, the output is x ; if it is less than 0, the output is 0. This formula ensures that the positive input is preserved and the negative input is mapped to zero.
In order to optimize the error value of the convolution layer, reduce the amount of data, and enhance the ability and efficiency of spatial feature extraction, the pooling layer is used for processing. In addition, in order to control the overfitting state, the ability of the dropout layer to reset the output or weight is used to effectively reduce the dependence of the correlation points between the neural networks. Therefore, the use of a multi-layer bidirectional gated recurrent network can realize the integration of various local features and strengthen the stability of the calculation results of the space, time, and height of the next target track point at any time.

2.3. Conflict Detection Method

Flight conflict detection is the basis of flight trajectory planning and flight safety control. The increasing number of aircraft is challenging the timeliness and accuracy of flight conflict detection [21]. Considering the safety range of UAVs and manned aircraft, the geometric optimization of the velocity obstacle model is performed using physical knowledge of relative motion and coordinate system transformations, and the judgment conditions for flight conflict are analyzed to enable effective conflict detection [22]. ‘Safe distance’ needs to be maintained during fusion airspace flights, particularly in the terminal area, which includes both horizontal and vertical separations. The purpose of setting a flight interval in the flight phase is to prevent flight conflicts, ensure flight safety, and optimize the utilization of airspace resources.
The UAV is treated as a particle, and a composite conflict-detection security area is constructed by superimposing the UAV conflict-detection security area and the navigation aircraft detection protection area. The protected area model for aircraft includes a sphere, a cylinder, and a cuboid. Due to the relatively low space occupancy rate of the sphere, the drone detection area is defined as a sphere with a radius r, the navigation aircraft detection protection area is a cylinder with a radius Hc and a height of 2 V c , and the red warning area is the conflict detection area, as shown in Figure 3.
The track point coordinates are transformed into the WGS-84 space coordinate system, and the horizontal distance between the manned aircraft and the UAV is calculated, as shown in Figure 4. The horizontal distance function can be expressed as follows:
c o s θ = d A B · v r h d A B · v r h
D h = d A B × 1 cos 2 θ
In the X-Z vertical plane, we set the coordinates of manned aircraft A to x 1 , z 1 and the coordinates of UAV B to x 2 , z 2 . Relative to the manned aircraft A, the slope of the straight line where the trajectory of UAV B is located is shown in Equation (17). The vertical distance function can be expressed as follows (Figure 5):
k = v 2 r v v 1 r v v 2 r h c o s φ 2 v 1 r h c o s φ 1
D v = k · ( x 1 x 2 ) + z 1 z 2
The aircraft conflict detection function can be expressed as follows:
I = i = 1 n [ D h ( p i ( A ) , ( p i ( B ) ) < β δ h   a n d   D v ( p i ( B ) ) < β δ v ]
Here, D h and D v are horizontal and vertical distance functions, respectively. p i ( A ) and p i ( B ) are the latitude and longitude positions of manned aircraft A and UAV B at time i ; δ h and δ v are the horizontal and vertical spacing standards set by the terminal area control center, and β is the coefficient set to meet different early warning needs. Therefore, when the detection function output I = 1 indicates that a potential conflict is detected, an early warning mechanism needs to be triggered; when the detection function output I = 0 indicates that the safety interval requirements are met, no early warning is required.
By integrating the trajectory distance detection function, the early warning of flight conflict is realized in the space-time dimension. If the space-time distance of the predicted trajectory is lower than the set safety interval threshold, the system will automatically generate warning information and recommend the optimal avoidance strategy to ensure flight safety.

3. Simulation Experiment

The simulation experiment is used to simulate the measured operation data of the fusion operation in the terminal area of Zigong Airport. The partial approach and approach track data of a UAV are selected for a 4D trajectory prediction experiment analysis, and the conflict detection of different aircraft running on the same route under the flight interval is studied. The whole process of the simulation experiment is shown in Figure 6.
In the performance evaluation of regression problems, mean absolute percentage error (MAPE) and root mean square error (RMSE) are widely used as two core indicators. MAPE eliminates the influence of the absolute dimension of data in the form of relative error percentage, which is especially suitable for the comparison of the prediction quality of data at different scales. The RMSE effectively amplifies the contribution of the significant error by calculating the square root of the deviation between the predicted value and the true value to more sensitively reflect the accuracy level of the prediction model. In this study, these two indicators with complementary characteristics were used to systematically verify the prediction efficiency of the CNN-BiGRU hybrid model. The calculation formula of the above indicators is as follows:
RMSE = 1 n i = 1 n ( o i m i ) 2
MAPE = 1 n i = 1 n o i m 1 o i × 100 %
oi is the predicted value of the i-th track point, and mi is the real value of the i-th track point. The optimization goal of the model is to minimize the evaluation index value, which is negatively correlated with the prediction error. Firstly, the preprocessed track data is defined as features and labels. Then, the dataset is divided into 70% of the training set and 30% of the test set, and 10% is reserved from the training set as the validation set. Finally, in order to effectively reduce the error, the single-step prediction input structure described in Figure 7 is used to construct the model input.
As shown in Figure 7, the input of the model is the complete flight data of six consecutive time points, and the historical sequence is used to predict the time, longitude, latitude, and height of the track position point at the next time point. Each input sample is thus constructed into a matrix of six rows and six columns.
In order to verify the accuracy of the prediction, we compare the output of LSTM, GRU, and other comparison models with the actual trajectory. Figure 8 is the two-dimensional latitude and longitude comparison diagram of the predicted trajectory and the actual trajectory of each model. Figure 9 is a three-dimensional trajectory comparison diagram containing height information, which intuitively shows the prediction performance of the model in space.
Experiments show that the CNN-BiGRU model proposed in this paper has the best performance in 4D trajectory prediction, especially in latitude and longitude prediction. The prediction error is significantly lower than that of the LSTM and GRU baseline models. The core advantage of this model is that it integrates the CNN module, which can effectively extract the local spatio-temporal features in the trajectory, and solve the problem of a lack of dimension and insufficient accuracy of traditional methods. The optimized model structure not only alleviates the over-fitting, but also its bidirectional gated recurrent unit (BiGRU) can capture the historical and future context information at the same time, thus achieving the simultaneous improvement of accuracy and prediction length.
Based on the comparison between the predicted trajectory and the actual trajectory, we quantitatively evaluate the errors of the four characteristics of time, longitude, latitude, and height in the single-step prediction. The detailed statistical data are shown in Table 3. According to the analysis, based on CNN-BiGRU, the MSE error optimization of GRU and LSTM is as high as 99.53% and 99.38%, respectively, which reflects the strong constraint ability of CNN-BiGRU on prediction error. In the mean absolute error (MAE) dimension, the error optimization ratios are 93.21% and 93.32%, respectively, indicating that CNN-BiGRU is more stable in controlling the absolute deviation of track prediction. The root mean square error (RMSE) optimization range was 93.11% and 93.18%, which further verified the prediction accuracy advantage of the model. On the mean absolute percentage error (MAPE) of the relative error index, the error optimization ratios are 94.36% and 93.61%, respectively, indicating that CNN-BiGRU has better prediction deviation consistency in different track scenarios. The errors of the prediction model proposed in this paper are significantly lower than those of GRU and LSTM baseline models, and all evaluation indicators are comprehensively leading. In addition, the overall performance of the GRU model is also steadily better than that of LSTM. The results fully prove that the model has higher accuracy in the 4D trajectory prediction task, and its output has the smallest deviation from the actual value, showing better robustness and prediction stability when processing time series data.

4. Example Verification

In order to construct a mixed operation scenario of manned aircraft and UAV and verify the reliability of early warning, the experiment introduces the real manned aircraft data in the terminal area, and generates its future trajectory through a high-precision prediction model to realize the interval calculation and conflict detection with the predicted trajectory of UAV. For the purpose of simplifying the calculation, it is assumed that two aircraft are operating simultaneously. The CNN-BiGRU model with the best predictive performance is selected to produce a high-precision trajectory, which serves as the direct basis for calculating intervals and checking conflicts.
Firstly, the effectiveness of the conflict detection function is verified by calculating the horizontal and vertical intervals of the generated trajectory. Furthermore, by comparing the distance difference between the predicted trajectory and the real trajectory in the horizontal and vertical directions, the comprehensive evaluation of the accuracy and reliability of the prediction model is completed. According to the interval regulation of the terminal area control operation center, the horizontal interval of the aircraft in the terminal area is 3000 m, the vertical interval is 150 m, and the proportional coefficient of the early warning system is set to 1 to activate the minimum early warning mechanism that meets the operating standards. In order to realize the conflict detection between manned aircraft and UAV, it is necessary to calculate the actual horizontal distance and vertical height difference between any two track points. Since the position data collected by the ADS-B system only contains latitude and longitude coordinates, it cannot be directly used for spatial distance calculation. Therefore, the spherical cosine theorem is used to calculate the horizontal distance: first, the latitude and longitude are converted to radians, and then the Earth’s radius is determined using the spherical triangle formula. The vertical distance is directly obtained by the absolute value of the height difference between the two aircraft at the same time point Z 1 i 4 Z 2 i 4 . The specific steps are as follows:
(1) Firstly, the longitude and latitude values of each track point are multiplied by π/180, respectively, and converted into the radian system.
(2) The standard value of the Earth’s radius R is usually set to 6371.01 km as the reference parameter for the subsequent spherical distance calculation.
(3) Based on the spherical cosine theorem, the spherical angle distance between two points is solved as follows:
cos Δ σ = sin φ 1 × sin φ 2 × cos φ 1 × cos φ 2 × cos ( λ 2 λ 1 )
Δ σ denotes the spherical angular distance between two points, ( φ 1 , λ 1 ) and ( φ 2 , λ 2 ) are the latitude and longitude coordinates of the two points to be calculated, respectively;
(4) By multiplying the spherical angle distance by the radius of the Earth, the actual spatial distance is obtained:
d = R × Δ σ
Based on the processed data, the interval calculation and conflict detection (Formula (25)) are completed. In order to focus on the highest risk period, this study selects the 800-s trajectory segment with the highest degree of proximity between the two aircraft, and outputs the horizontal and vertical distance change comparison chart, aiming to intuitively reveal the potential conflict trend. The experimental results are as follows:
In order to evaluate the effectiveness of the model in security applications, Figure 10 and Figure 11 compare the vertical and horizontal intervals between the predicted trajectory and the actual trajectory. The high overlap of the two curves shows that the model can effectively simulate the real distance change between aircraft. Finally, the preset conflict detection method is used for verification. The prediction results show that the two aircraft can maintain a safe interval during the whole flight in the next 800 s, and successfully pass the verification of safe operation in the terminal area.

5. Conclusions

This study proposes a complete set of UAV trajectory prediction and conflict detection schemes, and its innovations are mainly reflected in the following three aspects: Firstly, the DBSCAN clustering method is innovatively applied to preprocess the ADS-B data features, effectively eliminating the noise and outliers in the spatial and temporal dimensions, and improving the data quality and feature consistency. Secondly, the CNN-BiGRU deep learning model is constructed to verify the optimality of the model comparison. CNN-BiGRU realizes the integration of trajectory ‘prediction-judgment’, and can accurately judge the conflict situation and conflict degree between any two trajectories. Finally, through simulation experiments, the model shows high accuracy in calculating the distance between aircraft. It is verified that there is no risk of conflict detection in the flight of the integrated airspace terminal area within 800 s, which comprehensively proves the effectiveness and reliability of the proposed model in ensuring flight safety.

Author Contributions

Conceptualization, X.M. and L.Z.; methodology, X.M.; software, L.Z.; validation, L.Z., J.Z. and Y.W.; formal analysis, J.Z.; investigation, L.Z. and Y.W.; resources, X.M.; data curation, Y.W.; writing—original draft preparation, L.Z.; writing—review and editing, X.M.; visualization, J.Z.; supervision, J.Z.; project administration, X.M.; funding acquisition, X.M. All authors have read and agreed to the published version of the manuscript.

Funding

Fund Projects: 1. supported by “The Fundamental Research Funds for the National Key Laboratory of Unmanned Aerial Vehicle Technology”, Grant No. WRFX-202502; 2. Science and Technology Program of Xizang Autonomous Region, No. XZ202403ZY0014.

Data Availability Statement

The datasets generated and/or analyzed during the current study are available from the corresponding author on reasonable request.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Liu, Z.X.; Cai, K.; Zhu, Y. Civil unmanned aircraft system operation in national airspace: A survey from Air Navigation Service Provider perspective. Chin. J. Aeronaut. 2021, 34, 200–224. [Google Scholar] [CrossRef]
  2. Zhou, H. Legislative Development and Regulatory Challenges of the Integrated Operation of Civilian Unmanned and Manned Aircraft in China. In Emerging Rights Collection—Research on Emerging Rights in the Context of Artificial Intelligence; International Institute of Air and Space Law, Leiden University: Leiden, The Netherlands, 2024; Volume 2, p. 9. [Google Scholar]
  3. Su, Z.; Zhao, L.; Hao, Z.; Bai, R. Suvery of Artificial Intelligence Ensuring eVTOL Flight Safety in the Context of Low-Altitude Economy. Comput. Sci. 2025, 52, 177–189. [Google Scholar]
  4. Chen, S.; Jia, M.; Lin, J.; Jin, S.; Gao, Z.; Wang, Y.; Ma, Z.; Li, Z.; Duan, C.; Li, J. Research progress and prospect of the application of generative model-enabled aircraft technology. Acta Aeronaut. Astronaut. Sinica 2025, 46, 48. [Google Scholar]
  5. Pu, V.; Chen, Z.; Liu, Y.; Geng, X.; Zhu, Y.; Ren, K. Air traffic management technologies for digital low-altitude integrated operations. Aeronaut. J. 2025, 46, 1–23. [Google Scholar]
  6. Chen, Y.; Zhang, J.; Zou, X.; Wu, Q. Civil UAV traffic management architecture and key technologies. Sci. Technol. Eng. 2021, 21, 13221–13237. [Google Scholar]
  7. Cheng, C. Research on Aircraft Collision Avoidance and Trajectory Prediction Technology Based on Machine Learning; Nanjing University of Posts and Telecommunications: Nanjing, China, 2022. [Google Scholar]
  8. Wu, Z.; Tian, S.; Ma, L. A 4D trajectory prediction model based on the BP neural network. J. Intell. Syst. 2019, 29, 1545–1557. [Google Scholar] [CrossRef]
  9. Han, P.; Wang, W.; Shi, Q.; Yang, J. Real-time short-term trajectory prediction based on GRU neural network. In Proceedings of the 2019 IEEE/AIAA 38th Digital Avionics Systems Conference (DASC), San Diego, CA, USA, 8 September 2019. [Google Scholar]
  10. Shi, Q.; Wang, W.; Han, P. Short-term 4D trajectory prediction algorithm based on online-updating LSTM network. J. Signal Process. 2021, 7, 6–74. [Google Scholar]
  11. Zhang, H.; Yan, Y.; Li, S.; Hu, Y.; Liu, H. UAV Behavior-Intention Estimation Method Based on 4-D Flight-Trajectory Prediction. Sustainability 2021, 13, 12528. [Google Scholar] [CrossRef]
  12. Han, P.; Zhang, Q.; Shi, Q.; Zhang, Z. 4D trajectory prediction in terminal area based on DBSCAN-GRU algorithm. Signal Process. 2023, 39, 39–449. [Google Scholar]
  13. Liu, Y.; Xiang, J.; Luo, Z.; Jin, W. Short-term conflict detection algorithm for low-altitude free flight. J. Beijing Univ. Aeronaut. Astronaut. 2017, 43, 1873–1881. [Google Scholar]
  14. Miao, S.; Cheng, C.; Zhai, W.; Ren, F.; Zhang, B.; Li, S.; Zhang, J.; Zhang, H. A Low-Altitude Flight Conflict Detection Algorithm Based on a Multilevel Grid Spatiotemporal Index. ISPRS Int. J. Geo-Inf. 2019, 8, 289. [Google Scholar] [CrossRef]
  15. Mou, L. Conflict Detection and Resolution in Shared Airspace Between UAV and Manned Aircraft; Sichuan University: Chengdu, China, 2021. [Google Scholar]
  16. Wang, L.L.; Min, X.X.; Meng, L.H.; Xu, H.X. Investigation of UAV collision avoidance strategies in fusion airspace with redundant distance parameters. J. Saf. Environ. 2025, 25, 978–987. [Google Scholar]
  17. Yue, R.; Niu, M. Conflict detection method and resolution strategy between large fixed-wing UAV and manned aircraft. China Saf. Prod. Sci. Technol. 2025, 21, 70–78. [Google Scholar]
  18. Hu, J.; Yang, X.; Wang, W.; Wei, P.; Ying, L.; Liu, Y. Obstacle avoidance for UAS in continuous action space using deep reinforcement learning. IEEE Access 2022, 10, 90623–90634. [Google Scholar] [CrossRef]
  19. Nan, S.; Qiao, Y.; Xie, Y.; Zhang, Z.; Luo, Y.; Li, B.; Wang, J. Automatic conflict detection and resolution in digital service network requirement models using large language models. In Service Oriented Computing and Applications; Springer Nature: London, UK, 2025. [Google Scholar] [CrossRef]
  20. Zhu, G.; Li, M.; Wang, C.; Sun, H. Premodulation algorithm for excitation signals of parametric array based on cubic spline. J. Harbin Eng. Univ. 2025, 46, 513–520. [Google Scholar]
  21. Yue, R.; Niu, M. Low Altitude Multi-UAV Conflict Detection and Resolution Strategies Based on Flight Conflict Network. J. Sci. Technol. Eng. 2025, 25, 9631–9639. [Google Scholar]
  22. Gao, Y.; Guo, V.; Chen, J.; Li, G.; Wang, X. Prediction and resolution of conflict risk between UAV and manned aircraft in airspace. J. Saf. Environ. 2022, 22, 3288–3294. [Google Scholar]
Figure 1. DBSCAN track outlier detrction.
Figure 1. DBSCAN track outlier detrction.
Algorithms 19 00032 g001
Figure 2. CNN-BiGRU model structure.
Figure 2. CNN-BiGRU model structure.
Algorithms 19 00032 g002
Figure 3. Flight interval protection zone.
Figure 3. Flight interval protection zone.
Algorithms 19 00032 g003
Figure 4. Schematic diagram of horizontal conflict detection.
Figure 4. Schematic diagram of horizontal conflict detection.
Algorithms 19 00032 g004
Figure 5. Schematic diagram of vertical conflict detection.
Figure 5. Schematic diagram of vertical conflict detection.
Algorithms 19 00032 g005
Figure 6. Simulation experiment process.
Figure 6. Simulation experiment process.
Algorithms 19 00032 g006
Figure 7. Trajectory sample split chart.
Figure 7. Trajectory sample split chart.
Algorithms 19 00032 g007
Figure 8. Comparison of longitude and latitude trajectories.
Figure 8. Comparison of longitude and latitude trajectories.
Algorithms 19 00032 g008
Figure 9. Comparison of 3D trajectories.
Figure 9. Comparison of 3D trajectories.
Algorithms 19 00032 g009
Figure 10. Vertical distance of trajectories.
Figure 10. Vertical distance of trajectories.
Algorithms 19 00032 g010
Figure 11. Horizontal distance of trajectories.
Figure 11. Horizontal distance of trajectories.
Algorithms 19 00032 g011
Table 1. Illustration of track features.
Table 1. Illustration of track features.
NumberUnitTrait NameTrack Point
Z k i 1 time6 September 2024 11:12:46
Z k i 2 °longitude105.3775
Z k i 3 °latitude31.8531
Z k i 4 ftaltitude1239
Z k i 5 ktvelocity49
Z k i 6 °heading angle34
Note: Z k i j is the j-th feature of k track at time i in ADS-B data; 1 ft = 0.3048 m; 1 kt = 1.852 km/h.
Table 2. Outlier track data.
Table 2. Outlier track data.
TimeLatitude (°)Longitude (°)Altitude (ft)Velocity (kt)Heading Angle (°)
10:48:0231.5113105.67691259.88764678
10:48:0831.5109105.67881270.36595398
10:48:1331.5108105.68061279.346413099
10:48:1731.5107105.68241285.37565579
10:48:2531.5106105.684112875777
10:48:2931.5105105.68591283.28076778
10:48:3431.5104105.68761275.337111574
10:48:4031.5103105.68941264.80335645
11:06:2231.4654108.3441221.85345033
11:06:2631.4653108.3461218.99985272
11:06:3131.4653108.34811216.566612234
Table 3. Comparison of the total error of multi-prediction models.
Table 3. Comparison of the total error of multi-prediction models.
Evaluation IndexCNN-BiGRU ModelGRU ModelLSTM Model
MSE0.0065471.3807081.104536
MAE0.06250.92100.9350
RMSE0.0809111.1750361.185131
MAPE0.0226460.4015640.354406
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Ma, X.; Zheng, L.; Zhao, J.; Wu, Y. Cooperative 4D Trajectory Prediction and Conflict Detection in Integrated Airspace. Algorithms 2026, 19, 32. https://doi.org/10.3390/a19010032

AMA Style

Ma X, Zheng L, Zhao J, Wu Y. Cooperative 4D Trajectory Prediction and Conflict Detection in Integrated Airspace. Algorithms. 2026; 19(1):32. https://doi.org/10.3390/a19010032

Chicago/Turabian Style

Ma, Xin, Linxin Zheng, Jiajun Zhao, and Yuxin Wu. 2026. "Cooperative 4D Trajectory Prediction and Conflict Detection in Integrated Airspace" Algorithms 19, no. 1: 32. https://doi.org/10.3390/a19010032

APA Style

Ma, X., Zheng, L., Zhao, J., & Wu, Y. (2026). Cooperative 4D Trajectory Prediction and Conflict Detection in Integrated Airspace. Algorithms, 19(1), 32. https://doi.org/10.3390/a19010032

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop