Prediction of 3D Airspace Occupancy Using Machine Learning
Abstract
1. Introduction
2. Materials and Methods
- Complementarily, the DSR approach allowed the predictive model to be conceived as a technological artifact, validated based on its usefulness for anticipating high-occupancy zones and its accuracy in predicting future positions within the three-dimensional airspace [30].
2.1. Problem and Objetives
2.2. Data Acquisition
2.3. Data Processing
2.3.1. Cyclical Encoding
2.3.2. Feature Scaling
2.3.3. Dimensionality Reduction with Principal Component Analysis
2.3.4. Dataset Splitting
2.4. Model Training
2.4.1. KNN
- Classification: In classification tasks, the label of a new test instance is assigned based on the most frequent label among the selected neighbors (majority voting). This process ensures that the predicted label reflects the dominant category in the local neighborhood. Formally, the classification function for a test instance is defined as [41]:
- Regression: In regression problems (where the target variable is continuous), the output value is calculated as the average of the values of the nearest neighbors. Thus, the prediction for a point is defined as [41]:
2.4.2. XGBoost
- Sequential Ensemble (CART): XGBoost uses an ensemble of decision trees built sequentially under the CART (Classification and Regression Trees) framework. In this approach, each new tree learns from the errors made by previous trees and refines its predictions through a process known as gradient descent. This iterative optimization corrects accumulated errors, thereby improving the model’s accuracy at each step. As more trees are added, the model incrementally minimizes the loss function, such as mean squared error, leading to a better fit to the data [43].
- Tree Depth Control: Unlike Random Forest, where trees grow to their maximum depth, XGBoost allows the user to define a maximum tree depth. This helps to control model complexity and prevents overfitting [43].
- Parallel Processing: XGBoost is designed to take advantage of parallel computing capabilities, enabling highly efficient model training. This is particularly valuable for large datasets, significantly reducing training time [43].
- Regularization: XGBoost incorporates regularization terms that penalize model complexity, helping to mitigate overfitting. This adds an additional balance between accuracy and the model’s generalization capacity [43].
- Handling of Missing Values: The algorithm includes mechanisms to automatically manage missing values in the dataset, directing them to the most appropriate branch in the decision trees, which enhances model accuracy and robustness [43].
- Initial Tree: The process begins with the construction of an initial tree, , which provides an initial prediction of the target variable, . This tree produces a residual, defined as the difference between the actual value and the prediction ;
- Subsequent Tree Construction: A second tree, , is trained to fit the residual errors of the initial tree . The goal is for to learn the residuals and, when combined with , reduce the overall model error;
- Tree Combination: Trees and are combined to form a new model , which reduces the mean squared error compared to . This is expressed as:
- Iteration Until Error Minimization: This process continues iteratively until the final model is obtained, which minimizes the error as much as possible. Each iteration adds a new tree that fits the residuals of the previous iteration:
2.4.3. LSTM
- Forget Gate (): This gate determines how much of the previous cell state should be retained. It is mathematically defined as:
- Input Gate (): This gate decides what new information will be added to the cell state. It consists of two components: the activation of the input gate and the generation of the candidate cell state (), representing new information to potentially be stored. These are computed as follows:
- Cell State Update (): The new cell state is computed by combining the retained information (regulated by ) and the new candidate state (regulated by ). The update is performed as:
- Output Gate (): This gate determines which part of the cell state will be used as the cell’s output , which is then passed to the next time step and serves as the current output:
- Cell Output (): The output is calculated by applying the function to the updated cell state , modulated by the output gate :
2.4.4. Random Forest
2.4.5. Model Overview, Strengths and Limitations
- KNN: This is a non-parametric, instance-based algorithm that performs regression by estimating the output of a new observation based on the values of its k nearest neighbors in the training set. It uses distance metrics such as Euclidean or Manhattan distance to identify the closest data points in the feature space, under the assumption that instances located near each other tend to have similar target values. In its weighted versions, closer neighbors have a greater influence on the prediction, which enhances accuracy in dense or heterogeneous regions. Since KNN does not involve a formal training phase, it is simple to implement; however, it can be computationally expensive at prediction time and is sensitive to noise and high-dimensional data.
- Random Forest: This algorithm is an ensemble regression method that constructs multiple decision trees using bootstrap sampling and random feature selection and then aggregates their predictions through averaging. Since each tree is trained on a different subset of the data and considers a random subset of features, the resulting model captures diverse patterns, which enhances generalization and reduces the risk of overfitting. By combining numerous weak learners, Random Forest achieves greater robustness and provides stable predictions even in the presence of noisy or complex datasets. Its main strength lies in its ability to model nonlinear relationships and variable interactions; however, it does not incorporate any inherent mechanism to capture temporal dependencies.
- XGBoost: A regression algorithm that builds decision trees sequentially, where each new tree is trained to correct the residual errors of the previous ones. This gradient boosting process minimizes a loss function, such as mean squared error, through iterative optimization. XGBoost includes advanced features such as regularization to prevent overfitting, parallel processing to speed up computation, and the ability to handle missing values efficiently. As a result, it produces highly accurate models for tabular and heterogeneous data, though it tends to reproduce dominant patterns rather than long-term temporal dynamics.
- LSTM: A type of recurrent neural network specifically designed to capture sequential dependencies in time-series data. Each LSTM cell incorporates three gates: forget, input, and output, which regulate how information is retained, updated, or discarded over time. This mechanism allows the network to preserve relevant signals across time steps and helps mitigate the vanishing gradient problem. In regression tasks, LSTM predicts continuous values by learning long-term temporal patterns. This makes it particularly effective in dynamic environments where future outcomes depend heavily on historical sequences, such as flight trajectories.
2.5. Model Performance Evaluation
2.5.1. Mean Absolute Error (MAE)
2.5.2. Root Mean Squared Error (RMSE)
2.5.3. Coefficient of Determination (R2)
2.5.4. Pearson Correlation Coefficient (R)
2.5.5. Mean Absolute Percentage Error (MAPE)
2.5.6. Mean Absolute Percentage Error (MAPE)
2.5.7. Discrepancy Ratio (DR)
2.5.8. Complementary Visualizations
3. Results and Analysis
3.1. Data Preparation
3.2. KNN
3.3. XGBoost
3.4. Random Forest
3.5. LSTM
3.6. Model Comparison
4. Discussion
Limitations of the Study
5. Conclusions
6. Future Work
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Abbreviations
2D | Two-Dimensional |
3D | Three-Dimensional |
4D | Fourth Dimension |
ADS -B | Automatic Dependent Surveillance-Broadcast |
AI | Artificial Intelligence |
AIP | Aeronautical Information Publication |
ANN | Artificial Neural Network |
ANSP | Air Navigation Service Providers |
BLSTM | Bidirectional LSTM |
CNN | Convolutional Neural Network |
CRISP-DM | Cross-Industry Standard Process for Data Mining |
DR | Discrepancy Ratio |
DSR | Design Science Research |
ELM | Extreme Learning Machines |
FN | False Negatives |
FP | False Positives |
FRA | Free Route Airspace |
FT | Feet |
GANP | Global Air Navigation Plan |
GNSS | Global Navigation Satellite Systems |
HMM | Hidden Markov Models |
ICAO | International Civil Aviation Organization |
KNN | K-Nearest Neighbors |
LGMB | Light Gradient Boosting Machine |
LSTM | Long Short-Term Memory |
MAE | Mean Absolute Error |
MAPE | Mean Absolute Percentage Error |
ML | Machine Learning |
MLP | Multilayer Perceptron |
MSE | Mean Squared Error |
PCA | Principal Component Analysis |
PSO | Particle Swarm Optimization |
R | Pearson Correlation Coefficient |
RMSE | Root Mean Squared Error |
RNAV | Area Navigation |
RNP | Required Navigation Performance |
SI | Scatter Index |
SVM | Support Vector Machines |
TBO | Trajectory-Based Operations |
TN | True Negatives |
TP | True Positives |
XGBoost | Extreme Gradient Boosting |
References
- ICAO. Future of Aviation; ICAO: Montreal, QC, Canada, 2019. [Google Scholar]
- ICAO. Plan Mundial de Navegación Aérea 2013–2028. In Capacidad y Eficiencia, 4th ed.; ICAO: Montreal, QC, Canada, 2013; pp. 8–30. [Google Scholar]
- Lutte, B. ICAO aviation system block upgrades: A method for identifying training needs. Int. J. Aviat. Aeronaut. Aerosp. 2015, 2, 2–16. [Google Scholar] [CrossRef]
- Aerocivil. AIP. Available online: https://www.aerocivil.gov.co/proveedor_servicios/publicaciones/3572/aip-publicacion-de-informacion-aeronautica (accessed on 20 March 2025).
- Aerocivil. RAC 91—Reglas Generales de Vuelo y de Operación; Aerocivil: Bogota, Colombia, 2024. [Google Scholar]
- Simões-Spencer, K. Fuel Consumption Optimization using Neural Networks and Genetic Algorithms. Master’s Thesis, Universidade Tecnica de Lisboa, Lisbon, Portugal, 2011. [Google Scholar]
- Medeiros, D.M.C.; Silva, J.M.R.; Bousson, K. RNAV and RNP AR approach systems: The case for Pico Island airport. Int. J. Aviat. Manag. 2012, 1, 181. [Google Scholar] [CrossRef]
- SESAR. SESAR Joint Undertaking|Background on Single European Sky. Available online: https://www.sesarju.eu/ (accessed on 11 May 2022).
- FAA. Next Generation Air Transportation System (NextGen); Federal Aviation Administration: Washington, DC, USA, 2015; pp. 28–30. [Google Scholar]
- Eurocontrol. Free Route Airspace; Eurocontrol: Brussels, Belgium, 2023. [Google Scholar]
- SkyVector. Flight Planning/Aeronautical Charts, SkyVector. Available online: https://skyvector.com/ (accessed on 15 November 2024).
- Zhang, X.; Zhong, S.; Mahadevan, S. Airport surface movement prediction and safety assessment with spatial–temporal graph convolutional neural network. Transp. Res. Part. C Emerg. Technol. 2022, 144, 103873. [Google Scholar] [CrossRef]
- Ma, L.; Tian, S. A Hybrid CNN-LSTM Model for Aircraft 4D Trajectory Prediction. IEEE Access 2020, 8, 134668–134680. [Google Scholar] [CrossRef]
- Ayhan, S.; Samet, H. Aircraft trajectory prediction made easy with predictive analytics. In Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Association for Computing Machinery, San Francisco, CA, USA, 13–17 August 2016; pp. 21–30. [Google Scholar] [CrossRef]
- Tran, N.; Nguyen, H.Q.V.; Pham, D.T.; Alam, S. Aircraft Trajectory Prediction with Enriched Intent Using Encoder-Decoder Architecture. IEEE Access 2022, 10, 17881–17896. [Google Scholar] [CrossRef]
- Guan, X.; Lv, R.; Sun, L.; Liu, Y. A study of 4D trajectory prediction based on machine deep learning. In Proceedings of the World Congress on Intelligent Control and Automation (WCICA), Guilin, China, 12–17 June 2016; pp. 24–27. [Google Scholar] [CrossRef]
- Olive, X.; Basora, L. Detection and identification of significant events in historical aircraft trajectory data. Transp. Res. Part. C Emerg. Technol. 2020, 119, 102737. [Google Scholar] [CrossRef]
- Gil, D.; Hernandez-Sabate, A.; Enconniere, J.; Asmayawati, S.; Folch, P.; Borrego-Carazo, J.; Piera, M.À. E-Pilots: A System to Predict Hard Landing during the Approach Phase of Commercial Flights. IEEE Access 2022, 10, 7489–7503. [Google Scholar] [CrossRef]
- Alligier, R.; Gianazza, D.; Durand, N. Machine Learning Applied to Airspeed Prediction During Climb. IEEE Access 2021, 10, 7489–7503. [Google Scholar]
- Alligier, R.; Gianazza, D. Learning aircraft operational factors to improve aircraft climb prediction: A large scale multi-airport study. Transp. Res. Part. C Emerg. Technol. 2018, 96, 72–95. [Google Scholar] [CrossRef]
- Alligier, R. Predictive joint distribution of the mass and speed profile to improve aircraft climb prediction. In Proceedings of the 2020 International Conference on Artificial Intelligence and Data Analytics for Air Transportation, AIDA-AT 2020, Singapore, 3–4 February 2020. [Google Scholar] [CrossRef]
- Tong, C.; Yin, X.; Wang, S.; Zheng, Z. A novel deep learning method for aircraft landing speed prediction based on cloud-based sensor data. Future Gener. Comput. Syst. 2018, 88, 552–558. [Google Scholar] [CrossRef]
- Liu, X.; Huang, Y.; Wang, Q.; Song, Q.; Zhao, L. A prediction method for deck-motion of air-carrier based on PSO-KELM. In Proceedings of the International Conference on Sensing Technology, ICST, Nanjing, China, 11–13 November 2016. [Google Scholar] [CrossRef]
- Reitmann, S.; Nachtigall, K. Applying Bidirectional Long Short-Term Memories (BLSTM) to Performance Data in Air Traffic Management for System Identification; Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Springer: Berlin/Heidelberg, Germany, 2017; pp. 528–536. [Google Scholar] [CrossRef]
- Herrema, F.; Treve, V.; Desart, B.; Curran, R.; Visser, D. A novel machine learning model to predict abnormal Runway Occupancy Times and observe related precursors. In Proceedings of the 12th USA/Europe Air Traffic Management R and D Seminar, Seattle, WA, USA, 27–30 June 2017. [Google Scholar]
- Demir, E.; Demir, V.B. Predicting flight delays with artificial neural networks: Case study of an airport. In Proceedings of the 2017 25th Signal Processing and Communications Applications Conference (SIU), Antalya, Turkey, 15–18 May 2017; pp. 1–4. [Google Scholar] [CrossRef]
- Zhang, Q.; Mott, J.H.; Johnson, M.E.; Springer, J.A. Development of a Reliable Method for General Aviation Flight Phase Identification. IEEE Trans. Intell. Transp. Syst. 2022, 23, 11729–11738. [Google Scholar] [CrossRef]
- Ren, K.; Kim, A.M.; Kuhn, K. Exploration of the Evolution of Airport Ground Delay Programs. Transp. Res. Rec. 2018, 2672, 71–81. [Google Scholar] [CrossRef]
- Schröer, C.; Kruse, F.; Gómez, J.M. A systematic literature review on applying CRISP-DM process model. Procedia Comput. Sci. 2021, 181, 526–534. [Google Scholar] [CrossRef]
- Mtsweni, J.; Biermann, E.; Pretorius, L. iSemServ: A model-driven approach for developing semantic web services. S. Afr. Comput. J. 2014, 52, 55–70. [Google Scholar] [CrossRef]
- Zhang, J.; Liu, W.; Zhu, Y. Study of ADS-B data evaluation. Chin. J. Aeronaut. 2011, 24, 461–466. [Google Scholar] [CrossRef]
- García, S.; Luengo, J.; Herrera, F. Data Preprocessing in Data Mining. Intell. Syst. Ref. Libr. 2015, 72, 320. [Google Scholar]
- Ha, J.; Kambe, M.; Pe, J. Data Mining: Concepts and Techniques; Elsevier: Amsterdam, The Netherlands, 2011. [Google Scholar] [CrossRef]
- Jollife, I.T.; Cadima, J. Principal component analysis: A review and recent developments. R. Soc. Lond. 2016, 374, 20150202. [Google Scholar] [CrossRef] [PubMed]
- Kuhn, M.; Johnson, K. Applied Predictive Modeling; Springer: New York, NY, USA, 2013; pp. 1–600. [Google Scholar] [CrossRef]
- Seibold, H.; Hothorn, T.; Zeileis, A. Generalised linear model trees with global additive effects. Adv. Data Anal. Classif. 2019, 13, 703–725. [Google Scholar] [CrossRef]
- Ming, W.; Bao, Y.; Hu, Z.; Xiong, T. Multistep-Ahead Air Passengers Traffic Prediction with Hybrid ARIMA-SVMs Models. Sci. World J. 2014, 2014, 567246. [Google Scholar] [CrossRef]
- Guo, G.; Wang, H.; Bell, D.; Bi, Y.; Greer, K. KNN model-based approach in classification. In Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2888; Springer: Berlin/Heidelberg, Germany, 2003; pp. 986–996. [Google Scholar] [CrossRef]
- Shai, S.-S.; Shai, B.-D. Understanding Machine Learning: From Theory to Algorithms; Cambridge University Press: Cambridge, UK, 2014. [Google Scholar]
- Gabrillia, C. Implementation of the K-Nearest Neighbor Algorithm to Predict Air Pollution. Inf. Technol. Syst. 2023, 1, 45–54. [Google Scholar] [CrossRef]
- Fredriksson, K. Geometric Near-Neighbor Access Tree (GNAT) Revisited, May 2016. Available online: https://arxiv.org/abs/1605.05944v2 (accessed on 24 February 2025).
- Espinosa, J. Aplicación de algoritmos Random Forest y XGBoost en una base de solicitudes de tarjetas de crédito. Ing. Investig. Tecnol. 2020, 21, 1–16. [Google Scholar] [CrossRef]
- Recarey, R. Métodos de Ensamblado en Machine Learning, Universidade de Santiago de Compostela, 2021. Available online: http://eio.usc.es/pub/mte/descargas/ProyectosFinMaster/Proyecto_1686.pdf (accessed on 24 February 2025).
- Midtfjord, A.D.; De Bin, R.; Huseby, A.B. A decision support system for safer airplane landings: Predicting runway conditions using XGBoost and explainable AI. Cold Reg. Sci. Technol. 2022, 199, 103556. [Google Scholar] [CrossRef]
- Shi, Z.; Xu, M.; Pan, Q.; Yan, B.; Zhang, H. LSTM-based Flight Trajectory Prediction. In Proceedings of the International Joint Conference on Neural Networks, Rio de Janeiro, Brazil, 8–13 July 2018. [Google Scholar] [CrossRef]
- Pang, Y.; Xu, N.; Liu, Y. Aircraft trajectory prediction using lstm neural network with embedded convolutional layer. In Proceedings of the Annual Conference of the Prognostics and Health Management Society, PHM, Prognostics and Health Management Society, Scottsdale, AZ, USA, 21–26 September 2019. [Google Scholar] [CrossRef]
- Breiman, L. Random forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef]
- Yan, B.; Zhang, X.; Tang, C.; Wang, X.; Yang, Y.; Xu, W. A Random Forest-Based Method for Predicting Borehole Trajectories. Mathematics 2023, 11, 1297. [Google Scholar] [CrossRef]
- Hashemi, S.M.; Botez, R.M.; Ghazi, G. Robust Trajectory Prediction Using Random Forest Methodology Application to UAS-S4 Ehécatl. Aerospace 2024, 11, 49. [Google Scholar] [CrossRef]
- Chai, T.; Draxler, R.R. Root mean square error (RMSE) or mean absolute error (MAE)?—Arguments against avoiding RMSE in the literature. Geosci. Model. Dev. 2014, 7, 1247–1250. [Google Scholar] [CrossRef]
- De Myttenaere, A.; Golden, B.; Le Grand, B.; Rossi, F. Mean Absolute Percentage Error for regression models. Neurocomputing 2016, 192, 38–48. [Google Scholar] [CrossRef]
- Hintze, J.L.; Nelson, R.D. Violin plots: A box plot-density trace synergism. Am. Stat. 1998, 52, 181–184. [Google Scholar] [CrossRef]
- Wang, Z.; Liang, M.; Delahaye, D. A hybrid machine learning model for short-term estimated time of arrival prediction in terminal manoeuvring area. Transp. Res. Part. C Emerg. Technol. 2018, 95, 280–294. [Google Scholar] [CrossRef]
- Mukaka, M.M. A guide to appropriate use of Correlation coefficient in medical research. Malawi Med. J. 2012, 24, 69. Available online: https://pmc.ncbi.nlm.nih.gov/articles/PMC3576830/ (accessed on 10 September 2025). [PubMed]
- Jolliff, J.K.; Kindle, J.C.; Shulman, I.; Penta, B.; Friedrichs, M.A.; Helber, R.; Arnone, R.A. Summary diagrams for coupled hydrodynamic-ecosystem model skill assessment. J. Mar. Syst. 2009, 76, 64–82. [Google Scholar] [CrossRef]
- Doerr, C.; Gnewuch, M.; Wahlström, M. Calculation of Discrepancy Measures and Applications. In A Panorama of Discrepancy Theory; Springer: Cham, Switzerland, 2014. [Google Scholar] [CrossRef]
- Plan, E.L. Modeling and simulation of count data. CPT Pharmacometrics Syst. Pharmacol. 2014, 3, 1–12. [Google Scholar] [CrossRef]
- Wilkinson, L.; Friendly, M. The History of the Cluster Heat Map. Am. Stat. 2009, 63, 179–184. [Google Scholar] [CrossRef]
- Schimpf, N.; Wang, Z.; Li, S.; Knoblock, E.J.; Li, H.; Apaza, R.D. A Generalized Approach to Aircraft Trajectory Prediction via Supervised Deep Learning. IEEE Access 2023, 11, 116183–116195. [Google Scholar] [CrossRef]
- Silvestre, J.; Mielgo, P.; Bregon, A.; Martinez-Prieto, M.A.; Alvarez-Esteban, C. Multi-route aircraft trajectory prediction using Temporal Fusion Transformers. IEEE Access 2024, 12, 174094–174106. [Google Scholar] [CrossRef]
- Wu, Y.; Yu, H.; Du, J.; Liu, B.; Yu, W. An Aircraft Trajectory Prediction Method Based on Trajectory Clustering and a Spatiotemporal Feature Network. Electronics 2022, 11, 3453. [Google Scholar] [CrossRef]
- Zeng, W.; Quan, Z.; Zhao, Z.; Xie, C.; Lu, X. A Deep Learning Approach for Aircraft Trajectory Prediction in Terminal Airspace. IEEE Access 2020, 8, 151250–151266. [Google Scholar] [CrossRef]
Variable | Description | Unit |
---|---|---|
Latitude | Geographic coordinate indicating the north–south position of the aircraft | Decimal degrees (°) |
Longitude | Geographic coordinate indicating the east–west position of the aircraft | Decimal degrees (°) |
Altitude | Vertical position of the aircraft relative to mean sea level (MSL) | Feet (ft) |
Date | Calendar reference corresponding to the recorded observation | YYYY–MM–DD |
Time | Time of the observation, aligned to Colombian standard time (UTC–5) | HH:MM:SS |
Model | Strengths | Limitations |
---|---|---|
KNN |
|
|
Random Forest |
|
|
XGBoost |
|
|
LSTM |
|
|
Metric | Description | Unit |
---|---|---|
MAE | Average absolute difference between observed and predicted values | Latitude/Longitude: decimal degrees (°) Altitude: feet (ft) |
RMSE | Square root of the mean of squared errors | Latitude/Longitude: decimal degrees (°) Altitude: feet (ft) |
R2 | Proportion of variance in observed data explained by the model | Dimensionless |
R | Strength of linear association between observed and predicted values | Dimensionless |
MAPE | Average absolute error expressed as a percentage of observed values | % |
SI | RMSE normalized by the mean of observed values | Dimensionless |
DR | Ratio of the sum of predicted values to the sum of observed values | Dimensionless |
Model | Training Time (h) | Inference Latency (ms/Query) | Memory Usage (GB) | Notes |
---|---|---|---|---|
KNN | ~0.1 (no training) | ~150–200 ms | ~18 | Minimal training cost; inference latency limited by exhaustive neighbor searches. |
Random Forest | ~3.5 h (A100 GPU) | ~12–18 ms | ~8 | Moderate training cost; efficient inference through parallelized tree evaluation. |
XGBoost | ~5.0 h (A100 GPU) | ~20–25 ms | ~10 | Higher training overhead due to boosting iterations; inference latency remained acceptable. |
LSTM | ~7.8 h (A100 GPU) | ~6–10 ms | ~11 | Most demanding training process; inference extremely efficient once model weights were optimized. |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Tafur, C.L.; Rodríguez, J.O.; Daza, P.M.; Barón, I.R.; Traslaviña, D.S.; Bermúdez, J.A. Prediction of 3D Airspace Occupancy Using Machine Learning. Forecasting 2025, 7, 56. https://doi.org/10.3390/forecast7040056
Tafur CL, Rodríguez JO, Daza PM, Barón IR, Traslaviña DS, Bermúdez JA. Prediction of 3D Airspace Occupancy Using Machine Learning. Forecasting. 2025; 7(4):56. https://doi.org/10.3390/forecast7040056
Chicago/Turabian StyleTafur, Cristian Lozano, Jaime Orduy Rodríguez, Pedro Melo Daza, Iván Rodríguez Barón, Danny Stevens Traslaviña, and Juan Andrés Bermúdez. 2025. "Prediction of 3D Airspace Occupancy Using Machine Learning" Forecasting 7, no. 4: 56. https://doi.org/10.3390/forecast7040056
APA StyleTafur, C. L., Rodríguez, J. O., Daza, P. M., Barón, I. R., Traslaviña, D. S., & Bermúdez, J. A. (2025). Prediction of 3D Airspace Occupancy Using Machine Learning. Forecasting, 7(4), 56. https://doi.org/10.3390/forecast7040056