Tightly-Coupled Air-Ground Collaborative System for Autonomous UGV Navigation in GPS-Denied Environments
Abstract
Highlights
- A stateful, ID-based mapping mechanism prevents data redundancy from video streams by assigning a unique ID to each detected obstacle, ensuring it is recorded on the global map only once.
- An obstacle inflation algorithm translates an abstract obstacle map into a robust, navigation-ready map. Artificially expanding the footprint of each obstacle based on the UGV’s physical dimensions ensures that any planned path is collision-free.
- The ID-based filtering guarantees a clean and non-redundant map, which is crucial for effective path planning. This prevents the system from becoming confused by repeatedly mapping static objects, which would otherwise render the map unusable.
- The obstacle inflation process ensures that the theoretically optimal path found by the A* algorithm is also physically safe for the UGV. This is a critical step that bridges the gap between abstract planning and real-world execution by preventing the UGV’s body from colliding with obstacles.
Abstract
1. Introduction
- A tightly-coupled, end-to-end system architecture that seamlessly integrates an aerial perception module, a low-latency communication module, and a ground navigation module into a single, automated workflow.
- A real-time stateful mapping framework employing a custom-trained YOLOv8 model with unique ID-based filtering to generate dynamic global obstacle maps, ensuring efficient and non-redundant environmental representation for UGV navigation.
- A validated low-cost prototype demonstrating system feasibility through physical implementation using accessible hardware (Raspberry Pi) and robust middleware (FAST DDS), specifically designed for deployment in resource-constrained environments.
2. Modeling and Problem Description
2.1. System Overview
2.2. Problem Description
- Static Environment: All obstacles within the environment are stationary. The proposed system is not designed to handle dynamic or moving obstacles.
- Planar Terrain: The UGV operates on a flat, 2D plane, simplifying the navigation problem from 3D to 2D path planning.
- Reliable Communication: The wireless link between the UAV and UGV is assumed to be stable and sufficiently low-latency within the operational range, ensuring timely map updates.
- Accurate Perception and Localization: The UAV’s perception system accurately detects obstacles and that the geometric transformation from the camera’s image plane to the world frame is pre-calibrated and precise. Furthermore, both the UAV and UGV are assumed to have reliable access to their own localization data (i.e., their position and orientation in the environment).
3. The Proposed Air-Ground Collaborative Method
3.1. Aerial Perception and Mapping Module
Algorithm 1. Simplified Logic for the Aerial Perception and Mapping Module. |
Input: UAV video stream, pre-trained YOLOv8 perception model, coordinate transformation parameters Output: A list of unique obstacle coordinates relative to a start marker
|
3.2. Ground Navigation and Control Module
Algorithm 2. Simplified Logic for the Ground Navigation and Control Module. |
Input: obstacle_coords(from Algorithm 1), start_point,goal_point. Output: Completed physical navigation of the UGV along a collision-free path. 1: procedure UGV_Navigation_Process(obstacle_coords, start_point, goal_point); 2: navigable_map ← Create_Map_And_Inflate_Obstacles(obstacle_coords); //Step 1: Map Generation and Preparation for Safe Navigation. //Create a grid map from the obstacle coordinates and apply the obstacle inflation algorithm. 3: optimal_path ← Find_Path_A_Star(navigable_map, start_point, goal_point); //Step 2: Optimal Path Planning //Run the A* algorithm on the inflated (safe) map to find the most efficient path. 4: //Based on the A cost function defined in Equations (7) and (8). 5: if optimal_path is found: //Step 3: Physical Path Execution 6: Follow_Path_With_Gyro_Feedback(optimal_path); 7: //Implements motion control loops using Equations (9)–(12). 8: end procedure; |
3.3. Real-Time Air-Ground Communication Module
Algorithm 3. Logic for the Real-time Air-Ground Communication Module. |
Input: map_data, mission_goal. Output: The successful transmission of the map data from the UAV to the UGV, triggering the start of the UGV’s autonomous navigation task. //On the UAV (Publisher) Side. 1: procedure Publish_Map_Data(map_data); //The UAV acts as the “Publisher” in the Fast DDS framework. To a specific topic named “MAP_DATA”. 2: Publish(topic: “MAP_DATA”, data: map_data); 3: end procedure; //On the UGV (Subscriber) Side. 4: procedure Receive_And_Trigge-r_Navigation();//The UGV acts as the “Subscriber,” continuously listening to the “MAP_DATA” topic. 5: map_data ← Subscribe_And_Wait_For_Data(topic: “MAP_DATA”); 6: Execute_Ground_Navigation(map_data);//Initiates the process detailed in Algorithm 2 //As soon as the map data is received, it triggers the ground navigation process. //The received “map_data” becomes the direct input for the UGV’s path planner. 7: end procedure |
4. Experiments and Analysis
4.1. Experimental Setup
4.2. Performance Metrics
4.2.1. Evaluation of Aerial Perception and Mapping Module
4.2.2. Evaluation of Ground Navigation and Control Module
4.2.3. Evaluation of Real-Time Air-Ground Communication Module
5. System Operation Result
6. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Munasinghe, I.; Perera, A.; Deo, R.C. A Comprehensive Review of UAV-UGV Collaboration: Advancements and Challenges. J. Sens. Actuator Netw. 2024, 13, 81. [Google Scholar] [CrossRef]
- Ding, Y.; Xin, B.; Chen, J. A Review of Recent Advances in Coordination Between Unmanned Aerial and Ground Vehicles. Unmanned Syst. 2020, 9, 97–117. [Google Scholar] [CrossRef]
- Arafat, M.Y.; Alam, M.M.; Moh, S. Vision-Based Navigation Techniques for Unmanned Aerial Vehicles: Review and Challenges. Drones 2023, 7, 89. [Google Scholar] [CrossRef]
- Kaniewski, P.; Grzywacz, W. Visual-based navigation system for unmanned aerial vehicles. In Proceedings of the 2017 Signal Processing Symposium (SPSympo), Jachranka, Poland, 12–14 September 2017; pp. 1–6. [Google Scholar]
- Kanellakis, C.; Nikolakopoulos, G. Survey on computer vision for uavs: Current developments and trends. J. Intell. Robot. Syst. 2017, 87, 141–168. [Google Scholar] [CrossRef]
- Liu, L.; Ouyang, W.; Wang, X.; Fieguth, P.; Chen, J.; Liu, X.; Pietikäinen, M. Deep Learning for Generic Object Detection: A Survey. Int. J. Comput. Vis. 2019, 128, 261–318. [Google Scholar] [CrossRef]
- Ge, Z.; Liu, S.; Wang, F.; Li, Z.; Sun, J. YOLO X: Exceeding YOLO Series in 2021. arXiv 2021, arXiv:2107.08430. [Google Scholar]
- Lu, Y.; Xue, Z.; Xia, G.-S.; Zhang, L. A survey on vision-based UAV navigation. Geo-Spat. Inf. Sci. 2018, 21, 21–32. [Google Scholar] [CrossRef]
- Liu, H.; Long, Q.; Yi, B.; Jiang, W. A survey of sensors based autonomous unmanned aerial vehicle (UAV) localization techniques. Complex Intell. Syst. 2025, 11, 371–392. [Google Scholar] [CrossRef]
- Shen, Y.; Liu, J.; Luo, Y. Review of Path Planning Algorithms for Unmanned Vehicles. In Proceedings of the 2021 IEEE 2nd International Conference on Information Technology, Big Data and Artificial Intelligence (ICIBA), Chongqing, China, 26–28 December 2021; pp. 400–405. [Google Scholar]
- Lashin, M.; El-mashad, S.Y.; Elgammal, A.T. Real-time path planning in dynamic environments using LSTM-augmented A∗ search. Results Eng. 2025, 27, 106324. [Google Scholar] [CrossRef]
- Zhou, C.; Huang, B.; Fränti, P. A review of motion planning algorithms for intelligent robots. J. Intell. Manuf. 2022, 33, 387–424. [Google Scholar] [CrossRef]
- Wang, J.; Zhang, T.; Ma, N.; Meng, M.Q.-H. A Survey of Learning-Based Robot Motion Planning. IET Cyber-Syst. Robot. 2021, 3, 302–314. [Google Scholar] [CrossRef]
- Yue, P.; Xin, J.; Huang, Y.; Zhao, J.; Zhang, C.; Chen, W.; Shan, M. UAV Autonomous Navigation System Based on Air–Ground Collaboration in GPS-Denied Environments. Drones 2025, 9, 442. [Google Scholar] [CrossRef]
- Mittal, M.; Mohan, R.; Burgard, W.; Valada, A. Vision-Based Autonomous UAV Navigation and Landing for Urban Search and Rescue. In Robotics Research; Springer International Publishing: Cham, Switzerland, 2022; pp. 575–592. [Google Scholar]
- Liu, M.; Wang, X.; Zhou, A.; Fu, X.; Ma, Y.; Piao, C. UAV-YOLO: Small Object Detection on Unmanned Aerial Vehicle Perspective. Sensors 2020, 20, 2238. [Google Scholar] [CrossRef] [PubMed]
- Zhang, Z.; Liu, Y.; Liu, T.; Lin, Z.; Wang, S. DAGN: A Real-Time UAV Remote Sensing Image Vehicle Detection Framework. IEEE Geosci. Remote Sens. Lett. 2020, 17, 1884–1888. [Google Scholar] [CrossRef]
- Katona, K.; Neamah, H.A.; Korondi, P. Obstacle Avoidance and Path Planning Methods for Autonomous Navigation of Mobile Robot. Sensors 2024, 24, 3573. [Google Scholar] [CrossRef] [PubMed]
- Kidane, N.; Avran, M.N.; Pavos, J.M.; Simone, D.W. Trajectory Tracking and Control of Differential Drive Mobile Robots Using Feedback Linearization. In Proceedings of the 2025 7th International Congress on Human-Computer Interaction, Optimization and Robotic Applications (ICHORA), Ankara, Türkiye, 25–27 June 2025; pp. 1–9. [Google Scholar]
- Tran, Q.-K.; Ryoo, Y.-J. Multi-Sensor Fusion Framework for Reliable Localization and Trajectory Tracking of Mobile Robot by Integrating UWB, Odometry, and AHRS. Biomimetics 2025, 10, 478. [Google Scholar] [CrossRef] [PubMed]
- Albonico, M.; Đorđević, M.; Hamer, E.; Malavolta, I. Software engineering research on the Robot Operating System: A systematic mapping study. J. Syst. Softw. 2023, 197, 111574. [Google Scholar] [CrossRef]
- Paul, S.; Lephuoc, D.; Hauswirth, M. Performance Evaluation of ROS2-DDS middleware implementations facilitating Cooperative Driving in Autonomous Vehicle. arXiv 2024, arXiv:2412.07485. [Google Scholar] [CrossRef]
- Gambo, M.L.; Danasabe, A.; Almadani, B.; Aliyu, F.; Aliyu, A.; Al-Nahari, E. A Systematic Literature Review of DDS Middleware in Robotic Systems. Robotics 2025, 14, 63. [Google Scholar] [CrossRef]
Component | Air-Ground Collaborative System | Typical High-Cost Research Platform |
---|---|---|
Onboard Computer | $35 × 2 = $70 (Raspberry Pi 3B+) | $1500 (NVIDIA Jetson AGX Orin) |
Primary Perception Sensor | $20 (High-Resolution USB Camera) | $500 (Velodyne Puck LiDAR) |
UGV | $25 | $500 |
UAV | $50 | $1500 |
Inertial Measurement | $5 (MPU6050) | $1500 (Vector Nav VN-100 IMU/AHRS) |
Power Module | $25 | $200 |
Estimated Total Cost | $220 | $5700 |
Feature | Traditional CV Algorithms [4,5] | YOLO [6,7] |
---|---|---|
Detection Principle | Rule-based (e.g., color, shape). | Learning-based (neural network). |
Accuracy | Lower & Fragile | Higher; robust to visual changes. |
Speed | Slow for complex scenes. | Real-Time. |
Flexibility/Adaptability | Poor: requires manual recoding for new objects. | High: adapts to new objects via retraining. |
Version | Feature | Impact on Performance |
---|---|---|
YOLOv3 | Multi-Scale Features (FPN). | Improved small object detection. |
YOLOv4 | Speed/Accuracy Optimizations (BoF/BoS). | Superior speed-accuracy balance. |
YOLOv5 | Usability Focus (PyTorch). | Simplified training & deployment. |
YOLOv7 | Advanced Architecture (E-ELAN). | New speed & accuracy benchmark. |
YOLOv8 | Anchor-Free Unified Design. | Enhanced flexibility & performance. |
Feature | Dijkstra’s Algorithm [10,12] | Greedy Best-First Search [10,11,12] | A* Algorithm [10,11,12] |
---|---|---|---|
Principle | Focuses only on past cost (). | Focuses only on future cost (). | Balances past and future cost (). |
Optimality | Optimal. Finds the best path. | Not Optimal. Finds a path, but often not the best one. | Optimal. Finds the best path. |
Efficiency | Slow; explores in all directions. | Fast; but can make poor choices. | Efficient; uses educated guesswork to guide its search. |
Feature | ROS [21,22] | DDS [22,23] |
---|---|---|
Design Philosophy | General-purpose robotics framework. | Designed for high-performance, critical tasks. |
Real-time Performance | Can be too slow for real-time tasks. | Optimized for real-time data sharing. |
Primary Use Case | General robotics R&D. | Mission-critical systems. |
System Configuration | |||
---|---|---|---|
A* + FAST DDS | 110 ms | 95 ms | 92% |
Dijkstra + FAST DDS | 250 ms | 96 ms | 91% |
A* + ROS | 111 ms | 165 ms | 80% |
Class | Images | Instances | Box (P) | R | Map50 | mAP50-95 |
---|---|---|---|---|---|---|
all | 1498 | 3241 | 0.846 | 0.774 | 0.845 | 0.639 |
ID | Relative Coordinates |
---|---|
start (ID: 1) | (X: 0 cm, Y: 0 cm) |
Obstacle (ID: 2) | (X: 79.0 cm, Y: 43.0 cm) |
Obstacle (ID: 3) | (X: −86.0 cm, Y: 261.5 cm) |
Obstacle (ID: 4) | (X: 49.0 cm, Y: 167.0 cm) |
Obstacle (ID: 5) | (X: −4.5 cm, Y: 244.5 cm) |
Obstacle (ID: 6) | (X: −95.0 cm, Y: 433.5 cm) |
Obstacle (ID: 7) | (X: 153.5 cm, Y: 402.0 cm) |
Obstacle (ID: 8) | (X: −10.0 cm, Y: 449.5 cm) |
Obstacle (ID: 9) | (X: 70.0 cm, Y: 497.0 cm) |
Obstacle (ID: 10) | (X: 80.5 cm, Y: 443.5 cm) |
Obstacle (ID: 11) | (X: 64.5 cm, Y: 467.5 cm) |
Obstacle (ID: 12) | (X: 73.0 cm, Y: 497.5 cm) |
Obstacle (ID: 13) | (X: 151.0 cm, Y: 477.5 cm) |
Obstacle (ID: 14) | (X: 153.5 cm, Y: 403.0 cm) |
Obstacle (ID: 15) | (X: −46.0 cm, Y: 657.0 cm) |
Obstacle (ID: 16) | (X: 54.0 cm, Y: 665.5 cm) |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Deng, J.; Liu, J.; Hu, J. Tightly-Coupled Air-Ground Collaborative System for Autonomous UGV Navigation in GPS-Denied Environments. Drones 2025, 9, 614. https://doi.org/10.3390/drones9090614
Deng J, Liu J, Hu J. Tightly-Coupled Air-Ground Collaborative System for Autonomous UGV Navigation in GPS-Denied Environments. Drones. 2025; 9(9):614. https://doi.org/10.3390/drones9090614
Chicago/Turabian StyleDeng, Jiacheng, Jierui Liu, and Jiangping Hu. 2025. "Tightly-Coupled Air-Ground Collaborative System for Autonomous UGV Navigation in GPS-Denied Environments" Drones 9, no. 9: 614. https://doi.org/10.3390/drones9090614
APA StyleDeng, J., Liu, J., & Hu, J. (2025). Tightly-Coupled Air-Ground Collaborative System for Autonomous UGV Navigation in GPS-Denied Environments. Drones, 9(9), 614. https://doi.org/10.3390/drones9090614