Proactive Path Planning Using Centralized UAV-UGV Coordination in Semi-Structured Agricultural Environments
Abstract
1. Introduction
- The use of UAV aerial mapping to generate georeferenced non-traversable zone exclusions for optimal path calculation, thus, reducing inefficient, reactive ground movement.
- A low-latency YOLOv8-based detection [27] and georeferencing pipeline for dynamic obstacle monitoring tailored to field conditions.
- A centralized UAV–UGV communication architecture via an FMIS so as to ensure reliable data transfer and coordination.
- Experimental validation of a safety-critical redundancy mechanism that ensures worker safety in case of global intelligence failure.
- Quantification of the operational cost (energetic and temporal penalties) required for the UGV to safely recover from vision failure in order to identify the primary system’s bottlenecks.
2. Materials and Methods
2.1. Integrated System Operational Protocol
2.2. Hardware and Software of the Robotic Platforms
2.2.1. Unmanned Ground Vehicle Platform
2.2.2. Unmanned Aerial Vehicle Platform
2.3. Field Deployment and Validation Scenarios
2.3.1. Test Environment
2.3.2. Obstacle Classification and System-Level Implications
- Vehicles: (a) Pickup truck: Medium-sized black utility vehicle with an elongated shape, solid enclosed front cab, and open rear cargo bed; (b) Van: Medium-sized fully enclosed black van with a compact, uniform silhouette and continuous roof.
- Workers: Two human participants were utilized to simulate field workers.
2.3.3. Validation Scenarios
- Scenario A: The field incorporated a single vehicle obstacle (pickup truck) and two workers.
- Scenario B: The field incorporated two vehicle obstacles (pickup truck and van) and one worker.
2.4. Key Metrics for Assessing System Performance
2.4.1. Temporal Performance Metrics
- Total mission completion time: This time interval (in seconds) is recorded from the moment the UGV began its planned path until it signaled mission fulfillment.
- Latency: The total time interval (in milliseconds) is measured from the moment the onboard computer of the UAV successfully detects and georeferences an obstacle to the moment the navigation system of the UGV receives and processes the exclusion command from the FMIS.
2.4.2. Vision System Accuracy
- Detected obstacle counts (): The total number of vehicles (maximum of 2 total present) and workers (maximum of 2 total present) that are successfully recognized by the UAV in the field.
- Mean precision: Calculated as the ratio of true positives (TPs) to the sum of TPs and FPs, averaged across all detected objects.
- Mean recall: Defined as the ratio of TPs to the total of TPs and FNs, averaged across all detected objects.
2.4.3. Energy Performance Metrics
- UGV battery drain: The percentage of battery capacity consumed by the UGV during the mission.
- UAV battery drain: The percentage of battery capacity consumed by the UAV for its full operational period (pre-mission scan and dynamic monitoring).
3. Results
3.1. Overview of Mission Performance Across Experimental Scenarios and Time Windows
3.2. Aerial Vision System Performance
3.3. Energy Performance
3.4. Temporal Performance
4. Discussion
4.1. Statistical Significance of Time Window on Aerial Detection
4.2. Successful Validation of Safety-Critical Redundancy and Architectural Success
4.3. The Quantified Operational Cost of Mission Success and Safety
4.4. Limitations of the Present Study
4.5. Future Research Directions
5. Conclusions
- The system successfully validated its primary design objectives. These were zero mission failures and guaranteed worker safety in all trials. To that end, the UGV’s LiDAR system proved to be an essential safety redundancy for the necessary immediate reactive UGV’s maneuvers.
- A statistically significant association was found between the afternoon solar period and mission outcome with workers identified as a highly challenging detection class.
- The activation of the reactive protocol in partial success trials resulted in an average increase of ≈200% in UGV battery consumption accompanied by significant time penalties of ≈100% in both scenarios. These results confirmed that the energetic penalty for safety recovery is restricted solely to the ground platform.
- The greater time cost was observed in Scenario B, since the high compositional complexity (restricted maneuvering space due to vehicles) increased the temporal cost required for the UGV to safely re-engage and complete its navigation task.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Mulungu, K.; Kassie, M.; Tschopp, M. The role of information and communication technologies-based extension in agriculture: Application, opportunities and challenges. Inf. Technol. Dev. 2025, 31, 1117–1146. [Google Scholar] [CrossRef]
- Nkwocha, C.L.; Adewumi, A.; Folorunsho, S.O.; Eze, C.; Jjagwe, P.; Kemeshi, J.; Wang, N. A Comprehensive Review of Sensing, Control, and Networking in Agricultural Robots: From Perception to Coordination. Robotics 2025, 14, 159. [Google Scholar] [CrossRef]
- Ren, Z.; Zheng, H.; Chen, J.; Chen, T.; Xie, P.; Xu, Y.; Deng, J.; Wang, H.; Sun, M.; Jiao, W. Integrating UAV, UGV and UAV-UGV collaboration in future industrialized agriculture: Analysis, opportunities and challenges. Comput. Electron. Agric. 2024, 227, 109631. [Google Scholar] [CrossRef]
- Agelli, M.; Corona, N.; Maggio, F.; Moi, P.V. Unmanned Ground Vehicles for Continuous Crop Monitoring in Agriculture: Assessing the Readiness of Current ICT Technology. Machines 2024, 12, 750. [Google Scholar] [CrossRef]
- Farhan, S.M.; Yin, J.; Chen, Z.; Memon, M.S. A Comprehensive Review of LiDAR Applications in Crop Management for Precision Agriculture. Sensors 2024, 24, 5409. [Google Scholar] [CrossRef]
- Lochan, K.; Khan, A.; Elsayed, I.; Suthar, B.; Seneviratne, L.; Hussain, I. Advancements in Precision Spraying of Agricultural Robots: A Comprehensive Review. IEEE Access 2024, 12, 129447–129483. [Google Scholar] [CrossRef]
- Moysiadis, V.; Benos, L.; Karras, G.; Kateris, D.; Peruzzi, A.; Berruto, R.; Papageorgiou, E.; Bochtis, D. Human–Robot Interaction through Dynamic Movement Recognition for Agricultural Environments. AgriEngineering 2024, 6, 2494–2512. [Google Scholar] [CrossRef]
- Lin, Y.; Song, X.; Xiao, W.; Kuang, D.; Xia, S.; Chang, H.; Wongsuk, S.; He, X.; Liu, Y. Low-altitude remote sensing and deep learning-based canopy detection method for the navigation of orchard unmanned ground vehicles. Comput. Electron. Agric. 2025, 239, 111077. [Google Scholar] [CrossRef]
- Cyriac, R.; Thomas, J. Smart Farming with Cloud Supported Data Management Enabling Real-Time Monitoring and Prediction for Better Yield BT—Intelligent Robots and Drones for Precision Agriculture; Balasubramanian, S., Natarajan, G., Chelliah, P.R., Eds.; Springer Nature: Cham, Switzerland, 2024; pp. 283–306. [Google Scholar]
- Guebsi, R.; Mami, S.; Chokmani, K. Drones in Precision Agriculture: A Comprehensive Review of Applications, Technologies, and Challenges. Drones 2024, 8, 686. [Google Scholar] [CrossRef]
- Munasinghe, I.; Perera, A.; Deo, R.C. A Comprehensive Review of UAV-UGV Collaboration: Advancements and Challenges. J. Sens. Actuator Netw. 2024, 13, 81. [Google Scholar] [CrossRef]
- Benos, L.; Asiminari, G.; Busato, P.; Kateris, D.; Aidonis, D.; Bochtis, D. Explainable artificial intelligence-driven geometric feature selection for enhanced field traversing efficiency prediction. Comput. Electron. Agric. 2025, 239, 111049. [Google Scholar] [CrossRef]
- Chen, Y.; Liu, Z.; Xu, Z.; Lin, J.; Guan, X.; Zhou, Z.; Zheng, D.; Hewitt, A. UAVs-UGV cooperative boom sprayer system based on swarm control. Comput. Electron. Agric. 2025, 235, 110339. [Google Scholar] [CrossRef]
- Ciacco, A.; Giallombardo, G.; Guerriero, F.; Saccomanno, F.P. Monitoring agricultural fields through collaboration between autonomous robots and drones. In Proceedings of the 2025 IEEE Conference on Technologies for Sustainability (SusTech), Santa Ana, CA, USA, 20–23 April 2025; IEEE: New York, NY, USA, 2025; pp. 1–8. [Google Scholar]
- Mammarella, M.; Comba, L.; Biglia, A.; Dabbene, F.; Gay, P. Cooperation of unmanned systems for agricultural applications: A theoretical framework. Biosyst. Eng. 2022, 223, 61–80. [Google Scholar] [CrossRef]
- Mammarella, M.; Comba, L.; Biglia, A.; Dabbene, F.; Gay, P. Cooperation of unmanned systems for agricultural applications: A case study in a vineyard. Biosyst. Eng. 2022, 223, 81–102. [Google Scholar] [CrossRef]
- Xu, Y.; Xue, X.; Sun, Z.; Gu, W.; Cui, L.; Jin, Y.; Lan, Y. Global path planning for navigating orchard vehicle based on fruit tree positioning and planting rows detection from UAV imagery. Comput. Electron. Agric. 2025, 236, 110446. [Google Scholar] [CrossRef]
- Katikaridis, D.; Moysiadis, V.; Tsolakis, N.; Busato, P.; Kateris, D.; Pearson, S.; Sørensen, C.G.; Bochtis, D. UAV-Supported Route Planning for UGVs in Semi-Deterministic Agricultural Environments. Agronomy 2022, 12, 1937. [Google Scholar] [CrossRef]
- Shi, M.; Feng, X.; Pan, S.; Song, X.; Jiang, L. A Collaborative Path Planning Method for Intelligent Agricultural Machinery Based on Unmanned Aerial Vehicles. Electronics 2023, 12, 3232. [Google Scholar] [CrossRef]
- Katikaridis, D.; Benos, L.; Busato, P.; Kateris, D.; Papageorgiou, E.; Karras, G.; Bochtis, D. Experimental Comparative Analysis of Centralized vs. Decentralized Coordination of Aerial–Ground Robotic Teams for Agricultural Operations. Robotics 2025, 14, 119. [Google Scholar] [CrossRef]
- Mansur, H.; Gadhwal, M.; Abon, J.E.; Flippo, D. Mapping for Autonomous Navigation of Agricultural Robots Through Crop Rows Using UAV. Agriculture 2025, 15, 882. [Google Scholar] [CrossRef]
- Ma, M. Optimization of YOLOv8 for UAV-Based Object Detection: A Literature Review. Appl. Comput. Eng. 2025, 191, 65–74. [Google Scholar] [CrossRef]
- Rahman, S.; Rony, J.H.; Uddin, J.; Samad, M.A. Real-Time Obstacle Detection with YOLOv8 in a WSN Using UAV Aerial Photography. J. Imaging 2023, 9, 216. [Google Scholar] [CrossRef]
- Xu, N.; Ning, X.; Li, A.; Li, Z.; Song, Y.; Wu, W. Research on Orchard Navigation Line Recognition Method Based on U-Net. Sensors 2025, 25, 6828. [Google Scholar] [CrossRef]
- Li, G.; Le, F.; Si, S.; Cui, L.; Xue, X. Image Segmentation-Based Oilseed Rape Row Detection for Infield Navigation of Agri-Robot. Agronomy 2024, 14, 1886. [Google Scholar] [CrossRef]
- Dixit, B.; Ananthapadmanabha, A.; Thahsin, A.; Pathak, S.; Kasbekar, G.S.; Maity, A. A Novel Cipher for Enhancing MAVLink Security: Design, Security Analysis, and Performance Evaluation Using a Drone Testbed. arXiv 2025, arXiv:2504.20626. [Google Scholar] [CrossRef]
- Khan, Z.; Shen, Y.; Liu, H. ObjectDetection in Agriculture: A Comprehensive Review of Methods, Applications, Challenges, and Future Directions. Agriculture 2025, 15, 1351. [Google Scholar] [CrossRef]
- Moysiadis, V.; Katikaridis, D.; Benos, L.; Busato, P.; Anagnostis, A.; Kateris, D.; Pearson, S.; Bochtis, D. An Integrated Real-Time Hand Gesture Recognition Framework for Human-Robot Interaction in Agriculture. Appl. Sci. 2022, 12, 8160. [Google Scholar] [CrossRef]
- Ultralytics YOLOv8. Ultralytics. Available online: https://github.com/ultralytics/ultralytics (accessed on 5 November 2025).
- Khan, A.T.; Jensen, S.M.; Khan, A.R. Advancing precision agriculture: A comparative analysis of YOLOv8 for multi-class weed detection in cotton cultivation. Artif. Intell. Agric. 2025, 15, 182–191. [Google Scholar] [CrossRef]
- Shen, Y.; Yang, Z.; Khan, Z.; Liu, H.; Chen, W.; Duan, S. Optimization of Improved YOLOv8 for Precision Tomato Leaf Disease Detection in Sustainable Agriculture. Sensors 2025, 25, 1398. [Google Scholar] [CrossRef] [PubMed]
- Zhou, X.; Chen, W.; Wei, X. Improved Field Obstacle Detection Algorithm Based on YOLOv8. Agriculture 2024, 14, 2263. [Google Scholar] [CrossRef]
- Niu, S.; Nie, Z.; Li, G.; Zhu, W. Early Drought Detection in Maize Using UAV Images and YOLOv8+. Drones 2024, 8, 170. [Google Scholar] [CrossRef]
- Wu, W.; Liu, A.; Hu, J.; Mo, Y.; Xiang, S.; Duan, P.; Liang, Q. EUAVDet: An Efficient and Lightweight Object Detector for UAV Aerial Images with an Edge-Based Computing Platform. Drones 2024, 8, 261. [Google Scholar] [CrossRef]
- Abdalla, A.; Mohammed, M.M.A.; Adedeji, O.; Dotray, P.; Guo, W. Toward resource-efficient UAV systems: Deep learning model compression for onboard-ready weed detection in UAV imagery. Smart Agric. Technol. 2025, 12, 101086. [Google Scholar] [CrossRef]
- Machidon, A.L.; Krašovec, A.; Pejović, V.; Latini, D.; Sasidharan, S.T.; Del Frate, F.; Machidon, O.M. A Low-Cost UAV System and Dataset for Real-Time Weed Detection in Salad Crops. Electronics 2025, 14, 4082. [Google Scholar] [CrossRef]
- Zhang, Z. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1330–1334. [Google Scholar] [CrossRef]
- Matese, A.; Toscano, P.; Di Gennaro, S.F.; Genesio, L.; Vaccari, F.P.; Primicerio, J.; Belli, C.; Zaldei, A.; Bianconi, R.; Gioli, B. Intercomparison of UAV, Aircraft and Satellite Remote Sensing Platforms for Precision Viticulture. Remote Sens. 2015, 7, 2971–2990. [Google Scholar] [CrossRef]
- Christiansen, M.; Laursen, M.; Jørgensen, R.; Skovsen, S.; Gislum, R. Designing and Testing a UAV Mapping System for Agricultural Field Surveying. Sensors 2017, 17, 2703. [Google Scholar] [CrossRef]
- Grimstad, L.; From, P.J. The Thorvald II Agricultural Robotic System. Robotics 2017, 6, 24. [Google Scholar] [CrossRef]
- Lytridis, C.; Kaburlasos, V.G.; Pachidis, T.; Manios, M.; Vrochidou, E.; Kalampokas, T.; Chatzistamatis, S. An Overview of Cooperative Robotics in Agriculture. Agronomy 2021, 11, 1818. [Google Scholar] [CrossRef]
- Baxter, P.; Cielniak, G.; Hanheide, M.; From, P.J. Safe Human-Robot Interaction in Agriculture. In Proceedings of the Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction (HRI ’18), Chicago, IL, USA, 5–8 March 2018; Association for Computing Machinery: New York, NY, USA, 2018; pp. 59–60. [Google Scholar]
- Benos, L.; Tsaopoulos, D.; Tagarakis, A.C.; Kateris, D.; Busato, P.; Bochtis, D. Explainable AI-Enhanced Human Activity Recognition for Human–Robot Collaboration in Agriculture. Appl. Sci. 2025, 15, 650. [Google Scholar] [CrossRef]
- Burud, I.; Lange, G.; Lillemo, M.; Bleken, E.; Grimstad, L.; Johan From, P. Exploring Robots and UAVs as Phenotyping Tools in Plant Breeding. IFAC-PapersOnLine 2017, 50, 11479–11484. [Google Scholar] [CrossRef]
- Esser, F.; Marks, E.; Magistri, F.; Weyler, J.; Bultmann, S.; Zaenker, T.; Ahmadi, A.; Schreiber, M.; Kuhlmann, H.; McCool, C.; et al. Automated Leaf-Level Inspection of Crops Combining UAV and UGV Robots BT. In European Robotics Forum 2025; Huber, M., Verl, A., Kraus, W., Eds.; Springer Nature: Cham, Switzerland, 2025; pp. 35–41. [Google Scholar]
- Baek, E.-T.; Im, D.-Y. ROS-Based Unmanned Mobile Robot Platform for Agriculture. Appl. Sci. 2022, 12, 4335. [Google Scholar] [CrossRef]
- Niu, J.; Zhang, L.; Zhang, T.; Guan, J.; Shi, S. Orchard Robot Navigation via an Improved RTAB-Map Algorithm. Appl. Sci. 2025, 15, 11673. [Google Scholar] [CrossRef]
- Open Source Robotics Foundation. Carrot_Planner. ROS Wiki. Available online: https://wiki.ros.org/carrot_planner (accessed on 5 November 2025).
- Zheng, K. ROS Navigation Tuning Guide BT. In Robot Operating System (ROS): The Complete Reference; Koubaa, A., Ed.; Springer International Publishing: Cham, Switzerland, 2021; Volume 6, pp. 197–226. ISBN 978-3-030-75472-3. [Google Scholar]
- Vasconez, J.P.; Guevara, L.; Cheein, F.A. Social robot navigation based on HRI non-verbal communication: A case study on avocado harvesting. In Proceedings of the ACM Symposium on Applied Computing; Association for Computing Machinery: New York, NY, USA, 2019; Volume Part F147772, pp. 957–960. [Google Scholar]
- Lv, P.; Wang, B.; Cheng, F.; Xue, J. Multi-Objective Association Detection of Farmland Obstacles Based on Information Fusion of Millimeter Wave Radar and Camera. Sensors 2023, 23, 230. [Google Scholar] [CrossRef]
- Escolà, A.; Planas, S.; Rosell, J.R.; Pomar, J.; Camp, F.; Solanelles, F.; Gracia, F.; Llorens, J.; Gil, E. Performance of an Ultrasonic Ranging Sensor in Apple Tree Canopies. Sensors 2011, 11, 2459–2477. [Google Scholar] [CrossRef]
- Shen, Y.; Shen, Y.; Zhang, Y.; Huo, C.; Shen, Z.; Su, W.; Liu, H. Research Progress on Path Planning and Tracking Control Methods for Orchard Mobile Robots in Complex Scenarios. Agriculture 2025, 15, 1917. [Google Scholar] [CrossRef]
- Tang, Y.; Qiu, J.; Zhang, Y.; Wu, D.; Cao, Y.; Zhao, K.; Zhu, L. Optimization strategies of fruit detection to overcome the challenge of unstructured background in field orchard environment: A review. Precis. Agric. 2023, 24, 1183–1219. [Google Scholar] [CrossRef]
- Ye, L.; Wu, F.; Zou, X.; Li, J. Path planning for mobile robots in unstructured orchard environments: An improved kinematically constrained bi-directional RRT approach. Comput. Electron. Agric. 2023, 215, 108453. [Google Scholar] [CrossRef]
- Han, C.; Wu, W.; Luo, X.; Li, J. Visual Navigation and Obstacle Avoidance Control for Agricultural Robots via LiDAR and Camera. Remote Sens. 2023, 15, 5402. [Google Scholar] [CrossRef]
- StephanST/WALDO30 Whereabouts Ascertainment for Low-lying Detectable Objects. Hugging Face Model Repository. Available online: https://huggingface.co/StephanST/WALDO30 (accessed on 7 January 2025).
- Tariku, G.; Ghiglieno, I.; Simonetto, A.; Gentilin, F.; Armiraglio, S.; Gilioli, G.; Serina, I. Advanced Image Preprocessing and Integrated Modeling for UAV Plant Image Classification. Drones 2024, 8, 645. [Google Scholar] [CrossRef]
- Vélez, S.; Vacas, R.; Martín, H.; Ruano-Rosa, D.; Álvarez, S. A Novel Technique Using Planar Area and Ground Shadows Calculated from UAV RGB Imagery to Estimate Pistachio Tree (Pistacia vera L.) Canopy Volume. Remote Sens. 2022, 14, 6006. [Google Scholar] [CrossRef]
- Lin, Y.-W.; Liu, Y.-H.; Lin, Y.-B.; Hong, J.-C. FenceTalk: Exploring False Negatives in Moving Object Detection. Algorithms 2023, 16, 481. [Google Scholar] [CrossRef]
- Tummala, A.; Baskar, V.; Zaidi, S.H. Impact of Lighting-Based Biases on the Performance of YOLOv8 Object Detection Models. In Proceedings of the 2024 IEEE International Conference on Control & Automation, Electronics, Robotics, Internet of Things, and Artificial Intelligence (CERIA), Bandung, Indonesia, 17–18 October 2024; IEEE: New York, NY, USA, 2024; pp. 1–5. [Google Scholar]
- Gong, B.; Zhang, H.; Ma, B.; Tao, Z. Enhancing real-time low-light object detection via multi-scale edge and illumination-guided features in YOLOv8. J. Supercomput. 2025, 81, 1120. [Google Scholar] [CrossRef]
- Tapia-Mendez, E.; Hernandez-Sandoval, M.; Salazar-Colores, S.; Cruz-Albarran, I.A.; Tovar-Arriaga, S.; Morales-Hernandez, L.A. A Novel Deep Learning Approach for Precision Agriculture: Quality Detection in Fruits and Vegetables Using Object Detection Models. Agronomy 2025, 15, 1307. [Google Scholar] [CrossRef]
- Mat-Desa, S.; Mohd-Isa, W.-N.; Gomez-Krämer, P.; Roslee, M.; Hashim, N.; Abdullah, J.; Ali, A.; Che-Embi, Z.; Ibrahim, A. Dataset for small object detection with shadow (SODwS). Data Br. 2025, 60, 111482. [Google Scholar] [CrossRef]
- Benos, L.; Bechar, A.; Bochtis, D. Safety and ergonomics in human-robot interactive agricultural operations. Biosyst. Eng. 2020, 200, 55–72. [Google Scholar] [CrossRef]
- Pietrantoni, L.; Favilla, M.; Fraboni, F.; Mazzoni, E.; Morandini, S.; Benvenuti, M.; De Angelis, M. Integrating collaborative robots in manufacturing, logistics, and agriculture: Expert perspectives on technical, safety, and human factors. Front. Robot. AI 2024, 11, 1342130. [Google Scholar] [CrossRef]
- Jayaweera, H.M.P.C.; Hanoun, S. Path Planning of Unmanned Aerial Vehicles (UAVs) in Windy Environments. Drones 2022, 6, 101. [Google Scholar] [CrossRef]
- Sumi, Y.; Kim, B.K.; Ogure, T.; Kodama, M.; Sakai, N.; Kobayashi, M. Impact of Rainfall on the Detection Performance of Non-Contact Safety Sensors for UAVs/UGVs. Sensors 2024, 24, 2713. [Google Scholar] [CrossRef] [PubMed]
- Munir, A.; Siddiqui, A.J.; Anwar, S.; El-Maleh, A.; Khan, A.H.; Rehman, A. Impact of Adverse Weather and Image Distortions on Vision-Based UAV Detection: A Performance Evaluation of Deep Learning Models. Drones 2024, 8, 638. [Google Scholar] [CrossRef]
- Chakraborty, S.; Elangovan, D.; Govindarajan, P.L.; ELnaggar, M.F.; Alrashed, M.M.; Kamel, S. A Comprehensive Review of Path Planning for Agricultural Ground Robots. Sustainability 2022, 14, 9156. [Google Scholar] [CrossRef]
- Zhao, Z.; Zhang, Y.; Shi, J.; Long, L.; Lu, Z. Robust Lidar-Inertial Odometry with Ground Condition Perception and Optimization Algorithm for UGV. Sensors 2022, 22, 7424. [Google Scholar] [CrossRef]
- Suvittawat, A. Investigating Farmers’ Perceptions of Drone Technology in Thailand: Exploring Expectations, Product Quality, Perceived Value, and Adoption in Agriculture. Agriculture 2024, 14, 2183. [Google Scholar] [CrossRef]
- Marinoudi, V.; Benos, L.; Villa, C.C.; Lampridi, M.; Kateris, D.; Berruto, R.; Pearson, S.; Sørensen, C.G.; Bochtis, D. Adapting to the Agricultural Labor Market Shaped by Robotization. Sustainability 2024, 16, 7061. [Google Scholar] [CrossRef]
- Koubâa, A.; Allouch, A.; Alajlan, M.; Javed, Y.; Belghith, A.; Khalgui, M. Micro Air Vehicle Link (MAVlink) in a Nutshell: A Survey. IEEE Access 2019, 7, 87658–87680. [Google Scholar] [CrossRef]
- Fountas, S.; Malounas, I.; Athanasakos, L.; Avgoustakis, I.; Espejo-Garcia, B. AI-Assisted Vision for Agricultural Robots. AgriEngineering 2022, 4, 674–694. [Google Scholar] [CrossRef]
- Shan, B. A Computer Vision-Based Anti-Lodging Control System for Agricultural Machinery. In Proceedings of the 2025 6th International Conference on Computer Vision, Image and Deep Learning (CVIDL), Ningbo, China, 23–25 May 2025; IEEE: New York, NY, USA, 2025; pp. 1137–1142. [Google Scholar]
- Alqudsi, Y.; Makaraci, M. UAV swarms: Research, challenges, and future directions. J. Eng. Appl. Sci. 2025, 72, 12. [Google Scholar] [CrossRef]









| Aspect | Centralized Coordination | Decentralized Coordination |
|---|---|---|
| Decision-making | Supervisory node controls all agents | Each agent decides locally |
| Information flow | All data to central node | Agents exchange state and intent directly |
| Situational awareness | Global view at central node | Local awareness; global emerges via interactions |
| Optimization | High: mission-wide planning possible | Local: fleet-level optimality limited |
| Latency sensitivity | High; network delays affect response | Low; local reactions fast |
| Computational load | Centralized; low onboard requirement | Distributed; each agent needs processing power |
| Scalability | Limited by central node capacity | High; new agents easily added |
| Complexity | Lower; deterministic and structured | Higher; emergent behaviors and coordination logic |
| Typical use | Structured, supervised operations | Time-critical or unreliable-comm environments |
| Time Period | Time Window (EET) | Rationale for Vision System Testing |
|---|---|---|
| Low solar angle (morning) | 8:00 a.m.–10:30 a.m. | Low, rising solar angle |
| Peak solar angle (midday) | 11:30 a.m.–2:00 p.m. | High solar angle, centered on solar noon |
| Low solar angle (afternoon) | 2:30 p.m.–4:00 p.m. | Low, descending solar angle |
| Detection Outcome | Definition | System-Level Implication |
|---|---|---|
| True Positive (TP) | The detection system correctly identifies a real obstacle present within a crop row | The FMIS excludes this row from the UGV’s mission plan prior to traversal |
| True Negative (TN) | The detection system correctly determines that no obstacle is present in a crop row | The UGV is allowed to traverse the row as planned |
| False Positive (FP) | The detection system reports an obstacle in a crop row where no obstacle exists | The row is unnecessarily excluded from the mission plan |
| False Negative (FN) | An obstacle is present within a crop row, but the detection system fails to identify it | The UGV initiates an emergency stop, activates onboard LiDAR-based perception for obstacle classification, and executes a reactive avoidance maneuver to safely bypass the obstacle and resume navigation |
| Time Period | Observed Count | Row Total | Expected Count | ||
|---|---|---|---|---|---|
| Full Success | Partial Success | Full Success | Partial Success | ||
| Morning | 10 | 0 | 10 | 8.67 | 1.33 |
| Midday | 10 | 0 | 10 | 8.67 | 1.33 |
| Afternoon | 6 | 4 | 10 | 8.67 | 1.33 |
| Total | 26 | 4 | 30 | ||
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2026 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
Share and Cite
Katikaridis, D.; Benos, L.; Kateris, D.; Papageorgiou, E.; Karras, G.; Menexes, I.; Berruto, R.; Sørensen, C.G.; Bochtis, D. Proactive Path Planning Using Centralized UAV-UGV Coordination in Semi-Structured Agricultural Environments. Appl. Sci. 2026, 16, 1143. https://doi.org/10.3390/app16021143
Katikaridis D, Benos L, Kateris D, Papageorgiou E, Karras G, Menexes I, Berruto R, Sørensen CG, Bochtis D. Proactive Path Planning Using Centralized UAV-UGV Coordination in Semi-Structured Agricultural Environments. Applied Sciences. 2026; 16(2):1143. https://doi.org/10.3390/app16021143
Chicago/Turabian StyleKatikaridis, Dimitris, Lefteris Benos, Dimitrios Kateris, Elpiniki Papageorgiou, George Karras, Ioannis Menexes, Remigio Berruto, Claus Grøn Sørensen, and Dionysis Bochtis. 2026. "Proactive Path Planning Using Centralized UAV-UGV Coordination in Semi-Structured Agricultural Environments" Applied Sciences 16, no. 2: 1143. https://doi.org/10.3390/app16021143
APA StyleKatikaridis, D., Benos, L., Kateris, D., Papageorgiou, E., Karras, G., Menexes, I., Berruto, R., Sørensen, C. G., & Bochtis, D. (2026). Proactive Path Planning Using Centralized UAV-UGV Coordination in Semi-Structured Agricultural Environments. Applied Sciences, 16(2), 1143. https://doi.org/10.3390/app16021143

