Next Article in Journal / Special Issue
Co-Simulation Model of an Autonomous Driving Rover for Agricultural Applications
Previous Article in Journal
Dynamic Space Debris Removal via Deep Feature Extraction and Trajectory Prediction in Robotic Systems
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Experimental Comparative Analysis of Centralized vs. Decentralized Coordination of Aerial–Ground Robotic Teams for Agricultural Operations

by
Dimitris Katikaridis
1,2,3,
Lefteris Benos
1,*,
Patrizia Busato
4,
Dimitrios Kateris
1,
Elpiniki Papageorgiou
5,
George Karras
2 and
Dionysis Bochtis
1,3
1
Institute for Bio-Economy and Agri-Technology (IBO), Centre of Research and Technology-Hellas (CERTH), 6th km Charilaou-Thermi Rd., 57001 Thessaloniki, Greece
2
Department of Informatics and Telecommunications, University of Thessaly, 35131 Lamia, Greece
3
farmB Digital Agriculture S.A., Dekatis Evdomis (17th) Noemvriou 79, 55534 Thessaloniki, Greece
4
Interuniversity Department of Regional and Urban Studies and Planning (DIST), Polytechnic of Turin, Viale Pier Andrea Mattioli 39, 10125 Torino, Italy
5
Department of Energy Systems, University of Thessaly, Gaiopolis Campus, 41500 Larisa, Greece
*
Author to whom correspondence should be addressed.
Robotics 2025, 14(9), 119; https://doi.org/10.3390/robotics14090119
Submission received: 23 July 2025 / Revised: 20 August 2025 / Accepted: 27 August 2025 / Published: 28 August 2025
(This article belongs to the Special Issue Smart Agriculture with AI and Robotics)

Abstract

Reliable and fast communication between unmanned aerial vehicles (UAVs) and unmanned ground vehicles (UGVs) is essential for effective coordination in agricultural settings, particularly when human involvement is part of the system. This study systematically compares two communication architectures representing centralized and decentralized communication frameworks: (a) MAVLink (decentralized) and (b) Farm Management Information System (FMIS) (centralized). Field experiments were conducted in both empty field and orchard environments, using a rotary UAV for worker detection and a UGV responding to intent signaled through color-coded hats. Across 120 trials, the system performance was assessed in terms of communication reliability, latency, energy consumption, and responsiveness. FMIS consistently demonstrated higher message delivery success rates (97% in both environments) than MAVLink (83% in the empty field and 70% in the orchard). However, it resulted in higher UGV resource usage. Conversely, MAVLink achieved reduced UGV power draw and lower latency, but it was more affected by obstructed settings and also resulted in increased UAV battery consumption. In conclusion, MAVLink is suitable for time-sensitive operations that require rapid feedback, while FMIS is better suited for tasks that demand reliable communication in complex agricultural environments. Consequently, the selection between MAVLink and FMIS should be guided by the specific mission goals and environmental conditions.

1. Introduction

Nowadays, agriculture is experiencing a transformation towards Agriculture 5.0, stemming from the Society 5.0 paradigm [1,2]. The latter is an emerging concept that incorporates a plethora of information and communications technologies (ICT) [3]. A remarkable aspect of this transition is the integration of cyber-physical systems and human expertise laying the groundwork towards improving not only productivity, but safety and sustainability as well [4]. Of the most promising technologies are unmanned ground vehicles (UGVs) and unmanned aerial vehicles (UAVs) [5,6]. For instance, UGVs are used on the ground to transport items, conduct inspections, navigate crop rows, or handle short windows for harvesting [7,8]. In turn, UAVs provide aerial support by indicatively taking field multispectral and thermal imagery, monitoring crop health, and carrying out targeted spraying [9,10].
Within this context, coordinated operation of UAV–UGV teams has also been emerged by assigning complementary perception-action loops. This synergy can lead to more effective and time-efficient in-field operations benefitting from each other’s strengths [11,12]. In particular, this cooperation relies on a multi-phase strategy that allocates specific responsibilities to each autonomous agent, which are dependent on the activities of other agents to achieve the desired result [13]. In practical field conditions, their coordination, whether it is about sharing location data, responding to a worker’s request, or navigating around obstacles, depends a lot on how they are interconnected. In some setups, they can communicate directly with each other. This helps speed things up when quick decisions are needed [14]. Alternatively, the system relies on a central controller to coordinate the required tasks [15]. This approach can offer a broader view of the processes and contribute to the prioritization of tasks more intelligently. However, as direct communication between the UAV and UGV tends to be quicker, centralized systems can introduce delays resulting in possible bottlenecks in cases where network connectivity is unreliable [16].
For example, UAVs can provide wide-area situational awareness, whereas UGVs can execute localized actions in agricultural scenarios requiring time-sensitive and precise operations, like route planning. In this scenario, route planning can be (a) deterministic, where the entire path is precomputed using an offline description of the environment, or (b) sensor-based, where the path is produced dynamically in real time using data from onboard sensors [17]. Despite its great potential in the agricultural sector, especially in open-field settings, research in this area remains relatively limited and underexplored [18]. The main challenges stem from unstructured environment with moving machinery, terrain variability, and the dynamic nature of human involvement [19]. Some examples of cooperating unmanned vehicles, whether of a single type or hybrid, concern spraying [20,21], autonomous UGV root planning [22], weeding [23], harvesting [24,25], seeding [26], soil monitoring [27], and disease detection [28].
An indicative study is that of Mammarela et al. [20], where a fixed-wing UAV captured aerial imagery from a vineyard that was utilized for the purpose of creating a low-complexity georeferenced three-dimensional (3D) map. Rotary-wing UAVs and a research-oriented, highly maneuverable UGV designed for agricultural applications autonomously navigated the inter-row paths to precisely spray pesticides on crops. In the same vein, Katikaridis et al. [22] developed and tested a UAV-supported route planning approach for UGVs within an orchard environment. In practice, the UAV initially maps the orchard area, and a commercial UGV, which gets the necessary data from an FMIS, determines the optimal navigation path while avoiding obstacles in the field. Notably, the above studies differ mainly, apart from the application, in how UAV–UGV coordination is accomplished: direct communication in [20] and a centralized approach via the FMIS in [22].
Recent advances in human–robot interaction (HRI) in agriculture [29,30] have led to more adaptable and resilient systems, prompting interest in how human presence can be effectively integrated into UAV–UGV coordination to maximize overall benefits. To address this critical gap in agricultural robotics research, this study presents a systematic comparison of centralized versus decentralized communication architectures for UAV–UGV synergy by also considering the human-in-the-loop element. Through field testing, both in an empty field and an orchard, we quantitatively evaluate these approaches to determine their effectiveness and responsiveness. The human-in-the-loop multi-robot system that is presented in this study integrates the following: (a) a rotary wing UAV equipped with a red, green, blue (RGB) camera and integrated with a custom tree and worker detection algorithm; (b) a commercial, multi-functional autonomous ground vehicle capable of receiving task commands via either direct wireless communication or a server-based middleware; and (c) a human participant equipped with color-coded hats that indicate different UGV responses. This framework can be applied in real-world harvesting scenarios, such as in an orchard. For example, human workers can conduct manual harvesting and the UGV assists by transporting crates to predefined drop-off points, with UAV-supported route planning, in alignment with recent studies [31,32]. Consequently, the focus of this work was not on specific task completion, but rather on the quality and responsiveness of coordination between UAV and UGV systems. The color-coded hats can serve as a signaling mechanism to trigger different UGV responses based on the situational demands, similar to approaches using visual markers for task assignment [33].

2. Materials and Methods

2.1. System Overview

The experimental system was designed to assess the effectiveness of centralized (FMIS) and decentralized (MAVLink) communication architectures for UAV–UGV collaboration in agricultural environments. The system incorporates three main elements to assess the effectiveness of these communication schemes in agricultural settings: (a) an aerial unit (UAV); (b) a ground unit (UGV); and (c) a human participant. Together, these components form a semi-autonomous, human-in-the-loop framework that allows real-time monitoring, decision-making, and task execution under realistic field conditions.
In particular, the UAV is responsible for aerial monitoring, capturing high-resolution imagery of the participant and surrounding environment. These visual data are either processed onboard (in the case of MAVLink) or transmitted to an external server (for FMIS) to detect the participant’s location and identify behavior cues, such as color-coded hat markers. Based on this information, the UGV executes the corresponding ground tasks, such as approaching the participant, while avoiding obstacles. The human participant acts as a dynamic input to the system, providing non-verbal behavioral cues that trigger UGV responses, thereby enabling assessment of system responsiveness and coordination efficiency under realistic field conditions.
Each component is described in technical detail in the subsections below. Moreover, Table A1 and Table A2 in Appendix A summarize the main hardware and software configurations of the UAV and UGV, respectively, for quick reference.

2.1.1. Aerial Unit: Unmanned Aerial Vehicle Platform

The UAV platform utilized in this study is a custom quadcopter designed specifically for high-resolution monitoring and autonomous operation in agricultural environments. The complete UAV platform, along with the necessary sensors, is depicted in Figure 1a–e. At its core is the Pixhawk 4 autopilot system (Holybro, Shenzhen, China), which is equipped with a 32-bit ARM Cortex-M7 processor running at 216 MHz. This ensures stable and reliable flight control, with support for both ArduPilot and PX4 firmware to enable precise autonomous navigation. High-accuracy localization is achieved via an H-RTK F9P Global Navigation Satellite System (GNSS) receiver (Holybro, Shenzhen, China). This uses multi-band signals and Real-Time Kinematic (RTK) corrections to deliver positional accuracy within 1–2 cm that is crucial for georeferenced imaging.
As far as the visual data acquisition is concerned, the UAV carries a Sony Alpha 6100 RGB camera featuring a 24.2 MP APS-C Exmor CMOS sensor (Sony Corporation, Tokyo, Japan), capable of capturing detailed imagery at up to 11 fps. This camera is tightly integrated with the navigation system through a Seagull Map-X2 trigger (Seagull UAV, Copenhagen, Denmark), which ensures precise shutter activation based on predefined waypoints or distance intervals. A Raspberry Pi 4 Model B (quad-core Cortex-A72, 4 GB RAM) (Raspberry Pi Foundation, Cambridge, UK) serves as the onboard processing unit, managing real-time image aggregation and communication through a 4G LTE module supporting downlink speeds up to 150 Mbps. The system also includes a SiK Telemetry Radio (433 MHz) (Holybro, Shenzhen, China) for Micro Air Vehicle Link (MAVLink) communication and an R9SX receiver (900 MHz) (FrSky Electronic Co., Ltd., Wuxi, China) for manual remote control via an FrSky Taranis X9D Plus transmitter (FrSky Electronic Co., Ltd., Wuxi, China). The choice of 433/900 MHz radios for MAVLink and 4G LTE for FMIS reflects the differing communication demands of each architecture. MAVLink, focused on lightweight command and control, operates efficiently within the limited throughput of sub-GHz radios (57.6–115.2 kbps) while benefiting from robust non-line-of-sight performance. In contrast, FMIS requires higher bandwidth to support larger JavaScript Object Notation (JSON) payloads and data-rich exchanges, which are effectively handled by 4G LTE links providing tens of Mbps. Mission planning and monitoring are performed using QGroundControl (QGroundControl Development Team, Zurich, Switzerland). Finally, the UAV is powered by a 12,000 mAh LiPo battery (22.2 V, 6 S), providing an effective flight endurance of 30–40 min under typical field conditions.

2.1.2. Ground Unit: Unmanned Ground Vehicle Platform

Concerning the ground unit of the present system, a UGV (Thorvald, SAGA Robotics SA, Oslo, Norway), commonly used in the relative literature for autonomous agricultural operations [34], was implemented. The present robotic platform integrates a comprehensive sensor suite, including a Velodyne VLP-16 Puck laser scanner (Velodyne Lidar Inc., San Jose, CA, USA). This facilitates detailed 3D environmental mapping and obstacle detection critical for precision farming tasks. Coupled with an Intel NUC (Intel, Santa Clara, CA, USA) featuring an i5 6200U processor, 16 GB of RAM, and 128 GB of onboard storage running on Ubuntu Linux 18.04.6 LTS (Canonical Ltd., London, UK), this UGV efficiently manages complex computational tasks. These tasks are required for real-time data processing and autonomous navigation. Thorvald operates on the ROS Melodic (Open Source Robotics Foundation, Mountain View, CA, USA), enabling standardized software modularity, ease of integration, and enhanced interoperability across various robotic components, enhanced by Carrot Planner. The planner takes a goal point from an external user and checks if the user-specified goal is in an obstacle, and if it is, it goes back along the vector between the user-specified goal and the robot until a goal point that is not in an obstacle is found. It then passes this goal point on as a plan to a local planner or controller. In this way, the Carrot Planner allows the robot to get as close to a user-specified goal point as possible. Communication with Robotec’s motors is facilitated through a PEAK CANbus system (PEAK-System Technik GmbH, Darmstadt, Germany), ensuring reliable and efficient motor control. Navigation accuracy is further enhanced by incorporating an Inertial Measurement Unit (IMU), Xsens MTi 630R SK (Xsens Technologies B.V., Enschede, The Netherlands), and an RTK GPS receiver (S850 GNSS Receiver, Stonex Inc., Concord, NH, USA). This allows precise localization essential for task execution like navigation [31]. In addition, MAVProxy (ArduPilot Development Team, Canberra, Australia) is utilized to forward MAVLink messages to the Robot Operating System (ROS) (Open Source Robotics Foundation, Mountain View, CA, USA) for integrated system control [35]. Figure 2a–c illustrates the implemented autonomous ground unit and the essential onboard sensors.

2.1.3. Human Participant

In this system, a human participant was integrated as an essential element of the human-in-the-loop framework to replicate real-world agricultural operations. The participant’s interactions with the robotic platforms were structured to evaluate the responsiveness and efficiency of the semi-autonomous coordination mechanism under realistic operational conditions. Towards enabling non-verbal HRI, participants were equipped with color-coded hats that served as visual markers for behavior-triggering algorithms. The UAV identifies the color of the worker’s hat through onboard visual recognition. Then, it transmits this information to the UGV, enabling it to respond accordingly based on the predefined operational directive. The previous process takes place when MAVLink protocol is used. When FMIS is utilized, the image process is conducted in an external server. The image, shown in Figure 3, was captured by the UAV approximately 10 m above ground level while navigating an orchard and detecting a participant wearing a white hat. The visual data were processed to identify the hat color, enabling the system to initiate the corresponding action from the UGV. This action can be programmed depending on the application at hand.
Data collection was conducted across two distinct agricultural environments, with a single participant assigned to each site. Before participating in any experimental procedures, all individuals were required to sign an informed consent form, which had received prior approval from the Institutional Ethical Committee.

2.2. Inter-Robot Communication Protocol

Two distinct cases are developed and tested in the field, namely UAV–UGV coordination via (a) a commercial FMIS (farmB Digital Agriculture P.C., version 3.28.0, Thessaloniki, Greece) and (b) MAVLink protocol [36,37]. Both the centralized and decentralized approaches are briefly described below.

2.2.1. Coordination via a Farm-Management Information System

In the first case, communication between the UAV and the UGV was coordinated through a Hypertext Transfer Protocol (HTTP)-based FMIS middleware, which facilitated data exchange via well-defined application programming interface (API) protocols. The UGV employed a dedicated 4G LTE communication module (shown in Figure 2c) to ensure persistent connectivity with cloud-based servers. This provided real-time telemetry transmission, remote supervision, and continuous integration with the external FMIS infrastructure. It enhanced operational scalability, data-driven decision-making, and the potential for long-range deployments across variable agricultural terrains. In short, the FMIS receives a geotagged image and transmits the worker’s GPS location and color indication to the UGV.

2.2.2. Coordination via MAVLink Protocol

In the second case, the system employed a MAVLink-based decentralized communication framework to facilitate direct interoperability between the UAV and UGV. MAVLink messages are structured units of data. Each unit includes a starting section with sender, receiver and type information, the actual data being sent, and a code to verify the data’s accuracy [38]. This protocol enables real-time machine-to-machine data exchange between the aerial and ground robotic units, eliminating reliance on centralized servers [39]. The MAVLink messages were transmitted over a low-overhead protocol, ideal for bandwidth-constrained agricultural environments. Integration with the ROS was achieved through the MAVROS interface. This allowed MAVLink telemetry and command data to be seamlessly processed within ROS nodes [40]. This architecture ensured modularity, interoperability, and real-time behavior orchestration between agents. Furthermore, by adopting a server-independent design, the system maintained full operational capability even in remote or low-connectivity field conditions.

2.3. System Operational Workflow

2.3.1. Centralized Approach

Figure 4 shows the operational workflow between the UAV and the UGV, with the FMIS system acting as an intermediary for communication and coordination. Both the aerial and ground vehicles are equipped with 4G communication modules that enable real-time data exchange with the FMIS regardless of physical separation. The UAV begins by executing a survey flight plan, capturing geotagged imagery by using an onboard Raspberry Pi, which transmits both image and GPS data to the FMIS. The FMIS performs tree recognition using a machine learning (ML) model developed in [22]. This model is necessary for planning UGV paths as well as color-based worker recognition relying on simple visual markers (e.g., hats in this study). These markers trigger UAV decisions and message dispatches to the UGV. The FMIS analyzes these data to generate navigation plans that are then communicated to the UGV. As the UGV navigates towards each worker, it logs its GPS data and stores task-specific buckets of coordinates. Notably, while the UAV provides macro-level mapping, the UGV uses onboard LiDAR for local obstacle avoidance in dynamic field conditions, handling unmapped hazards like moving machinery, using the methodology of [41]. When the UGV reaches the participant, it notifies FMIS, which, in turn, informs the UAV to update its situational awareness. Once the field has been fully surveyed, the UAV returns to the predefined location. Finally, all mission-related GPS data from both platforms are stored centrally in the FMIS for post-mission analysis and traceability.

2.3.2. Decentralized Approach

The integration between the aerial and ground unit in this approach is achieved through a structured communication protocol leveraging MAVLink-ROS messaging over universal asynchronous receiver/transmitter (UART) telemetry and ground control infrastructure. As can be seen in Figure 5, the UAV, equipped with a Raspberry Pi acting as a ROS node within the UGV’s roscore [42], performs aerial surveying using a pre-defined flight plan. During flight, the UAV’s onboard RGB camera captures real-time imagery and transmits these data to the Raspberry, where the color-detection algorithm takes place and returns a color code to the UAV via a trigger port. The UAV forwards that color code to the UGV via a MAVLink message. Each hat color corresponds to a unique operational state. Detected states and associated GPS coordinates are processed onboard and transmitted to the UGV via MAVLink messages, which are converted to C++-readable ROS topics by a primary node responsible for managing inter-vehicle communication. The system supports tree detection and classification through embedded ML algorithms, which enhance the UGV’s in-field navigation capabilities, as detailed in [22].
After receiving positional and color information, the UGV autonomously navigates to the corresponding worker using RTK-GPS and confirms arrival back to the UAV via the same MAVLink-ROS channel. During operations, the UGV maintains a dynamic record of each worker, including coordinates and state, and stores these data for post-processing. LiDAR-based obstacle detection ensures real-time avoidance of unforeseen obstructions not captured by the UAV, such as incoming machinery or an untracked tree that the tree detection algorithm failed to detect. Once the UAV completes its survey or reaches battery limits, it sends a specific termination message to the UGV. Then, it terminates all active MAVLink-ROS connections, clears the onboard Raspberry Pi storage, and returns to the desired location.

2.4. Experimental Setup and Execution

As a means of evaluating the effectiveness of the proposed collaborative framework under varying conditions and communication protocols, a structured experimental methodology was carefully designed and implemented. Two distinct neighboring fields were chosen to evaluate system performance under varying levels of complexity: (a) an empty field, a flat, obstacle-free area of 5.5 hectares with a roughly rectangular shape simulating basic operational conditions, and (b) an almond orchard, a trapezoid-shaped field with an area of 6.4 hectares. In this field, the 995 almond trees have been planted in a grid pattern of 8   ×   8 with consistent spacing, which facilitates efficient management. In Figure 6, a satellite view of the two distinct agricultural fields, located in the Thessaly region of central Greece, is presented.
As mentioned above, the UAV was equipped with an RGB camera and onboard processing capabilities that enabled detection of both orchard trees and hat color. Each hat color could indicate a specific worker intent, for example, (a) blue for no assistance, (b) white for logistical support, and (c) red for emergency situations. In this context, the aerial unit also acted as a communication node, transmitting the relevant commands to the UGV via two protocols. These included the MAVLink protocol and the FMIS, which provided direct and indirect communication pathways, respectively.
The aforementioned procedure was repeated for all combinations of experimental variables: two distinct environments (open field, orchard), with one participant involved in each, two communication protocols (MAVLink, FMIS), and three types of worker intent signaled through hat color (blue, white, red). Each configuration was tested in 10 separate trials, resulting in 120 unique cases.

2.5. Critical Performance and System Behavior Metrics

Two distinct phases were examined in this analysis as a means of evaluating how effectively UAVs and UGVs communicate and respond:
  • Phase 1: The UAV successfully transfers a photo to the Raspberry Pi for processing, where it analyzes the image to extract important details regarding color information. Once the UGV receives the color result, it exploits these data to make decisions based on the detected colors or objects in the image. At the same time, the UGV receives the worker’s latitude and longitude coordinates, enabling it to determine its position relative to the worker.
  • Phase 2: In this phase, the ground vehicle reaches the worker, marking a key point in its task of collaboration. Once the UGV has arrived at the worker, it informs the UAV of its position, allowing the UAV to update its situational awareness.
A data collection template (in a Google Sheet format) was used during all experimental sessions to systematically record important data. Table 1 summarizes the variables along with a short description and the corresponding phase. These metrics were selected for the purpose of assessing the performance of the system, energy efficiency, communication reliability, and computational load during key moments of UAV–UGV collaboration. Specifically, timing and latency values show how effectively relevant data are exchanged. Moreover, battery usage signifies the energy demands related to each task, while CPU load corresponds to the processing burden regarding each unit.

3. Results

Across 120 field trials, which were conducted in two environments (open field, orchard) involving three types of worker intent signaled through hat color (blue, white, red), we compared MAVLink telemetry with an FMIS-based middleware. Overall, FMIS demonstrated a high success rate of 97% in both environments, outperforming MAVLink, which achieved 83% success in the empty field and 70% in the orchard. The flowchart in Figure 7 depicts the trial breakdown along with the success or failure outcomes for each subcase. In brief, the two failures associated with FMIS occurred during trials conducted in the orchard environment, where dense foliage and intermittent signal interference occasionally caused temporary communication dropouts. In contrast, MAVLink exhibited a lower overall success rate, with five failures in the empty field and nine failures in the orchard environment. The majority of MAVLink failures were attributed to packet loss and intermittent link dropouts, especially in the orchard, as in the FMIS-based trials. These conditions caused frequent communication disruptions, impacting the reliability of the MAVLink protocol. The success or failure of all trials was not influenced by the specific hat color used, indicating that the color-coded signaling mechanism was reliable across conditions. In particular, we performed a chi-square test of independence comparing success and failure counts across all hat colors (Appendix B). The results showed no significant association between hat color and trial outcome ( χ 2 = 1.01, p = 0.60), confirming our original statement that hat color did not influence system performance.
Towards examining various aspects of UAV–UGV coordination, we focused on two distinct phases of the process: Phase 1 (color detection and worker localization initialization) and Phase 2 (ground vehicle reaches worker and updates aerial vehicle with position). As described in Section 2.5, different metrics were employed to evaluate each phase targeting the efficiency of the communication between the aerial and ground units. Subsequently, the comparative performance of the MAVLink-based and FMIS-based systems for the two environments is illustrated using bar charts.
Summary tables pertaining to Phase 1 (Table A4) and Phase 2 (Table A5) across all metrics are presented in Appendix C for the sake of completeness. Moreover, paired-samples t-tests were conducted for each performance metric to assess statistical significance, with results summarized in Table A6 and Table A7 for Phase 1 and Phase 2, respectively. For each metric, the mean difference (MAVLink minus FMIS), the t-statistic, and the corresponding p-value were calculated to assess significance. In addition, Cohen’s d was computed to quantify effect size, providing insight into the practical relevance of the observed differences. Cohen’s d for a paired-samples t-test was calculated via d = t / n (where t is the test statistic from the paired t-test and n is the number of paired observations). Statistical significance was set at a threshold of p < 0.05. All analyses were performed using Microsoft Excel.

3.1. Phase 1: Color Detection and Worker Localization Initialization

For the purpose of evaluating the communication performance between the aerial and ground vehicles during the initial localization and detection phase, we first measured the total duration of the associated process. In particular, we measured the time taken from when the camera image was transferred to the Raspberry Pi to when the UGV received the color classification and the worker’s GPS coordinates. As can be depicted in Figure 8a, FMIS steadily outperformed MAVLink in terms of speed in this case. In the empty field environment, the FMIS-based system achieved an average time of 2.93 s ( S D = 0.55), while MAVLink required 4.74 s ( S D = 0.53). Similarly, in the orchard setting, FMIS completed the phase in 3.16 s ( S D = 0.56), while MAVLink completed it in 4.81 s ( S D = 0.47).
A paired-samples t -test was conducted to compare total duration between the MAVLink and FMIS protocols across all trials. MAVLink trials ( M = 4.78 s, S D = 0.51) were significantly longer than FMIS trials ( M = 3.04 s, S D = 0.60). The mean difference of 1.74 s was statistically significant, t = 18.34, p < 0.001. Furthermore, it was associated with a very large effect size (Cohen’s d = 2.37). This indicates that the observed difference is both statistically significant and substantial in magnitude. Consequently, during the early stages of UAV–UGV coordination, the FMIS facilitates faster integration and processing. The reason why FMIS outperforms MAVLink in this metric is associated with differences in protocol design. MAVLink is a general-purpose communication protocol that may present higher overhead in this context. In contrast, FMIS, especially its particular module designed for this purpose, appears to be optimized for quicker data handling and latency. In addition, the FMIS middleware may be better integrated with the onboard hardware (Raspberry Pi, etc.), allowing faster processing compared to MAVLink’s communication stack.
Subsequently, we focused on the communication latency during the handshake process, defined as the total time elapsed from the UAV sending the detected color information to the UGV and the UGV successfully replying back. This metric reflects the responsiveness of the communication link. The results shown in Figure 8b point out that for the MAVLink protocol in an empty environment, the average latency was 276.74 ms ( S D = 19.52). Under the same conditions, FMIS exhibited a slight increase of approximately 13%. In the orchard environment, MAVLink and FMIS demonstrated almost equal values of latency values. A paired-samples t -test was also conducted to compare handshake latency between the MAVLink and FMIS protocols across trials. MAVLink trials ( M = 293 ms, S D = 52.76) and FMIS trials ( M = 309 ms, S D = 34.73) did not differ significantly, t = −0.04, p = 0.97. The mean difference of −16 ms corresponds to a negligible effect size (Cohen’s d = −0.005), indicating that handshake performance was statistically and practically equivalent between the two protocols.
As far as the results on energy consumption are concerned, battery consumption data for the UAV were obtained via internal system reports provided by the MAVLink protocol. This includes battery status updates at a default sampling interval of 1 Hz. The accuracy of these internal reports corresponds to the specifications of the UAV’s flight controller and battery management system. For the UGV, battery status was monitored using an onboard Battery Management System (BMS) board, which continuously tracks voltage, current, and state of charge. Focusing in the present analysis, FMIS consistently consumed less UAV battery in both environments compared to MAVLink (Figure 9a). This was mainly owing to its architectural design, which offloads significant communication handling and data processing tasks to the UGV. FMIS minimizes onboard UAV computation by delegating responsibilities such as message aggregation, error correction, and link maintenance to the ground system. This design allows the aerial unit to focus on flight control and sensor acquisition with minimal communication overhead. Overall, MAVLink trials ( M = 4.30%, S D = 0.50) consumed significantly more battery than FMIS trials ( M = 1.06%, S D = 0.53), t = 41.91, p < 0.001. The mean difference of 3.24% represents an extremely large effect size (Cohen’s d = 5.41), indicating a consistent battery advantage for the FMIS protocol in this phase.
However, this shift in processing burden results in increased UGV battery usage for the FMIS, as shown in Figure 9b. In contrast, MAVLink’s lightweight communication structure imposes less strain on the UGV. Overall, MAVLink trials ( M = 0.55%, S D = 0.11) consumed remarkably less battery than FMIS trials ( M = 1.60%, S D = 0.60), t   = −12.27, p < 0.001. The mean difference of −1.05% represents a large effect size (Cohen’s d = −1.77), showing that the MAVLink protocol is substantially more energy-efficient for UGV battery consumption in this phase.
Regarding the UAV CPU load measurements for both environments, it can be inferred that FMIS and MAVLink impose almost similar processing demands on the aerial platform, as can be seen in Figure 9c. This occurs because the CPU of the aerial unit is only used for messages delivery to the UGV and vice versa. Specifically, MAVLink trials ( M = 49.14%, S D = 5.88) and FMIS trials ( M = 50.23%, S D = 5.07) did not differ significantly ( t   = −1.46, p = 0.15). The mean difference of −1.09% corresponds to a small effect size (Cohen’s d = −0.21), indicating similar CPU load for both protocols.

3.2. Phase 2: Ground Vehicle Reaches Worker and Updates Aerial Vehicle with Position

In Phase 2, where the UGV reaches the worker and informs the UAV of the worker’s position, the UGV-to-UAV position report latency becomes a useful metric, with respect to demonstrating the responsiveness of inter-robot coordination. The results provided in Figure 10a illustrate that MAVLink exhibits significantly lower latency in both environments, with approximately 102.65 ms ( S D = 8.86) in the empty field and 104.40 ms ( S D = 7.29) in the orchard, compared to the FMIS’s 308.20 ms ( S D = 20.02) and 340.20 ms ( S D = 21.06), respectively. This stresses that MAVLink, because of the low-overhead messaging structure it has, offers faster transmission and processing of status updates. In contrast, FMIS introduces greater communication delay in this phase likely due to additional message handling layers, like reliability checks or buffering mechanisms. Similar to Phase 1, a paired-samples t -test was conducted to compare UGV-to-UAV position report latency between the MAVLink and FMIS protocols. MAVLink trials ( M = 103.53 ms, S D = 8.03) showed significantly lower latency than FMIS trials ( M = 324.20 ms, S D = 54.44), t = −25.26, p < 0.001. The mean difference of −220.67 ms corresponds to a very large effect size (Cohen’s d = −4.10), indicating substantially faster position reporting with the MAVLink protocol.
Concerning the UGV battery usage related to this phase, it illustrates the energy cost associated with the navigation to the worker and the subsequent communication with the UAV. The results in Figure 10b illustrate that FMIS consistently causes higher UGV battery consumption across both environments. This increased consumption under FMIS aligns with its communication model, which offloads more processing and communication responsibilities to the UGV. Specifically, FMIS processes tasks that require additional calculations and continuous transmission activity, hence increasing energy consumption. This trend is consistent with the higher UGV-to-UAV position report latency observed under FMIS, where the extended communication time likely contribute to additional battery drain. A paired-samples t -test was conducted to compare UGV battery use between MAVLink and FMIS protocols during Phase 2. MAVLink trials ( M = 1.71%, S D = 0.45) used significantly less battery than FMIS trials ( M = 2.28%, S D = 0.44), t = −5.62, p < 0.001. The mean difference of −0.57% corresponds to a large effect size (Cohen’s d = −0.91), indicating a meaningful battery efficiency advantage for MAVLink in Phase 2.
The UGV CPU usage results during Phase 2 (Figure 10c) further support the differences observed in energy consumption and communication behavior between FMIS and MAVLink. In short, FMIS leads to approximately 30% and 60% increases in the cases of the empty field and orchard, respectively, as compared to MAVLink’s findings. A paired-samples t -test was conducted to compare the UGV CPU load between the MAVLink and FMIS protocols. MAVLink trials ( M = 18.90%, S D = 3.85) showed significantly lower CPU load than FMIS trials ( M = 27.36%, S D = 3.46), t = −10.76, p   < 0.001. The mean difference of −8.46% corresponds to a very large effect size (Cohen’s d = −1.74), indicating substantially reduced CPU load with the MAVLink protocol in this phase.

3.3. Summary of Observed Patterns and Trends

Overall, the main difference between the two studied systems lies in their communication approach. MAVLink is a peer-to-peer protocol that facilitates direct messaging between UAV and UGV with remarkable low latency, below 105 ms for UGV-to-UAV updates in both open field and orchard environments. This real-time capability stresses that MAVLink is best for tasks demanding rapid feedback loops, such as emergency stops. Nevertheless, this speed comes at the expense of reliability. In more obstructed environments, like the almond orchards examined in this study, the system suffers from significant packet loss, which reaches up to 30%. As a consequence, the MAVLink’s performance is influenced significantly by environmental conditions, making it a less reliable choice when consistent message delivery is necessary. On the other hand, FMIS operates as an HTTP-based middleware backed by a solid server architecture. Although it introduces a higher communication delay, it consistently delivered 97% of messages, even under challenging conditions. Its Transmission Control Protocol (TCP) foundation supports reliable message delivery and complex JSON payloads, making it more appropriate for applications requiring rich metadata and centralized coordination.
From a computational and energy perspective, both systems place similar demands on the CPU of the UAV, since CPU usage on the aerial unit is limited to handling message transmission. However, the UGV experiences some differences. In particular, FMIS requires more onboard processing to handle HTTP requests, error checking, and queuing logic. This results in an increase in UGV CPU load and more battery consumption. Although this increases the load on the system, FMIS improves data transmission rates and ensures more reliable communication. MAVLink minimizes UGV resource usage but requires more UAV processing power and fails to maintain reliability in signal-constrained environments [43]. In turn, FMIS, by leveraging cellular connectivity and a dependable server infrastructure, maintained stable communication in both environments, although its performance remains dependent on network coverage.

4. Discussion

The present study compares two UAV–UGV communication architectures, namely FMIS and MAVLink, in agricultural environments. The motivation for this investigation comes from the practical need to consider communication strategies that can reliably support UAV–UGV coordination in these complex and unpredictable environments. These strategies must balance trade-offs between speed, reliability, and power consumption to inform mission-based protocol selection. Based on the observed trade-off, a fundamental architectural difference between FMIS and MAVLink can be seen. FMIS assigns complex tasks to more powerful systems to ensure that messages are reliable and can carry a lot of detailed information. It can be seen as “quality and detail over speed”. In contrast, MAVLink is designed for quick communication and to allow devices to make decisions on their own with minimal delay. It uses a simple binary format that keeps messages small and efficient. Unlike more complex data formats, this lightweight design adds very little extra data, simplifying the process of sending messages, even over slower connections [44]. The MAVLink-based communication can be seen as “speed over ultimate reliability and detail”. As a consequence, the choice between MAVLink and FMIS should be driven by specific goals and conditions of the mission. Hence, if fast, real-time responses are of major importance, MAVLink is the better option, because of its low communication delay, provided that the environment allows stable signal transmission. Conversely, if the mission requires rich data exchange, reliable message delivery, or coordination across multiple robots or a server, FMIS is more suitable, even if it introduces some delay.

4.1. Study Limitations and Considerations

Although this study offers a valuable comparison of FMIS and MAVLink for UAV–UGV coordination in agricultural settings, it presents some limitations that should be mentioned. Firstly, the present assessment considered specific open field agricultural environments. However, the communication performance of both architectures, particularly MAVLink’s reliability, is known to be highly dependent on the surrounding environment, especially the presence of obstructions and radio interference [45]. For instance, the observed packet loss percentages and latency values for MAVLink might vary considerably in different farming settings with different levels of complexity or electromagnetic noise. Moreover, the study supposed a certain level of technical expertise for implementing and maintaining both communication architectures, without exploring the practical challenges and skill requirements for end users [46]. In fact, the study did not consider how easy or difficult it would be for typical agricultural workers, who may not have the same technical background, to install, use, or resolve problems in these communication systems in their daily work. The potential for data loss or corruption in addition to simple packet loss metrics, as well as the fault recovery mechanisms in each system, also justify further investigation in real-world deployment scenarios.
Interestingly, the additional operational notes recorded during the present data collection indicate that shadows cast by trees did not significantly impact image processing accuracy. Windy conditions were observed to cause slight UAV instability, which may have contributed to variability in communication latency and energy consumption. However, other environmental factors such as temperature, humidity, solar radiation, and ground unevenness were not systematically monitored or controlled during the trials. While the aforementioned conditions can potentially influence both communication reliability and image processing, they were not the primary focus of this study.

4.2. Prospective Research Directions

Based on the above discussion, future research should examine the performance of MAVLink and FMIS in a wider range of agricultural environments. These include those environments with varying levels of foliage density and electromagnetic interference, towards understanding their limitations and adaptability. Exploring the potential of hybrid communication models, where the system can dynamically switch between MAVLink and FMIS based on mission phase, real-time environmental conditions, and signal quality, could provide a more efficient communication strategy. For instance, in an agricultural scenario where a UAV and UGV are collaboratively monitoring an orchard with human workers present, a hybrid communication model could enhance both safety and efficiency [4,47]. During routine operations in open areas, MAVLink can be used for its low-latency communication to quickly relay worker positions and immediate movement commands between the UAV and the UGV, ensuring real-time coordination and avoiding collisions. As the robots enter denser parts of the orchard with weaker telemetry signals, the system could automatically switch to FMIS to maintain reliable communication. By combining both protocols, the system could ensure urgent safety-related commands reach their target quickly via MAVLink. Meanwhile, FMIS guarantees data integrity and coordination in complex or signal-degraded environments, thereby supporting safe and intelligent human-robot collaboration in real-world agricultural tasks [29,48].
Another research direction is to jointly consider communication, planning, and state-estimation design rather than treating them as independent modules, especially in complex agricultural environments where several factors can affect control accuracy and operation efficiency [49]. Our results show that communication method directly impacts the robustness and responsiveness of planning and control. In turn, the distribution of planning tasks (centralized vs. decentralized) also influences communication demand and efficiency. This interdependence is consistent with broader research in agricultural robotics, such as the hybrid dual-extended Kalman filter–adaptive radial basis function neural network (DEKF–ARBFNN) estimation method for unmanned tractors [50]. That work demonstrated that improving estimation robustness enhances control stability in uncertain conditions. Future studies could extend our framework by integrating robust state-estimation approaches with communication-aware planning, allowing UAV–UGV teams to dynamically adapt their coordination strategy to account for both network conditions and control requirements.
While our current study focuses on a single UAV, UGV, and human participant per site to establish a controlled baseline for communication performance, we recognize the critical need to consider scalability for practical deployments involving multiple agents. Both MAVLink and FMIS architectures have inherent characteristics that impact their scalability. MAVLink, being a decentralized protocol, can efficiently handle multiple UAVs and UGVs by enabling peer-to-peer communication. Nevertheless, it may experience increased network congestion and coordination challenges as the number of agents grows. On the other hand, FMIS can potentially face performance degradation under heavier communication loads due to bottlenecks at the central server or communication hub, particularly if the network bandwidth or computational resources are limited. However, FMIS platforms can be scaled through distributed server architectures or cloud-based infrastructures that manage communication loads dynamically. Future work could extend our experimental framework to evaluate these architectures in multi-robot and multi-human scenarios, exploring the trade-offs between similar performance metrics. A more detailed analysis separating image-processing latency from communication transmission delays will also be pursued in future work. Disaggregating these components will help identify whether delays are due to perception algorithms or communication overhead, enabling more targeted system improvements. Finally, future research should incorporate more detailed monitoring and control of environmental variables. This will help better understand their effects on aerial–ground robotic team performance in agricultural settings and ensure more rigorous comparisons between communication architectures.

5. Conclusions

The study compares two communication approaches for coordinating UAV and UGV teams in agricultural settings:
  • Centralized approach (FMIS-based): The UAV and UGV communicate via a cloud-based FMIS. The UAV performs aerial surveying and sends geotagged images and GPS data to the FMIS, which processes tree and worker recognition, plans UGV navigation, and coordinates task execution. The UGV uses onboard LiDAR for local obstacle avoidance and continuously logs mission data, all centrally stored for post-mission analysis. This approach favors reliable data handling and centralized decision-making.
  • Decentralized approach (MAVLink-based): The UAV and UGV directly exchange real-time data via a low-latency MAVLink protocol integrated with ROS over UART telemetry. The UAV runs onboard image processing to detect worker hat colors and sends operational commands and GPS coordinates directly to the UGV. The UGV autonomously navigates using RTK-GPS and local LiDAR for obstacle avoidance. This setup ensures modular, real-time communication without reliance on external servers, supporting operations in remote or low-connectivity environments.
The results from 120 field trials comparing MAVLink and FMIS across empty field and orchard environments indicate that FMIS consistently achieved higher communication success rates (97% in both environments) than MAVLink (83% and 70% in the empty field and orchard, respectively). This can be attributed to FMIS’s server-based architecture and use of reliable TCP communication, which ensures consistent message delivery even in obstructed, foliage-dense conditions. In contrast, MAVLink’s peer-to-peer protocol is more vulnerable to packet loss and signal degradation in complex environments.
In Phase 1 (worker detection and localization), FMIS completed tasks faster than MAVLink, mainly because of better hardware integration and optimized data handling. This faster performance is attributed to FMIS’s optimization in managing data and better integration with onboard hardware, unlike MAVLink’s more generic protocol design, which introduces additional delays. Besides, FMIS consistently resulted in lower UAV battery usage (but higher UGV battery usage) than MAVLink across both environments.
In Phase 2, where UGV reaches the worker and updates the UAV with the position, MAVLink demonstrated significantly lower latency for UGV-to-UAV position updates compared to FMIS. FMIS also caused higher UGV battery and CPU usage in both environments, whereas MAVLink maintained lower energy and processing demands. Paired-samples t -tests were conducted for all quantitative metrics of both Phases 1 and 2 to assess statistical significance.
The main limitation of this study is that experiments were conducted in specific open-field agricultural environments with controlled conditions. These conditions may not fully capture the variability of communication performance, environmental influences, as well as user expertise in real-world farming scenarios. Future research should evaluate MAVLink and FMIS across diverse and complex agricultural environments and consider scalability to multi-robot and multi-human teams. Moreover, hybrid communication strategies that switch dynamically between protocols, combined with communication-aware planning and robust state estimation, could improve coordination. They can also enhance reliability and operational efficiency in real-world deployments.

Author Contributions

Conceptualization, D.K. (Dimitris Katikaridis) and D.B.; methodology, D.K. (Dimitris Katikaridis), L.B., P.B., E.P. and G.K.; validation, L.B. and D.K. (Dimitrios Kateris); formal analysis, P.B., E.P. and G.K.; investigation, D.K. (Dimitris Katikaridis), L.B. and D.K. (Dimitrios Kateris); writing—original draft preparation, D.K. (Dimitris Katikaridis) and L.B.; writing—review and editing, P.B., D.K. (Dimitrios Kateris), E.P., G.K. and D.B.; visualization, D.K. (Dimitris Katikaridis); supervision, D.B. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Institutional review board statement is not applicable to this article.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The datasets presented in this article are not readily available because the data are part of an ongoing study. Requests to access the datasets should be directed to the authors of the article.

Conflicts of Interest

The authors Dimitris Katikaridis and Dionysis Bochtis were employed by the company farmB Digital Agriculture S.A. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Appendix A

The system architecture comprises two primary robotic platforms: an aerial unit (UAV) and a ground unit (UGV). Each of them is equipped with specialized hardware and software components. Table A1 and Table A2 summarize the key hardware elements and software functionalities of the UAV and UGV platforms, respectively.
Table A1. Detailed summary of the main hardware and software components of the unmanned aerial vehicle.
Table A1. Detailed summary of the main hardware and software components of the unmanned aerial vehicle.
CategoryComponentDescription
HardwarePlatformCustom quadcopter designed for autonomous agricultural monitoring
AutopilotPixhawk 4 with 32-bit ARM Cortex-M7 processor @ 216 MHz
LocalizationH-RTK F9P GNSS receiver with multi-band RTK corrections; positional accuracy 1–2 cm
CameraSony Alpha 6100 RGB camera, 24.2 MP APS-C Exmor CMOS sensor, up to 11 fps
Camera triggerSeagull Map-X2 hardware trigger
Onboard processingRaspberry Pi 4 Model B (quad-core Cortex-A72, 4 GB RAM)
Communication Modules4G LTE module (downlink up to 150 Mbps), SiK Telemetry Radio (433 MHz), R9SX receiver (900 MHz)
Remote controlFrSky Taranis X9D Plus transmitter
Power supply12,000 mAh LiPo battery (22.2 V, 6 S); flight time 30–40 min
SoftwareFlight controlArduPilot and PX4 firmware on Pixhawk 4 autopilot
Image processingCustom real-time image aggregation on Raspberry Pi 4
Communication protocolMAVLink for UAV telemetry and control
ROS integrationMAVROS interface for MAVLink-ROS messaging
Mission planningQGroundControl
Table A2. Detailed summary of the main hardware and software components of the unmanned ground vehicle.
Table A2. Detailed summary of the main hardware and software components of the unmanned ground vehicle.
CategoryComponentDescription
HardwarePlatformThorvald UGV (SAGA Robotics SA), commonly used for autonomous agricultural operations
LiDAR sensorVelodyne VLP-16 Puck laser scanner for 3D mapping and obstacle detection
Processing unitIntel NUC with i5 6200U processor, 16 GB RAM, 128 GB storage, running Ubuntu Linux 18.04.6 LTS
Motor controlPEAK CANbus system for reliable motor communication
LocalizationXsens MTi 630R SK IMU and Stonex S850 RTK GNSS receiver for precise navigation
SoftwareOperating systemUbuntu Linux 18.04.6 LTS
Robotics middlewareROS Melodic
Navigation plannerCarrot Planner for goal-point adjustment around obstacles
Communication protocolsMAVProxy forwarding MAVLink messages to ROS; MAVLink protocol for decentralized communication

Appendix B

For the purpose of testing whether the hat color influenced trial success, a chi-square test of independence was conducted in Microsoft Excel (“CHISQ.TEST” function) using the aggregated success and failure counts across all experimental conditions. The observed counts of successes and failures for each hat color are summarized in Table A3.
Table A3. Observed and expected trial outcomes by hat color.
Table A3. Observed and expected trial outcomes by hat color.
Hat ColorSuccessFailureRow TotalExpected SuccessExpected Failure
Blue3554034.675.33
White3374034.675.33
Red3644034.675.33
Expected counts were calculated using the following:
E i , j   =   R o w   t o t a l × C o l u m n   t o t a l Τ o t a l   t r i a l s ,
while the chi-square statistic was computed as follows:
χ 2   =   O b s e r v e d   c o u n t E x p e c t e d   c o u n t 2 E x p e c t e d   c o u n t
Finally, the degrees of freedom ( d f ) for the chi-square test were equal to 2, calculated through the following:
d f   =   n u m b e r   o f   r o w s 1 × n u m b e r   o f   c o l u m n s 1 ,
where the number of rows and columns are 3 (Blue, White, Red) and 2 (Success, Failure), respectively.
Consequently, the total test statistic was χ 2 2 , N = 120 1.01 and p 0.60 . Since p   > 0.05, there is no statistically significant association between hat color and trial outcome.

Appendix C

In this appendix, an overview of the experimental results regarding Phases 1 and 2 are presented in Table A4 and Table A5, respectively. In particular, these tables include the comparative performance of the MAVLink-based and FMIS-based systems for two different environments, namely empty and orchard. The assessment covers several operational metrics, ranging from total duration and handshake latency to battery consumption and CPU loads for both UAV and UGV platforms. Each environment–protocol combination was tested through 30 trials, corresponding to three different hat colors (white, red, blue) worn by the participant, with 10 repetitions per color, totaling 120 trials. All results are reported as mean values accompanied by their standard deviations.
Table A4. Summary of Phase 1 results comparing the performance of MAVLink-based and FMIS-based systems across various operational metrics in different environments; values are presented as mean ± standard deviation.
Table A4. Summary of Phase 1 results comparing the performance of MAVLink-based and FMIS-based systems across various operational metrics in different environments; values are presented as mean ± standard deviation.
EnvironmentProtocolTotal
Duration (s)
Handshake
Latency (ms)
UAV Battery UseUGV Battery UseUAV CPU Load
EmptyMAVLink4.74 ± 0.53276.74 ± 19.524.27 ± 0.530.53 ± 0.1249.60 ± 6.62
FMIS2.93 ± 0.55313.10 ± 18.271.12 ± 0.561.59 ± 0.2051.07 ± 5.44
OrchardMAVLink4.81 ± 0.47309.32 ± 34.904.33 ± 0.460.56 ± 0.1148.67 ± 5.01
FMIS3.16 ± 0.56305.07 ± 44.901.00 ± 0.511.60 ± 0.2249.38 ± 5.06
Table A5. Summary of Phase 2 results comparing the performance of MAVLink-based and FMIS-based systems across various operational metrics in different environments. Values are presented as mean ± standard deviation.
Table A5. Summary of Phase 2 results comparing the performance of MAVLink-based and FMIS-based systems across various operational metrics in different environments. Values are presented as mean ± standard deviation.
EnvironmentProtocolUGV-to-UAV Position
Report Latency (ms)
UGV Battery Use (%)UGV CPU Load (%)
EmptyMAVLink102.65 ± 8.861.58 ± 0.3419.51 ± 4.40
FMIS308.20 ± 20.022.34 ± 0.2725.62 ± 3.27
OrchardMAVLink104.40 ± 7.291.84 ± 0.2718.29 ± 4.28
FMIS340.20 ± 21.062.21 ± 0.4029.10 ± 3.37
For the purpose of assessing the differences between the MAVLink and FMIS protocols, paired-samples t -tests were conducted for each performance metric mentioned above. This test is appropriate because measurements were collected from the same trials under both protocols, ensuring dependent samples. For each metric, we calculated the mean difference (MAVLink minus FMIS), the t -statistic, and the associated p -value to determine statistical significance. Additionally, Cohen’s d was computed to quantify the effect size and provide insight into the practical significance of the observed differences. Statistical significance was set at p < 0.05. All analyses were performed using Microsoft Excel. Table A6 summarizes key metrics from Phase 1, while Table A7 summarizes those from Phase 2.
Table A6. Phase 1 comparison of MAVLink and FMIS protocols. Paired t -test results with mean differences, t , p -values, and Cohen’s d .
Table A6. Phase 1 comparison of MAVLink and FMIS protocols. Paired t -test results with mean differences, t , p -values, and Cohen’s d .
MetricMean Difference 1 t p -Value Cohen s   d
Total duration (s)1.7418.34<0.0012.37
Handshake latency (ms)−16.00−0.040.97−0.005
UAV battery use (%)3.2441.91<0.0015.41
UGV battery use (%)−1.05−12.27<0.001−1.77
UAV CPU load (%)−1.09−1.460.15−0.21
1 Mean differences correspond to MAVLink minus FMIS.
Table A7. Phase 2 comparison of MAVLink and FMIS protocols. Paired t -test results with mean differences, t , p -values, and Cohen’s d .
Table A7. Phase 2 comparison of MAVLink and FMIS protocols. Paired t -test results with mean differences, t , p -values, and Cohen’s d .
MetricMean Difference 1 t p -Value Cohen s   d
UGV-to-UAV position
report latency (ms)
−220.67−25.26<0.001−4.10
UGV battery use (%)−0.57−5.62<0.001−0.91
UGV CPU load (%)−8.46−10.76<0.001−1.74
1 Mean differences correspond to MAVLink minus FMIS.

References

  1. Mourtzis, D.; Angelopoulos, J.; Panopoulos, N. A Literature Review of the Challenges and Opportunities of the Transition from Industry 4.0 to Society 5.0. Energies 2022, 15, 6276. [Google Scholar] [CrossRef]
  2. Bissadu, K.D.; Sonko, S.; Hossain, G. Society 5.0 enabled agriculture: Drivers, enabling technologies, architectures, opportunities, and challenges. Inf. Process. Agric. 2025, 12, 112–124. [Google Scholar] [CrossRef]
  3. Taha, M.F.; Mao, H.; Zhang, Z.; Elmasry, G.; Awad, M.A.; Abdalla, A.; Mousa, S.; Elwakeel, A.E.; Elsherbiny, O. Emerging Technologies for Precision Crop Management Towards Agriculture 5.0: A Comprehensive Overview. Agriculture 2025, 15, 582. [Google Scholar] [CrossRef]
  4. Benos, L.; Sørensen, C.G.; Bochtis, D. Field Deployment of Robotic Systems for Agriculture in Light of Key Safety, Labor, Ethics and Legislation Issues. Curr. Robot. Rep. 2022, 3, 49–56. [Google Scholar] [CrossRef]
  5. Agrawal, J.; Arafat, M.Y. Transforming Farming: A Review of AI-Powered UAV Technologies in Precision Agriculture. Drones 2024, 8, 664. [Google Scholar] [CrossRef]
  6. Niu, H.; Chen, Y. The Unmanned Ground Vehicles (UGVs) for Digital Agriculture. In Smart Big Data in Digital Agriculture Applications: Acquisition, Advanced Analytics, and Plant Physiology-Informed Artificial Intelligence; Niu, H., Chen, Y., Eds.; Springer Nature: Cham, Switzerland, 2024; pp. 99–109. ISBN 978-3-031-52645-9. [Google Scholar]
  7. Agelli, M.; Corona, N.; Maggio, F.; Moi, P.V. Unmanned Ground Vehicles for Continuous Crop Monitoring in Agriculture: Assessing the Readiness of Current ICT Technology. Machines 2024, 12, 750. [Google Scholar] [CrossRef]
  8. Ersü, C.; Petlenkov, E.; Janson, K. A Systematic Review of Cutting-Edge Radar Technologies: Applications for Unmanned Ground Vehicles (UGVs). Sensors 2024, 24, 7807. [Google Scholar] [CrossRef]
  9. Toscano, F.; Fiorentino, C.; Capece, N.; Erra, U.; Travascia, D.; Scopa, A.; Drosos, M.; D’Antonio, P. Unmanned Aerial Vehicle for Precision Agriculture: A Review. IEEE Access 2024, 12, 69188–69205. [Google Scholar] [CrossRef]
  10. Zhu, H.; Lin, C.; Liu, G.; Wang, D.; Qin, S.; Li, A.; Xu, J.-L.; He, Y. Intelligent agriculture: Deep learning in UAV-based remote sensing imagery for crop diseases and pests detection. Front. Plant Sci. 2024, 15, 1435016. [Google Scholar] [CrossRef] [PubMed]
  11. Chen, J.; Zhang, X.; Xin, B.; Fang, H. Coordination Between Unmanned Aerial and Ground Vehicles: A Taxonomy and Optimization Perspective. IEEE Trans. Cybern. 2016, 46, 959–972. [Google Scholar] [CrossRef]
  12. Munasinghe, I.; Perera, A.; Deo, R.C. A Comprehensive Review of UAV-UGV Collaboration: Advancements and Challenges. J. Sens. Actuator Netw. 2024, 13, 81. [Google Scholar] [CrossRef]
  13. Mammarella, M.; Comba, L.; Biglia, A.; Dabbene, F.; Gay, P. Cooperation of unmanned systems for agricultural applications: A theoretical framework. Biosyst. Eng. 2022, 223, 61–80. [Google Scholar] [CrossRef]
  14. Cardone, M. Cooperative Autonomous Navigation. In Automated and Autonomous Navigation Powered by GNSS: Navigation Techniques and Applications, Future Automated Navigation; Cardone, M., Ed.; Springer Nature: Cham, Switzerland, 2025; pp. 203–217. ISBN 978-3-031-78753-9. [Google Scholar]
  15. Teschner, G.; Hajdu, C.; Hollósi, J.; Boros, N.; Kovács, A.; Ballagi, Á. Digital Twin of Drone-based Protection of Agricultural Areas. In Proceedings of the 2022 IEEE 1st International Conference on Internet of Digital Reality (IoD), Online, 23–24 June 2022; pp. 99–104. [Google Scholar]
  16. Arbanas, B.; Ivanovic, A.; Car, M.; Orsag, M.; Petrovic, T.; Bogdan, S. Decentralized planning and control for UAV–UGV cooperative teams. Auton. Robot. 2018, 42, 1601–1618. [Google Scholar] [CrossRef]
  17. Chakraborty, S.; Elangovan, D.; Govindarajan, P.L.; ELnaggar, M.F.; Alrashed, M.M.; Kamel, S. A Comprehensive Review of Path Planning for Agricultural Ground Robots. Sustainability 2022, 14, 9156. [Google Scholar] [CrossRef]
  18. Lytridis, C.; Kaburlasos, V.G.; Pachidis, T.; Manios, M.; Vrochidou, E.; Kalampokas, T.; Chatzistamatis, S. An Overview of Cooperative Robotics in Agriculture. Agronomy 2021, 11, 1818. [Google Scholar] [CrossRef]
  19. Bechar, A.; Vigneault, C. Agricultural robots for field operations. Part 2: Operations and systems. Biosyst. Eng. 2017, 153, 110–128. [Google Scholar] [CrossRef]
  20. Mammarella, M.; Comba, L.; Biglia, A.; Dabbene, F.; Gay, P. Cooperation of unmanned systems for agricultural applications: A case study in a vineyard. Biosyst. Eng. 2022, 223, 81–102. [Google Scholar] [CrossRef]
  21. Ivić, S.; Andrejčuk, A.; Družeta, S. Autonomous control for multi-agent non-uniform spraying. Appl. Soft Comput. 2019, 80, 742–760. [Google Scholar] [CrossRef]
  22. Katikaridis, D.; Moysiadis, V.; Tsolakis, N.; Busato, P.; Kateris, D.; Pearson, S.; Sørensen, C.G.; Bochtis, D. UAV-Supported Route Planning for UGVs in Semi-Deterministic Agricultural Environments. Agronomy 2022, 12, 1937. [Google Scholar] [CrossRef]
  23. McAllister, W.; Osipychev, D.; Davis, A.; Chowdhary, G. Agbots: Weeding a field with a team of autonomous robots. Comput. Electron. Agric. 2019, 163, 104827. [Google Scholar] [CrossRef]
  24. Lytridis, C.; Bazinas, C.; Kalathas, I.; Siavalas, G.; Tsakmakis, C.; Spirantis, T.; Badeka, E.; Pachidis, T.; Kaburlasos, V.G. Cooperative Grape Harvesting Using Heterogeneous Autonomous Robots. Robotics 2023, 12, 147. [Google Scholar] [CrossRef]
  25. Millard, A.G.; Ravikanna, R.; Groß, R.; Chesmore, D. Towards a Swarm Robotic System for Autonomous Cereal Harvesting. In Towards Autonomous Robotic Systems; Althoefer, K., Konstantinova, J., Zhang, K., Eds.; Springer International Publishing: Cham, Switzerland, 2019; pp. 458–461. [Google Scholar]
  26. Blender, T.; Buchner, T.; Fernandez, B.; Pichlmaier, B.; Schlegel, C. Managing a Mobile Agricultural Robot Swarm for a seeding task. In Proceedings of the IECON Proceedings (Industrial Electronics Conference), 42nd Annual Conference of the IEEE Industrial Electronics Society, Florence, Italy, 23–26 October 2016; IEEE Computer Society: Washington, DC, USA, 2016; pp. 6879–6886. [Google Scholar]
  27. Tokekar, P.; Hook, J.V.; Mulla, D.; Isler, V. Sensor Planning for a Symbiotic UAV and UGV System for Precision Agriculture. IEEE Trans. Robot. 2016, 32, 1498–1511. [Google Scholar] [CrossRef]
  28. Menendez-Aponte, P.; Garcia, C.; Freese, D.; Defterli, S.; Xu, Y. Software and Hardware Architectures in Cooperative Aerial and Ground Robots for Agricultural Disease Detection. In Proceedings of the 2016 International Conference on Collaboration Technologies and Systems (CTS), Orlando FL, USA, 31 October–4 November 2016; pp. 354–358. [Google Scholar]
  29. Benos, L.; Moysiadis, V.; Kateris, D.; Tagarakis, A.C.; Busato, P.; Pearson, S.; Bochtis, D. Human-Robot Interaction in Agriculture: A Systematic Review. Sensors 2023, 23, 6776. [Google Scholar] [CrossRef]
  30. Elias, A.; Galvez Trigo, M.J.; Camacho-Villa, C. Analyzing Previous Human-Robot Interaction Implementation in Agriculture: What Can We Learn from the Past? In Proceedings of the 2025 20th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Melbourne, Australia, 4–6 March 2025; pp. 123–131. [Google Scholar]
  31. Moysiadis, V.; Benos, L.; Karras, G.; Kateris, D.; Peruzzi, A.; Berruto, R.; Papageorgiou, E.; Bochtis, D. Human–Robot Interaction through Dynamic Movement Recognition for Agricultural Environments. AgriEngineering 2024, 6, 2494–2512. [Google Scholar] [CrossRef]
  32. Urvina, R.P.; Guevara, C.L.; Vásconez, J.P.; Prado, A.J. An Integrated Route and Path Planning Strategy for Skid–Steer Mobile Robots in Assisted Harvesting Tasks with Terrain Traversability Constraints. Agriculture 2024, 14, 1206. [Google Scholar] [CrossRef]
  33. Vásconez, J.P.; Auat Cheein, F.A. Workload and production assessment in the avocado harvesting process using human-robot collaborative strategies. Biosyst. Eng. 2022, 223, 56–77. [Google Scholar] [CrossRef]
  34. Grimstad, L.; From, P.J. The Thorvald II Agricultural Robotic System. Robotics 2017, 6, 24. [Google Scholar] [CrossRef]
  35. Quigley, M.; Gerkey, B.P.; Conley, K.; Faust, J.; Foote, T.; Leibs, J.; Berger, E.; Wheeler, R.N. ROS: An Open-Source Robot Operating System; In ICRA Workshop on Open Source Software. In Proceedings of the IEEE International Conference on Robotics and Automation, Koba, Japan, 12–17 May 2009; pp. 1–6. [Google Scholar]
  36. MAVLink Development Team. MAVLink Developer Guide. Available online: https://mavlink.io/ (accessed on 12 May 2025).
  37. Koubâa, A.; Allouch, A.; Alajlan, M.; Javed, Y.; Belghith, A.; Khalgui, M. Micro Air Vehicle Link (MAVlink) in a Nutshell: A Survey. IEEE Access 2019, 7, 87658–87680. [Google Scholar] [CrossRef]
  38. Anagnostis, I.; Kotzanikolaou, P.; Douligeris, C. Understanding and Securing the Risks of Uncrewed Aerial Vehicle Services. IEEE Access 2025, 13, 47955–47995. [Google Scholar] [CrossRef]
  39. de Castro, G.G.R.; Santos, T.M.B.; Andrade, F.A.A.; Lima, J.; Haddad, D.B.; Honório, L.d.M.; Pinto, M.F. Heterogeneous Multi-Robot Collaboration for Coverage Path Planning in Partially Known Dynamic Environments. Machines 2024, 12, 200. [Google Scholar] [CrossRef]
  40. ROS Wiki. Mavros. Available online: https://wiki.ros.org/mavros (accessed on 12 May 2025).
  41. Moysiadis, V.; Katikaridis, D.; Benos, L.; Busato, P.; Anagnostis, A.; Kateris, D.; Pearson, S.; Bochtis, D. An Integrated Real-Time Hand Gesture Recognition Framework for Human-Robot Interaction in Agriculture. Appl. Sci. 2022, 12, 8160. [Google Scholar] [CrossRef]
  42. ROS Wiki. Roscore. Available online: https://wiki.ros.org/roscore (accessed on 12 May 2025).
  43. Thahsin, A.; Ananthapadmanabhan, A.; Pathak, S.; Maity, A.; Kasbekar, G.S. Enhancing MAVLink Security: Implementation and Performance Evaluation of Encryption on a Drone Testbed. In Proceedings of the 2025 International Conference on Information Networking (ICOIN), Chiang Mai, Thailand, 15–17 January 2025; pp. 529–534. [Google Scholar]
  44. Reichstein, L.; Schopferer, S.; Jünger, F. A Comparison of Command and Control Communication Protocols for Unmanned Aircraft: STANAG 4586 vs. MAVLink. In Proceedings of the 2022 International Conference on Unmanned Aircraft Systems (ICUAS), Xi’an, China, 23–25 September 2022; pp. 1283–1292. [Google Scholar]
  45. Aljumah, A.; Ahanger, T.A.; Ullah, I. Challenges in Securing UAV IoT Framework: Future Research Perspective. IEEE Trans. Netw. Serv. Manag. 2025, 1, 2607–2629. [Google Scholar] [CrossRef]
  46. Bampasidou, M.; Goldgaber, D.; Gentimis, T.; Mandalika, A. Overcoming ‘Digital Divides’: Leveraging higher education to develop next generation digital agriculture professionals. Comput. Electron. Agric. 2024, 224, 109181. [Google Scholar] [CrossRef]
  47. Vasconez, J.P.; Kantor, G.A.; Auat Cheein, F.A. Human–robot interaction in agriculture: A survey and current challenges. Biosyst. Eng. 2019, 179, 35–48. [Google Scholar] [CrossRef]
  48. Yerebakan, M.O.; Hu, B. Human–Robot Collaboration in Modern Agriculture: A Review of the Current Research Landscape. Adv. Intell. Syst. 2024, 6, 2300823. [Google Scholar] [CrossRef]
  49. Roshanianfard, A.; Noguchi, N.; Okamoto, H.; Ishii, K. A review of autonomous agricultural vehicles (The experience of Hokkaido University). J. Terramech. 2020, 91, 155–183. [Google Scholar] [CrossRef]
  50. Xu, G.; Chen, M.; He, X.; Liu, Y.; Wu, J.; Diao, P. Research on state-parameter estimation of unmanned Tractor—A hybrid method of DEKF and ARBFNN. Eng. Appl. Artif. Intell. 2024, 127, 107402. [Google Scholar] [CrossRef]
Figure 1. The custom quadcopter designed for the present study equipped with the necessary components: (a) complete unmanned aerial vehicle platform; (b) integrated onboard sensor assembly mounted on a carbon fiber plate; (c) RGB camera mounted at the lower part of the platform; (d) red enclosure housing drone’s battery along with a circular ON/OFF switch; and (e) illuminated motor, with a bright green LED strip, for enhanced visibility during flight operations.
Figure 1. The custom quadcopter designed for the present study equipped with the necessary components: (a) complete unmanned aerial vehicle platform; (b) integrated onboard sensor assembly mounted on a carbon fiber plate; (c) RGB camera mounted at the lower part of the platform; (d) red enclosure housing drone’s battery along with a circular ON/OFF switch; and (e) illuminated motor, with a bright green LED strip, for enhanced visibility during flight operations.
Robotics 14 00119 g001
Figure 2. (a) The unmanned ground vehicle navigating an orchard inter-row and a close-up of its sensor suite including (b) GPS and LiDAR and (c) an Intel NUC with a 4G LTE module.
Figure 2. (a) The unmanned ground vehicle navigating an orchard inter-row and a close-up of its sensor suite including (b) GPS and LiDAR and (c) an Intel NUC with a 4G LTE module.
Robotics 14 00119 g002
Figure 3. Image captured by the UAV at an altitude of approximately 10 m above ground level while navigating an orchard and detecting a participant wearing a white hat.
Figure 3. Image captured by the UAV at an altitude of approximately 10 m above ground level while navigating an orchard and detecting a participant wearing a white hat.
Robotics 14 00119 g003
Figure 4. Illustration of the unified UAV–UGV–FMIS operational framework, emphasizing the centralized coordination architecture.
Figure 4. Illustration of the unified UAV–UGV–FMIS operational framework, emphasizing the centralized coordination architecture.
Robotics 14 00119 g004
Figure 5. Schematic representation of the integrated UAV–UGV workflow, highlighting the decentralized architecture of the system.
Figure 5. Schematic representation of the integrated UAV–UGV workflow, highlighting the decentralized architecture of the system.
Robotics 14 00119 g005
Figure 6. Satellite view of the two distinct agricultural environments used for the experimental evaluation: an empty field (rectangular) and a structured almond orchard (trapezoid), both located in Thessaly, a region in central Greece.
Figure 6. Satellite view of the two distinct agricultural environments used for the experimental evaluation: an empty field (rectangular) and a structured almond orchard (trapezoid), both located in Thessaly, a region in central Greece.
Robotics 14 00119 g006
Figure 7. Experimental design flowchart illustrating how the 120 trials were divided between the different protocols, environments, and worker intent signaled through hat color along with the success or failure counts for each subcase.
Figure 7. Experimental design flowchart illustrating how the 120 trials were divided between the different protocols, environments, and worker intent signaled through hat color along with the success or failure counts for each subcase.
Robotics 14 00119 g007
Figure 8. Phase 1: Comparison of MAVLink-based versus FMIS-based system performance in terms of (a) total duration and (b) handshake latency in both empty and orchard environments.
Figure 8. Phase 1: Comparison of MAVLink-based versus FMIS-based system performance in terms of (a) total duration and (b) handshake latency in both empty and orchard environments.
Robotics 14 00119 g008
Figure 9. Phase 1: Comparison of MAVLink-based versus FMIS-based system performance in terms of (a) UAV battery use; (b) UGV battery use; and (c) UAV CPU load in both empty and orchard environments.
Figure 9. Phase 1: Comparison of MAVLink-based versus FMIS-based system performance in terms of (a) UAV battery use; (b) UGV battery use; and (c) UAV CPU load in both empty and orchard environments.
Robotics 14 00119 g009
Figure 10. Phase 1: Comparison of MAVLink- versus FMIS-based system performance in terms of (a) UGV-to-UAV position report latency; (b) UGV battery use; and (c) UGV CPU load in both empty and orchard environments.
Figure 10. Phase 1: Comparison of MAVLink- versus FMIS-based system performance in terms of (a) UGV-to-UAV position report latency; (b) UGV battery use; and (c) UGV CPU load in both empty and orchard environments.
Robotics 14 00119 g010
Table 1. Variables of the data collection template utilized during the experiments to record critical performance metrics and system behaviors across multiple experimental trials.
Table 1. Variables of the data collection template utilized during the experiments to record critical performance metrics and system behaviors across multiple experimental trials.
Column Name/MetricDescriptionPhase
TrialUnique sequential identifier for each run1, 2
ProtocolCommunication protocol used (“MAVLink” or “FMIS”)1, 2
EnvironmentTest setting (“Empty Field” or “Orchard”)1, 2
Hat colorColor that indicates worker intent (“blue”, “white”, or “red”)1, 2
Total durationTime elapsed from camera image transferred to Raspberry to UGV receiving color result and worker GPS coordinates1
UAV battery use (%)Energy cost associated with the flight and any onboard processing of UAV1
Handshake latency Time from UAV sending color info to UGV response confirming reception1
UGV battery use (%)Energy consumption of the UGV while receiving data and preparing for action1
UAV CPU load (%)Mean CPU usage of the UAV during messages broadcasting/receiving1
UGV-to-UAV position report latencyThe time delay between the UGV initiating the message (“I reached the worker”) and the UAV successfully receiving and processing that information2
UGV battery use (%)Energy cost during final navigation to worker and informing UAV2
UGV CPU load (%)Average processing load on the UGV during communication with the UAV2
NotesObservations, anomalies, or context-specific info1, 2
CPU: Central Processing Unit; FMIS: Farm Management Information System; UAV: unmanned aerial vehicle; UGV: unmanned ground vehicle.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Katikaridis, D.; Benos, L.; Busato, P.; Kateris, D.; Papageorgiou, E.; Karras, G.; Bochtis, D. Experimental Comparative Analysis of Centralized vs. Decentralized Coordination of Aerial–Ground Robotic Teams for Agricultural Operations. Robotics 2025, 14, 119. https://doi.org/10.3390/robotics14090119

AMA Style

Katikaridis D, Benos L, Busato P, Kateris D, Papageorgiou E, Karras G, Bochtis D. Experimental Comparative Analysis of Centralized vs. Decentralized Coordination of Aerial–Ground Robotic Teams for Agricultural Operations. Robotics. 2025; 14(9):119. https://doi.org/10.3390/robotics14090119

Chicago/Turabian Style

Katikaridis, Dimitris, Lefteris Benos, Patrizia Busato, Dimitrios Kateris, Elpiniki Papageorgiou, George Karras, and Dionysis Bochtis. 2025. "Experimental Comparative Analysis of Centralized vs. Decentralized Coordination of Aerial–Ground Robotic Teams for Agricultural Operations" Robotics 14, no. 9: 119. https://doi.org/10.3390/robotics14090119

APA Style

Katikaridis, D., Benos, L., Busato, P., Kateris, D., Papageorgiou, E., Karras, G., & Bochtis, D. (2025). Experimental Comparative Analysis of Centralized vs. Decentralized Coordination of Aerial–Ground Robotic Teams for Agricultural Operations. Robotics, 14(9), 119. https://doi.org/10.3390/robotics14090119

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop