1. Introduction
Owing to the recent breakthrough in unmanned aerial vehicles (UAVs) or drones, their applications in the agricultural field, such as in crop monitoring, detection of crop diseases, digital surface modeling (DSM), sowing, spraying, irrigation, and mapping, have significantly reduced working hours and labor requirements, greatly improving the efficiency of agricultural work. In particular, they are widely used for remote sensing [
1,
2]. Currently, remote sensing in the agricultural field mainly involves remote exploration using a single UAV system [
3,
4,
5], mapping [
6,
7,
8,
9], monitoring [
10,
11], and DSM [
12,
13]. The remote sensing process varies depending on the features (characteristics) of crops, the cultivation environment, and the exploration purpose; however, it mostly uses ortho-images obtained using red–green–blue (RGB) cameras and multi-spectral cameras. After performing image registration based on the ortho-images captured by the UAV along the image acquisition path, a post-processing process (which involves calculating the normalized difference vegetation index (NDVI), the carotenoid reflectance index (CRI), the normalized difference red edge index (NDRE), etc.) is used according to the purpose. The agricultural UAV market is growing rapidly, and many commercial UAVs have been released. Major UAV companies include DJI, Parrot, Precisionhawk, and AGEagle, and they are developing products for a variety of agricultural solutions [
5]. Most of the UAVs used for image acquisition utilize a multi-rotor (quadcopter or hexacopter) as the main platform, and an appropriate UAV is chosen based on the total payload and sensors attached [
14]. Most commercial UAVs are controlled using a ground-installed flight control system called a ground control station (GCS). It coordinates and controls all commands sent to the UAV, as well as all data received from the UAV, and controls all conditions before, during, and after the flight. Commercial GCSs include open-source-based MissionPlanner and QgroundControl [
15]. The GCS supported by a UAV varies depending on the flight control (FC) firmware installed in the UAV, and the commercial GCS also changes according to the FC selected by the user. MissionPlanner, which supports only ArduPilot and operates only on Windows-based systems, is suitable for professional use. QgroundControl has the advantage that it supports both ArduPilot and PX4, operates not only on Windows but also on most other operating systems (Linux, iOS, Android, etc.), and can be easily operated, even by beginners.
In general, UAVs are widely used for farm work, such as crop monitoring, control, and sowing, which are required for agriculture. As UAVs use batteries as their main power source, a single UAV has limited operating time, taking considerable time to cover large areas. Multiple images taken from different directions are required for the precision maps used in precision agriculture, and field phenotyping requires high accuracy [
16]. Thus, the use of multiple drones rather than a single drone is more efficient for acquiring images of a large area in a short period of time [
17]. Nevertheless, the commercial programs currently controlling UAVs are often developed based on single-center flight control. Even when multiple UAVs can be controlled, these programs usually do not provide real-time synchronized control and independent flight control. Research is underway to solve this problem and enable a large number of UAVs to fly in groups [
18,
19]; moreover, research has been actively conducted for the cluster flight of a large number of UAVs in agriculture. Ju and Son used seven measurement indices (total time, setup time, flight time, battery consumption, inaccuracy of land, haptic control effort, and coverage ratio) to evaluate the performance of single and multiple UAVs during agricultural work, and proved that the multi-UAV system is more efficient than the single UAV system (total flight time improved by 18.1%, battery consumption reduced by 59.8%, coverage area increased by 200%, etc.) [
20]. Barrientos et al. subdivided a large area of agricultural land using multiple UAVs and applied an efficient flight strategy, suggesting the application to real large agricultural lands through real field flight experiments [
21]. A more complex control system is required for controlling multiple UAVs compared to a single UAV, and various constraints must be considered. The basic flight control for multiple UAVs is the same as for a single UAV, using a computer or remote controller with appropriate flight control software, such as GCS. The development of multiple path generation and multiple UAV collaborative driving algorithms is necessary to enable multiple UAVs to fly simultaneously. Several studies have been conducted on the efficient path planning and reliable flight control of multiple UAVs. Roberge et al. demonstrated that applying a genetic algorithm and a particle swarm optimization algorithm to a multi-core CPU allows UAV path planning within 10 s [
22]. Gu et al. combined a wireless sensor network and multiple UAVs to enable efficient collaborative operation, and proposed a method for recognizing objects and locations using multiple UAVs [
23]. Lee et al. performed multi-UAV control using the robot operating system (ROS) global positioning system (GPS) waypoint tracking package and a centralized task allocation network system. To overcome the battery limitations of a single UAV, this system was developed to enable multiple UAVs to take turns and complete continuous flight missions [
24].
In general, in the case of UAVs applied to agriculture, scanning is carried out over a large area. Such remote sensing can be used to accelerate the arduous process of performing crop inventory and yield estimates for large areas. Area sizes typically used for remote sensing in agriculture, ranging from 5 to 10 hectares, are used for remote sensing and generating field phenotypes [
25]. However, they have difficulties in providing high accuracy for key phenotypic characteristics, such as stem diameter, leaf angle, crop height, and leaf area index (LAI), as a result of the low spatial resolution of the data due to flight height. To solve this issue, phenotyping studies based on unmanned ground vehicles (UGVs) are in progress. Young et al. measured corn height and stem width using a stereo camera and a time-of-flight depth sensor mounted on a UGV. The field phenotyping showed a measurement error of 15% for the plant height and 13% for the plant stem width [
26]. Madec et al. conducted a comparative experiment on the plant phenotypic characteristics observed by a UGV equipped with light detection and ranging (LiDAR) and a UAV equipped with RGB cameras. Although LiDAR showed higher phenotyping accuracy due to spatial resolution differences (LiDAR (3–5 mm), RGB (10 mm)), both LiDAR (UGV) and RGB (UAV) showed a high correlation with respect to plant height [
27]. As UGVs can perform capturing under the canopy, important data, such as crop height and stem diameter, can be acquired throughout the development stage of the crop [
28]. Manish et al. generated a mobile mapping system using a UGV. The developed UGV was equipped with LiDAR and RGB cameras, along with global navigation satellite system (GNSS)/inertial navigation system (INS) devices to align and project each coordinate system. A comparative experiment using a UAV and PhenoRover was also conducted to compare the accuracies. The accuracy of the point cloud generated in the UGV was ±5–8 cm, similar to that of the control group (UAV, PhenoRover), and the developed UGV showed a relatively low noise level during data collection, being able to capture individual plants [
29].
Many studies have been conducted on measuring field phenotypes (crop height) using drone-based remote sensing. Christiansen et al. created a point cloud by combining the GNSS and an inertial measurement unit with LiDAR data. The created point cloud was mapped and analyzed using the functions of ROS and Point Cloud Library. Based on the analyzed crop height and volume, it was possible to estimate crop yield and nitrogen concentration [
7]. Torres-Sánchez et al. collected images at flight altitudes of 50 m and 100 m to generate a DSM. As a result of image acquisition and accuracy analysis with different front and side overlaps for each altitude, it was possible to achieve 95% accuracy and 85% time saving by using 95% of front overlap and 60% of side overlap at a 100 m flight altitude [
30]. Holman et al. collected images using a UAV equipped with an RGB camera and reconstructed the three-dimensional (3D) terrain through structure from motion (SfM). The height accuracy of the crop was
≥ 0.92, and the root means square error (RMSE) ≤ 0.07 m, showing high accuracy. When compared to terrestrial LiDAR (terrestrial LiDAR accuracy:
= 0.97, RMSE = 0.027 m) for comparative experiments, although the SfM method using RGB did not show as high a level of accuracy as that of LiDAR, it seemed more suitable in terms of time and cost effectiveness for the wide areas of agricultural sites because LiDAR is expensive and requires a long scan time [
31]. However, measurement errors may occur as the SfM method generates high-resolution 3D topography or field phenotyping from high-resolution images. Inaccurate GPS positioning accuracy [
32] and plant movement due to wind or the rotor movement of UAVs [
33] are considered to be the main causes of errors. To reduce measurement error and minimize the effect of the disturbance, images should be simultaneously acquired from multiple perspectives. However, the commonly used commercial GCS programs simply assign the pre-designed missions to single and multiple UAVs for images collected from a single perspective. As the UAVs are operated independently to complete the designated mission areas, it is impossible to acquire images simultaneously from multiple perspectives by collaborating with the multiple UAVs to obtain high-precision field phenotypes.
Many studies are underway to reconstruct the 3D phenotype models of crops using images taken from multiple perspectives [
34,
35,
36,
37,
38], as they result in higher accuracy than by using images taken from a single angle, allowing the measurement of major organs of the crop (leaf area, plant height, crown area, calyx size, etc.) required for crop monitoring. Zhu et al. observed the morphological changes (plant height, plant length, plant width, crown height, crown area, etc.) of soybean plants based on low-cost 3D reconstruction technology, and the correlation coefficient obtained was significantly higher than 0.98, indicating high accuracy [
34]. He et al. used strawberry images taken from multiple angles (360°), and rapidly obtained a quantitative phenotypic analysis of external strawberry traits, such as height, length, width, and the calyx size of strawberries [
36]. Zermas et al. created a 3D point cloud using high-resolution RGB images collected from multiple angles (circular orbits) using a UAV and a portable camera, and measured the LAI, individual, and average plant height, leaf angle with respect to the stem, and leaf length. To overcome the limitations of two-dimensional (2D) images, a 3D point cloud was reconstructed by capturing images from multiple perspectives, which allowed the measurement of the main parts of the plant in detail with excellent accuracy (LAI accuracy of 92.48%, plant height accuracy of 89.2%, and leaf length accuracy of 74.8%) [
38]. High-resolution images taken from multiple perspectives are required to improve the 3D model (phenotype) measurement accuracy and measure the major organs of crops.
This study aimed at developing a system for the collaborative driving of multiple UAVs to improve the accuracy of field phenotyping using UAV-based agricultural remote sensing, thus optimizing the cost, efficiency, and productivity, while reducing the impact on the external environment. The specific purposes of this study were: (1) to design, build, and test multiple UAVs; (2) to develop an algorithm to enable the collaborative driving of multiple UAVs; and (3) to verify the multiple UAV collaborative driving algorithm thus developed and optimize the variables, with a comparative verification experiment conducted by comparing it with the commercial GCS system in the simulation environment and the real world.