Next Article in Journal
Orthodontics Surgical Assistance (Piezosurgery®): Experimental Evidence According to Clinical Results
Next Article in Special Issue
Spatial Location of Sugarcane Node for Binocular Vision-Based Harvesting Robots Based on Improved YOLOv4
Previous Article in Journal
Utilization of Wool Integral Lipids to Determine Milk Fat Content in Suffolk Down Ewes
Previous Article in Special Issue
Mallard Detection Using Microphone Arrays Combined with Delay-and-Sum Beamforming for Smart and Remote Rice–Duck Farming
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Comprehensive Survey of the Recent Studies with UAV for Precision Agriculture in Open Fields and Greenhouses

by
Muhammet Fatih Aslan
1,
Akif Durdu
2,3,
Kadir Sabanci
1,
Ewa Ropelewska
4,* and
Seyfettin Sinan Gültekin
2
1
Department of Electrical and Electronics Engineering, Karamanoglu Mehmetbey University, Karaman 70100, Turkey
2
Department of Electrical and Electronics Engineering, Konya Technical University, Konya 42130, Turkey
3
Robotics Automation Control Laboratory (RAC-LAB), Konya Technical University, Konya 42130, Turkey
4
Fruit and Vegetable Storage and Processing Department, The National Institute of Horticultural Research, Konstytucji 3 Maja 1/3, 96-100 Skierniewice, Poland
*
Author to whom correspondence should be addressed.
Appl. Sci. 2022, 12(3), 1047; https://doi.org/10.3390/app12031047
Submission received: 7 December 2021 / Revised: 14 January 2022 / Accepted: 17 January 2022 / Published: 20 January 2022
(This article belongs to the Collection Agriculture 4.0: From Precision Agriculture to Smart Farming)

Abstract

:
The increasing world population makes it necessary to fight challenges such as climate change and to realize production efficiently and quickly. However, the minimum cost, maximum income, environmental pollution protection and the ability to save water and energy are all factors that should be taken into account in this process. The use of information and communication technologies (ICTs) in agriculture to meet all of these criteria serves the purpose of precision agriculture. As unmanned aerial vehicles (UAVs) can easily obtain real-time data, they have a great potential to address and optimize solutions to the problems faced by agriculture. Despite some limitations, such as the battery, load, weather conditions, etc., UAVs will be used frequently in agriculture in the future because of the valuable data that they obtain and their efficient applications. According to the known literature, UAVs have been carrying out tasks such as spraying, monitoring, yield estimation, weed detection, etc. In recent years, articles related to agricultural UAVs have been presented in journals with high impact factors. Most precision agriculture applications with UAVs occur in outdoor environments where GPS access is available, which provides more reliable control of the UAV in both manual and autonomous flights. On the other hand, there are almost no UAV-based applications in greenhouses where all-season crop production is available. This paper emphasizes this deficiency and provides a comprehensive review of the use of UAVs for agricultural tasks and highlights the importance of simultaneous localization and mapping (SLAM) for a UAV solution in the greenhouse.

1. Introduction

It is estimated that the world population will reach 10 billion by 2050 [1]. This situation highlights the problems of agricultural needs and demands. Of course, the solutions to these problems depend on efficient and fast production. Applications such as robotics, computer science, artificial intelligence, the Internet of Things (IoT), etc., can provide smart, efficient and fast products for smart farming. Smart farming aims to produce useful results for understanding the soil and to use information and communication technology (ICT) services to collect and process information provided by multiple sources. The amount of water, vegetation, pesticides, humidity, etc., which change according to time and place, require the continuous monitoring of products in terms of irrigation and spraying. Smart farming makes crop management easier and more efficient by using technological equipment in proportion to the specific needs of crops. In this way, this approach aims to use agricultural chemicals more appropriately, to save energy and products, to prevent agricultural pollution, to use smart technology solutions and to create environmentally conscious production. All of these contribute significantly to crop productivity. In this context, precision agriculture can combine multiple analysis processes and technological tools related to all stages of production, from planting to harvest [2].
Finding solutions to agricultural demands depends on rapid production. For this reason, the number of robotic and computerized solutions used in smart agriculture has increased significantly in recent years due to their contributions [3,4,5]. As the size of farms grows, the size of their equipment, as well as tasks such as irrigation, spraying, planting, pruning, etc., significantly increases the farmer’s workload. These tasks are highly suitable for autonomous robots, as they often require a large number of repetitions over a long period of time and a large area [6]. In addition, the application of industrial developments and technology to agriculture makes production sustainable by enabling more careful farming [7]. Therefore, to date, robotic agricultural solutions have been proposed in various agricultural solutions, such as fruit harvesting [8], monitoring [9], loading and unloading of agricultural material [10], irrigation [11], fertilization [12], weed detection [13], automatic grafting [14] and sowing [15]. Most of these studies have been carried out on agricultural fields with suitable ground and structured environments using wheeled robots.

1.1. UAV and Precision Agriculture

In recent years, the use of unmanned aerial vehicles (UAV) in agriculture has increased due to the development of UAV technology (see Figure 1). As a mobile robot, UAV is a low-cost alternative detection technology and data analysis technique [16,17]. There are many types of UAVs, and low-cost UAVs can collect high-resolution data from different points in space. Although UAVs have not yet been applied in most precision agriculture applications, they are increasingly playing an active role in this field in terms of sustainable agricultural practices and profitability [18]. Moreover, the UAV, which is included in agriculture, significantly reduces human resources and provides measurement precision. If the data obtained with the UAV are evaluated and interpreted correctly, they can contribute to the production of the crop and increase productivity.
UAVs can be particularly valuable for precision agriculture applications and have strong potential to increase the efficiency of water [19], crop [20] and precision pest management [21]. They can also perform a wide variety of agricultural operations, including soil health monitoring, fertilizer application and weather analysis. A UAV with flexible movements and a camera provides basic support to manpower in situation assessment and surveillance applications. In addition, multiple sensors can be used simultaneously with the UAV, and analyses can be strengthened with sensor fusion. UAVs can be used to extract vegetation indices that allow farmers to continuously monitor crop variability and stress conditions. Therefore, the agricultural UAV market has gradually expanded, and some UAV companies, such as DJI (https://www.dji.com/, (Accessed date: 6 December 2021)) and Parrot (https://www.parrot.com/ (Accessed date: 6 December 2021)), have developed UAVs for agricultural purposes [22].
To date, UAVs have mainly been used for pest detection/control and monitoring a large number of crops, such as corn, rice and beans. The most common applications performed with UAVs for precision agriculture are monitoring and spraying [24]. In monitoring applications, certain information about the crop and vegetation indices are obtained with image analysis and remote sensing, and thus, various diseases or pests, plant health or plant growth can be monitored. In spraying applications, the required amounts of pesticides and fertilizers are sprayed to increase the yield of the crop and prevent plant diseases. Recently, in addition to these applications, applications such as mapping, weed detection, irrigation and remote sensing have also been applied with UAVs. Moreover, UAVs are expected to be used for purposes such as planting, transportation and soil and field surveys in the future [25]. These missions can also be performed using satellite imagery or aircraft, but UAV applications are more useful when the image resolution, cost and difficulty are considered. In addition, UAVs can carry different types of cameras, such as thermal, multispectral and hyperspectral instruments, and thus, they can capture aerial images with different characteristics [26]. According to the Association for Unmanned Vehicle Systems International (AUVSI), 80% of UAVs will be used in different agricultural applications in the near future. Therefore, it is obvious that the agricultural sector will need UAVs to a much greater extent in the future [4,24].
Although UAVs are suitable for agricultural applications, they have important technical limitations. Therefore, some questions should be answered to develop advanced and smart agricultural systems. Some of these questions are:
  • A UAV that works in a large agricultural area for crop monitoring, spraying, etc., should fully monitor the field, but is the UAV battery sufficient for this duty period?
  • Are the size of the land and the flight time of the UAV compatible?
  • Can the UAV operate autonomously in a closed environment and is it reliable?
  • Is communication loss possible during the UAV mission?
  • Can UAV carry loads (RGB camera, multispectral camera, etc.) for different missions?
Some studies claim that problems such as speed and battery can be solved with multiple UAVs [27]. Multiple UAVs are actually a swarm, so there is a task assigned to each UAV. In this way, it is predicted that agricultural applications can be carried out quickly and without battery obstacles over a large area of agricultural land [25]. However, the number of agricultural studies using multiple UAVs is much lower than those using single UAVs. Performing a cooperative mission in open fields or a limited greenhouse environment with multiple UAVs will also pose a reliability problem. It is also a challenge to establish an effective communication network between UAVs [28]. These questions and multi-UAV applications are active research areas that still await solutions.

1.2. Our Study and Contributions

This study reviews and groups previous studies on agricultural tasks recently performed with UAVs. As a result of the importance of agricultural applications with UAVs, review and survey studies have also increased. The increasing use of UAVs in precision agriculture and various UAV solutions were presented by Gago et al. [29]. Radoglou-Grammatikis, Sarigiannidis, Lagkas and Moscholios [4] discussed potential UAV applications in precision agriculture, and information was given on 20 precision agriculture-UAV studies conducted in the past. Kim, Kim, Ju and Son [24] explained the characteristics, control and design of agricultural UAVs and then touched on UAV applications in agriculture such as mapping, spraying, planting, crop monitoring, irrigation and pest detection. Ju and Son [25] discussed multiple UAV systems in agriculture and emphasized that the performance of a multiple UAV system is significantly superior to a single UAV system. Tsouros, Bibi and Sarigiannidis [3] explained precision agriculture practices with UAVs, types of UAVs used, data collection and data-processing methods and limitations.
This study focuses on the survey of UAV studies and divides agricultural tasks into two steps. In the first, outdoor UAV studies are discussed, and in the second, UAV applications in the greenhouse (indoor) are investigated, unlike previous studies. Attention is drawn to the necessity and importance of the use of UAVs for precision agriculture tasks in greenhouses. This study emphasizes that the number of studies on UAVs in the greenhouse is low and offers a simultaneous localization and mapping (SLAM) proposal for indoor (greenhouse) autonomous UAVs as a solution.
The fact that the use of UAVs for agricultural purposes is a field that has progressed very rapidly makes survey and review studies important. The important contributions of this study can be summarized as follows:
  • Agricultural practices carried out with UAVs recently, mostly in 2020, are extensively discussed.
  • UAV agricultural applications are discussed in two categories, i.e., indoor and outdoor environments.
  • The importance, necessity and inadequacy of greenhouse UAV missions are emphasized.
  • The importance of SLAM for autonomous agricultural UAV solutions in the greenhouse is explained.
The remainder of this manuscript is organized as follows. In Section 2, information is presented about the different agricultural tasks performed with UAVs in open fields, and previous agricultural studies are discussed. Section 3 highlights the necessity and lack of UAV solutions in the greenhouse and discusses previous UAV studies carried out in the greenhouse. Section 4 states that the SLAM problem is vital for performing autonomous missions with a UAV inside the greenhouse. Finally, Section 5 discusses the entire manuscript and concludes the study.

2. Survey for Outdoor Agricultural UAV Applications

This section addresses outdoor UAV studies carried out for precision agriculture for different missions. UAV-based precision agriculture studies include applications such as crop monitoring, mapping, irrigation, etc. Useful information about the crop can be obtained with images acquired through the UAV camera. Torres-Sánchez et al. [30] evaluated the accuracy, spatial and temporal consistency and precision of six different vegetation indices (CIVE, ExG, ExGR, Woebbecke index, NGRDI and VEG) in a wheat crop using images from a low-cost camera attached to a UAV. With ExG and VEG indices, it achieved the best accuracy in vegetation fraction mapping with values of 87.73–91.99% at a height of 30 m and 83.74–87.82% at a height of 60 m. With this study, it was proven that a low-cost camera UAV can be used in precision agriculture applications such as weed management. According to Zhang et al. [31], accurate mapping of individual plants with UAV images is difficult, given the large variations in sizes and geometries and the distribution of plants. In this context, the authors used frailejones plants as an example and proposed the Scale Sequence Residual U-Net (SS Res U-Net) semantic segmentation model. The proposed semantic method provided a more successful classification compared to FCN, standard U-Net, Res U-Net and MS Res U-Net methods. Johansen et al. [32] used both multispectral UAV and WorldView-3 images to quickly and consistently assess macadamia tree status with remotely sensed images. In the application for three different areas, excellent, good, moderate, fair and poor classes were used for trees, and the classification algorithm was a random forest classifier. Tree condition was more successfully predicted with WorldView-3 images.
Numerous examples of previous studies can be cited, as above. However, the number of these studies using UAVs for agricultural purposes is quite high. Therefore, grouping these studies according to their tasks provides more efficient information to the reader. For this reason, in this study, recent state-of-the-art studies on the use of UAVs in precision agriculture obtained from the Web of Science are investigated. These journals have high impact factors in the Web of Science. Previous studies have generally aimed to perform an outdoor agricultural task with UAVs. Table 1 provides information about the tasks, years, agricultural product handled, type of UAV and purpose of these recent studies. An article can contain different tasks at the same time. For example, most of the mapping articles also involve crop monitoring and remote sensing. When Table 1 is examined, it is seen that UAV solutions are actively explored in agriculture, and UAV missions performed outdoors are very diverse. The definition of these tasks can be generalized as in Table 1, although the applications are in different areas and for different crops.
Table 1 shows that UAVs performing agricultural tasks have recently been widely published in Web of Science indexed journals. This situation is discussed in more detail in the study by Cerro, Cruz Ulloa, Barrientos and de León Rivas [23]. Therefore, it is clear that the interest in UAVs in agriculture has increased in recent years and that UAVs will play a major role in the future of agriculture [25]. The previous studies in Table 1 show that different crops can benefit from the capabilities of the UAV, such as data collection and flexible mobility; that is, a large gain in production and yield can be achieved with the UAV. Most agricultural products require constant control, and sometimes, changes that are invisible to the human eye can occur in crops. The UAV is still a very important alternative for many crops, especially for addressing such problems. For this reason, it is expected that studies similar to those in Table 1 will be applied frequently in the future.
A more detailed description of UAV missions that mostly have applications is given below. The applications in Table 1, which are not explained in detail below, are path-planning, field-monitoring and artificial pollination studies. These path-planning and field-monitoring studies aim to control the UAV performing this task in a more optimized approach, rather than performing an agricultural task. For a more realistic and generalizable scenario, some problems regarding the autonomous movement of the UAV need to be resolved. These can be summarized as the movement of the UAV in a dynamic environment, prevention of collisions and determination of the shortest path between mission points. Such studies are very few in the field of agriculture. Another agricultural UAV task with a very limited number of studies is artificial pollination. The aim is to distribute pollen among plants using small-sized UAVs. This field also has a gap in terms of application and needs new studies.

2.1. Crop Monitoring

Crop monitoring involves taking measurements for the efficient cultivation of a crop based on images of the crop captured with remote sensing methods. In this context, yield, growth and diseases of the product are generally estimated. These controls are difficult to perform manually, especially on very large farms. Very large farms are often monitored via satellites and aircraft. However, satellites and airplanes involve difficulties such as high altitude, higher costs and the effect of clouds on the ability to take clear pictures [33]. These conditions also do not meet the requirement for precision crop monitoring. However, UAVs can obtain high-resolution images from close range at a low cost [24]. For this reason, it is reasonable to perform these tasks with a UAV at minimum cost. In addition, thanks to the different cameras in the UAV, different index information about the crops is obtained; these indexes are also important for crop yield control [34]. The sensors used for this purpose are generally thermal, RGB, multispectral and hyperspectral sensors (see Figure 2) [24].

2.2. Mapping

Mapping applications aim to follow agricultural or forest areas locally. This is achieved by obtaining 2D or 3D maps of agricultural land observed by UAVs. In this way, over time, changes in factors such as current production, efficiency and environmental conditions are observed. In this context, mapping can be performed by showing information such as the area of the agricultural land, soil diversity and crop status on images taken with the UAV. For example, understanding soil variability is one of the oldest challenges facing farmers and researchers. Soil analysis and periodic observations provide a variable fertilization system [4,24]. Low-flying, high-resolution images and low-cost UAV mapping applications have been frequently applied recently. As an example of mapping agricultural products with a UAV, Johansen, Duan, Tu, Searle, Wu, Phinn, Robson and McCabe [32] used both multispectral UAV and WorldView-3 images to map macadamia gardens and mapped different macadamia trees using a random forest approach.
Agricultural mapping applications often aim to mark or indicate information about plants or soil in an image. In contrast, mapping for mobile robotics, which is discussed later, aims to determine the geometric boundaries of the environment. In mobile robotics, if the boundaries of the environment are known, the mobile vehicle acts with the knowledge of its limits in the environment and also localizes itself in this environment. To perform autonomous tasks in an indoor environment, the map of the environment must be known. This is why mapping in robotics is very different from the term mapping in agriculture.

2.3. Spraying

One of the most preferred precision agriculture applications lately is the use of UAVs for spraying applications. Agricultural chemical products sprayed on crops are aimed at increasing the yield of the crop and reducing possible plant diseases and pests. However, with unbalanced and excessive use, it has negative effects on both environmental health and human health, causing diseases such as cancer and neurologic disorders [4,38]. Compared to a fast and unbalanced sprayer, UAVs can reduce pesticide use and maximize plant health and yield [4,24]. The chemical pesticide is sprayed on the plants, usually with a spraying system mounted on the UAV. With techniques such as image processing and artificial intelligence, the condition of the soil or plant is predicted, and spraying is performed accordingly. In a sample study on spraying, Martinez-Guanter et al. [39] carried out a low-cost and high-efficiency UAV spraying application for olive and citrus orchards.

2.4. Irrigation

Sensitive irrigation applications are an area of concern for the whole world in terms of water consumption. The fact that 70% of the water consumed worldwide is used to irrigate crops highlights the importance of precision irrigation [40]. A UAV equipped with spectral and thermal cameras can identify places where water is lacking and intelligently deliver water to these areas. Image processing and artificial intelligence algorithms with different imaging techniques ensure that water is used at the required place and in sufficient amounts. Soil morphology obtained by UAV enables these applications, and water waste is prevented [3]. Irrigation applications can be carried out similarly to spraying applications by loading water instead of pesticide. In future smart farming applications, it is anticipated that a collaborative irrigation system with UAVs, unmanned ground vehicles (UGVs) or herd robots will be implemented [24]. Figure 3 is given as an example of a study aimed at irrigation with a UAV. Park et al. [41] used thermal cameras for early detection of water stress in crops and detected areas with water deficiency with the adaptive crop water stress index (CWSI) method.

2.5. Weed Detection

Weed detection is also among precision agriculture applications. Weeds in a field adversely affect the growth of other main crops and cause losses in crop yield and growth. To prevent this, weeds should be detected, and their growth should be prevented [42]. However, as the distribution of weeds is not regular (heterogeneous), precise detection is required. In this case, deep learning-based methods have been very actively studied recently [43]. After successful detection, weed control is achieved using herbicides. Traditional methods overuse herbicides, resulting in reduced crop yields. Instead, spraying a sufficient amount of herbicides on weeds detected by UAV is much more beneficial in terms of cost, environmental pollution and yield. The marking of weedy areas on UAV images (mapping) is required for precise spraying of the herbicide [3,44]. In a sample study, Bah et al. [45] proposed a method that uses UAV images and applies deep learning to detect weeds in spinach and bean fields. The images of the detected weeds are shown in Figure 4.

2.6. Remote Sensing

Remote sensing covers applications for evaluating the progress of production at different times. In order to perform a low-cost and straightforward evaluation of crop, soil and environmental conditions, the data obtained and the applications made are generally related to remote sensing. Remote sensing applications are the most applied precision agriculture application due to their necessity and importance. Thanks to remote sensing technologies, potential problems in agriculture are detected early, and measures can be taken to solve these problems promptly. This ensures crop efficiency. In this context, remote sensing covers a wide range of applications. All of the studies aimed at increasing the productivity of agricultural products by using remote sensing can be included under this category [4,46].
Manual methods, in contrast to remote sensing, to obtain the phenotypic characteristics of crops directly measure data such as biomass, leaf area index (LAI) and chlorophyll content. However, manual methods require device operators to work intensively in the field and are therefore difficult and time-consuming. To eliminate these disadvantages, technological solutions have provided significant developments in this field [34]. Remote sensing attracts more attention today due to developments in robotic technology, the development of sensors and advances in data processing. Most applied remote sensing studies are carried out using aerial images taken by satellites, UAVs and manned aircraft. With these tools, data can be collected at spatial, temporal and spectral resolutions. These tools can collect large area data and are nondestructive. Remote sensing by satellites and aircraft provides the ability to collect data on a large spatial scale, but long revisit periods, cloud congestion and high costs limit the use of these tools in remote sensing. In addition, satellite images have low spatial resolution [47]. For this reason, recently, UAVs have become much more popular than other vehicles, with advantages such as flexibility of movement, ease of use, low cost and high spatial resolution [48].
An example study for remote sensing using a UAV was performed by Ye et al. [49]. Ye, Huang, Huang, Cui, Dong, Guo, Ren and Jin [49] aimed to detect areas of banana infected or uninfected with fusarium wilt using multispectral images acquired with a UAV. Figure 5 shows the fusarium wilt disease distribution obtained as a result of the study. Green areas indicate healthy areas, and yellow areas show infected banana areas.

3. UAV Solutions in Greenhouses

The use of greenhouses is growing, and thus, the distance between the grower and crop is increasing. New technologies are being introduced for the care and preservation of crops. Obtaining more accurate data on crop growth and local growing conditions shows the grower how and where crop problems arise. In this way, it is easier to understand when and where problems preventing crop productivity are and how to fix them. Unfortunately, current data collection techniques have limited spatial resolution and are labor-intensive [22,77,78].
Greenhouse farming is one of the most suitable fields for using robotics, automation and computer technologies together [79]. Most greenhouses have climate control systems, usually consisting of temperature and humidity sensors, as well as irrigation, ventilation and heating systems. Thanks to these systems, it is possible to grow plants in the greenhouse throughout the year. These systems offer a wide variety of possibilities, including climate control and production monitoring. However, they are a costly and complex solution due to cost and reliability issues. The later-emerging wireless sensor networks (WSNs) have become very popular for such projects, with the advantages of modularity, low power consumption, etc. [80]. For this reason, they have been used in greenhouses for monitoring [81] and precision agriculture [82,83] in many studies. However, in general, such automatic systems do not fully meet the requirements for precise temperature, humidity, etc., settings; therefore, yield losses in greenhouses occur [84].
Although greenhouse solutions with WSNs have been presented in many studies, their application in the greenhouse has been experimentally limited, usually because WSN applications can be used in small-sized areas. For example, Erazo-Rodas, Sandoval-Moreno, Muñoz-Romero, Huerta, Rivas-Lalaleo, Naranjo and Rojo-Álvarez [81] and Rodríguez, Gualotuña and Grilo [82] implemented WSNs in small greenhouses. Jiang et al. [85] conducted a WSN-based monitoring study in a larger greenhouse but by adding a large number of nodes (wireless sensors). Therefore, they provided a more costly and more complex solution for precision agriculture.
The recent activity of UAVs in agriculture has overshadowed the WSN and wheeled agricultural robot solutions. Unlike WSNs, wheeled robots and other solutions, UAVs can take measurements at nearly any point in the three-dimensional space of the greenhouse, making activities such as local climate control and crop monitoring easier and more reliable. Moreover, with the UAV, crops can be monitored continuously at certain times (every week, every hour), and changes in the crop can be monitored. Observing the plants from the air makes it easier to reveal problems such as water stress, soil variation and pest infestation. In addition, thanks to developing camera technology, the disease status of plants that cannot be detected by the human eye can be easily monitored with multiband images by using different sensors (hyperspectral, multispectral, infrared, etc.) [86].
Despite the above-mentioned advantages of UAVs, their use in the greenhouse has been very low in the studies reported to date. Studies on the use of UAVs in the greenhouse have generally focused on the measurement of water vapor (H2O), carbon dioxide (CO2) and methane (CH4), which are three important greenhouse gases [87,88,89,90,91]. These studies use the advantage of the easy positioning of the UAV in the greenhouse, and they use the sensors on the UAV to take measurements in areas that are difficult to measure. However, in these studies, the UAV was manually positioned.
Apart from gas measurement, almost no precision agriculture studies have been carried out on applications in the greenhouse, such as crop monitoring, weed detection, yield estimation or plant temperature stress. In other words, the tasks performed with UAVs in outdoor environments shown in Table 1 have not been adequately implemented in indoor environments yet. There are very few studies on precision agriculture with UAVs in indoor environments. Table 2 shows experimental studies in greenhouses for precision agriculture with UAVs that did not use gas measurement methods. This study aims to awaken the idea that it is reasonable for UAVs to perform autonomous missions in the greenhouse. Although the studies in Table 2 did not perform autonomous UAV missions, they prove that UAVs can perform autonomous missions in the greenhouse environment.
The tasks of UAVs in indoor applications are more difficult than the tasks performed with UAVs in outdoor environments. Since it is a closed environment, it contains problems such as reliability, control and the inability to use the global positioning system (GPS). Especially in autonomous applications, the absence of GPS requires extra effort for positioning. For this reason, the number of UAV studies applied in the greenhouse to date is insufficient, as indicated in Table 2, and needs improvement. Among these studies, in the study by Roldán, Joossen, Sanz, Del Cerro and Barrientos [86], air temperature and humidity, brightness and carbon dioxide concentration were measured using sensors mounted on a mini-UAV. Control of the UAV was performed manually by the user. In a later study by the same team, Roldán, Garcia-Aunon, Garzón, De León, Del Cerro and Barrientos [79], included a UGV in addition to the UAV. The aim was to combine the advantages of the two robots. The measured values were the same, and the UAV was placed on the ground robot. If the battery of the ground robot was low or the UGV encountered an obstacle, measurements were taken by manually flying the UAV. Simon et al. [92] tried to locate a UAV in a greenhouse. For this, they added fixed nodes (via WSN) to the closed area and positioned the UAV according to its distance from these nodes. The applied positioning technique is not practical or useful for large indoor environments. Shi et al. [93] designed a UAV with adjustable deflectors for tomato pollination. At the end of the study, they stated that the control of the manual UAV is quite difficult due to the deflectors and should be improved.
Apart from the studies (articles, papers, etc.) listed in Table 2, examples of UAV applications in the greenhouse are as follows. A different study on the UAV in the greenhouse was presented by Amador and Hu [94] (see Figure 6a). Due to the decrease in bee populations in practice, an artificial pollination study was proposed for flowers grown in the greenhouse. A lily was pollinated by a remote-controlled flying robot. The robot was hairy, just like a real bee, and adhered to pollen thanks to an ionic liquid gel. Within the same application, UAVs can collect environmental data in the greenhouse and perform mapping. In this way, UAVs, which can make yield estimates and monitor plant health, can provide great advantages for greenhouse workers thanks to their low cost and availability. Founded by four friends who are aware of these advantages, Applied Drone Innovations (ADI) (http://applieddroneinnovations.nl/ (Accessed date: 6 December 2021)) in the Netherlands has developed UAV prototypes that monitor plant growth and environmental factors such as temperature, humidity, carbon dioxide, brightness and volatile organic compounds in the greenhouse [95] (see Figure 6b). They increased the effectiveness of UAVs in the greenhouse with tasks such as localization and flight path optimization in the indoor area of the greenhouse. In February 2020, Dutch drone developer PATS (https://pats-drones.com/ (Accessed date: 6 December 2021)) introduced a bat-like UAV solution system with video to eliminate insect pests in the greenhouse environment (see Figure 6c). The aim was to maintain the ecological balance in a greenhouse ecosystem by eliminating pests with autonomously operating UAVs.
Although the usability and importance of the UAV in the greenhouse are obvious, greenhouses are a difficult flight environment for the UAV due to their natural structure. Greenhouse operators want to benefit from the labor savings, efficiency and business advantages that UAVs can offer, but they hesitate about the reliability of these systems. Indeed, it is difficult to carry out autonomous UAV studies in an indoor environment. Moreover, the type of plant grown is also a major factor in operating the UAV in the greenhouse. It is much more difficult for the UAV to perform tasks in situations in which plants grow taller, plants overlap with each other and obscure the view of other plants, or greenhouse rope is used for plants. However, apart from such situations, autonomous UAV missions for the greenhouse make quite a lot of sense. While the outdoor applications of UAVs are many since they are reliable and easy, the use of UAVs in closed areas such as greenhouses is still limited today. However, its wide application range, low cost, versatility and precision indicate a promising future for UAVs in indoor farming. A robotic system that accurately detects insects and pests among thousands of plants and produces maps and data visualizations for greenhouse farmers is highly intriguing.
It is not easy for the UAV to perform autonomous tasks in indoor areas such as greenhouses. In the greenhouse, it is very difficult for the UAV to navigate autonomously while sowing, spraying, irrigating, etc., in an unstructured environment, especially without damaging small leaves, flowers, etc. The UAV shown in Figure 6a performs the pollination task through (manual) control by a human. The UAVs produced by ADI in Figure 6b are used in crop management, disease and pest detection. After implementing manual tasks since 2015, ADI expanded its initiatives last year to move the UAV autonomously. A newly introduced UAV developed by PATS is used to destroy pests, as shown in Figure 6c. The company aims to develop mini-UAVs for greenhouse insects. The difficulty in developing autonomous UAVs in indoor environments arises from difficult tasks such as UAV control, image processing algorithms, artificial intelligence algorithms, load, positioning and navigation, which must be performed simultaneously. Processing the environmental data received by the UAV in real time on the central computer, sending the results obtained from the processed data to the UAV and assigning tasks require robust software and algorithm knowledge. At the same time, the hardware requirements to support this software system are also important.

4. Solution Proposal for UAV Applications in Greenhouses

UAVs need to be developed for smart greenhouses. However, to perform tasks in the greenhouse with a fully autonomous UAV, the solutions proposed for autonomous mobile robots should be examined. Since most of the tasks in the greenhouse are repetitive applications, such as monitoring, control, etc., mobile robotics are very suitable as an application. Instead of people constantly visiting the greenhouse and watching changes, it is more logical for these activities to be performed by a UAV, which is in the greenhouse and performs tasks at certain time intervals. The basic need for this is to provide the UAV with localization and mapping ability in the indoor environment. With this ability, the 3D position of the UAV in the greenhouse can be obtained at any time. In this way, UAV solutions that assess plants individually, collect data and develop predictions such as the yield and growth of plants can be applied in a greenhouse (or indoor environment).
The SLAM [96,97] solution is essential for autonomous UAVs in an indoor environment. SLAM is one of the main problems to be solved in autonomous mobile robotics. In fact, the goal is just a robot moving in an unknown environment. For this, the environment must be mapped, and simultaneously, the robot must be localized within the constantly growing map [98]. If a mobile robot can solve the SLAM problem, that robot can move independently in the environment and perform tasks. SLAM is the first problem to be solved for localization, mapping and path planning tasks required for autonomous movement [99]. SLAM simultaneously performs the process of creating an environment map consisting of different properties (markers, obstacles, etc.) and determining the absolute position of the robot using this map. In this way, the robot recognizes its environment, determines its own position (ego-motion) accordingly and finally becomes ready to perform the task assigned to it.
Localization in outdoor environments is generally carried out with GPS. Since the accuracy of GPS is low in indoor environments, UAVs should also be localized in cases where GPS is not available. In addition, GPS signals can easily be distorted due to weather factors, electromagnetic noise and tall buildings at high density. Therefore, the UAV must be localized indoors with sensors such as a camera (visual SLAM (VSLAM)), inertial measurement unit (IMU), LIDAR, etc. Although it is difficult for the UAV to perform tasks in indoor environments, studies on SLAM applications for UAV localization in indoor environments have recently increased. In a SLAM study that we previously conducted [100], the results of the previous indoor state-of-the-art visual-inertial SLAM (VISLAM)/visual-inertial odometry (VIO) studies and the method that we suggested were compared. With the deep learning-based approach, more successful localization (odometry) was achieved with UAV indoors. Figure 7 is a screenshot of the implementation of this previous study. The trajectory curve shows that the DJI Tello drone is localized in our indoor lab (this lab can be thought of as a greenhouse) in real time using visual and IMU data. If such applications are made in the greenhouse, the instantaneous position of the UAV can be known, as seen in Figure 7. In this way, different tasks can be assigned to the UAV, and the UAV can be directed to a specific coordinate. It can perform a special treatment on a particular plant at this coordinate. In this way, desired measurements and images can be taken from certain points or certain plants. The development of such applications will also make a significant contribution to smart greenhouses in the future.
Since the SLAM area has been an active topic for mobile robotics for many years, new studies and techniques are still being developed. It is a very rapidly developing area but still not fully resolved. Therefore, a large number of different methods have been proposed for SLAM solutions indoors. To provide an example of SLAM solutions with different sensors, Table 3 describes SLAM, VSLAM and VISLAM studies. ORB-SLAM, large-scale direct SLAM (LSD-SLAM), semi-direct visual odometry (SVO) and visual-inertial system (VINS) mentioned in Table 3 are frequently used methods in the field of autonomous mobile robotics. Numerous indoor SLAM studies can be added to Table 3.
In Table 3, LIDAR was used for SLAM in the first two studies. While this sensor makes the application easier, it has a cost disadvantage. Today, the most popular studies in the field of SLAM employ low-cost methods. These studies aim to realize SLAM applications using only cameras. Popular examples of these studies, called V-SLAM or monocular SLAM, are listed in rows 3–5 in Table 3. Although VSLAM methods are quite successful, the main weakness of these applications is the inability to provide scale information. To solve this scale uncertainty problem, VIO/VISLAM studies have been developed that combine an inertial sensor (IMU) with a monocular SLAM system to provide scale information. The prominent studies on VIO/VISLAM are included in rows 6–8 in Table 3. Detailed information on these SLAM studies is shared in Table 3.
Although UAVs have applications such as spraying, imaging, mapping and fertilization in open lands (see Table 1), in indoor environments such as greenhouses, agricultural practices with UAVs have almost never been applied due to localization and mapping difficulties (see Table 2). The UAVs used in previous applications were manually controlled, and measurements were made with the sensors on them. The studies shared in Table 3 are successful studies aimed at self-localization and recognizing the environment of a UAV in indoor environments. In these studies, methodological recommendations for SLAM were presented, and prediction errors in UAV poses were compared with previous studies. Indoor SLAM applications are often an important requirement in the robotics industry, such as search and rescue or defense in environments without GPS access, or in the entertainment industry, such as virtual reality [101,102]. However, as we know, there are no agricultural applications of such advanced SLAM methods in indoor environments such as greenhouses. The need for SLAM in indoor environments arises from the need to perform autonomous tasks only with the sensors on the robot in a GPS-denied environment. At the same time, since smart greenhouses are indoor environments where autonomous tasks will be performed by the UAV, SLAM solutions can be directly applied in greenhouses.
Considering the success of the methods in Table 3 and their practicality, it is clear that the problem of implementing a UAV that navigates autonomously in the greenhouse will be solved with modern SLAM techniques. However, an autonomous UAV in the greenhouse is also a challenge, as SLAM still does not achieve the desired level of success. However, different tasks can be performed by the UAV in a greenhouse with the existing SLAM methods.
By using a UAV, which has the advantages of mobility, high-resolution imaging and low cost, the lack of studies on aspects such as imaging, yield estimation and mapping in the greenhouse environment can be eliminated by UAVs that collect data and perform monitoring in the greenhouse. It is possible to use SLAM methods to collect data and make observations autonomously from greenhouses where production occurs in all seasons. Moreover, in this big data era, it may become easier and faster to obtain large datasets. This could be an exciting advance for smart greenhouses. An example representation image of a UAV (e.g., Parrot (https://www.parrot.com/ (Accessed date: 6 December 2021)) performing autonomous missions in a greenhouse environment is given in Figure 8. If control, monitoring and prediction tasks can be performed autonomously with UAVs, crops can be grown in a greenhouse environment, as in Figure 8, in the future without the need for humans.

5. Conclusions

The role of computerized technologies in precision agriculture has developed rapidly thanks to advancing technology and software. In addition, different technological solutions in different areas are being increasingly developed to improve and optimize agricultural processes. In particular, the striking superiority of the use of UAVs in precision agriculture studies in recent times is noteworthy. Moreover, it is a field that is rapidly evolving, and innovations are continuously being developed. Therefore, this survey study explores the recent state-of-the-art studies that present the active use of UAVs in tasks in the agricultural field. Although UAV technology has been popular recently, its applications in agriculture have increased very rapidly. For this reason, applications such as crop monitoring, spraying, irrigation, weed detection and mapping using aerial UAV images have been proposed in recent years. The general purpose of these studies is to achieve a maximum gain in factors such as energy, water, labor and yield, which are important for precision agriculture.

5.1. Evaluation of Outdoor UAV Applications

Recent studies in indoor (greenhouse) and outdoor environments were considered for this survey. It can be easily seen in recent studies that the UAV has a wide variety of application areas in open fields (outdoor) and wide areas. Many experimental studies on crop monitoring, weed detection, remote sensing, spraying, etc., have been widely applied in the open field. Large lands, low-altitude flying of UAVs, high-resolution images and low cost make such applications preferable. Conversely, monitoring by plane or satellite is both difficult and expensive. In addition, various index information about the crop in the field can be obtained by using cameras with different imaging systems with the UAV. This enables precision agriculture practices such as balanced/sufficient irrigation and balanced/sufficient spraying that protect human health and the environment. Thanks to the GPS in outdoor areas, the UAV can be tracked with manual control, and missions can be carried out on desired sites in a wide area. In addition, since location information is available, autonomous missions can be performed by UAVs at certain locations. The results of this research indicate that open field UAV applications can be used on many more agricultural products in the future.

5.2. Evaluation of Indoor UAV Applications (Greenhouse)

Despite the wide variety of work in the outdoor area, there are very few studies on the use of UAVs in the greenhouse, so UAV solutions in the greenhouse are insufficient. In addition, no studies have emphasized this deficiency. Previous review studies have often introduced UAV applications and different UAV models in precision agriculture. In fact, the greenhouse environment is well suited for autonomous tasks. Factors such as the lack of GPS information in an indoor environment, the presence of obstacles in the environment during flight and a limited flight area make UAV control difficult in indoor environments such as a greenhouse. However, the intended use of greenhouses requires the constant control and monitoring of the condition of plants or crops. Performing such applications in very large greenhouses with an autonomous flying UAV instead of being repeated by humans will make a great contribution to precision agriculture.
As an example of UAV applications in the greenhouse, plant-based (individual) monitoring of the growing conditions of vegetables in greenhouses can be provided, and various metrics can be obtained using imaging techniques that are directly related to the growth of plants. Plants showing unexpected growth can be identified, and thus, crop control can be achieved. In addition, continuous control of products in the greenhouse through processes such as weed detection, yield estimation, disease detection and plant temperature stress can be provided. Although these are the first applications that come to mind, greenhouses have not been preferred in studies thus far. In fact, such tasks can be performed by a UAV with a camera using various artificial intelligence and image processing techniques. In this way, precision agriculture practices can be applied, and crop yields can be increased. However, using UAVs in a greenhouse environment involves many technical difficulties. These challenges range from the automation of the UAV flight controller to the development of the software necessary for archiving, retrieving and interpreting the collected big data. Determining the position of the aircraft will be a major challenge in indoor environments with limited flight space. The difficulties encountered in the real world, such as insufficient GPS signal, the presence of dynamic obstacles in the environment, static obstacles in the task area and lighting changes, require a large number of threads to work in harmony in the software.

5.3. UAV Solution Proposal for Smart Greenhouses

In this study, attention is drawn to the necessity of smart solutions in the greenhouse and the insufficiency of studies in this field. Increasing the number of UAV applications in the greenhouse ensures smart, fast and efficient production in all seasons. Therefore, a UAV that performs autonomous tasks to realize smart greenhouses is promising. Monitoring each plant or plant community with a known location by UAV and performing specific tasks for certain plants will create smart greenhouses. This requires precise positioning and knowledge of the structure of the environment. This can be achieved by applying the SLAM problem, which seeks a solution to the autonomous movement of the mobile robot in the indoor environment and to the UAV in the greenhouse. This study introduces SLAM at the end of the survey and gives examples of different state-of-the-art applications. These studies show that thanks to developing SLAM studies, image processing tools and artificial intelligence algorithms, autonomous UAVs can be developed that collect data on crops in greenhouses and analyze and adjust the conditions according to the collected data. As a result, a UAV that can perform autonomous missions using advanced SLAM applications may be the permanent worker of smart greenhouses in the future. Moreover, the UAV, which makes autonomous observations, can perform collaborative tasks with UGVs. In this way, the crop yield is increased with the help of robots, and smart and fast production is realized. Mobile robots, which perform autonomous tasks, provide great convenience, especially for workers working in large greenhouse areas.
In conclusion, this survey highlights the need to use SLAM methods for the development of precision agriculture practices in the greenhouse and thus aims to contribute to future research, marketing and applications.

Author Contributions

Conceptualization, M.F.A.; methodology, M.F.A.; investigation, M.F.A., A.D. and K.S.; resources, M.F.A.; data curation, M.F.A.; writing—original draft preparation, M.F.A.; writing—review and editing, M.F.A., A.D., K.S. and E.R.; visualization, M.F.A.; supervision, E.R. and S.S.G. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Acknowledgments

The authors are grateful to the RAC-LAB (www.rac-lab.com (Accessed date: 6 December 2021)) for training and support.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Hunter, M.C.; Smith, R.G.; Schipanski, M.E.; Atwood, L.W.; Mortensen, D.A. Agriculture in 2050: Recalibrating targets for sustainable intensification. Bioscience 2017, 67, 386–391. [Google Scholar] [CrossRef] [Green Version]
  2. Cisternas, I.; Velásquez, I.; Caro, A.; Rodríguez, A. Systematic literature review of implementations of precision agriculture. Comput. Electron. Agric. 2020, 176, 105626. [Google Scholar] [CrossRef]
  3. Tsouros, D.C.; Bibi, S.; Sarigiannidis, P.G. A review on UAV-based applications for precision agriculture. Information 2019, 10, 349. [Google Scholar] [CrossRef] [Green Version]
  4. Radoglou-Grammatikis, P.; Sarigiannidis, P.; Lagkas, T.; Moscholios, I. A compilation of UAV applications for precision agriculture. Comput. Netw. 2020, 172, 107148. [Google Scholar] [CrossRef]
  5. Quaglia, G.; Visconte, C.; Scimmi, L.S.; Melchiorre, M.; Cavallone, P.; Pastorelli, S. Design of a UGV powered by solar energy for precision agriculture. Robotics 2020, 9, 13. [Google Scholar] [CrossRef] [Green Version]
  6. Pedersen, M.; Jensen, J. Autonomous Agricultural Robot: Towards Robust Autonomy. Master’s Thesis, Aalborg University, Aalborg, Denmark, 2007. [Google Scholar]
  7. Peña, C.; Riaño, C.; Moreno, G. RobotGreen: A teleoperated agricultural robot for structured environments. J. Eng. Sci. Technol. Rev. 2019, 12, 144–145. [Google Scholar] [CrossRef]
  8. Wang, C.; Liu, S.; Zhao, L.; Luo, T. Virtual Simulation of Fruit Picking Robot Based on Unity3d. In Proceedings of the 2nd International Conference on Artificial Intelligence and Computer Science, Hangzhou, China, 25–26 July 2020; p. 012033. [Google Scholar]
  9. Ishibashi, M.; Iida, M.; Suguri, M.; Masuda, R. Remote monitoring of agricultural robot using web application. IFAC Proc. Vol. 2013, 46, 138–142. [Google Scholar] [CrossRef]
  10. Han, L.; Ruijuan, C.; Enrong, M. Design and simulation of a handling robot for bagged agricultural materials. IFAC-PapersOnLine 2016, 49, 171–176. [Google Scholar] [CrossRef]
  11. Chen, M.; Sun, Y.; Cai, X.; Liu, B.; Ren, T. Design and implementation of a novel precision irrigation robot based on an intelligent path planning algorithm. arXiv 2020, arXiv:2003.00676. [Google Scholar]
  12. Adamides, G.; Katsanos, C.; Parmet, Y.; Christou, G.; Xenos, M.; Hadzilacos, T.; Edan, Y. HRI usability evaluation of interaction modes for a teleoperated agricultural robotic sprayer. Appl. Ergon. 2017, 62, 237–246. [Google Scholar] [CrossRef] [PubMed]
  13. Zhang, Q.; Chen, M.S.; Li, B. A visual navigation algorithm for paddy field weeding robot based on image understanding. Comput. Electron. Agric. 2017, 143, 66–78. [Google Scholar] [CrossRef]
  14. Jiang, K.; Zhang, Q.; Chen, L.; Guo, W.; Zheng, W. Design and optimization on rootstock cutting mechanism of grafting robot for cucurbit. Int. J. Agric. Biol. Eng. 2020, 13, 117–124. [Google Scholar] [CrossRef]
  15. Jayakrishna, P.V.S.; Reddy, M.S.; Sai, N.J.; Susheel, N.; Peeyush, K.P. Autonomous Seed Sowing Agricultural Robot. In Proceedings of the 2018 International Conference on Advances in Computing, Communications and Informatics (ICACCI), Bengaluru, India, 19–22 September 2018; pp. 2332–2336. [Google Scholar]
  16. Norasma, C.; Fadzilah, M.; Roslin, N.; Zanariah, Z.; Tarmidi, Z.; Candra, F. Unmanned Aerial Vehicle Applications in Agriculture. In Proceedings of the 1st South Aceh International Conference on Engineering and Technology (SAICOET), Tapaktuan, Indonesia, 8–9 December 2018; p. 012063. [Google Scholar]
  17. Honrado, J.; Solpico, D.B.; Favila, C.; Tongson, E.; Tangonan, G.L.; Libatique, N.J. UAV imaging with Low-Cost Multispectral İmaging System for Precision Agriculture Applications. In Proceedings of the 2017 IEEE Global Humanitarian Technology Conference (GHTC), San Hose, CA, USA, 19–22 October 2017; pp. 1–7. [Google Scholar]
  18. Pinguet, B. The Role of Drone Technology in Sustainable Agriculture. Available online: https://www.precisionag.com/in-field-technologies/drones-uavs/the-role-of-drone-technology-in-sustainable-agriculture/ (accessed on 10 October 2021).
  19. Alexandris, S.; Psomiadis, E.; Proutsos, N.; Philippopoulos, P.; Charalampopoulos, I.; Kakaletris, G.; Papoutsi, E.-M.; Vassilakis, S.; Paraskevopoulos, A. Integrating drone technology into an ınnovative agrometeorological methodology for the precise and real-time estimation of crop water requirements. Hydrology 2021, 8, 131. [Google Scholar] [CrossRef]
  20. López, A.; Jurado, J.M.; Ogayar, C.J.; Feito, F.R. A framework for registering UAV-based imagery for crop-tracking in Precision Agriculture. Int. J. Appl. Earth Obs. Geoinf. 2021, 97, 102274. [Google Scholar] [CrossRef]
  21. Yan, X.; Zhou, Y.; Liu, X.; Yang, D.; Yuan, H. Minimizing occupational exposure to pesticide and increasing control efficacy of pests by unmanned aerial vehicle application on cowpea. Appl. Sci. 2021, 11, 9579. [Google Scholar] [CrossRef]
  22. Delavarpour, N.; Koparan, C.; Nowatzki, J.; Bajwa, S.; Sun, X. A technical study on UAV characteristics for precision agriculture applications and associated practical challenges. Remote Sens. 2021, 13, 1204. [Google Scholar] [CrossRef]
  23. Cerro, J.D.; Cruz Ulloa, C.; Barrientos, A.; de León Rivas, J. Unmanned aerial vehicles in agriculture: A survey. Agronomy 2021, 11, 203. [Google Scholar] [CrossRef]
  24. Kim, J.; Kim, S.; Ju, C.; Son, H.I. Unmanned aerial vehicles in agriculture: A review of perspective of platform, control, and applications. IEEE Access 2019, 7, 105100–105115. [Google Scholar] [CrossRef]
  25. Ju, C.; Son, H.I. Multiple UAV systems for agricultural applications: Control, implementation, and evaluation. Electronics 2018, 7, 162. [Google Scholar] [CrossRef] [Green Version]
  26. Raeva, P.L.; Šedina, J.; Dlesk, A. Monitoring of crop fields using multispectral and thermal imagery from UAV. Eur. J. Remote Sens. 2019, 52, 192–201. [Google Scholar] [CrossRef] [Green Version]
  27. Erdelj, M.; Saif, O.; Natalizio, E.; Fantoni, I. UAVs that fly forever: Uninterrupted structural inspection through automatic UAV replacement. Ad Hoc Netw. 2019, 94, 101612. [Google Scholar] [CrossRef] [Green Version]
  28. Chen, W.; Liu, J.; Guo, H.; Kato, N. Toward robust and ıntelligent drone swarm: Challenges and future directions. IEEE Netw. 2020, 34, 278–283. [Google Scholar] [CrossRef]
  29. Gago, J.; Estrany, J.; Estes, L.; Fernie, A.R.; Alorda, B.; Brotman, Y.; Flexas, J.; Escalona, J.M.; Medrano, H. Nano and micro unmanned aerial vehicles (UAVs): A new grand challenge for precision agriculture? Curr. Protoc. Plant Biol. 2020, 5, e20103. [Google Scholar] [CrossRef] [PubMed]
  30. Torres-Sánchez, J.; Pena, J.M.; de Castro, A.I.; López-Granados, F. Multi-temporal mapping of the vegetation fraction in early-season wheat fields using images from UAV. Comput. Electron. Agric. 2014, 103, 104–113. [Google Scholar] [CrossRef]
  31. Zhang, C.; Atkinson, P.M.; George, C.; Wen, Z.; Diazgranados, M.; Gerard, F. Identifying and mapping individual plants in a highly diverse high-elevation ecosystem using UAV imagery and deep learning. ISPRS J. Photogramm. Remote Sens. 2020, 169, 280–291. [Google Scholar] [CrossRef]
  32. Johansen, K.; Duan, Q.; Tu, Y.-H.; Searle, C.; Wu, D.; Phinn, S.; Robson, A.; McCabe, M.F. Mapping the condition of macadamia tree crops using multi-spectral UAV and WorldView-3 imagery. ISPRS J. Photogramm. Remote Sens. 2020, 165, 28–40. [Google Scholar] [CrossRef]
  33. Park, S.; Lee, H.; Chon, J. Sustainable monitoring coverage of unmanned aerial vehicle photogrammetry according to wing type and image resolution. Environ. Pollut. 2019, 247, 340–348. [Google Scholar] [CrossRef]
  34. Deng, L.; Mao, Z.; Li, X.; Hu, Z.; Duan, F.; Yan, Y. UAV-based multispectral remote sensing for precision agriculture: A comparison between different cameras. ISPRS J. Photogramm. Remote Sens. 2018, 146, 124–136. [Google Scholar] [CrossRef]
  35. Maimaitijiang, M.; Ghulam, A.; Sidike, P.; Hartling, S.; Maimaitiyiming, M.; Peterson, K.; Shavers, E.; Fishman, J.; Peterson, J.; Kadam, S. Unmanned Aerial System (UAS)-based phenotyping of soybean using multi-sensor data fusion and extreme learning machine. ISPRS J. Photogramm. Remote Sens. 2017, 134, 43–58. [Google Scholar] [CrossRef]
  36. Hassanein, M.; Lari, Z.; El-Sheimy, N. A new vegetation segmentation approach for cropped fields based on threshold detection from hue histograms. Sensors 2018, 18, 1253. [Google Scholar] [CrossRef] [Green Version]
  37. Bhandari, S.; Raheja, A.; Chaichi, M.; Green, R.; Do, D.; Pham, F.; Ansari, M.; Wolf, J.; Sherman, T.; Espinas, A. Effectiveness of UAV-Based Remote Sensing Techniques in Determining Lettuce Nitrogen and Water Stresses. In Proceedings of the 14th International Conference on Precision Agriculture, Montreal, QC, Canada, 24–27 June 2018; pp. 1066403–1066415. [Google Scholar]
  38. Dhouib, I.; Jallouli, M.; Annabi, A.; Marzouki, S.; Gharbi, N.; Elfazaa, S.; Lasram, M.M. From immunotoxicity to carcinogenicity: The effects of carbamate pesticides on the immune system. Environ. Sci. Pollut. Res. 2016, 23, 9448–9458. [Google Scholar] [CrossRef]
  39. Martinez-Guanter, J.; Agüera, P.; Agüera, J.; Pérez-Ruiz, M. Spray and economics assessment of a UAV-based ultra-low-volume application in olive and citrus orchards. Precis. Agric. 2020, 21, 226–243. [Google Scholar] [CrossRef]
  40. Chartzoulakis, K.; Bertaki, M. Sustainable water management in agriculture under climate change. Agric. Agric. Sci. Procedia 2015, 4, 88–98. [Google Scholar] [CrossRef] [Green Version]
  41. Park, S.; Ryu, D.; Fuentes, S.; Chung, H.; Hernández-Montes, E.; O’Connell, M. Adaptive estimation of crop water stress in nectarine and peach orchards using high-resolution imagery from an unmanned aerial vehicle (UAV). Remote Sens. 2017, 9, 828. [Google Scholar] [CrossRef] [Green Version]
  42. Islam, N.; Rashid, M.M.; Wibowo, S.; Xu, C.-Y.; Morshed, A.; Wasimi, S.A.; Moore, S.; Rahman, S.M. Early weed detection using image processing and machine learning techniques in an australian chilli farm. Agriculture 2021, 11, 387. [Google Scholar] [CrossRef]
  43. Hasan, A.M.; Sohel, F.; Diepeveen, D.; Laga, H.; Jones, M.G. A survey of deep learning techniques for weed detection from images. Comput. Electron. Agric. 2021, 184, 106067. [Google Scholar] [CrossRef]
  44. Louargant, M.; Villette, S.; Jones, G.; Vigneau, N.; Paoli, J.N.; Gée, C. Weed detection by UAV: Simulation of the impact of spectral mixing in multispectral images. Precis. Agric. 2017, 18, 932–951. [Google Scholar] [CrossRef] [Green Version]
  45. Bah, M.D.; Hafiane, A.; Canals, R. Deep learning with unsupervised data labeling for weed detection in line crops in UAV images. Remote Sens. 2018, 10, 1690. [Google Scholar] [CrossRef] [Green Version]
  46. Khanal, S.; Kc, K.; Fulton, J.P.; Shearer, S.; Ozkan, E. Remote sensing in agriculture—Accomplishments, limitations, and opportunities. Remote Sens. 2020, 12, 3783. [Google Scholar] [CrossRef]
  47. Zhang, H.; Wang, L.; Tian, T.; Yin, J. A Review of unmanned aerial vehicle low-altitude remote sensing (UAV-LARS) use in agricultural monitoring in China. Remote Sens. 2021, 13, 1221. [Google Scholar] [CrossRef]
  48. Noor, N.M.; Abdullah, A.; Hashim, M. Remote Sensing UAV/Drones and İts Applications for Urban Areas: A review. In Proceedings of the IOP Conference Series: Earth and Environmental Science; IOP Publishing: London, UK, 2018; p. 012003. [Google Scholar]
  49. Ye, H.; Huang, W.; Huang, S.; Cui, B.; Dong, Y.; Guo, A.; Ren, Y.; Jin, Y. Recognition of banana fusarium wilt based on UAV remote sensing. Remote Sens. 2020, 12, 938. [Google Scholar] [CrossRef] [Green Version]
  50. Allred, B.; Eash, N.; Freeland, R.; Martinez, L.; Wishart, D. Effective and efficient agricultural drainage pipe mapping with UAS thermal infrared imagery: A case study. Agric. Water Manag. 2018, 197, 132–137. [Google Scholar] [CrossRef]
  51. Christiansen, M.P.; Laursen, M.S.; Jørgensen, R.N.; Skovsen, S.; Gislum, R. Designing and testing a UAV mapping system for agricultural field surveying. Sensors 2017, 17, 2703. [Google Scholar] [CrossRef] [Green Version]
  52. Gašparović, M.; Zrinjski, M.; Barković, Đ.; Radočaj, D. An automatic method for weed mapping in oat fields based on UAV imagery. Comput. Electron. Agric. 2020, 173, 105385. [Google Scholar] [CrossRef]
  53. Schiefer, F.; Kattenborn, T.; Frick, A.; Frey, J.; Schall, P.; Koch, B.; Schmidtlein, S. Mapping forest tree species in high resolution UAV-based RGB-imagery by means of convolutional neural networks. ISPRS J. Photogramm. Remote Sens. 2020, 170, 205–215. [Google Scholar] [CrossRef]
  54. Pearse, G.D.; Tan, A.Y.S.; Watt, M.S.; Franz, M.O.; Dash, J.P. Detecting and mapping tree seedlings in UAV imagery using convolutional neural networks and field-verified data. ISPRS J. Photogramm. Remote Sens. 2020, 168, 156–169. [Google Scholar] [CrossRef]
  55. Freitas, H.; Faiçal, B.S.; de Silva, A.V.C.; Ueyama, J. Use of UAVs for an efficient capsule distribution and smart path planning for biological pest control. Comput. Electron. Agric. 2020, 173, 105387. [Google Scholar] [CrossRef]
  56. Tokekar, P.; Vander Hook, J.; Mulla, D.; Isler, V. Sensor planning for a symbiotic UAV and UGV system for precision agriculture. IEEE Trans. Robot. 2016, 32, 1498–1511. [Google Scholar] [CrossRef]
  57. Pan, Z.; Lie, D.; Qiang, L.; Shaolan, H.; Shilai, Y.; Yande, L.; Yongxu, Y.; Haiyang, P. Effects of citrus tree-shape and spraying height of small unmanned aerial vehicle on droplet distribution. Int. J. Agric. Biol. Eng. 2016, 9, 45–52. [Google Scholar]
  58. Faiçal, B.S.; Freitas, H.; Gomes, P.H.; Mano, L.Y.; Pessin, G.; de Carvalho, A.C.; Krishnamachari, B.; Ueyama, J. An adaptive approach for UAV-based pesticide spraying in dynamic environments. Comput. Electron. Agric. 2017, 138, 210–223. [Google Scholar] [CrossRef]
  59. Meng, Y.; Su, J.; Song, J.; Chen, W.-H.; Lan, Y. Experimental evaluation of UAV spraying for peach trees of different shapes: Effects of operational parameters on droplet distribution. Comput. Electron. Agric. 2020, 170, 105282. [Google Scholar] [CrossRef]
  60. Fu, Z.; Jiang, J.; Gao, Y.; Krienke, B.; Wang, M.; Zhong, K.; Cao, Q.; Tian, Y.; Zhu, Y.; Cao, W. Wheat growth monitoring and yield estimation based on multi-rotor unmanned aerial vehicle. Remote Sens. 2020, 12, 508. [Google Scholar] [CrossRef] [Green Version]
  61. Cao, Y.; Li, G.L.; Luo, Y.K.; Pan, Q.; Zhang, S.Y. Monitoring of sugar beet growth indicators using wide-dynamic-range vegetation index (WDRVI) derived from UAV multispectral images. Comput. Electron. Agric. 2020, 171, 105331. [Google Scholar] [CrossRef]
  62. Johansen, K.; Morton, M.J.; Malbeteau, Y.; Aragon, B.; Al-Mashharawi, S.; Ziliani, M.G.; Angel, Y.; Fiene, G.; Negrao, S.; Mousa, M.A. Predicting biomass and yield in a tomato phenotyping experiment using UAV imagery and random forest. Front. Artif. Intell. 2020, 3, 28. [Google Scholar] [CrossRef] [PubMed]
  63. Tetila, E.C.; Machado, B.B.; Astolfi, G.; Belete, N.A.d.S.; Amorim, W.P.; Roel, A.R.; Pistori, H. Detection and classification of soybean pests using deep learning with UAV images. Comput. Electron. Agric. 2020, 179, 105836. [Google Scholar] [CrossRef]
  64. Zhang, M.; Zhou, J.; Sudduth, K.A.; Kitchen, N.R. Estimation of maize yield and effects of variable-rate nitrogen application using UAV-based RGB imagery. Biosyst. Eng. 2020, 189, 24–35. [Google Scholar] [CrossRef]
  65. Wan, L.; Cen, H.; Zhu, J.; Zhang, J.; Zhu, Y.; Sun, D.; Du, X.; Zhai, L.; Weng, H.; Li, Y. Grain yield prediction of rice using multi-temporal UAV-based RGB and multispectral images and model transfer—A case study of small farmlands in the South of China. Agric. For. Meteorol. 2020, 291, 108096. [Google Scholar] [CrossRef]
  66. Kerkech, M.; Hafiane, A.; Canals, R. Vine disease detection in UAV multispectral images using optimized image registration and deep learning segmentation approach. Comput. Electron. Agric. 2020, 174, 105446. [Google Scholar] [CrossRef]
  67. Ashapure, A.; Jung, J.; Chang, A.; Oh, S.; Yeom, J.; Maeda, M.; Maeda, A.; Dube, N.; Landivar, J.; Hague, S.; et al. Developing a machine learning based cotton yield estimation framework using multi-temporal UAS data. ISPRS J. Photogramm. Remote Sens. 2020, 169, 180–194. [Google Scholar] [CrossRef]
  68. Li, B.; Xu, X.; Zhang, L.; Han, J.; Bian, C.; Li, G.; Liu, J.; Jin, L. Above-ground biomass estimation and yield prediction in potato by using UAV-based RGB and hyperspectral imaging. ISPRS J. Photogramm. Remote Sens. 2020, 162, 161–172. [Google Scholar] [CrossRef]
  69. Zheng, J.; Fu, H.; Li, W.; Wu, W.; Yu, L.; Yuan, S.; Tao, W.Y.W.; Pang, T.K.; Kanniah, K.D. Growing status observation for oil palm trees using Unmanned Aerial Vehicle (UAV) images. ISPRS J. Photogramm. Remote Sens. 2021, 173, 95–121. [Google Scholar] [CrossRef]
  70. Gomez Selvaraj, M.; Vergara, A.; Montenegro, F.; Alonso Ruiz, H.; Safari, N.; Raymaekers, D.; Ocimati, W.; Ntamwira, J.; Tits, L.; Omondi, A.B.; et al. Detection of banana plants and their major diseases through aerial images and machine learning methods: A case study in DR Congo and Republic of Benin. ISPRS J. Photogramm. Remote Sens. 2020, 169, 110–124. [Google Scholar] [CrossRef]
  71. Elmokadem, T. Distributed coverage control of quadrotor multi-uav systems for precision agriculture. IFAC-PapersOnLine 2019, 52, 251–256. [Google Scholar] [CrossRef]
  72. Hoffmann, H.; Jensen, R.; Thomsen, A.; Nieto, H.; Rasmussen, J.; Friborg, T. Crop water stress maps for an entire growing season from visible and thermal UAV imagery. Biogeosciences 2016, 13, 6545–6563. [Google Scholar] [CrossRef] [Green Version]
  73. Romero, M.; Luo, Y.; Su, B.; Fuentes, S. Vineyard water status estimation using multispectral imagery from an UAV platform and machine learning algorithms for irrigation scheduling management. Comput. Electron. Agric. 2018, 147, 109–117. [Google Scholar] [CrossRef]
  74. Jiyu, L.; Lan, Y.; Jianwei, W.; Shengde, C.; Cong, H.; Qi, L.; Qiuping, L. Distribution law of rice pollen in the wind field of small UAV. Int. J. Agric. Biol. Eng. 2017, 10, 32–40. [Google Scholar] [CrossRef]
  75. Dos Santos Ferreira, A.; Freitas, D.M.; da Silva, G.G.; Pistori, H.; Folhes, M.T. Weed detection in soybean crops using ConvNets. Comput. Electron. Agric. 2017, 143, 314–324. [Google Scholar] [CrossRef]
  76. Stroppiana, D.; Villa, P.; Sona, G.; Ronchetti, G.; Candiani, G.; Pepe, M.; Busetto, L.; Migliazzi, M.; Boschetti, M. Early season weed mapping in rice crops using multi-spectral UAV data. Int. J. Remote Sens. 2018, 39, 5432–5452. [Google Scholar] [CrossRef]
  77. Song, Y.; Wang, J.; Shan, B. Estimation of winter wheat yield from UAV-based multi-temporal imagery using crop allometric relationship and SAFY model. Drones 2021, 5, 78. [Google Scholar] [CrossRef]
  78. Sagan, V.; Maimaitijiang, M.; Sidike, P.; Maimaitiyiming, M.; Erkbol, H.; Hartling, S.; Peterson, K.; Peterson, J.; Burken, J.; Fritschi, F. UAV/satellite multiscale data fusion for crop monitoring and early stress detection. ISPRS Arch. 2019. [Google Scholar] [CrossRef] [Green Version]
  79. Roldán, J.J.; Garcia-Aunon, P.; Garzón, M.; de León, J.; del Cerro, J.; Barrientos, A. Heterogeneous multi-robot system for mapping environmental variables of greenhouses. Sensors 2016, 16, 1018. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  80. Hamouda, Y.E.; Elhabil, B.H. Precision Agriculture for Greenhouses Using a Wireless Sensor Network. In Proceedings of the 2017 Palestinian International Conference on Information and Communication Technology (PICICT), Gaza, Palestine, 8–9 May 2017; pp. 78–83. [Google Scholar]
  81. Erazo-Rodas, M.; Sandoval-Moreno, M.; Muñoz-Romero, S.; Huerta, M.; Rivas-Lalaleo, D.; Naranjo, C.; Rojo-Álvarez, J.L. Multiparametric monitoring in equatorian tomato greenhouses (I): Wireless sensor network benchmarking. Sensors 2018, 18, 2555. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  82. Rodríguez, S.; Gualotuña, T.; Grilo, C. A system for the monitoring and predicting of data in precision agriculture in a rose greenhouse based on wireless sensor networks. Procedia Comput. Sci. 2017, 121, 306–313. [Google Scholar] [CrossRef]
  83. Mat, I.; Kassim, M.R.M.; Harun, A.N.; Yusoff, I.M. IoT in Precision Agriculture Applications Using Wireless Moisture Sensor Network. In Proceedings of the 2016 IEEE Conference on Open Systems (ICOS), Kedah, Malaysia, 10–12 October 2016; pp. 24–29. [Google Scholar]
  84. Komarchuk, D.S.; Gunchenko, Y.A.; Pasichnyk, N.A.; Opryshko, O.A.; Shvorov, S.A.; Reshetiuk, V. Use of Drones in Industrial Greenhouses. In Proceedings of the 2021 IEEE 6th International Conference on Actual Problems of Unmanned Aerial Vehicles Development (APUAVD), Kyiv, Ukraine, 19–21 October 2021; pp. 184–187. [Google Scholar]
  85. Jiang, J.-A.; Wang, C.-H.; Liao, M.-S.; Zheng, X.-Y.; Liu, J.-H.; Chuang, C.-L.; Hung, C.-L.; Chen, C.-P. A wireless sensor network-based monitoring system with dynamic convergecast tree algorithm for precision cultivation management in orchid greenhouses. Precis. Agric. 2016, 17, 766–785. [Google Scholar] [CrossRef]
  86. Roldán, J.J.; Joossen, G.; Sanz, D.; del Cerro, J.; Barrientos, A. Mini-UAV based sensory system for measuring environmental variables in greenhouses. Sensors 2015, 15, 3334–3350. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  87. Neumann, P.P.; Kohlhoff, H.; Hüllmann, D.; Lilienthal, A.J.; Kluge, M. Bringing Mobile Robot Olfaction to the Next Dimension—UAV-Based Remote Sensing of Gas Clouds and Source Localization. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017; pp. 3910–3916. [Google Scholar]
  88. Khan, A.; Schaefer, D.; Tao, L.; Miller, D.J.; Sun, K.; Zondlo, M.A.; Harrison, W.A.; Roscoe, B.; Lary, D.J. Low power greenhouse gas sensors for unmanned aerial vehicles. Remote Sens. 2012, 4, 1355–1368. [Google Scholar] [CrossRef] [Green Version]
  89. Khan, A.; Schaefer, D.; Roscoe, B.; Sun, K.; Tao, L.; Miller, D.; Lary, D.J.; Zondlo, M.A. Open-Path Greenhouse Gas Sensor for UAV Applications. In Proceedings of the Conference on Lasers and Electro-Optics 2012, San Jose, CA, USA, 6 May 2012; p. JTh1L.6. [Google Scholar]
  90. Berman, E.S.F.; Fladeland, M.; Liem, J.; Kolyer, R.; Gupta, M. Greenhouse gas analyzer for measurements of carbon dioxide, methane, and water vapor aboard an unmanned aerial vehicle. Sens. Actuators B Chem. 2012, 169, 128–135. [Google Scholar] [CrossRef]
  91. Malaver, A.; Motta, N.; Corke, P.; Gonzalez, F. Development and integration of a solar powered unmanned aerial vehicle and a wireless sensor network to monitor greenhouse gases. Sensors 2015, 15, 4072–4096. [Google Scholar] [CrossRef] [PubMed]
  92. Simon, J.; Petkovic, I.; Petkovic, D.; Petkovics, A. Navigation and applicability of hexa rotor drones in greenhouse environment. Teh. Vjesn. 2018, 25, 249–255. [Google Scholar]
  93. Shi, Q.; Liu, D.; Mao, H.; Shen, B.; Liu, X.; Ou, M. Study on Assistant Pollination of Facility Tomato by UAV. In Proceedings of the 2019 ASABE Annual International Meeting, Boston, MA, USA, 7–10 July 2019; American Society of Agricultural and Biological Engineers: St. Joseph, MI, USA, 2019; p. 1. [Google Scholar]
  94. Amador, G.J.; Hu, D.L. Sticky solution provides grip for the first robotic pollinator. Chem 2017, 2, 162–164. [Google Scholar] [CrossRef] [Green Version]
  95. Simmonds, W.; Fesselet, L.; Sanders, B.; Ramsay, C.; Heemskerk, C. HiPerGreen: High Precision Greenhouse Farming, Inholland University of Applied Sciences: Diemen, The Netherlands, 2017.
  96. Durrant-Whyte, H.; Bailey, T. Simultaneous localization and mapping: Part I. IEEE Robot. Autom. Mag. 2006, 13, 99–110. [Google Scholar] [CrossRef] [Green Version]
  97. Bailey, T.; Durrant-Whyte, H. Simultaneous localization and mapping (SLAM): Part II. IEEE Robot. Autom. Mag. 2006, 13, 108–117. [Google Scholar] [CrossRef] [Green Version]
  98. Durdu, A.; Korkmaz, M. A novel map-merging technique for occupancy grid-based maps using multiple robots: A semantic approach. Turk. J. Electr. Eng. Comput. Sci. 2019, 27, 3980–3993. [Google Scholar] [CrossRef]
  99. Taheri, H.; Xia, Z.C. SLAM; definition and evolution. Eng. Appl. Artif. Intell. 2021, 97, 104032. [Google Scholar] [CrossRef]
  100. Yusefi, A.; Durdu, A.; Aslan, M.F.; Sungur, C. LSTM and Filter Based Comparison Analysis for Indoor Global Localization in UAVs. IEEE Access 2021, 9, 10054–10069. [Google Scholar] [CrossRef]
  101. Jinyu, L.; Bangbang, Y.; Danpeng, C.; Nan, W.; Guofeng, Z.; Hujun, B. Survey and evaluation of monocular visual-inertial SLAM algorithms for augmented reality. Virtual Real. Intell. Hardw. 2019, 1, 386–410. [Google Scholar] [CrossRef]
  102. Gurturk, M.; Yusefi, A.; Aslan, M.F.; Soycan, M.; Durdu, A.; Masiero, A. The YTU dataset and recurrent neural network based visual-inertial odometry. Measurement 2021, 184, 109878. [Google Scholar] [CrossRef]
  103. Dowling, L.; Poblete, T.; Hook, I.; Tang, H.; Tan, Y.; Glenn, W.; Unnithan, R.R. Accurate indoor mapping using an autonomous unmanned aerial vehicle (UAV). arXiv 2018, arXiv:1808.01940. [Google Scholar]
  104. Qin, H.; Meng, Z.; Meng, W.; Chen, X.; Sun, H.; Lin, F.; Ang, M.H. Autonomous exploration and mapping system using heterogeneous UAVs and UGVs in GPS-denied environments. IEEE Trans. Veh. Technol. 2019, 68, 1339–1350. [Google Scholar] [CrossRef]
  105. Mur-Artal, R.; Montiel, J.M.M.; Tardós, J.D. ORB-SLAM: A versatile and accurate monocular SLAM system. IEEE Trans. Robot. 2015, 31, 1147–1163. [Google Scholar] [CrossRef] [Green Version]
  106. Engel, J.; Schöps, T.; Cremers, D. LSD-SLAM: Large-Scale Direct Monocular SLAM. In European Conference on Computer Vision; Springer: Cham, Switzerland, 2014; pp. 834–849. [Google Scholar]
  107. Forster, C.; Pizzoli, M.; Scaramuzza, D. SVO: Fast Semi-Direct Monocular Visual Odometry. In Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 31 May–7 June 2014; pp. 15–22. [Google Scholar]
  108. Qin, T.; Li, P.; Shen, S. VINS-Mono: A robust and versatile monocular visual-inertial state estimator. IEEE Trans. Robot. 2018, 34, 1004–1020. [Google Scholar] [CrossRef] [Green Version]
  109. Delmerico, J.; Scaramuzza, D. A Benchmark Comparison of Monocular Visual-Inertial Odometry Algorithms for Flying Robots. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, QLD, Australia, 21–25 May 2018; pp. 2502–2509. [Google Scholar]
  110. Mourikis, A.I.; Roumeliotis, S.I. A Multi-State Constraint Kalman Filter for Vision-Aided Inertial Navigation. In Proceedings of the 2007 IEEE International Conference on Robotics and Automation, Rome, Italy, 10–14 April 2007; pp. 3565–3572. [Google Scholar]
  111. Leutenegger, S.; Lynen, S.; Bosse, M.; Siegwart, R.; Furgale, P. Keyframe-based visual–inertial odometry using nonlinear optimization. Int. J. Robot. Res. 2015, 34, 314–334. [Google Scholar] [CrossRef] [Green Version]
  112. Bloesch, M.; Omari, S.; Hutter, M.; Siegwart, R. Robust Visual Inertial Odometry Using a Direct EKF-Based Approach. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 28 September–2 October 2015; pp. 298–304. [Google Scholar]
  113. Forster, C.; Zhang, Z.; Gassner, M.; Werlberger, M.; Scaramuzza, D. SVO: Semidirect visual odometry for monocular and multicamera systems. IEEE Trans. Robot. 2016, 33, 249–265. [Google Scholar] [CrossRef] [Green Version]
  114. Lynen, S.; Achtelik, M.W.; Weiss, S.; Chli, M.; Siegwart, R. A Robust and Modular Multi-Sensor Fusion Approach Applied to mav Navigation. In Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013; pp. 3923–3929. [Google Scholar]
  115. Faessler, M.; Fontana, F.; Forster, C.; Mueggler, E.; Pizzoli, M.; Scaramuzza, D. Autonomous, vision-based flight and live dense 3D mapping with a quadrotor micro aerial vehicle. J. Field Robot. 2016, 33, 431–450. [Google Scholar] [CrossRef]
  116. Forster, C.; Carlone, L.; Dellaert, F.; Scaramuzza, D. On-manifold preintegration for real-time visual—Inertial odometry. IEEE Trans. Robot. 2016, 33, 1–21. [Google Scholar] [CrossRef] [Green Version]
  117. Heo, S.; Cha, J.; Park, C.G. EKF-based visual inertial navigation using sliding window nonlinear optimization. IEEE Trans. Intell. Transp. Syst. 2019, 20, 2470–2479. [Google Scholar] [CrossRef]
Figure 1. The number of studies in Web of Science on UAV and UAV applications in agriculture [23].
Figure 1. The number of studies in Web of Science on UAV and UAV applications in agriculture [23].
Applsci 12 01047 g001
Figure 2. Camera types used in agricultural UAV applications [3]. (a) Thermal camera [35]. (b) RGB camera [36]. (c) Multispectral camera [34]. (d) Hyperspectral camera [37].
Figure 2. Camera types used in agricultural UAV applications [3]. (a) Thermal camera [35]. (b) RGB camera [36]. (c) Multispectral camera [34]. (d) Hyperspectral camera [37].
Applsci 12 01047 g002
Figure 3. UAV adaptive CWSI map using thermal infrared images [41].
Figure 3. UAV adaptive CWSI map using thermal infrared images [41].
Applsci 12 01047 g003
Figure 4. Weed detection study in spinach field by deep learning [45]. (a) With no background information, red dots indicate weeds. (b) With background information available, red regions indicate weeds.
Figure 4. Weed detection study in spinach field by deep learning [45]. (a) With no background information, red dots indicate weeds. (b) With background information available, red regions indicate weeds.
Applsci 12 01047 g004
Figure 5. Distribution maps of banana fusarium wilt: (a) 5-m resolution. (b) 10-m resolution [49].
Figure 5. Distribution maps of banana fusarium wilt: (a) 5-m resolution. (b) 10-m resolution [49].
Applsci 12 01047 g005
Figure 6. UAVs designed for various agricultural tasks in the greenhouse. (a) UAV for artificial pollination. (b) UAV developed by the company ADI. (c) UAV developed by the company PATS.
Figure 6. UAVs designed for various agricultural tasks in the greenhouse. (a) UAV for artificial pollination. (b) UAV developed by the company ADI. (c) UAV developed by the company PATS.
Applsci 12 01047 g006
Figure 7. Our previous work on UAV positioning in an indoor environment [100].
Figure 7. Our previous work on UAV positioning in an indoor environment [100].
Applsci 12 01047 g007
Figure 8. A UAV performing autonomous missions in a greenhouse environment.
Figure 8. A UAV performing autonomous missions in a greenhouse environment.
Applsci 12 01047 g008
Table 1. Recent state-of-the-art studies carried out with UAVs for precision agriculture in the outdoor environment.
Table 1. Recent state-of-the-art studies carried out with UAVs for precision agriculture in the outdoor environment.
NoStudyStudy NameTaskYearProduct/FocusUAV
Type
Purpose of Study
1Zhang, Atkinson, George, Wen, Diazgranados and Gerard [31]Identifying and mapping individual plants in a highly diverse high-elevation ecosystem using UAV imagery and deep learningMapping2020FrailejonesSingle
UAV
In this study, frailejones plants were classified from UAV images using a newly proposed SS Res U-Net deep learning method. Later, the proposed model was compared with other deep learning-based semantic segmentation methods and was shown to be superior to these methods.
2Johansen, Duan, Tu, Searle, Wu, Phinn, Robson and McCabe [32]Mapping the condition of macadamia tree crops using multi-spectral UAV and WorldView-3 imageryMapping2020Macadamia treeSingle
UAV
This study used both multispectral UAV and WorldView-3 images to map the condition of macadamia tree crops. A random forest classifier achieved 98.5% correct matching for both UAV and WorldView-3 images.
3Allred et al. [50]Effective and efficient agricultural drainage pipe mapping with UAS thermal infrared imagery: A case studyMapping2018Agricultural underground drainage systemsSingle
UAV
The Pix4D software and Pix4Dmapper Pro were employed to determine drainage pipe locations using visible (VIS), thermal infrared (TIR) and near-infrared (NIR) imagery obtained by UAV. The study claimed that TIR imagery from UAV has considerable potential for detecting drain line locations under dry-surface conditions.
4Christiansen et al. [51]Designing and Testing a UAV Mapping System for Agricultural Field SurveyingMapping2017Winter wheatSingle
UAV
Data from sensors such as light detection and ranging (LIDAR), global navigation satellite system (GNSS) and inertial measurement unit (IMU) mounted on a UAV were fused to conduct mapping of winter wheat field. IMU, GNSS and UAV data were used to estimate the orientation and position (pose). The point cloud data from LIDAR were combined with the estimated pose for three-dimensional (3D) mapping.
5Gašparović et al. [52]An automatic method for weed mapping in oat fields based on UAV imageryMapping2020WeedSingle
UAV
Four independent classification algorithms derived from the random forest algorithm were tested for the creation of weed maps. Input data were collected using a low-cost RGB camera mounted on a UAV. The automatic object-based classification algorithm had the highest classification accuracy with an overall accuracy of 89.0%.
6Schiefer et al. [53]Mapping forest tree species in high-resolution UAV-based RGB-imagery by means of convolutional neural networksMapping2020Forest tree speciesSingle
UAV
RGB imagery taken from a UAV was assessed with the learning capabilities of convolutional neural networks (CNNs) and a semantic segmentation approach (U-Net) for the mapping of tree species in the forest environment. Nine tree species, deadwood, three genus-level classes and forest floor were accurately and quickly mapped.
7Pearse et al. [54]Detecting and mapping tree seedlings in UAV imagery using convolutional neural networks and field-verified dataMapping2020Tree seedlingsSingle
UAV
A deep learning-based method applied to data from an RGB camera mounted on a UAV was presented for large-scale and rapid mapping of young conifer seedlings. CNN-based models were trained on two sites to detect seedlings with an overall accuracy of 99.5% and 98.8%.
8Freitas et al. [55]Use of UAVs for an efficient capsule distribution and smart path planning for biological pest controlPath planning2020Exotic pestsSingle
UAV
A UAV-based coverage algorithm was proposed to cover all areas and to detect exotic pests damaging the area. The capsule deposition sites were calculated in the whole environment and generated a path for the cup distribution location of the UAV in the algorithm. This planned distribution was more advantageous and preferable than a zigzag distribution in this study.
9Tokekar et al. [56]Sensor Planning for a Symbiotic UAV and UGV System for Precision AgricultureRemote sensing2016Nitrogen level predictionSingle
UAV
+
UGV
This study aimed to predict the nitrogen (N) map of an environment and to plan an optimum path to apply fertilizer with a UAV. A UGV helped to measure each point visited by the UAV. The total time spent was minimized according to traveling and measuring. They applied the method of the traveling salesperson problem with neighborhoods (TSPN) for this path-minimization problem.
10Pan et al. [57]Effects of citrus tree-shape and spraying height of small unmanned aerial vehicle on droplet distributionSpraying2016Citrus treesSingle
UAV
The effects of spraying height of a UAV and citrus tree shape were investigated for droplet distribution in this study. The UAV performance at a 1.0 m working height was better than at the other heights. Additionally, to increase the droplet distribution, open center shape citrus trees were advised based on the results of the study.
11Faiçal et al. [58]An adaptive approach for UAV-based pesticide spraying in dynamic environmentsSpraying2017PesticideSingle
UAV
A computer-based system that controls a UAV for precise pesticide deposition in the field and metaheuristic route-planning method based on particle swarm optimization, genetic algorithms, hill-climbing and simulated annealing was evaluated for autonomous adaptation of route changes. The spray deposition was tracked by sensors, and the system was controlled by wireless sensor networks (WSNs). The proposed system resulted in less environmental damage, more precise changes in the route of flight and more accurate deposition of the pesticide.
12Meng et al. [59]Experimental evaluation of UAV spraying for peach trees of different shapes: Effects of operational parameters on droplet distributionSpraying2020Peach treesSingle
UAV
The effects of UAV operational parameters on droplet distribution for orchard trees were evaluated in this work. A UAV was experimentally used for the aerial spraying of Y-shape and CL-shape peach trees, and improvement on the droplet coverage was shown by the increase in nozzle flow rate at the end of the study.
13Ye, Huang, Huang, Cui, Dong, Guo, Ren and Jin [49]Recognition of banana fusarium wilt based on UAV remote sensingCrop monitoring2020BananaSingle
UAV
UAV-based multispectral imagery was used to determine infested banana regions in this work. Banana fusarium wilt disease was identified with a red-edge band multispectral camera sensor. The binary logistic regression method was used to establish the spatial relationships between infested plants and non-infested plants on the known map.
14Fu et al. [60]Wheat growth monitoring and yield estimation based on multi-rotor unmanned aerial vehicleCrop monitoring2020WheatSingle
UAV
This study was performed on wheat trials treated with seeding densities and different nitrogen levels in the area. The images were collected by a multispectral camera mounted on the UAV. Multiple linear regression (MLR), simple linear regression (LR), partial least squares regression (PLSR), stepwise multiple linear regression (SMLR), random forest (RF) and artificial neural network (ANN) modeling methods were used to estimate wheat yield. The experimental results showed that machine learning methods had a better performance for predicting wheat yield.
15Cao et al. [61]Monitoring of sugar beet growth indicators using wide-dynamic-range vegetation index (WDRVI) derived from UAV multispectral imagesCrop monitoring2020Sugar beetSingle
UAV
A UAV equipped with a multispectral camera sensor was used for the experiments. In this study, four wide-dynamic-range vegetation indices (WDRVIs) were calculated by adding α weight coefficients to the normalized vegetation index (NDVI) to estimate the fresh weight of leaves (FWL), the fresh weight of beet LAI and the fresh weight of roots (FWR) of the sugar beet. Next, the effect of different indices on sugar beet was compared. According to the study, WDRVI1 can be used as a vegetation index to monitor beet growth.
16Johansen et al. [62]Predicting Biomass and Yield in a Tomato Phenotyping Experiment Using UAV Imagery and Random ForestCrop monitoring2020Wild tomato speciesSingle
UAV
In this study, UAV images were used with random forest learning to estimate the biomass and yield of 1200 tomato plants. The results of RGB and multispectral UAV images collected 1 and 2 weeks before harvest were compared.
17Tetila et al. [63]Detection and classification of soybean pests using deep learning with UAV imagesCrop monitoring2020Soybean pestsSingle
UAV
This study applied five deep learning architectures to classify soybean pest images and compared their results. Accuracy reaching 93.82% was achieved with transfer learning-based methods performed on a dataset consisting of 5000 images.
18Zhang et al. [64]Estimation of maize yield and effects of variable-rate nitrogen application using UAV-based RGB imageryCrop monitoring2020MaizeSingle
UAV
In this study, color images captured remotely by a UAV imaging system were used to estimate maize yield. Various linear regression models were developed for three sample area sizes (21, 106 and 1058 m2). In the yield estimation using linear regression models, a mean absolute percentage error (MAPE) varying between 6.2% and 15.1% was obtained.
19Wan et al. [65]Grain yield prediction of rice using multi-temporal UAV-based RGB and multispectral images and model transfer—a case study of small farmlands in the South of ChinaCrop monitoring2020RiceSingle
UAV
A UAV platform with RGB and multispectral cameras was used to predict grain yield in rice. Spectral and structural information was obtained from RGB and multispectral images to evaluate grain yield and monitor crop growth status. It was then evaluated using random forest models.
20Kerkech et al. [66]Vine disease detection in UAV multispectral images using optimized image registration and deep learning segmentation approachCrop monitoring2020VineSingle
UAV
In this study, deep learning segmentation was used in UAV images to detect mildew disease in vines. A combination of visible and infrared images is used in this method. With the proposed method, the disease was detected with an accuracy of 92% at the grapevine level and 87% at the leaf level.
21Ashapure et al. [67]Developing a machine learning-based cotton yield estimation framework using multi-temporal UAS dataCrop monitoring2020CottonSingle
UAV
In this study, multitemporal remote sensing data collected from a UAV were used for cotton yield estimation. In the cotton yield estimation made using artificial neural networks (ANNs), the highest value of R2 was 0.89.
22Li et al. [68]Above-ground biomass estimation and yield prediction in potato by using UAV-based RGB and hyperspectral imagingCrop monitoring2020PotatoSingle
UAV
RGB and hyperspectral images were obtained with a low-altitude UAV to estimate biomass and crop yield in potatoes. High accuracy was obtained in biomass estimation using random forest regression models.
23Zheng et al. [69]Growing status observation for oil palm trees using Unmanned Aerial Vehicle (UAV) imagesCrop monitoring2021Palm
trees
Single
UAV
A classification method was proposed that reveals both the presence and the growth state of oil palm trees. This approach, based on Faster RCNN and called multiclass oil palm detection (MOPAD), produced effective results by using a refined pyramid feature (RPF) and hybrid class-balanced loss together. In this study, palm trees in two regions in Indonesia were classified into five groups using MOPAD. In the classification in two regions, F1-score values were determined to be 87.91% and 99.04%.
24Gomez Selvaraj et al. [70]Detection of banana plants and their major diseases through aerial images and machine learning methods: A case study in DR Congo and the Republic of BeninCrop monitoring2020Banana
plants
Single
UAV
In this study, banana groups and diseases were classified by using pixel-based and machine learning models using multilevel satellite images and UAV platforms on the mixed-complex surface of Africa. Banana bunchy top disease (BBTD), Xanthomonas wilt of banana (BXW), healthy banana cluster and individual banana plants were determined as 4 classes and classified with 99.4%, 92.8%, 93.3% and 90.8% accuracy, respectively. This approach was reported to have an important potential as a decision support system in identifying the major banana diseases encountered in Africa.
25Elmokadem [71]Distributed Coverage Control of Quadrotor Multi-UAV Systems for Precision AgricultureField monitoring2019Region-based UAV controlMultiple
UAVs
In this study, multiple UAV control strategies were presented for precision agriculture applications. Using Voronoi partitions, the positions of the UAVs were determined, and collisions with each other were prevented. Simulations were run in Gazebo and Robot Operating System (ROS) to show the performance of the proposed method.
26Hoffmann et al. [72]Crop water stress maps for an entire growing season from visible and thermal UAV imageryIrrigation2016BarleySingle
UAV
A water deficit index (WDI) was obtained using images collected by a UAV. Using this index, the water stress of plants was measured. Both early and growing plant images were used to determine WDI. The WDI index is different from the commonly used vegetation index, which is based on the greenery of the surface. The resulting WDI map had a spatial resolution of 0.25 m in this study.
27Romero et al. [73]Vineyard water status estimation using multispectral imagery from a UAV platform and machine learning algorithms for irrigation scheduling managementIrrigation2018VineSingle
UAV
In this study, a relationship was established between the vegetation index derived from multiband images taken using UAVs and the midday stem water potential of grapes. For this, the pattern recognition ANN model classified the results as severe water stress, moderate water stress and no water stress for certain thresholds. It was determined that this model is a suitable method for optimum irrigation.
28Jiyu et al. [74]Distribution law of rice pollen in the wind field of small UAVArtificial pollination2017RiceSingle
UAV
In this study, the required flight speed of the UAVs to have a positive effect on the pollination of rice was determined. The flight speed of the UAV, which offers the best pollination opportunity, was determined to be 4.53 m/s. SPSS’s Q-Q plot was used to verify this situation. The findings provided the velocity parameters that should be used by agricultural UAVs to have a positive effect on rice pollination.
29dos Santos Ferreira et al. [75]Weed detection in soybean crops using ConvNetsWeed detection2017Soybean cropsSingle
UAV
Images were taken in a soybean field in Brazil using UAVs. With these images, a database was created with classes such as soil, soybean and broadleaf grasses. The classification was made using convolutional neural networks. The best result was achieved by using ConvNets, and the accuracy was 98%.
30Stroppiana et al. [76]Early season weed mapping in rice crops using multi-spectral UAV dataWeed detection2018RiceSingle
UAV
Shortly after planting rice, the authors mapped the weeds found in the field using a UAV. The images taken by using the Parrot Sequoia sensor were classified as weed or not weed with an unsupervised clustering algorithm. The herbicide was applied by comparing the amount of weed on this map with a certain threshold level.
Table 2. UAV studies in the greenhouse for precision agriculture.
Table 2. UAV studies in the greenhouse for precision agriculture.
NoStudyStudy NameTaskYearProduct/
Focus
UAV
Type
1Shi, Liu, Mao, Shen, Liu and Ou [93]Study on Assistant Pollination of Facility Tomato by UAVPollination2019TomatoSingle UAV
2Roldán, Garcia-Aunon, Garzón, De León, Del Cerro and Barrientos [79]Heterogeneous Multi-Robot System for Mapping
Environmental Variables of Greenhouses
Mapping2016Environmental variablesSingle UAV
3Roldán, Joossen, Sanz, Del Cerro and Barrientos [86]Mini-UAV Based Sensory System for Measuring Environmental
Variables in Greenhouses
Monitoring2015Environmental variablesSingle UAV
4Simon, Petkovic, Petkovic and Petkovics [92]Navigation and Applicability of Hexa Rotor Drones in Greenhouse EnvironmentNavigation2018PositioningSingle UAV
Table 3. Previous popular SLAM studies in indoor environments.
Table 3. Previous popular SLAM studies in indoor environments.
NoStudyInformationSensorsApp. EnvironmentResult
1Dowling et al. [103]UAV study based on an extended Kalman filter (EKF) that can navigate independently in a closed indoor environment, create an area map using 2D laser scan data for navigation and record live video.LIDAR,
ultrasonic sensor
(SLAM)
ROSThe map was created by a planar laser scanner using a UAV indoors, and it was shown that the UAV avoided obstacles correctly.
2Qin et al. [104]UAV and UGV were used together for autonomous exploration, mapping and navigation in the indoor environment. To take advantage of heterogeneous robots, the exploration and mapping tasks are divided into two layers. In the first layer, the aim is to carry out a preliminary exploration and produce a rough mapping with the UGV mounted on 3D LIDAR. The map created by the UGV is shared with the UAV. The UAV then performs complementary precision mapping using an inclined 2D laser module and visual sensors, filling the remaining gaps in the previous map. The application was applied both in simulation and experimentally.LIDAR,
stereo camera (ZED)
(SLAM)
ROSUAV and UGV advantages were utilized. The structure of the environment was successfully obtained.
3Mur-Artal et al. [105]Feature-based monocular ORB-SLAM was presented for indoor and outdoor environments. For feature extraction, ORB with directed multiscale FAST corners was used. While the ORB provided good invariance from the point of view, its calculation and matching were extremely fast. This enabled the powerful optimization of mapping. The system combined monitoring, local mapping and loop closing threads running in parallel. The distributed bag of words (DBoW) location recognition module was used in the system to perform loop detection.Monocular camera
(VSLAM)
ROSA very reliable and successful solution for monocular SLAM was developed with ORB-SLAM.
4Engel et al. [106]This study presented the monocular large-scale direct SLAM (LSD-SLAM), which is popular among direct SLAM methods. Direct SLAM algorithms do not extract key points in the image but, instead, use image densities to predict location and map. That is, they are more robust and detailed than feature-based methods (MonoSLAM, PTAM, ORB-SLAM, etc.), but this causes high computational costs. The map of the environment is created based on specific key frames containing the camera image, an inverted depth map and the variance of the inverted depth map. The depth map and its variance are created not for all pixels, but only for pixels located near large image density gradients, which therefore have a semi-dense structure.Monocular camera
(VSLAM)
ROSSuccessful real-time monocular SLAM was performed with LSD-SLAM without feature extraction.
5Forster et al. [107]This study introduced the semi-direct visual odometry (SVO) algorithm, which is very fast and powerful. It eliminates feature extraction and matching techniques that reduce the speed of visual odometry. SVO combines the properties of feature-based methods (tracking multiple features, parallel tracking and mapping, keyframe selection) with the accuracy and speed of direct methods.Monocular camera
(VSLAM)
ROSA successful real-time SLAM algorithm was realized by combining the advantages of direct and indirect SLAM algorithms.
6Qin et al. [108]This study proposed a monocular visual-inertial system (VINS) for 6-degrees-of-freedom (DoF) state prediction using a camera and a low-cost IMU. The initialization procedure provides all necessary values, including pose, velocity, gravity vector, gyroscope deflection and 3D feature position, to bootstrap the next nonlinear optimization-based VIO. Initial values were obtained by matching the IMU values with the vision-only structure. After initialization of the predictor, sliding window-based monocular VIO was performed for high accuracy and robust state prediction. A nonlinear optimization-based method was used to combine IMU measurements and visual features.Monocular camera,
IMU
(VISLAM)
ROSA successful VISLAM was achieved with efficient IMU pre-integration, automatic estimator initialization, online external calibration, error detection and recovery, loop detection and pose graph optimization.
7Delmerico and Scaramuzza [109]This study performed the evaluation of open code VIO algorithms on flying robot hardware configurations. The methods used were multi-state constraint Kalman filter (MSCKF) [110], open keyframe-based visual-inertial SLAM (OKVIS) [111], robust visual-inertial odometry (ROVIO) [112], monocular visual-inertial system (VINS-Mono) [108], semi-direct visual odometry (SVO) [113] + multisensor fusion (MSF) [114], (SVO-MSF) [115] and SVO + Georgia Tech Smoothing and Mapping Library (GTSAM) (SVO-GTSAM) [116]. These algorithms were implemented on the EuRoC Micro Aerial Vehicle (MAV) dataset, which contains 6-DoF motion trajectories for flying robots.Monocular camera,
IMU
(VISLAM)
MatlabThe comparison revealed that SVO + MSF had the most accurate performance. In addition, processing time per frame, CPU usage and memory usage criteria were taken into consideration in the study.
8Heo et al. [117]In this study, a new measurement model named local-optimal-multi-state constraint Kalman filter (LOMSCKF) was designed. With this model, the nonlinear optimization method was fused with MSCKF to perform VIO. In addition, unlike MSCKF, all of the measurements and information available in the sliding window were used.Monocular camera,
IMU
(VISLAM)
MatlabThe performance of the proposed LOMSCKF was evaluated using both virtual and real-world datasets. LOMSCKF outperformed MSCKF.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Aslan, M.F.; Durdu, A.; Sabanci, K.; Ropelewska, E.; Gültekin, S.S. A Comprehensive Survey of the Recent Studies with UAV for Precision Agriculture in Open Fields and Greenhouses. Appl. Sci. 2022, 12, 1047. https://doi.org/10.3390/app12031047

AMA Style

Aslan MF, Durdu A, Sabanci K, Ropelewska E, Gültekin SS. A Comprehensive Survey of the Recent Studies with UAV for Precision Agriculture in Open Fields and Greenhouses. Applied Sciences. 2022; 12(3):1047. https://doi.org/10.3390/app12031047

Chicago/Turabian Style

Aslan, Muhammet Fatih, Akif Durdu, Kadir Sabanci, Ewa Ropelewska, and Seyfettin Sinan Gültekin. 2022. "A Comprehensive Survey of the Recent Studies with UAV for Precision Agriculture in Open Fields and Greenhouses" Applied Sciences 12, no. 3: 1047. https://doi.org/10.3390/app12031047

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop