Next Article in Journal
Analyzing the Impact of Cybersecurity on Monitoring and Control Systems in the Energy Sector
Previous Article in Journal
Ensemble Learning for Predicting TOC from Well-Logs of the Unconventional Goldwyer Shale
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Unmanned Aerial Vehicles (UAV) in Precision Agriculture: Applications and Challenges

1
Department of Computer Science and Engineering, Karpagam Academy of Higher Education, Coimbatore 641021, India
2
Department of Electronics and Communication Engineering, Vel Tech Multitech Dr.Rangarajan Dr.Sakuthala Engineering College, Chennai 600062, India
3
Department of Information Technology, University of the Punjab Gujranwala Campus, Gujranwala 52250, Pakistan
4
Department of Information and Communication Engineering, Yeungnam University, Gyeongsan 38541, Korea
*
Authors to whom correspondence should be addressed.
Energies 2022, 15(1), 217; https://doi.org/10.3390/en15010217
Submission received: 1 November 2021 / Revised: 14 December 2021 / Accepted: 21 December 2021 / Published: 29 December 2021

Abstract

:
Agriculture is the primary source of income in developing countries like India. Agriculture accounts for 17 percent of India’s total GDP, with almost 60 percent of the people directly or indirectly employed. While researchers and planters focus on a variety of elements to boost productivity, crop loss due to disease is one of the most serious issues they confront. Crop growth monitoring and early detection of pest infestations are still a problem. With the expansion of cultivation to wider fields, manual intervention to monitor and diagnose insect and pest infestations is becoming increasingly difficult. Failure to apply on time fertilizers and pesticides results in more crop loss and so lower output. Farmers are putting in greater effort to conserve crops, but they are failing most of the time because they are unable to adequately monitor the crops when they are infected by pests and insects. Pest infestation is also difficult to predict because it is not evenly distributed. In the recent past, modern equipment, tools, and approaches have been used to replace manual involvement. Unmanned aerial vehicles serve a critical role in crop disease surveillance and early detection in this setting. This research attempts to give a review of the most successful techniques to have precision-based crop monitoring and pest management in agriculture fields utilizing unmanned aerial vehicles (UAVs) or unmanned aircraft. The researchers’ reports on the various types of UAVs and their applications to early detection of agricultural diseases are rigorously assessed and compared. This paper also discusses the deployment of aerial, satellite, and other remote sensing technologies for disease detection, as well as their Quality of Service (QoS).

1. Introduction

Most of the developed countries have adopted the latest technologies such as Photogrammetry and Remote Sensing (RS) [1,2] for precision agriculture [3,4] using Unmanned Aerial Vehicles (UAV) to make a good agriculture farm with minor infection. It will help the farmers with more crop productivity, quality, and, most importantly, the farmers’ lesser workload. Further it can be used for spraying fertilizer and pesticides. Usually, the UAV’s are developed with an automated drone system with sensors and cameras in order to monitor the condition and height of the crops. There are various types of UAV models have been developed. Based on the agriculture farm, select proper and appropriate UAVs should. The role of UAV in precision management is taken care by the captured spectral images. The multispectral camera will monitor the condition of the crop by scanning the entire crop field. The actuated drones mounted with cameras will identify the pest and insect hot spots. The UAVs and remote-sensing techniques mentioned above help the farmers to take appropriate measures at the right time to protect the crops from diseases. The UAV with low-altitude remote sensing has more advantages like good mobility, easy construction, and high resolution for obtaining the images [5]. The quality of the crop and yield benefits depends on biotic and abiotic factors. In the past, the farmers rely based on their experiences for the production of the crops. Different types of farmers are moving towards remote sensing platforms like UAV-based technology, which helps them protect the crops. In the future, precision agriculture will rely on Sensors, Robotics, the Internet of Things, Machine Learning, and Decision-based support systems. In [6], IoT-based technology has also been adapted to agricultural systems, incorporating cloud computing, big data storage, security issues, and analytics. In [7], they implemented an energy harvesting mechanism using solar energy and a wind turbine by integrating a long-range (LoRa) communication modem in agricultural field.
This review contributes the best solutions for protecting the crop and pest management to solve the farmer’s problem and their day-to-day challenges in the agriculture field. We provide a brief overview to the necessity for UAVs. The goal of precision farming using remote sensing technologies is explained to reduce the potential risks and improve the agricultural yield. We focus on UAVs and their types with clear explanations with a comparison between the different types of UAVs including their technical specifications. The role of UAV in precision pest management is discussed. We provide the conclusion with a challenges and future scope in precision agriculture.
The rest of the paper is organized as follows. Section 2 gives a brief overview about the precision agriculture. Section 3 describes different types of UAVs. Section 4 juxtaposed the qualitative parameters of various types of UAVs and their applications in precision agriculture. Section 5 investigates the role of UAVs in precision pest management. In the last section we have drawn our conclusions.

2. Precision Agriculture

Precision agriculture (PA) helps farmers make crucial decisions at the right time by analyzing a vast amount of data regarding the environment and crop details. Thus, PA helps the farmers marching towards more production with quality to meet the required demand. Remote Sensing (RS) plays a vital role in crop evaluation and soil health conditions. It indicates the problems at the right time and helps to resolve the problem wisely. Figure 1 describes various remote sensing platforms used for precision agriculture.
UAV is flexible for most applications and addresses the solutions for the problems faced by other RS platforms [8].
It can be easily accessible and provides accurate data. Further, it is cost-effective and easy to deploy anywhere and can operate real-time spatial images compared with other traditional RS Platforms. Table 1 presented a detailed comparison of the quality of services provided by the various types of RS Platforms in Precision Agriculture.

3. Types of Unmanned Aerial Vehicle (UAV)

UAVs describe vehicles with weights around or lower to 25 kg which do not need a human to fly them as they can be managed remotely. A quick survey can be easily achieved over a wide range of area through unmanned aerial vehicles [9]. UAVs can be applied for analyzing images, ground monitoring, and in-depth situation analysis of a crop [8]. We can categorize UAVs into various types based on the number of rotors, speed, application, mechanism, etc. UAVs with weights greater than or equal to 25 kg have specific rules and laws to fly. As a result, weight can be a significant factor for distinguishing between the UAVs while the vehicle takes off. Firstly, we can see very heavy UAVs which weigh around 2 tons or more. They will be able to carry enough fuel and are mainly used for military purposes. Secondly, some UAVs weigh 200–2000 kgs and 50–200 kgs. These are used for various applications extensively and can hold enough fuel to travel for longer. Finally, we have lightweight UAVs weighing around 5 to 50 kgs that finds uses in agriculture purposes.
Further, we have micro-UAVs which weigh less than 5 kgs. UAVs that are lighter than 5 kg are easy for take-off and less expensive than heavier vehicles. It can be fixed-wing, Single Rotor, Multi Rotor Landing (VTOL) UAVs, and Hybrid Vertical Take-off. There is a vast difference in the structure of fixed-wing and multi-rotor. Their time of flying, endurance, and type of energy differ entirely from each other. A single motor is slightly different from multi-rotors. The single rotor contains two rotors in which one more oversized rotor is on the top, and the other is small and fixed on the tail. Multi-rotor can be Tricopter, Quadcopter, Hexacopter, and Octocopter based on the number of rotors and applications [10]. We discuss the various types of UAVs depicts in Figure 2 and Table 2.

3.1. Fixed Winged

As can be seen in Figure 3, A fixed-winged UAV does data collection through remote operation mode. Fabrication of a simple fixed-wing UAV is by a wingspan of 195 cm and a carbon-fiber body with one propeller engine. As a result, excellent aerodynamics can be provided with the added benefit of more flight time when speed increases in the places surveyed. Usually, such UAVs are equipped with high-resolution cameras for better mapping and surveillance from height. In addition, it has a straightforward flight system. Moreover, the architecture and maintenance of such UAVs are also relatively easy [11].

3.2. Single Rotor

A single-rotor system consists of two different components: As can be seen in Figure 4, the helicopter and another system that controls the helicopter from ground level. The helicopter contains various parts connected to it, namely a flight controller, gyroscope, GPS receiver, transmitter for image and telemetry, the sensor for heading and spraying components. Similarly, the ground-level controlling system contains a telemetry receiver and a transmitter in a remote control. Moreover, in specific systems, forced-air engine cooling is installed to cool the engine when it reaches high altitude and when the flight speed is low. In order to sense the heeling and pitch angle of the aircraft and detect 3D positional velocity, vertical gyroscopes which have high precision are used. Another sensor using magnetic heading is used to make minor corrections of mistakes due to the changing fight directions. The elevation and location of the UAV can be detected using a pressure altimeter connected to it. Various control variables can be computed using Kalman filter and Proportional Integral Derivative (PID) algorithms [12].

3.3. Hybrid Vertical Take-Off and Landing (VTOL)

Hybrid VTOL UAVs are Vehicles that have the benefit of both fixed-winged systems and multi-rotor systems. As shown in Figure 5, They are very efficient in take-off, which is similar to multi-rotor systems. At the same time, they fly with an efficiency of a fixed-wing system. Due to its combined features, the development and maintenance of hybrid systems are complex, as are their control systems. In this case, three controllers, namely horizontal, vertical, and transition, are used [8].

3.4. Multi Rotor

Based on the number of rotors and their configuration, multi-Rotor can be classified. Some of the most frequently used multi-copters are tricopter, quadcopter, hexocopter, and octocopter.

3.4.1. Tri Copter

As can be seen in Figure 6, The general structure of tricopter has three rotors which will help balance the weight of the tricopter when it is flying. The movement of the rotors is in such a way that the right rotor will be in the clockwise direction. The other two rotors will move in the opposite direction. A servo method is used to negate the unbalanced clockwise torque, which is done by tilting the rotor present in the tail. As a result, a productive pitch has been developed using the three rotors in various directions to move forward. Thus, differentiating the left and right rotor thrust, rolling can be achieved. Similarly, the vehicles can be moved sideways also [13].

3.4.2. Quad Copter

Quadcopter has a superior design of UAVs, and they have four rotors. These rotors generate the lift of this model. As shown in Figure 7, Out of these four rotors, two oppositely placed rotors rotate in a clockwise direction (CW), and the remaining two rotors rotate in a counter-lockwise direction (CCW). The movement of this model around the axis includes forward/backward movements called ‘pitch,’ moving laterally in left or right directions called ‘roll’ and clockwise and counterclockwise movements called ‘yaw.’ The Plus and cross configuration models (based on their shapes) of Quadcopter are. The cross model is more popular than the conventional one because of its increased stability over the plus model [14].

3.4.3. Hex Copter

The Greek word Hexa means six. Hexacopter is a drone that has six arms, and each of them is attached to a single high-speed BLDC motor. As can be seen in Figure 8, The airframe is made of glass fiber. Aluminum tubes (500 × 25 mm) are fixed to an arm mount in the outer edge of the airframe. The six motors are mounted at the far end of these tubes (see Plate 1). The airframe plate is the support structure over which the other parts of the drone, such as batteries, motor, support a flight-controlled GPS antenna and tube of high-speed capacity. It also hasFPV cameras, ESC, circuit boards, and sensors. This model is used to spray pesticides for various agricultural purposes with a maximum of 5-L capacity fluid tank attached to the bottom of the airframe, where the outlet of this tank is attached to the inlet of the spray motor. The outlet of the spray motor is connected to spray nozzles. A U-shaped bent aluminum pipe of 14 × 1.5 mm proportion is used to mount the parts like a fluid tank, spray motor, and spray lance. The spray lance has four nozzles spaced in a gap of 45 cm, each spanning 1.3 m. The bottom area of the drone has landing gears below the spray unit so that take-off and landing of the model would be safe during and after the spray [15].

3.4.4. Octocopter

Octocopter has eight rotors and is used as similar to Hexa UAV for agricultural spraying purposes. As can be seen in Figure 9, This has a diagonal wheelbase of 1630 mm diameter and can fly for 15 min with a 10 kg payload. It has six nozzles with 5–8 m spray width. This model was observed using the Time-resolved particle image velocimetry (TR-PIV) method to measure the movement of the sprayed droplets and their deposition. This observation method showed that two variables, such as rotor speed and position of the spray nozzle, influence the movement of deposition of the spray [16].

4. Role of UAV in Precision Pest Management

Precision pest management can be used for monitoring the crops which identify the pest-affected places using remote sensing technologies, and control mechanisms such as pesticide spraying will be acted accordingly from prevention of diseases. For achieving this, both the technologies should be mounted on the UAV.
The Unmanned aerial vehicle can also be used for spraying fertilizer and pesticides on agricultural fields [9]. The UAV has a significant feature with good speed and accuracy in spraying system of the fertilizer and pesticides. The main parts of UAV used for spraying are:
  • Pressure nozzle;
  • Spraying controller;
  • Pesticide box;
  • Hall-flow sensor;
  • Small diaphragm pump;
  • Field-map interpretation system.
A sprayer is connected with UAV for spraying pesticides or fertilizers. It can be sprayed through the nozzle into droplets under pressure. The suitable pressure is produced to spray the fluid with the help of the spray motor. The spraying controller uses the Hall-flow sensor for estimating the fluid flow inside the system and initiates the nozzle of the sprayer. UAVs used for spraying purpose can be varied with their Speed, Payload, and number of nozzles used for spraying. UAV-based fertilizer and pesticide spraying methodology has more efficiency than the traditional systems. It reduces the human contact with hazardous gases. A limited amount of human power is required. The UAV reduces the time and expenses.
A detailed study is made on pest detection using Remote Sensing technology. Table 3, Table 4, Table 5 and Table 6 show the pest detection in various types of crops, and the observations are analyzed through the captured spectral images by the UAV, Manned Aircraft, Satellite, and ground-based technology. Further in the Table 3, technical specification of the UAV is mentioned while capturing the images of the crops in different agricultural fields and locations.
A large volume of spatial images with high resolution was acquired with the UAV, which helps increase the accuracy level of the algorithm for classification and identification of the leaf spot in the banana. Quantification, prediction, identification, and classification are made to observe pests and insects in agricultural crops. The aerial images of the UAV and digital image processing (DIP), it calculates the severity of the attack of yellow Sigatoka. For estimating the damage in the field, it will act as an alternative method [38] Deep learning architectures are evaluated for the pest images of soybean and its classification obtained from the UAV. The performance of Inception-v3, Resnet50, VGG-16, VGG-19, and Xception was evaluated for different learning strategies with a dataset of 5000 images captured in actual field conditions [37].
UAVs mounted with traditional RGB cameras using remote sensing technologies could be considered to detect and quantify pests through UAV aerial images. Focusing on the 2D geomatic and 3D products, most of the users of UAV platforms need to improve the application utility and accuracy [39]. Recent advancement in remote sensing technology through unmanned aerial vehicles (UAVs) leads to rapid image processing tools for crop management and surveillance of pests. This UAV remote sensing-based technology increases the efficiency of existing practices of human surveillance for the detection of pests like grape phylloxera in vineyards. It uses UAV integrated with advanced digital hyper spectral, multispectral, and RGB sensors. The predictive model is developed for phylloxera detection. Under different levels of phylloxera infestation, the combination of RGB, multispectral, and hyper spectral images with ground-based data at two separate periods was explored [41] Comparing remote sensing technologies presented in Table 4.
Table 4. Aerial (manned aircraft) based remote sensing.
Table 4. Aerial (manned aircraft) based remote sensing.
ReferencesCrop NameParameters
CameraPest NameObservations
Xuan Li et al., 2021 [48]alfalfaMultispectralEmpoasca fabaeDamage assessments
Bhattarai et al., 2019 [49]WheatMultispectralHessian flyArthropod counts
Backoulou et al., 2018a,b [50,51]SorghumMultispectralSugarcane aphidDamage assessments
Backoulou et al., 2016 [52]WheatMultispectralGreenbugArthropod counts or visual inspection
Elliott et al., 2015 [53]SorghumMultispectralSugarcane aphidDamage assessments
Backoulou et al., 2011a,b, 2013, 2015 [54,55,56]WheatMultispectralRussian wheat aphidVisual inspections
Mirik et al., 2014 [57]WheatHyper spectralRussian wheat aphidVisual inspection of images
Reisig and Godfrey 2010 [58]CottonMultispectral,
Hyper spectral
Cotton aphidArthropod counts
Elliott et al., 2009 [59]WheatMultispectralGreenbugArthropod counts or visual inspection
Carroll 2008 [60]CornHyper spectralEuropean corn borerDamage assessments
Elliott et al., 2007 [61]WheatMultispectralRussian wheat aphidProportion of infested plants
Reisig and Godfrey, 2006 [62]CottonMultispectral,
Hyper spectral
Spider miteArthropod counts
Willers et al., 2005 [63]CottonMultispectralTarnished plantbugSweep net sampling
Fitzgerald et al., 2004 [34]CottonHyper spectralStrawberry spiderArthropod counts
Sudbrink et al., 2003 [64]CottonMultispectralBeet armywormArthropod counts
F. W. Nutter Jr. et al., 2002 [65]Soya BeanMultispectralSoya Bean Cyst NematodeVisual inspection of images
Willers et al., 1999 [66]CottonMultispectralTarnished plant bugSweep net sampling, drop cloth sampling
Lobits et al., 1997 [67]GrapeMultispectralGrape phylloxeraRoot digging
Hart and Meyers, 1968 [68]CitrusMultispectralBrown soft scaleArthropod counts sooty mold assessments
Everitt et al., 1994 [69]CitrusMultispectralCitrus blackflyVisual inspections sooty mold assessments
Everitt et al., 1996 [70]CottonMultispectralSilverleaf whiteflyVisual inspections sooty mold assessments
Hart et al., 1973 [71]CitrusMultispectralCitrus blackflyArthropod counts sooty mold assessments
Remote sensing data is used for studying the infestations of pests and insects in agricultural fields efficiently. In winter wheat (Triticumaestivum) fields in Kansas, USA, the association between Hessian fly (Mayetiola destructor) infestation and normalized difference vegetation index (NDVI) is evaluated using aircraft data and multispectral satellite. In each field, Hessian fly infestation was surveyed with multiple sampling points in a uniform grid fashion. The results have proven an increase in pest infestation with decreased NDVI in both aircraft and satellite data. NDVI satellite data performed better than NDVI aircraft data in pest infestation fields. The results show that remote sensing technology data can be used for monitoring the health of wheat plants and areas of poor growth [50]. Infestations of pests and insects in the agriculture field are not uniform and can proliferate in intensity and size. Remote sensing with multispectral data is used for assessing the sorghum fields for the infestations by sugarcane aphids. The difference in the normalized differenced vegetation index (NDVI) with bi-temporal images and analysis of changes in the image captured is efficient for assessing the infestation of temporal changes in the sorghum field by the sugarcane aphids. Experimentation on comparing changes in the field and distribution categories concerning normalized differenced vegetation index (NDVI) image classification from the sorghum field with infested sugarcane aphid, an essential technique for assessing the infestations of temporal changes by sugarcane aphids in sorghum fields [72] Comparing orbital based remote sensing technologies presented in Table 5.
Table 5. Orbital (Satellite) based remote sensing.
Table 5. Orbital (Satellite) based remote sensing.
ReferencesCrop NameParameters
CameraPest NameObservations
MarianAdan et al., 2021 [73]avocadoMultispectralPersea miteVisual Inspections
Michael Gomez Selvaraj et al., 2020 [74]BananaRGB, MultispectralYellow sigatokaVisual Inspections
Bhattarai et al., 2019 [50]WheatMultispectralHessian flyArthropod counts
Ma et al., 2019 [23]WheatMultispectralWheat aphidArthropod counts
Abdel-Rahman et al., 2017 [75]CornMultispectralStem borerArthropod counts
Zhang et al., 2016 [76]CornMultispectralOriental armywormDamage assess-counts
Lestina et al., 2016 [77]WheatMultispectralWheat stem sawflyArthropod counts
Luo et al., 2014 [78]WheatMultispectralWheat aphidArthropod counts damage assessments
Huang et al., 2011 [79]WheatMultispectralAphidArthropod counts
Reisig and Godfrey, 2010 [59]CottonMultispectralCotton aphidArthropod counts
Reisig and Godfrey, 2006 [63]CottonMultispectralSpider miteArthropod counts
Remote sensing tools coupled with Machine Learning have a lead role in monitoring the crop and surveillance of pests. Early warning systems use remote sensing applications to classify crops and pest-affected areas that provide accurate and cost-effective data at different agricultural fields with proper spatial, temporal, and spectral resolutions. However, monitoring more significant landscapes is challenging, therefore combining high-resolution UAV satellite images of data through efficient machine learning (ML) models and advanced mobile applications, which helps detect the disease-affected part.
The hybrid model system is developed by combining a custom classifier and object detection model (RetinaNet) for disease classification and banana localization; we have used RGB-UAV aerial images from the Republic of Benin and DR Congo fields. This result proves better accuracy under different testing with performance metrics and reveals that RGB-UAV mixed model successfully classifies the object classification and detection among healthy and diseased crops with 99.4% accuracy. Thus, this approach provides high potential support systems for making major banana diseases in Africa [76].
Monitoring the pests and diseases makes vital in providing treatment practically in affected regions. The accuracy level of the crops affected by insects and pests gets improved when the environmental parameters are coupled with the vegetation index. Furthermore, similar symptoms can be identified for different pests and diseases in crop growth. Therefore, the information of growth period helps obtain the changes incurred in the crop concerning infection of insects and pests. An approach is developed by integrating environmental parameters and crop growth, experimenting with image performance classification effects, and discriminating the crops affected by the pests and diseases with Landsat-8 satellite images (Bi-Temporal).
The integrated model with environmental factors and temporal growth indices proved with good results of 82.6% accuracy. In addition, it performed better in discriminating damages using Landsat-8 satellite images in winter wheat crops. Further, to enhance the accuracy level of the advancement models by integrating multi-temporal remotely sensed data with multisource, which provides a detailed spatial crop pest and disease distribution to meet the current requirements of precision agriculture [23] Comparing ground based remote sensing technologies presented in Table 6.
Table 6. Ground based Remote Sensing.
Table 6. Ground based Remote Sensing.
ReferencesCrop NameParameters
CameraPest NameObservations
MaríaGyomar Gonzalez-Gonzalez et al., 2021 [80]CitrusHyperspectralTetranychus urticaevisual inspection of the leaves
Martin and Latheef 2019 [81]CornMultispectralBanks grassmite spotted spidermiteDamage assessments
Alves et al., 2019, 2013 [82,83]SoyabeanHyperspectralSoybean aphidArthropod counts
Samuel Joall and et al., 2018 [43]Sugar BeetMultispectral, HyperspectralBeet
Cyst Nematode
Visual Images
Martin and Latheef, 2018 [84]Pinto beanMultispectralTwo-spotted spiderControlled
infestations
Fan et al., 2017 [85]RiceHyperspectralStriped stem borerDamage assessments
Herrmann et al., 2017 [86]BeanHyperspectralTwo spotted spider miteDamage assessments
Abdel-Rahman et al., 2013, 2010, 2009 [87,88,89]SugarcaneHyperspectralSugarcane thripsArthropod counts, Damage assessments
Mirik et al., 2012 [90]WheatMultispectralRussian wheat aphidVisual inspections
Zhang et al., 2008 [91], Luedeling et al., 2009 [92]PeachHyperspectralSpider miteArthropod counts, Damage assessments
Fraulo et al., 2009 [93]StrawberryHyperspectralTwo spotted spider miteArthropod counts
Li et al., 2008 [94]SorghumHyperspectralCorn leaf aphidArthropod counts,
Xu et al., 2007 [95]TomatoHyperspectralLeaf minerDamage assessments
F. W. Nutter Jr. et al., 2002 [65]Soya BeanMultispectralSoya Bean Cyst NematodeVisual inspection of images
Everitt et al., 1996 [70]CottonMultispectralSilverleaf whiteflyVisual inspections
Peñuelas et al., 1995 [96]AppleHyperspectralEuropean red miteArthropod counts
Using spectral sensors with infrared range and 50 nm sensor bandwidth in soybean fields, a cumulative abundance of A. glycines could be effectively quantified. A. glycines on soybean are detected by simulating ground-based hyperspectral data with multispectral sensors. This approach reduces the complexity and cost while compared with counts of manual aphids with potential scouting of pests in soybean and crop production systems [82].
For the last few decades, most agriculture fields are using RS technologies for precision agriculture with different applications such as crop monitoring, Prediction of Yields, and Pest Management. Further, these techniques are also used for plant stress and nutritional deficiencies. RS technologies can detect pests and insects successfully in a wide variety of crops and fields. The average usage of different types of RS Platforms is shown in Figure 10. Precision Accuracy is more important in the economic development of the agriculture field, and the accuracy yields to monitor the crop infected by the pest and quality of the crop properly. Further, the precision accuracy rate in the agriculture field by RS technologies is shown in Figure 11.

5. Economic Benefits of UAV Technologies

The UAV based remote sensing technology helps the farmers in the agricultural fields for gaining more productivity globally. There will be certain regions like South and Southeast Asia, Western and Central Europe, Central America and the Caribbean, and Southern Africa can be adapted with these kinds of new technologies without major human adaptations to increase productivity for a sustainable growing population.
There are more economic benefits for the society that could be derived from the remote sensing technology and unmanned aerial vehicle. Especially for developing countries like India and African countries, the usage of UAV leads to reduction in damage of crops and increase yields. If farmers can be encouraged to use this technology on the commercial side, it will eventually help them to increase the production of crops. Once the farmers are producing crops on a larger scale, it will help them to export the agricultural products to other continents. This will balance the problems in the economy of developing countries through an increase in export and reduction in import of agricultural produce to some extent. Moreover, it will gradually help increase employment which will reduce poverty and improve the standard of living for people.

6. Conclusions

Unmanned aerial vehicle in precision agriculture has critical challenges which are described as payload, Sensors used in the UAV, cost of UAV, flight duration, data analytics, environmental conditions, and requirements. Cost is the main challenge for UAV use, which is added with various needed sensors, mounting parts, technology-based applications, and the software needed for data analytics. Nowadays, commercial companies offer services for renting out the various UAVs with all needed remote sensing devices. Data analytics is also a vital challenge to attain results at a periodic interval of time once the data have been collected from the various sensors mounted on the UAVs. It creates numerous terabytes of data stored, processed, and analyzed adequately with the appropriate software. Similarly, it is hard to develop a UAV that can detect both hotspots of the pest and the solutions applied for them since payload and flight duration are limited for UAV use in fields. Weather conditions such as rain, snowfall, clouds, and fog are another factor that limits the UAV activities and the sensing process. The farmers can easily adapt to this technology that is compatible with their agriculture requirements and cost-effective solutions.

Author Contributions

Conceptualization, P.V., S.R. and R.K.M.; methodology, M.S. and J.-G.C.; software, S.R. and S.N.; validation, P.V., M.S. and J.-G.C.; investigation, M.S. and J.-G.C.; resources, S.R., R.K.M. and S.N.; data curation, P.V., M.S. and J.-G.C.; writing—original draft preparation, P.V.; writing—review and editing, P.V., M.S. and J.-G.C.; visualization, S.R. and R.K.M.; supervision, S.R. and R.K.M.; project administration, M.S. and R.K.M.; funding acquisition, J.-G.C. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Basic Science Research Program through the National Research Foundation (NRF) of Korea funded by the Ministry of Education under Grant 2018R1D1A1B07048948.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest. The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

References

  1. Colomina, I. Unmanned aerial systems for photo grammetry and remote sensing: Areview. ISPRS J. Photo Grammetry Remote Sens. 2014, 92, 79–97. [Google Scholar] [CrossRef] [Green Version]
  2. Everaerts, J. The use of unmanned aerial vehicles (UAVs) for remote sensing and mapping. The International Archives of the Photo grammetry. Remote Sens. Spat. Inf. Sci. 2008, 37, 1187–1192. [Google Scholar]
  3. Natu, A.S.; Kulkarni, S.C. Adoption and Utilization of Drones for Advanced Precision Farming: A Review. Int. J. Recent Innov. Trends Comput. Commun. 2016, 4, 563–565. [Google Scholar]
  4. Zhang, C.; Kovacs, J.M. The application of small unmanned aerial systems for precision agriculture: A review. In Precision Agriculture; Springer: Berlin/Heidelberg, Germany, 2012; Volume 13, pp. 693–712. [Google Scholar]
  5. Zhang, H.L.; Tian, W.T.; Yin, J. A Review of Unmanned Aerial Vehicle Low-Altitude Remote Sensing (UAV-LARS) Use in Agricultural Monitoring in China. Remote Sens. 2021, 13, 1221. [Google Scholar] [CrossRef]
  6. Farooq, M.S.; Riaz, S.; Abid, A.; Abid, K.; Naeem, M.A. A Survey on the Role of IoT in Agriculture for the Implementation of Smart Farming. IEEE Access 2019, 7, 156237–156271. [Google Scholar] [CrossRef]
  7. Swain, M.; Zimon, D.; Singh, R.; Hashmi, M.F.; Rashid, M.; Hakak, S. LoRa-LBO: An Experimental Analysis of LoRa Link Budget Optimization in Custom Build IoT Test Bed for Agriculture 4.0. Agronomy 2021, 11, 820. [Google Scholar] [CrossRef]
  8. Delavarpour, N.; Cengiz, K.; Nowatzki, N.; Bajwa, S.; Sun, X. A Technical Study on UAV Characteristics for Precision Agriculture Applications and Associated Practical Challenges. Remote Sens. 2021, 13, 1204. [Google Scholar] [CrossRef]
  9. Rahman, M.F.F.; Fan, S.; Zhang, Y.; Chen, L. A Comparative Study on Application of Unmanned Aerial Vehicle Systems in Agriculture. Agriculture 2021, 11, 22. [Google Scholar] [CrossRef]
  10. Islam, N.; Rashid, M.M.; Pasandideh, F.; Ray, B.; Moore, S.; Kadel, R. A Review of Applications and Communication Technologies for Internet of Things (IoT) and Unmanned Aerial Vehicle (UAV) based Sustainable Smart Farming. Sustainability 2021, 13, 1821. [Google Scholar] [CrossRef]
  11. Ziliani, m.; Parkes, s.; Hoteit, I.; McCabe, M. Intra-Season Crop Height Variability at Commercial Farm Scales Using a Fixed-Wing UAV. Remote Sens. 2018, 10, 2007. [Google Scholar] [CrossRef] [Green Version]
  12. Xinyu, X.; Lan, Y.; Sun, Z.; Chang, C.; Hoffmann, W.C. Develop an unmanned aerial vehicle based automatic aerial spraying system. Comput. Electron. Agric. 2016, 128, 58–66. [Google Scholar]
  13. McArthur, D.R.; Chowdhury, A.B.; Cappelleri, D.J. Design of the interacting-boomcopter unmanned aerial vehicle for remote sensor mounting. J. Mech. Robot. 2018, 10, 025001. [Google Scholar] [CrossRef] [Green Version]
  14. Sharma, R. Review on Application of Drone Systems in Precision Agriculture. J. Adv. Res. Electron. Eng. Technol. 2021, 7, 520137. [Google Scholar]
  15. Yallappa, D.; Veerangouda, M.; Maski, D.; Palled, V.; Bheemanna, M. Development and evaluation of drone mounted sprayer for pesticide applications to crops. In Proceedings of the 2017 IEEE Global Humanitarian Technology Conference (GHTC), San Jose, CA, USA, 19–22 October 2017; pp. 1–7. [Google Scholar]
  16. Qing, T.; Zhang, R.; Chen, L.; Min, X.; Tongchuan, Y.; Bin, Z. Droplets movement and deposition of an eight-rotor agricultural UAV in downwash flow field. Int. J. Agric. Biol. Eng. 2017, 10, 47–56. [Google Scholar]
  17. Bhoi, S.K.; Kumar Jena, K.; Kumar Panda, S.; Long, H.V.; Kumar, P.R.; Bin Jebreen, S.H. An Internet of Things assisted Unmanned Aerial Vehicle based artificial intelligence model for rice pest detection. Microprocess. Microsyst. 2021, 80, 103607. [Google Scholar] [CrossRef]
  18. Wu, B.; Liang, A.; Zhang, H.; Zhu, T.; Zou, Z.; Yang, D.; Tang, W.; Li, J.; Su, J. Application of conventional UAV-based high-throughput object detection to the early diagnosis of pine wilt disease by deep learning. For. Ecol. Manag. 2021, 486, 118986. [Google Scholar] [CrossRef]
  19. Ishengoma, F.S.; Rai, I.A.; Said, R.N. Identification of maize leaves infected by fall armyworms using UAV-based imagery and convolutional neural networks. Comput. Electron. Agric. 2021, 184, 106124. [Google Scholar] [CrossRef]
  20. Saito Moriya, É.A.; Imai, N.N.; Tommaselli, A.M.G.; Berveglieri, A.; Santos, G.H.; Soares, M.A.; Marino, M.; Reis, T.T. Detection and mapping of trees infected with citrus gummosis using UAV hyperspectral data. Comput. Electron. Agric. 2021, 188, 106298. [Google Scholar] [CrossRef]
  21. An, G.; Xing, M.; He, B.; Kang, H.; Shang, J.; Liao, C.; Huang, X.; Zhang, H. Extraction of Areas of Rice False Smut Infection Using UAV Hyperspectral Data. Remote Sens. 2021, 13, 3185. [Google Scholar] [CrossRef]
  22. Nguyen, C.; Sagan, V.; Maimaitiyiming, M.; Maimaitijiang, M.; Bhadra, S.; Kwasniewski, M.T. Early Detection of Plant Viral Disease Using Hyperspectral Imaging and Deep Learning. Sensors 2021, 21, 742. [Google Scholar] [CrossRef]
  23. Ma, H.; Huang, W.; Jing, Y.; Yang, C.; Han, L.; Dong, Y.; Ye, H.; Shi, Y.; Zheng, Q.; Liu, L.; et al. Integrating growth and environmental parameters to discriminate powdery mildew and aphid of winter wheat using bi-temporal Landsat-8 imagery. Remote Sens. 2019, 11, 846. [Google Scholar] [CrossRef] [Green Version]
  24. Qin, J.; Wang, B.; Wu, Y.; Lu, Q.; Zhu, H. Identifying Pine Wood Nematode Disease Using UAV Images and Deep Learning Algorithms. Remote Sens. 2021, 13, 162. [Google Scholar] [CrossRef]
  25. Xiao, Y.; Dong, Y.; Huang, W.; Liu, L.; Ma, H. Wheat Fusarium Head Blight Detection Using UAV-Based Spectral and Texture Features in Optimal Window Size. Remote Sens. 2021, 13, 2437. [Google Scholar] [CrossRef]
  26. Guo, A.; Huang, W.; Dong, Y.; Ye, H.; Ma, H.; Liu, B.; Wu, W.; Ren, Y.; Ruan, C.; Geng, Y. Wheat Yellow Rust Detection Using UAV-Based Hyperspectral Technology. Remote Sens. 2021, 13, 123. [Google Scholar] [CrossRef]
  27. Castrignanò, A.; Belmonte, A.; Antelmi, I.; Quarto, R.; Quarto, F.; Shaddad, S.; Sion, V.; Muolo, M.R.; Ranieri, N.A.; Gadaleta, G.; et al. A geostatistical fusion approach using UAV data for probabilistic estimation of Xylella fastidiosa subsp. pauca infection in olive trees. Sci. Total Environ. 2020, 752, 141814. [Google Scholar] [CrossRef]
  28. Francesconi, S.; Harfouche, A.; Maesano, M.; Balestra, G.M. UAV-Based Thermal, RGB Imaging and Gene Expression Analysis Allowed Detection of Fusarium Head Blight and Gave New Insights into the Physiological Responses to the Disease in Durum Wheat. Front. Plant. Sci. 2021, 12, 628575. [Google Scholar] [CrossRef] [PubMed]
  29. Yadav, S.; Sengar, N.; Singh, A.; Singh, A.; Dutta, M.K. Identification of disease using deep learning and evaluation of bacteriosis in peach leaf. Ecol. Inform. 2021, 61, 101247. [Google Scholar] [CrossRef]
  30. Görlich, F.; Marks, E.; Mahlein, A.-K.; König, K.; Lottes, P.; Stachniss, C. UAV-Based Classification of Cercospora Leaf Spot Using RGB Images. Drones 2021, 5, 34. [Google Scholar] [CrossRef]
  31. Yu, R.; Ren, L.; Luo, Y. Early detection of pine wilt disease in Pinus tabuliformis in North China using a field portable spectrometer and UAV-based hyperspectral imagery. For. Ecosyst. 2021, 8, 40. [Google Scholar] [CrossRef]
  32. Yu, R.; Luo, Y.; Zhou, Q.; Zhang, X.; Wu, D.; Ren, L. Early detection of pine wilt disease using deep learning algorithms and UAV-based multispectral imagery. For. Ecol. Manag. 2021, 497, 119493. [Google Scholar] [CrossRef]
  33. Chivasa, W.; Mutanga, O.; Burgueño, J. UAV-based high-throughput phenotyping to increase prediction and selection accuracy in maize varieties under artificial MSV inoculation. Comput. Electron. Agric. 2021, 184, 106128. [Google Scholar] [CrossRef]
  34. Fitzgerald, J.G.; Maas, S.J.; Detar, W.R. Spider mite detection and canopy component mapping in cotton using hyperspectral imagery and spectral mixture analysis. Precis. Agric. 2004, 5, 275–289. [Google Scholar] [CrossRef]
  35. Gao, J.; Westergaard, J.C.; Sundmark, E.H.R.; Bagge, M.; Liljeroth, E.; Alexandersson, E. Automatic late blight lesion recognition and severity quantification based on field imagery of diverse potato genotypes by deep learning. Knowl. Based Syst. 2014, 214, 106723. [Google Scholar] [CrossRef]
  36. Deng, X.; Zhu, Z.; Yang, J.; Zheng, Z.; Huang, Z.; Yin, X.; Wei, S.; Lan, Y. Detection of Citrus Huanglongbing Based on Multi-Input Neural Network Model of UAV Hyperspectral Remote Sensing. Remote Sens. 2020, 12, 2678. [Google Scholar] [CrossRef]
  37. Tetila, E.C.; Machado, B.B.; Astolfi, G.; Belete, N.A.D.; Amorim, W.P.; Roel, A.R.; Pistori, H. Detection and classification of soybean pests using deep learning with UAV images. Comput. Electron. Agric. 2020, 179, 105836. [Google Scholar] [CrossRef]
  38. Calou, V.C.; Teixeira, A.d.S.; Moreira, L.C.; Lima, C.S.; de Oliveira, J.; de Oliveira, M. The use of UAVs in monitoring yellow sigatoka in banana. Biosyst. Eng. 2020, 193, 115–125. [Google Scholar] [CrossRef]
  39. Del-Campo-Sanchez, A.; Ballesteros, R.; Hernandez-Lopez, D.; Ortega, J.F.; Moreno, M.A. Agroforestry and Cartography Precision Research Group. Quantifying the effect of Jacobiascalybica pest on vineyards with UAVs by combining geometric and computer vision technique. PLoS ONE 2019, 14, e0215521. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  40. Abdulridha, J.; OzgurandAmpatzidis, B.J. UAV-Based Remote Sensing Technique to Detect Citrus Canker Disease Utilizing Hyperspectral Imaging and Machine Learning. Remote Sens. 2013, 11, 1373. [Google Scholar] [CrossRef] [Green Version]
  41. Vanegas, F.; Bratanov, D.; Powell, K.; Weiss, J.; Gonzalez, F. A novel methodology for improving plant pest surveillance in vineyards and crops using UAV-based hyperspectral and spatial data. Sensors 2018, 18, 260. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  42. Huang, H.; Deng, J.; Lan, Y.; Yang, A.; Deng, X.; Zhang, L.; Wen, S.; Jiang, Y.; Suo, G.; Chen, P. A two-stage classification approach for the detection of spider mite-infested cotton using UAV multispectral imagery. Remote Sens. Lett. 2018, 9, 933–941. [Google Scholar] [CrossRef]
  43. Joalland, S.; Screpanti, C.; Varella, H.V.; Reuther, M.; Schwind, M.; Lang, C.; Liebisch, A.W.F. Aerial and Ground Based Sensing of Tolerance to BeetCyst Nematode in Sugar Beet. Remote Sens. 2018, 10, 787. [Google Scholar] [CrossRef] [Green Version]
  44. Hunt, J.E.R.; Rondon, S.I. Detection of potato beetle damage using remote sensing from small unmanned aircraft systems. J. Appl. Remote Sens. 2017, 11, 026013. [Google Scholar] [CrossRef] [Green Version]
  45. Stanton, C.; Starek, M.J.; Elliott, N.; Brewer, M.; Maeda, M.M.; Chu, T. Unmanned aircraft system-derived crop height and normalized difference vegetation index metrics for sorghum yield and aphid stress assessment. J. Appl. Remote Sens. 2017, 1, 026035. [Google Scholar] [CrossRef] [Green Version]
  46. Severtson, D.; Callow, N.; Flower, K.; Neuhaus, A.; Olejnik, M.; Nansen, C. Unmanned aerial vehicle canopy reflectance data detects potassium deficiency and green peach aphid susceptibility in canola. Precis. Agric. 2016, 17, 659–677. [Google Scholar] [CrossRef] [Green Version]
  47. Nebiker, S.; Lack, N.; Abächerli, M.; Läderach, S. Light-weight multispectral UAV sensors and their capabilities for predicting grain yield and detecting plant diseases. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci 2016, XLI-B1, 963–970. [Google Scholar] [CrossRef] [Green Version]
  48. Li, X.; Giles, D.K.; Andaloro, J.T.; Long, R.; Lang, E.B.; Watson, L.J.; Qandah, I. Comparison of UAV and Fixed-Wing Aerial Application for Alfalfa Insect Pest Control: Evaluating Efficacy, Residues, and Spray Quality. Pest. Manag. Sci. 2021, 77, 4980–4992. [Google Scholar] [CrossRef] [PubMed]
  49. Bhattarai, G.P.; Schmid, R.B.; McCornack, B.P. Remote sensing data to detect hessian fly infestation in commercial wheat fields. Sci. Rep. 2019, 9, 6109. [Google Scholar]
  50. Backoulou, F.G.; Elliott, N.; Giles, K.; Alves, T.; Brewer, M.; Starek, M. Using multispectral imagery to map spatially variable sugarcane aphid infestations in sorghum. Southwest. Entomol. 2018, 43, 37–44. [Google Scholar] [CrossRef]
  51. Backoulou, F.G.; Elliott, K.L.; Brewer, G.M.J.; Starek, M. Detecting change in a sorghum field infested by sugarcane aphid. Southwest. Entomol. 2018, 43, 823–832. [Google Scholar] [CrossRef]
  52. Backoulou, F.G.; Elliott, N.C.; Giles, K.L. Using multispectral imagery to compare the spatial pattern of injury to wheat caused by Russian wheat aphid and greenbug. Southwest. Entomol. 2016, 41, 1–8. [Google Scholar] [CrossRef]
  53. Elliott, C.N.; Backoulou, G.F.; Brewer, M.J.; Giles, K.L. NDVI to detect sugarcane aphid injury to grain sorghum. J. Econ. Entomol. 2015, 108, 1452–1455. [Google Scholar] [CrossRef] [Green Version]
  54. Backoulou, F.G.; Elliott, N.C.; Giles, K.; Phoofolo, M.; Catana, V. Development of a method using multispectral imagery and spatial pattern metrics to quantify stress to wheat fields caused by Diuraphisnoxia. Comput. Electron. Agric. 2011, 75, 64–70. [Google Scholar] [CrossRef]
  55. Backoulou, F.G.; Elliott, K.L.; Giles Rao, M.N. Differentiating stress to wheat fields induced by Diuraphisnoxia from other stress causing factors. Comput. Electron. Agric. 2013, 90, 47–53. [Google Scholar] [CrossRef]
  56. Backoulou, F.G.; Elliott, N.C.; Giles, K.L.; Mirik, M. Processed multispectral imagery differentiates wheat crop stress caused by greenbug from other causes. Comput. Electron. Agric. 2015, 115, 34–39. [Google Scholar] [CrossRef]
  57. Mirik, M.; Ansley, R.J.; Steddom, K.; Rush, C.M.; Michels, G.J.; Workneh, F.; Cui, S.; Elliott, N.C. High spectral and spatial resolution hyperspectral imagery for quantifying Russian wheat aphid infestation in wheat using the constrained energy minimization classifier. J. Appl. Remote Sens. 2014, 8, 083661. [Google Scholar] [CrossRef] [Green Version]
  58. Reisig, D.D.; Godfrey, L.D. Remotely sensing arthropod and nutrient stressed plants: A case study with nitrogen and cotton aphid (Hemiptera: Aphididae). Environ. Entomol. 2010, 39, 1255–1263. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  59. Elliott, N.; Mirik, M.; Yang, Z.; Jones, D.; Phoofolo, M.; Catana, V.; Giles, K.; Michels, G.J. Airborne remote sensing to detect greenbug stress to wheat. Southwest. Entomol. 2009, 34, 205–221. [Google Scholar] [CrossRef]
  60. Carroll, W.M.; Glaser, J.A.; Hellmich, R.L.; Hunt, T.E.; Sappington, T.W.; Calvin, D.; Copenhaver, K.; Fridgen, J. Use of spectral vegetation indices derived from airborne hyperspectral imagery for detection of European corn borer infestation in Iowa corn plots. J. Econ. Entomol. 2008, 101, 1614–1623. [Google Scholar] [CrossRef] [PubMed]
  61. Elliott, C.N.; Mirik, M.; Yang, Z.; Dvorak, T.; Rao, M.; Michels, J.; Walker, T.; Catana, V.; Phoofolo, M.; Giles, K.L.; et al. Airborne multispectral remote sensing of Russian wheat aphid injury to wheat. Southwest. Entomol. 2007, 32, 213–219. [Google Scholar] [CrossRef]
  62. Reisig, D.; Godfrey, L. Remote sensing for detection of cotton aphid- (Homoptera: Aphididae) and spider mite- (Acari: Tetranychidae) infested cotton in the San Joaquin Valley. Environ. Entomol. 2006, 35, 1635–1646. [Google Scholar] [CrossRef]
  63. Willers, J.L.; Jenkins, J.N.; Ladner, W.L.; Gerard, P.D.; Boykin, D.L.; Hood, K.B.; McKibben, P.L.; Samson, S.A.; Bethel, M.M. Site-specific approaches to cotton insect control. Sampling and remote sensing analysis techniques. Precis. Agric. 2005, 6, 431–445. [Google Scholar] [CrossRef]
  64. Sudbrink, D.; Harris, F.; Robbins, J.; English, P.; Willers, J. Evaluation of remote sensing to identify variability in cotton plant growth and correlation with larval densities of beet armyworm and cabbage looper (Lepidoptera: Noctuidae). Fla. Entomol. 2003, 86, 290–294. [Google Scholar] [CrossRef]
  65. Nutter, F.W., Jr.; Tylka, G.L.; Guan, J.; Moreira, A.J.D.; Marett, C.C.; Rosburg, T.R.; Basart, J.P.; Chong, C.S. Use of Remote Sensing to Detect Soybean Cyst Nematode-Induced Plant Stress. J. Nematol. 2002, 34, 222–231. [Google Scholar] [PubMed]
  66. Willers, L.J.; Seal, M.R.; Luttrell, R.G. Remote sensing, lineintercept sampling for tarnished plant bugs (Heteroptera: Miridae) in midsouth cotton. J. Cotton Sci. 1999, 3, 160–170. [Google Scholar]
  67. Lobits, B.; Johnson, L.; Hlavka, C.; Armstrong, R.; Bell, C. Grapevine remote sensing analysis of phylloxera early stress (GRAPES): Remote sensing analysis summary. NASA Tech. Memo. 1997, 112218. [Google Scholar]
  68. Hart, W.G.; Meyers, V.I. Infrared aerial color photography for detection of populations of brown soft scale in citrus groves. J. Econ. Entomol. 1968, 61, 617–624. [Google Scholar] [CrossRef]
  69. Everitt, J.; Escobar, D.; Summy, K.; Davis, M. Using airborne video, global positioning system, and geographical information system technologies for detecting and mapping citrus blackfly infestations. Southwest. Entomol. 1994, 19, 129–138. [Google Scholar]
  70. Everitt, J.; Escobar, D.; Summy, K.; Alaniz, M.; Davis, M. Using spatial information technologies for detecting and mapping whitefly and harvester ant infestations in south Texas. Southwest. Entomol. 1996, 21, 421–432. [Google Scholar]
  71. Hart, G.W.; Ingle, S.J.; Davis, M.R.; Mangum, C. Aerial photography with infrared color film as a method of surveying for citrus blackfly. J. Econ. Entomol. 1973, 66, 190–194. [Google Scholar] [CrossRef]
  72. Backoulou, F.G.; Elliott, N.C.; Giles, K.; Phoofolo, M.; Catana, V.; Mirik, M.; Michels, J. Spatially discriminating Russian wheat aphid induced plant stress from other wheat stressing factors. Comput. Electron. Agric. 2011, 78, 123–129. [Google Scholar] [CrossRef]
  73. Adan, M.; Abdel-Rahman, E.M.; Gachoki, S.; Muriithi, H.B.W.; Lattorff, M.G.; Kerubo, V.; Landmann, T.; Mohamed, S.A.; Tonnang, H.E.Z.; Dubois, T. Use of earth observation satellite data to guide the implementation of integrated pest and pollinator management (IPPM) technologies in an avocado production system. Remote Sens. Appl. Soc. Environ. 2021, 23, 100566. [Google Scholar] [CrossRef]
  74. Selvaraj, M.G.; Vergara, A.; Montenegro, F.; Ruiz, H.A.; Safari, N.; Raymaekers, D.; Ocimati, W.; Ntamwira, J.; Tits, L.; Blomme, G. Detection of banana plants and their major diseases through aerial images and machine learning methods: A case study in DR Congo and Republic of Benin. J. Photogramm. Remote Sens. 2020, 169, 110–124. [Google Scholar] [CrossRef]
  75. Abdel-Rahman, M.E.; Landmann, T.; Kyalo, R.; Ong’amo, G.; Mwalusepo, S.; Sulieman, S.; LeRu, B. Predicting stem borer density in maize using RapidEye data and generalized linear models. Int. J. Appl. Earth Obs. Geoinf. 2017, 57, 61–74. [Google Scholar] [CrossRef]
  76. Zhang, J.; Huang, Y.; Yuan, L.; Yang, G.; Chen, L.; Zhao, C. Using satellite multispectral imagery for damage mapping of armyworm (Spodopterafrugiperda) in maize at a regional scale. Pest. Manag. Sci. 2016, 72, 335–348. [Google Scholar] [CrossRef] [PubMed]
  77. Lestina, J.; Cook, M.; Kumar, S.; Morisette, J.; Ode, P.J.; Peairs, F. MODIS imagery improves pest risk assessment: A case study of wheat stem sawfly (Cephuscinctus, Hymenoptera: Cephidae) in Colorado, USA. Environ. Entomol. 2016, 45, 1343–1351. [Google Scholar] [CrossRef] [PubMed]
  78. Luo, J.; Huang, W.; Zhao, J.; Zhang, J.; Ma, R.; Huang, M. Predicting the probability of wheat aphid occurrence using satellite remote sensing and meteorological data. Optik 2014, 125, 5660–5665. [Google Scholar] [CrossRef]
  79. Huang, W.; Luo, J.; Zhao, J.; Zhang, J.; Ma, Z. Predicting wheat aphid using 2-dimensional feature space based on multi-temporal Landsat TM. In IEEE International Geoscience and Remote Sensing Symposium; IEEE: New York, NY, USA, 2011; Volume 24–29, pp. 1830–1833. [Google Scholar]
  80. Gonzalez-Gonzalez, M.; Blasco, J.; Cubero, S.; Chueca, P. Automated Detection of TetranychusurticaeKoch in Citrus Leaves Based on Colour and VIS/NIR Hyperspectral Imaging. Agronomy 2021, 11, 1002. [Google Scholar] [CrossRef]
  81. Martin, D.E.; Latheef, M.A. Aerial application methods control spider mites on corn in Kansas, USA. Exp. Appl. Acarol. 2019, 77, 571–582. [Google Scholar] [CrossRef] [PubMed]
  82. Alves, M.T.; Moon, R.D.; MacRae, I.V.; Koch, R.L. Optimizing band selection for spectral detection of Aphis glycines Matsumura in soybean. Pest. Manag. Sci. 2019, 75, 942–949. [Google Scholar] [CrossRef]
  83. Alves, M.T.; Macrae, I.V.; Koch, R.L. Soybean aphid (Hemiptera: Aphididae) affects soybean spectral reflectance. J. Econ. Entomol. 2013, 108, 2655–2664. [Google Scholar] [CrossRef] [Green Version]
  84. Martin, D.E.; Latheef, M.A. Active optical sensor assessment of spider mite damage on greenhouse beans and cotton. Exp. Appl. Acarol. 2018, 74, 147–158. [Google Scholar] [CrossRef]
  85. Fan, Y.; Wang, T.; Qiu, Z.; Peng, J.; Zhang, C.; He, Y. Fast detection of striped stem-borer (Chilosuppressalis Walker) infested rice seedling based on visible/near-infrared hyperspectral imaging system. Sensors 2017, 17, 2470. [Google Scholar] [CrossRef]
  86. Herrmann, I.; Berenstein, M.; Paz-Kagan, T.; Sade, A.; Karnieli, A. Spectral assessment of two-spotted spider mite damage levels in the leaves of greenhouse-grown pepper and bean. Biosyst. Eng. 2017, 157, 72–85. [Google Scholar] [CrossRef]
  87. Abdel-Rahman, M.E.; VandenBerg, M.; Way, M.J.; Ahmed, F.B. Hand-held spectrometry for estimating thrips (Fulmekiolaserrata) incidence in sugarcane. In IEEE International Geoscience and Remote Sensing Symposium; IEEE: New York, NY, USA, 2019; Volume 12–17, pp. 268–271. [Google Scholar]
  88. Abdel-Rahman, M.E.; Ahmed, F.B.; vandenBerg, M.; Way, M.J. Potential of spectroscopic data sets for sugarcane thrips (FulmekiolaserrataKobus) damage detection. Int. J. Remote Sens. 2010, 31, 4199–4216. [Google Scholar] [CrossRef]
  89. Abdel-Rahman, M.E.; Way, M.; Ahmed, F.; Ismail, R.; Adam, E. Estimation of thrips (FulmekiolaserrataKobus) density in sugarcane using leaf-level hyperspectral data. S. Afr. J. Plant. Soil 2013, 30, 91–96. [Google Scholar] [CrossRef] [Green Version]
  90. Mirik, M.; Ansley, R.; Michels, G.; Elliott, N. Spectral vegetation indices selected for quantifying Russian wheat aphid (Diuraphisnoxia) feeding damage in wheat (Triticumaestivum L.). Precis. Agric. 2012, 13, 501–516. [Google Scholar] [CrossRef]
  91. Zhang, M.; Hale, A.; Luedeling, E. Feasibility of using remote sensing techniques to detect spider mite damage in stone fruit orchards. In Proceedings of the IEEE International Geoscience and Remote Sensing Symposium, Boston, MA, USA, 6–11 July 2008; Volume 7–11, pp. I323–I326. [Google Scholar]
  92. Luedeling, E.; Hale, A.; Zhang, M.; Bentley, W.J.; Dharmasri, L.C. Remote sensing of spider mite damage in California peach orchards. Int. J. Appl. Earth Obs. Geoinf. 2009, 11, 244–255. [Google Scholar] [CrossRef]
  93. Fraulo, B.A.; Cohen, M.; Liburd, O.E. Visible/near infrared reflectance (VNIR) spectroscopy for detecting twospotted spider mite (Acari: Tetranychidae) damage in strawberries. Environ. Entomol 2009, 38, 137–142. [Google Scholar] [CrossRef] [Green Version]
  94. Li, H.; Payne, W.A.; Michels, G.J.; Rush, C.M. Reducing plant abiotic and biotic stress: Drought and attacks of greenbugs, corn leaf aphids and virus disease in dryland sorghum. Environ. Exp. Bot. 2008, 63, 305–316. [Google Scholar] [CrossRef]
  95. Xu, H.; Ying, Y.; Fu, X.; Zhu, S. Near-infrared spectroscopy in detecting leaf miner damage on tomato leaf. Biosyst. Eng. 2007, 96, 447–454. [Google Scholar] [CrossRef]
  96. Peñuelas, J.; Filella, I.; Lloret, P.; Munoz, F.; Vilajeliu, M. Reflectance assessment of mite effects on apple trees. Int. J. Remote Sens. 1995, 16, 2727–2733. [Google Scholar] [CrossRef]
Figure 1. RS Platforms in precision agriculture.
Figure 1. RS Platforms in precision agriculture.
Energies 15 00217 g001
Figure 2. Types of UAV.
Figure 2. Types of UAV.
Energies 15 00217 g002
Figure 3. Fixed Wing.
Figure 3. Fixed Wing.
Energies 15 00217 g003
Figure 4. Single rotor.
Figure 4. Single rotor.
Energies 15 00217 g004
Figure 5. Hybrid VTOL.
Figure 5. Hybrid VTOL.
Energies 15 00217 g005
Figure 6. Tricopter.
Figure 6. Tricopter.
Energies 15 00217 g006
Figure 7. Quad copter.
Figure 7. Quad copter.
Energies 15 00217 g007
Figure 8. Hexacopter.
Figure 8. Hexacopter.
Energies 15 00217 g008
Figure 9. Octo copter.
Figure 9. Octo copter.
Energies 15 00217 g009
Figure 10. Average usage of RS platforms.
Figure 10. Average usage of RS platforms.
Energies 15 00217 g010
Figure 11. Precision accuracy rate.
Figure 11. Precision accuracy rate.
Energies 15 00217 g011
Table 1. QOS comparison of RS platforms used in precision agriculture.
Table 1. QOS comparison of RS platforms used in precision agriculture.
Quality of ServicesTypes of RS Platforms
UAVSatelliteManned AircraftGround Based
Flexibilityhighlowlowlow
Adaptabilityhighlowlowlow
Costlowhighhighlow
Time Consumptionlowlowlowhigh
Risklowaveragehighlow
Accuracyhighlowhighmoderate
Deploymenteasydifficultcomplexmoderate
Feasibilityyesnonoyes
Availabilityyesnoyesno
Operabilityeasycomplexcomplexeasy
Table 2. Comparison of Types of UAV.
Table 2. Comparison of Types of UAV.
ParametersTypes of UAV
Fixed WingSingle RotorMulti-RotorHybrid VTOL
No. of Rotors11(1 Big Sized and Small Sized on the tail of the drone)Tricopter-3
Quadcopter-4
Hexacopter-6
Octocopter-8
1
Manufacture and MaintenanceSimpleComplexComplexComplex
CostHighHighLowHigh
Average Flying Time2 h
(Battery)
16 h
(Powered by Gas Engine)
Higher (Powered by Gas Engine)Limited (20–30 min)Ability to cover longer distances
EnduranceMoreMore
(with Gas Power)
LimitedMore
EnergyBattery—They never utilize energy to stay afloat on air,
Gas Engine
Gas PowerBattery—They utilize energy to stay afloat on airBattery
SpeedFast Flying SpeedLimitedLimitedFast Flying Speed
ApplicationsLong-Distance Aerial Mapping and SurveillanceAerial ScanningAerial Photography, Short Distance Aerial Mapping and SurveillanceMapping and Land Surveying, Mining, Surveillance and Security
DrawbacksAerial photography is not applicable because it needs to be motionless in the air for a period.Harder to fly, Dangerous to handleLimited PayloadImperfect in hovering
Limited Payload
Training Required in FlyingRequired (runway or a Catapult Launcher- to set a fixed-wing in air, Parachute or a Net- Landing)Not RequiredNot RequiredNot Required
Table 3. UAV based remote sensing.
Table 3. UAV based remote sensing.
ReferencesCrop NameParameters
Type of UAVCameraNo. of RotorsPest NameObservations
Sourav Kumar Bhoia et al., 2021 [17]RiceMulti-RotorRGB, Multispectral4Leaf hopperVisual inspection of images
Wu, Bizhi et al., 2021 [18]PineMulti-RotorMultispectral6Bursaphelenchus xylophilusVisual Images
Ishengoma, Farian Severine et al., 2021 [19]MaizeMulti-RotorMultispectral6LepidopteraVisual Images
Érika Akemi Saito Moriya et al., 2021 [20]LemonMulti-RotorHyperspectral4Phytophthora GummosisVisual inspection of images
An, G et al., 2021 [21]RiceMulti-RotorHyperspectral4Ustilaginoidea virensDamage assessments
Nguyen, C et al., 2021 [22]GrapevineMulti-RotorHyperspectral4Grapevine vein-clearing virusVisual Images
Ma, H et al., 2021 [23]WheatMulti-RotorHyperspectral4Fusarium head blightVisual inspection of images
Qin, J et al., 2021 [24]PineMulti-RotorMultispectral6BursaphelenchusxylophilusDamage assessments
Xiao, Y et al., 2021 [25]WheatMulti-RotorHyperspectral4Pathogen Fusarium graminearum (Gibberellazeae)Visual Images
Guo, A et al., 2021 [26]WheatMulti-RotorHyperspectral4Puccinia striiformisDisease Monitoring
Castrignanò, A et al., 2020 [27]OliveMulti-RotorMultispectral6Xylella fastidiosaVisual Images
Francesconi S et al., 2021 [28]WheatMulti-RotorHyperspectral4Pathogen Fusarium graminearum (Gibberellazeae)Visual Images
SaumyaYadav et al., 2021 [29]PeachMulti-RotorRGB, Multispectral4Xanthomonas campestris pv.pruniVisual Images
Görlich, F et al., 2021 [30]Sugar beetMulti-RotorHyperspectral4CercosporabeticolaDamage assessments
Yu, Run et al., 2021 [31]PineMulti-RotorHyperspectral4BursaphelenchusxylophilusVisual Images
Yue Shi et al., 2021 [32]PotatoMulti-RotorHyperspectral4Phytophthora infestansVisual Images
Walter Chivasa, et al., 2021 [33]MaizeMulti-RotorMultispectral6Gemini virusVisual Images
Anton Louise P. de Ocampo and Elmer P. Dadios 2021 [34]SolanummelongenaMulti-Rotor- Quad copterRGB4Aphis gossypiiVision-based Monitoring
Gao, Junfeng et al., 2020 [35]PotatoMulti-RotorMultispectral6Phytophthora infestansVisual Images, Degree of Severity
Deng, Xiaoling et al., 2020 [36]LemonMulti-RotorHyperspectral4CandidatusLiberibacter asiaticusVisual inspection of images
Everton Castel˜aoTetila et al., 2020 [37]SoyaMulti-Rotor- Quad copterRGB4Defoliant pests such as insects and mollusksPest Segmentation and Classification
Vinı’cius Bitencourt Campos Calou
et al., 2020 [38]
BananaMulti-Rotor- Quad copterRGB4Yellow sigatokaVisual Images, Degree of Severity
Del Campo-Sanchez et al., 2019. [39]GrapeMulti-RotorRGB4Cotton assidVisual inspection of images
Abdulridha, Jaafar et al., 2019. [40]LemonMulti-RotorHyperspectral4Xanthomonas citriVisual inspection of images
Vanegas et al., 2018 [41]GrapeMulti-RotorRGB, Multispectral, Hyperspectral4GrapephylloxeraGround trapsand root digging, visual vigour assessments
Huang et al., 2018 [42]CottonMulti-RotorMultispectral4Two-spotted spidermiteDamage assessments
Samuel Joalland et al., 2018 [43]Sugar BeetMulti-RotorHyperspectral4Beet
Cyst Nematode
Visual Images
Hunt et al., 2017. [44]PotatoMulti-RotorMultispectral6Colorado potato beetleDamage assessments
Stanton et al., 2017 [45]SorghumFixed WingMultispectral1Sugarcane aphidArthropod counts
Severtson et al., 2016a. [46]CanolaMulti-RotorMultispectral8Green peachaphidArthropod counts, soil and plant tissue nutrient analyses
Nebiker et al., 2016 [47]OnionFixed WingMultispectral1ThripsNA
Ishengoma et al., 2021 [19]WheatMulti-RotorRGB, Multispectral4Fall armywormOutbreak reported by grower
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Velusamy, P.; Rajendran, S.; Mahendran, R.K.; Naseer, S.; Shafiq, M.; Choi, J.-G. Unmanned Aerial Vehicles (UAV) in Precision Agriculture: Applications and Challenges. Energies 2022, 15, 217. https://doi.org/10.3390/en15010217

AMA Style

Velusamy P, Rajendran S, Mahendran RK, Naseer S, Shafiq M, Choi J-G. Unmanned Aerial Vehicles (UAV) in Precision Agriculture: Applications and Challenges. Energies. 2022; 15(1):217. https://doi.org/10.3390/en15010217

Chicago/Turabian Style

Velusamy, Parthasarathy, Santhosh Rajendran, Rakesh Kumar Mahendran, Salman Naseer, Muhammad Shafiq, and Jin-Ghoo Choi. 2022. "Unmanned Aerial Vehicles (UAV) in Precision Agriculture: Applications and Challenges" Energies 15, no. 1: 217. https://doi.org/10.3390/en15010217

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop