A Review on UAV-Based Applications for Precision Agriculture †
Abstract
:1. Introduction
- Which are the different types of UAV applications in Precision Agriculture? In this research question, we aim to explore the current trends in the application of UAVs in precision agriculture. The initial goal of UAVs in their early application in agriculture was to derive direct image-based products. Nowadays, this has changed, and the applications of UAVs in agriculture are intelligence-based oriented products that process images and provide informed decision-making applications to the farmers. In this question, we provide a thorough description of the different types of applications that UAVs can support based on the different operational needs of agriculture fields.
- What types of crops are monitored by UAV systems? In this research question, our target is to record the different types of crops that have been monitored so far with the help of UAVs. Additionally, we provide general information regarding the geographical distribution of these crops, their size and the different stages of growth where monitoring can take place. By answering this research question, we can identify how the different characteristics of each crop and its life cycle affect the use of UAVs.
- Which UAV system technologies are adopted in Precision Agriculture? In this research question, we identify the system characteristics of UAV-based applications for Precision Agriculture. By answering this question we can locate the specific UAV types and sensors that can be used for monitoring crops.
- What types of data can be acquired by UAVs? In this research question, we record the different types of data that can be acquired by UAVs based on the sensor technology employed. We also provide a review of the advantages and disadvantages of the different types of data that can be gathered with the help of different sensors based on the associated cost and the types of the field operations applied.
- Which data processing methods can be used to exploit the agricultural data acquired by UAVs? In this research question, we identify the methods that are used for image analysis in agricultural crops. We distinguish between three types of data processing methods that can be used alone or complementary so as to gain insights regarding a field namely: (a) Photogrammetric techniques; (b) Machine Learning techniques; and (c) Vegetation Indices.
2. UAV-Based Monitoring of Crops
2.1. Types of UAV Applications in Precision Agriculture
2.2. Types and Properties of Crops Monitored by UAV-Based Systems
- To monitor large fields (>10 ha), where data are collected from all areas.
- To monitor small farms or small parts of a field.
- To monitor areas of great heterogeneity. This is achieved by using a UAV equipped with automatic pilot systems and ground-level sensors.
2.3. UAV System Technologies
- One or more UAVs: Flying vehicles that have no operator on their spindle but operate either autonomously or remotely.
- A Ground Control Station (GCS): It is a computer that either communicates with the UAV Control System or controls and monitors the UAV directly. The GCS monitors information related to the flight of the UAV. The user has the ability to receive data relevant to the flight of the aircraft, but also data recorded by the sensors that support the flight (i.e., ground-based sensors or sensors embedded in the aircraft). In addition, the GCS contains the software required for the processing of data acquired by the UAV and the extraction of the information needed by the system operator for the crop monitoring.
- UAV Control System (UAV CS): It is used to control the UAV. It can be either a two-way data link, such as a remote control, or a built-in computer (usually with a built in GPS). The UAV CS includes the flight control system and/or the autopilot system, which controls the operation of the UAV. This system receives and processes data from the autopilot or flight control system for the proper operation of the UAV. It usually contains sensors to monitor the flight properties, such as sensors for measuring distance from ground, air force, etc. The control system has the ability to process information from sensors to correct any problems that may arise, and to communicate with the GCC wireless and in real time by sending and receiving the necessary information.
- Sensors for data acquisition: They are cameras intended to collect the information needed. The next section provides a detailed presentation of possible ways of collecting information and the technologies exploited. In the case that UAVs are not intended to collect information but are used for another purpose, such as spraying, the sensors are replaced with the necessary components.
- Fixed-wing: These are unmanned planes with wings that require a runway to take off from the ground or a catapult. This type of UAVs has high endurance as well as the ability to fly at high speeds. In addition, fixed-wing UAVs have the ability to cover large areas on each flight and can carry more payload. However, they are more expensive than the other types. In the works reviewed, 22% used fixed-wing UAVs. One type of fixed-wing UAVs that has not been identified in the reviewed literature, but is a very promising technology, is the solar-powered UAVs [124]. Solar-powered UAVs offer significantly increased flight times because they exploit and store the sun’s energy during the day. This is the reason that they are preferred for long-endurance operations.
- Rotary-wing: The rotary-wing UAVS, also called rotorcrafts or Vertical Take-Off and Landing (VTOL), offer the advantages of steady flying at one place while keeping the maneuverability attribute. These features are useful for many different types of missions. However, they cannot fly at very high speed or stay in the air for a long time. They are generally the most widely used UAVs in all kinds of applications, but especially in Precision Agriculture. One reason for this is the fact that they present lower cost compared to the other types of UAVs. In addition, this type of UAVs is suitable when the monitored crops are not very large, which is usually the case. A UAV of this type may be:
- An unmanned helicopter: They include main and tail rotors such as conventional helicopters. Overall, 4% of the works used this type.
- Multi-rotor: This category includes rotary-wing UAVs with four or more rotors (quadcopter, hexacopter, octocopter, etc.). These aircraft are generally more stable in flight than unmanned helicopters. Overall, 72% of the works used this type.
- Blimps: This type of UAV is lighter than air, has high endurance, flies at low speeds and is generally larger in size compared to the other types. Their manufacturing characteristics allows them to remain in the air even in the event of a total loss of power, while being considered relatively safe in the event of a collision. Usually, they are not used in Precision Agriculture applications. In the recent works reviewed, no application was found using this type of UAVs.
- Flapping wing: These UAVs are very small and they have flexible, shaped little wings inspired by the way birds and insects fly. They are not often used in Precision Agriculture as they require high energy consumption due to their size. No work was found in the literature review using this type of UAVs.
- Parafoil-wing. Usually aircrafts of this type have one or more propellers at the back in order to control the course of their flight, but at the same time for harnessing the power of the air to fly without consuming much energy. They are also capable of carrying a larger payload. They are not usually exploited for PA applications. Only 2% of the works analyzed use this type.
- Easy to operate
- Slower speeds
- Ability to maneuver
- Relatively low cost
3. UAV Data Acquisition
- Visible light sensors (RGB)
- Multispectral sensors
- Hyperspectral sensors
- Thermal sensors
- Visible light sensors (RGB): RGB sensors are the most frequently used sensors by UAV systems for Precision Agriculture applications. They are relatively low cost compared to the other types and can acquire high resolution images. In addition, they are easy to use and operate and they are lightweight. In addition, the information acquired requires simple processing. The images can be acquired in different conditions, on both sunny and cloudy days, but a specific time frame is required based on weather conditions to avoid inadequate or excessive exposure of the image. Considering the drawbacks of these sensors, the main disadvantage is the fact that they are inadequate for analyzing a lot of vegetation parameters that require spectral information in the non-visible spectrum. They are commonly used in tandem with the other types of sensors.
- By using multispectral or hyperspectral imaging sensors, UAVs can acquire information about the vegetation’s spectral absorption and reflection on several bands. Spectral information can be significantly helpful in assessing a lot of biological and physical characteristics of the crops. For example, unhealthy parts of the crops can be discriminated in an image, as visible radiation in the red channel is absorbed by chlorophyll, while near infrared (NIR) radiation is strongly reflected. Thus, even if it is not yet visible in the red channel, it can be identified by the information in the NIR channel. Spectral information can be used to calculate several vegetation indices and monitor several crop characteristics based on them.Multispectral and hyperspectral sensors are frequently used, despite their higher costs. However, a drawback of these sensors arises from the fact that it is required to apply more complex pre-processing methods in order to extract useful information from the captured images. The pre-processing procedure of spectral images often contains the radiometric calibration, geometric correction, image fusion and image enhancement. The main difference between multispectral and hyperspectral sensors is the number of bands (or channels) that each sensor can capture and the width of the bands. Multispectral sensors capture 5–12 channels while hyperspectral images can usually capture hundreds or thousands bands, but in a narrower bandwidth. Although in the recent works studied multispectral sensors are used a lot more frequently than hyperspectral because of their lower cost, hyperspectral technology seems to have a lot of potential and is considered as the future trend for crop phenotyping research [9].
- Thermal infrared sensors capture information about the temperature of the objects and generate images displaying them based on this information and not their visible properties. Thermal cameras use infrared sensors and an optical lens to receive infrared energy. All objects warmer than absolute zero (−273 C/−459 F) emit infrared radiation at specific wavelengths (LWIR and MWIR bands) in an amount proportional to their temperature. Hence, thermal cameras focus and detect the radiation in these wavelengths and usually translate it into a grayscale image for the heat representation. Many thermal imaging sensors can also generate colored images. These images often show warmer objects as yellow and cooler objects as blue. This type of sensors is used for very specific applications (e.g., irrigation management). As a result, they are not frequently used in PA applications of UAV systems that focus on monitoring other characteristics of the crops.
4. UAS Data Processing
- Photogrammetry techniques: Photogrammetry regards the accurate reconstruction of a scene or an object from several overlapping pictures. Photogrammetric techniques can process the 2D data and establish the geometric relationships between the different images and the object(s), obtaining 3D models. To construct the 3D models, photogrammetry requires at least two overlapping images of the same scene and/or object(s), captured from different points of view. These kind of techniques can be used for extracting three-dimensional digital surface or terrain models [37,40,43] and/or orthophotos [50,55]. UAV low-altitude data acquisition enables the construction of 3D models with a much higher spatial resolution compared to other remote sensing technologies (such as satellites). However, the collection of many images is required to have information for the entire field under study. Thus, in most cases, it is necessary to collect many overlapping images to construct Digital Elevation Models (DEMs) of the crops and/or create orthophotos (also referred to as orthomosaics). The 3D models and the orthophotos include information about the 3D characteristics of the crops based on the structure of the vegetation (e.g., the vegetation height, the canopy, the density, etc.) and can be very useful for applications that can exploit only RGB imagery. The works reviewed showed that photogrammetric techniques are very commonly used in all types of applications as they are also required to create vegetation indices maps. In addition, the 3D information they include is very important and is often used in tandem with other techniques.
- Machine learning methods: Machine Learning (ML) has been used to process the data acquired, for prediction and/or identification purposes, with great results in many domains, such as medical systems [126,127], marketing [128], biology [129], etc. Machine learning techniques are often been applied in Precision Agriculture to exploit the information from the large amount of data acquired by the UAVs. ML is able to estimate some parameters regarding the crop growth rate, detect diseases or even to identify/discriminate objects in the images. Machine learning usage has increased a lot recently due to the fast advancements taking place especially in the deep learning field.
- Vegetation Indices calculation: Vegetation Indices (VIs) are one of the most popular products of remote sensing applications for Precision Agriculture. They use different mathematical combinations/transformations of at least two spectral bands of the electromagnetic spectrum, designed to maximize the contribution of the vegetation characteristics while minimizing the external confounding factors. They can deliver reliable spatial and temporal information about the agricultural crops monitored. In most cases, many VIs are extracted and used to draw conclusions. They can be calculated based on information of either each photograph individually or after the production of orthophotos depicting the whole crop. Calculating vegetation indices may serve in the identification of useful crop characteristics, such as biological and physical parameters of the vegetation.
4.1. Photogrammetric Techniques
- The Digital Terrain Model (DTM) represents the altitude of the surface of the Earth, i.e., of the terrain. These models do not take into account either artificial or natural (e.g., trees, vegetation, buildings) objects that exist in the field. DTMs just present the elevation of the bare Earth. Figure 3 shows a Digital Terrain Model from Ronchetti et al. [79].
- The Digital Surface Model (DSM) represents the altitude of the surface that is first encountered by the remote sensing system (i.e., when the aerial image captures the top of a building, tree, the vegetation etc.). Hence, the elevation model generated includes the elevation of the bare Earth along with artificial and natural objects that may exist in the field.
4.2. Using Machine Learning
- Image segmentation
- Feature extraction and classification
4.3. Vegetation Indices
- Vegetation Indices based on multispectral or hyperspectral data. Most of the developed Vegetation Indices use multispectral and/or hyperspectral information that can combine several bands.
- Vegetation Indices based on information from the visible spectrum. Several VIs in the visible spectrum have been developed and are widely used due to the high cost of multispectral and hyperspectral sensors.
- R: Red (620–670 nm)
- G: Green (500–560 nm)
- B: Blue (430–500 nm)
- : Near Infrared (720–1500 nm)
- : Red Edge (670–720 nm)
5. Limitations in the Use of UAVs for Precision Agriculture
- Open category: Operations in this category do not require authorization or pilot license. This category is limited to operations: in visual line of sight (VLOS), up to 120 m flight height and performed with UAVs compliant with some technical requirements defined.
- Specific category: This is the second level of operations, which covers operations that are not compliant to the restrictions of the open category and are considered “medium risk” operations. Operators must perform a risk assessment (using a standardized method) and define mitigation measures. Operations involving UAVs of more than 25 kg or not operated in VLOS will typically fall under this category. The technical requirements for this category depend on the specific authorized operation.
- Certified category: This category is considered high risk and includes operations involving large drones in controlled air-spaces. Rules applicable to this category will be the same as for manned aviation. This category does not concern the use of UAVs for Precision Agriculture.
6. Discussion and Conclusions
- Photogrammetry techniques are used to construct Orthophotos or DEMs from the overlapping images acquired. They are used in most of the applications to create vegetation maps considering several characteristics; however, they are also used standalone mainly when only RGB sensors are available, to exploit the 3D characteristics of the vegetation.
- Machine Learning methods can be used to monitor several different characteristics of the crops. They can exploit RGB and/or multispectral/hyperspectral images. A very frequent technique exploited in the literature is the Object Based Image Analysis.
- Vegetation Indices use combinations of the reflection of several bands obtained by RGB or spectral sensors. They are proved to be most effective when multispectral or hyperspectral information is used. They have been used in many recent studies. They have been proved to be very effective in monitoring various parameters of the crops, by using different combinations of spectral bands.
Funding
Conflicts of Interest
References
- FAO. Declaration of the World Summit on Food Security; FAO: Rome, Italy, 2009. [Google Scholar]
- Mylonas, P.; Voutos, Y.; Sofou, A. A Collaborative Pilot Platform for Data Annotation and Enrichment in Viticulture. Information 2019, 10, 149. [Google Scholar] [CrossRef]
- Mulla, D.J. Twenty five years of remote sensing in precision agriculture: Key advances and remaining knowledge gaps. Biosyst. Eng. 2013, 114, 358–371. [Google Scholar] [CrossRef]
- Bauer, M.E.; Cipra, J.E. Identification of Agricultural Crops by Computer Processing of ERTS MSS Data; LARS Technical Reports; Purdue University: West Lafayette, IN, USA, 1973; p. 20. [Google Scholar]
- Mora, A.; Santos, T.; Lukasik, S.; Silva, J.; Falcao, A.; Fonseca, J.; Ribeiro, R. Land cover classification from multispectral data using computational intelligence tools: A comparative study. Information 2017, 8, 147. [Google Scholar] [CrossRef]
- Taylor, J.; William, R.; Munson, K. Jane’s Pocket Book of Remotely Piloted Vehicles: Robot Aircraft Today; Collier Books: New York, NY, USA, 1977. [Google Scholar]
- Zhang, C.; Kovacs, J.M. The application of small unmanned aerial systems for precision agriculture: A review. Precis. Agric. 2012, 13, 693–712. [Google Scholar] [CrossRef]
- Yang, S.; Yang, X.; Mo, J. The application of unmanned aircraft systems to plant protection in China. Precis. Agric. 2018, 19, 278–292. [Google Scholar] [CrossRef]
- Yang, G.; Liu, J.; Zhao, C.; Li, Z.; Huang, Y.; Yu, H.; Xu, B.; Yang, X.; Zhu, D.; Zhang, X.; et al. Unmanned aerial vehicle remote sensing for field-based crop phenotyping: Current status and perspectives. Front. Plant Sci. 2017, 8, 1111. [Google Scholar] [CrossRef] [PubMed]
- Mogili, U.R.; Deepak, B. Review on application of drone systems in precision agriculture. Procedia Comput. Sci. 2018, 133, 502–509. [Google Scholar] [CrossRef]
- Puri, V.; Nayyar, A.; Raja, L. Agriculture drones: A modern breakthrough in precision agriculture. J. Stat. Manag. Syst. 2017, 20, 507–518. [Google Scholar] [CrossRef]
- Kulbacki, M.; Segen, J.; Knieć, W.; Klempous, R.; Kluwak, K.; Nikodem, J.; Kulbacka, J.; Serester, A. Survey of Drones for Agriculture Automation from Planting to Harvest. In Proceedings of the 2018 IEEE 22nd International Conference on Intelligent Engineering Systems (INES), Las Palmas de Gran Canaria, Spain, 21–23 June 2018; pp. 000353–000358. [Google Scholar]
- Manfreda, S.; McCabe, M.; Miller, P.; Lucas, R.; Pajuelo Madrigal, V.; Mallinis, G.; Ben Dor, E.; Helman, D.; Estes, L.; Ciraolo, G.; et al. On the use of unmanned aerial systems for environmental monitoring. Remote Sens. 2018, 10, 641. [Google Scholar] [CrossRef]
- Adão, T.; Hruška, J.; Pádua, L.; Bessa, J.; Peres, E.; Morais, R.; Sousa, J. Hyperspectral imaging: A review on UAV-based sensors, data processing and applications for agriculture and forestry. Remote Sens. 2017, 9, 1110. [Google Scholar] [CrossRef]
- Maes, W.H.; Steppe, K. Perspectives for remote sensing with unmanned aerial vehicles in precision agriculture. Trends Plant Sci. 2019, 24, 152–164. [Google Scholar] [CrossRef] [PubMed]
- Kamilaris, A.; Prenafeta-Boldu, F.X. Deep learning in agriculture: A survey. Comput. Electron. Agric. 2018, 147, 70–90. [Google Scholar] [CrossRef]
- Tsouros, D.C.; Triantafyllou, A.; Bibi, S.; Sarigannidis, P.G. Data acquisition and analysis methods in UAV-based applications for Precision Agriculture. In Proceedings of the 2019 IEEE 15th International Conference on Distributed Computing in Sensor Systems (DCOSS), Santorini Island, Greece, 29–31 May 2019; pp. 377–384. [Google Scholar]
- Huang, H.; Deng, J.; Lan, Y.; Yang, A.; Deng, X.; Zhang, L. A fully convolutional network for weed mapping of unmanned aerial vehicle (UAV) imagery. PLoS ONE 2018, 13, 1–19. [Google Scholar] [CrossRef] [PubMed]
- Hunt, E.R.; Horneck, D.A.; Spinelli, C.B.; Turner, R.W.; Bruce, A.E.; Gadler, D.J.; Brungardt, J.J.; Hamm, P.B. Monitoring nitrogen status of potatoes using small unmanned aerial vehicles. Precis. Agric. 2018, 19, 314–333. [Google Scholar] [CrossRef]
- Zhang, J.; Basso, B.; Price, R.F.; Putman, G.; Shuai, G. Estimating plant distance in maize using Unmanned Aerial Vehicle (UAV). PLoS ONE 2018, 13, e0195223. [Google Scholar] [CrossRef] [PubMed]
- Wang, J.J.; Ge, H.; Dai, Q.; Ahmad, I.; Dai, Q.; Zhou, G.; Qin, M.; Gu, C. Unsupervised discrimination between lodged and non-lodged winter wheat: A case study using a low-cost unmanned aerial vehicle. Int. J. Remote Sens. 2018, 39, 2079–2088. [Google Scholar] [CrossRef]
- Näsi, R.; Viljanen, N.; Kaivosoja, J.; Alhonoja, K.; Hakala, T.; Markelin, L.; Honkavaara, E. Estimating Biomass and Nitrogen Amount of Barley and Grass Using UAV and Aircraft Based Spectral and Photogrammetric 3D Features. Remote Sens. 2018, 10, 1082. [Google Scholar] [CrossRef]
- Yonah, I.B.; Mourice, S.K.; Tumbo, S.D.; Mbilinyi, B.P.; Dempewolf, J. Unmanned aerial vehicle-based remote sensing in monitoring smallholder, heterogeneous crop fields in Tanzania. Int. J. Remote Sens. 2018, 39, 5453–5471. [Google Scholar] [CrossRef]
- Kellenberger, B.; Marcos, D.; Tuia, D. Detecting mammals in UAV images: Best practices to address a substantially imbalanced dataset with deep learning. Remote Sens. Environ. 2018, 216, 139–153. [Google Scholar] [CrossRef]
- Mozgeris, G.; Jonikavičius, D.; Jovarauskas, D.; Zinkevičius, R.; Petkevičius, S.; Steponavičius, D. Imaging from manned ultra-light and unmanned aerial vehicles for estimating properties of spring wheat. Precis. Agric. 2018, 19, 876–894. [Google Scholar] [CrossRef]
- Deng, L.; Mao, Z.; Li, X.; Hu, Z.; Duan, F.; Yan, Y. UAV-based multispectral remote sensing for precision agriculture: A comparison between different cameras. ISPRS J. Photogramm. Remote Sens. 2018, 146, 124–136. [Google Scholar] [CrossRef]
- Zheng, H.; Cheng, T.; Li, D.; Zhou, X.; Yao, X.; Tian, Y.; Cao, W.; Zhu, Y. Evaluation of RGB, color-infrared and multispectral images acquired from unmanned aerial systems for the estimation of nitrogen accumulation in rice. Remote Sens. 2018, 10, 824. [Google Scholar] [CrossRef]
- Tewes, A.; Schellberg, J. Towards remote estimation of radiation use efficiency in maize using uav-based low-cost camera imagery. Agronomy 2018, 8, 16. [Google Scholar] [CrossRef]
- Raeva, P.L.; Šedina, J.; Dlesk, A. Monitoring of crop fields using multispectral and thermal imagery from UAV. Eur. J. Remote Sens. 2019, 52, 192–201. [Google Scholar] [CrossRef]
- Huang, Y.; Reddy, K.N.; Fletcher, R.S.; Pennington, D. UAV low-altitude remote sensing for precision weed management. Weed Technol. 2018, 32, 2–6. [Google Scholar] [CrossRef]
- Gracia-Romero, A.; Vergara-Díaz, O.; Thierfelder, C.; Cairns, J.; Kefauver, S.; Araus, J. Phenotyping conservation agriculture management effects on ground and aerial remote sensing assessments of maize hybrids performance in Zimbabwe. Remote Sens. 2018, 10, 349. [Google Scholar] [CrossRef]
- De Castro, A.I.; Torres-Sánchez, J.; Peña, J.M.; Jiménez-Brenes, F.M.; Csillik, O.; López-Granados, F. An Automatic Random Forest-OBIA Algorithm for Early Weed Mapping between and within Crop Rows Using UAV Imagery. Remote Sens. 2018, 10, 285. [Google Scholar] [CrossRef]
- Lambert, J.; Hicks, H.; Childs, D.; Freckleton, R. Evaluating the potential of Unmanned Aerial Systems for mapping weeds at field scales: A case study with Alopecurus myosuroides. Weed Res. 2018, 58, 35–45. [Google Scholar] [CrossRef] [PubMed]
- Uddin, M.A.; Mansour, A.; Jeune, D.L.; Ayaz, M.; Aggoune, E.-H.M. UAV-assisted dynamic clustering of wireless sensor networks for crop health monitoring. Sensors 2018, 18, 555. [Google Scholar] [CrossRef] [PubMed]
- Khan, Z.; Rahimi-Eichi, V.; Haefele, S.; Garnett, T.; Miklavcic, S.J. Estimation of vegetation indices for high-throughput phenotyping of wheat using aerial imaging. Plant Methods 2018, 14, 20. [Google Scholar] [CrossRef] [PubMed]
- Fan, X.; Kawamura, K.; Xuan, T.D.; Yuba, N.; Lim, J.; Yoshitoshi, R.; Minh, T.N.; Kurokawa, Y.; Obitsu, T. Low-cost visible and near-infrared camera on an unmanned aerial vehicle for assessing the herbage biomass and leaf area index in an Italian ryegrass field. Grassl. Sci. 2018, 64, 145–150. [Google Scholar] [CrossRef]
- Ziliani, M.; Parkes, S.; Hoteit, I.; McCabe, M. Intra-Season Crop Height Variability at Commercial Farm Scales Using a Fixed-Wing UAV. Remote Sens. 2018, 10, 2007. [Google Scholar] [CrossRef] [Green Version]
- Zheng, H.; Li, W.; Jiang, J.; Liu, Y.; Cheng, T.; Tian, Y.; Zhu, Y.; Cao, W.; Zhang, Y.; Yao, X. A Comparative Assessment of Different Modeling Algorithms for Estimating Leaf Nitrogen Content in Winter Wheat Using Multispectral Images from an Unmanned Aerial Vehicle. Remote Sens. 2018, 10, 2026. [Google Scholar] [CrossRef] [Green Version]
- Han, X.; Thomasson, J.A.; Bagnall, G.C.; Pugh, N.; Horne, D.W.; Rooney, W.L.; Jung, J.; Chang, A.; Malambo, L.; Popescu, S.C.; et al. Measurement and calibration of plant-height from fixed-wing UAV images. Sensors 2018, 18, 4092. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Torres-Sánchez, J.; de Castro, A.I.; Peña, J.M.; Jiménez-Brenes, F.M.; Arquero, O.; Lovera, M.; López-Granados, F. Mapping the 3D structure of almond trees using UAV acquired photogrammetric point clouds and object-based image analysis. Biosyst. Eng. 2018, 176, 172–184. [Google Scholar] [CrossRef]
- Comba, L.; Biglia, A.; Aimonino, D.R.; Gay, P. Unsupervised detection of vineyards by 3D point-cloud UAV photogrammetry for precision agriculture. Comput. Electron. Agric. 2018, 155, 84–95. [Google Scholar] [CrossRef]
- Su, J.; Liu, C.; Coombes, M.; Hu, X.; Wang, C.; Xu, X.; Li, Q.; Guo, L.; Chen, W.H. Wheat yellow rust monitoring by learning from multispectral UAV aerial imagery. Comput. Electron. Agric. 2018, 155, 157–166. [Google Scholar] [CrossRef]
- De Castro, A.; Jiménez-Brenes, F.; Torres-Sánchez, J.; Peña, J.; Borra-Serrano, I.; López-Granados, F. 3-D characterization of vineyards using a novel UAV imagery-based OBIA procedure for precision viticulture applications. Remote Sens. 2018, 10, 584. [Google Scholar] [CrossRef] [Green Version]
- Bah, M.D.; Hafiane, A.; Canals, R. Deep Learning with Unsupervised Data Labeling for Weed Detection in Line Crops in UAV Images. Remote Sens. 2018, 10, 1690. [Google Scholar] [CrossRef] [Green Version]
- Wierzbicki, D.; Fryskowska, A.; Kedzierski, M.; Wojtkowska, M.; Delis, P. Method of radiometric quality assessment of NIR images acquired with a custom sensor mounted on an unmanned aerial vehicle. J. Appl. Remote Sens. 2018, 12, 015008. [Google Scholar] [CrossRef]
- Kerkech, M.; Hafiane, A.; Canals, R. Deep leaning approach with colorimetric spaces and vegetation indices for vine diseases detection in UAV images. Comput. Electron. Agric. 2018, 155, 237–243. [Google Scholar] [CrossRef]
- Latif, M.A.; Cheema, M.J.M.; Saleem, M.F.; Maqsood, M. Mapping wheat response to variations in N, P, Zn, and irrigation using an unmanned aerial vehicle. Int. J. Remote Sens. 2018, 39, 7172–7188. [Google Scholar] [CrossRef]
- Jung, J.; Maeda, M.; Chang, A.; Landivar, J.; Yeom, J.; McGinty, J. Unmanned aerial system assisted framework for the selection of high yielding cotton genotypes. Comput. Electron. Agric. 2018, 152, 74–81. [Google Scholar] [CrossRef]
- Wan, L.; Li, Y.; Cen, H.; Zhu, J.; Yin, W.; Wu, W.; Zhu, H.; Sun, D.; Zhou, W.; He, Y. Combining UAV-Based Vegetation Indices and Image Classification to Estimate Flower Number in Oilseed Rape. Remote Sens. 2018, 10, 1484. [Google Scholar] [CrossRef] [Green Version]
- Sa, I.; Popović, M.; Khanna, R.; Chen, Z.; Lottes, P.; Liebisch, F.; Nieto, J.; Stachniss, C.; Walter, A.; Siegwart, R. Weedmap: A large-scale semantic weed mapping framework using aerial multispectral imaging and deep neural network for precision farming. Remote Sens. 2018, 10, 1423. [Google Scholar] [CrossRef] [Green Version]
- Ballesteros, R.; Ortega, J.F.; Hernandez, D.; Moreno, M.A. Onion biomass monitoring using UAV-based RGB imaging. Precis. Agric. 2018, 19, 840–857. [Google Scholar] [CrossRef]
- Simic Milas, A.; Romanko, M.; Reil, P.; Abeysinghe, T.; Marambe, A. The importance of leaf area index in mapping chlorophyll content of corn under different agricultural treatments using UAV images. Int. J. Remote Sens. 2018, 39, 5415–5431. [Google Scholar] [CrossRef]
- Mesas-Carrascosa, F.J.; Pérez-Porras, F.; Meroño de Larriva, J.; Mena Frau, C.; Agüera-Vega, F.; Carvajal-Ramírez, F.; Martínez-Carricondo, P.; García-Ferrer, A. Drift correction of lightweight microbolometer thermal sensors on-board unmanned aerial vehicles. Remote Sens. 2018, 10, 615. [Google Scholar] [CrossRef] [Green Version]
- Varela, S.; Dhodda, P.; Hsu, W.; Prasad, P.; Assefa, Y.; Peralta, N.; Griffin, T.; Sharda, A.; Ferguson, A.; Ciampitti, I. Early-season stand count determination in corn via integration of imagery from unmanned aerial systems (UAS) and supervised learning techniques. Remote Sens. 2018, 10, 343. [Google Scholar] [CrossRef] [Green Version]
- Marino, S.; Alvino, A. Detection of homogeneous wheat areas using multi-temporal UAS images and ground truth data analyzed by cluster analysis. Eur. J. Remote Sens. 2018, 51, 266–275. [Google Scholar] [CrossRef] [Green Version]
- Marcial-Pablo, M.d.J.; Gonzalez-Sanchez, A.; Jimenez-Jimenez, S.I.; Ontiveros-Capurata, R.E.; Ojeda-Bustamante, W. Estimation of vegetation fraction using RGB and multispectral images from UAV. Int. J. Remote Sens. 2019, 40, 420–438. [Google Scholar] [CrossRef]
- Oliveira, H.C.; Guizilini, V.C.; Nunes, I.P.; Souza, J.R. Failure detection in row crops from UAV images using morphological operators. IEEE Geosci. Remote Sens. Lett. 2018, 15, 991–995. [Google Scholar] [CrossRef]
- Mafanya, M.; Tsele, P.; Botai, J.O.; Manyama, P.; Chirima, G.J.; Monate, T. Radiometric calibration framework for ultra-high-resolution UAV-derived orthomosaics for large-scale mapping of invasive alien plants in semi-arid woodlands: Harrisia pomanensis as a case study. Int. J. Remote Sens. 2018, 39, 5119–5140. [Google Scholar] [CrossRef] [Green Version]
- Hassanein, M.; Lari, Z.; El-Sheimy, N. A new vegetation segmentation approach for cropped fields based on threshold detection from hue histograms. Sensors 2018, 18, 1253. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Jeong, S.; Ko, J.; Choi, J.; Xue, W.; Yeom, J.-m. Application of an unmanned aerial system for monitoring paddy productivity using the GRAMI-rice model. Int. J. Remote Sens. 2018, 39, 2441–2462. [Google Scholar] [CrossRef]
- Iwasaki, K.; Torita, H.; Abe, T.; Uraike, T.; Touze, M.; Fukuchi, M.; Sato, H.; Iijima, T.; Imaoka, K.; Igawa, H. Spatial pattern of windbreak effects on maize growth evaluated by an unmanned aerial vehicle in Hokkaido, northern Japan. Agrofor. Syst. 2019, 93, 1133–1145. [Google Scholar] [CrossRef]
- Li, Y.; Qian, M.; Liu, P.; Cai, Q.; Li, X.; Guo, J.; Yan, H.; Yu, F.; Yuan, K.; Yu, J.; et al. The recognition of rice images by UAV based on capsule network. Clust. Comput. 2018, 1–10. [Google Scholar] [CrossRef]
- Aasen, H.; Bolten, A. Multi-temporal high-resolution imaging spectroscopy with hyperspectral 2D imagers–From theory to application. Remote Sens. Environ. 2018, 205, 374–389. [Google Scholar] [CrossRef]
- Quebrajo, L.; Perez-Ruiz, M.; Pérez-Urrestarazu, L.; Martínez, G.; Egea, G. Linking thermal imaging and soil remote sensing to enhance irrigation management of sugar beet. Biosyst. Eng. 2018, 165, 77–87. [Google Scholar] [CrossRef]
- Han, L.; Yang, G.; Yang, H.; Xu, B.; Li, Z.; Yang, X. Clustering field-based maize phenotyping of plant-height growth and canopy spectral dynamics using a UAV remote-sensing approach. Front. Plant Sci. 2018, 9, 1638. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Křížová, K.; Kroulík, M.; Haberle, J.; Lukáš, J.; Kumhálová, J. Assessment of soil electrical conductivity using remotely sensed thermal data. Agron. Res. 2018, 16, 784–793. [Google Scholar]
- Huang, C.-y.; Wei, H.L.; Rau, J.Y.; Jhan, J.P. Use of principal components of UAV-acquired narrow-band multispectral imagery to map the diverse low stature vegetation fAPAR. GISci. Remote Sens. 2019, 56, 605–623. [Google Scholar] [CrossRef]
- Souza, I.R.; Escarpinati, M.C.; Abdala, D.D. A curve completion algorithm for agricultural planning. In Proceedings of the 33rd Annual ACM Symposium on Applied Computing, Pau, France, 9–13 April 2018; pp. 284–291. [Google Scholar]
- Pascuzzi, S.; Anifantis, A.S.; Cimino, V.; Santoro, F. Unmanned aerial vehicle used for remote sensing on an apulian farm in southern Italy. In Proceedings of the 17th International Scientific Conference Engineering for Rural Development, Jelgava, Latvia, 23–25 May 2018; pp. 23–25. [Google Scholar]
- Bah, M.D.; Hafiane, A.; Canals, R. Weeds detection in UAV imagery using SLIC and the hough transform. In Proceedings of the 2017 Seventh International Conference on Image Processing Theory, Tools and Applications (IPTA), Montreal, QC, Canada, 28 November–1 December 2017; pp. 1–6. [Google Scholar]
- Pantelej, E.; Gusev, N.; Voshchuk, G.; Zhelonkin, A. Automated field monitoring by a group of light aircraft-type UAVs. In Proceedings of the International Conference on Intelligent Information Technologies for Industry, Sochi, Russia, 17–21 September 2018; Springer: Cham, Switzerland, 2018; pp. 350–358. [Google Scholar]
- Parraga, A.; Doering, D.; Atkinson, J.G.; Bertani, T.; de Oliveira Andrades Filho, C.; de Souza, M.R.Q.; Ruschel, R.; Susin, A.A. Wheat Plots Segmentation for Experimental Agricultural Field from Visible and Multispectral UAV Imaging. In Proceedings of the SAI Intelligent Systems Conference, London, UK, 6–7 September 2018; Springer: Cham, Switzerland, 2018; pp. 388–399. [Google Scholar]
- Bah, M.D.; Dericquebourg, E.; Hafiane, A.; Canals, R. Deep Learning Based Classification System for Identifying Weeds Using High-Resolution UAV Imagery. In Proceedings of the Science and Information Conference, London, UK, 10–12 July 2018; Springer: Cham, Switzerland, 2018; pp. 176–187. [Google Scholar]
- Mancini, A.; Frontoni, E.; Zingaretti, P. Improving Variable Rate Treatments by Integrating Aerial and Ground Remotely Sensed Data. In Proceedings of the 2018 International Conference on Unmanned Aircraft Systems (ICUAS), Dallas, TX, USA, 12–15 June 2018; pp. 856–863. [Google Scholar]
- Palomino, W.; Morales, G.; Huamán, S.; Telles, J. PETEFA: Geographic Information System for Precision Agriculture. In Proceedings of the 2018 IEEE XXV International Conference on Electronics, Electrical Engineering and Computing (INTERCON), Lima, Peru, 8–10 August 2018; pp. 1–4. [Google Scholar]
- De Oca, A.M.; Arreola, L.; Flores, A.; Sanchez, J.; Flores, G. Low-cost multispectral imaging system for crop monitoring. In Proceedings of the 2018 International Conference on Unmanned Aircraft Systems (ICUAS), Dallas, TX, USA, 12–15 June 2018; pp. 443–451. [Google Scholar]
- Montero, D.; Rueda, C. Detection of palm oil bud rot employing artificial vision. In IOP Conference Series: Materials Science and Engineering; IOP Publishing: Bristol, UK, 2018; Volume 437, p. 012004. [Google Scholar]
- Wang, X.; Sun, H.; Long, Y.; Zheng, L.; Liu, H.; Li, M. Development of Visualization System for Agricultural UAV Crop Growth Information Collection. IFAC-PapersOnLine 2018, 51, 631–636. [Google Scholar] [CrossRef]
- Ronchetti, G.; Pagliari, D.; Sona, G. DTM Generation Through UAV Survey With a FISHEYE Camera On a Vineyard. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2018, 42, 2. [Google Scholar] [CrossRef] [Green Version]
- Hassanein, M.; El-Sheimy, N. An efficient weed detection procedure using low-cost UAV imagery system for precision agriculture applications. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2018. [Google Scholar] [CrossRef] [Green Version]
- Lussem, U.; Bolten, A.; Gnyp, M.; Jasper, J.; Bareth, G. Evaluation of RGB-based vegetation indices from UAV imagery to estimate forage yield in Grassland. ISPRS-Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2018, 1215–1219. [Google Scholar] [CrossRef] [Green Version]
- Rudd, J.D.; Roberson, G.T. Using unmanned aircraft systems to develop variable rate prescription maps for cotton defoliants. In Proceedings of the 2018 ASABE Annual International Meeting, Detroit, MI, USA, 29 July–1 August 2018; American Society of Agricultural and Biological Engineers: St. Joseph, MI, USA, 2018; p. 1. [Google Scholar]
- Soares, G.A.; Abdala, D.D.; Escarpinati, M. Plantation Rows Identification by Means of Image Tiling and Hough Transform. In Proceedings of the 13th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2018), Madeira, Portugal, 27–29 January 2018; pp. 453–459. [Google Scholar]
- Zhao, T.; Niu, H.; de la Rosa, E.; Doll, D.; Wang, D.; Chen, Y. Tree canopy differentiation using instance-aware semantic segmentation. In Proceedings of the 2018 ASABE Annual International Meeting, Detroit, MI, USA, 29 July–1 August 2018; American Society of Agricultural and Biological Engineers: St. Joseph, MI, USA, 2018; p. 1. [Google Scholar]
- Zhao, T.; Yang, Y.; Niu, H.; Wang, D.; Chen, Y. Comparing U-Net convolutional network with mask R-CNN in the performances of pomegranate tree canopy segmentation. Proc. SPIE 2018, 10780, 107801J. [Google Scholar]
- Pap, M.; Kiraly, S. Comparison of segmentation methods on images of energy plants obtained by UAVs. In Proceedings of the 2018 IEEE International Conference on Future IoT Technologies (Future IoT), Eger, Hungary, 18–19 January 2018; pp. 1–8. [Google Scholar]
- Wahab, I.; Hall, O.; Jirström, M. Remote Sensing of Yields: Application of UAV Imagery-Derived NDVI for Estimating Maize Vigor and Yields in Complex Farming Systems in Sub-Saharan Africa. Drones 2018, 2, 28. [Google Scholar] [CrossRef] [Green Version]
- Bhandari, S.; Raheja, A.; Chaichi, M.R.; Green, R.L.; Do, D.; Pham, F.H.; Ansari, M.; Wolf, J.G.; Sherman, T.M.; Espinas, A. Effectiveness of UAV-Based Remote Sensing Techniques in Determining Lettuce Nitrogen and Water Stresses. In Proceedings of the 14th International Conference in Precision Agriculture, Montreal, QC, Canada, 24–27 June 2018. [Google Scholar]
- Xue, X.; Lan, Y.; Sun, Z.; Chang, C.; Hoffmann, W.C. Develop an unmanned aerial vehicle based automatic aerial spraying system. Comput. Electron. Agric. 2016, 128, 58–66. [Google Scholar] [CrossRef]
- Hunt, E., Jr.; Horneck, D.; Hamm, P.; Gadler, D.; Bruce, A.; Turner, R.; Spinelli, C.; Brungardt, J. Detection of nitrogen deficiency in potatoes using small unmanned aircraft systems. In Proceedings of the 12th International Conference on Precision Agriculture, Sacramento, CA, USA, 20–23 July 2014. [Google Scholar]
- Bellvert, J.; Zarco-Tejada, P.J.; Marsal, J.; Girona, J.; González-Dugo, V.; Fereres, E. Vineyard irrigation scheduling based on airborne thermal imagery and water potential thresholds. Aust. J. Grape Wine Res. 2016, 22, 307–315. [Google Scholar] [CrossRef] [Green Version]
- Romero, M.; Luo, Y.; Su, B.; Fuentes, S. Vineyard water status estimation using multispectral imagery from an UAV platform and machine learning algorithms for irrigation scheduling management. Comput. Electron. Agric. 2018, 147, 109–117. [Google Scholar] [CrossRef]
- Wang, G.; Lan, Y.; Qi, H.; Chen, P.; Hewitt, A.; Han, Y. Field evaluation of an unmanned aerial vehicle (UAV) sprayer: Effect of spray volume on deposition and the control of pests and disease in wheat. Pest Manag. Sci. 2019, 75, 1546–1555. [Google Scholar] [CrossRef] [PubMed]
- Hentschke, M.; Pignaton de Freitas, E.; Hennig, C.; Girardi da Veiga, I. Evaluation of Altitude Sensors for a Crop Spraying Drone. Drones 2018, 2, 25. [Google Scholar] [CrossRef] [Green Version]
- Garre, P.; Harish, A. Autonomous Agricultural Pesticide Spraying UAV. In IOP Conference Series: Materials Science and Engineering; IOP Publishing: Bristol, UK, 2018; Volume 455, p. 012030. [Google Scholar]
- Lan, Y.; Chen, S. Current status and trends of plant protection UAV and its spraying technology in China. Int. J. Precis. Agric. Aviat. 2018, 1, 1–9. [Google Scholar] [CrossRef]
- Albetis, J.; Jacquin, A.; Goulard, M.; Poilvé, H.; Rousseau, J.; Clenet, H.; Dedieu, G.; Duthoit, S. On the potentiality of UAV multispectral imagery to detect Flavescence dorée and Grapevine Trunk Diseases. Remote Sens. 2019, 11, 23. [Google Scholar] [CrossRef] [Green Version]
- Dos Santos Ferreira, A.; Freitas, D.M.; da Silva, G.G.; Pistori, H.; Folhes, M.T. Weed detection in soybean crops using ConvNets. Comput. Electron. Agric. 2017, 143, 314–324. [Google Scholar] [CrossRef]
- Ballester, C.; Hornbuckle, J.; Brinkhoff, J.; Smith, J.; Quayle, W. Assessment of in-season cotton nitrogen status and lint yield prediction from unmanned aerial system imagery. Remote Sens. 2017, 9, 1149. [Google Scholar] [CrossRef] [Green Version]
- Gnädinger, F.; Schmidhalter, U. Digital counts of maize plants by unmanned aerial vehicles (UAVs). Remote Sens. 2017, 9, 544. [Google Scholar] [CrossRef] [Green Version]
- Percival, D.; Gallant, D.; Harrington, T.; Brown, G. Potential for commercial unmanned aerial vehicle use in wild blueberry production. In XI International Vaccinium Symposium 1180; International Society for Horticultural Science (ISHS): Orlando, FL, USA, 2016; pp. 233–240. [Google Scholar]
- Yao, X.; Wang, N.; Liu, Y.; Cheng, T.; Tian, Y.; Chen, Q.; Zhu, Y. Estimation of wheat LAI at middle to high levels using unmanned aerial vehicle narrowband multispectral imagery. Remote Sens. 2017, 9, 1304. [Google Scholar] [CrossRef] [Green Version]
- Maimaitijiang, M.; Ghulam, A.; Sidike, P.; Hartling, S.; Maimaitiyiming, M.; Peterson, K.; Shavers, E.; Fishman, J.; Peterson, J.; Kadam, S.; et al. Unmanned Aerial System (UAS)-based phenotyping of soybean using multi-sensor data fusion and extreme learning machine. ISPRS J. Photogramm. Remote Sens. 2017, 134, 43–58. [Google Scholar] [CrossRef]
- Poblete, T.; Ortega-Farías, S.; Moreno, M.; Bardeen, M. Artificial neural network to predict vine water status spatial variability using multispectral information obtained from an unmanned aerial vehicle (UAV). Sensors 2017, 17, 2488. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Jermthaisong, P.; Kingpaiboon, S.; Chawakitchareon, P.; Kiyoki, Y. Relationship between vegetation indices and SPAD values of waxy corn using an unmanned aerial vehicle. Inf. Model. Knowl. Bases XXX 2019, 312, 312. [Google Scholar]
- Gao, J.; Liao, W.; Nuyttens, D.; Lootens, P.; Vangeyte, J.; Pižurica, A.; He, Y.; Pieters, J.G. Fusion of pixel and object-based features for weed mapping using unmanned aerial vehicle imagery. Int. J. Appl. Earth Obs. Geoinf. 2018, 67, 43–53. [Google Scholar] [CrossRef]
- Iqbal, F.; Lucieer, A.; Barry, K. Poppy crop capsule volume estimation using UAS remote sensing and random forest regression. Int. J. Appl. Earth Obs. Geoinf. 2018, 73, 362–373. [Google Scholar] [CrossRef]
- Huuskonen, J.; Oksanen, T. Soil sampling with drones and augmented reality in precision agriculture. Comput. Electron. Agric. 2018, 154, 25–35. [Google Scholar] [CrossRef]
- Jorge, J.; Vallbé, M.; Soler, J.A. Detection of irrigation inhomogeneities in an olive grove using the NDRE vegetation index obtained from UAV images. Eur. J. Remote Sens. 2019, 52, 169–177. [Google Scholar] [CrossRef] [Green Version]
- Bhandari, S.; Raheja, A.; Chaichi, M.R.; Green, R.L.; Do, D.; Pham, F.H.; Ansari, M.; Wolf, J.G.; Sherman, T.M.; Espinas, A. Lessons Learned from UAV-Based Remote Sensing for Precision Agriculture. In Proceedings of the 2018 International Conference on Unmanned Aircraft Systems (ICUAS), Dallas, TX, USA, 12–15 June 2018; pp. 458–467. [Google Scholar]
- Franco, C.; Guada, C.; Rodríguez, J.T.; Nielsen, J.; Rasmussen, J.; Gómez, D.; Montero, J. Automatic detection of thistle-weeds in cereal crops from aerial RGB images. In International Conference on Information Processing and Management of Uncertainty in Knowledge-Based Systems; Springer: Cham, Switzerland, 2018; pp. 441–452. [Google Scholar]
- Sobayo, R.; Wu, H.H.; Ray, R.; Qian, L. Integration of Convolutional Neural Network and Thermal Images into Soil Moisture Estimation. In Proceedings of the 2018 1st International Conference on Data Intelligence and Security (ICDIS), South Padre Island, TX, USA, 8–10 April 2018; pp. 207–210. [Google Scholar]
- Oliveira, R.; Khoramshahi, E.; Suomalainen, J.; Hakala, T.; Viljanen, N.; Honkavaara, E. Real-time and post-processed georeferencing for hyperpspectral drone remote sensing. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sc. 2018, 42, 789–795. [Google Scholar] [CrossRef] [Green Version]
- Liu, J.; Chen, P.; Xu, X. Estimating Wheat Coverage Using Multispectral Images Collected by Unmanned Aerial Vehicles and a New Sensor. In Proceedings of the 2018 7th International Conference on Agro-geoinformatics (Agro-geoinformatics), Hangzhou, China, 6–9 August 2018; pp. 1–5. [Google Scholar]
- Maurya, A.K.; Singh, D.; Singh, K. Development of Fusion Approach for Estimation of Vegetation Fraction Cover with Drone and Sentinel-2 Data. In Proceedings of the IGARSS 2018-2018 IEEE International Geoscience and Remote Sensing Symposium, Valencia, Spain, 22–27 July 2018; pp. 7448–7451. [Google Scholar]
- Kumpumäki, T.; Linna, P.; Lipping, T. Crop Lodging Analysis from Uas Orthophoto Mosaic, Sentinel-2 Image and Crop Yield Monitor Data. In Proceedings of the IGARSS 2018-2018 IEEE International Geoscience and Remote Sensing Symposium, Valencia, Spain, 22–27 July 2018; pp. 7723–7726. [Google Scholar]
- Falco, N.; Wainwright, H.; Ulrich, C.; Dafflon, B.; Hubbard, S.S.; Williamson, M.; Cothren, J.D.; Ham, R.G.; McEntire, J.A.; McEntire, M. Remote Sensing to Uav-Based Digital Farmland. In Proceedings of the IGARSS 2018-2018 IEEE International Geoscience and Remote Sensing Symposium, Valencia, Spain, 22–27 July 2018; pp. 5936–5939. [Google Scholar]
- Albornoz, C.; Giraldo, L.F. Trajectory design for efficient crop irrigation with a UAV. In Proceedings of the 2017 IEEE 3rd Colombian Conference on Automatic Control (CCAC), Cartagena, Colombia, 18–20 October 2017; pp. 1–6. [Google Scholar]
- Chartzoulakis, K.; Bertaki, M. Sustainable water management in agriculture under climate change. Agric. Agric. Sci. Procedia 2015, 4, 88–98. [Google Scholar] [CrossRef] [Green Version]
- Saccon, P. Water for agriculture, irrigation management. Appl. Soil Ecol. 2018, 123, 793–796. [Google Scholar] [CrossRef]
- Gupta, S.G.; Ghonge, M.M.; Jawandhiya, P. Review of unmanned aircraft system (UAS). Int. J. Adv. Res. Comput. Eng. Technol. (IJARCET) 2013, 2, 1646–1658. [Google Scholar] [CrossRef]
- Cai, G.; Dias, J.; Seneviratne, L. A survey of small-scale unmanned aerial vehicles: Recent advances and future development trends. Unmanned Syst. 2014, 2, 175–199. [Google Scholar] [CrossRef] [Green Version]
- Palossi, D.; Gomez, A.; Draskovic, S.; Keller, K.; Benini, L.; Thiele, L. Self-sustainability in nano unmanned aerial vehicles: A blimp case study. In Proceedings of the Computing Frontiers Conference, Siena, Italy, 15–17 May 2017; pp. 79–88. [Google Scholar]
- Oettershagen, P.; Stastny, T.; Mantel, T.; Melzer, A.; Rudin, K.; Gohl, P.; Agamennoni, G.; Alexis, K.; Siegwart, R. Long-endurance sensing and mapping using a hand-launchable solar-powered uav. In Field and Service Robotics; Springer: Cham, Switzerland, 2016. [Google Scholar]
- Maltamo, M.; Naesset, E.; Vauhkonen, J. Forestry applications of airborne laser scanning. Concepts Case Stud. Manag. For. Ecosyst. 2014, 27, 460. [Google Scholar]
- Tsouros, D.C.; Smyrlis, P.N.; Tsipouras, M.G.; Tsalikakis, D.G.; Giannakeas, N.; Tzallas, A.T.; Manousou, P. Automated collagen proportional area extraction in liver biopsy images using a novel classification via clustering algorithm. In Proceedings of the 2017 IEEE 30th International Symposium on Computer-Based Medical Systems (CBMS), Thessaloniki, Greece, 22–24 June 2017; pp. 30–34. [Google Scholar]
- Bonotis, P.A.; Tsouros, D.C.; Smyrlis, P.N.; Tzallas, A.T.; Giannakeas, N.; Evripidis, G.; Tsipouras, M.G. Automated Assesment of Pain Intensity based on EEG Signal Analysis. In Proceedings of the IEEE 19th International Conference on BioInformatics and BioEngineering, Athens, Greece, 28–30 October 2019. [Google Scholar]
- Cui, D.; Curry, D. Prediction in marketing using the support vector machine. Mark. Sci. 2005, 24, 595–615. [Google Scholar] [CrossRef]
- Tarca, A.L.; Carey, V.J.; Chen, X.W.; Romero, R.; Drăghici, S. Machine learning and its applications to biology. PLoS Comput. Biol. 2007, 3, e116. [Google Scholar] [CrossRef] [PubMed]
- Leica Geosystems. ERDAS Imagine; Leica Geosystems: Atalanta, GA, USA, 2004. [Google Scholar]
- Baatz, M.; Benz, U.; Dehghani, S.; Heynen, M.; Holtje, A.; Hofmann, P.; Lingenfelder, I.; Mimler, M.; Sohlbach, M.; Weber, M. eCognition Professional User Guide 4; ADefiniens Imaging: Munich, Germany, 2004. [Google Scholar]
- Tetracam, Inc. ADC Users Guide V2.3; Tetracam, Inc.: Chatsworth, CA, USA, 2011. [Google Scholar]
- Westoby, M.J.; Brasington, J.; Glasser, N.F.; Hambrey, M.J.; Reynolds, J. Structure-from-Motion photogrammetry: A low-cost, effective tool for geoscience applications. Geomorphology 2012, 179, 300–314. [Google Scholar] [CrossRef] [Green Version]
- Liakos, K.; Busato, P.; Moshou, D.; Pearson, S.; Bochtis, D. Machine learning in agriculture: A review. Sensors 2018, 18, 2674. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Hossain, M.D.; Chen, D. Segmentation for Object-Based Image Analysis (OBIA): A review of algorithms and challenges from remote sensing perspective. ISPRS J. Photogramm. Remote Sens. 2019, 150, 115–134. [Google Scholar] [CrossRef]
- Wiegand, C.; Richardson, A.; Escobar, D.; Gerbermann, A. Vegetation indices in crop assessments. Remote Sens. Environ. 1991, 35, 105–119. [Google Scholar] [CrossRef]
- Tanriverdi, C. A review of remote sensing and vegetation indices in precision farming. J. Sci. Eng. 2006, 9, 69–76. [Google Scholar]
- Bannari, A.; Morin, D.; Bonn, F.; Huete, A. A review of vegetation indices. Remote Sens. Rev. 1995, 13, 95–120. [Google Scholar] [CrossRef]
- Xue, J.; Su, B. Significant remote sensing vegetation indices: A review of developments and applications. J. Sens. 2017, 2017, 1353691. [Google Scholar] [CrossRef] [Green Version]
- Eurostat. Agriculture, Forestry and Fishery Statistics; Eurostat: Luxembourg, 2018. [Google Scholar]
- European Commission. Commission Delegated Regulation (EU) 2019/945 of 12 March 2019 on unmanned aircraft systems and on third-country operators of unmanned aircraft systems. Off. J. Eur. Union 2019, L 152, 1–40. [Google Scholar]
- European Commission. Commission Implementing Regulation (EU) 2019/947 of 24 May 2019 on the rules and procedures for the operation of unmanned aircraft. Off. J. Eur. Union 2019, L 152, 45–70. [Google Scholar]
Crop Features | |
---|---|
Vegetation | biomass [22,103] |
nitrogen status [22,99,103,110] | |
moisture content [109,110] | |
vegetation color [49,54] | |
spectral behavior of chlorophyll [64,99] | |
temperature [64,69] | |
spatial position of an object [32,106] | |
size and shape of different elements and plants | |
vegetation indices [54,55,56] | |
Soil | moisture content [109,112] |
temperature [66,69] | |
electrical conductivity [66] |
Software Tool | Description |
---|---|
Adobe Photoshop [21,100] | Applied to correct distortion/use of other image processing methods |
Agisoft Photoscan [22,36,37] | Exploited for the construction of 3D models and orthomosaics. It also allows the calculation of vegetation indices |
QGIS [23,55] | Usually exploited for the calculation of the vegetation indices from multispectral data |
MATLAB [35,100] | Applied mainly for the calculation of vegetation indices. It can also be exploited for other image processing methods |
Pix4D [29,35,55] | The most commonly used tool. It can be used for calculating VIs and/or constructing of 3D models and orthomosaics |
Vegetation Index | Abbreviation | Formula |
---|---|---|
Vegetation Indices derived from multispectral information | ||
Ratio Vegetation Index | RVI | |
Normalized Difference Vegetation Index | NDVI | |
Normalized Difference Red Edge Index | NDRE | |
Green Normalized Difference Vegetation Index | GNDVI | |
RGB-based Vegetation Indices | ||
Excess Greenness Index | ExG | |
Normalized Difference Index | NDI |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Tsouros, D.C.; Bibi, S.; Sarigiannidis, P.G. A Review on UAV-Based Applications for Precision Agriculture. Information 2019, 10, 349. https://doi.org/10.3390/info10110349
Tsouros DC, Bibi S, Sarigiannidis PG. A Review on UAV-Based Applications for Precision Agriculture. Information. 2019; 10(11):349. https://doi.org/10.3390/info10110349
Chicago/Turabian StyleTsouros, Dimosthenis C., Stamatia Bibi, and Panagiotis G. Sarigiannidis. 2019. "A Review on UAV-Based Applications for Precision Agriculture" Information 10, no. 11: 349. https://doi.org/10.3390/info10110349
APA StyleTsouros, D. C., Bibi, S., & Sarigiannidis, P. G. (2019). A Review on UAV-Based Applications for Precision Agriculture. Information, 10(11), 349. https://doi.org/10.3390/info10110349