Next Article in Journal
3D Reconstruction of Coastal Cliffs from Fixed-Wing and Multi-Rotor UAS: Impact of SfM-MVS Processing Parameters, Image Redundancy and Acquisition Geometry
Next Article in Special Issue
Thresholding Analysis and Feature Extraction from 3D Ground Penetrating Radar Data for Noninvasive Assessment of Peanut Yield
Previous Article in Journal
A Surging Glacier Recognized by Remote Sensing on the Zangser Kangri Ice Field, Central Tibetan Plateau
Previous Article in Special Issue
Remote Sensing and Machine Learning in Crop Phenotyping and Management, with an Emphasis on Applications in Strawberry Farming
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Technical Note

A Review of Unmanned Aerial Vehicle Low-Altitude Remote Sensing (UAV-LARS) Use in Agricultural Monitoring in China

1
Institute of Agricultural Sciences in Taihu Area of Jiangsu, Suzhou 215155, China
2
Key Laboratory of Land Surface Pattern and Simulation, Institute of Geographical Sciences and Natural Resources Research, Chinese Academy of Sciences, Beijing 100101, China
3
Suzhou Polytechnic Institute of Agriculture, Suzhou 215008, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2021, 13(6), 1221; https://doi.org/10.3390/rs13061221
Submission received: 22 February 2021 / Revised: 12 March 2021 / Accepted: 16 March 2021 / Published: 23 March 2021
(This article belongs to the Special Issue Digital Agriculture with Remote Sensing)

Abstract

:
Precision agriculture relies on the rapid acquisition and analysis of agricultural information. An emerging method of agricultural monitoring is unmanned aerial vehicle low-altitude remote sensing (UAV-LARS), which possesses significant advantages of simple construction, strong mobility, and high spatial-temporal resolution with synchronously obtained image and spatial information. UAV-LARS could provide a high degree of overlap between X and Y during key crop growth periods that is currently lacking in satellite and remote sensing data. Simultaneously, UAV-LARS overcomes the limitations such as small scope of ground platform monitoring. Overall, UAV-LARS has demonstrated great potential as a tool for monitoring agriculture at fine- and regional-scales. Here, we systematically summarize the history and current application of UAV-LARS in Chinese agriculture. Specifically, we outline the technical characteristics and sensor payload of the available types of unmanned aerial vehicles and discuss their advantages and limitations. Finally, we provide suggestions for overcoming current limitations of UAV-LARS and directions for future work.

Graphical Abstract

1. Introduction

China has the largest population of about 1.4 billion in the world, and it also ranks first in agricultural population of about 600 million. Agriculture is the most important industry in China. At present, China produces 25 percent of the world’s food and feeds around 19 percent of the world’s population with only 7 percent of the world’s arable land. For a long time, farming in China has been a labor-intensive industry. In recent decades, the Chinese government has been paying close attention to agricultural science and technology, leading to a high growth rate in China’s output of various agricultural products and enhancing the transformation from traditional to modern agriculture.
Agricultural remote sensing is a process of systematically monitoring and analyzing agricultural multi-factors using remote sensing technology such as satellites, manned aircraft and unmanned aerial vehicles (UAVs) [1]. Since agricultural remote sensing was introduced to China in the 1970s, it has rapidly outstripped traditional agricultural monitoring to become one of the primary methods of quickly and accurately obtaining agricultural information, and its application in this field continues to grow exponentially [2]. While an advantage of satellite remote sensing is its ability to collect data over a large spatial scale, long revisit periods, cloud occlusion, and high acquisition cost limit its practical application in fine-scale agricultural monitoring [3]. Similarly, ground-based sensing platforms have the limitations of poor mobility and inability to monitor multiple plots simultaneously. UAVs, commonly known as unmanned aircraft systems (UAS) or drones, can be used to overcome these shortcomings [4]. UAVs are a quickly evolving technology and the use of unmanned aerial vehicle low-altitude UAV remote sensing (UAV-LARS) in precision agriculture is currently an area of great research interest and development [5].
UAV-LARS uses a traditional UAV as an aerial platform to carry various sensor payloads including digital cameras and multispectral and hyperspectral high-resolution aerial imaging cameras [6]. Advancements in UAV navigation and obstacle avoidance technology and the development of modularized, light-weight sensor payloads enable real-time target positioning. Decision-making assistance by some on-board software further supports their applications in precision agriculture. The key advantages of operational flexibility, strong timeliness, higher spatial resolution of imagery and lower operational costs have widened the scope of UAV-LARS application in agricultural monitoring [7]. According to the Association for Unmanned Vehicle Systems International (AUVSI), 80% of UAVs will be utilized for agricultural purposes in the coming years [3]. Therefore, there is no doubt that UAVs may play a crucial role in the development of the agricultural sector.
A number of surveys concerning UAV-LARS have been published in the literature. Zhang and Kovacs [8] provided an insight on the devices by defining the application of UAV-LARS in precision agriculture ten years ago. Radoglou-Grammatikis et al. [3] analyzed in detail 20 cases including UAV-based crop monitoring applications. The methodology adopted, the proposed UAV architecture, the UAV type, as well as the UAV technical characteristics and payload for each application were examined. Considering the rapid evolution of UAV-LARS, Xiang et al. [9] presented a specific review of recent advances in technologies and applications. Some research reviewed specific applications of UAV-LARS in the field of forestry [10,11], natural resource management and environment [12]. However, most studies were carried out in developed countries such as Japan, Greece, U.K., and the U.S.A., while there is no systematic review of UAV-LARS including its development history, recent developments in technology of UAV image processing, and its agricultural application in China.
At present, UAV-LARS plays a crucial role in precision agriculture and is developing rapidly in China, so it is essential to take a comprehensive survey to gain a clear understanding of the current status of UAV-LARS. We hope this review could provide an overall picture of recent developments of UAV-LARS in China and help guide the direction of future research. In this review, we provide an in-depth overview of the agricultural research applications of UAV-LARS and specifically detail the (1) history of UAV-LARS development (Section 3), (2) composition of the UAV-LARS platform (Section 4) and research trends (Section 5), (3) scope of UAV data post-processing (Section 6), (4) agricultural research applications of UAV-LARS (Section 7), (5) current limitations of this method (Section 8), and (6) conclusions and potential for future agricultural application of UAV-LARS (Section 9).

2. Methodology

Systematic literature review (SLR) is a type of review that analyzes multiple research studies through a systematic process. In this paper, an SLR was performed to state the UAVs application in various scenarios of agricultural monitoring in China and explore the limitations. We first introduced the general procedures for literature analysis: formulating the question, identifying studies, statistical analysis, and reporting the results. Literature search is the most vital step of the methodology. Thus, electronic literature search of related studies indexed in the China National Knowledge Infrastructure (CNKI) was then conducted using keywords. The CNKI is the largest Chinese digital library with an integrated collection of journals, master’s and doctoral theses, conference papers, newspapers, yearbooks, reference books and other literature resources. Next, the selected studies were combined and analyzed to address specific topics including development history of UAV-LARS, UAV image processing issues, and new applications and research trends. Moreover, the limitations of UAV-LARS were also explored to gain a clearer panorama and to promote further progress. Finally, the main conclusions were gathered up, and future work was prospected. The various steps are as follows:
  • Identification of the need of review
  • Inclusion criterion
  • Identification of keywords
  • Literature collection
  • Literature management
  • Information extraction and analysis
  • Summary and future prospect

3. History of UAV-LARS Development in China

3.1. Exploratory Stage (1982–2000)

The UAV industry in China began later than that in other developed countries such as the U.S.A and Japan. Prior to the 1980s, Nanjing University of Aeronautics and Astronautics and Beijing University of Aeronautics and Astronautics designed a UAV remote sensing prototype. However, this UAV was primarily used in military applications [13]. In 1982, Northwest Polytechnic University developed four D-4 prototypes and two sets of ground equipment for multi-purpose UAVs [14], marking the beginning of civil UAV remote sensing. During this period, UAV-LARS was mainly used in aerial mapping and physical prospecting, with few applications in the agricultural sector.

3.2. Initial Stage (2001–2011)

Development of UAV-LARS technology for agricultural monitoring began in the early 21st century. The Soil and Fertilizer Research Institute of Chinese Academy of Agricultural Sciences first carried out a UAV-LARS experiment in Shanghai Jiangfeng farm in 2004, digitizing land boundaries, measuring plot area, identifying crop species, and analyzing crop growth [15]. Subsequently, UAVs equipped with digital cameras were increasingly used to quantify land use and land cover change [16], investigate medicinal plant resources [17], collect paddy field information [18], and monitor crop diseases [19]. At this point, UAVs became an important part of agricultural remote sensing monitoring.

3.3. General Application Stage (2012–2020)

As the UAV industry chain in China developed, so too did the stability and maneuverability of the UAV platform. The domestic UAV market grew by an order of magnitude: for example, the multi-rotors produced by DJI-Innovations Technology Co., Ltd. (DJI company, Shenzhen, China) dominated the industry with >70% of the global market share and >90% of the domestic market share. Applications in agriculture, forestry, land surveying and mapping, disaster rescue and other industries rapidly accelerated the development of UAV-LARS technology in China [13]. In agricultural monitoring, field extraction, soil nutrient inversion, crop growth monitoring, yield prediction, crop stress, crop phenotypic analysis and farmland water management based on UAV-LARS, collected data are now widely used.

4. Technical Details of the UAV-LARS Platform

4.1. System Architecture

The UAV-LARS is a new type of aerial remote sensing system that uses a UAV as aerial platform to carry sensor payloads such as digital, multispectral, and hyperspectral cameras. These sensors acquire high-resolution images providing accurate information from low altitudes with less interference from clouds, making the UAV-LARS a flexible, efficient remote sensing tool, and it can be repeatedly deployed for data acquisition [20]. The system architecture of a UAV is generally designed for particular tasks. The UAV weight should be first considered for an efficient use in terms of energy; the UAV has to be as light as possible. In order to get sufficient flight time, a powerful propulsion system must be included. The maneuverability and obstacle avoidance function are two important characteristics for safe and autonomous flight. In addition, the software system of UAV should be strictly designed for a high performance.
In general, the basic UAV-LARS system is composed of a flight system, a mission load system and a ground control system (Figure 1). Further down, other UAV modules such as autopilot, inertial navigation system, imaging sensor and its mounting, and route planning software are considered critical. There are a wide variety of shapes and configurations of drones.

4.2. Types of UAVs

There are many types of UAV systems that differ in size, weight, load, power, and endurance time. In agricultural remote sensing, UAVs are used which typically weigh less than 116 kg, that is, belonging to the “small” (≤15 Kg) or “light” (≤7 Kg) UAV classes, and fly lower than 1000 m, falling into either low altitude (100–1000 m) or ultra-low altitude (1–100 m) categories [21]. While there are many flight mechanisms for UAVs, the most popular types for remote sensing are fixed-wing, rotary-wing, and unmanned helicopter (Table 1). Depending on flight mechanisms, UAVs are divided into civilian and military use [22]. Most of the agricultural remote sensing monitoring tasks are carried out by civilian UAVs, mainly composed of fixed-wing UAVs and industrial rotor-wing UAVs, which can fly according to the set route and hover at different altitudes, driven by four to eight propellers. Multirotors are increasingly used since they are naturally more agile, compact and easier to launch than fixed-wing UAVs. X-frame and H-frame are two main types of multirotors designed for different flight behaviors. X-frame is a symmetric air frame and guarantees a higher stabilization than H-frame, which also has a higher payload capacity.
With the rapid increase in expertise and growth of agricultural aviation information technology, the number of consumer-grade UAVs in China has increased explosively over the last few years. The DJI Company, for example, has developed small and light UAVs that can carry a variety of sensor equipment based on different UAV configurations, flight control systems, and payload and endurance time (Table 2). The high cost performance and advanced technology of these UAV products promote the rapid development of UAV industry applications.

4.3. Payload

Based on different agricultural monitoring mission requirements, a range of sensors can be mounted on the UAV platforms. Depending on the UAV’s payload lift capabilities, these sensors can include high resolution CCD digital cameras, light optical cameras, multispectral cameras, hyperspectral cameras, thermal cameras, polarization load and light detection and ranging (LiDAR) systems, among others (Table 3). For agricultural use, it is preferable to have a small, light payload to enhance precision and performance. Increasingly, integrate sensors that perform multiple functions are being developed. Payloads differ in the type of imaging sensor they contain, the spectral band width collected and the resultant image amplitude size. For example, digital cameras and multispectral sensors are most frequently used because of their low cost and multiple available options. Hyperspectral cameras and LiDAR are also used in agricultural applications to obtain crop height, biomass, and leaf area index (LAI) [23].

5. Agricultural UAV-LARS Literature Statistics

We conducted an SLR of UAV-LARS applications for agricultural monitoring in China from 1995 to 2019. A total of 1955 peer-reviewed studies were identified using the search terms “agriculture”, “UAV”, “spectrum”, and “low-altitude remote sensing” in the CNKI literature database. The number of scientific papers published demonstrated a trend of rapid growth over recent years (Figure 2). Most of the research focusing on UAV-LARS was performed since 2012, and accounts for 76.6% of the total literature published during the period of 2016–2019. We now see an average annual number of 374 UAV publications, reflecting an increasing adoption of UAVs technology being applied to agricultural monitoring. Keyword frequency statistics by using the theme visualization tool in the CNKI literature database indicated that the keywords such as “vegetation index”, “remote sensing image” and “hyperspectral” appear frequently in the identified 1955 papers (Figure 3), further reinforcing the current scientific interest in agricultural UAV-LARS applications.

6. UAV Image Processing

Image processing is the prominent concern regarding UAV-LARS. UAV images have high resolution, a high overlap degree, large data volume and small phase amplitude. However, due to the UAV weight restrictions and therefore the types of sensors that can be carried, there remain limitations in data collection such as low pose accuracy, large rotation angle, and image distortion. In the early days, traditional photogrammetry was used to correct UAV image geometry and aerial triangulation; this has since been replaced by algorithms in professional UAV image processing software. These algorithms integrate radiometric calibration, geometric correction, and matching and splicing, etc. to generate point cloud models [37]. These models input UAV sensor data and output digital orthophoto maps (DOMs) and digital elevation models (DEMs). UAV software (for example, Pix4Dmapper (Pix4D SA, Lausanne, Switzerland), PhotoScan (Agisoft LLC, St. Petersburg, Russia), Smart3D (Bentley Systems, Inc., Exton, PA, USA), ImageSation SSK (Intergraph Corp., Huntsville, Alabama, USA), ERDAS/LPS (Intergraph Corp., Huntsville, Alabama, USA) and OneButton (Research System Inc., Manassas, VA, USA) is popular because they are accessible and easy to implement [9]. Chinese software programs including PixelGrid (Chinese Academy Of Surveying And Mapping, Beijing, China), DPGird (Wuhan University, Wuhan, China), DPMatrix (Wuhan University, Wuhan, China), FlightMatrix (Wuhan Visiontek Inc., Wuhan, China), GodWork (Wuhan University, Wuhan, China), MAP-AT (Chinese Academy Of Surveying And Mapping, Beijing, China), Cloud-AT (Guangzhou Remote Sensing Information Technology Co., Ltd., Guangzhou, China), and Heli-Mapping (Wuhan University, Wuhan, China) are also gaining in popularity. Of these, Pix4Dmapper is one of the most highly optimized and accurate estimators. Other researchers maintain that the spectral reflectance and coefficient of variation of PhotoScan mosaic images are more consistent with the original single image than Pix4Dmapper and generate a more accurate DEM and DOM, making it more advantageous in agricultural applications [38].
Most methods of matching and mosaicking UAV images are based on scale invariant feature transform (SIFT) matching algorithms proposed by Lowe [39]. The SIFT features improve in previous approaches by being largely invariant to changes in scale, illumination, and local affine distortions [40]. The SIFT algorithm has the advantages of high matching accuracy, and good tolerance to light and micro angle changes. However, with large image volumes, time efficiency is low due to the complexity of the algorithm [41]. Some studies have optimized the SIFT algorithm, such as by using bidirectional matching or random sample consensus (RANSAC) algorithms [42]. However, most UAV image matching and mosaicking methods are relatively simple and suitable for images with small rotation (rotation angle <15°) or offset (<5–10 pixel) [43]. Recently, new UAV image processing methods have been introduced. These methods match images based on feature points, and search the best stitching line with a dynamic programming algorithm to obtain seamless and uniform color stitching using image fusion methods [44]. For example, Jia et al. [45] proposed a non-minimum value suppression method based on image sharpening to adaptively modify the sampling step size. This increased the number of feature points and matched logarithms by 77.5% and 15 pairs respectively, which is more suitable for low contrast remote sensing image mosaic.

7. UAV-LARS Application in Agriculture

The development of UAV technology and concomitant decline in costs accelerated the wide-spread adoption of UAVs for agricultural monitoring. UAV-LARS information acquisition methods, including photographing, spectral inversion and spectral integration, are used to study farmland dynamics, crop growth, pest and weed stress, disaster assessment, farmland water and fertilizer environment. We elaborate on these different applications below.

7.1. Dynamic Monitoring of Cultivated Land

China is a developing country where agriculture forms the basis of the national economy. As such, cultivated land represents economic output and food security. Understanding the distribution, quality, and dynamic changes in cultivated land is crucial for formulating land use policy. At present, UAV monitoring is used in small- and medium-sized cultivated areas to quantify changes in plot size, shape, and crop planting type. Crop monitoring is essential to identify the most profitable crop planting regime. The earliest research using UAVs to monitor cultivated land in China was performed by Ma et al. [46], who used a Canon EOS 300D (Canon Inc., Tokyo, Japan) digital camera equipped UAV to conduct a land resource survey in Wuming county, Guangxi Province. More recently, UAV-LARS technology was used as part of the Third National Land Survey to evaluate cultivated land in Gulang county, Gansu Province [47]. UAV tilt photogrammetry and object-oriented classification technology were used to reduce the influence of surrounding ground features on the boundary of cultivated land, enabling accurate discrimination of cultivated land and planting characteristics based on perspective conversion [48]. Yu et al. [49] compared UAV images with different spatial and temporal resolutions, heading and lateral overlap and demonstrated that the UAV images with 6 cm spatial resolution could satisfactorily classify small-scale land use in agricultural parks. UAV RGB images can also be analyzed to determine vegetation index, texture and shape information and thereby separate cultivated land without planting crops with an overall accuracy of 89.2% [50].

7.2. Crop Growth Monitoring

Crop growth monitoring involves the measures of changes in crop growth, vegetation cover, and element content. UAV-based crop growth monitoring has the potential to provide information for yield estimates and agricultural management decisions. As previously noted, much of previous work in UAV-based precision agriculture has been focused on estimating important agricultural attributes in staple or other widely grown crops (that is, rice, wheat, maize, etc.)
Many researchers currently use UAVs equipped with high-definition and multi-spectral cameras to quantify numerous crop-related vegetation indices (that is, normalized difference vegetation index (NDVI), enhanced vegetation index (EVI), difference vegetation index (DVI), ratio vegetation index (RVI), soil regulated vegetation index (SRVI), green normalized vegetation index(GNVI), water band index (WBI), chlorophyll absorption ratio index (CARI), photochemical reflectance index (PRI) and red edge vegetation stress index (RVTI) [51]). This can be accomplished by combining the use of UAVs with methods of spectral angle mapping, maximum likelihood classification, wavelet fusion algorithm, Fisher discriminant analysis, and support vector machines to extract crop parameters (that is, crop emergence rate, plant height, biomass, nitrogen content, and chlorophyll content) during different growth periods. Using a Canon PowerShot G16 digital camera (Canon Inc., Tokyo, Japan) and an ADC Lite multispectral sensor (Tetracam Inc., Chatsworth, CA, USA) on a multi-rotor UAV, Gao et al. [52] performed univariate and multivariable LAI inversion models on images of soybean plots, demonstrating that the bulging stage was the best time for soybean LAI inversion, and NDVI linear regression modelling could predict real-time soybean growth. Pei et al. [53] established a high-precision inversion model for the growth period of wheat by analyzing the correlation between comprehensive growth parameters extracted from UAV images. Other studies have shown that high-resolution UAV images of corn fields taken at different growth stages can be used to generate highly accurate measures of LAI [54] and that digital and multispectral UAV images contain data necessary to accurately estimate rice yield [55].
Helge [56] was the first to use frame type hyperspectral imagery from UAVs to study crop growth; Chinese scholars quickly followed suit. Qin et al. [28] demonstrated that UAV hyperspectral images could be used to calculate the total nitrogen content of rice leaves using a linear estimation model based on the ratio spectral index from the first derivative of 738 and 522 nm spectral reflectance. Infrared thermal imaging systems and micro LiDAR systems mounted on UAV platforms can also be used to obtain the growth information of crops, but these methods have limitations (for example, very expensive, low signal-to-noise ratio of images, and blurred edge of object) [57].

7.3. Monitoring Soil Water and Fertilizer in Cultivated Land

Soil water and fertilizer are essential for crop growth and yield, and new applications for monitoring these factors using UAV-LARS are under active development.
Soil moisture content (SMC) determines soil aggregate structure and nutrient status, and managing SMC in cultivated land through appropriate irrigation can improve soil conditions during critical crop growth periods and thereby improve the crop yield and quality. Wang et al. [29] developed a regional farmland SMC hyperspectral quantitative estimation model that uses UAV hyperspectral sensor data, providing a new method for remote sensing monitoring of SMC. Other researchers have calculated the crop water deficit index (strongly related to SMC) based on thermal infrared imaging, which can reveal the spatial distribution of crop water deficit [58]. Furthermore, thermal infrared images taken by the UAV can be used to calculate the water temperature comprehensive index (the sum of crop water stress index, canopy relative temperature difference and surface relative temperature difference) and quantify SMC at different soil depths [33].
Soil carbon matter (SOM) content is an important indicator of soil fertility. The visible near infrared (NIR) spectral band contains important information regarding hydrogen groups in SOM [59]. Using UAV equipped with multi-spectral and hyper-spectral sensors, Wang [60] built an empirical model that can effectively calculate SOM in cultivated land. This model factors in the relationships between flight altitude, imaging width, pixel scale, monitoring model, and monitoring accuracy by using wavelet transform, principal components analysis and partial least squares methods in a multi-scale diagnosis of farmland SOM content. This approach can be used to overcome the issue of low accuracy in SOM inversion. Guo et al. [61] used hyperspectral UAV data to create an SOM estimation model for paddy soil based on the original reflectance ratio index, which reduced the influence of soil background (e.g., water and straw) on the SOM calculation, substantially improving estimation accuracy. Overall, however, the complexity of the SOM structure and composition, soil type, soil moisture, straw mulching, terrain and other factors makes it difficult to build a high-precision model based on UAV images and further research and refinement in this area is necessary.

7.4. Diseases, Insect Pests and Weeds Identification

Diseases, insect pests, and weeds significantly restrict crop yield. According to the Food and Agriculture Organization of the United Nations (FAO), diseases and insect pests in China account for an annual average loss of 30% total yield [62,63]. The spectrums of afflicted and healthy crops differ substantially due to physiological changes in cell structure, pigment, water content, LAI and biomass caused by disease spots, wilting, defoliation, necrosis, slow growth and other characteristics. Generally, the reflectance in the visible light (400–760 nm) region from infected plants is higher than that from healthy ones, while the reflectance in the near-infrared (760–1000 nm) range is lower from infected plants [64], which demonstrates the “blue shift” phenomenon. Furthermore, slightly different spectral responses to different diseases, insect pests and weeds are presented [57,65]. For example, Ma et al. [66] showed that the best spectral characteristics for monitoring the degree of harm of the Chinese chestnut red mite are low position and red edge, and the determination coefficient exceeded 0.6. The blue shift of the characteristic wavelength could be found in mild red mite pest using these two characteristics. Tian et al. [67] showed that the rate of rice roll leaf had significant negative correlation with the spectral reflectance at the green (560 nm), red-edge (717 nm) and NIR (840 nm) bands and positive correlation with the spectral reflectance at the red (668 nm) band. The rate of roll leaf was also negatively correlated with NDVI and DVI.
Multi-spectral and hyper-spectral sensor-equipped UAV can be used to monitor crop diseases, insect infestations, and weeds. Hyper-spectral images have higher spectral resolution than multi-spectral images and can capture detailed information on spectral properties, giving this tool a wider application in agricultural monitoring. Diseases such as wheat stripe rust disease have been identified based on the photochemical vegetation index PRI using hyperspectral UAV imaging [68]. An optimal index model of this disease was developed based on UAV imaged reflectance of wheat canopies [14]. Wang et al. [69] designed a small multi-rotor UAV system to identify rice disease with high accuracy, which effectively reduced the influence of complex background such as rice leaf occlusion, rice panicle adhesion, and illumination in natural environment. For insect pests, Cui [70] used the Chichi information criterion (AIC) to select characteristics of cotton field mites from remote sensing data that enable the identification of mite damage in real time. Low-altitude imaging (DJI M100 equipped with a Zenmuse 100 PTZ camera (DJI company, Shenzhen, China)) in conjunction with convolution neural network analysis has also been used to identify grass weed categories and to evaluate weed density [71]. Further studies found that an image resolution of 0.29 cm was optimal for high accuracy weed detection in cotton fields [72]. While the identification of diseases and weeds (that is, discrimination of healthy vs. unhealthy crops) by UAV-LARS is a common practice, classification accuracy still needs to be improved. Due to camouflage and migration of pests coupled with the complex field environment (that is, background), crop pests remain difficult to identify [57].

7.5. Natural Disaster Assessment

After agricultural disasters such as heavy rainfall, gale and hail, UAV-LARS can quickly procure images to accurately determine the extent of crop loss. With high efficiency and low cost, UAV-LARS has become a foundation component of agricultural insurance estimation. At present, the distinguishment of plant lodging by UAV is most studied. Multiple methods including color and texture features, maximum likelihood classification, and image segmentation network methods are used to recognize the lodged corn, wheat, cotton, and rice and to identify the afflicted area [73,74,75,76,77]. Gan et al. [78] built a waterlogging disaster recognition model based on the difference of corn canopy height calculated using UAV-LiDAR data. To date, research into agricultural natural disaster assessment using UAV-LARS in China has been concentrated on the plain, and the monitoring accuracy remains low.

8. Existing Problems

8.1. Endurance Capability

UAVs are equipped with multiple sensors, and are limited by lithium battery capacity and the UAV power weight ratio, and the endurance time of UAV is usually limited to 30 min. Even under low load conditions, this endurance time generally does not exceed 60 min, significantly limiting the single flight monitoring area. Multiple batteries are needed for large-scale remote sensing monitoring, increasing the cost, reducing the convenience of field operation, and overall limiting the development of UAV-LARS in agricultural applications [79]. As an example, the maximum safe endurance time for a UAV equipped with a Zenmuse X3 camera (247 g) (DJI company, Shenzhen, China) is about 20 min. When the flight altitude is 120 m and the overlap degree of UAV images is 70–80%, only 40–50 ha can be monitored in a single flight. If the payload weight increases, the monitoring area decreases further. Solving the endurance time limitation is a pressing issue for the development of the UAV industry. Future breakthroughs in fuel cell technology (such as hydrogen fuel, graphene battery, etc.), power management chip technology, hollow wireless charging, and other technologies will be important for continuing to expand UAV applications.

8.2. Safety of Air UAV Operation

In 2008, “civil UAV pilot management regulations of China” implemented rules regarding UAV piloting. These regulations agree that only a UAV with an empty weight of less than 4 kg or a takeoff weight of less than 7 kg that flies at an altitude of less than 120 m can be operated by a pilot without a UAV license. Regulations and legislation regarding UAV use are still not perfect, which is becoming an increasing issue as the number of civil UAVs in China rises rapidly. The “unregistered flight” phenomenon (flight without UAV flight license or legal identity) has caused security risks such as disturbing aviation order. In addition, complex geographical environments, climate perturbations, and power grid structures can lead to UAV explosion accidents, resulting in economic losses and social security threats. UAV pilots should strictly follow the “Basic flight rules of the people’s Republic of China”, “Civil UAV air traffic management measures” and other laws and regulations and make sure to apply in advance, plan reasonably, and operate with certificates in order to maximize the safety of the UAV operation. The civil UAV supervision department should also establish and improve their systems of supervision and clearly identify clearance and no-fly zones designated by the regulatory department, enabling further development of UAV applications in agricultural monitoring.

8.3. Monitoring Effectiveness

The majority of images procured by UAVs are irregular and real-time return has not yet been realized. Complex data processing and analysis can be conducted using professional software, but this method is not optimal for users that require immediate monitoring results. In addition, the huge amount of data requires a large storage space. With the development of 5G mobile communication technology, we may be able to improve the transmission bandwidth of the UAV data link to improve UAV data long-distance and low delay returns. Moving forward, commonly-used algorithms for agricultural remote sensing monitoring require more thorough refinement to develop the interactive processing program. Finally, new developments will ideally enable real-time and conventional operation to be carried out on UAVs based on the characteristic band or band combination calculation, providing accurate and efficient analysis of agricultural information [80].

9. Conclusions and Prospects

UAV-LARS is a new type of high spatial and temporal resolution sensor application of information monitoring technology. In this article, the development history, data processing, current application and future prospective of UAV-LARS has been systematically reviewed. With the on-going development of UAV technology and the rapid growth of the UAV industry, UAV-LARS provides various degrees of research and application in crop growth monitoring, yield prediction, and disaster monitoring in China. By analyzing the spatiotemporal dynamic changes of UAV images, crop growth and environmental information can be obtained in real time. It is clear that limitations of the technology exist, such as low payload, short endurance time and a narrow area for image collection. Despite all this, UAV-LARS has significant advantages over ground-based survey and satellite-based monitoring including high mobility, low cost, near real-time application, and the ability to provide high-precision crop sample and fine texture information for precision agriculture. Overall, this method provides a promising platform for effective management of crops, soil, fertilization and irrigation.
Agricultural monitoring based on UAV-LARS started relatively late in China, however, it is foreseeable that in the near future, the applications will proliferate depending on the flexibility associated with accurate and low-cost products. Given the complexity of application scenarios, UAV-LARS urgently needs to realize intelligence and automation (e.g., a multi-agent UAV swarm [81]). The research and development of UAV or UAV group self-charging, autonomous take-off and landing, automatic obstacle avoidance, intelligent tracking, regular cruise, real-time return, automatic image interpretation and other technologies will further improve the application level of low-altitude remote sensing in agricultural monitoring.

Author Contributions

H.Z. and L.W. contributed to the manuscript discussion and writing; T.T. gave comments on the manuscript and checked the writing; J.Y. contributed to the conception of the study. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Foundation of Suzhou academy of agricultural sciences, grant number 8111722 and Suzhou agricultural Science and technology innovation project, grant number SNG2020072.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Wang, D.; Shao, Q.; Yue, H. Surveying Wild Animals from Satellites, Manned Aircraft and Unmanned Aerial Systems (UASs): A Review. Remote Sens. 2019, 11, 1308. [Google Scholar] [CrossRef] [Green Version]
  2. Tang, H.J. Progress and Prospect of Agricultural Remote Sensing Research. J. Agric. 2018, 8, 167–171. [Google Scholar] [CrossRef]
  3. Radoglou-Grammatikis, P.; Sarigiannidis, P.; Lagkas, T.; Moscholios, I. A Compilation of UAV Applications for Precision Agriculture. Comput. Netw. 2020, 172, 107148. [Google Scholar] [CrossRef]
  4. Araus, J.L.; Cairns, J.E. Field high-throughput phenotyping: The new crop breeding frontier. Trends Plant Sci. 2014, 19, 52–61. [Google Scholar] [CrossRef]
  5. Zappa, C.J.; Brown, S.M.; Laxague, N.J.M.; Dhakal, T.; Subramaniam, A. Using Ship-Deployed High-Endurance Unmanned Aerial Vehicles for the Study of Ocean Surface and Atmospheric Boundary Layer Processes. Front. Mar. Sci. 2020, 6, 777. [Google Scholar] [CrossRef]
  6. Bagaram, M.B.; Giuliarelli, D.; Chirici, G.; Giannetti, F.; Barbati, A. UAV Remote Sensing for Biodiversity Monitoring: Are Forest Canopy Gaps Good Covariates? Remote Sens. 2018, 10, 1397. [Google Scholar] [CrossRef]
  7. Matese, A.; Toscano, P.; Di Gennaro, S.F.; Genesio, L.; Vaccari, F.P.; Primicerio, J.; Belli, C.; Zaldei, A.; Bianconi, R.; Gioli, B. Intercomparison of UAV, Aircraft and Satellite Remote Sensing Platforms for Precision Viticulture. Remote Sens. 2015, 7, 2971–2990. [Google Scholar] [CrossRef] [Green Version]
  8. Zhang, C.; Kovacs, J.M. The application of small unmanned aerial systems for precision agriculture: A review. Precis. Agric. 2012, 13, 693–712. [Google Scholar] [CrossRef]
  9. Xiang, T.Z.; Xia, G.S.; Zhang, L. Mini-Unmanned Aerial Vehicle-Based Remote Sensing: Techniques, applications, and prospects. IEEE Geosc. Rem. Sen. M. 2019, 7, 29–63. [Google Scholar] [CrossRef] [Green Version]
  10. Torresan, C.; Berton, A.; Carotenuto, F.; Di Gennaro, S.F.; Gioli, B.; Matese, A.; Miglietta, F.; Vagnoli, C.; Zaldei, A.; Wallace, L. Forestry applications of UAVs in Europe: A review. Int. J. Remote Sens. 2017, 38, 2427–2447. [Google Scholar] [CrossRef]
  11. Pádua, L.; Vanko, J.; Hruška, J.; Adão, T.; Sousa, J.J.; Peres, E.; Morais, R. UAS, sensors, and data processing in agroforestry: A review towards practical applications. Int. J. Remote Sens. 2017, 2349–2391. [Google Scholar] [CrossRef]
  12. Stanisław, A.; Dawid, P.; Roman, U. Unmanned aerial vehicles for environmental monitoring with special reference to heat loss. E3S Web Conf. 2017, 19, 02005. [Google Scholar] [CrossRef] [Green Version]
  13. Yan, L.; Liao, X.H.; Zhou, C.H.; Fan, B.K.; Gong, J.Y.; Cui, P.; Zheng, Y.Q.; Tan, X. The impact of UAV remote sensing technology on the industrial development of China: A review. J. Geo Inf. Sci. 2019, 21, 476–495. [Google Scholar] [CrossRef]
  14. Yu, C.W. Development of civil UAV in China. Robot. Ind. 2017, 1, 52–58. [Google Scholar] [CrossRef]
  15. Bai, Y.L.; Jin, J.Y.; Yang, L.P.; Zhang, N.; Wang, L. Low altitude remote sensing technology and its application in precision agriculture. Soil Fert. Sci. Chin. 2004, 1, 3–6, 52. [Google Scholar] [CrossRef]
  16. Ma, R.S.; Sun, H.; Ma, L.J.; Lin, Z.G.; Wu, C.H. Land use Survey Based on Image from Miniature Unmanned Aerial Vehicle. Remote Sens. Inf. 2006, 1, 43–45. [Google Scholar] [CrossRef]
  17. Xie, C.X.; Chen, S.L.; Lin, Z.J.; Zhou, Y.Q.; Li, Y. Application of UAV remote sensing technology in investigation of medicinal plant resources. Mod. Chin. Med. 2007, 9, 4–6. [Google Scholar] [CrossRef]
  18. Li, J.Y.; Zhang, T.M.; Peng, X.D.; Yan, G.Q.; Chen, Y. The application of small UAV(SUAV) in Farmland information monitoring system. J. Agric. Mech. Res. 2010, 32, 183–186. [Google Scholar] [CrossRef]
  19. Leng, W.F.; Wang, H.G.; Xu, Y.; Ma, Z.H. Preliminary study on monitoring wheat stripe rust with using UAV. Acta Phytopath. Sin. 2012, 42, 202–205. [Google Scholar] [CrossRef]
  20. Mukherjee, A.; Misra, S.; Raghuwanshi, N.S. A survey of unmanned aerial sensing solutions in precision agriculture. J. Netw. Comput. Appl. 2019, 148, 102461. [Google Scholar] [CrossRef]
  21. Civil Aviation Administration of China. Interim Regulations on Flight Management of Unmanned Aerial Vehicles. 2018. Available online: http://www.caac.gov.cn/HDJL/YJZJ/201801/t20180126_48853.html (accessed on 23 March 2021).
  22. Park, M.; Lee, S.G.; Lee, S. Dynamic Topology Reconstruction Protocol for UAV Swarm Networking. Symmetry 2020, 12, 1111. [Google Scholar] [CrossRef]
  23. Anthony, D.; Elbaum, S.; Lorenz, A.; Detweiler, C. On crop height estimation with UAVs. IEEE RSJ Int. Conf. Intell. Robots Syst. 2014. [Google Scholar] [CrossRef] [Green Version]
  24. Sun, G.; Huang, W.J.; Chen, P.F.; Gao, S.; Wang, X. Advances in UAV-based Multispectral Remote Sensing Applications. Trans. Chin. Soc. Agric. Mach. 2018, 49, 1–17. [Google Scholar] [CrossRef]
  25. Li, B.; Liu, R.Y.; Liu, S.H.; Liu, Q.; Liu, F.; Zhou, G.Q. Monitoring vegetation coverage variation of winter wheat by low-altitude UAV remote sensing system. Trans. Chin. Soc. Agric. Eng. 2012, 28, 160–165. [Google Scholar] [CrossRef]
  26. Mao, Z.H.; Deng, L.; Sun, J.; Zhang, A.W.; Chen, X.Y.; Zhao, Y. Research on the application of UAV multispectral remote sensing in the maize chlorophyll prediction. Spectrosc. Spect. Anal. 2018, 38, 2923–2931. [Google Scholar] [CrossRef]
  27. Wei, Q.; Zhang, B.Z.; Wei, Z.; Han, X.; Duan, C.F. Estimation of Canopy Chlorophyll Content in Winter Wheat by UAV Multispectral Remote Sensing. J. Triticeae Crop. 2020, 3, 365–372. [Google Scholar] [CrossRef]
  28. Qin, Z.F.; Chang, Q.R.; Xie, B.N.; Shen, J. Rice leaf nitrogen content estimation based on hysperspectral imagery of UAV in Yellow River diversion irrigation district. Trans. Chin. Soc. Agric. Eng. 2016, 32, 77–85. [Google Scholar] [CrossRef]
  29. Wang, J.Z.; Ding, J.L.; Ma, X.K.; Ge, X.Y.; Liu, B.H.; Liang, J. Detection of Soil Moisture Content Based on UAV-derived hyperspectral imagery and spectral index in oasis cropland. Trans. Chin. Soc. Agric. Mach. 2018, 49, 164–172. [Google Scholar] [CrossRef]
  30. Liang, H.; Liu, H.H.; He, J. Rice Photosynthetic performance monitoring system based on UAV hyperspectral. J. Agric. Mech. Res. 2020, 42, 214–218. [Google Scholar] [CrossRef]
  31. Bian, J. Diagnostic Model for Crops Moisture Status Based on UAV Thermal Infrared. Master’s Thesis, Northwest A&F University, Yanglin, China, 2019. [Google Scholar]
  32. Duan, C.F.; Hu, Z.H.; Wei, Z.; Zhang, B.Z.; Chen, H.; Li, R. Estimation of summer maize evapotranspiration and its influencing factors based on UAVs thermal infrared remote sensing. Water Saving Irrigation 2019, 12, 110–116. [Google Scholar]
  33. Zhang, Z.T.; Xu, C.H.; Tan, C.X.; Bian, J.; Han, W.T. Influence of coverage on soil moisture content of field corn inversed from thermal infrared remote sensing of UAV. Trans. Chin. Soc. Agric. Mach. 2019, 50, 213–225. [Google Scholar] [CrossRef]
  34. Wang, K.L. Cotton Canopy Information Recognition Based on Visible Light and Thermal Infrared Image of UAV. Master’s Thesis, Chinese Academy of Agricultural Sciences, Beijing, China, 2019. [Google Scholar]
  35. Yang, F. Estimation of Winter Wheat Aboveground Biomass with UAV LiDAR and Hyperspectral Data; Xi’an University of Science and Technology: Xi’an, China, 2017. [Google Scholar]
  36. Chen, H. LAI Inversion Method for Crop Based on LiDAR and Multispectral Remote Sensing. Master’s Thesis, Shihezi University, Shihezi, China, 2018. [Google Scholar]
  37. Hruska, R.; Mitchell, J.; Anderson, M.; Glenn, N.F. Radiometric and Geometric Analysis of Hyperspectral Imagery Acquired from an Unmanned Aerial Vehicle. Remote Sens. 2012, 4, 2736–2752. [Google Scholar] [CrossRef] [Green Version]
  38. Chen, P.F.; Xu, X.G. A comparison of photogrammetric software packages for mosaicking unmanned aerial vehicle (UAV) images in agricultural application. Acta Agron. Sin. 2020, 46, 1112–1119. [Google Scholar] [CrossRef]
  39. Lowe, D.G. Object Recognition from Local Scale-Invariant Features. In Proceedings of the International Conference on Computer Vision, Kerkyra, Greece, 20–27 September 1999; pp. 1150–1157. [Google Scholar] [CrossRef]
  40. Lowe, D.G. Distinctive Image Features from Scale-Invariant Keypoints. Int. J. Comput. Vis. 2004, 60, 91–110. [Google Scholar] [CrossRef]
  41. Yang, J.B.; Jiang, Y.T.; Yang, X.B.; Guo, G.M. A fast mosaic algorithm of UAV images based on Dense SIFT feature matching. J. Geo Inf. Sci. 2019, 21, 588–599. [Google Scholar] [CrossRef]
  42. Li, F.F.; Xiao, B.L.; Jia, Y.H.; Mao, X.L. Improved SIFT algorithm and its application in automatic registration of remotely-sensed imagery. Geomat. Inf. Sci. Wuhan Univ. 2009, 34, 1245–1249. [Google Scholar] [CrossRef] [Green Version]
  43. Zhao, Z.; Ling, X.; Sun, C.K.; Li, Y.Z. UAV tilted images matching research based on POS. Remote Sens. Land Resour. 2016, 28, 87–92. [Google Scholar] [CrossRef]
  44. Chen, X.L.; Chen, X.L.; Peng, Y.Y. The key technology research on image processing in unmanned aerial vehicle. Beijing Surv. Mapp. 2016, 3, 24–27. [Google Scholar] [CrossRef]
  45. Jia, Y.J.; Xu, Z.A.; Su, Z.B.; Rizwan, A.M. Mosaic of crop remote sensing images from UAV based on improved SIFT algorithm. Trans. Chin. Soc. Agric. Eng. 2017, 33, 123–129. [Google Scholar] [CrossRef]
  46. Ma, L.J.; Ma, R.S.; Lin, Z.G.; Wu, C.H.; Sun, H. Application of micro UAV Remote Sensing. J. Meteorol. Res. Appl. 2005, 26, 180–181. [Google Scholar] [CrossRef]
  47. Wang, Y.H.; Zhang, Y.Y.; Men, L.J.; Liu, B. UAV survey in the third national land survey application of pilot project in Gansu. Geomat. Spat. Inf. Technol. 2019, 42, 219–221. [Google Scholar] [CrossRef]
  48. Lei, Y.; Zhou, W.Z. Application prospect of UAV tilt photogrammetry technology in land survey. Resour. Habitant Environ. 2019, 7, 11–13. [Google Scholar] [CrossRef]
  49. Yu, K.; Shan, J.; Wang, Z.M.; Lu, B.H.; Qiu, L.; Mao, L.J. Land use status monitoring in small scale by unmanned aerial vehicles (UAVs) observations. Jiangsu Agric. Sci. 2019, 35, 853–859. [Google Scholar] [CrossRef]
  50. Xu, P.; Xu, W.C.; Luo, Y.F.; Han, Y.W.; Wang, J.Y. Precise classification of cultivated land based on visible remote sensing image of UAV. J. Agric. Sci. Tech. 2019, 21, 79–86. [Google Scholar] [CrossRef]
  51. Liu, Z.; Wan, W.; Huang, J.Y.; Han, Y.W.; Wang, J.Y. Progress on key parameters inversion of crop growth based on unmanned aerial vehicle remote sensing. Trans. Chin. Soc. Agric. Eng. 2018, 34, 60–71. [Google Scholar] [CrossRef]
  52. Gao, L.; Yang, G.J.; Wang, B.S.; Yu, H.Y.; Xu, B.; Feng, H.K. Soybean leaf area index retrieval with UAV (unmanned aerial vehicle) remote sensing imagery. Chin. J. Eco Agric. 2015, 23, 868–876. [Google Scholar] [CrossRef]
  53. Pei, H.J.; Feng, H.K.; Li, C.C.; Jin, X.L.; Li, Z.H.; Yang, G.J. Remote sensing monitoring of winter wheat growth with UAV based on comprehensive index. Trans. Chin. Soc. Agric. Eng. 2017, 33, 74–82. [Google Scholar] [CrossRef]
  54. Chu, H.L.; Xiao, Q.; Bai, J.H.; Cheng, J. The retrieval of leaf area index based on remote sensing by unmanned aerial vehicle. Remote Sens. Tech. Appl. 2017, 32, 140–148. [Google Scholar] [CrossRef]
  55. Zhou, X.; Zheng, H.; Xu, X.; He, J.Y.; Ge, X.K.; Yao, X.; Cheng, T.; Zhu, Y.; Cao, W.X.; Tian, Y.C. Predicting grain yield in rice using multi-temporal vegetation indices from UAV-based multispectral and digital imagery. ISPRS J. Photogramm. 2017, 130, 246–255. [Google Scholar] [CrossRef]
  56. Helge, A.; Andreas, B.; Andreas, B.; Georg, B. Generating 3D hyperspectral information with lightweight UAV snapshot cameras for vegetation monitoring: From camera calibration to quality assurance. ISPRS J. Photogramm. 2015, 108, 245–259. [Google Scholar] [CrossRef]
  57. Lan, Y.; Deng, X.; Zeng, G. Advances in diagnosis of crop diseases, pests and weeds by UAV remote sensing. Smart Agric. 2019, 1, 1–19. [Google Scholar] [CrossRef]
  58. Chen, Z.; Ma, C.Y.; Sun, H.; Cheng, Q.; Duan, F.Y. The inversion methods for water stress of irrigation crop based on unmanned aerial vehicle remote sensing. China Agric. Inf. 2019, 31, 23–35. [Google Scholar] [CrossRef]
  59. Song, H.Y. Detection of Soil by Near Infrared Spectroscopy; Chemical Industry Press: Beijing, China, 2013. [Google Scholar]
  60. Wang, L. A Research About Remote Sensing Monitoring Method of Soil Organic Matter Based on Imaging Spectrum Technology. Master’s Thesis, Henan Polytechnic University, Zhengzhou, China, 2016. [Google Scholar]
  61. Guo, H.; Zhang, X.; Lu, Z.; Tian, T.; Xu, F.F.; Luo, M.; Wu, Z.G.; Sun, Z.J. Estimation of organic matter content in southern paddy soil based on airborne hyperspectral images. J. Agric. Sci. Tech. 2020, 22, 60–71. [Google Scholar]
  62. Deutsch, C.A.; Tewksbury, J.J.; Tigchelaar, M.; Battisti, D.S.; Merrill, S.C.; Huey, R.B.; Naylor, R.L. Increase in crop losses to insect pests in a warming climate. Science 2018, 361, 916–919. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  63. Huang, W.; Shi, Y.; Dong, Y.; Zhang, Y.J.; Liu, L.Y.; Wang, J.H. Progress and prospects of crop diseases and pests monitoring by remote sensing. Smart Agric. 2019, 1, 1–11. [Google Scholar] [CrossRef]
  64. Zhao, H.; Yang, C.; Guo, W.; Zhang, L.; Zhang, D. Automatic Estimation of Crop Disease Severity Levels Based on Vegetation Index Normalization. Remote Sens. 2020, 12, 1930. [Google Scholar] [CrossRef]
  65. Viera-Torres, M.; Sinde-González, I.; Gil-Docampo, M.; Bravo-Yandún, V.; Toulkeridis, T. Generating the baseline in the early detection of bud rot and red ring disease in oil palms by geospatial technologies. Remote Sens. 2020, 12, 3229. [Google Scholar] [CrossRef]
  66. Ma, S.Y.; Guo, Z.Z.; Wang, S.T.; Zhang, K. Hyperspectral remote sensing monitoring of Chinese chestnut red mite diseases and insect pests in UAV. Trans. Chin. Soc. Agric. Mach. 2021, 1–12. Available online: https://kns.cnki.net/kcms/detail/11.1964.S.20210204.1844.004.html (accessed on 23 March 2021).
  67. Tian, M.L.; Ban, S.T.; Yuan, T.; Wang, Y.Y.; Ma, C.; Li, L.Y. Monitoring of rice damage by rice leaf roller using UAV-based remote sensing. Acta Agric. Shanghai 2020, 36, 137–142. [Google Scholar] [CrossRef]
  68. Huang, W.; Lamb, D.W.; Niu, Z.; Zhang, Y.J.; Liu, L.Y.; Wang, J.H. Identification of yellow rust in wheat using in-situ spectral reflectance measurements and airborne hyperspectral imaging. Precis. Agric. 2007, 8, 187–197. [Google Scholar] [CrossRef]
  69. Wang, Z.; Chu, G.K.; Zhang, H.J.; Liu, S.X.; Huang, X.C.; Gao, F.R.; Zhang, C.Q.; Wang, J.X. Identification of diseased empty rice panicles based on Haar-like feature of UAV optical image. Trans. Chin. Soc. Agric. Eng. 2018, 34, 73–82. [Google Scholar] [CrossRef]
  70. Cui, M.N. Study on Dynamic Monitoring of Cotton Spider Mites Based on Remote Sensing of UAV. Master’s Thesis, Shihezi University, Shihezi, China, 2019. [Google Scholar]
  71. Wang, S.B.; Han, Y.; Chen, J.; Pan, Y.; Cao, Y.; Meng, H. Weed classification of remote sensing by UAV in ecological irrigation areas based on deep learning. J. Drain. Irrig. Mach. Eng. 2018, 36, 1137–1141. [Google Scholar] [CrossRef]
  72. Xue, J.L.; Dai, J.G.; Zhao, Q.Z.; Zhang, G.S.; Cui, M.N.; Jiang, N. Cotton field weed detection based on low-altitude drone image and YOLOv3. J. Shihezi Univ. Nat. Sci. 2019, 37, 21–27. [Google Scholar] [CrossRef]
  73. Dong, J.H.; Yang, X.D.; Gao, L.; Wang, B.S.; Wang, L. Information extraction of winter wheat lodging area based on UAV remote sensing image. Heilongjiang Agric. Sci. 2016, 147–152. [Google Scholar] [CrossRef]
  74. Tian, M.L.; Ban, S.T.; Chang, Q.R.; You, M.M.; Luo, D.; Wang, L.; Wang, S. Use of hyperspectral images from UAV-based imaging spectroradiometer to estimate cotton leaf area index. Trans. Chin. Soc. Agric. Eng. 2016, 32, 102–108. [Google Scholar] [CrossRef]
  75. Zheng, E.G.; Tian, Y.F.; Chen, T. Region extraction of corn lodging in UAV images based on deep learning. J. Henan Agric. Sci. 2018, 47, 155–160. [Google Scholar] [CrossRef]
  76. Zhang, X.L.; Guan, H.X.; Liu, H.J.; Meng, X.T.; Yang, H.X.; Ye, Q.; Yu, W.; Zhang, H.S. Extraction of maize lodging area in mature period based on UAV multispectral image. Trans. Chin. Soc. Agric. Eng. 2019, 35, 98–106. [Google Scholar] [CrossRef]
  77. Dai, J.G.; Zhang, G.S.; Guo, P.; Zeng, T.J.; Cui, M.N.; Xue, J.L. Information extraction of cotton lodging based on multi-spectral image from UAV remote sensing. Trans. Chin. Soc. Agric. Eng. 2019, 35, 63–70. [Google Scholar] [CrossRef]
  78. Gan, P.; Dong, Y.S.; Sun, L.; Yang, G.J.; Li, Z.H.; Yang, F.; Wang, L.Z.; Wang, J.W. Evaluation of Maize Waterlogging Disaster Using UAV LiDAR Data. Sci. Agric. Sin. 2017, 50, 2983–2992. [Google Scholar] [CrossRef]
  79. Zhou, H.; Yan, Z.X. Application of OneButton Software in Remote Sensing Image Processing. Bull. Surv. Mapp. 2017, S1, 59–61, 78. [Google Scholar] [CrossRef]
  80. Su, R.D. Application of UAV in modern agriculture. Jiangsu Agric. Sci. 2019, 47, 75–79. [Google Scholar] [CrossRef]
  81. Bajo, J.; Hallenborg, K.; Pawlewski, P.; Botti, V.; Sánchez-Pi, N.; Duque Méndez, N.D.; Lopes, F.; Julian, V. [Communications in Computer and Information Science] Highlights of Practical Applications of Agents, Multi-Agent Systems, and Sustainability—The PAAMS Collection. In Proceedings of the International Workshops of PAAMS 2015, Salamanca, Spain, 3–4 June 2015; Volume 524. [Google Scholar] [CrossRef]
Figure 1. Composition of the unmanned aerial vehicle low-altitude remote sensing (UAV-LARS) monitoring system.
Figure 1. Composition of the unmanned aerial vehicle low-altitude remote sensing (UAV-LARS) monitoring system.
Remotesensing 13 01221 g001
Figure 2. Publication trend of UAV-LARS applications in agricultural monitoring of China (1995–2019).
Figure 2. Publication trend of UAV-LARS applications in agricultural monitoring of China (1995–2019).
Remotesensing 13 01221 g002
Figure 3. Top ten high-frequency words of UAV-LARS applications in agricultural monitoring of China (1995–2019).
Figure 3. Top ten high-frequency words of UAV-LARS applications in agricultural monitoring of China (1995–2019).
Remotesensing 13 01221 g003
Table 1. Characteristics of unmanned aerial vehicles (UAVs) for agricultural remote sensing monitoring.
Table 1. Characteristics of unmanned aerial vehicles (UAVs) for agricultural remote sensing monitoring.
CategoryBenefitsLimitations
Fixed-wingLong endurance
Large load
Fast flight speed
Large operation range
Takeoff needs run-up
landing needs glide
No hovering capability
MultirotorFly horizontally and vertically
Vertical takeoff and landing
Hovering at a given location Autonomous navigation
Simple structure
Short endurance time
Small load
Poor resistance to harsh environment
Unmanned helicopterVertical takeoff and landing
Hovering at a given location
Flight stability
Complex wing structure
High maintenance cost
Table 2. Comparison of small multirotor UAVs (<15 kg) of the DJI company.
Table 2. Comparison of small multirotor UAVs (<15 kg) of the DJI company.
TypeNo. of
Propellers
Fuselage Weight
(Kg)
Endurance Time
(min)
Payload
(Kg)
Price Range
(U.S.D)
M21044.8 38 2.3 5000–15,000
M600Pro69.1 38 6.0 4999–15,000
S80063.7 16 2.5 1800
S90063.3 18 4.9 1300
S100084.4 15 5.6 3400
Table 3. Different types of sensors used in agricultural monitoring.
Table 3. Different types of sensors used in agricultural monitoring.
Sensor TypeCharacteristicsMemory
(Mb/Min)
Price
(U.S.D)
SoftwareCitations
Digital cameraDJI ZENMUSE Z3 (DJI Technology Co., Ltd., Shenzhen, China), Canon 5DMark II (Canon Inc., Tokyo, Japan), Nikon D800E (Nikon Corp., Tokyo, Japan), SONY α7r (Sony Corp., Tokyo, Japan), PhaseOne IQ180 (PhaseOne Corp., Copenhagen, Denmark), PhaseOne iXM (PhaseOne Corp., Copenhagen, Denmark),
Hasselblad H6D-100c (F. W. Hasselblad and Co., Gothenburg, Sweden)
Pixel: 10–100 million,
Frame: small and middle size,
Weight: 100–2500 g
50–2000900–35,000Pix4D Mapper (Pix4D SA, Lausanne, Switzerland),
Photoscan (Agisoft LLC, St. Petersburg, Russia),
OneButton (Research System Inc., Manassas, VA, USA)
[24]
Multispectral imagerParrot Sequoia (Parrot Inc., Paris, France),
Micasense RedEdge (MicaSense Inc., Seattle, WA, USA),
Tetracam ADC (Tetracam Inc., Chatsworth, CA,
USA),Cubert S128 (Cubert GmbH, Ulm, Germany),
DJI multispectral carema (DJI Technology Co., Ltd., Shenzhen, China)
High automation, Staring imaging,
Weight: 30–700 g,
Spectral range: 400–1100 nm
800–40005000–16,000Pix4D Mapper, Photoscan,
OneButton,
ICE (Microsoft Corp., Redmond, WA, USA)
[25,26,27]
Hyperspectral imagerCubert UHD185 (Cubert GmbH, Ulm, Germany),
Headwall Nano-Hyperspec (Headwall Photonics Inc., Fitchburg, WI, USA),
SENOP-RIKOLA (Senop Oy, Kangasala, Finland),
Gaiasky-mini (Sichuan Dualix Spectral Imaging Technology Co., Ltd., Chengdu, China)
Spectral range: 350–2500 nm,
Spectral sampling interval: 0.4–4.5 nm,
Spectral resolution: 4–10 nm,
Spectral numbers: 100–400,
Weight: 400–2000 g
600–300070,000–150,000Pix4D Mapper,
Photoscan,
OneButton;
ICE
[28,29,30]
Thermal infrared imagerDJI XT TIRcamera (DJI Technology Co., Ltd., Shenzhen, China),
VuePro 640R (FLIR Systems Inc., Wilsonville, OR, USA),
FLIR Camera Tau2 (FLIR Systems Inc., Wilsonville, OR, USA),
FLIR Thermo CAM SC3000 (FLIR Systems Inc., Wilsonville, OR, USA),
Optris PI450 (Optris GmbH, Berlin, Germany)
Spectral range: 3.5–13.5 μm,
Spatial resolution: 640 × 512 pixels,
Temperature resolution: 0.05 °C, Temperature range: −20–100 °C,
Spatial resolution: <10 cm
3–10010,000–15,000Pix4D Mapper,
Photoscan
[31,32,33,34]
LiDARRiegl VUX-1 (Rigel Laser Measurement Systems GmbH, Wien, Österreich)Weight: 3600 g,
Wavelength: 1550 nm,
Spot diameter: 25 mm
1000–60,000150,000–200,000LiDAR360 (Beijing Digital Green Earth Technology Co., Ltd., Beijing, China),
CloudStation (YellowScan company, Montpellier, France)
[35,36]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Zhang, H.; Wang, L.; Tian, T.; Yin, J. A Review of Unmanned Aerial Vehicle Low-Altitude Remote Sensing (UAV-LARS) Use in Agricultural Monitoring in China. Remote Sens. 2021, 13, 1221. https://doi.org/10.3390/rs13061221

AMA Style

Zhang H, Wang L, Tian T, Yin J. A Review of Unmanned Aerial Vehicle Low-Altitude Remote Sensing (UAV-LARS) Use in Agricultural Monitoring in China. Remote Sensing. 2021; 13(6):1221. https://doi.org/10.3390/rs13061221

Chicago/Turabian Style

Zhang, Haidong, Lingqing Wang, Ting Tian, and Jianghai Yin. 2021. "A Review of Unmanned Aerial Vehicle Low-Altitude Remote Sensing (UAV-LARS) Use in Agricultural Monitoring in China" Remote Sensing 13, no. 6: 1221. https://doi.org/10.3390/rs13061221

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop