Next Article in Journal
Phosphorus-Driven Stem-Biased Allocation: NPK Synergy Optimizes Growth and Physiology in Dalbergia odorifera T. C. Chen Seedlings
Previous Article in Journal
RdDM-Associated Chromatin Remodelers in Soybean: Evolution and Stress-Induced Expression of CLASSY Genes
Previous Article in Special Issue
Advanced Estimation of Winter Wheat Leaf’s Relative Chlorophyll Content Across Growth Stages Using Satellite-Derived Texture Indices in a Region with Various Sowing Dates
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Advances in UAV Remote Sensing for Monitoring Crop Water and Nutrient Status: Modeling Methods, Influencing Factors, and Challenges

1
College of Water Resources and Architectural Engineering, Northwest A&F University, Xianyang 712100, China
2
Key Laboratory of Agricultural Soil and Water Engineering in Arid and Semiarid Areas, Ministry of Education, Northwest A&F University, Xianyang 712100, China
3
Xinjiang Research Institute of Agriculture in Arid Areas, Northwest A&F University, Xianyang 712100, China
*
Author to whom correspondence should be addressed.
Plants 2025, 14(16), 2544; https://doi.org/10.3390/plants14162544
Submission received: 30 June 2025 / Revised: 8 August 2025 / Accepted: 12 August 2025 / Published: 15 August 2025

Abstract

With the advancement of precision agriculture, Unmanned Aerial Vehicle (UAV)-based remote sensing has been increasingly employed for monitoring crop water and nutrient status due to its high flexibility, fine spatial resolution, and rapid data acquisition capabilities. This review systematically examines recent research progress and key technological pathways in UAV-based remote sensing for crop water and nutrient monitoring. It provides an in-depth analysis of UAV platforms, sensor configurations, and their suitability across diverse agricultural applications. The review also highlights critical data processing steps—including radiometric correction, image stitching, segmentation, and data fusion—and compares three major modeling approaches for parameter inversion: vegetation index-based, data-driven, and physically based methods. Representative application cases across various crops and spatiotemporal scales are summarized. Furthermore, the review explores factors affecting monitoring performance, such as crop growth stages, spatial resolution, illumination and meteorological conditions, and model generalization. Despite significant advancements, current limitations include insufficient sensor versatility, labor-intensive data processing chains, and limited model scalability. Finally, the review outlines future directions, including the integration of edge intelligence, hybrid physical–data modeling, and multi-source, three-dimensional collaborative sensing. This work aims to provide theoretical insights and technical support for advancing UAV-based remote sensing in precision agriculture.

1. Introduction

Crop water and nutrient status are critical determinants of agricultural productivity. Accurately characterizing their spatial and temporal distribution is essential for enhancing water use efficiency, optimizing fertilization practices, and ensuring food security [1,2]. Conventional monitoring of water and nutrient status has relied on manual sampling, point-based measurements, and empirical judgment. These methods are labor-intensive, lack temporal responsiveness, and offer limited spatial representativeness, rendering them inadequate for large-scale, high-frequency, and precision agricultural management. To improve the timeliness and accuracy of water and nutrient monitoring, remote sensing—characterized by its non-contact nature, wide spatial coverage, and operational efficiency—has become a critical tool for agricultural information acquisition.
Unmanned Aerial Vehicles (UAVs), with their flexible flight capabilities, operational simplicity, and cost-effectiveness, have become a key focus in agricultural remote sensing [3,4]. Compared with satellite- and manned aircraft-based systems, UAV-based remote sensing provides higher spatial resolution, greater temporal flexibility, and better repeatability, making it particularly well-suited for fine-scale agricultural monitoring at the field level [5]. Equipped with multispectral, hyperspectral, thermal infrared (TIR), and microwave sensors, UAVs can rapidly acquire multidimensional data, including canopy structure, spectral reflectance, and temperature distribution. This enables accurate diagnosis of crop water stress, nitrogen status, and optimal fertilization timing [6,7,8].
Advances in sensor technology, flight control systems, and data processing algorithms have driven substantial progress in UAV-based remote sensing for monitoring crop water and nutrient status. Methodological frameworks based on vegetation indices (VIs), thermal indicators, spectral inversion models, and multi-source data-driven algorithms have enabled high-precision estimation of crop parameters such as water status, nitrogen content, and canopy structure [9,10,11,12,13]. The quality and utility of UAV-based remote sensing data have been markedly enhanced by techniques such as image stitching, radiometric correction, segmentation, and data fusion [14].
Despite these advances, several challenges persist in scaling UAV-based remote sensing applications to large agricultural areas. Key challenges include the influence of crop growth stages and canopy coverage on spectral responses, scale mismatches and mixed pixel effects, environmental interference from illumination and meteorological conditions, and the complexity and low automation of current data processing workflows. In particular, the stability and adaptability of remote sensing models require further improvement to perform reliably across diverse crops, regions, and management conditions.
Based on this, the review aims to comprehensively examine the key technologies and research progress of UAV-based remote sensing in monitoring crop water and nutrient status. It systematically reviews the principles, advantages, and application scenarios of different monitoring methods, while analyzing the challenges and development trends in practical applications. To enhance the clarity of the technical workflow, a conceptual framework (Figure 1) is proposed, illustrating the end-to-end process of UAV-based remote sensing for crop monitoring—from sensor selection and data acquisition, through preprocessing and modeling, to field-level decision support. This review seeks to provide theoretical insights and technical references for the development of an efficient, intelligent, and scalable agricultural water and nutrient monitoring system.

2. Data Acquisition and Processing

High-quality data acquisition and standardized processing are fundamental to UAV-based remote sensing for monitoring crop water and nutrient status. UAV platforms equipped with various types of remote sensing sensors can capture multidimensional information reflecting crop physiological conditions and field environments. These data support parameter extraction, spatiotemporal variation analysis, and diagnostic model development for crop water and nutrient monitoring. However, raw remote sensing data are often affected by platform motion, atmospheric conditions, and illumination variability, leading to radiometric errors, geometric distortions, and noise contamination, which constrain their direct applicability. Therefore, preprocessing steps—such as radiometric correction, image stitching, image segmentation, and multi-source data fusion—are required to convert raw imagery into standardized data products suitable for quantitative analysis, thereby providing a reliable foundation for accurate and efficient monitoring of crop water and nutrient dynamics. This section outlines the main UAV platforms and remote sensing sensors currently used for monitoring crop water and nutrient status and introduces the key steps in remote sensing data processing.

2.1. UAV Platforms

In agricultural remote sensing, the choice of UAV platform directly affects the efficiency, cost, and suitability of data acquisition in different contexts. UAVs are commonly classified by rotor configuration into multirotor, fixed-wing, and hybrid-wing types [15]. The main technical parameters of these UAV platforms are summarized in Table 1, providing a quick comparison of endurance, altitude capability, cost, and maintenance requirements.
Multirotor UAVs typically have more than three rotors, with common configurations including quadcopters, hexacopters, and octocopters. Lift is generated by high-speed rotor rotation driven by electric motors, and flight attitude is controlled by adjusting individual rotor speeds. These UAVs offer advantages in vertical takeoff, landing, and stable hovering. However, their endurance is limited by battery capacity and power system constraints. They are primarily used for high-resolution data acquisition at the field scale.
Fixed-wing UAVs utilize an aerodynamic design similar to that of conventional aircraft, generating lift through airflow velocity differences above and below the wings. With their efficient aerodynamic structure, these UAVs offer advantages such as long endurance, high payload capacity, and extended flight range. However, they require longer runways for takeoff and landing or depend on catapult launches and gliding, which imposes greater demands on field conditions. They are mainly used for high-resolution data acquisition over irrigation districts and small watershed.
Hybrid-wing UAVs integrate the advantages of both multirotor and fixed-wing platforms. During takeoff and landing, they operate in multirotor mode to enable vertical takeoff and landing without requiring designated runways. During cruising, they switch to fixed-wing mode, utilizing aerodynamic lift for efficient long-distance flight. This design combines the flexibility of multirotor systems and the long endurance and range of fixed-wing platforms, enhancing operational efficiency and spatial coverage while maintaining adaptability in complex environments. However, their complex structure results in higher manufacturing costs and greater maintenance requirements. They are mainly used for applications requiring both vertical takeoff and long-range flights under complex field conditions.

2.2. Remote Sensing Payload

Remote sensing sensors are essential payloads for acquiring data on surface features. Based on their operating spectral bands, they are classified into optical, TIR, and microwave types, each providing complementary data for monitoring crop water and nutrient status. The key technical parameters and representative applications of these sensor types are summarized in Table 2, which provides a concise overview of their spectral ranges, typical payloads, and main agricultural monitoring uses.
Among optical sensors, multispectral and hyperspectral sensors are the most commonly used. Multispectral sensors detect reflected solar radiation in the visible and near-infrared (NIR) range (0.4–1.1 μm) and are widely applied in agriculture. Due to their relatively low cost and high spatial and temporal resolution, multispectral sensors are extensively used in agricultural water and nutrient monitoring, particularly for calculating various VIs to assess crop water stress status, nitrogen status, and leaf area index (LAI) [16,17,18]. Compared with multispectral sensors, hyperspectral sensors capture hundreds of contiguous narrow spectral bands spanning 0.4–2.5 μm range. Their high spectral resolution enables superior inversion accuracy in applications such as soil organic matter estimation, crop nutrient deficiency diagnosis, and crop water stress status [19,20,21].
TIR sensors detect longwave thermal radiation emitted from the land surface, typically in the 8–14 μm range, enabling retrieval of land surface temperature (LST). This capability makes them essential for detecting crop water stress and estimating evapotranspiration. Under water stress, reduced transpiration causes canopy temperature to rise. TIR sensors are sensitive to these thermal changes, enabling effective diagnosis of crop water stress [22,23]. While highly effective for water status monitoring, TIR data provide limited information for assessing crop nutrient status, as nutrient deficiencies generally do not induce direct and measurable changes in surface thermal emissions.
Microwave remote sensing plays an important role in crop water and nutrient monitoring due to its all-weather, day-and-night capability and sensitivity to the dielectric properties of surface materials. It is typically categorized into active and passive systems, each with distinct mechanisms and advantages.
Active microwave sensing, exemplified by Synthetic Aperture Radar (SAR), transmits microwave pulses and recording their backscattered signals to generate high-resolution information about the land surface. Analysis of SAR data with varying frequencies and polarization modes enables the retrieval of canopy water content (e.g., vegetation or leaf water content), which directly indicates crop water stress [24]. In addition, SAR’s sensitivity to surface roughness facilitates indirect estimation of soil moisture, since soil water content modulates the dielectric constant and alters surface scattering behavior.
Passive microwave sensing, typically using microwave radiometers, detects naturally emitted microwave radiation from the land surface, expressed as brightness temperature. Brightness temperature depends on the physical temperature and emissivity of the surface, which are influenced by soil moisture, vegetation water content, and canopy coverage. These parameters make passive microwave observations valuable for large-scale soil and vegetation moisture retrieval.

2.3. Data Processing

Raw remote sensing images acquired by UAVs capture digital signals influenced by surface features, atmospheric conditions, and sensor characteristics. To ensure accurate retrieval of crop physiological and biochemical parameters, raw data must be preprocessed to produce consistent and reliable reflectance products. The UAV remote sensing data processing workflow typically includes radiometric correction, image stitching, segmentation, and data fusion.

2.3.1. Radiometric Correction

Radiometric correction is essential for quantitative remote sensing analysis. Its goal is to eliminate inconsistencies arising from solar irradiance variability, terrain effects, and sensor errors. Since the advent of satellite remote sensing, radiometric correction techniques for satellite imagery have been extensively developed and are now relatively mature [25,26,27,28]. In contrast, radiometric correction for UAV-based remote sensing imagery remains an emerging research area. Because of the low flight altitude of UAVs, atmospheric effects during image acquisition are significantly reduced compared with satellite-based systems. As a result, atmospheric correction procedures are typically simplified, focusing on solar irradiance variability, terrain-induced effects, and intrinsic sensor biases [29]. During repeated flights or extended UAV missions, acquired imagery is often affected by substantial variations in illumination conditions. Without correction, radiometric discrepancies between images can reduce the stability of inversion results and limit model generalizability.
Multispectral systems, owing to their simple structure, low cost, and broad applicability, are the most widely used imaging systems in UAV-based remote sensing. To address radiometric inconsistencies, Luo et al. [30] proposed a Piecewise Empirical Linear (PEL) method for estimating reflectance across diverse surface types, which significantly improved correlations between remote sensing indices and crop parameters. Using the MicaSense Altum system as a case study, Wang et al. [31] compared seven radiometric correction methods and found that relying on a single reference panel captured at one time point yielded poor results under rapidly changing illumination conditions. To improve reflectance consistency, they proposed using multiple reference panel placements combined with flight altitude calibration. In addition, optimizing low-cost reference panel materials has also garnered research attention. Rosas et al. [32] evaluated the reflectance performance and durability of various materials under different climatic conditions and recommended matte-coated wood panels and synthetic leather as practical alternatives, providing physical support for radiometric correction in low- to mid-cost UAV systems.
TIR cameras are widely used for detecting crop water stress and estimating evapotranspiration; however, their uncooled design makes them vulnerable to thermal drift and ambient temperature fluctuations, often leading to brightness temperature shifts and vignetting effects. To address this issue, Aragon et al. [33] developed a calibration function based on blackbody targets and ambient temperature responses, applicable to various infrared camera models (e.g., FLIR, TeAx), which significantly improves temperature estimation accuracy. Elfarkh et al. [34] found that flight altitude, acquisition time, and meteorological conditions significantly affect the consistency of LST. They recommended acquiring TIR data during radiometrically stable periods to ensure higher data quality. Wang et al. [35] further proposed the DRAT method, which integrates probability density fitting with a radiative transfer model to eliminate temporal drift errors without relying on extensive ground-based observations.
Although hyperspectral imaging systems provide high spectral resolution and fine classification capability, they are highly sensitive to platform orientation and illumination changes, often resulting in radiometric striping and geometric distortion. Jakob et al. [36] developed the MEPHySTo toolbox, which integrates geometric and radiometric processing of hyperspectral imagery and is suitable for high-precision applications such as mineral exploration. Building on this, Song et al. [37] proposed an integrated correction framework for such systems, combining attitude correction, solar irradiance modeling, geometric registration, and image stitching. This approach significantly enhances radiometric consistency and image continuity under large-scale operational conditions.
Collectively, these studies highlight the critical role of radiometric correction in ensuring the accuracy and consistency of UAV-based remote sensing data. As UAV applications expand in agricultural monitoring, developing robust, efficient, and scalable radiometric correction methods—particularly those that reduce reliance on physical targets and support multi-sensor integration—will be essential for reliable crop parameter retrieval across diverse spatial and temporal conditions.

2.3.2. Image Stitching

UAV-based remote sensing has received considerable attention for its capability to capture high-spatial-resolution imagery. However, such high-resolution images typically cover only a limited portion of the land surface per frame. To address this limitation and acquire a comprehensive view of the study area, image stitching has become an essential technique. Image stitching combines multiple overlapping images into a larger composite, thereby improving the applicability of UAV-based remote sensing across diverse research and monitoring scenarios.
To address the dual challenge of achieving high spatial resolution and broad coverage in UAV imagery, researchers have proposed various image stitching techniques. Traditional image stitching techniques primarily rely on feature point matching strategies, such as the Scale-Invariant Feature Transform (SIFT) algorithm [38]. By extracting scale-invariant features and applying dynamic thresholding in the Difference-of-Gaussian (DoG) space, SIFT improves stitching efficiency and matching robustness in agricultural remote sensing. Ren et al. [39] developed a coarse-to-fine image stitching strategy based on a multilayer perceptron (MLP) to address spectral discrepancies and spatial misalignments in multispectral imagery. By incorporating feature-based error regression correction, this approach balances computational efficiency and stitching accuracy. Yang et al. [40] further proposed an enhancement-based approach to improve multispectral image stitching in complex riparian environments by reinforcing the spectral features of individual bands.
To mitigate common issues such as artifacts and structural distortions in traditional image stitching, Xu et al. [41] introduced a method that integrates Accelerated-KAZE (AKAZE) feature extraction with the As-Projective-As-Possible (APAP) model. By incorporating structure-preserving mechanisms and multi-level constraints, this approach improves structural consistency and visual coherence in stitched imagery. For high-dimensional data such as hyperspectral imagery, Mo et al. [42] proposed an image stitching method that integrates deep feature representations with a graph neural network (GNN). By incorporating spectral consistency correction and multi-scale fusion, their method achieves seamless, low-distortion, high-quality stitching results.
To address the challenges of multi-sensor and multi-modal image co-registration and stitching, Turner et al. [43] achieved spatial alignment and joint stitching of visible, multispectral, and TIR imagery in a study on Antarctic moss monitoring, with a root mean square error (RMSE) as low as 1.78 pixels. In addition, Kapil et al. [44] addressed low-contrast TIR image stitching by developing a texture mapping framework that uses synchronized RGB imagery as an intermediary. This approach improved the geometric consistency and radiometric quality of TIR orthomosaics.

2.3.3. Image Segmentation

During early growth stages or under conditions of low planting density, high-resolution UAV imagery often exhibits mixed spectral responses due to insufficient vegetation cover. In such cases, image pixels can generally be classified into three categories: bare soil, crop canopy, and other background noise. If raw UAV imagery is directly used for vegetation monitoring or parameter inversion, background noise may reduce the descriptive capacity of remote sensing data and compromises model accuracy. Therefore, to improve the efficiency and accuracy of remote sensing data analysis, image segmentation is essential for isolating meaningful land surface components from raw imagery. In agricultural remote sensing, commonly used segmentation methods include threshold-based techniques, machine learning (ML) algorithms, and increasingly adopted deep learning (DL) approaches.
Traditional threshold-based segmentation methods, such as NDVI-based thresholding, typically require a manually defined fixed threshold to separate vegetated from non-vegetated regions [45]. Although computationally efficient and relatively easy to implement, the manual selection of threshold values is subjective and poorly adapted to varying illumination and land surface conditions across different images. To improve objectivity and adaptability, adaptive thresholding techniques such as the Otsu algorithm have been widely adopted. The Otsu algorithm automatically determines the optimal threshold by maximizing the between-class variance of two groups—foreground (e.g., vegetation) and background (e.g., soil)—from the image histogram, eliminating the need for manual intervention [46]. This approach enhances the robustness of threshold-based segmentation to some extent. However, whether using fixed or adaptive methods, segmentation performance often declines when significant spectral overlap among land surface components exists or when imagery is affected by noise or uneven illumination.
Supervised segmentation methods such as Support Vector Machines (SVM) and Random Forests (RF) rely on training samples to learn the spectral characteristics of land surface types [47]. These methods can achieve more accurate segmentation in complex scenarios, improving the ability to identify various land surface features. In recent years, DL approaches—particularly Convolutional Neural Networks (CNNs)—have demonstrated significant advantages in remote sensing image segmentation due to their strong capabilities in hierarchical feature extraction and spatial context modeling [48]. When processing farmland images with complex textured backgrounds and small-scale targets (e.g., individual seedlings), DL models can adaptively learn high-level semantic features, improving segmentation accuracy and model robustness.
Effective image segmentation removes bare soil, shadows, and other background interferences, improving the accuracy of vegetation information extraction and providing a more reliable data foundation for subsequent tasks such as crop parameter estimation, water stress detection, and field condition assessment. Therefore, for high-resolution UAV imagery, segmentation is an indispensable step in the preprocessing pipeline and plays a crucial role in improving the accuracy and stability of farmland remote sensing analysis.

2.3.4. Data Fusion

In complex agricultural environments, a single sensor is often insufficient to capture the multidimensional growth characteristics of crops, including canopy structure, physiological and biochemical status, and water and thermal conditions. As a result, multi-sensor collaborative observation has become a dominant trend. Against this backdrop, data fusion techniques have been developed to integrate imagery from different sensors at spatial, spectral, temporal, and semantic level, aiming to enhance the accuracy and robustness of crop parameter retrieval.
Image fusion methods can be classified into three categories based on processing level: pixel-level, feature-level, and decision-level fusion. Pixel-level fusion integrates information directly at the pixel level and is suitable for data sources with similar spatial resolutions. For example, in cotton aphid monitoring, the Gram–Schmidt transformation was used to fuse multispectral and panchromatic images, enhancing the sensitivity of VIs to aphid infestation severity (R2 = 0.88) [49]. In rice straw burning detection, fusing infrared and visible imagery improved object detection performance, with mAP@0.5 increasing by approximately 5% [50]. Feature-level fusion combines intermediate features such as texture, spectral indices, and VIs for joint modeling. For instance, in cotton yield estimation, combining LiDAR-derived plant height and multispectral-based leaf chlorophyll content (LCC) in an XGBoost model improved prediction accuracy (R2 = 0.802) [51]. Similarly, in potato aboveground biomass estimation, a RF model combining texture, spectral, and meteorological features achieved the best performance (R2 = 0.79) [52]. Decision-level fusion integrates outputs from multiple models or sensors and is particularly suitable for handling heterogeneous data sources. In multimodal fusion frameworks, optical and SAR imagery can be modeled independently and then combined through weighted integration to improve the robustness of LAI and canopy cover estimation [53].
In recent years, multimodal fusion combined with DL modeling has become an important advancement in remote sensing data integration. Multimodal approaches typically integrate RGB, TIR, and multispectral imagery to improve the capacity of models to capture crop physiological traits and environmental conditions. For example, in a citrus orchard under drip irrigation, combining RGB, TIR, and multispectral data with a CNN-LSTM model improved the prediction accuracy of 5 cm soil moisture content (SMC), achieving R2 = 0.88 [54]. Song et al. [55] integrated visible imagery with enhanced multispectral data to generate outputs with higher spatial resolution and richer spectral information, improving the accuracy of sunflower lodging detection from 84.4% to 89.8%.

3. UAV-Based Remote Sensing Modeling Methods for Crop Water and Nutrient Status

3.1. VI-Based Approaches

In remote sensing-based crop monitoring, physiological parameters such as leaf water content and chlorophyll concentration are key indicators of water and nutrient status, but are often difficult to measure directly. VIs, as commonly used and effective remote sensing parameters, provide an indirect means of assessment by enhancing vegetation signals and suppressing non-target interference, such as soil background and illumination variability. This is typically achieved through band-based operations such as ratios, differences, or normalization. The theoretical foundation of VIs lies in the distinct spectral reflectance characteristics of vegetation across different wavelengths—particularly the strong contrast between visible and NIR bands—which are highly sensitive to variations in canopy structure, physiological activity, and water stress. Therefore, VIs play a vital role in the remote sensing-based inversion and assessment of crop water and nutrient status. Commonly used VIs are summarized in Table 3. These indices demonstrate strong adaptability across sensor types and spatial scales, serving as effective indicators of crop water and nutrient status.

3.1.1. Applications of VIs in Monitoring Crop Water Status

VIs reflect canopy coverage and leaf water status and are therefore widely used to monitor soil moisture and crop water stress [69,70]. Drought stress reduces leaf water content and transpiration, altering spectral reflectance characteristics and leads to a decline in VI values. NDVI is widely used to monitor water stress; however, its sensitivity to soil background and illumination conditions limits its accuracy.
To improve monitoring accuracy, researchers have developed various water indices based on NIR and shortwave infrared (SWIR) bands. These indices are constructed by leveraging the distinct absorption and reflectance characteristics of vegetation at specific wavelengths (e.g., 970 nm, 1200 nm, 1450 nm, 1940 nm, and 2500 nm), which are closely associated with crop water content. Gao [71] proposed the Normalized Difference Water Index (NDWI), which estimates leaf water content using the reflectance difference between the 860 and 1240 nm bands. Building on this, Wang and Qu [72] introduced the Normalized Multiband Drought Index (NMDI), which integrates two strong water absorption bands at 1640 and 2130 nm. This index improves the detection of compound drought conditions and demonstrates greater stability in heterogeneous farmland. In addition to traditional VIs, recent studies have developed novel indices and their lightweight applications in orchards or crop-specific scenarios. For example, Gao et al. [73] developed two convolution-based VIs (CNRVI and CNGVI) and proposed a lightweight CVCR-Net model for root-zone soil moisture inversion in kiwifruit orchards. The model reduced both size and parameter count, enhancing deployment efficiency while maintaining accuracy, offering a novel approach to crop water monitoring on embedded platforms.
TIR remote sensing enables the analysis of the relationship between canopy temperature (Tc) and land surface evapotranspiration, focusing on how crop water stress influences canopy thermal radiation. In the 1960s, Tanner [74] was the first to employ the temperature difference between canopy and air (Tc–Ta) to qualitatively assess crop water status. Building on this concept, Idso et al. [75] introduced a non-transpiring baseline and proposed the prototype of the Crop Water Stress Index (CWSI), which incorporates both Tc–Ta and vapor pressure deficit (VPD). Later, Jackson et al. [76] integrated land surface energy balance theory into the model, establishing a physically based version of CWSI. This advancement enabled CWSI to be widely applied across crops, regions, and temporal scales as a robust indicator of crop water status. Compared with traditional VIs, CWSI exhibits greater stability in areas with low vegetation cover and can indirectly reflect root-zone moisture dynamics [77]. Owing to these features, CWSI serves as a reliable indicator for irrigation scheduling and water stress monitoring in precision agriculture.
In addition to vegetation- and temperature-based indices, recent studies have developed feature space analysis methods to integrate LST and VIs to improve soil moisture retrieval accuracy. Based on the surface energy balance principle, these methods construct a negatively correlated Ts–VI feature space, reflecting the mechanism whereby surface temperature rises under dry conditions and decreases in moist areas. Sandholt et al. [78] proposed the Temperature Vegetation Dryness Index (TVDI), which enables quantitative monitoring of regional-scale water stress by normalizing LST along the dry and wet edges of the Ts–VI feature space. However, the accuracy of the Ts–VI method is affected by atmospheric conditions, surface heterogeneity, and seasonal variations, necessitating cross-validation with ground-based measurements and multi-source data in practical applications.
Complementing optical and TIR methods, SAR technology offers advantages for soil moisture retrieval, especially under cloudy or low-light conditions. Zhang et al. [79] proposed a dual-polarization SAR vegetation index (DRVI), that integrates backscatter contrast and polarization parameters. Combined with the water-cloud model, this approach achieved accurate moisture estimation across crop types, significantly outperforming conventional indices such as NDVI and LAI.
In summary, remote sensing technologies based on VIs—ranging from optical water indices to canopy temperature and SAR polarization indices—have established a multidimensional, cross-scale framework for crop water diagnosis. This framework provides a theoretical basis and technical support for precision irrigation, water resource management, and early drought warning.

3.1.2. Applications of VIs in Monitoring Crop Nutrient Status

Among the three primary macronutrients in crops, nitrogen has been the most extensively studied using remote sensing, with a relatively mature methodological framework. This is largely attributed to the pivotal role nitrogen plays in photosynthesis, where variations in N content affect chlorophyll synthesis and vegetation spectral reflectance—particularly the contrast between the visible and NIR bands. Since the 1980s, numerous nitrogen-sensitive VIs have been developed, providing a foundation for high-resolution spatial and temporal monitoring of crop nitrogen status [80].
NDVI, one of the earliest and most widely used indices, can detect nitrogen-induced variations. However, it often suffers from spectral saturation under dense canopy or high LAI conditions, limiting its effectiveness in monitoring crop nitrogen status during the mid to late growth stages [81]. To address this limitation, researchers have developed improved VIs by incorporating more sensitive spectral bands, such as the GNDVI [60], the MERIS Terrestrial Chlorophyll Index (MTCI) [82], and the Chlorophyll Index Red-Edge (CIred-edge) [83]. Notably, the inclusion of the red-edge spectral region (700–750 nm) has improved the sensitivity of these indices to nitrogen variation, making them effective for monitoring nitrogen status during the mid to late growth stages [84,85]. For example, Zhang et al. [86] found that the red-edge wavelength at 718 nm exhibited the highest correlation with leaf nitrogen concentration (LNC) (r = 0.92) when analyzing hyperspectral imagery of maize leaves under varying nitrogen treatments. Based on this band, the derived Normalized Difference Spectral Index (NDSI) and Ratio Spectral Index (RSI) achieved strong predictive performance, with R2 values of 0.90 and 0.86, respectively. Furthermore, nitrogen level classification based on these indices achieved an accuracy of 91.7%, underscoring the advantages of red-edge and NIR band combinations in nitrogen discrimination.
Progress has also been made in the remote sensing of other essential nutrients, such as phosphorus (P) and potassium (K), but related research remains at an early stage. The primary challenge in monitoring phosphorus is its low concentration and comparatively weak influence on spectral reflectance relative to nitrogen. Nevertheless, Wang et al. [87] utilized the successive projections algorithm (SPA) to identify key wavelengths from hyperspectral data and developed a three-band VI, which improved the estimation accuracy of phosphorus content in sedge leaves to R2 > 0.68, outperforming traditional two-band indices. Furthermore, the study emphasized that NIR and SWIR bands are particularly sensitive to phosphorus variation, indicating a promising direction for future large-scale phosphorus monitoring.
Potassium (K) plays an important role in regulating cellular osmotic balance, enzyme activity, and stress resistance, and displays distinct spectral response features in the red-edge and SWIR regions. Lu et al. [88] developed two novel SWIR indices—NDSI (R1705, R1385) and RSI (R1385, R1705)—to estimate potassium content in rice leaves, achieving R2 = 0.68. Incorporating the red-edge band to form a three-band combination, the predictive performance improved to R2 = 0.74. Similarly, Yang et al. [89] identified reflectance near 1883 and 2305 nm as strongly indicative of leaf potassium content in wheat. The partial least squares regression (PLSR) models produced R2 values of 0.74 and 0.65, respectively. Collectively, these findings highlight the important role of SWIR region in remotely sensing crop potassium status.

3.2. Data-Driven Approaches

With the increasing availability of high-dimensional, multi-source UAV-based remote sensing data, data-driven models have become essential for retrieving crop water and nutrient status. Traditional linear models, such as multiple linear regression (MLR) and PLSR have been widely used to establish quantitative relationships between VIs, spectral and thermal features, and crop physiological parameters [90]. However, the nonlinear and heterogeneous nature of remote sensing data often limits their accuracy and generalizability. In contrast, ML and DL methods offer powerful capabilities in nonlinear modeling, feature extraction, and temporal dynamics analysis. Recent advances in tree-based models [91], kernel methods [92], artificial neural networks (ANNs) [93], and deep architectures have significantly improved the performance of remote sensing-based crop status assessment [94]. The following sections review representative models within these categories.

3.2.1. Data-Driven Approaches for Crop Water Status Estimation

Estimating crop water status from remote sensing data often presents challenges such as high dimensionality, strong nonlinearity, and multi-source data integration. Traditional parametric models struggle to capture the complex interactions between remote sensing features and crop water status, making data-driven approaches increasingly important. Early studies often employed linear regression models to quantify the relationships between remote sensing indicators—such as VIs (e.g., NDVI, NDWI) and LST—and crop water content. Zhang et al. [95] developed a leaf water content inversion model using ground-based hyperspectral data and evaluated several regression techniques. Results showed that PLSR consistently achieved the highest prediction accuracy across growth stages. However, linear models are limited in capturing nonlinear interactions between crops and environmental factors, which reduces both accuracy and generalizability.
To address the limitations of linear models, researchers have increasingly adopted ML approaches. Algorithms such as RF, SVM, and XGBoost have been successfully applied to remote sensing-based estimation of soil moisture and canopy water content, owing to their robust feature modeling capabilities and resistance to noise. For example, Chen et al. [96] developed the Thermal-Derived Drought Index (TDDI) using UAV-based multisource imagery and temperature features. By integrating this index with ML models such as RF and SVM, they estimated the water content of maize and sorghum. Results showed that RF achieved the highest prediction accuracy under both single-crop and multi-crop scenarios. Savchik et al. [97] employed RF and ANN models, integrating canopy spectral data and soil information to predict stem water potential in almond orchards. Divya Dharshini et al. [98] evaluated four ML algorithms for predicting relative water content under varying irrigation regimes using sorghum as a case study. SVM achieved the highest accuracy under irrigated conditions (R2 = 0.94), followed by XGBoost. Under rainfed conditions, XGBoost demonstrated greater robustness, while PLSR performed poorly in both scenarios.
With the increasing availability of large-scale and time-series remote sensing data, DL methods have demonstrated significant advantages in modeling dynamic changes in crop water status. Babaeian et al. [99] integrated UAV-based multispectral imagery with soil physicochemical properties to develop an AutoML framework for estimating root-zone soil moisture, achieving high accuracy (RMSE < 0.02 cm3/cm3). Yang et al. [100] proposed a UAV-based strategy for crop water status estimation by constructing a deep fusion framework with a dynamic model updating mechanism, which improving adaptability across growth stages. In addition, Yang et al. [101] applied continuous wavelet transform to extract features from hyperspectral and TIR data, integrating them into a neural network model to estimate leaf water content in winter wheat, achieving high accuracy (R2 > 0.9). These findings underscore the potential of integrating remote sensing feature fusion with DL for accurate crop water assessment.

3.2.2. Data-Driven Approaches for Crop Nutrient Estimation

Crop nutrient content, particularly nitrogen (N), phosphorus (P), and potassium (K), is a key determinant of plant growth and yield. In recent years, remote sensing technologies—particularly hyperspectral and UAV-based platforms—have become important tools for assessing crop nutrient status due to their rapid, non-destructive, and large-scale monitoring capabilities. However, complex and nonlinear relationships between nutrient levels and remote sensing features often reduce the performance of traditional statistical models. In this context, data-driven approaches present a promising alternative for improving the accuracy and robustness of nutrient estimation.
Among ML methods, ensemble algorithms such as RF and XGBoost have been widely applied in nutrient estimation due to their strong feature selection and nonlinear modeling capabilities. Jiang et al. [102] achieved accurate estimation of leaf nitrogen content in Annona squamosa by integrating UAV-based hyperspectral data with ensemble learning models. Similarly, Zha et al. [103] combined UAV data with RF to improve prediction accuracy of the nitrogen nutrition index in rice (R2 = 0.94–0.96), outperforming both VIs (R2 = 0.43–0.63) and MLR (R2 = 0.54–0.75). These studies demonstrate the effectiveness of ensemble learning in capturing complex relationships between VIs and nutrient status.
With increasing dimensionality and spatiotemporal resolution of remote sensing data, DL approaches have shown clear advantages in nutrient monitoring. Chen et al. [104] developed a DL framework integrating hyperspectral reflectance and phenological information to predict nitrogen content in apple leaves, achieving R2 = 0.79 on the validation set. Xiao et al. [105] proposed a CNN model based on visible and NIR spectra, which predicted nitrogen concentration in cotton leaves (RMSE = 3.36) and classified nitrogen status with 83.34% accuracy. Zhang et al. [106] introduced a self-supervised spectral–spatial vision transformer (SSVT) that jointly learns spatial and spectral features, improving nitrogen status prediction in wheat (accuracy = 0.96).
Recent research highlights multi-source data fusion and cross-scale modeling as key directions in nutrient monitoring. Du et al. [107] developed an incremental learning model to predict LCC and LAI across crops including soybean, rapeseed, and wheat, achieving R2 values from 0.56 to 0.82. Similarly, Dehghan-Shoar et al. [108] constructed a physics-informed neural network that incorporates prior knowledge from radiative transfer models, improving the robustness of nitrogen estimation across scales (R2 = 0.71). These approaches offer valuable technical support for precision fertilization in agriculture.

3.3. Physically Based Approaches

Physically based models are the most mechanistically grounded approach in agricultural remote sensing, as they quantitatively simulate energy and matter transfer within crop systems to elucidate interactions between physiological processes and environmental conditions. Compared with empirical models, physically based models offer better interpretability and extrapolation capabilities, as they do not depend on region-specific training datasets. These advantages make them effective for large-scale crop water and nutrient monitoring across diverse spatial and temporal contexts. Based on their underlying mechanisms, these models can be categorized into three types: canopy radiative transfer models, microwave radiative transfer models, and energy balance models.

3.3.1. Canopy Radiative Transfer Models

Canopy radiative transfer models constitute the theoretical basis for retrieving crop structural and physiological–biochemical parameters from remote sensing data. These models simulate the interactions of solar radiation with crop canopies, including absorption, reflection, transmission, and multiple scattering. Through such physical modeling, critical parameters—such as LAI, chlorophyll content, and leaf water content—can be quantitatively estimated from spectral reflectance, thereby providing support for precision water and nutrient management in agriculture.
Based on the canopy structural characteristics, radiative transfer models can be broadly classified into two categories: continuous medium models and discrete medium models. Continuous models assume a homogeneous canopy structure, where leaves are uniformly distributed along the vertical profile and spatial variability among individual plants is ignored. This approach is well-suited for crop canopies with dense coverage and uniform architecture. A representative example is the SAIL (Scattering by Arbitrarily Inclined Leaves) model, which uses a four-stream radiative transfer formulation to simulate the vertical propagation and scattering of solar radiation within the canopy [109]. The spectral characteristics of individual leaves are described using the PROSPECT model, which incorporates leaf internal structure, chlorophyll concentration, water content, and dry matter effects on reflectance and transmittance [110]. The combination of SAIL and PROSPECT forms the PROSAIL model, which is widely recognized as one of the most prevalent canopy radiative transfer models. With a limited number of input variables (e.g., LAI, Cab, Cw, and structure parameter N), the PROSAIL model can generate canopy reflectance spectra, making it a standard tool in remote sensing inversion studies [111].
In contrast, discrete models do not assume a homogeneous canopy structure. Instead, they account for the spatial distribution of individual plants and are typically based on geometric-optical theory. A representative example is the Li–Strahler geometric-optical model, which characterizes projection, occlusion, shadowing, and multiple scattering among plant components (e.g., individual crowns or crop rows) [112]. This model is suited for sparsely vegetated fields, agricultural plots with distinct row structures, and heterogeneous landscapes. Discrete models resolve the combined effects of direct illumination, canopy shadows, and background elements such as bare soil, making them advantageous for analyzing high-resolution UAV remote sensing imagery. Additionally, these models form the basis for advanced three-dimensional radiative transfer simulations, including ray-tracing methods based on Monte Carlo techniques.
A key application of canopy radiative transfer models is retrieval of crop biophysical parameters from remote sensing data. By constructing forward models to simulate reflectance and generating a look-up table (LUT), essential parameters such as LAI and chlorophyll content (Cab) can be estimated through spectral matching with field observations. Alternatively, inversion can be achieved via fitting or optimization algorithms, such as least squares, genetic algorithms, or Bayesian inference. Due to their strong physical interpretability, these models enhance the generalizability and cross-site applicability of remote sensing monitoring frameworks. For instance, in agricultural water stress assessments, radiative transfer models facilitate crop water status evaluation by retrieving key canopy attributes such as Cab, LAI, and canopy chlorophyll content (CCC). Yang et al. [113] developed a water status monitoring framework by integrating UAV-based multispectral imagery with the PROSAIL model. The retrieved parameters—Cab, LAI, and CCC—showed strong correlations with stomatal conductance (Gs). Furthermore, incorporating meteorological factors into the model improved prediction accuracy, demonstrating the robustness and utility of the PROSAIL model when coupled with weather variables for Gs estimation across growth stages. Pasqualotto et al. [114] proposed two indices—the Water Absorption Area Index (WAAI) and the Derivative Water Index (DWI)—by combining PROSAIL-simulated spectra with hyperspectral imagery. Under heterogeneous crop conditions, these indices substantially outperformed traditional ones in estimating canopy water content, achieving R2 values of 0.80 and 0.70, respectively. These results highlight the potential of PROSAIL-based indices for scalable remote sensing applications in crop water status monitoring.
Similarly, canopy radiative transfer models have demonstrated strong generalizability and physical advantages in nutrient monitoring. Tripathi et al. [115] employed the PROSAIL model to estimate the LAI and average leaf angle (ALA) of mustard crops by constructing a LUT that linked simulated directional reflectance with canopy structural parameters. Their results confirmed the effectiveness of radiative transfer models in retrieving crop physiological traits. Li et al. [116] proposed the N-PROSAIL model, which integrates a nitrogen-specific PROSPECT model with the SAIL model to enable accurate retrieval of winter wheat nitrogen status at both the leaf and canopy levels (i.e., LNC and CND), outperforming conventional VI-based approaches. Du et al. [107] further combined PROSAIL-simulated spectra with a deep neural network to develop a joint estimation model for LCC and LAI. By incorporating an incremental learning strategy, the model enabled cross-crop nitrogen diagnosis for soybean, rapeseed, and wheat, demonstrating the feasibility of integrating DL with physically based models. Li et al. [117] recently proposed the PROSAIL-NAM framework, which integrates the PROSAIL-PRO with a nitrogen allocation model. In this framework, CNC is divided into photosynthetic and non-photosynthetic components, driven by CCC and CDM, respectively. This approach achieved accuracy CNC estimation across different ecosystems and remote sensing platforms (RMSE = 0.49–2.25 g/m2), offering methodological support for crop nitrogen monitoring at regional to global scales.

3.3.2. Microwave Radiative Transfer Models

In contrast to optical remote sensing, which depends on solar radiation as the primary energy source, microwave remote sensing acquires data through active emission (e.g., SAR) or passive sensing of land surface emissions (e.g., microwave radiometers). This capability enables continuous, all-weather monitoring and provides advantages for observing crop–soil systems under cloudy or rainy conditions. Microwave radiative transfer models simulate the propagation, scattering, absorption, and attenuation of microwave signals through vegetation and soil. These models facilitate the retrieval of key biophysical parameters, particularly those sensitive to microwave signals through soil and vegetation.
The interaction between microwave signals and the land surface is influenced by factors such as soil volumetric water content, surface roughness, vegetation water content, structural density, polarization, and incidence angle. Microwave radiative transfer models describe these interactions using electromagnetic propagation and scattering theories, such as Maxwell’s equations and the radiative transfer equation. Their primary aim is to establish quantitative linkages between microwave observations and biophysical surface parameters. Based on the simulation targets and underlying mechanisms, these models are categorized into soil scattering models and coupled vegetation–soil models.
The Integral Equation Model (IEM) is one of the most widely adopted physically based models for simulating microwave backscatter over bare or sparsely vegetated surfaces [118]. It conceptualizes the land surface as a rough dielectric interface and calculates backscatter under varying polarizations and incidence angles by incorporating surface conductivity, geometric descriptors, and both volume and surface scattering. IEM has been extensively applied in multi-frequency (e.g., C-, L-, and X-band) and multi-polarization SAR analyses, serving as a key approach for surface soil moisture retrieval. However, accurate implementation necessitates detailed measurements of surface roughness parameters—such as root mean square (RMS) height and correlation length—rendering it experimentally demanding. To enhance practicality, many studies adopt simplified assumptions or calibrate the model with in situ observations [119,120].
In vegetated environments, microwave signals are influenced by both soil backscatter and vegetation absorption, scattering, and attenuation. Under these conditions, soil scattering models alone fail to fully capture the composite microwave response. To address this limitation, coupled models have been developed that incorporate both vegetation and soil contributions. The Water Cloud Model (WCM), a widely used empirical approach, treats vegetation as a homogeneous water cloud and characterizes its microwave interactions using exponential functions [121]. WCM expresses total backscatter as the sum of vegetation and soil components and is commonly applied in low-relief, uniformly vegetated areas such as croplands and grasslands. Its main advantages include structural simplicity and minimal parameterization, making it well suited for integration with ML–based inversion frameworks. In contrast, the Michigan Microwave Canopy Scattering Model (MIMICS) is a physically based model that simulates multiple scattering within crop canopies and the underlying soil [122]. It accounts for volume scattering, canopy geometry, and dielectric variability. Unlike the empirical WCM, MIMICS supports multilayer canopy representation, making it suitable for complex structures and medium-to-high vegetation densities. However, it requires more detailed inputs and higher computational cost.
Microwave radiative transfer models are extensively employed to retrieve key parameters of crop water status, such as SMC, vegetation water content (VWC), and aboveground biomass (AGB). For instance, Romshoo et al. [123] conducted a typical C-band microwave remote sensing experiment using a scatterometer to collect radar backscatter signals from vegetated surfaces under multi-temporal conditions. By integrating IEM, volume scattering, and empirical models, the study evaluated the effects of soil moisture and vegetation variables on radar backscatter. The results confirmed the sensitivity of microwave signals to both soil and vegetation water content, particularly under different biomass conditions. Zhang et al. [124] developed an SMC retrieval framework using multi-frequency SAR data. By integrating an improved CIEM with a lookup table optimization algorithm, the framework avoided predefined surface roughness parameters—a common limitation in traditional models. The method demonstrated strong estimation accuracy across multiple agricultural bare-soil sites, significantly improving the robustness and practicality of microwave-based physical models. Yahia et al. [125] further fused SAR, multispectral, and TIR data to construct a neural network-based framework for SMC estimation. By implementing a joint weighting strategy at both feature and decision levels, the framework incorporated EA-IEM outputs with PDI and TVDI indices. It was validated across three agricultural sites with distinct soil characteristics in the UK and Algeria, achieving high accuracy and generalizability. These findings highlight the potential of multi-source remote sensing fusion for improving SMC retrieval.

3.3.3. Energy Balance Models

Energy balance models are physically based approaches that quantify the partitioning of surface energy and water fluxes by simulating solar radiation absorption, longwave radiation exchange, and heat transfer processes within soil and vegetation. These models are based on the principle that incoming solar radiation at the land surface is either converted into latent heat through evaporation and transpiration (evapotranspiration, ET), or dissipated as sensible heat. Based on the attribution of energy fluxes to surface components, energy balance models are classified into single-source and dual-source types.
Single-source models treat the land surface as a homogeneous medium, without distinguishing heat exchange between soil and vegetation. These models are most applicable in areas with low vegetation cover or spatially uniform soil–vegetation energy interactions. The Surface Energy Balance Algorithm for Land (SEBAL) is a representative single-source model that estimates surface energy fluxes based on remote sensing inputs, such as surface albedo, radiative fluxes, and land surface temperature [126]. The Mapping Evapotranspiration at High Resolution with Internalized Calibration (METRIC) model extends SEBAL by incorporating high-resolution remote sensing data and introducing an internal calibration mechanism for automatic parameter adjustment [127].
Dual-source energy balance models distinguish between energy contributions of soil and vegetation, enabling the separate estimation of soil evaporation and plant transpiration. These models are suitable for areas with high vegetation cover, such as crop fields or forests, where soil and vegetation contribute unequally to evapotranspiration. The Two-Source Energy Balance (TSEB) model is a representative example. It independently calculates sensible and latent heat fluxes from soil and vegetation, allowing for more accurate simulation of surface energy partitioning in densely vegetated areas [128].
The key outputs of energy balance models are ET and crop water use efficiency (WUE), both of which are vital indicators for assessing crop growth and water consumption. With advances in UAV-based TIR remote sensing, integrating UAV-derived LST data into the TSEB model facilitates high-precision ET estimation. Research has shown that the TSEB model, along with its improved variant DTD, which incorporates LST error correction, can accurately simulate ET under both clear and cloudy conditions in barley fields. The results showed strong consistency with eddy covariance observations, confirming the feasibility and accuracy of applying UAV-based thermal imagery in energy balance modeling [129]. UAV-based multispectral remote sensing, when integrated with energy balance models, has also demonstrated strong capabilities in ET monitoring. For instance, multispectral UAV imagery combined with the SEBAL was used to estimate ET in a pistachio orchard in an arid region, generating high-resolution ET and crop coefficient (Kc) maps at 10 cm spatial resolution. The Kc maps, derived using NDVI and the FAO-56 guidelines, enabled precise identification of drought-affected trees, thereby providing technical support for precision irrigation and drought response strategies [130].
Moreover, UAV-based remote sensing enables high-throughput and non-destructive assessment of WUE. Na et al. [131] integrated the SEBAL model, Kc, and a soil water balance equation to estimate daily ET through the growth period of winter wheat. They also used UAV-derived multispectral data combined with the RF algorithm to retrieve aboveground biomass, enabling quantification of WUE under different irrigation regimes across multiple wheat cultivars. The results indicated that biomass-based WUE at the flowering stage was highly correlated with final yield WUE, providing an effective indicator for selecting water-saving cultivars and demonstrating the potential of UAVs in precision breeding and agricultural water-saving management.

3.4. Hybrid Models

With advancement in remote sensing technologies and the diversification of data acquisition methods, the limitations of purely physically based models in accuracy and applicability have become increasingly evident. This is especially evident under complex environmental conditions, where physical models struggle to account for the intricate interactions among influencing factors. In contrast, ML algorithms excel at processing large-scale, high-dimensional datasets but lack physical interpretability. As a result, integrating physical models with data-driven approaches—such as ML and DL—to form hybrid models has emerged as a promising strategy for improving the accuracy and generalizability of remote sensing-based crop monitoring.
Hybrid models preserve the physical interpretability of process-based models while incorporating the adaptive capabilities of ML algorithms, thereby achieving high predictive accuracy and robustness under varying environmental conditions. One common approach embeds ML algorithms within physical models, using outputs of physical simulations as inputs for data-driven inversion or parameter optimization. Alternatively, ML can train and calibrate key parameters within physical models, enhancing adaptability to complex and heterogeneous field conditions. For example, Impollonia et al. [132] compared multiple PROSAIL inversion strategies and proposed a hybrid method that combines physical model simulations with ML regression. This approach improved the estimation accuracy of LCC and LAI in industrial hemp under different nitrogen treatments, confirming the values of physically simulated data in enhancing the performance of ML models during the training phase. Ling et al. [133] addressed the spectral discrepancy between simulated and measured data in traditional hybrid models by proposing an iterative hybrid inversion method. The approach employs a BPNN for inversion, then iteratively optimizes PROSAIL parameters using the inversion results, generating training samples that better resemble real remote sensing imagery. This iterative process progressively enhances model accuracy and improves the stability and robustness of LAI estimation for winter wheat. Bhadra et al. [134] incorporated PROSAIL-simulated data as prior knowledge into a neural network and applied a transfer learning strategy to enhance the model’s generalization capability on real UAV-based hyperspectral imagery. This architecture, which combines DL with physically based modeling, leverages the physical interpretability of PROSAIL and improves the estimation accuracy of LCC and ALA under complex agricultural field conditions through a 1D-CNN.

3.5. Comparative Evaluation of Modeling Approaches

Although diverse modeling approaches—including VI-based, data-driven, physically based, and hybrid models—have been applied to UAV remote sensing of crop water and nutrient status, their comparative performance varies under different agricultural conditions. VI-based methods are computationally simple and interpretable but often suffer from spectral saturation and limited robustness in heterogeneous fields. Data-driven models achieve high predictive accuracy and flexibility, yet their effectiveness strongly depends on large, high-quality training datasets, and they often lack interpretability. Physically based models exhibit stronger generalizability across regions and crops because they are grounded in mechanistic principles, but they usually demand extensive parameterization and are computationally intensive. Hybrid models represent a promising pathway by combining the interpretability of physical models with the adaptability of AI, thereby improving robustness and transferability; however, their operational applications are still at an early stage. A comparative summary of these modeling approaches is presented in Table 4.

4. Key Factors Affecting UAV-Based Remote Sensing of Crop Water and Nutrient Status

With the widespread application of UAV-based remote sensing in agricultural water and nutrient monitoring, data acquisition accuracy has improved significantly. However, practical implementation still faces many challenges. The key factors affecting monitoring performance include crop canopy coverage and growth stages, spatial resolution and scale compatibility, environmental disturbances, and data processing and model generalization capability.

4.1. Influence of Canopy Coverage and Growth Stage

Canopy coverage and growth stages jointly determine the structural and physiological characteristics of crop canopies and are key factors affecting the accuracy of UAV-based remote sensing for monitoring water and nutrient status. As crops develop from emergence to maturity, canopy structures become dense, accompanied by significant changes in spectral and radiative properties. These dynamics directly influence the remote sensing response and the accuracy of inversion models.
At early growth stages with low vegetation coverage, exposed soil dominates the field surface and introduces background interference to remote sensing signals. For example, in optical remote sensing, even after image segmentation removes most soil pixels, mixed pixels remain at canopy–soil boundaries, weakening the sensitivity of VIs such as NDVI to canopy variation [135]. As vegetation coverage increases and the canopy closes, soil interference is reduced. However, overlapping of multiple leaf layers can lead to optical saturation. Specifically, VIs such as NDVI and EVI become insensitive to further increases in LAI once it exceeds a threshold (typically around 4), resulting in a plateau effect [136,137], which compromises the accuracy of LAI and chlorophyll content retrieval [138]. For example, in fields with high vegetation coverage, spectral saturation effects can cause large estimation errors, with the normalized root mean square error (NRMSE) of potato aboveground biomass exceeding 20% [139].
More importantly, transitions between crop growth stages alter canopy structure and induce temporal shifts in water and nutrient responses. During the vegetative stage, nitrogen demand dominates, and reflectance in the red-edge (700–750 nm) and green bands is highly sensitive to nitrogen variation [140]. In contrast, during the reproductive stage, water becomes the primary limiting factor, and TIR signals as well as NIR reflectance show stronger responses [141]. These stage-dependent spectral sensitivities challenge the development of universally applicable models for water and nutrient inversion.

4.2. The Impact of Scale Effects and Spatial Resolution

The monitoring complexity introduced by variations in vegetation coverage and growth stages is further compounded by the inherent characteristics of UAV-based remote sensing. While UAVs offer centimeter-level spatial resolution for fine-scale agricultural monitoring, this precision also leads to a “scale mismatch” problem when applied to field-level management practices. Specifically, the spatial scale of data acquisition often does not align with the scale of agricultural decision-making, resulting in scale effects. These effects compromise the accuracy of parameter inversion and limit the generalization and scalability of models for regional applications.
The mixed pixel problem remains prevalent even under high-resolution conditions. Although UAV-based imagery can achieve spatial resolutions of 1–10 cm, a single pixel may still contain multiple ground components, such as vegetation, bare soil, shadows, and water bodies, resulting in spectral mixing and distortion of water or nutrient indices [142]. In the optical domain, reflectance differences among surface features are substantial. In fields with low canopy cover or wide inter-row spacing, the combined reflectance of soil and vegetation can easily lead to misestimation of water-related indices [143]. Spatial heterogeneity is especially prominent at the scale of management units (e.g., plots or fields), where crop water status or nutrient availability often shows patchy, non-uniform distribution. Such local stress may result from factors like groundwater variation or uneven fertilizer application [144]. At low spatial resolution (e.g., >1 m), pixel averaging may obscure localized stress signals, and result in missed detections. Although high-resolution imagery enables detailed observation, the challenge remains how to effectively aggregate such fine-scale information into actionable insights at the management scale.

4.3. Effects of Illumination, Meteorology, and Other Environmental Factors

In addition to crop characteristics and observational scale, variations in illumination and atmosphere are major external factors contributing to data uncertainty and inversion errors in field-scale remote sensing. UAV-based remote sensing primarily relies on passive optical and TIR sensors to capture surface reflectance and thermal radiation, making its observations highly susceptible to interference from factors such as solar angle, wind, and atmospheric humidity [145,146]. These environmental variables vary across temporal and meteorological conditions, and if left uncorrected, can lead to systematic misinterpretation in crop water and nutrient monitoring. Specifically, changes in solar elevation directly influence the amount of incoming radiation received by optical and thermal sensors. During early morning and late afternoon, the low solar angle increases shadow coverage between surface features, particularly in ridged farmland, which causes diurnal fluctuations in VIs and compromises the stability of physiological assessments of crops [147].
Similarly, variations in wind speed, air temperature, and humidity directly affect TIR remote sensing. TIR sensors are widely used to monitor canopy temperature, serving as a critical tool for estimating evapotranspiration (ET) and diagnosing water stress. When wind speed exceeds 2 m/s, increased canopy turbulence and air mixing reduce the sensitivity of thermal imagery to water stress conditions [148]. Therefore, concurrent monitoring and correction of environmental parameters are essential for ensuring the reliability of remote sensing assessments.

5. Challenges and Future Prospects for UAV-Based Monitoring of Crop Water and Nutrient Status

With the advances of agricultural remote sensing systems, UAV platforms and AI algorithms have been widely applied to monitor crop water and nutrient status. However, several technical and application bottlenecks persist in practical implementation. These challenges span from sensor performance at the hardware level to the efficiency of data processing workflows and the adaptability and interpretability of monitoring models. To better support precision field management, it is imperative to systematically identify and address the following limitations.

5.1. Current Challenges

(1)
Delayed development of proximal remote sensing sensors and insufficient specialization and adaptability. At present, spectral sensors used for crop water and nutrient monitoring are still dominated by general-purpose multispectral or TIR devices, lacking wavelength configurations and structural designs optimized for agricultural scenarios. For example, high-resolution sensors tailored to key spectral bands sensitive to crop water and nutrient status have not yet been developed, making it difficult to detect subtle changes in crop physiological conditions. In addition, under complex field conditions such as high temperature, humidity, and wind, current sensors often suffer from limited stability and adaptability, which restricts both data quality and monitoring frequency.
(2)
Lengthy data processing chains with limited real-time performance and intelligence. Although UAV-based remote sensing platforms offer high-resolution observation capabilities, the massive volume and diversity of the acquired imagery still require complex and time-consuming post-processing workflows. These steps include image mosaicking, radiometric correction, and parameter inversion, which are often labor-intensive and heavily dependent on manual intervention. These workflows hinder timely analysis, reduce data utilization efficiency, and fail to meet the rapid response demands of agricultural decision-making or support high-frequency, dynamic monitoring tasks.
(3)
Insufficient depth in multi-source data fusion and underutilization of spatial information. Current research efforts are largely focused on processing data from a single platform, with limited integration of remote sensing information across multiple platforms—including satellites, UAVs, and ground-based sensors—and modalities such as optical, TIR, and radar. Particularly, systematic fusion methods are lacking in areas such as scale transformation, temporal gap-filling, and canopy structural reconstruction. This limits the ability to capture in-field variability and support regional-scale decision-making, constraining the spatial adaptability of precision agriculture.
(4)
Significant environmental interference in remote sensing inversion leads to high data uncertainty. UAV-based remote sensing primarily relies on passive sensors to capture surface reflectance, making it highly susceptible to solar angle, wind speed, atmospheric humidity, and cloud cover. During the early growth stages, strong soil background signals may obscure crop spectral features, while in later stages, dense canopy overlap can result in spectral saturation. Moreover, shadow occlusion and terrain variation further challenge radiometric consistency. Without correction, these factors can compromise the stability and reliability of model outputs.
(5)
Limited model generalization, cross-regional adaptability, and interpretability. Most existing models rely heavily on locally trained datasets, making them difficult to generalize across different crops, regions, seasons, and management practices. In complex agricultural environments, these models are prone to overfitting and transfer failures. Although DL-based “black-box” models often achieve high accuracy, they lack explicit physiological or physical interpretability, undermining their credibility in intelligent field diagnostics. This limitation hinders their practical deployment and scalability in real-world agricultural management.
(6)
Practical barriers in real-world adoption. Despite the rapid development of UAV-based remote sensing technologies, their widespread adoption in agricultural practice faces several practical barriers. High initial investment costs for UAV platforms and sensors, as well as the need for trained personnel to operate and maintain these systems, often limit their accessibility for smallholder farmers. In addition, regulatory restrictions—such as flight permits, operational safety requirements, and data privacy concerns—pose institutional challenges, especially in regions with evolving UAV policies. These factors hinder the scalability and routine use of UAV-based diagnostics, emphasizing the need for cost-effective solutions, simplified user interfaces, and policy support frameworks to facilitate broader implementation.

5.2. Future Prospects

(1)
Development of application-specific sensors and edge-intelligent devices agricultural scenarios. To address the limitations of general-purpose sensors in crop monitoring, future efforts will focus on developing sensor modules designed to capture crop-sensitive spectral bands related to water and nutrient status. These sensors will be optimized for lightweight design, low power consumption, and enhanced resistance to field interference. Moreover, smart terminals with edge computing modules—such as UAVs or in-field sensor nodes—will enable real-time data processing, including image stitching, VI computation, and key region extraction during flight. Only essential information will be transmitted to the cloud, thereby reducing bandwidth demand and latency. This approach ensures stable, high-frequency sensing data to support in-field diagnosis of crop water and nutrient status.
(2)
Establishing an edge–cloud collaborative architecture to enhance processing efficiency and intelligence. Leveraging 5G communication and AI algorithms, a new dual-layer architecture can be established for UAV-based remote sensing that integrates data acquisition, transmission, and processing. This architecture consists of edge-level preprocessing and cloud-based deep analysis. Onboard UAV systems will handle basic tasks such as noise reduction, VI computation, and target segmentation, while high-complexity operations—such as radiometric correction, inversion modeling, and large-scale data analysis—are performed in the cloud. This division improves processing efficiency, enables near real-time diagnosis, and promotes the transformation of UAV-based remote sensing from a “data acquisition tool” to an “intelligent decision-making system.”
(3)
Multi-scale collaborative remote sensing and 3D data fusion to enhance spatial awareness and decision support. Integrating UAVs, satellites, and ground-based platforms enables unified crop monitoring across centimeter- to kilometer-scale resolutions. Combining LiDAR and multispectral imagery, high-precision 3D canopy models can be generated, allowing joint feature extraction of crop height, canopy structure, and spectral responses. Based on these models, AI models can automatically detect spatially heterogeneous regions, such as zones of water stress or uneven fertilization. These outputs support precision fertilization and site-specific irrigation, improving the efficiency and operability of precision agriculture.
(4)
Developing dynamic correction mechanisms to improve environmental adaptability and reduce remote sensing uncertainty. To address the common environmental interferences in remote sensing inversion, adaptive models and correction frameworks should be developed, including coverage-adaptive models, physiology–spectrum coupled models, and real-time environmental correction systems. For instance, in the early growth stage, soil reflectance modeling can reduce spectral contamination, while in the grain-filling stage, multi-angle imaging can alleviate saturation effects. Additionally, integrating real-time meteorological variables—such as wind speed, humidity, and solar radiation—can guide radiometric correction algorithms to perform shadow compensation, angle normalization, and wind–temperature coupling adjustment, improving data consistency and reliability across time and space.
(5)
Integrating physical mechanisms with AI for interpretable models to improve generalization and robustness. To overcome limitations in model transferability and interpretability, a “pretraining–fine-tuning” paradigm can be adopted. Large-scale, multi-crop datasets can build generalized base models, which are then rapidly adapted to specific regions or crops using small local samples. Furthermore, integrating physical priors with neural networks—such as embedding radiative transfer models (e.g., PROSAIL) into the architecture—enables the incorporation of physical constraints during training. This hybrid approach enhances model stability, interpretability, and data efficiency, facilitating a transition from opaque “black-box” systems to transparent, knowledge- and data-driven solutions in agricultural remote sensing.
(6)
Promoting practical deployment through cost-effective design, policy support, and user-friendly tools. To overcome real-world barriers, future research and development should prioritize cost-effective UAV systems and simplified operational workflows tailored for agricultural end-users. Developing modular, low-cost UAV platforms with plug-and-play sensors can reduce entry barriers for small and medium-sized farms. In parallel, intuitive software interfaces and semi-automated workflows will lower the technical threshold for non-expert users. Moreover, establishing supportive regulatory frameworks and providing training programs or service outsourcing models will facilitate broader and safer UAV adoption in agricultural practice, bridging the gap between research innovations and field-level implementation.

Author Contributions

Conceptualization, J.C. and Z.Z.; investigation, H.L., X.B. and L.Q.; writing—original draft preparation, X.Y., X.L. and Y.L.; writing—review and editing, J.C., Z.Z. and X.Y.; funding acquisition, J.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by National Natural Science Foundation of China, grant number 52279047.

Conflicts of Interest

The authors declare that they have no known competing financial interests or personal relationships that could appear to influence the work reported in this paper.

References

  1. Katsoulas, N.; Elvanidi, A.; Ferentinos, K.P.; Kacira, M.; Bartzanas, T.; Kittas, C. Crop reflectance monitoring as a tool for water stress detection in greenhouses: A review. Biosyst. Eng. 2016, 151, 374–398. [Google Scholar] [CrossRef]
  2. Zhang, Y.; Xiao, J.; Yan, K.; Lu, X.; Li, W.; Tian, H.; Wang, L.; Deng, J.; Lan, Y. Advances and Developments in Monitoring and Inversion of the Biochemical Information of Crop Nutrients Based on Hyperspectral Technology. Agronomy 2023, 13, 2163. [Google Scholar] [CrossRef]
  3. Hunt, E.; Daughtry, C. What good are unmanned aircraft systems for agricultural remote sensing and precision agriculture? Int. J. Remote Sens. 2018, 39, 5345–5376. [Google Scholar] [CrossRef]
  4. Maes, W.H.; Steppe, K. Perspectives for Remote Sensing with Unmanned Aerial Vehicles in Precision Agriculture. Trends Plant Sci. 2019, 24, 152–164. [Google Scholar] [CrossRef] [PubMed]
  5. Wang, J.; Zhang, S.; Lizaga, I.; Zhang, Y.; Ge, X.; Zhang, Z.; Zhang, W.; Huang, Q.; Hu, Z. UAS-based remote sensing for agricultural Monitoring: Current status and perspectives. Comput. Electron. Agric. 2024, 227, 109501. [Google Scholar] [CrossRef]
  6. García-Berná, J.; Ouhbi, S.; Benmouna, B.; García-Mateos, G.; Fernández-Alemán, J.; Molina-Martínez, J. Systematic Mapping Study on Remote Sensing in Agriculture. Appl. Sci. 2020, 10, 3456. [Google Scholar] [CrossRef]
  7. Dong, H.; Dong, J.; Sun, S.; Bai, T.; Zhao, D.; Yin, Y.; Shen, X.; Wang, Y.; Zhang, Z.; Wang, Y. Crop water stress detection based on UAV remote sensing systems. Agric. Water Manag. 2024, 303, 109059. [Google Scholar] [CrossRef]
  8. Zhu, H.; Lin, C.; Liu, G.; Wang, D.; Qin, S.; Li, A.; Xu, J.L.; He, Y. Intelligent agriculture: Deep learning in UAV-based remote sensing imagery for crop diseases and pests detection. Front. Plant Sci. 2024, 15, 1435016. [Google Scholar] [CrossRef] [PubMed]
  9. Fu, Y.Y.; Yang, G.J.; Pu, R.L.; Li, Z.H.; Li, H.L.; Xu, X.G.; Song, X.Y.; Yang, X.D.; Zhao, C.J. An overview of crop nitrogen status assessment using hyperspectral remote sensing: Current status and perspectives. Eur. J. Agron. 2021, 124, 126241. [Google Scholar] [CrossRef]
  10. Yang, G.J.; Liu, J.G.; Zhao, C.J.; Li, Z.H.; Huang, Y.B.; Yu, H.Y.; Xu, B.; Yang, X.D.; Zhu, D.M.; Zhang, X.Y.; et al. Unmanned Aerial Vehicle Remote Sensing for Field-Based Crop Phenotyping: Current Status and Perspectives. Front. Plant Sci. 2017, 8, 1111. [Google Scholar] [CrossRef]
  11. Karthikeyan, L.; Chawla, I.; Mishra, A.K. A review of remote sensing applications in agriculture for food security: Crop growth and yield, irrigation, and crop losses. J. Hydrol. 2020, 586, 124905. [Google Scholar] [CrossRef]
  12. Wu, B.F.; Zhang, M.; Zeng, H.W.; Tian, F.Y.; Potgieter, A.B.; Qin, X.L.; Yan, N.N.; Chang, S.; Zhao, Y.; Dong, Q.H.; et al. Challenges and opportunities in remote sensing-based crop monitoring: A review. Natl. Sci. Rev. 2023, 10, nwac290. [Google Scholar] [CrossRef] [PubMed]
  13. Virnodkar, S.S.; Pachghare, V.K.; Patil, V.C.; Jha, S.K. Remote sensing and machine learning for crop water stress determination in various crops: A critical review. Precis. Agric. 2020, 21, 1121–1155. [Google Scholar] [CrossRef]
  14. Liu, S.; Cheng, J.; Liang, L.; Bai, H.; Dang, W. Light-Weight Semantic Segmentation Network for UAV Remote Sensing Images. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 8287–8296. [Google Scholar] [CrossRef]
  15. Vergouw, B.; Nagel, H.; Bondt, G.; Custers, B. Drone Technology: Types, Payloads, Applications, Frequency Spectrum Issues and Future Developments. In The Future of Drone Use: Opportunities and Threats from Ethical and Legal Perspectives; Custers, B., Ed.; T.M.C. Asser Press: The Hague, The Netherlands, 2016; pp. 21–45. [Google Scholar]
  16. Sun, G.; Hu, T.; Chen, S.; Sun, J.; Zhang, J.; Ye, R.; Zhang, S.; Liu, J. Using UAV-based multispectral remote sensing imagery combined with DRIS method to diagnose leaf nitrogen nutrition status in a fertigated apple orchard. Precis. Agric. 2023, 24, 2522–2548. [Google Scholar] [CrossRef]
  17. Lazarević, B.; Carović-Stanko, K.; Safner, T.; Poljak, M. Study of High-Temperature-Induced Morphological and Physiological Changes in Potato Using Nondestructive Plant Phenotyping. Plants 2022, 11, 3534. [Google Scholar] [CrossRef]
  18. Ahmad, U.; Alvino, A.; Marino, S. A Review of Crop Water Stress Assessment Using Remote Sensing. Remote Sens. 2021, 13, 4155. [Google Scholar] [CrossRef]
  19. Hosseinpour-Zarnaq, M.; Omid, M.; Sarmadian, F.; Ghasemi-Mobtaker, H.; Alimardani, R.; Bohlol, P. Exploring the capabilities of hyperspectral remote sensing for soil texture evaluation. Ecol. Inform. 2025, 90, 103336. [Google Scholar] [CrossRef]
  20. Raj, R.; Walker, J.P.; Pingale, R.; Banoth, B.N.; Jagarlapudi, A. Leaf nitrogen content estimation using top-of-canopy airborne hyperspectral data. Int. J. Appl. Earth Obs. Geoinf. 2021, 104, 102584. [Google Scholar] [CrossRef]
  21. Surase, R.R.; Kale, K.V.; Varpe, A.B.; Vibhute, A.D.; Gite, H.R.; Solankar, M.M.; Gaikwad, S.; Nalawade, D.B. Estimation of Water Contents from Vegetation Using Hyperspectral Indices; Springer: Singapore, 2019; pp. 247–255. [Google Scholar]
  22. Kumar, A.; Tripathi, R.P. Thermal Infrared Radiation for Assessing Crop Water Stress in Wheat. J. Agron. Crop Sci. 1990, 164, 268–272. [Google Scholar] [CrossRef]
  23. Mangus, D.L.; Sharda, A.; Zhang, N. Development and evaluation of thermal infrared imaging system for high spatial and temporal resolution crop water stress monitoring of corn within a greenhouse. Comput. Electron. Agric. 2016, 121, 149–159. [Google Scholar] [CrossRef]
  24. Eweys, O.A.; Elwan, A.; Borham, T.I. Retrieving topsoil moisture using RADARSAT-2 data, a novel approach applied at the east of the Netherlands. J. Hydrol. 2017, 555, 670–682. [Google Scholar] [CrossRef]
  25. Honkavaara, E.; Saari, H.; Kaivosoja, J.; Pölönen, I.; Hakala, T.; Litkey, P.; Mäkynen, J.; Pesonen, L. Processing and Assessment of Spectrometric, Stereoscopic Imagery Collected Using a Lightweight UAV Spectral Camera for Precision Agriculture. Remote Sens. 2013, 5, 5006–5039. [Google Scholar] [CrossRef]
  26. Tuominen, S.; Pekkarinen, A. Local radiometric correction of digital aerial photographs for multi source forest inventory. Remote Sens. Environ. 2004, 89, 72–82. [Google Scholar] [CrossRef]
  27. Weinreb, M.P.; Fleming, H.E. Empirical radiance corrections: A technique to improve satellite soundings of atmospheric temperature. Geophys. Res. Lett. 1974, 1, 298–301. [Google Scholar] [CrossRef]
  28. Hall, F.G.; Strebel, D.E.; Nickeson, J.E.; Goetz, S.J. Radiometric rectification: Toward a common radiometric response among multidate, multisensor images. Remote Sens. Environ. 1991, 35, 11–27. [Google Scholar] [CrossRef]
  29. Honkavaara, E.; Khoramshahi, E. Radiometric Correction of Close-Range Spectral Image Blocks Captured Using an Unmanned Aerial Vehicle with a Radiometric Block Adjustment. Remote Sens. 2018, 10, 256. [Google Scholar] [CrossRef]
  30. Luo, S.; Jiang, X.; Yang, K.; Li, Y.; Fang, S. Multispectral remote sensing for accurate acquisition of rice phenotypes: Impacts of radiometric calibration and unmanned aerial vehicle flying altitudes. Front. Plant Sci. 2022, 13, 958106. [Google Scholar] [CrossRef]
  31. Wang, Y.; Kootstra, G.; Yang, Z.; Khan, H.A. UAV multispectral remote sensing for agriculture: A comparative study of radiometric correction methods under varying illumination conditions. Biosyst. Eng. 2024, 248, 240–254. [Google Scholar] [CrossRef]
  32. Rosas, J.T.F.; de Carvalho Pinto, F.D.A.; Queiroz, D.M.D.; de Melo Villar, F.M.; Martins, R.N.; Silva, S.D.A. Low-cost system for radiometric calibration of UAV-based multispectral imagery. J. Spat. Sci. 2022, 67, 395–409. [Google Scholar] [CrossRef]
  33. Aragon, B.; Johansen, K.; Parkes, S.; Malbeteau, Y.; Al-Mashharawi, S.; Al-Amoudi, T.; Andrade, C.F.; Turner, D.; Lucieer, A.; McCabe, M.F. A Calibration Procedure for Field and UAV-Based Uncooled Thermal Infrared Instruments. Sensors 2020, 20, 3316. [Google Scholar] [CrossRef]
  34. Elfarkh, J.; Johansen, K.; Angulo, V.; Camargo, O.L.; McCabe, M.F. Quantifying Within-Flight Variation in Land Surface Temperature from a UAV-Based Thermal Infrared Camera. Drones 2023, 7, 617. [Google Scholar] [CrossRef]
  35. Wang, Z.; Zhou, J.; Ma, J.; Wang, Y.; Liu, S.; Ding, L.; Tang, W.; Pakezhamu, N.; Meng, L. Removing temperature drift and temporal variation in thermal infrared images of a UAV uncooled thermal infrared imager. ISPRS J. Photogramm. 2023, 203, 392–411. [Google Scholar] [CrossRef]
  36. Jakob, S.; Zimmermann, R.; Gloaguen, R. The Need for Accurate Geometric and Radiometric Corrections of Drone-Borne Hyperspectral Data for Mineral Exploration: MEPHySTo—A Toolbox for Pre-Processing Drone-Borne Hyperspectral Data. Remote Sens. 2017, 9, 88. [Google Scholar] [CrossRef]
  37. Song, L.; Li, H.; Chen, T.; Chen, J.; Liu, S.; Fan, J.; Wang, Q. An Integrated Solution of UAV Push-Broom Hyperspectral System Based on Geometric Correction with MSI and Radiation Correction Considering Outdoor Illumination Variation. Remote Sens. 2022, 14, 6267. [Google Scholar] [CrossRef]
  38. Zhao, J.; Zhang, X.; Gao, C.; Qiu, X.; Tian, Y.; Zhu, Y.; Cao, W. Rapid Mosaicking of Unmanned Aerial Vehicle (UAV) Images for Crop Growth Monitoring Using the SIFT Algorithm. Remote Sens. 2019, 11, 1226. [Google Scholar] [CrossRef]
  39. Ren, X.; Sun, M.; Zhang, X.; Liu, L. A Simplified Method for UAV Multispectral Images Mosaicking. Remote Sens. 2017, 9, 962. [Google Scholar] [CrossRef]
  40. Yang, Z.; Pu, F.; Chen, H.; He, Y.; Xu, X. IBEWMS: Individual Band Spectral Feature Enhancement Based Waterfront Environment UAV Multispectral Image Stitching. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2024, 18, 221–240. [Google Scholar] [CrossRef]
  41. Xu, J.; Zhao, D.; Ren, Z.; Fu, F.; Sun, Y.; Fang, M. A Parallax Image Mosaic Method for Low Altitude Aerial Photography with Artifact and Distortion Suppression. J. Imaging 2022, 9, 5. [Google Scholar] [CrossRef]
  42. Mo, Y.; Kang, X.; Duan, P.; Li, S. A Robust UAV Hyperspectral Image Stitching Method Based on Deep Feature Matching. IEEE Trans. Geosci. Remote Sens. 2021, 60, 1–14. [Google Scholar] [CrossRef]
  43. Turner, D.; Lucieer, A.; Malenovský, Z.; King, D.; Robinson, S. Spatial Co-Registration of Ultra-High Resolution Visible, Multispectral and Thermal Images Acquired with a Micro-UAV over Antarctic Moss Beds. Remote Sens. 2014, 6, 4003–4024. [Google Scholar] [CrossRef]
  44. Kapil, R.; Castilla, G.; Marvasti-Zadeh, S.M.; Goodsman, D.; Erbilgin, N.; Ray, N. Orthomosaicking Thermal Drone Images of Forests via Simultaneously Acquired RGB Images. Remote Sens. 2023, 15, 2653. [Google Scholar] [CrossRef]
  45. Bosilj, P.; Duckett, T.; Cielniak, G. Connected attribute morphology for unified vegetation segmentation and classification in precision agriculture. Comput. Ind. 2018, 98, 226–240. [Google Scholar] [CrossRef] [PubMed]
  46. Wu, J.; Liu, C.; Ouyang, A.; Li, B.; Chen, N.; Wang, J.; Liu, Y.d. Early Detection of Slight Bruises in Yellow Peaches (Amygdalus persica) Using Multispectral Structured-Illumination Reflectance Imaging and an Improved Ostu Method. Foods 2024, 13, 3843. [Google Scholar] [CrossRef]
  47. Singh, M.P.; Gayathri, V.; Chaudhuri, D. A Simple Data Preprocessing and Postprocessing Techniques for SVM Classifier of Remote Sensing Multispectral Image Classification. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2022, 15, 7248–7262. [Google Scholar] [CrossRef]
  48. Wang, Y.; Lv, J.; Xu, L.; Gu, Y.; Zou, L.; Ma, Z. A segmentation method for waxberry image under orchard environment. Sci. Hortic. 2020, 266, 109309. [Google Scholar] [CrossRef]
  49. Ren, C.N.; Liu, B.; Liang, Z.; Lin, Z.L.; Wang, W.; Wei, X.Z.; Li, X.J.; Zou, X.J. An Innovative Method of Monitoring Cotton Aphid Infestation Based on Data Fusion and Multi-Source Remote Sensing Using Unmanned Aerial Vehicles. Drones 2025, 9, 229. [Google Scholar] [CrossRef]
  50. Wen, H.; Hu, X.K.; Zhong, P. Detecting rice straw burning based on infrared and visible information fusion with UAV remote sensing. Comput. Electron. Agric. 2024, 222, 109078. [Google Scholar] [CrossRef]
  51. Wu, B.; Fan, L.Q.; Xu, B.W.; Yang, J.J.; Zhao, R.M.; Wang, Q.; Ai, X.T.; Zhao, H.X.; Yang, Z.R. UAV-based LiDAR and multispectral sensors fusion for cotton yield estimation: Plant height and leaf chlorophyll content as a bridge linking remote sensing data to yield. Ind. Crops Prod. 2025, 230, 121110. [Google Scholar] [CrossRef]
  52. Xian, G.L.; Liu, J.A.; Lin, Y.X.; Li, S.; Bian, C.S. Multi-Feature Fusion for Estimating Above-Ground Biomass of Potato by UAV Remote Sensing. Plants 2024, 13, 3356. [Google Scholar] [CrossRef] [PubMed]
  53. Veloso, A.; Mérmoz, S.; Bouvet, A.; Toan, T.; Planells, M.; Dejoux, J.; Ceschia, E. Understanding the temporal behavior of crops using Sentinel-1 and Sentinel-2-like data for agricultural applications. Remote Sens. Environ. 2017, 199, 415–426. [Google Scholar] [CrossRef]
  54. Wu, Z.J.; Cui, N.B.; Zhang, W.J.; Yang, Y.A.; Gong, D.Z.; Liu, Q.S.; Zhao, L.; Xing, L.W.; He, Q.Y.; Zhu, S.D.; et al. Estimation of soil moisture in drip-irrigated citrus orchards using multi-modal UAV remote sensing. Agric. Water Manag. 2024, 302, 108972. [Google Scholar] [CrossRef]
  55. Song, Z.S.; Zhang, Z.T.; Yang, S.Q.; Ding, D.Y.; Ning, J.F. Identifying sunflower lodging based on image fusion and deep semantic segmentation with UAV remote sensing imaging. Comput. Electron. Agric. 2020, 179, 105812. [Google Scholar] [CrossRef]
  56. Pinder, J.E.; McLeod, K.W. Indications of Relative Drought Stress in Longleaf Pine from Thematic Mapper Data. Photogramm. Eng. Remote Sens. 1999, 65, 495–501. [Google Scholar]
  57. Roujean, J.-L.; Breon, F.-M. Estimating PAR absorbed by vegetation from bidirectional reflectance measurements. Remote Sens. Environ. 1995, 51, 375–384. [Google Scholar] [CrossRef]
  58. Huete, A. A comparison of vegetation indices over a global set of TM images for EOS-MODIS. Remote Sens. Environ. 1997, 59, 440–451. [Google Scholar] [CrossRef]
  59. Woebbecke, D.M.; Meyer, G.E.; Von Bargen, K.; Mortensen, D.A. Color Indices for Weed Identification Under Various Soil, Residue, and Lighting Conditions. Trans. ASAE 1995, 38, 259–269. [Google Scholar] [CrossRef]
  60. Gitelson, A.A.; Kaufman, Y.J.; Merzlyak, M.N. Use of a green channel in remote sensing of global vegetation from EOS-MODIS. Remote Sens. Environ. 1996, 58, 289–298. [Google Scholar] [CrossRef]
  61. Wilson, E.H.; Sader, S.A. Detection of forest harvest type using multiple dates of Landsat TM imagery. Remote Sens. Environ. 2002, 80, 385–396. [Google Scholar] [CrossRef]
  62. Pettorelli, N.; Vik, J.; Mysterud, A.; Gaillard, J.; Tucker, C.; Stenseth, N. Using the satellite-derived NDVI to assess ecological responses to environmental change. Trends Ecol. Evol. 2005, 20, 503–510. [Google Scholar] [CrossRef]
  63. Zeng, Y.; Hao, D.; Badgley, G.; Damm, A.; Rascher, U.; Ryu, Y.; Johnson, J.; Krieger, V.; Wu, S.; Qiu, H.; et al. Estimating near-infrared reflectance of vegetation from hyperspectral data. Remote Sens. Environ. 2021, 267, 112723. [Google Scholar] [CrossRef]
  64. Rondeaux, G.; Steven, M.; Baret, F. Optimization of soil-adjusted vegetation indices. Remote Sens. Environ. 1996, 55, 95–107. [Google Scholar] [CrossRef]
  65. Birth, G.S.; McVey, G.R. Measuring the Color of Growing Turf with a Reflectance Spectrophotometer. Agron. J. 1968, 60, 640–643. [Google Scholar] [CrossRef]
  66. Huete, A.R. A soil-adjusted vegetation index (SAVI). Remote Sens. Environ. 1988, 25, 295–309. [Google Scholar] [CrossRef]
  67. Bannari, A.; Asalhi, H.; Teillet, P.M. Transformed difference vegetation index (TDVI) for vegetation cover mapping. In Proceedings of the IEEE International Geoscience and Remote Sensing Symposium, Toronto, ON, Canada, 24–28 June 2002; Volume 3055, pp. 3053–3055. [Google Scholar]
  68. Rouse, J.W., Jr.; Haas, R.H.; Schell, J.A.; Deering, D.W. Monitoring Vegetation Systems in the Great Plains with Erts. In NASA Special Publication; Freden, S.C., Mercanti, E.P., Becker, M.A., Eds.; National Aeronautics and Space Administration: Washington, DC, USA, 1974; Volume 351, p. 309. [Google Scholar]
  69. Nguyen, C.T.; Chidthaisong, A.; Diem, P.K.; Huo, L. A Modified Bare Soil Index to Identify Bare Land Features during Agricultural Fallow-Period in Southeast Asia Using Landsat 8. Land 2021, 10, 231. [Google Scholar] [CrossRef]
  70. Decsi, K.; Kutasy, B.; Hegedűs, G.; Alföldi, Z.P.; Kálmán, N.; Nagy, Á.; Virág, E. Natural immunity stimulation using ELICE16INDURES&#xae; plant conditioner in field culture of soybean. Heliyon 2023, 9, e12907. [Google Scholar] [CrossRef]
  71. Gao, B.-c. NDWI—A normalized difference water index for remote sensing of vegetation liquid water from space. Remote Sens. Environ. 1996, 58, 257–266. [Google Scholar] [CrossRef]
  72. Wang, L.; Qu, J.J. NMDI: A normalized multi-band drought index for monitoring soil and vegetation moisture with satellite remote sensing. Geophys. Res. Lett. 2007, 34, L20405. [Google Scholar] [CrossRef]
  73. Gao, W.; Gao, Z.; Niu, Z.; Roussos, P.; Sygrimis, N.; Zhang, D.; Li, M. Lightweighting of kiwifruit root soil water content inversion model based on novel vegetation indices. Smart Agric. Technol. 2025, 11, 100995. [Google Scholar] [CrossRef]
  74. Tanner, C.B. Plant Temperatures 1. Agron. J. 1963, 55, 210–211. [Google Scholar] [CrossRef]
  75. Idso, S.; Jackson, R.D.; Pinter, P.; Reginato, R.; Hatfield, J. Normalizing the stress-degree-day parameter for environmental variability☆. Agric. Meteorol. 1981, 24, 45–55. [Google Scholar] [CrossRef]
  76. Jackson, R.D.; Idso, S.B.; Reginato, R.J.; Pinter, P.J. Canopy temperature as a crop water stress indicator. Water Resour. Res. 1981, 17, 1133–1138. [Google Scholar] [CrossRef]
  77. Xu, J.; Zhao, Y.; Liu, Z. Research on Ecological Environment Change of Middle and Western Inner-Mongolia Region Using RS and GIS. Natl. Remote Sens. Bull. 2002, 6, 142–149. [Google Scholar] [CrossRef]
  78. Sandholt, I.; Rasmussen, K.; Andersen, J.A. A simple interpretation of the surface temperature/vegetation index space for assessment of surface moisture status. Remote Sens. Environ. 2002, 79, 213–224. [Google Scholar] [CrossRef]
  79. Zhang, R.; Bao, X.; Hong, R.; He, X.; Yin, G.; Chen, J.; Ouyang, X.; Wang, Y.; Liu, G. Soil moisture retrieval over croplands using novel dual-polarization SAR vegetation index. Agric. Water Manag. 2024, 306, 109159. [Google Scholar] [CrossRef]
  80. Rouse, J.; Haas, R.H.; Deering, D.; Schell, J.A.; Harlan, J. Monitoring the Vernal Advancement and Retrogradation (Green Wave Effect) of Natural Vegetation. [Great Plains Corridor]; National Aeronautics and Space Administration: Washington, DC, USA, 1973.
  81. Tian, Y.C.; Yao, X.; Yang, J.; Cao, W.X.; Hannaway, D.B.; Zhu, Y. Assessing newly developed and published vegetation indices for estimating rice leaf nitrogen concentration with ground- and space-based hyperspectral reflectance. Field Crops Res. 2011, 120, 299–310. [Google Scholar] [CrossRef]
  82. Dash, J.; Curran, P.J. The MERIS terrestrial chlorophyll index. Int. J. Remote Sens. 2004, 25, 5403–5413. [Google Scholar] [CrossRef]
  83. Gitelson, A.A.; Gritz, Y.; Merzlyak, M.N. Relationships between leaf chlorophyll content and spectral reflectance and algorithms for non-destructive chlorophyll assessment in higher plant leaves. J. Plant Physiol. 2003, 160, 271–282. [Google Scholar] [CrossRef]
  84. Li, F.; Miao, Y.X.; Feng, G.H.; Yuan, F.; Yue, S.C.; Gao, X.W.; Liu, Y.Q.; Liu, B.; Ustine, S.L.; Chen, X.P. Improving estimation of summer maize nitrogen status with red edge-based spectral vegetation indices. Field Crops Res. 2014, 157, 111–123. [Google Scholar] [CrossRef]
  85. Zhao, B.; Duan, A.W.; Ata-Ul-Karim, S.T.; Liu, Z.D.; Chen, Z.F.; Gong, Z.H.; Zhang, J.Y.; Xiao, J.F.; Liu, Z.G.; Qin, A.Z.; et al. Exploring new spectral bands and vegetation indices for estimating nitrogen nutrition index of summer maize. Eur. J. Agron. 2018, 93, 113–125. [Google Scholar] [CrossRef]
  86. Zhang, Y.; Wang, Z.C.; Spohrer, K.; Reineke, A.J.; He, X.K.; Müller, J. Vegetation indices for the detection and classification of leaf nitrogen deficiency in maize. Eur. J. Agron. 2025, 168, 127665. [Google Scholar] [CrossRef]
  87. Wang, J.J.; Shi, T.Z.; Liu, H.Z.; Wu, G.F. Successive projections algorithm-based three-band vegetation index for foliar phosphorus estimation. Ecol. Indic. 2016, 67, 12–20. [Google Scholar] [CrossRef]
  88. Lu, J.S.; Yang, T.C.; Su, X.; Qi, H.; Yao, X.; Cheng, T.; Zhu, Y.; Cao, W.X.; Tian, Y.C. Monitoring leaf potassium content using hyperspectral vegetation indices in rice leaves. Precis. Agric. 2020, 21, 324–348. [Google Scholar] [CrossRef]
  89. Yang, T.C.; Lu, J.S.; Liao, F.; Qi, H.; Yao, X.; Cheng, T.; Zhu, Y.; Cao, W.X.; Tian, Y.C. Retrieving potassium levels in wheat blades using normalised spectra. Int. J. Appl. Earth Obs. Geoinf. 2021, 102, 102412. [Google Scholar] [CrossRef]
  90. Chandel, A.; Khot, L.; Yu, L.-X. Alfalfa (Medicago sativa L.) crop vigor and yield characterization using high-resolution aerial multispectral and thermal infrared imaging technique. Comput. Electron. Agric. 2021, 182, 105999. [Google Scholar] [CrossRef]
  91. Shukla, G.; Garg, R.; Srivastava, H.; Garg, P. Performance analysis of different predictive models for crop classification across an aridic to ustic area of Indian states. Geocarto Int. 2018, 33, 240–259. [Google Scholar] [CrossRef]
  92. Huang, Y. Improved SVM-Based Soil-Moisture-Content Prediction Model for Tea Plantation. Plants 2023, 12, 2309. [Google Scholar] [CrossRef] [PubMed]
  93. Zhao, L.; Qing, S.; Li, H.; Qiu, Z.; Niu, X.; Shi, Y.; Chen, S.; Xing, X. Estimating maize evapotranspiration based on hybrid back-propagation neural network models and meteorological, soil, and crop data. Int. J. Biometeorol. 2024, 68, 511–525. [Google Scholar] [CrossRef]
  94. Han, X.; Zhong, Y.; Zhang, L. An Efficient and Robust Integrated Geospatial Object Detection Framework for High Spatial Resolution Remote Sensing Imagery. Remote Sens. 2017, 9, 666. [Google Scholar] [CrossRef]
  95. Zhang, C.; Liu, J.; Shang, J.; Cai, H. Capability of crop water content for revealing variability of winter wheat grain yield and soil moisture under limited irrigation. Sci. Total Environ. 2018, 631–632, 677–687. [Google Scholar] [CrossRef]
  96. Chen, H.; Chen, H.; Zhang, S.; Chen, S.; Cen, F.; Zhao, Q.; Huang, X.; He, T.; Gao, Z. Comparison of CWSI and Ts-Ta-VIs in moisture monitoring of dryland crops (sorghum and maize) based on UAV remote sensing. J. Integr. Agric. 2024, 23, 2458–2475. [Google Scholar] [CrossRef]
  97. Savchik, P.; Nocco, M.; Kisekka, I. Mapping almond stem water potential using machine learning and multispectral imagery. Irrig. Sci. 2024, 43, 105–120. [Google Scholar] [CrossRef]
  98. Divya Dharshini, S.; Anurag; Kumar, A.; Satpal; Kumar, M.; Priyanka, P.; Pugazenthi, K. Evaluation of machine-learning algorithms in estimation of relative water content of sorghum under different irrigated environments. J. Arid. Environ. 2025, 229, 105390. [Google Scholar] [CrossRef]
  99. Babaeian, E.; Paheding, S.; Siddique, N.; Devabhaktuni, V.K.; Tuller, M. Estimation of root zone soil moisture from ground and remotely sensed soil information with multisensor data fusion and automated machine learning. Remote Sens. Environ. 2021, 260, 112434. [Google Scholar] [CrossRef]
  100. Yang, N.; Zhang, Z.; Yang, X.; Dong, N.; Xu, Q.; Chen, J.; Sun, S.; Cui, N.; Ning, J. Evaluation of crop water status using UAV-based images data with a model updating strategy. Agric. Water Manag. 2025, 312, 109445. [Google Scholar] [CrossRef]
  101. Yang, N.; Zhang, Z.; Zhang, J.; Yang, X.; Liu, H.; Chen, J.; Ning, J.; Sun, S.; Shi, L. Accurate estimation of winter-wheat leaf water content using continuous wavelet transform-based hyperspectral combined with thermal infrared on a UAV platform. Eur. J. Agron. 2025, 168, 127624. [Google Scholar] [CrossRef]
  102. Jiang, X.T.; Gao, L.T.; Xu, X.; Wu, W.B.; Yang, G.J.; Meng, Y.; Feng, H.K.; Li, Y.F.; Xue, H.Y.; Chen, T.E. Combining UAV Remote Sensing with Ensemble Learning to Monitor Leaf Nitrogen Content in Custard Apple (Annona squamosa L.). Agronomy 2025, 15, 38. [Google Scholar] [CrossRef]
  103. Zha, H.N.; Miao, Y.X.; Wang, T.T.; Li, Y.; Zhang, J.; Sun, W.C.; Feng, Z.Q.; Kusnierek, K. Improving Unmanned Aerial Vehicle Remote Sensing-Based Rice Nitrogen Nutrition Index Prediction with Machine Learning. Remote Sens. 2020, 12, 215. [Google Scholar] [CrossRef]
  104. Chen, R.Q.; Liu, W.P.; Yang, H.; Jin, X.L.; Yang, G.J.; Zhou, Y.; Zhang, C.J.; Han, S.Y.; Meng, Y.; Zhai, C.Y.; et al. A novel framework to assess apple leaf nitrogen content: Fusion of hyperspectral reflectance and phenology information through deep learning. Comput. Electron. Agric. 2024, 219, 108816. [Google Scholar] [CrossRef]
  105. Xiao, Q.; Wu, N.; Tang, W.; Zhang, C.; Feng, L.; Zhou, L.; Shen, J.; Zhang, Z.; Gao, P.; He, Y. Visible and near-infrared spectroscopy and deep learning application for the qualitative and quantitative investigation of nitrogen status in cotton leaves. Front. Plant Sci. 2022, 13, 1080745. [Google Scholar] [CrossRef]
  106. Zhang, X.; Han, L.X.; Sobeih, T.; Lappin, L.; Lee, M.A.; Howard, A.; Kisdi, A. The Self-Supervised Spectral-Spatial Vision Transformer Network for Accurate Prediction of Wheat Nitrogen Status from UAV Imagery. Remote Sens. 2022, 14, 1400. [Google Scholar] [CrossRef]
  107. Du, R.Q.; Chen, J.Y.; Xiang, Y.Z.; Zhang, Z.T.; Yang, N.; Yang, X.Z.; Tang, Z.J.; Wang, H.; Wang, X.; Shi, H.Z.; et al. Incremental learning for crop growth parameters estimation and nitrogen diagnosis from hyperspectral data. Comput. Electron. Agric. 2023, 215, 108356. [Google Scholar] [CrossRef]
  108. Dehghan-Shoar, M.H.; Kereszturi, G.; Pullanagari, R.R.; Orsi, A.A.; Yule, I.J.; Hanly, J. A physically informed multi-scale deep neural network for estimating foliar nitrogen concentration in vegetation. Int. J. Appl. Earth Obs. Geoinf. 2024, 130, 103917. [Google Scholar] [CrossRef]
  109. Verhoef, W. Light scattering by leaf layers with application to canopy reflectance modeling: The SAIL model. Remote Sens. Environ. 1984, 16, 125–141. [Google Scholar] [CrossRef]
  110. Jacquemoud, S.; Baret, F. PROSPECT: A model of leaf optical properties spectra. Remote Sens. Environ. 1990, 34, 75–91. [Google Scholar] [CrossRef]
  111. Jacquemoud, S.; Verhoef, W.; Baret, F.; Bacour, C.; Zarco-Tejada, P.J.; Asner, G.P.; François, C.; Ustin, S.L. PROSPECT+SAIL models: A review of use for vegetation characterization. Remote Sens. Environ. 2009, 113, S56–S66. [Google Scholar] [CrossRef]
  112. Li, X.; Strahler, A.H. Geometric-Optical Modeling of a Conifer Forest Canopy. IEEE Trans. Geosci. Remote Sens. 1985, GE-23, 705–721. [Google Scholar] [CrossRef]
  113. Yang, N.; Zhang, Z.; Yang, X.; Zhang, J.; Zhang, B.; Xie, P.; Wang, Y.; Chen, J.; Shi, L. UAV-based stomatal conductance estimation under water stress using the PROSAIL model coupled with meteorological factors. Int. J. Appl. Earth Obs. Geoinf. 2025, 137, 104425. [Google Scholar] [CrossRef]
  114. Pasqualotto, N.; Delegido, J.; Van Wittenberghe, S.; Verrelst, J.; Rivera, J.P.; Moreno, J. Retrieval of canopy water content of different crop types with two new hyperspectral indices: Water Absorption Area Index and Depth Water Index. Int. J. Appl. Earth Obs. Geoinf. 2018, 67, 69–78. [Google Scholar] [CrossRef] [PubMed]
  115. Tripathi, R.; Sahoo, R.N.; Sehgal, V.K.; Tomar, R.K.; Chakraborty, D.; Nagarajan, S. Inversion of PROSAIL Model for Retrieval of Plant Biophysical Parameters. J. Indian Soc. Remote Sens. 2011, 40, 19–28. [Google Scholar] [CrossRef]
  116. Li, Z.; Jin, X.; Yang, G.; Drummond, J.; Yang, H.; Clark, B.; Li, Z.; Zhao, C. Remote Sensing of Leaf and Canopy Nitrogen Status in Winter Wheat (Triticum aestivum L.) Based on N-PROSAIL Model. Remote Sens. 2018, 10, 1463. [Google Scholar] [CrossRef]
  117. Li, D.; Wu, Y.; Berger, K.; Kuang, Q.; Feng, W.; Chen, J.M.; Wang, W.; Zheng, H.; Yao, X.; Zhu, Y.; et al. Estimating canopy nitrogen content by coupling PROSAIL-PRO with a nitrogen allocation model. Int. J. Appl. Earth Obs. Geoinf. 2024, 135, 104280. [Google Scholar] [CrossRef]
  118. Fung, A.K.; Li, Z.; Chen, K.S. Backscattering from a randomly rough dielectric surface. IEEE Trans. Geosci. Remote Sens. 1992, 30, 356–369. [Google Scholar] [CrossRef]
  119. Zhang, L.; Li, H.; Xue, Z. Calibrated Integral Equation Model for Bare Soil Moisture Retrieval of Synthetic Aperture Radar: A Case Study in Linze County. Appl. Sci. 2020, 10, 7921. [Google Scholar] [CrossRef]
  120. Chen, K.S.; Tzong-Dar, W.; Leung, T.; Qin, L.; Jiancheng, S.; Fung, A.K. Emission of rough surfaces calculated by the integral equation method with comparison to three-dimensional moment method simulations. IEEE Trans. Geosci. Remote Sens. 2003, 41, 90–101. [Google Scholar] [CrossRef]
  121. Attema, E.P.W.; Ulaby, F.T. Vegetation modeled as a water cloud. Radio Sci. 1978, 13, 357–364. [Google Scholar] [CrossRef]
  122. Ulaby, F.T.; McDonald, K.; Sarabandi, K.; Dobson, M.C. Michigan Microwave Canopy Scattering Models (MIMICS). In Proceedings of the International Geoscience and Remote Sensing Symposium, ‘Remote Sensing: Moving Toward the 21st Century’, Edinburgh, UK, 12–16 September 1988; p. 1009. [Google Scholar]
  123. Romshoo, S.A.; Koike, M.; Onaka, S.; Oki, T.; Musiake, K. Influence of surface and vegetation characteristics on C-band radar measurements for soil moisture content. J. Indian Soc. Remote Sens. 2002, 30, 229–244. [Google Scholar] [CrossRef]
  124. Zhang, X.; Chen, B.; Zhao, H.; Li, T.; Chen, Q. Physical-based soil moisture retrieval method over bare agricultural areas by means of multi-sensor SAR data. Int. J. Remote Sens. 2018, 39, 3870–3890. [Google Scholar] [CrossRef]
  125. Yahia, O.; Guida, R.; Iervolino, P. Novel Weight-Based Approach for Soil Moisture Content Estimation via Synthetic Aperture Radar, Multispectral and Thermal Infrared Data Fusion. Sensors 2021, 21, 3457. [Google Scholar] [CrossRef]
  126. Bastiaanssen, W.G.M.; Menenti, M.; Feddes, R.A.; Holtslag, A.A.M. A remote sensing surface energy balance algorithm for land (SEBAL). 1. Formulation. J. Hydrol. 1998, 212–213, 198–212. [Google Scholar] [CrossRef]
  127. Allen, R.G.; Tasumi, M.; Trezza, R. Satellite-Based Energy Balance for Mapping Evapotranspiration with Internalized Calibration (METRIC)—Model. J. Irrig. Drain. Eng. 2007, 133, 380–394. [Google Scholar] [CrossRef]
  128. Colaizzi, P. Advances in a Two-Source Energy Balance Model: Partitioning of Evaporation and Transpiration for Cotton. Trans. ASABE 2016, 59, 181–197. [Google Scholar] [CrossRef]
  129. Hoffmann, H.; Nieto, H.; Jensen, R.; Guzinski, R.; Zarco-Tejada, P.; Friborg, T. Estimating evaporation with thermal UAV data and two-source energy balance models. Hydrol. Earth Syst. Sci. 2016, 20, 697–713. [Google Scholar] [CrossRef]
  130. Khormizi, H.Z.; Malamiri, H.R.G.; Ferreira, C.S.S. Estimation of Evaporation and Drought Stress of Pistachio Plant Using UAV Multispectral Images and a Surface Energy Balance Approach. Horticulturae 2024, 10, 515. [Google Scholar] [CrossRef]
  131. Na, L.; Qingshan, L.; Zimeng, L.; Yang, L.; Zongzheng, Y.; Liwei, S. Non-destructive method using UAVs for high-throughput water productivity assessment for winter wheat cultivars. Agric. Water Manag. 2025, 314, 109526. [Google Scholar] [CrossRef]
  132. Impollonia, G.; Croci, M.; Blandinières, H.; Marcone, A.; Amaducci, S. Comparison of PROSAIL Model Inversion Methods for Estimating Leaf Chlorophyll Content and LAI Using UAV Imagery for Hemp Phenotyping. Remote Sens. 2022, 14, 5801. [Google Scholar] [CrossRef]
  133. Ling, J.; Zeng, Z.; Shi, Q.; Li, J.; Zhang, B. Estimating Winter Wheat LAI Using Hyperspectral UAV Data and an Iterative Hybrid Method. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2023, 16, 8782–8794. [Google Scholar] [CrossRef]
  134. Bhadra, S.; Sagan, V.; Sarkar, S.; Braud, M.; Mockler, T.C.; Eveland, A.L. PROSAIL-Net: A transfer learning-based dual stream neural network to estimate leaf chlorophyll and leaf angle of crops from UAV hyperspectral images. ISPRS J. Photogramm. 2024, 210, 1–24. [Google Scholar] [CrossRef]
  135. Qi, J.; Chehbouni, A.; Huete, A.R.; Kerr, Y.H.; Sorooshian, S. A modified soil adjusted vegetation index. Remote Sens. Environ. 1994, 48, 119–126. [Google Scholar] [CrossRef]
  136. Gitelson, A.A. Wide Dynamic Range Vegetation Index for Remote Quantification of Biophysical Characteristics of Vegetation. J. Plant Physiol. 2004, 161, 165–173. [Google Scholar] [CrossRef]
  137. Haboudane, D.; Miller, J.R.; Pattey, E.; Zarco-Tejada, P.J.; Strachan, I.B. Hyperspectral vegetation indices and novel algorithms for predicting green LAI of crop canopies: Modeling and validation in the context of precision agriculture. Remote Sens. Environ. 2004, 90, 337–352. [Google Scholar] [CrossRef]
  138. Mutanga, O.; Masenyama, A.; Sibanda, M. Spectral saturation in the remote sensing of high-density vegetation traits: A systematic review of progress, challenges, and prospects. ISPRS J. Photogramm. 2023, 198, 297–309. [Google Scholar] [CrossRef]
  139. Liu, Y.; Feng, H.; Fan, Y.; Yue, J.; Chen, R.; Ma, Y.; Bian, M.; Yang, G. Improving potato above ground biomass estimation combining hyperspectral data and harmonic decomposition techniques. Comput. Electron. Agric. 2024, 218, 108699. [Google Scholar] [CrossRef]
  140. Pineda, M.; Barón, M.; Pérez-Bueno, M.-L. Thermal Imaging for Plant Stress Detection and Phenotyping. Remote Sens. 2021, 13, 68. [Google Scholar] [CrossRef]
  141. Anconelli, S.; Mannini, P.; Battilani, A. Cwsi and Baseline Studies to Increase Quality of Processing Tomatoes; International Society For Horticultural Science: Leuven, Belgium, 1994; pp. 303–306. [Google Scholar]
  142. Yu, F.H.; Zhao, D.; Guo, Z.H.; Jin, Z.Y.; Guo, S.; Chen, C.L.; Xu, T.Y. Characteristic Analysis and Decomposition of Mixed Pixels From UAV Hyperspectral Images in Rice Tillering Stage. Spectrosc. Spect. Anal. 2022, 42, 947–953. [Google Scholar] [CrossRef]
  143. Bayat, B.; Van der Tol, C.; Verhoef, W. Remote Sensing of Grass Response to Drought Stress Using Spectroscopic Techniques and Canopy Reflectance Model Inversion. Remote Sens. 2016, 8, 557. [Google Scholar] [CrossRef]
  144. Wang, J.J.; Ding, J.L.; Ge, X.Y.; Zhang, Z.; Han, L.J. Application of Fractional Differential Technique in Estimating Soil Water Content from Airborne Hyperspectral Data. Spectrosc. Spect. Anal. 2022, 42, 3559–3567. [Google Scholar] [CrossRef]
  145. Li, J.; Wu, W.; Zhao, C.; Bai, X.; Dong, L.; Tan, Y.; Yusup, M.; Akelebai, G.; Dong, H.; Zhi, J. Effects of solar elevation angle on the visible light vegetation index of a cotton field when extracted from the UAV. Sci. Rep. 2025, 15, 18497. [Google Scholar] [CrossRef]
  146. Yuan, X.; Lv, Z.; Laakso, K.; Han, J.; Liu, X.; Meng, Q.; Xue, S. Observation Angle Effect of Near-Ground Thermal Infrared Remote Sensing on the Temperature Results of Urban Land Surface. Land 2024, 13, 2170. [Google Scholar] [CrossRef]
  147. Shafiee, S.; Mroz, T.; Burud, I.; Lillemo, M. Evaluation of UAV multispectral cameras for yield and biomass prediction in wheat under different sun elevation angles and phenological stages. Comput. Electron. Agric. 2023, 210, 107874. [Google Scholar] [CrossRef]
  148. Liu, S.; Lu, L.; Mao, D.; Jia, L. Evaluating parameterizations of aerodynamic resistance to heat transfer using field measurements. Hydrol. Earth Syst. Sci. 2007, 11, 769–783. [Google Scholar] [CrossRef]
Figure 1. Framework of research progress on UAV-based remote sensing for monitoring crop water and nutrient status.
Figure 1. Framework of research progress on UAV-based remote sensing for monitoring crop water and nutrient status.
Plants 14 02544 g001
Table 1. UAV platform types and their technical parameters for agricultural monitoring.
Table 1. UAV platform types and their technical parameters for agricultural monitoring.
Rotor TypeEndurance TimeMaximum AltitudeEquipment CostMaintenance ComplexityTypical Platform
Multirotor<1 hWithin a few hundred metersRelatively lowRelatively simpleDJI M300
Fixed-wingSeveral hoursSeveral kilometersModerateModerateHC-141
Hybrid-wingSeveral hoursSeveral kilometersRelatively highRelatively complexCW-100
Table 2. Classification and technical parameters of remote sensing sensor types.
Table 2. Classification and technical parameters of remote sensing sensor types.
Sensor TypeSpectral RangeTypical PayloadApplications
Optical remote sensing0.4–1.1 μm (Visible/NIR)Micasense RedEdge-PVegetation index-based assessment of nitrogen status; estimation of LAI
Hyperspectral sensor0.4–2.5 μm (Hundreds of narrow bands)Cubert ULTRIS 5Inversion of soil organic matter; early detection of crop pests and diseases; nutrient deficiency diagnosis
Thermal infrared sensor8–14 μmFLIR Vue ProMonitoring canopy temperature for water stress; estimation of surface water content via thermal inertia
Microwave remote sensing0.1 cm–1 m (C/X/L bands)FSAR miniSARSoil moisture inversion; crop biomass estimation; flood and waterlogging monitoring
Table 3. Common vegetation indices and their formulas.
Table 3. Common vegetation indices and their formulas.
Vegetable IndexFormulaReference
Drought Stress Index (DSI) D S I = S 1 N [56]
Difference Vegetation Index (DVI) D V I = N R [57]
Enhanced Vegetation Index (EVI) E V I = g ( N R ) ( N + C 1 R C 2 B + L ) [58]
Excess Green Index (ExG) E X G = 2 G R B [59]
Green Normalized Difference Vegetation Index (GNDVI) G N D V I = ( N G ) ( N + G ) [60]
Normalized Difference Moisture Index (NDMI) N D M I = ( N S 1 ) ( N + S 1 ) [61]
Normalized Difference Vegetation Index (NDVI) N D V I = ( N R ) ( N + R ) [62]
Hyperspectral Near-Infrared Reflectance of Vegetation (NIRvH2) N I R v H 2 = N R k ( λ N λ R ) [63]
Optimized Soil-Adjusted Vegetation Index (OSAVI) O S A V I = ( N R ) ( N + R + 0.16 ) [64]
Ratio Vegetation Index (RVI) R V I = R E 2 R [65]
Soil-Adjusted Vegetation Index (SAVI) S A V I = ( 1.0 + L ) ( N R ) ( N + R + L ) [66]
Transformed Difference Vegetation Index (TDVI) T D V I = 1.5 ( N R ) ( N 2 + R + 0.5 ) 0.5 [67]
Transformed Vegetation Index (TVI) T V I = ( ( N R ) ( N + R ) + 0.5 ) 0.5 [68]
Note: N = near-infrared reflectance; R = red reflectance; G = green reflectance; B = blue reflectance; R E 2 = red-edge reflectance; S 1 = shortwave infrared reflectance; λ N and λ R = central wavelengths of NIR and red bands; k = correction factor; L = soil adjustment factor; g = gain factor; C 1 and C 2 = atmospheric correction coefficients.
Table 4. Comparative summary of modeling approaches for UAV-based monitoring of crop water and nutrient status.
Table 4. Comparative summary of modeling approaches for UAV-based monitoring of crop water and nutrient status.
Modeling ApproachAccuracyScalabilityData RequirementsRobustnessAdvantages
VI-based methodsModerate; sensitive to canopy coverageHigh; easy to implementLow (few spectral bands)Limited under heterogeneous or stressed conditionsSimple, interpretable, cost-effective
Data-driven models High with sufficient training dataModerate; limited cross-region transferHigh (large, labeled datasets required)Variable; prone to overfittingNonlinear modeling capacity, flexible
Physically based modelsModerate-high; depends on parameterizationHigh; applicable across crops and regionsMedium-high (field + meteorological inputs)Strong generalization across conditionsMechanistic, interpretable
Hybrid modelsHigh; balance of accuracy and interpretabilityPromising, but not fully validatedMedium-high (multi-source data required)Strong; potential for transferabilityCombine physical priors with AI adaptability
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Yang, X.; Chen, J.; Lu, X.; Liu, H.; Liu, Y.; Bai, X.; Qian, L.; Zhang, Z. Advances in UAV Remote Sensing for Monitoring Crop Water and Nutrient Status: Modeling Methods, Influencing Factors, and Challenges. Plants 2025, 14, 2544. https://doi.org/10.3390/plants14162544

AMA Style

Yang X, Chen J, Lu X, Liu H, Liu Y, Bai X, Qian L, Zhang Z. Advances in UAV Remote Sensing for Monitoring Crop Water and Nutrient Status: Modeling Methods, Influencing Factors, and Challenges. Plants. 2025; 14(16):2544. https://doi.org/10.3390/plants14162544

Chicago/Turabian Style

Yang, Xiaofei, Junying Chen, Xiaohan Lu, Hao Liu, Yanfu Liu, Xuqian Bai, Long Qian, and Zhitao Zhang. 2025. "Advances in UAV Remote Sensing for Monitoring Crop Water and Nutrient Status: Modeling Methods, Influencing Factors, and Challenges" Plants 14, no. 16: 2544. https://doi.org/10.3390/plants14162544

APA Style

Yang, X., Chen, J., Lu, X., Liu, H., Liu, Y., Bai, X., Qian, L., & Zhang, Z. (2025). Advances in UAV Remote Sensing for Monitoring Crop Water and Nutrient Status: Modeling Methods, Influencing Factors, and Challenges. Plants, 14(16), 2544. https://doi.org/10.3390/plants14162544

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop