Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (1,031)

Search Parameters:
Keywords = UAV-multispectral

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
19 pages, 5340 KiB  
Article
Potential of Multi-Source Multispectral vs. Hyperspectral Remote Sensing for Winter Wheat Nitrogen Monitoring
by Xiaokai Chen, Yuxin Miao, Krzysztof Kusnierek, Fenling Li, Chao Wang, Botai Shi, Fei Wu, Qingrui Chang and Kang Yu
Remote Sens. 2025, 17(15), 2666; https://doi.org/10.3390/rs17152666 (registering DOI) - 1 Aug 2025
Abstract
Timely and accurate monitoring of crop nitrogen (N) status is essential for precision agriculture. UAV-based hyperspectral remote sensing offers high-resolution data for estimating plant nitrogen concentration (PNC), but its cost and complexity limit large-scale application. This study compares the performance of UAV hyperspectral [...] Read more.
Timely and accurate monitoring of crop nitrogen (N) status is essential for precision agriculture. UAV-based hyperspectral remote sensing offers high-resolution data for estimating plant nitrogen concentration (PNC), but its cost and complexity limit large-scale application. This study compares the performance of UAV hyperspectral data (S185 sensor) with simulated multispectral data from DJI Phantom 4 Multispectral (P4M), PlanetScope (PS), and Sentinel-2A (S2) in estimating winter wheat PNC. Spectral data were collected across six growth stages over two seasons and resampled to match the spectral characteristics of the three multispectral sensors. Three variable selection strategies (one-dimensional (1D) spectral reflectance, optimized two-dimensional (2D), and three-dimensional (3D) spectral indices) were combined with Random Forest Regression (RFR), Support Vector Machine Regression (SVMR), and Partial Least Squares Regression (PLSR) to build PNC prediction models. Results showed that, while hyperspectral data yielded slightly higher accuracy, optimized multispectral indices, particularly from PS and S2, achieved comparable performance. Among models, SVM and RFR showed consistent effectiveness across strategies. These findings highlight the potential of low-cost multispectral platforms for practical crop N monitoring. Future work should validate these models using real satellite imagery and explore multi-source data fusion with advanced learning algorithms. Full article
(This article belongs to the Special Issue Perspectives of Remote Sensing for Precision Agriculture)
22 pages, 8105 KiB  
Article
Extraction of Sparse Vegetation Cover in Deserts Based on UAV Remote Sensing
by Jie Han, Jinlei Zhu, Xiaoming Cao, Lei Xi, Zhao Qi, Yongxin Li, Xingyu Wang and Jiaxiu Zou
Remote Sens. 2025, 17(15), 2665; https://doi.org/10.3390/rs17152665 (registering DOI) - 1 Aug 2025
Abstract
The unique characteristics of desert vegetation, such as different leaf morphology, discrete canopy structures, sparse and uneven distribution, etc., pose significant challenges for remote sensing-based estimation of fractional vegetation cover (FVC). The Unmanned Aerial Vehicle (UAV) system can accurately distinguish vegetation patches, extract [...] Read more.
The unique characteristics of desert vegetation, such as different leaf morphology, discrete canopy structures, sparse and uneven distribution, etc., pose significant challenges for remote sensing-based estimation of fractional vegetation cover (FVC). The Unmanned Aerial Vehicle (UAV) system can accurately distinguish vegetation patches, extract weak vegetation signals, and navigate through complex terrain, making it suitable for applications in small-scale FVC extraction. In this study, we selected the floodplain fan with Caragana korshinskii Kom as the constructive species in Hatengtaohai National Nature Reserve, Bayannur, Inner Mongolia, China, as our study area. We investigated the remote sensing extraction method of desert sparse vegetation cover by placing samples across three gradients: the top, middle, and edge of the fan. We then acquired UAV multispectral images; evaluated the applicability of various vegetation indices (VIs) using methods such as supervised classification, linear regression models, and machine learning; and explored the feasibility and stability of multiple machine learning models in this region. Our results indicate the following: (1) We discovered that the multispectral vegetation index is superior to the visible vegetation index and more suitable for FVC extraction in vegetation-sparse desert regions. (2) By comparing five machine learning regression models, it was found that the XGBoost and KNN models exhibited relatively lower estimation performance in the study area. The spatial distribution of plots appeared to influence the stability of the SVM model when estimating fractional vegetation cover (FVC). In contrast, the RF and LASSO models demonstrated robust stability across both training and testing datasets. Notably, the RF model achieved the best inversion performance (R2 = 0.876, RMSE = 0.020, MAE = 0.016), indicating that RF is one of the most suitable models for retrieving FVC in naturally sparse desert vegetation. This study provides a valuable contribution to the limited existing research on remote sensing-based estimation of FVC and characterization of spatial heterogeneity in small-scale desert sparse vegetation ecosystems dominated by a single species. Full article
Show Figures

Figure 1

21 pages, 4657 KiB  
Article
A Semi-Automated RGB-Based Method for Wildlife Crop Damage Detection Using QGIS-Integrated UAV Workflow
by Sebastian Banaszek and Michał Szota
Sensors 2025, 25(15), 4734; https://doi.org/10.3390/s25154734 (registering DOI) - 31 Jul 2025
Abstract
Monitoring crop damage caused by wildlife remains a significant challenge in agricultural management, particularly in the case of large-scale monocultures such as maize. The given study presents a semi-automated process for detecting wildlife-induced damage using RGB imagery acquired from unmanned aerial vehicles (UAVs). [...] Read more.
Monitoring crop damage caused by wildlife remains a significant challenge in agricultural management, particularly in the case of large-scale monocultures such as maize. The given study presents a semi-automated process for detecting wildlife-induced damage using RGB imagery acquired from unmanned aerial vehicles (UAVs). The method is designed for non-specialist users and is fully integrated within the QGIS platform. The proposed approach involves calculating three vegetation indices—Excess Green (ExG), Green Leaf Index (GLI), and Modified Green-Red Vegetation Index (MGRVI)—based on a standardized orthomosaic generated from RGB images collected via UAV. Subsequently, an unsupervised k-means clustering algorithm was applied to divide the field into five vegetation vigor classes. Within each class, 25% of the pixels with the lowest average index values were preliminarily classified as damaged. A dedicated QGIS plugin enables drone data analysts (Drone Data Analysts—DDAs) to adjust index thresholds, based on visual interpretation, interactively. The method was validated on a 50-hectare maize field, where 7 hectares of damage (15% of the area) were identified. The results indicate a high level of agreement between the automated and manual classifications, with an overall accuracy of 81%. The highest concentration of damage occurred in the “moderate” and “low” vigor zones. Final products included vigor classification maps, binary damage masks, and summary reports in HTML and DOCX formats with visualizations and statistical data. The results confirm the effectiveness and scalability of the proposed RGB-based procedure for crop damage assessment. The method offers a repeatable, cost-effective, and field-operable alternative to multispectral or AI-based approaches, making it suitable for integration with precision agriculture practices and wildlife population management. Full article
(This article belongs to the Section Remote Sensors)
Show Figures

Figure 1

28 pages, 5503 KiB  
Article
Feature Selection Framework for Improved UAV-Based Detection of Solenopsis invicta Mounds in Agricultural Landscapes
by Chun-Han Shih, Cheng-En Song, Su-Fen Wang and Chung-Chi Lin
Insects 2025, 16(8), 793; https://doi.org/10.3390/insects16080793 (registering DOI) - 31 Jul 2025
Abstract
The red imported fire ant (RIFA; Solenopsis invicta) is an invasive species that severely threatens ecology, agriculture, and public health in Taiwan. In this study, the feasibility of applying multispectral imagery captured by unmanned aerial vehicles (UAVs) to detect red fire ant [...] Read more.
The red imported fire ant (RIFA; Solenopsis invicta) is an invasive species that severely threatens ecology, agriculture, and public health in Taiwan. In this study, the feasibility of applying multispectral imagery captured by unmanned aerial vehicles (UAVs) to detect red fire ant mounds was evaluated in Fenlin Township, Hualien, Taiwan. A DJI Phantom 4 multispectral drone collected reflectance in five bands (blue, green, red, red-edge, and near-infrared), derived indices (normalized difference vegetation index, NDVI, soil-adjusted vegetation index, SAVI, and photochemical pigment reflectance index, PPR), and textural features. According to analysis of variance F-scores and random forest recursive feature elimination, vegetation indices and spectral features (e.g., NDVI, NIR, SAVI, and PPR) were the most significant predictors of ecological characteristics such as vegetation density and soil visibility. Texture features exhibited moderate importance and the potential to capture intricate spatial patterns in nonlinear models. Despite limitations in the analytics, including trade-offs related to flight height and environmental variability, the study findings suggest that UAVs are an inexpensive, high-precision means of obtaining multispectral data for RIFA monitoring. These findings can be used to develop efficient mass-detection protocols for integrated pest control, with broader implications for invasive species monitoring. Full article
(This article belongs to the Special Issue Surveillance and Management of Invasive Insects)
Show Figures

Figure 1

22 pages, 12611 KiB  
Article
Banana Fusarium Wilt Recognition Based on UAV Multi-Spectral Imagery and Automatically Constructed Enhanced Features
by Ye Su, Longlong Zhao, Huichun Ye, Wenjiang Huang, Xiaoli Li, Hongzhong Li, Jinsong Chen, Weiping Kong and Biyao Zhang
Agronomy 2025, 15(8), 1837; https://doi.org/10.3390/agronomy15081837 - 29 Jul 2025
Viewed by 102
Abstract
Banana Fusarium wilt (BFW, also known as Panama disease) is a highly infectious and destructive disease that threatens global banana production, requiring early recognition for timely prevention and control. Current monitoring methods primarily rely on continuous variable features—such as band reflectances (BRs) and [...] Read more.
Banana Fusarium wilt (BFW, also known as Panama disease) is a highly infectious and destructive disease that threatens global banana production, requiring early recognition for timely prevention and control. Current monitoring methods primarily rely on continuous variable features—such as band reflectances (BRs) and vegetation indices (VIs)—collectively referred to as basic features (BFs)—which are prone to noise during the early stages of infection and struggle to capture subtle spectral variations, thus limiting the recognition accuracy. To address this limitation, this study proposes a discretized enhanced feature (EF) construction method, the automated kernel density segmentation-based feature construction algorithm (AutoKDFC). By analyzing the differences in the kernel density distributions between healthy and diseased samples, the AutoKDFC automatically determines the optimal segmentation threshold, converting continuous BFs into binary features with higher discriminative power for early-stage recognition. Using UAV-based multi-spectral imagery, BFW recognition models are developed and tested with the random forest (RF), support vector machine (SVM), and Gaussian naïve Bayes (GNB) algorithms. The results show that EFs exhibit significantly stronger correlations with BFW’s presence than original BFs. Feature importance analysis via RF further confirms that EFs contribute more to the model performance, with VI-derived features outperforming BR-based ones. The integration of EFs results in average performance gains of 0.88%, 2.61%, and 3.07% for RF, SVM, and GNB, respectively, with SVM achieving the best performance, averaging over 90%. Additionally, the generated BFW distribution map closely aligns with ground observations and captures spectral changes linked to disease progression, validating the method’s practical utility. Overall, the proposed AutoKDFC method demonstrates high effectiveness and generalizability for BFW recognition. Its core concept of “automatic feature enhancement” has strong potential for broader applications in crop disease monitoring and supports the development of intelligent early warning systems in plant health management. Full article
(This article belongs to the Section Pest and Disease Management)
Show Figures

Figure 1

25 pages, 5776 KiB  
Article
Early Detection of Herbicide-Induced Tree Stress Using UAV-Based Multispectral and Hyperspectral Imagery
by Russell Main, Mark Jayson B. Felix, Michael S. Watt and Robin J. L. Hartley
Forests 2025, 16(8), 1240; https://doi.org/10.3390/f16081240 - 28 Jul 2025
Viewed by 296
Abstract
There is growing interest in the use of herbicide for the silvicultural practice of tree thinning (i.e., chemical thinning or e-thinning) in New Zealand. Potential benefits of this approach include improved stability of the standing crop in high winds, and safer and lower-cost [...] Read more.
There is growing interest in the use of herbicide for the silvicultural practice of tree thinning (i.e., chemical thinning or e-thinning) in New Zealand. Potential benefits of this approach include improved stability of the standing crop in high winds, and safer and lower-cost operations, particularly in steep or remote terrain. As uptake grows, tools for monitoring treatment effectiveness, particularly during the early stages of stress, will become increasingly important. This study evaluated the use of UAV-based multispectral and hyperspectral imagery to detect early herbicide-induced stress in a nine-year-old radiata pine (Pinus radiata D. Don) plantation, based on temporal changes in crown spectral signatures following treatment with metsulfuron-methyl. A staggered-treatment design was used, in which herbicide was applied to a subset of trees in six blocks over several weeks. This staggered design allowed a single UAV acquisition to capture imagery of trees at varying stages of herbicide response, with treated trees ranging from 13 to 47 days after treatment (DAT). Visual canopy assessments were carried out to validate the onset of visible symptoms. Spectral changes either preceded or coincided with the development of significant visible canopy symptoms, which started at 25 DAT. Classification models developed using narrow band hyperspectral indices (NBHI) allowed robust discrimination of treated and non-treated trees as early as 13 DAT (F1 score = 0.73), with stronger results observed at 18 DAT (F1 score = 0.78). Models that used multispectral indices were able to classify treatments with a similar accuracy from 18 DAT (F1 score = 0.78). Across both sensors, pigment-sensitive indices, particularly variants of the Photochemical Reflectance Index, consistently featured among the top predictors at all time points. These findings address a key knowledge gap by demonstrating practical, remote sensing-based solutions for monitoring and characterising herbicide-induced stress in field-grown radiata pine. The 13-to-18 DAT early detection window provides an operational baseline and a target for future research seeking to refine UAV-based detection of chemical thinning. Full article
(This article belongs to the Section Forest Health)
Show Figures

Figure 1

27 pages, 2978 KiB  
Article
Dynamic Monitoring and Precision Fertilization Decision System for Agricultural Soil Nutrients Using UAV Remote Sensing and GIS
by Xiaolong Chen, Hongfeng Zhang and Cora Un In Wong
Agriculture 2025, 15(15), 1627; https://doi.org/10.3390/agriculture15151627 - 27 Jul 2025
Viewed by 311
Abstract
We propose a dynamic monitoring and precision fertilization decision system for agricultural soil nutrients, integrating UAV remote sensing and GIS technologies to address the limitations of traditional soil nutrient assessment methods. The proposed method combines multi-source data fusion, including hyperspectral and multispectral UAV [...] Read more.
We propose a dynamic monitoring and precision fertilization decision system for agricultural soil nutrients, integrating UAV remote sensing and GIS technologies to address the limitations of traditional soil nutrient assessment methods. The proposed method combines multi-source data fusion, including hyperspectral and multispectral UAV imagery with ground sensor data, to achieve high-resolution spatial and spectral analysis of soil nutrients. Real-time data processing algorithms enable rapid updates of soil nutrient status, while a time-series dynamic model captures seasonal variations and crop growth stage influences, improving prediction accuracy (RMSE reductions of 43–70% for nitrogen, phosphorus, and potassium compared to conventional laboratory-based methods and satellite NDVI approaches). The experimental validation compared the proposed system against two conventional approaches: (1) laboratory soil testing with standardized fertilization recommendations and (2) satellite NDVI-based fertilization. Field trials across three distinct agroecological zones demonstrated that the proposed system reduced fertilizer inputs by 18–27% while increasing crop yields by 4–11%, outperforming both conventional methods. Furthermore, an intelligent fertilization decision model generates tailored fertilization plans by analyzing real-time soil conditions, crop demands, and climate factors, with continuous learning enhancing its precision over time. The system also incorporates GIS-based visualization tools, providing intuitive spatial representations of nutrient distributions and interactive functionalities for detailed insights. Our approach significantly advances precision agriculture by automating the entire workflow from data collection to decision-making, reducing resource waste and optimizing crop yields. The integration of UAV remote sensing, dynamic modeling, and machine learning distinguishes this work from conventional static systems, offering a scalable and adaptive framework for sustainable farming practices. Full article
(This article belongs to the Section Agricultural Soils)
Show Figures

Figure 1

19 pages, 5166 KiB  
Article
Estimating Wheat Chlorophyll Content Using a Multi-Source Deep Feature Neural Network
by Jun Li, Yali Sheng, Weiqiang Wang, Jikai Liu and Xinwei Li
Agriculture 2025, 15(15), 1624; https://doi.org/10.3390/agriculture15151624 - 26 Jul 2025
Viewed by 182
Abstract
Chlorophyll plays a vital role in wheat growth and fertilization management. Accurate and efficient estimation of chlorophyll content is crucial for providing a scientific foundation for precision agricultural management. Unmanned aerial vehicles (UAVs), characterized by high flexibility, spatial resolution, and operational efficiency, have [...] Read more.
Chlorophyll plays a vital role in wheat growth and fertilization management. Accurate and efficient estimation of chlorophyll content is crucial for providing a scientific foundation for precision agricultural management. Unmanned aerial vehicles (UAVs), characterized by high flexibility, spatial resolution, and operational efficiency, have emerged as effective tools for estimating chlorophyll content in wheat. Although multi-source data derived from UAV-based multispectral imagery have shown potential for wheat chlorophyll estimation, the importance of multi-source deep feature fusion has not been adequately addressed. Therefore, this study aims to estimate wheat chlorophyll content by integrating spectral and textural features extracted from UAV multispectral imagery, in conjunction with partial least squares regression (PLSR), random forest regression (RFR), deep neural network (DNN), and a novel multi-source deep feature neural network (MDFNN) proposed in this research. The results demonstrate the following: (1) Except for the RFR model, models based on texture features exhibit superior accuracy compared to those based on spectral features. Furthermore, the estimation accuracy achieved by fusing spectral and texture features is significantly greater than that obtained using a single type of data. (2) The MDFNN proposed in this study outperformed other models in chlorophyll content estimation, with an R2 of 0.850, an RMSE of 5.602, and an RRMSE of 15.76%. Compared to the second-best model, the DNN (R2 = 0.799, RMSE = 6.479, RRMSE = 18.23%), the MDFNN achieved a 6.4% increase in R2, and 13.5% reductions in both RMSE and RRMSE. (3) The MDFNN exhibited strong robustness and adaptability across varying years, wheat varieties, and nitrogen application levels. The findings of this study offer important insights into UAV-based remote sensing applications for estimating wheat chlorophyll under field conditions. Full article
(This article belongs to the Section Artificial Intelligence and Digital Agriculture)
Show Figures

Figure 1

24 pages, 12286 KiB  
Article
A UAV-Based Multi-Scenario RGB-Thermal Dataset and Fusion Model for Enhanced Forest Fire Detection
by Yalin Zhang, Xue Rui and Weiguo Song
Remote Sens. 2025, 17(15), 2593; https://doi.org/10.3390/rs17152593 - 25 Jul 2025
Viewed by 363
Abstract
UAVs are essential for forest fire detection due to vast forest areas and inaccessibility of high-risk zones, enabling rapid long-range inspection and detailed close-range surveillance. However, aerial photography faces challenges like multi-scale target recognition and complex scenario adaptation (e.g., deformation, occlusion, lighting variations). [...] Read more.
UAVs are essential for forest fire detection due to vast forest areas and inaccessibility of high-risk zones, enabling rapid long-range inspection and detailed close-range surveillance. However, aerial photography faces challenges like multi-scale target recognition and complex scenario adaptation (e.g., deformation, occlusion, lighting variations). RGB-Thermal fusion methods integrate visible-light texture and thermal infrared temperature features effectively, but current approaches are constrained by limited datasets and insufficient exploitation of cross-modal complementary information, ignoring cross-level feature interaction. A time-synchronized multi-scene, multi-angle aerial RGB-Thermal dataset (RGBT-3M) with “Smoke–Fire–Person” annotations and modal alignment via the M-RIFT method was constructed as a way to address the problem of data scarcity in wildfire scenarios. Finally, we propose a CP-YOLOv11-MF fusion detection model based on the advanced YOLOv11 framework, which can learn heterogeneous features complementary to each modality in a progressive manner. Experimental validation proves the superiority of our method, with a precision of 92.5%, a recall of 93.5%, a mAP50 of 96.3%, and a mAP50-95 of 62.9%. The model’s RGB-Thermal fusion capability enhances early fire detection, offering a benchmark dataset and methodological advancement for intelligent forest conservation, with implications for AI-driven ecological protection. Full article
(This article belongs to the Special Issue Advances in Spectral Imagery and Methods for Fire and Smoke Detection)
Show Figures

Figure 1

30 pages, 5734 KiB  
Article
Evaluating Remote Sensing Products for Pasture Composition and Yield Prediction
by Karen Melissa Albacura-Campues, Izar Sinde-González, Javier Maiguashca, Myrian Herrera, Judith Zapata and Theofilos Toulkeridis
Remote Sens. 2025, 17(15), 2561; https://doi.org/10.3390/rs17152561 - 23 Jul 2025
Viewed by 318
Abstract
Vegetation and soil indices are able to indicate patterns of gradual plant growth. Therefore, productivity data may be used to predict performance in the development of pastures prior to grazing, since the morphology of the pasture follows repetitive cycles through the grazing of [...] Read more.
Vegetation and soil indices are able to indicate patterns of gradual plant growth. Therefore, productivity data may be used to predict performance in the development of pastures prior to grazing, since the morphology of the pasture follows repetitive cycles through the grazing of animals. Accordingly, in recent decades, much attention has been paid to the monitoring and development of vegetation by means of remote sensing using remote sensors. The current study seeks to determine the differences between three remote sensing products in the monitoring and development of white clover and perennial ryegrass ratios. Various grass and legume associations (perennial ryegrass, Lolium perenne, and white clover, Trifolium repens) were evaluated in different proportions to determine their yield and relationship through vegetation and soil indices. Four proportions (%) of perennial ryegrass and white clover were used, being 100:0; 90:10; 80:20 and 70:30. Likewise, to obtain spectral indices, a Spectral Evolution PSR-1100 spectroradiometer was used, and two UAVs with a MAPIR 3W RGNIR camera and a Parrot Sequoia multispectral camera, respectively, were employed. The data collection was performed before and after each cut or grazing period in each experimental unit, and post-processing and the generation of spectral indices were conducted. The results indicate that there were no significant differences between treatments for yield or for vegetation indices. However, there were significant differences in the index variables between sensors, with the spectroradiometer and Parrot obtaining similar values for the indices both pre- and post-grazing. The NDVI values were closely correlated with the yield of the forage proportions (R2 = 0.8948), constituting an optimal index for the prediction of pasture yield. Full article
(This article belongs to the Special Issue Application of Satellite and UAV Data in Precision Agriculture)
Show Figures

Figure 1

21 pages, 16254 KiB  
Article
Prediction of Winter Wheat Yield and Interpretable Accuracy Under Different Water and Nitrogen Treatments Based on CNNResNet-50
by Donglin Wang, Yuhan Cheng, Longfei Shi, Huiqing Yin, Guangguang Yang, Shaobo Liu, Qinge Dong and Jiankun Ge
Agronomy 2025, 15(7), 1755; https://doi.org/10.3390/agronomy15071755 - 21 Jul 2025
Viewed by 397
Abstract
Winter wheat yield prediction is critical for optimizing field management plans and guiding agricultural production. To address the limitations of conventional manual yield estimation methods, including low efficiency and poor interpretability, this study innovatively proposes an intelligent yield estimation method based on a [...] Read more.
Winter wheat yield prediction is critical for optimizing field management plans and guiding agricultural production. To address the limitations of conventional manual yield estimation methods, including low efficiency and poor interpretability, this study innovatively proposes an intelligent yield estimation method based on a convolutional neural network (CNN). A comprehensive two-factor (fertilization × irrigation) controlled field experiment was designed to thoroughly validate the applicability and effectiveness of this method. The experimental design comprised two irrigation treatments, sufficient irrigation (C) at 750 m3 ha−1 and deficit irrigation (M) at 450 m3 ha−1, along with five fertilization treatments (at a rate of 180 kg N ha−1): (1) organic fertilizer alone, (2) organic–inorganic fertilizer blend at a 7:3 ratio, (3) organic–inorganic fertilizer blend at a 3:7 ratio, (4) inorganic fertilizer alone, and (5) no fertilizer control. The experimental protocol employed a DJI M300 RTK unmanned aerial vehicle (UAV) equipped with a multispectral sensor to systematically acquire high-resolution growth imagery of winter wheat across critical phenological stages, from heading to maturity. The acquired multispectral imagery was meticulously annotated using the Labelme professional annotation tool to construct a comprehensive experimental dataset comprising over 2000 labeled images. These annotated data were subsequently employed to train an enhanced CNN model based on ResNet50 architecture, which achieved automated generation of panicle density maps and precise panicle counting, thereby realizing yield prediction. Field experimental results demonstrated significant yield variations among fertilization treatments under sufficient irrigation, with the 3:7 organic–inorganic blend achieving the highest actual yield (9363.38 ± 468.17 kg ha−1) significantly outperforming other treatments (p < 0.05), confirming the synergistic effects of optimized nitrogen and water management. The enhanced CNN model exhibited superior performance, with an average accuracy of 89.0–92.1%, representing a 3.0% improvement over YOLOv8. Notably, model accuracy showed significant correlation with yield levels (p < 0.05), suggesting more distinct panicle morphological features in high-yield plots that facilitated model identification. The CNN’s yield predictions demonstrated strong agreement with the measured values, maintaining mean relative errors below 10%. Particularly outstanding performance was observed for the organic fertilizer with full irrigation (5.5% error) and the 7:3 organic-inorganic blend with sufficient irrigation (8.0% error), indicating that the CNN network is more suitable for these management regimes. These findings provide a robust technical foundation for precision farming applications in winter wheat production. Future research will focus on integrating this technology into smart agricultural management systems to enable real-time, data-driven decision making at the farm scale. Full article
(This article belongs to the Section Precision and Digital Agriculture)
Show Figures

Figure 1

28 pages, 7545 KiB  
Article
Estimation of Rice Leaf Nitrogen Content Using UAV-Based Spectral–Texture Fusion Indices (STFIs) and Two-Stage Feature Selection
by Xiaopeng Zhang, Yating Hu, Xiaofeng Li, Ping Wang, Sike Guo, Lu Wang, Cuiyu Zhang and Xue Ge
Remote Sens. 2025, 17(14), 2499; https://doi.org/10.3390/rs17142499 - 18 Jul 2025
Viewed by 450
Abstract
Accurate estimation of rice leaf nitrogen content (LNC) is essential for optimizing nitrogen management in precision agriculture. However, challenges such as spectral saturation and canopy structural variations across different growth stages complicate this task. This study proposes a robust framework for LNC estimation [...] Read more.
Accurate estimation of rice leaf nitrogen content (LNC) is essential for optimizing nitrogen management in precision agriculture. However, challenges such as spectral saturation and canopy structural variations across different growth stages complicate this task. This study proposes a robust framework for LNC estimation that integrates both spectral and texture features extracted from UAV-based multispectral imagery through the development of novel Spectral–Texture Fusion Indices (STFIs). Field data were collected under nitrogen gradient treatments across three critical growth stages: heading, early filling, and late filling. A total of 18 vegetation indices (VIs), 40 texture features (TFs), and 27 STFIs were derived from UAV images. To optimize the feature set, a two-stage feature selection strategy was employed, combining Pearson correlation analysis with model-specific embedded selection methods: Recursive Feature Elimination with Cross-Validation (RFECV) for Random Forest (RF) and Extreme Gradient Boosting (XGBoost), and Sequential Forward Selection (SFS) for Support Vector Regression (SVR) and Deep Neural Networks (DNNs). The models—RFECV-RF, RFECV-XGBoost, SFS-SVR, and SFS-DNN—were evaluated using four feature configurations. The SFS-DNN model with STFIs achieved the highest prediction accuracy (R2 = 0.874, RMSE = 2.621 mg/g). SHAP analysis revealed the significant contribution of STFIs to model predictions, underscoring the effectiveness of integrating spectral and texture information. The proposed STFI-based framework demonstrates strong generalization across phenological stages and offers a scalable, interpretable approach for UAV-based nitrogen monitoring in rice production systems. Full article
(This article belongs to the Section Remote Sensing in Agriculture and Vegetation)
Show Figures

Figure 1

32 pages, 6589 KiB  
Article
Machine Learning (AutoML)-Driven Wheat Yield Prediction for European Varieties: Enhanced Accuracy Using Multispectral UAV Data
by Krstan Kešelj, Zoran Stamenković, Marko Kostić, Vladimir Aćin, Dragana Tekić, Tihomir Novaković, Mladen Ivanišević, Aleksandar Ivezić and Nenad Magazin
Agriculture 2025, 15(14), 1534; https://doi.org/10.3390/agriculture15141534 - 16 Jul 2025
Viewed by 496
Abstract
Accurate and timely wheat yield prediction is valuable globally for enhancing agricultural planning, optimizing resource use, and supporting trade strategies. Study addresses the need for precision in yield estimation by applying machine-learning (ML) regression models to high-resolution Unmanned Aerial Vehicle (UAV) multispectral (MS) [...] Read more.
Accurate and timely wheat yield prediction is valuable globally for enhancing agricultural planning, optimizing resource use, and supporting trade strategies. Study addresses the need for precision in yield estimation by applying machine-learning (ML) regression models to high-resolution Unmanned Aerial Vehicle (UAV) multispectral (MS) and Red-Green-Blue (RGB) imagery. Research analyzes five European wheat cultivars across 400 experimental plots created by combining 20 nitrogen, phosphorus, and potassium (NPK) fertilizer treatments. Yield variations from 1.41 to 6.42 t/ha strengthen model robustness with diverse data. The ML approach is automated using PyCaret, which optimized and evaluated 25 regression models based on 65 vegetation indices and yield data, resulting in 66 feature variables across 400 observations. The dataset, split into training (70%) and testing sets (30%), was used to predict yields at three growth stages: 9 May, 20 May, and 6 June 2022. Key models achieved high accuracy, with the Support Vector Regression (SVR) model reaching R2 = 0.95 on 9 May and R2 = 0.91 on 6 June, and the Multi-Layer Perceptron (MLP) Regressor attaining R2 = 0.94 on 20 May. The findings underscore the effectiveness of precisely measured MS indices and a rigorous experimental approach in achieving high-accuracy yield predictions. This study demonstrates how a precise experimental setup, large-scale field data, and AutoML can harness UAV and machine learning’s potential to enhance wheat yield predictions. The main limitations of this study lie in its focus on experimental fields under specific conditions; future research could explore adaptability to diverse environments and wheat varieties for broader applicability. Full article
(This article belongs to the Special Issue Applications of Remote Sensing in Agricultural Soil and Crop Mapping)
Show Figures

Figure 1

23 pages, 3492 KiB  
Article
A Multimodal Deep Learning Framework for Accurate Biomass and Carbon Sequestration Estimation from UAV Imagery
by Furkat Safarov, Ugiloy Khojamuratova, Misirov Komoliddin, Xusinov Ibragim Ismailovich and Young Im Cho
Drones 2025, 9(7), 496; https://doi.org/10.3390/drones9070496 - 14 Jul 2025
Viewed by 323
Abstract
Accurate quantification of above-ground biomass (AGB) and carbon sequestration is vital for monitoring terrestrial ecosystem dynamics, informing climate policy, and supporting carbon neutrality initiatives. However, conventional methods—ranging from manual field surveys to remote sensing techniques based solely on 2D vegetation indices—often fail to [...] Read more.
Accurate quantification of above-ground biomass (AGB) and carbon sequestration is vital for monitoring terrestrial ecosystem dynamics, informing climate policy, and supporting carbon neutrality initiatives. However, conventional methods—ranging from manual field surveys to remote sensing techniques based solely on 2D vegetation indices—often fail to capture the intricate spectral and structural heterogeneity of forest canopies, particularly at fine spatial resolutions. To address these limitations, we introduce ForestIQNet, a novel end-to-end multimodal deep learning framework designed to estimate AGB and associated carbon stocks from UAV-acquired imagery with high spatial fidelity. ForestIQNet combines dual-stream encoders for processing multispectral UAV imagery and a voxelized Canopy Height Model (CHM), fused via a Cross-Attentional Feature Fusion (CAFF) module, enabling fine-grained interaction between spectral reflectance and 3D structure. A lightweight Transformer-based regression head then performs multitask prediction of AGB and CO2e, capturing long-range spatial dependencies and enhancing generalization. Proposed method achieves an R2 of 0.93 and RMSE of 6.1 kg for AGB prediction, compared to 0.78 R2 and 11.7 kg RMSE for XGBoost and 0.73 R2 and 13.2 kg RMSE for Random Forest. Despite its architectural complexity, ForestIQNet maintains a low inference cost (27 ms per patch) and generalizes well across species, terrain, and canopy structures. These results establish a new benchmark for UAV-enabled biomass estimation and provide scalable, interpretable tools for climate monitoring and forest management. Full article
(This article belongs to the Special Issue UAVs for Nature Conservation Tasks in Complex Environments)
Show Figures

Figure 1

25 pages, 16927 KiB  
Article
Improving Individual Tree Crown Detection and Species Classification in a Complex Mixed Conifer–Broadleaf Forest Using Two Machine Learning Models with Different Combinations of Metrics Derived from UAV Imagery
by Jeyavanan Karthigesu, Toshiaki Owari, Satoshi Tsuyuki and Takuya Hiroshima
Geomatics 2025, 5(3), 32; https://doi.org/10.3390/geomatics5030032 - 13 Jul 2025
Viewed by 616
Abstract
Individual tree crown detection (ITCD) and tree species classification are critical for forest inventory, species-specific monitoring, and ecological studies. However, accurately detecting tree crowns and identifying species in structurally complex forests with overlapping canopies remains challenging. This study was conducted in a complex [...] Read more.
Individual tree crown detection (ITCD) and tree species classification are critical for forest inventory, species-specific monitoring, and ecological studies. However, accurately detecting tree crowns and identifying species in structurally complex forests with overlapping canopies remains challenging. This study was conducted in a complex mixed conifer–broadleaf forest in northern Japan, aiming to improve ITCD and species classification by employing two machine learning models and different combinations of metrics derived from very high-resolution (2.5 cm) UAV red–green–blue (RGB) and multispectral (MS) imagery. We first enhanced ITCD by integrating different combinations of metrics into multiresolution segmentation (MRS) and DeepForest (DF) models. ITCD accuracy was evaluated across dominant forest types and tree density classes. Next, nine tree species were classified using the ITCD outputs from both MRS and DF approaches, applying Random Forest and DF models, respectively. Incorporating structural, textural, and spectral metrics improved MRS-based ITCD, achieving F-scores of 0.44–0.58. The DF model, which used only structural and spectral metrics, achieved higher F-scores of 0.62–0.79. For species classification, the Random Forest model achieved a Kappa value of 0.81, while the DF model attained a higher Kappa value of 0.91. These findings demonstrate the effectiveness of integrating UAV-derived metrics and advanced modeling approaches for accurate ITCD and species classification in heterogeneous forest environments. The proposed methodology offers a scalable and cost-efficient solution for detailed forest monitoring and species-level assessment. Full article
Show Figures

Figure 1

Back to TopTop