Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (218)

Search Parameters:
Keywords = UAV phenotyping

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
21 pages, 6740 KB  
Article
Co-Registration of UAV and Handheld LiDAR Data for Fine Phenotyping of Rubber Plantations with Complex Canopies
by Junxiang Tan, Hao Chen, Kaihui Zhang, Hao Yang, Xiongjie Wang, Ronghao Yang, Guyue Hu, Shaoda Li, Jianfei Liu and Xiangjun Wang
Plants 2026, 15(3), 376; https://doi.org/10.3390/plants15030376 - 26 Jan 2026
Abstract
Rubber tree phenotyping is transitioning from labor-intensive manual techniques toward high-throughput intelligent sensing platforms. However, the advancement of high-throughput phenotyping remains hindered by complex canopy architectures and pronounced seasonal morphological variations. To address these challenges, this paper introduces a unified phenotyping framework that [...] Read more.
Rubber tree phenotyping is transitioning from labor-intensive manual techniques toward high-throughput intelligent sensing platforms. However, the advancement of high-throughput phenotyping remains hindered by complex canopy architectures and pronounced seasonal morphological variations. To address these challenges, this paper introduces a unified phenotyping framework that leverages a novel Wood Salient Keypoint (WSK)-based registration algorithm to achieve seamless data fusion from unmanned aerial vehicle laser scanning (ULS) and handheld laser scanning (HLS) systems. The proposed approach begins by extracting stable wooden structures through a region-of-interest (ROI) segmentation process. Repeatable WSKs are then generated using a newly proposed wood structure significance (WSS) score, which quantifies and identifies salient regions across multi-view data. For transformation estimation, descriptor matching, WSS constraints, and geometric consistency optimization are integrated into a fast global registration (FGR) pipeline. Extensive evaluation across 25 plots covering 5 sites at the National rubber plantation base in Danzhou, Hainan, China, demonstrates that the method achieves a mean co-registration accuracy of 9 cm. Further analysis under varying seasonal canopy complexities confirms its robustness and critical role in enabling high-precision rubber tree phenotyping. Full article
(This article belongs to the Special Issue Advances in Artificial Intelligence for Plant Research—2nd Edition)
Show Figures

Figure 1

26 pages, 3744 KB  
Article
Analysis of Vegetation Dynamics and Phenotypic Differentiation in Five Triticale (×Triticosecale Wittm.) Varieties Using UAV-Based Multispectral Indices
by Asparuh I. Atanasov, Hristo P. Stoyanov, Atanas Z. Atanasov and Boris I. Evstatiev
Agronomy 2026, 16(3), 303; https://doi.org/10.3390/agronomy16030303 - 25 Jan 2026
Abstract
This study investigates the vegetation dynamics and phenotypic differentiation of five triticale (×Triticosecale Wittm.) varieties under the region-specific agroecological conditions of Southern Dobruja, Bulgaria, across two growing seasons (2024–2025), with the aim of evaluating how local climatic variability shapes vegetation index patterns. [...] Read more.
This study investigates the vegetation dynamics and phenotypic differentiation of five triticale (×Triticosecale Wittm.) varieties under the region-specific agroecological conditions of Southern Dobruja, Bulgaria, across two growing seasons (2024–2025), with the aim of evaluating how local climatic variability shapes vegetation index patterns. UAV-based multispectral imaging was employed throughout key phenological stages to obtain reflectance indices, including NDVI, SAVI, EVI2, and NIRI, which served as indicators of canopy development and physiological status. NDVI was used as the primary reference index, and a baseline value (NDVIbase), defined as the mean NDVI across all varieties on a given date, was applied to evaluate relative varietal deviations over time. Multiple linear regression analyses were performed to assess the relationship between NDVI and baseline biometric parameters for each variety, revealing that varieties 22/78 and 20/52 exhibited reflectance dynamics most closely aligned with expected developmental trends in 2025. In addition, the relationship between NDVI and meteorological variables was examined for the variety Kolorit, demonstrating that relative humidity exerted a pronounced influence on index variability. The findings highlight the sensitivity of triticale vegetation indices to both varietal characteristics and short-term climatic fluctuations. Overall, the study provides a methodological framework for integrating UAV-based multispectral data with meteorological information, emphasizing the importance of region-specific, time-resolved monitoring for improving precision agriculture practices, optimizing crop management, and supporting informed variety selection. Full article
(This article belongs to the Section Precision and Digital Agriculture)
20 pages, 2028 KB  
Review
Advances in Boron, Iron, Manganese, and Zinc Signaling, Transport, and Functional Integration for Enhancing Cotton Nutrient Efficiency and Yield—A Review
by Unius Arinaitwe, Dalitso Noble Yabwalo, Abraham Hangamaisho, Shillah Kwikiiriza and Francis Akitwine
Int. J. Plant Biol. 2026, 17(1), 7; https://doi.org/10.3390/ijpb17010007 - 20 Jan 2026
Viewed by 133
Abstract
Micronutrients, particularly boron (B), iron (Fe), manganese (Mn), and zinc (Zn), are pivotal for cotton (Gossypium spp.) growth, reproductive success, and fiber quality. However, their critical roles are often overlooked in fertility programs focused primarily on macronutrients. This review synthesizes recent advances [...] Read more.
Micronutrients, particularly boron (B), iron (Fe), manganese (Mn), and zinc (Zn), are pivotal for cotton (Gossypium spp.) growth, reproductive success, and fiber quality. However, their critical roles are often overlooked in fertility programs focused primarily on macronutrients. This review synthesizes recent advances in the physiological, molecular, and agronomic understanding of B, Fe, Mn, and Zn in cotton production. The overarching goal is to elucidate their impact on cotton nutrient use efficiency (NUE). Drawing from the peer-reviewed literature, we highlight how these micronutrients regulate essential processes, including photosynthesis, cell wall integrity, hormone signaling, and stress remediation. These processes directly influence root development, boll retention, and fiber quality. As a result, deficiencies in these micronutrients contribute to significant yield gaps even when macronutrients are sufficiently supplied. Key genes, including Boron Transporter 1 (BOR1), Iron-Regulated Transporter 1 (IRT1), Natural Resistance-Associated Macrophage Protein 1 (NRAMP1), Zinc-Regulated Transporter/Iron-Regulated Transporter-like Protein (ZIP), and Gossypium hirsutum Zinc/Iron-regulated transporter-like Protein 3 (GhZIP3), are crucial for mediating micronutrient uptake and homeostasis. These genes can be leveraged in breeding for high-yielding, nutrient-efficient cotton varieties. In addition to molecular hacks, advanced phenotyping technologies, such as unmanned aerial vehicles (UAVs) and single-cell RNA sequencing (scRNA-seq; a technology that measures gene expression at single-cell level, enabling the high-resolution analysis of cellular diversity and the identification of rare cell types), provide novel avenues for identifying nutrient-efficient genotypes and elucidating regulatory networks. Future research directions should include leveraging microRNAs, CRISPR-based gene editing, and precision nutrient management to enhance the use efficiency of B, Fe, Mn, and Zn. These approaches are essential for addressing environmental challenges and closing persistent yield gaps within sustainable cotton production systems. Full article
Show Figures

Figure 1

27 pages, 7545 KB  
Article
Winter Wheat Yield Estimation Under Different Management Practices Using Multi-Source Data Fusion
by Hao Kong, Jingxu Wang, Taiyi Cai, Jun Du, Chang Zhao, Chanjuan Hu and Han Jiang
Agronomy 2026, 16(1), 71; https://doi.org/10.3390/agronomy16010071 - 25 Dec 2025
Viewed by 287
Abstract
Accurate crop yield estimation under differentiated management practices is a core requirement for the development of smart agriculture. However, current yield estimation models face two major challenges: limited adaptability to different management practices, thus exhibiting poor generalizability, and ineffective integration of multi-source remote [...] Read more.
Accurate crop yield estimation under differentiated management practices is a core requirement for the development of smart agriculture. However, current yield estimation models face two major challenges: limited adaptability to different management practices, thus exhibiting poor generalizability, and ineffective integration of multi-source remote sensing features, limiting further improvements in estimation accuracy. To address these issues, this study integrated UAV-based multispectral and thermal infrared remote sensing data to propose a yield estimation framework based on multi-source feature fusion. First, three machine learning algorithms—Partial Least Squares Regression (PLSR), Random Forest (RF), and Extreme Gradient Boosting (XGBoost)—were employed to retrieve key biochemical parameters of winter wheat. The RF model demonstrated superior performance, with retrieval accuracies for chlorophyll, nitrogen, and phosphorus contents of R2 = 0.8347, 0.5914, and 0.9364 and RMSE = 0.2622, 0.4127, and 0.0236, respectively. Subsequently, yield estimation models were constructed by integrating the retrieved biochemical parameters with phenotypic traits such as plant height and biomass. The RF model again exhibited superior performance (R2 = 0.66, RMSE = 867.28 kg/ha). SHapley Additive exPlanations (SHAP) analysis identified May chlorophyll content (Chl-5) and March chlorophyll content (Chl-3) as the most critical variables for yield prediction, with stable positive contributions to yield when their values exceeded 2.80 mg/g and 2.50 mg/g, respectively. The quantitative assessment of management practices revealed that the straw return + 50% inorganic fertilizer + 50% organic fertilizer (RIO50) treatment under the combined organic–inorganic fertilization regime achieved the highest measured grain yield (11,469 kg/ha). Consequently, this treatment can be regarded as an optimized practice for attaining high yield. This study confirms that focusing on chlorophyll dynamics during key physiological stages is an effective approach for enhancing yield estimation accuracy under varied management practices, providing a technical basis for precise field management. Full article
(This article belongs to the Section Precision and Digital Agriculture)
Show Figures

Figure 1

17 pages, 2184 KB  
Article
Soybean Yield Prediction with High-Throughput Phenotyping Data and Machine Learning
by Predrag Ranđelović, Vuk Đorđević, Jegor Miladinović, Simona Bukonja, Marina Ćeran, Vojin Đukić and Marjana Vasiljević
Agriculture 2026, 16(1), 22; https://doi.org/10.3390/agriculture16010022 - 21 Dec 2025
Cited by 1 | Viewed by 546
Abstract
The non-destructive estimation of grain yield could increase the efficiency of soybean breeding through early genotype testing, allowing for more precise selection of superior varieties. High-throughput phenotyping (HTPP) data can be combined with machine learning (ML) to develop accurate prediction models. In this [...] Read more.
The non-destructive estimation of grain yield could increase the efficiency of soybean breeding through early genotype testing, allowing for more precise selection of superior varieties. High-throughput phenotyping (HTPP) data can be combined with machine learning (ML) to develop accurate prediction models. In this study, an unmanned aerial vehicle (UAV) equipped with a multispectral camera was utilized to collect data on plant density (PD), plant height (PH), canopy cover (CC), biomass (BM), and various vegetation indices (VIs) from different stages of soybean development. These traits were used within random forest (RF) and partial least squares regression (PLSR) algorithms to develop models for soybean yield estimation. The initial RF model produced more accurate results, as it had a smaller error between actual and predicted yield compared with the PLSR model. To increase the efficiency of the RF model and optimize the data collection process, the number of predictors was gradually decreased by eliminating highly correlated VIs and selecting the most important variables. The final prediction was based only on several VIs calculated from a few mid-soybean stages. Although the reduction in the number of predictors increased the yield estimation error to some extent, the R2 in the final model remained high (R2 = 0.79). Therefore, the proposed ML model based on specific HTPP variables represents an optimal balance between efficiency and prediction accuracy for in-season soybean yield estimation. Full article
(This article belongs to the Section Artificial Intelligence and Digital Agriculture)
Show Figures

Figure 1

18 pages, 3498 KB  
Article
Improved Estimation of Cotton Aboveground Biomass Using a New Developed Multispectral Vegetation Index and Particle Swarm Optimization
by Guanyu Wu, Mingyu Hou, Yuqiao Wang, Hongchun Sun, Liantao Liu, Ke Zhang, Lingxiao Zhu, Xiuliang Jin, Cundong Li and Yongjiang Zhang
Agriculture 2025, 15(24), 2608; https://doi.org/10.3390/agriculture15242608 - 17 Dec 2025
Viewed by 316
Abstract
Accurate and rapid estimation of aboveground biomass (AGB) in cotton is crucial for precise agricultural management. However, current AGB estimation methods are limited by data homogeneity and insufficient model accuracy, which fail to comprehensively reflect the cotton growth status. This study introduces a [...] Read more.
Accurate and rapid estimation of aboveground biomass (AGB) in cotton is crucial for precise agricultural management. However, current AGB estimation methods are limited by data homogeneity and insufficient model accuracy, which fail to comprehensively reflect the cotton growth status. This study introduces a novel approach by coupling cotton canopy Soil and Plant Analyzer Development (SPAD) values with multispectral (MS) data to achieve precise estimation of cotton AGB. Two experimental treatments, involving varied nitrogen fertilizer rates and organic manure applications, were conducted from 2022 to 2023. MS data from UAVs were collected across multiple cotton growth stages, while AGB and canopy SPAD values were synchronously measured. Using the coefficient of variation method, SPAD values were coupled with existing vegetation indices to develop a novel vegetation index termed CGSIVI. Moreover, the applicability of various machine learning algorithms—including Random Forest Regressor (RFR), eXtreme Gradient Boosting (XGBoost), Categorical Boosting (CatBoost), Particle Swarm Optimization-XGBoost (PSO-XGBoost), and Particle Swarm Optimization-CatBoost (PSO-CatBoost)—was evaluated for inverting cotton AGB. The results indicated that, compared to the original vegetation indices, the correlation between the improved vegetation index (CGSIVI) and AGB was enhanced by 13.60% overall, with the CGSICIre exhibiting the highest correlation with cotton AGB (R2 = 0.87). The overall AGB estimation accuracy across different growth stages, spanning the entire growth period, ranged from 0.768 to 0.949, peaking during the flowering stage. Furthermore, when the CGSIVI was used as an input parameter in comparisons of different machine learning algorithms, the PSO-XGBoost algorithm demonstrated superior estimation accuracy across the entire growth stage and within individual growth stages. This high-throughput crop phenotyping analysis method enables rapid and accurate estimation. It reveals the spatial heterogeneity of cotton growth status, thereby providing a powerful tool for accurately identifying growth differences in the field. Full article
(This article belongs to the Special Issue Unmanned Aerial System for Crop Monitoring in Precision Agriculture)
Show Figures

Figure 1

20 pages, 3159 KB  
Article
Photosynthetic and Canopy Trait Characterization in Soybean (Glycine max L.) Using Chlorophyll Fluorescence and UAV Imaging
by Harmeet Singh-Bakala, Francia Ravelombola, Jacob D. Washburn, Grover Shannon, Ru Zhang and Feng Lin
Agriculture 2025, 15(24), 2576; https://doi.org/10.3390/agriculture15242576 - 12 Dec 2025
Viewed by 554
Abstract
Photosynthesis (PS) is the cornerstone of crop productivity, directly influencing yield potential. Photosynthesis remains an underexploited target in soybean breeding, partly because field-based photosynthetic traits are difficult to measure at scale. Also, it is unclear which reproductive stage(s) provide the most informative physiological [...] Read more.
Photosynthesis (PS) is the cornerstone of crop productivity, directly influencing yield potential. Photosynthesis remains an underexploited target in soybean breeding, partly because field-based photosynthetic traits are difficult to measure at scale. Also, it is unclear which reproductive stage(s) provide the most informative physiological signals for yield. Few studies have evaluated soybean PS in elite germplasm under field conditions, and the integration of chlorophyll fluorescence (CF) with UAV imaging for PS traits remains largely unexplored. This study evaluated genotypic variation in photosynthetic and canopy traits among elite soybean germplasm across environments and developmental stages using CF and UAV imaging. Linear mixed-model analysis revealed significant genotypic and G×E effects for yield, canopy and several photosynthetic parameters. Broad-sense heritability (H2) estimates indicated dynamic genetic control, ranging from 0.12 to 0.77 at the early stage (S1) and 0.20–0.81 at the mid-reproductive stage (S2). Phi2, SPAD and FvP/FmP exhibited the highest heritability, suggesting their potential as stable selection targets. Correlation analyses showed that while FvP/FmP and SPAD were modestly associated with yield at S1, stronger positive relationships with Phi2, PAR and FvP/FmP emerged during S2, underscoring the importance of sustained photosynthetic efficiency during pod formation. Principal component analysis identified photosynthetic efficiency and leaf structural traits as key axes of physiological variation. UAV-derived indices such as NDRE, MTCI, SARE, MExG and CIRE were significantly correlated with CF-based traits and yield, highlighting their utility as high-throughput proxies for canopy performance. These findings demonstrate the potential of integrating CF and UAV phenotyping to enhance physiological selection and yield improvement in soybean breeding. Full article
Show Figures

Figure 1

25 pages, 12181 KB  
Article
Characterizing Growth and Estimating Yield in Winter Wheat Breeding Lines and Registered Varieties Using Multi-Temporal UAV Data
by Liwei Liu, Xinxing Zhou, Tao Liu, Dongtao Liu, Jing Liu, Jing Wang, Yuan Yi, Xuecheng Zhu, Na Zhang, Huiyun Zhang, Guohua Feng and Hongbo Ma
Agriculture 2025, 15(24), 2554; https://doi.org/10.3390/agriculture15242554 - 10 Dec 2025
Cited by 1 | Viewed by 504
Abstract
Grain yield is one of the most critical indicators for evaluating the performance of wheat breeding. However, the assessment process, from early-stage breeding lines to officially registered varieties that have passed the DUS (Distinctness, Uniformity, and Stability) test, is often time-consuming and labor-intensive. [...] Read more.
Grain yield is one of the most critical indicators for evaluating the performance of wheat breeding. However, the assessment process, from early-stage breeding lines to officially registered varieties that have passed the DUS (Distinctness, Uniformity, and Stability) test, is often time-consuming and labor-intensive. Multispectral remote sensing based on unmanned aerial vehicles (UAVs) has demonstrated significant potential in crop phenotyping and yield estimation due to its high throughput, non-destructive nature, and ability to rapidly collect large-scale, multi-temporal data. In this study, multi-temporal UAV-based multispectral imagery, RGB images, and canopy height data were collected throughout the entire wheat growth stage (2023–2024) in Xuzhou, Jiangsu Province, China, to characterize the dynamic growth patterns of both breeding lines and registered cultivars. Vegetation indices (VIs), texture parameters (Tes), and a time-series crop height model (CHM), including the logistic-derived growth rate (GR) and the projected area (PA), were extracted to construct a comprehensive multi-source feature set. Four machine learning algorithms, namely a random forest (RF), support vector machine regression (SVR), extreme gradient boosting (XGBoost), and partial least squares regression (PLSR), were employed to model and estimate yield. The results demonstrated that spectral, texture, and canopy height features derived from multi-temporal UAV data effectively captured phenotypic differences among wheat types and contributed to yield estimation. Features obtained from later growth stages generally led to higher estimation accuracy. The integration of vegetation indices and texture features outperformed models using single-feature types. Furthermore, the integration of time-series features and feature selection further improved predictive accuracy, with XGBoost incorporating VIs, Tes, GR, and PA yielding the best performance (R2 = 0.714, RMSE = 0.516 t/ha, rRMSE = 5.96%). Overall, the proposed multi-source modeling framework offers a practical and efficient solution for yield estimation in early-stage wheat breeding and can support breeders and growers by enabling earlier, more accurate selection and management decisions in real-world production environments. Full article
Show Figures

Figure 1

31 pages, 1530 KB  
Article
Towards Resilient Agriculture: A Novel UAV-Based Lightweight Deep Learning Framework for Wheat Head Detection
by Na Luo, Yao Yang, Xiwei Yang, Di Yang, Jiao Tang, Siyuan Duan, Hou Huang and He Zhu
Mathematics 2025, 13(23), 3844; https://doi.org/10.3390/math13233844 - 1 Dec 2025
Viewed by 410
Abstract
Precision agriculture increasingly relies on unmanned aerial vehicle (UAV) imagery for high-throughput crop phenotyping, yet existing deep learning detection models face critical constraints limiting practical deployment: computational demands incompatible with edge computing platforms and insufficient accuracy for multi-scale object detection across diverse environmental [...] Read more.
Precision agriculture increasingly relies on unmanned aerial vehicle (UAV) imagery for high-throughput crop phenotyping, yet existing deep learning detection models face critical constraints limiting practical deployment: computational demands incompatible with edge computing platforms and insufficient accuracy for multi-scale object detection across diverse environmental conditions. We present LSM-YOLO, a lightweight detection framework specifically designed for aerial wheat head monitoring that achieves state-of-the-art performance while maintaining minimal computational requirements. The architecture integrates three synergistic innovations: a Lightweight Adaptive Extraction (LAE) module that reduces parameters by 87.3% through efficient spatial rearrangement and adaptive feature weighting while preserving critical boundary information; a P2-level high-resolution detection head that substantially improves small object recall in high-altitude imagery; and a Dynamic Head mechanism employing unified multi-dimensional attention across scale, spatial, and task dimensions. Comprehensive evaluation on the Global Wheat Head Detection dataset demonstrates that LSM-YOLO achieves 91.4% mAP@0.5 and 51.0% mAP@0.5:0.95—representing 21.1% and 37.1% improvements over baseline YOLO11n—while requiring only 1.29 M parameters and 3.4 GFLOPs, constituting 50.0% parameter reduction and 46.0% computational cost reduction compared to the baseline. Full article
Show Figures

Figure 1

20 pages, 6710 KB  
Article
A Phenotyping Perception Mechanism of Fusing Spatial and Channel Reconstruction Convolution Employing Maize-Breeding UAV Visual Images
by Huanzhe Wang, Jian Chen, Xiqing Wang and Shuaisong Zhang
Drones 2025, 9(12), 830; https://doi.org/10.3390/drones9120830 - 30 Nov 2025
Viewed by 452
Abstract
As precision agriculture advances, UAV-based aerial image object detection has emerged as a pivotal technology for maize-phenotyping perception operations. Complex backgrounds reduce the model’s performance in extracting features of maize tassels, while sacrificing model computation complexity to improve feature expression is detrimental to [...] Read more.
As precision agriculture advances, UAV-based aerial image object detection has emerged as a pivotal technology for maize-phenotyping perception operations. Complex backgrounds reduce the model’s performance in extracting features of maize tassels, while sacrificing model computation complexity to improve feature expression is detrimental to deployment on UAVs. To achieve a balance between the model size and deploy ability, an enhanced model incorporating spatial-channel convolution is proposed. First, a maize-breeding UAV was built, and the collection of maize tassel image data was realized. Second, Spatial and Channel Reconstruction Convolution (SCConv) was integrated into the neck network of the YOLOv8 baseline model, reducing the model computation complexity while maintaining the detection accuracy. Finally, the constructed maize tassel dataset and public Maize Tasseling Stage (MTS) dataset were used for the training and evaluation of the enhanced model. The results showed that the enhanced model achieved a precision of 92.2%, recall of 84.3%, and mAP@0.5 of 91.7%, with 7.3 G floating-point operations (FLOPs) and a model size of 5.16 MB. Compared with the original model, the enhanced model exhibited respective increases of 3.2%, 3.4%, and 3.4% in precision, recall, and mAP@0.5, along with respective reductions of 0.8 G FLOPs in computation complexity and 0.79 MB in model size. Compared with YOLOv10n, the precision, recall, and mAP@0.5 of the enhanced model are increased by 1.8%, 3.1%, and 2.9%, respectively, and the model computation is reduced by 0.3 G FLOPs, and the model size is reduced by 0.42 MB. The improved model is accurate, performs better on UAV aerial images in complex scenarios, and provides a methodological basis for deployment. It also supports maize tassel detection and holds potential for application in maize breeding. Full article
(This article belongs to the Special Issue Advances of UAV in Precision Agriculture—2nd Edition)
Show Figures

Figure 1

19 pages, 1965 KB  
Article
RGB-Derived Indices Accurately Detect Genotypic and Agronomic Differences in Canopy Variation in Durum Wheat
by Fabio Fania, Ivano Pecorella, Elio Romano, Patrizio Spadanuda, Nicola Pecchioni, Salvatore Esposito and Pasquale De Vita
Crops 2025, 5(6), 85; https://doi.org/10.3390/crops5060085 - 19 Nov 2025
Viewed by 606
Abstract
Durum wheat (Triticum turgidum ssp. durum) represents a strategic crop for the Mediterranean basin and global semiarid regions, being the raw material for pasta and a key component of sustainable cereal production. Improving early vigor and canopy development is essential to enhance [...] Read more.
Durum wheat (Triticum turgidum ssp. durum) represents a strategic crop for the Mediterranean basin and global semiarid regions, being the raw material for pasta and a key component of sustainable cereal production. Improving early vigor and canopy development is essential to enhance resource-use efficiency and yield stability under variable agronomic conditions. For these reasons, we report the application of a series of RGB-derived vegetation indices (VIs) from Unmanned Aerial Vehicle (UAVs) to evaluate their effectiveness in capturing canopy variation in the early growth stages in a large collection of durum wheat varieties and on their validation under different agronomic managements. Digital RGB images from seedling emergence to grain filling were taken in two field experiments, and RGB-based indices were calculated over four consecutive growing seasons. In the first experiment, 521 durum wheat varieties were evaluated, showing highly significant genotypic differences for all VIs (p < 0.001) and explaining up to 72% of the phenotypic variance at the end of tillering. In addition, TGI explained more variation than CSI when recorded at the end of the tillering stage. In the second experiment, two contrasting genotypes managed under two sowing rates and six nitrogen (N) treatments displayed a strong discriminating capacity of NGRDI and TGI for genotype and sowing density (η2 = 0.50). These results highlight the potential use of RGB-derived VIs for high-throughput phenotypic selection of soil coverage ability in durum wheat, even under different agronomic conditions. Full article
Show Figures

Figure 1

27 pages, 2833 KB  
Article
From Molecules to Fields: Mapping the Thematic Evolution of Intelligent Crop Breeding via BERTopic Text Mining
by Xiaohe Liang, Yu Wu, Jiayu Zhuang, Jiajia Liu, Jie Lei, Qi Wang and Ailian Zhou
Agriculture 2025, 15(22), 2373; https://doi.org/10.3390/agriculture15222373 - 16 Nov 2025
Viewed by 991
Abstract
The convergence of agricultural biotechnology and artificial intelligence is reshaping modern crop improvement. Despite a surge of studies integrating artificial intelligence and biotechnology, the rapidly expanding literature on intelligent crop breeding remains fragmented across molecular, phenotypic, and computational dimensions. Existing reviews often rely [...] Read more.
The convergence of agricultural biotechnology and artificial intelligence is reshaping modern crop improvement. Despite a surge of studies integrating artificial intelligence and biotechnology, the rapidly expanding literature on intelligent crop breeding remains fragmented across molecular, phenotypic, and computational dimensions. Existing reviews often rely on traditional bibliometric or narrative approaches that fail to capture the deep semantic evolution of research themes. To address this gap, this study employs the BERTopic model to systematically analyze 1867 articles (1995–2025, WoS Core Collection), mapping the thematic landscape and temporal evolution of intelligent crop breeding and revealing how methodological and application-oriented domains have co-evolved over time. Eight core topics emerge, i.e., (T0) genomic prediction and genotype–environment modeling; (T1) UAV remote sensing and multimodal phenotyping; (T2) stress-tolerant breeding and root phenotypes; (T3) ear/pod counting with deep learning; (T4) grain trait representation and evaluation; (T5) CRISPR and genome editing; (T6) spike structure recognition and 3D modeling; and (T7) maize tassel detection and developmental staging. Topic-evolution analyses indicate a co-development pattern, where genomic prediction provides a stable methodological backbone, while phenomics (UAV/multimodal imaging, organ-level detection, and 3D reconstruction) propels application-oriented advances. Attention dynamics reveal increasing momentum in image-based counting (T3), grain quality traits (T4), and CRISPR-enabled editing (T5), alongside a plateau in traditional mainstays (T0, T1) and mild cooling in root phenotyping under abiotic stress (T2). Quality stratification (citation quartiles, Q1–Q4) shows high-impact concentration in T0/T1 and a growing tail of application-driven work across T3–T7. Journal analysis reveals a complementary publication ecosystem: Frontiers in Plant Science and Plant Methods anchor cross-disciplinary dissemination; Remote Sensing and Computers and Electronics in Agriculture host engineering-centric phenomics; genetics/breeding journals sustain T0/T2; and molecular journals curate T5. These findings provide an integrated overview of methods, applications, and publication venues, offering practical guidance for research planning, cross-field collaboration, and translational innovation in intelligent crop breeding. Full article
Show Figures

Figure 1

30 pages, 2612 KB  
Article
Uncrewed Aerial Vehicle (UAV)-Based High-Throughput Phenotyping of Maize Silage Yield and Nutritive Values Using Multi-Sensory Feature Fusion and Multi-Task Learning with Attention Mechanism
by Jiahao Fan, Jing Zhou, Natalia de Leon and Zhou Zhang
Remote Sens. 2025, 17(21), 3654; https://doi.org/10.3390/rs17213654 - 6 Nov 2025
Viewed by 899
Abstract
Maize (Zea mays L.) silage’s forage quality significantly impacts dairy animal performance and the profitability of the livestock industry. Recently, using uncrewed aerial vehicles (UAVs) equipped with advanced sensors has become a research frontier in maize high-throughput phenotyping (HTP). However, extensive existing [...] Read more.
Maize (Zea mays L.) silage’s forage quality significantly impacts dairy animal performance and the profitability of the livestock industry. Recently, using uncrewed aerial vehicles (UAVs) equipped with advanced sensors has become a research frontier in maize high-throughput phenotyping (HTP). However, extensive existing studies only consider a single sensor modality and models developed for estimating forage quality are single-task ones that fail to utilize the relatedness between each quality trait. To fill the research gap, we propose MUSTA, a MUlti-Sensory feature fusion model that utilizes MUlti-Task learning and the Attention mechanism to simultaneously estimate dry matter yield and multiple nutritive values for silage maize breeding hybrids in the field environment. Specifically, we conducted UAV flights over maize breeding sites and extracted multi-temporal optical- and LiDAR-based features from the UAV-deployed hyperspectral, RGB, and LiDAR sensors. Then, we constructed an attention-based feature fusion module, which included an attention convolutional layer and an attention bidirectional long short-term memory layer, to combine the multi-temporal features and discern the patterns within them. Subsequently, we employed multi-head attention mechanism to obtain comprehensive crop information. We trained MUSTA end-to-end and evaluated it on multiple quantitative metrics. Our results showed that it is capable of practical quality estimation results, as evidenced by the agreement between the estimated quality traits and the ground truth data, with weighted Kendall’s tau coefficients (τw) of 0.79 for dry matter yield, 0.74 for MILK2006, 0.68 for crude protein (CP), 0.42 for starch, 0.39 for neutral detergent fiber (NDF), and 0.51 for acid detergent fiber (ADF). Additionally, we implemented a retrieval-augmented method that enabled comparable prediction performance, even without certain costly features available. The comparison experiments showed that the proposed approach is effective in estimating maize silage yield and nutritional values, providing a digitized alternative to traditional field-based phenotyping. Full article
Show Figures

Figure 1

18 pages, 3632 KB  
Article
Rapeseed Yield Estimation Using UAV-LiDAR and an Improved 3D Reconstruction Method
by Na Li, Zhiwei Hou, Haiyong Jiang, Chongchong Chen, Chao Yang, Yanan Sun, Lei Yang, Tianyu Zhou, Jingyu Chu, Qingzhe Fan and Lijie Zhang
Agriculture 2025, 15(21), 2265; https://doi.org/10.3390/agriculture15212265 - 30 Oct 2025
Cited by 1 | Viewed by 840
Abstract
Quantitative estimation of rapeseed yield is important for precision crop management and sustainable agricultural development. Traditional manual measurements are inefficient and destructive, making them unsuitable for large-scale applications. This study proposes a canopy-volume estimation and yield-modeling framework based on unmanned aerial vehicle light [...] Read more.
Quantitative estimation of rapeseed yield is important for precision crop management and sustainable agricultural development. Traditional manual measurements are inefficient and destructive, making them unsuitable for large-scale applications. This study proposes a canopy-volume estimation and yield-modeling framework based on unmanned aerial vehicle light detection and ranging (UAV-LiDAR) data combined with a HybridMC-Poisson reconstruction algorithm. At the early yellow ripening stage, 20 rapeseed plants were reconstructed in 3D, and field data from 60 quadrats were used to establish a regression relationship between plant volume and yield. The results indicate that the proposed method achieves stable volume reconstruction under complex canopy conditions and yields a volume–yield regression model. When applied at the field scale, the model produced predictions with a relative error of approximately 12% compared with observed yields, within an acceptable range for remote sensing–based yield estimation. These findings support the feasibility of UAV-LiDAR–based volumetric modeling for rapeseed yield estimation and help bridge the scale from individual plants to entire fields. The proposed method provides a reference for large-scale phenotypic data acquisition and field-level yield management. Full article
(This article belongs to the Section Artificial Intelligence and Digital Agriculture)
Show Figures

Figure 1

28 pages, 12549 KB  
Article
An Enhanced Faster R-CNN for High-Throughput Winter Wheat Spike Monitoring to Improved Yield Prediction and Water Use Efficiency
by Donglin Wang, Longfei Shi, Yanbin Li, Binbin Zhang, Guangguang Yang and Serestina Viriri
Agronomy 2025, 15(10), 2388; https://doi.org/10.3390/agronomy15102388 - 14 Oct 2025
Viewed by 751
Abstract
This study develops an innovative unmanned aerial vehicle (UAV)-based intelligent system for winter wheat yield prediction, addressing the inefficiencies of traditional manual counting methods (with approximately 15% error rate) and enabling quantitative analysis of water–fertilizer interactions. By integrating an enhanced Faster Region-Based Convolutional [...] Read more.
This study develops an innovative unmanned aerial vehicle (UAV)-based intelligent system for winter wheat yield prediction, addressing the inefficiencies of traditional manual counting methods (with approximately 15% error rate) and enabling quantitative analysis of water–fertilizer interactions. By integrating an enhanced Faster Region-Based Convolutional Neural Network (Faster R-CNN) architecture with multi-source data fusion and machine learning, the system significantly improves both spike detection accuracy and yield forecasting performance. Field experiments during the 2022–2023 growing season captured high-resolution multispectral imagery for varied irrigation regimes and fertilization treatments. The optimized detection model incorporates ResNet-50 as the backbone feature extraction network, with residual connections and channel attention mechanisms, achieving a mean average precision (mAP) of 91.2% (calculated at IoU threshold 0.5) and 88.72% recall while reducing computational complexity. The model outperformed YOLOv8 by a statistically significant 2.1% margin (p < 0.05). Using model-generated spike counts as input, the random forest (RF) model regressor demonstrated superior yield prediction performance (R2 = 0.82, RMSE = 324.42 kg·ha−1), exceeding the Partial Least Squares Regression (PLSR) (R2 +46%, RMSE-44.3%), Least Squares Support Vector Machine (LSSVM) (R2 + 32.3%, RMSE-32.4%), Support Vector Regression (SVR) (R2 + 30.2%, RMSE-29.6%), and Backpropagation (BP) Neural Network (R2+22.4%, RMSE-24.4%) models. Analysis of different water–fertilizer treatments revealed that while organic fertilizer under full irrigation (750 m3 ha−1) conditions achieved maximum yield benefit (13,679.26 CNY·ha−1), it showed relatively low water productivity (WP = 7.43 kg·m−3). Conversely, under deficit irrigation (450 m3 ha−1) conditions, the 3:7 organic/inorganic fertilizer treatment achieved optimal WP (11.65 kg m−3) and WUE (20.16 kg∙ha−1∙mm−1) while increasing yield benefit by 25.46% compared to organic fertilizer alone. This research establishes an integrated technical framework for high-throughput spike monitoring and yield estimation, providing actionable insights for synergistic water–fertilizer management strategies in sustainable precision agriculture. Full article
(This article belongs to the Section Water Use and Irrigation)
Show Figures

Figure 1

Back to TopTop