Next Article in Journal
Factors Influencing Consumers’ Direct Sale Purchase Intention in the Context of Climate Change
Previous Article in Journal
Research on Consumer Behavior-Driven Collaborative Mechanism of Green Supply Chain and Its Performance Optimization
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Application of Deep Learning Technology in Monitoring Plant Attribute Changes

1
College of Land Science and Technology, China Agricultural University, 17 Qinghua East Road, Haidian, Beijing 100083, China
2
College of Information and Electrical Engineering, China Agricultural University, 17 Qinghua East Road, Haidian, Beijing 100083, China
*
Author to whom correspondence should be addressed.
Sustainability 2025, 17(17), 7602; https://doi.org/10.3390/su17177602
Submission received: 7 July 2025 / Revised: 14 August 2025 / Accepted: 20 August 2025 / Published: 22 August 2025
(This article belongs to the Section Sustainable Agriculture)

Abstract

With the advancement of remote sensing imagery and multimodal sensing technologies, monitoring plant trait dynamics has emerged as a critical area of research in modern agriculture. Traditional approaches, which rely on handcrafted features and shallow models, struggle to effectively address the complexity inherent in high-dimensional and multisource data. In contrast, deep learning, with its end-to-end feature extraction and nonlinear modeling capabilities, has substantially improved monitoring accuracy and automation. This review summarizes recent developments in the application of deep learning methods—including CNNs, RNNs, LSTMs, Transformers, GANs, and VAEs—to tasks such as growth monitoring, yield prediction, pest and disease identification, and phenotypic analysis. It further examines prominent research themes, including multimodal data fusion, transfer learning, and model interpretability. Additionally, it discusses key challenges related to data scarcity, model generalization, and real-world deployment. Finally, the review outlines prospective directions for future research, aiming to inform the integration of deep learning with phenomics and intelligent IoT systems and to advance plant monitoring toward greater intelligence and high-throughput capabilities.

1. Introduction

Monitoring plant trait dynamics involves continuously tracking morphological, physiological, biochemical, and phenotypic characteristics of plants across varying temporal and environmental conditions [1,2,3]. This process integrates multisource data from remote sensing, proximal sensing, and experimental observations to support precision agriculture, ecological management, and plant phenotyping [4,5].
Key traits—including plant height, the leaf area index (LAI), chlorophyll content, stomatal conductance, and phenological stages—serve as critical indicators of plant growth, health status, and stress responses [6,7,8,9]. These traits not only reflect genotype–phenotype–environment interactions but also guide yield forecasting and sustainable crop management [10,11,12]. Their dynamic characterization supports precision fertilization, irrigation scheduling, and ecosystem assessments under climate variability [13].
Despite their significance, conventional monitoring approaches are often labor-intensive, low in spatiotemporal resolution, and susceptible to subjectivity. Classical machine learning methods, such as SVM, LDA, KNN, and random forests, have been explored to automate trait classification and prediction tasks [14,15,16,17,18,19]. While effective on small-scale or low-dimensional datasets, these models face limitations in representing nonlinearities, integrating multimodal data, and scaling to high-throughput applications [20,21].
Recent advances in deep learning offer new opportunities to address these challenges [22,23,24,25,26]. Unlike traditional approaches requiring handcrafted features, deep learning models—particularly convolutional neural networks (CNNs), recurrent neural networks (RNNs), Transformers, and generative adversarial networks (GANs)—can automatically extract hierarchical features from raw data and generalize across diverse conditions [27,28,29,30]. These architectures have demonstrated superior performance in stress detection, yield estimation, pest and disease identification, and phenotypic prediction tasks [31,32,33,34,35,36,37,38].
For example, Wang et al. [39] developed a CNN-based deep neural network for genomic prediction that integrates omics data and employs techniques like batch normalization and dropout to enhance generalization. Similarly, transfer learning approaches using pretrained networks (e.g., VGG and ResNet) have shown strong adaptability in plant phenotyping from UAV and satellite imagery [40,41]. These methods reduce training cost and improve model robustness across spatial scales and crop types.
Given the growing need for automated, scalable, and accurate trait monitoring systems, deep learning has emerged as a transformative tool to model complex plant–environment interactions, fuse heterogeneous data sources, and improve prediction under real-world uncertainties [27].
The remainder of this paper is organized as follows: Section 2 introduces commonly used deep learning models for plant trait monitoring, including CNNs, LSTMs, RNNs, Transformers, and GANs. Section 3 provides a systematic review of specific applications of deep learning in this field, covering growth monitoring and yield prediction, pest and disease diagnosis, phenotypic and genetic trait analysis, and environmental impact assessments. Section 4 discusses the current advantages, challenges, and future research directions of deep learning techniques in plant trait monitoring. Section 5 presents the conclusions. Figure 1 illustrates the main structure of this paper.

2. Deep Learning Models for Plant Attribute Monitoring

2.1. Convolutional Neural Networks (CNNs)

Convolutional neural networks (CNNs) originated in the 1980s and were initially used for handwritten digit recognition in postal code reading. With the evolution of deep learning, CNNs have gained prominence in agriculture for their ability to autonomously extract features and deliver high performance in crop yield estimation and plant phenotypic monitoring [42,43,44].
Unlike traditional feedforward networks, CNNs employ convolutional, pooling, and fully connected layers to capture local spatial features and reduce dimensionality. Convolutional layers apply sliding filters to generate feature maps, while pooling layers enhance robustness through downsampling [45]. CNNs can process data in one, two, or three dimensions, making them suitable for integrating heterogeneous sources such as meteorological and soil data.
In trait analysis and yield prediction, CNN variants include 2D-CNNs, 3D-CNNs, and YOLO [46]. Specifically, 2D-CNNs extract spatial–spectral features via 2D kernels [47], whereas 3D-CNNs incorporate temporal dynamics for multitemporal prediction [42]. YOLO enables real-time detection by dividing images into regions and parallelizing inference [48]. For instance, a YOLOv3-based soybean yield prediction model matched ResNet101’s performance with only half the parameters, demonstrating the efficacy of lightweight CNNs [43].
CNNs have become indispensable for spatial and spatiotemporal feature learning in plant monitoring due to their generalization capability and architectural flexibility. Zhuang et al. [44] introduced an enhanced Faster R-CNN with DCNv2 and optimized anchors for maize seedling detection (Figure 2), achieving 99.53% accuracy under complex field conditions. This was the first integration of a two-stage deep learning model with an orbital high-throughput platform, enabling dynamic monitoring of maize emergence and leaf development. Field trials across 52 maize varieties confirmed the system’s robustness and generalizability (R2 = 0.997 and RMSE = 43.382).

2.2. Recurrent Neural Networks (RNNs) and Long Short-Term Memory (LSTM)

Recurrent neural networks (RNNs) are designed to model sequential data by retaining information from previous time steps via recurrent connections, enabling effective learning of temporal dependencies in variable-length sequences [49]. They have been applied to diverse domains such as handwriting recognition, image captioning, machine translation, crop yield forecasting, and phenotypic time-series analysis [50,51,52]. However, traditional RNNs are prone to vanishing or exploding gradients during training [53], hindering long-term dependency modeling—which is particularly problematic in monitoring extended plant phenotypic changes. Moreover, the fixed-length input requirement often leads to truncation or padding, risking information loss and reduced model efficiency.
To overcome these limitations, Hochreiter and Schmidhuber [54] introduced Long Short-Term Memory (LSTM) networks, which incorporate input, forget, and output gates to control information flow, allowing the retention and selective updating of relevant temporal features [50]. Enhanced variants such as the attention-based LSTM (ALSTM) proposed by Tian et al. [55] further improve sequence modeling through attention mechanisms and regularization layers.
In plant trait dynamics monitoring, RNNs—particularly LSTMs—demonstrate superior performance in modeling crop growth stages, LAI variation, and moisture stress by capturing nonlinear temporal patterns across multisource inputs (e.g., meteorological, spectral, and soil data). These models offer robustness in handling irregular-length sequences and adaptability through transfer learning, enabling accurate estimation of yield, biomass, and phenotypic traits even under limited or imbalanced data conditions [56]. Compared to traditional statistical methods, LSTM-based models exhibit higher predictive accuracy and better generalizability across regions and crops.
Applications of RNNs and LSTMs in this field include stress response modeling and multitemporal climate adaptation analysis [57,58]. For instance, Ali et al. [59] developed a multimodal LSTM framework integrating molecular and phenotypic features to predict drought stress (Figure 3), achieving 97% accuracy—outperforming the RNN (94%), Gradient Boosting (96%), and SVM (82%). This study, covering 101 plant genera, underscores LSTM’s effectiveness in early stress detection and crop resilience management, offering valuable insights into precision agriculture and food security enhancement.

2.3. Transformer

The Transformer, which was proposed by Vaswani et al. [60], is a deep learning architecture that replaces traditional RNN and CNN structures with self-attention mechanisms to model global dependencies. Its encoder–decoder structure allows for efficient sequence modeling and has achieved significant success in natural language processing, computer vision, and time-series forecasting. In agriculture, its application is expanding to areas such as remote sensing and plant phenotyping [61].
In crop monitoring and trait analysis, Transformer variants—including the Vision Transformer and the Swin Transformer—offer notable advantages. They effectively capture long-range temporal and spatial dependencies in multitemporal imagery and growth sequences, making them well suited for UAV/satellite-based plant monitoring. Compared to CNNs, their multi-head self-attention mechanism provides greater flexibility in handling complex field conditions like occlusion and variable lighting while also facilitating multimodal data fusion (e.g., hyperspectral, RGB, and meteorological inputs) [62]. Moreover, their strong pretraining and transfer learning capabilities reduce reliance on large labeled datasets and enhance generalizability.
Transformers have been applied to weed detection, pest and disease recognition [63], and yield prediction. Rahman et al. [64] pioneered the use of Transformers in drought phenotyping of blueberries by applying them to plant spectral reflectance data (Figure 4). Their model introduced a patch-based dimensionality reduction module to handle high-dimensional spectral inputs and employed a Transformer encoder with multi-head self-attention and an MLP regressor to estimate leaf water content (LWC). The model achieved an R2 of 0.81 in cross-cultivar prediction—surpassing traditional and deep learning baselines—demonstrating its robustness and adaptability. Emerging Transformer-like models such as Mamba have also shown promise in agricultural domains and merit further exploration [65,66].

2.4. Generative Models

Data scarcity and limited diversity are major challenges in applying deep learning to plant trait dynamics monitoring, particularly due to high variability in environmental conditions, crop types, and disease or pest manifestations. Generative models, especially generative adversarial networks (GANs) and variational autoencoders (VAEs), offer powerful solutions by synthesizing realistic and diverse training data to improve model robustness and generalization [67].
GANs comprise a generator and a discriminator trained adversarially, enabling the creation of synthetic images that closely mimic real-world data [68]. In plant disease and pest detection, GANs enhance datasets by simulating varied lesion patterns, pest morphologies, lighting conditions, and complex backgrounds. In contrast, VAEs learn latent data distributions to generate structurally coherent and continuously interpolated samples [69], making them particularly suitable for modeling subtle phenotypic variations and addressing class imbalance [70].
Integrating GANs and VAEs into data augmentation pipelines reduces reliance on large annotated datasets, captures fine-grained crop-specific traits, and enhances scalability across diverse geographic and climatic contexts [71,72]. Although agricultural applications are still emerging, successes in biomedical imaging and natural image generation validate their effectiveness in improving data quality and model performance [73,74,75].
For instance, Debnath et al. [76] proposed a GAN–U-Net hybrid framework that integrates multisource vegetation indices and deep residual regression for high-precision crop area and biomass segmentation. The approach incorporates Taylor expansion and the Coot optimization algorithm to accelerate convergence and stabilize segmentation performance. Tested on the DeepWheat dataset, this model significantly outperformed conventional deep learning baselines in emergence and biomass estimation, demonstrating the potential of combining generative models with swarm intelligence optimization for phenotypic monitoring. Table 1 summarizes the main characteristics, strengths and weaknesses, representative algorithms and case studies, and potential application scenarios of deep learning models used in plant attribute monitoring.

3. Applications of Deep Learning in Monitoring Plant Attributes

3.1. Growth Monitoring and Yield Prediction

Crop yield forecasting aims to quantitatively estimate crop quantity or quality over a defined future period by integrating multisource data on climate, soil, crop varieties, and management practices [82]. It plays a vital role in guiding agricultural production, policy formulation, and market regulation. Amid rising global food demand—which is projected to increase by 60% by 2050—and mounting challenges from climate change and geopolitical instability, accurate yield prediction is essential for ensuring food security and production resilience [83,84].
Traditional forecasting models struggle to handle the nonlinear interactions, high-dimensional features, and incomplete data inherent in complex agroecosystems [85]. Deep learning, with its multilayer architectures and end-to-end modeling capabilities, offers powerful solutions to these challenges. It has been successfully applied to major crops such as maize [86,87], soybeans [86,88], rice [89,90], wheat [56,91,92], tomatoes [93,94], and lettuce [95,96].
Chu et al. [89] proposed a hybrid model integrating dual multilayer perceptrons and an RNN to extract spatial–temporal features for rice yield prediction, achieving low MAE and RMSE across both growing seasons. The model demonstrated rapid convergence and strong generalization over 81 counties. Khan et al. [86] compared DNN, 1D-CNN, and GRU networks for maize and soybean yield estimation using GPP data, with the 1D-CNN achieving R2 = 0.864 (maize) and 0.750 (soybean). Their transfer learning strategy enabled cross-crop and cross-region adaptation, reaching R2 = 0.903 and 0.931 for crop-type and climate zone transfers, respectively—highlighting the feasibility of remote sensing-based yield prediction under data-scarce conditions (see Figure 5).
Chakraborty et al. [97] developed a U-Net-based segmentation model for quantifying almond flower density from UAV RGB imagery, enabling early yield prediction. The model outperformed the Enhanced Bloom Index (EBI) baseline (R2 = 0.78 vs. 0.14) and maintained robustness across three years of multi-location data. By validating against diverse meteorological conditions and using five-fold cross-validation, the study provided a scalable method for floral monitoring and laid the foundation for cross-crop model transfer.

3.2. Disease and Pest Diagnosis

Plant diseases and pests—which are caused by fungi, bacteria, viruses, nematodes, and insects—pose serious biotic stress throughout the crop life cycle, leading to physiological disorders and yield losses [98,99,100]. Their complexity, diversity, and spatial variability present major challenges for accurate identification, particularly under complex field conditions [101]. To facilitate automation, deep learning-based diagnosis tasks are typically decomposed into classification (“what”), detection (“where”), and segmentation (“how”) subtasks [102,103], as illustrated in Figure 6.
Traditional methods (e.g., manual inspection and rule-based image processing) suffer from limited feature representation, sensitivity to environmental variation, and poor adaptability [104,105]. In contrast, deep learning approaches—especially CNNs, GANs, and Transformers—offer superior feature learning, cross-task transferability, and robustness under limited data through techniques such as data augmentation, pretraining, and multimodal fusion [106,107].
At present, a variety of deep learning strategies have been developed to address different monitoring scenarios and data characteristics.
Close-range imagery with deep learning has achieved high classification and segmentation accuracy. CNN-based models (e.g., VGG and ResNet) exceed 95% accuracy in leaf disease classification [108], while segmentation networks such as U-Net and Mask R-CNN enable lesion quantification. Xiong et al. [109] proposed AISA (with MobileNet + improved GrabCut + the excess green index) to achieve automated leaf segmentation, reaching 98% top-three accuracy. Wang et al. [110] developed DUNet (DeepLabV3+ + U-Net) for cucumber disease severity grading, achieving 92.85% accuracy and high robustness under complex field conditions (Figure 7). Recent advances have demonstrated that integrating UAV-captured imagery with deep learning models—such as dual-branch CNN-attention frameworks—can significantly improve the accuracy and robustness of rice lodging detection under complex field conditions [111].
Field-scale monitoring using UAV imagery enables spatial mapping and dynamic tracking of diseases and pests in major crops (e.g., rice, wheat, and maize) [112,113,114]. Deep models—such as 1D-/3D-CNNs, RNNs, and Transformers—extract spatial–spectral–temporal features from RGB, multispectral, and hyperspectral UAV data for severity grading, lesion extraction, and risk prediction [115].
IoT-based multimodal monitoring further enables real-time risk forecasting by integrating environmental sensor data (e.g., temperature, humidity, and CO2) with deep learning models [116]. For example, Sharma et al. [117] deployed a wireless sensor network combined with fuzzy logic to identify pest/disease outbreaks in rice and millet fields. Cai et al. [118] integrated IoT sensing with neural networks for vegetable pest warning, achieving 98% accuracy and an F1-score of 0.97 on 761 samples.
Generative and transfer learning techniques help mitigate small or imbalanced datasets. GANs and VAEs synthesize lesion images, while transfer learning fine-tunes pretrained models [119]. Mallick et al. [120] built a real-time MobileNetV3-based SoC platform for soybean/mustard pest diagnosis, achieving >93% accuracy with 24× inference speedup and 19% energy savings. Wang et al. [121] introduced SCM-BDC, a causal feature extraction–based zero-shot model for pest/disease classification, improving H-scores by 5.2% and 1.4% on APTV99 and ADTV68, respectively (Figure 8).
Lightweight deployment and mobile monitoring further enable real-time detection on edge devices. Model compression (e.g., quantization and pruning) enhances performance on smartphones and UAVs. Mallick et al. [120] developed an Android app linked to an SoC inference platform for high-efficiency mobile detection. Chen et al. [122] introduced an AR glasses–based rice pest monitoring system using an RDP detector (YOLOv5X + CLIP), achieving 87.4% mAP and 86.5% recall—14.6% higher precision than vision-only models—highlighting the potential of multimodal fusion and wearable AI in smart agriculture.

3.3. Phenotyping and Genetic Trait Analysis

A plant phenotype refers to the set of observable traits resulting from the interaction between the genotype and the environment, encompassing morphological, physiological, biochemical, and molecular characteristics [4]. Plant phenomics integrates multimodal imaging—including RGB, thermal infrared, multispectral, hyperspectral, X-ray, and LiDAR—with deep learning and genetic approaches to enable high-throughput, non-destructive, dynamic monitoring of phenotypes across multiple scales, accelerating the understanding of genotype–phenotype interactions and advancing crop improvement [2].
In digital plant phenomics, deep learning has become a key technology for addressing the high dimensionality, complexity, and heterogeneity of phenotypic data. CNNs, RNNs, and their variants support automatic extraction, classification, and regression prediction of phenotypic traits and are widely applied in stress detection, trait quantification, and gene function discovery [123].

3.3.1. Automatic Extraction of Phenotypic Traits from Images

Traditional phenotyping relies on manual measurements, which are time-consuming and highly subjective. High-throughput imaging has enabled the acquisition of RGB, multispectral, thermal infrared, and LiDAR data; however, extracting features from large-scale unstructured imagery remains a major bottleneck [79]. Figure 9 illustrates common computer vision tasks in plant phenotyping research. Leveraging end-to-end feature learning, CNNs have shown strong performance in tasks such as leaf area measurement, fruit counting, and canopy density estimation. Among them, VGGNet is particularly well suited for extracting high-resolution leaf phenotypes [124]; ResNet utilizes residual connections to capture fine-grained morphological variations [125]; and Inception applies parallel multiscale convolutions, facilitating the analysis of complex structures.
Integrating transfer learning with Inception–ResNet architectures can enhance the accuracy of multimodal image-based phenotypic recognition [79]. Mask R-CNN has been applied to single-plant segmentation, fruit counting, and 3D point cloud reconstruction. For example, Bheemanahalli et al. [126] used Mask R-CNN to automatically extract sorghum stomatal phenotypic traits and combined the results with genome-wide association studies to reveal their genetic basis. Atkins et al. [127] employed Cascade Mask R-CNN for Arabidopsis silique detection and segmentation, supporting QTL analysis.
Given the temporal dependency of phenotypes, combining CNNs with RNNs (LSTM/GRU) or Transformers enables modeling of growth dynamics. For instance, Namin et al. [81] developed a CNN-LSTM framework to extract dynamic growth features, improving genotype classification accuracy from 76.8% to 93.0%, significantly outperforming traditional SVM and CNN-CRF methods.
Multimodal image fusion further improves phenotypic recognition precision. Shuo Zhou et al. [128] developed Maize-IAS, a Mask R-CNN-based software for maize phenotyping that enables pixel-level leaf segmentation and trait measurement. Li et al. [129] created the Angle_Measurement software, which integrates YOLOv5, Canny edge detection, and the Hough transform to automatically measure cotton fruit branch angles (as shown in Figure 10). The detection accuracy was highly consistent with AutoCAD (maximum correlation 0.95) and was 500 times faster than manual measurements. Yu et al. [130] proposed the AUNet segmentation model for dynamic lettuce phenotyping, achieving an mIoU of 0.9561, outperforming multiple classical models.

3.3.2. Applications of Deep Learning in Genotype–Phenotype Association

Traditional GWAS and genomic selection approaches primarily utilize linear models, which are limited in capturing nonlinear interactions among polygenic loci, epigenetic factors, and environmental variables [131]. Deep learning models, by contrast, can autonomously extract latent features from high-dimensional omics data without explicit feature engineering [79]. CNNs, RNNs, Transformers, and graph neural networks (GNNs) have been applied to predict regulatory elements, identify cis-acting sites, and perform phenotypic regression, thereby enhancing predictive accuracy and generalizability [132].
Early models such as DeepBind and DeepSEA employed CNNs to predict transcription factor binding and chromatin accessibility, while Basset learned epigenetic features such as DNA methylation patterns [133]. Champigny et al. [133] further demonstrated the feasibility of multitrait prediction using whole-genome DNA methylation data. In parallel, integrative multi-omics learning has gained momentum, with deep learning architectures—such as multichannel CNNs and Transformer variants—capable of fusing SNPs, transcriptomics, and epigenetics for multitask learning.
For instance, Wang et al. [134] developed WheatNet, a CNN-based field phenotyping and prediction system (Figure 11), combining deep phenotypic prediction with GWAS. This approach identified key loci (e.g., Ppd-D1, Ppd-B1, and chromosome 1B) associated with wheat flowering time and revealed epistatic interactions. The study also proposed the “spike emergence rate” as a novel developmental trait, providing new insights into the genetic control of flowering phenology.
Interpretability methods such as SHAP and DeepLIFT, combined with deep networks, have enabled in silico functional variant analysis. Peleke et al. [135] built a CNN model to decode cis-regulatory sequences across four species (Figure 12), achieving 80–87% accuracy and AUC values of 0.85–0.94. Their approach—combining DeepLIFT with TF-MoDISco—extracted conserved regulatory motifs across tissues and species, confirming cross-species transferability and validating results using a multi-genotype tomato dataset.
In genomic selection, deep regression models directly map SNPs to traits, facilitating multitask breeding applications. Rairdin et al. [136] used RetinaNet to identify over 40 novel SNPs associated with SDS resistance. Wang et al. [137] proposed Cropformer, a Transformer-based model integrating SHAP interpretability, which improved trait prediction by up to 7.5% and boosted accuracy by 12.2–19.1% through multimodal fusion.
Overall, the integration of deep learning with multi-omics data is reshaping functional genomics and precision breeding by revealing complex causal links between genotype and phenotype.

3.4. Monitoring the Environmental Impact on Plants

Plant stress—triggered when environmental conditions deviate from optimal conditions—induces physiological responses that hinder growth and productivity [138]. Stressors are broadly categorized into biotic (e.g., pathogens and pests) and abiotic (e.g., drought, salinity, heat, cold, nutrient deficiency, and heavy metals) factors (Figure 13) [139,140]. Among them, abiotic stress is a major cause of yield reduction, adversely affecting morphology, physiology, and molecular processes such as photosynthesis inhibition, oxidative stress, and hormonal disruption [139,141,142,143].
With intensifying climate change, land degradation, and resource depletion, the frequency and severity of abiotic stresses are escalating, threatening sustainable agriculture and food security. Breeding stress-tolerant crops is critical yet challenging. Thus, real-time, dynamic, and accurate monitoring technologies are essential for early detection and timely mitigation [38,144,145].
Traditional methods (e.g., manual inspection and lab assays) lack scalability, consistency, and precision. Emerging technologies—such as UAVs, satellites, and ground sensors—generate high-dimensional and nonlinear data that conventional statistical and machine learning models struggle to analyze effectively. Deep learning overcomes these limitations by automatically extracting latent, nonlinear features from complex datasets, enabling early detection, severity assessments, and spatial localization of stress signals with high accuracy [38].
Recent studies have applied CNNs, RNNs, LSTMs, and GANs to classify various abiotic stress types—including drought, heat, salinity, and nutrient deficiency—using multimodal data sources such as RGB, fluorescence, thermal, multispectral, hyperspectral, and in situ sensor data [2]. These deep models enable end-to-end learning, reduce reliance on manual feature engineering, and demonstrate strong generalization across stress types and environmental conditions [123]. As a result, deep learning has become a core component of intelligent crop stress monitoring systems, supporting precision agriculture through robust, automated, and scalable solutions.

3.4.1. Water Stress Assessment

An accurate assessment of crop water stress is critical for understanding plant–environment interactions and forms the basis for precision irrigation. Traditional methods commonly rely on direct in situ measurements of soil moisture, leaf water potential, and stomatal conductance [146]. While informative, these approaches are labor-intensive, temporally and spatially limited, and often unsuitable for large-scale dynamic monitoring.
Recent advances in deep learning offer new avenues for efficient and scalable water stress assessments. With end-to-end modeling capabilities, deep neural networks can autonomously extract relevant phenotypic and environmental features from multimodal inputs, thereby improving both accuracy and generalizability. Models such as CNNs, LSTM networks, and temporal convolutional networks (TCNs) have shown promising results in soil moisture estimation, evapotranspiration prediction, water status evaluation, and stress-level classification [147]. In particular, deep learning approaches leveraging RGB, thermal infrared, multispectral, and hyperspectral imagery have demonstrated superior performance over traditional machine learning and empirical models in crops such as maize, rice, sugarcane, and chickpeas. These methods report improved classification accuracy, a reduced root mean square error (RMSE), and a higher coefficient of determination ( R 2 ) across diverse growing conditions.
For example, Yu et al. [148] developed a predictive framework integrating a high-throughput phenotyping platform with deep learning, combining an Inception–residual–attention CNN with multiple recurrent neural networks (Figure 14). This approach achieved the highest Rp2 (up to 0.9583) for predicting SSC, NO2−, pH, and Ca2+; reached a water stress classification accuracy of 98.86%; and, for the first time, enabled pixel-level visualization of pH distribution based on hyperspectral data, providing technical support for precision irrigation and quality management.

3.4.2. Nutrient Deficiency

Crop nutrient deficiency refers to the insufficient supply of essential elements during plant growth and development, leading to a series of physiological and morphological changes that ultimately impact yield and quality [149,150]. A nutrient stress assessment relies not only on absolute measurements but also on relative indicators to comprehensively reflect the nutritional status of both soil and plants.
Traditional nutrient monitoring methods depend on in situ chemical measurements and visual inspection, which are labor-intensive, time-consuming, and difficult to scale for high-throughput, dynamic assessments in large agricultural settings [151]. Deep learning, with its end-to-end modeling and automatic feature extraction capabilities, has significantly improved the accuracy and efficiency of nutrient deficiency detection [151]. Its applications in plant nutrient deficiency monitoring span key technical approaches such as multi-level image analysis, feature learning strategies, ensemble methods, and multimodal data fusion.
CNNs have been extensively applied to visual symptom-based crop nutrient deficiency diagnosis using RGB, near-infrared, and hyperspectral imagery. Custom-designed CNNs and ResNet-based transfer learning models have achieved over 98% accuracy in identifying nitrogen, iron, manganese, and calcium deficiencies in crops such as sorghum and rice. Ensemble learning and feature fusion have further improved generalization and robustness. Deep learning integrated with multispectral and hyperspectral data has proven effective in capturing spectral–spatial features for diagnosing nutrient stress in quinoa, cowpea, and cauliflower. Moreover, combining deep models with physiological indicators—such as chlorophyll content and photosynthetic efficiency—enhances both interpretability and practical value. For instance, Chandel et al. [152] developed an AlexNet-based nitrogen stress classification and variable-rate fertilization system, achieving an F1 score of 0.973 for three-category classification and outperforming GoogLeNet. As shown in Figure 15a, field validation demonstrated that the system reduced nitrogen fertilizer use by 37.5% while maintaining stable yields, underscoring the potential of deep learning for precision nutrient management in sustainable agriculture.

3.4.3. Salinity and Heavy Metal Stress

Soil salinity and heavy metal stress are major factors limiting crop yield in global agriculture. However, the physiological responses of plants to salinity are often highly complex and variable, posing challenges for experimental design and interpretation [142,154]. Conventional detection methods—such as ion analysis, electrical conductivity measurements, and physiological index assessments—are accurate but typically time-consuming, costly, and demanding in terms of field operations, making large-scale, real-time monitoring difficult [155]. Deep learning models, with their superior feature extraction and nonlinear modeling capabilities, can directly mine complex phenotypic information from high-dimensional multimodal data, significantly improving the accuracy and efficiency of stress assessments. They have shown remarkable advantages in salinity and heavy metal stress detection, driving the rapid development of noninvasive, automated, and high-throughput monitoring technologies [28].
In salinity stress research, CNNs, LSTMs, and hybrid models have been widely applied to electrical signal time-series analysis, spectral interpretation, and phenotypic classification. Yao et al. [156] developed the SLSTM-TCNN model, which effectively predicts wheat leaf signal dynamics under varying NaCl concentrations. Combined with a NaCl stress discrimination model and a salt tolerance classification model, this approach enabled high-precision identification of salt-tolerant lines. Feng et al. [157] integrated hyperspectral imaging with deep learning for precise segmentation and quantitative analysis of okra phenotypes, revealing strong correlations between spectral data and chlorophyll content, photosynthesis, and ion accumulation under stress conditions (r > 0.7). Additionally, Khotimah et al. [158] proposed the SC-CAN network, which incorporates spectral convolution and channel attention mechanisms to enable early diagnosis of wheat salt stress, achieving an average F1 score of 83.03.
In heavy metal stress detection, deep learning has also demonstrated superior performance. For example, Deng et al. [153] addressed the severe impact of salinity stress on soybean seedling growth and photosynthetic function by proposing a noninvasive detection method that combines a deep CNN network with chlorophyll fluorescence imaging (ChlF) (Figure 15b). The study collected image data of three key chlorophyll fluorescence parameters (Fv/Fm, Y(NO), and Inh) in soybean seedlings subjected to different salinity levels and systematically evaluated the performance of six classical CNN models (AlexNet, GoogLeNet, ResNet50, ShuffleNet, SqueezeNet, and MobileNetv2) in salinity stress recognition. Furthermore, a feature fusion strategy was implemented to enhance recognition accuracy by integrating multisource fluorescence image information. The experimental results showed that fusing the three ChlF image features extracted from the average pooling layer of ResNet50 and classifying them using an SVM classifier increased the recognition accuracy to 98.61%. Compared with traditional visual assessments and SPAD measurements, the deep learning approach demonstrated higher resolution and consistency across five salinity stress levels. The study also employed UMAP visualization to intuitively illustrate the discriminative power of high-dimensional deep features under different salinity conditions, supporting interpretability in deep learning models.
Overall, deep learning, through end-to-end learning, feature selection, and multisource data integration, provides high-accuracy and scalable solutions for the early detection, phenotypic quantification, and grading of salinity and heavy metal stress. In the future, the deep integration of deep neural networks with hyperspectral imaging, fluorescence imaging, and UAV-based remote sensing is expected to further advance crop stress monitoring toward greater intelligence and high-throughput capacity.

3.4.4. Temperature Stress

Temperature fluctuations directly impact chemical reaction rates, gas solubility, and water uptake, significantly affecting plant metabolic activities [150]. To cope with temperature stress, plants establish complex regulatory mechanisms at the cellular and metabolic levels, including hormone signaling, light signaling, circadian rhythm regulation, and reactive oxygen species homeostasis [159,160,161]. Leveraging digital technologies to accurately detect and mitigate yield losses caused by heat and cold stress is essential for safeguarding food security.
Temperature stress induces complex responses across metabolic, physiological, and epigenetic layers. Traditional assessment approaches relying on manual observation or single-sensor measurements often suffer from subjectivity and limited timeliness [141,149]. In contrast, deep learning has emerged as a key technology in this field, offering powerful feature extraction and nonlinear modeling capabilities.
In image-based applications, combining RGB, multispectral, and thermal infrared remote sensing data with deep convolutional neural networks enables precise identification of temperature stress. For cold stress, Yang et al. [162] employed a 10-layer CNN to interpret spectral reflectance of maize seedlings in the visible–near-infrared range, achieving a correlation of 0.829 between model outputs and chemical measurements and demonstrating the suitability of deep convolutional architectures for quantifying cold damage phenotypes.
For heat stress, Ryu et al. [163] compared the LS-SVM, RP-CNN, DNN, and PLS-DA algorithms in garlic, showing that LS-SVM achieved the best performance for heat stress detection using multispectral snapshot imagery. Xing et al. [164] developed the CNN-LSTM-att model by incorporating recurrent networks and attention mechanisms to improve classification of heat–drought stress in Chinese fir seedlings, reaching an accuracy of 96.91%. In cotton anther dehiscence detection, Tan et al. [165] combined YOLOv5 and Faster R-CNN models, significantly improving detection precision and adaptability, with suitability for mobile deployment.
Additionally, research has shown that multimodal data fusion is critical for improving the accuracy of temperature stress diagnosis. Integrating RGB, thermal infrared, and hyperspectral channels not only strengthens the identification of temperature anomalies but also provides a more comprehensive basis for quantifying stress across crops and growth stages. Deep learning models can fully exploit complex spectral, spatial, and temporal features, addressing the limitations of traditional machine learning in handling nonlinear relationships.
Overall, deep learning, through end-to-end feature learning and model integration, has markedly enhanced the efficiency and accuracy of temperature stress phenotyping. Looking ahead, integrating deep networks with multisource sensing and edge computing to build high-throughput, real-time intelligent monitoring systems is key to advancing precision agriculture and climate-resilient crop management. For example, Mao et al. [166] addressed the limitations of conventional freeze damage assessment methods—manual operations, time consumption, and subjectivity—by proposing and validating an automated monitoring strategy combining UAV-based multisource remote sensing with deep learning (as shown in Figure 16). The prediction accuracy reached Rp2 = 0.862, and the CNN-GRU model significantly outperformed baseline methods, marking the first demonstration of the value of background information in freeze damage estimation of dense-canopy crops. Overall, the integration of deep learning with multisource sensing and edge computing will drive crop stress monitoring toward higher throughput, better real-time capabilities, and greater intelligence.

4. Discussion

4.1. Advantages

In the field of plant trait dynamics monitoring, deep learning technologies have demonstrated clear advantages over traditional machine learning and manual feature engineering approaches, owing to their distinctive algorithmic strengths and powerful feature learning capabilities. First, deep learning models possess end-to-end automatic feature extraction capacity. Through multilayer nonlinear network structures, they can autonomously discover the most discriminative feature representations from raw multisource data, substantially reducing dependence on domain expertise and manual intervention and overcoming the limitations of conventional algorithms in feature design and selection. Compared with shallow models reliant on prior knowledge, architectures such as CNNs, RNNs, and their derivatives—like LSTMs and U-Nets—can fully capture higher-order spatiotemporal correlations embedded in images, spectra, and time-series data, thereby enhancing the precision in characterizing plant physiological status, phenotypic dynamics, and stress responses.
Second, deep learning exhibits significant advantages in handling high-dimensional, large-scale, and multimodal datasets. In recent years, with the rapid proliferation of remote sensing imagery, UAV multispectral data, and high-throughput phenomics datasets, deep learning technologies have leveraged deep network architectures to integrate multichannel information, enabling high-resolution monitoring and prediction of continuous plant trait variations. Studies have shown that deep learning models substantially improve prediction accuracy and robustness in typical plant phenotyping tasks such as leaf area index estimation, canopy structure segmentation, pest and disease recognition, and drought stress detection. In many cases, their classification accuracy and detection performance have consistently exceeded 95%, far surpassing comparable traditional machine learning models.
Third, deep learning models possess strong nonlinear modeling capabilities and cross-domain transfer learning capacity, enabling them to effectively accommodate complex phenotypic variations across different crop types, observation scales, and environmental conditions. By incorporating transfer learning and pretrained models such as ResNet, VGGNet, and InceptionNet, researchers can rapidly develop high-performance models even under limited labeled sample conditions, thereby mitigating data scarcity challenges while enhancing model generalizability and applicability.
Furthermore, deep learning demonstrates a high degree of automation and intelligence in plant trait monitoring. Leveraging deep convolutional neural networks and derivative architectures for semantic segmentation and instance segmentation—such as U-Net and Mask R-CNN—these approaches achieve precise segmentation and feature extraction of leaves, fruits, and canopy structures, enabling continuous tracking and quantitative analysis of growth status, nutrient levels, and stress responses.
In summary, the primary strengths of deep learning in monitoring plant trait dynamics lie in its end-to-end feature learning, capacity to model complex nonlinear relationships, large-scale data integration, enhanced generalization through transfer learning, and automated, intelligent feature extraction. Collectively, these advantages have accelerated high-throughput phenotyping, precision agriculture decision support, and crop breeding research, providing essential technological support for efficient, accurate, and sustainable plant monitoring.

4.2. Challenges

The high predictive accuracy of deep learning models in plant trait monitoring fundamentally depends on large-scale, high-quality annotated datasets. However, unlike industrial computer vision tasks, data acquisition in agricultural settings faces considerable challenges [167]. On the one hand, the uncertainty of field environments, dynamic crop growth cycles, and the resolution heterogeneity of multimodal sensors make obtaining high-quality labeled samples extremely costly. Most existing studies rely on phenotype datasets collected over limited areas and time periods, which are insufficient to meet the training demands of deep models.
To address data scarcity, researchers have proposed several strategies: first, using data augmentation techniques such as rotation, translation, scaling, and color perturbation to enhance sample diversity and improve generalization; second, introducing transfer learning, in which networks are pretrained on large public datasets such as ImageNet and then fine-tuned for agricultural phenotyping tasks to improve performance under limited sample conditions; third, generating synthetic data with GANs to expand training datasets with simulated imagery. Although these methods partially alleviate data limitations, deep models remain highly dependent on data quality and sample diversity, especially in cross-regional and cross-growth-stage scenarios.
Deploying deep learning models in diverse agricultural environments faces severe generalization challenges. Different crop types, climatic conditions, field management practices, and sensor configurations lead to significant distributional discrepancies in input data, creating large performance gaps between training and application domains. For example, in tasks such as leaf phenotype analysis, pest and disease detection, and drought stress monitoring, convolutional and temporal models often overfit to specific acquisition conditions, making it difficult to maintain stable predictive performance across fields or growth stages. Furthermore, the “black box” nature of deep neural networks limits the interpretability of predictions in agricultural contexts, undermining the credibility of resulting decisions.
To improve model transparency and interpretability, researchers have proposed several methods: first, using feature attribution techniques such as integrated gradients, SHAP values, and DeepLIFT to quantify the contributions of input variables to model outputs; second, combining feature visualization to map intermediate layer outputs as intuitive heatmaps or feature channels, facilitating the identification of key phenotypic regions; third, exploring attention mechanisms and rule-based hybrid models to enhance interpretability of decision logic. Although these techniques improve transparency to some extent, balancing predictive accuracy with interpretability in large-scale agricultural applications remains an urgent research challenge.
The integration of deep learning with IoT and robotics is driving plant trait monitoring toward a new era of real-time, high-throughput, and automated assessments. By deploying multimodal sensors—including RGB cameras, thermal imagers, multispectral and hyperspectral cameras, and LiDAR—combined with UAV imagery and ground-based mobile platforms, researchers can continuously collect morphological and physiological crop data to dynamically sense growth status, pest and disease incidence, and stress responses. However, this integration still faces multiple challenges. First, the heterogeneous data generated by IoT devices differ substantially in temporal and spatial resolution, spectral ranges, and acquisition conditions, requiring deep models to achieve cross-modal feature alignment and consistent representation; otherwise, predictive performance can be unstable. Second, real-time monitoring and decision-making depend on efficient inference in edge computing environments, but conventional deep networks are computationally heavy and parameter-intensive, limiting deployment on low-power UAVs and robotic platforms.
To address these issues, several technical pathways have emerged: first, the development of lightweight network architectures such as MobileNet, EfficientNet, and depthwise separable convolution, which significantly reduce model size and inference latency to enable real-time operation on embedded platforms; second, the use of compression techniques such as pruning, quantization, and distillation to shrink parameter counts while retaining key representational capacity and improving deployment flexibility; third, the proposal of multimodal data fusion frameworks that leverage attention mechanisms, cross-modal alignment, and multisource feature interactions to achieve robust representation and perception in complex field scenarios. For example, some studies have combined convolutional neural networks with Transformer architectures to jointly process imagery and temporal sensor data, using attention modules to focus on critical physiological indicators.
Although these methods have demonstrated feasibility in experimental environments, their universality and stability in large-scale agricultural production remain limited. First, while lightweight models improve inference speed, they often sacrifice some phenotypic detail extraction capabilities, making it difficult to balance high predictive accuracy and real-time performance. Second, insufficient sensor calibration and environmental adaptability still cause data noise and observation bias to significantly impact model performance. Additionally, autonomous navigation and task scheduling for robots remain underdeveloped in complex field environments, imposing higher demands on path planning algorithms and deep perception networks. In the future, combining edge–cloud collaborative computing, ultra-lightweight networks, and self-supervised pretraining on multisource data holds promise to further enhance the deployment efficiency and application value of deep learning models in automated precision agriculture.

4.3. Future Perspectives

With the rapid advancement of deep learning algorithms, sensor technologies, and agricultural digitalization, plant attribute monitoring is accelerating toward a new stage characterized by high-throughput, automation, and intelligence. However, translating research innovations into real-world agricultural settings remains a significant challenge. Future research and applications are expected to achieve breakthroughs in several key areas:
Deep integration of multimodal data: Although deep learning has already demonstrated excellent performance on single-image or single-sensor data, future efforts must further fuse hyperspectral, thermal infrared, LiDAR, 3D point cloud, and multi-omics data to build a unified multimodal feature space, enabling holistic perception of plant morphology, physiology, and molecular characteristics [168]. Such integration can enhance robustness under complex field conditions.
Dynamic monitoring across spatiotemporal scales: For precision agriculture and ecological monitoring, it is essential to establish large-scale, cross-temporal monitoring systems covering scales from leaves to canopies and from individual fields to regions. Integrating deep neural networks with temporal modeling approaches such as Transformers and LSTM will facilitate continuous prediction of plant growth, stress responses, and yield dynamics. Furthermore, addressing temporal model drift—where prediction accuracy degrades over time due to changing field conditions—will be critical to maintaining long-term deployment reliability.
Interpretability and decision transparency: As deep models are increasingly adopted in agricultural production, their “black box” nature has become more prominent. Future research should prioritize improving model interpretability and transparency by employing feature attribution, attention visualization, and causal inference to enhance the comprehensibility of model outputs and strengthen user trust and adoption. In particular, agricultural end-users such as farmers, breeders, and technicians often lack data science backgrounds, making model interpretability essential for real-world adoption. Future studies should explore how interpretable outputs (e.g., visual heatmaps, rule-based summaries, and confidence scores) can be effectively translated into intuitive decision support tools. Integrating these tools into user-friendly platforms or interfaces will help bridge the gap between complex model predictions and practical field use, thereby increasing trust, usability, and decision confidence among non-expert users.
Few-shot and zero-shot learning: Given the high costs and difficulties associated with agricultural data collection and annotation, intelligent data generation and transfer learning based on GANs, VAEs, and ZSLs will become critical approaches to improving model robustness. Moreover, these techniques can mitigate data scarcity under new tasks or rare crop varieties.
Lightweight deployment and edge computing: To enable real-time inference on field-deployed UAVs, robots, and mobile devices, lightweight deep networks (e.g., MobileNet and EfficientNet), model compression, and edge–cloud collaborative computing will become core research directions, supporting the large-scale implementation of these technologies [169]. In addition, field conditions such as strong light variation, occlusion by leaves or soil, sensor disturbance (e.g., dust, vibration), and unstable network transmission must be considered during deployment. Solutions such as adaptive image enhancement, real-time calibration, multi-view fusion, and fault-tolerant system design should be explored to improve operational stability.
Bridging the gap between research and deployment: Future work should explicitly address real-world challenges such as domain shift, unseen environmental noise, hardware limitations, and long-term model drift. Building large-scale benchmark datasets under realistic field scenarios and promoting open-source deployment pipelines will help accelerate the transition of deep learning technologies from research laboratories to farms and agricultural enterprises [170,171].

5. Conclusions

Deep learning technologies, with their end-to-end feature learning capabilities, advantages in modeling nonlinear relationships, and powerful data fusion capacity, are reshaping the research paradigm of plant attribute monitoring. This review systematically summarized recent advances and state-of-the-art achievements in applying deep learning models—including CNNs, RNNs/LSTMs, Transformers, and GANs—to key applications such as plant growth monitoring, yield prediction, pest and disease identification, abiotic stress detection, and phenotypic analysis. Existing studies demonstrate that deep learning consistently outperforms traditional machine learning approaches under complex environmental conditions. Its ability to extract nonlinear features from high-dimensional multimodal datasets provides precise and reliable technical support for plant monitoring. However, challenges related to data scarcity, cross-domain generalization, model interpretability, and edge deployment remain critical constraints to large-scale adoption. In the future, integrating multimodal remote sensing imagery, high-throughput phenomics, and agricultural IoT, together with advanced techniques such as transfer learning, generative models, and causal inference, is expected to further enhance the applicability and accessibility of deep learning in plant attribute monitoring. Through interdisciplinary innovation, deep learning holds great promise for providing more robust technological support to drive the digitalization, intelligence, and greening of agricultural production, thereby promoting yield improvement, efficient resource utilization, and sustainable agricultural development.

Author Contributions

Conceptualization, H.W.; methodology, H.W.; validation, S.H.; formal analysis, S.H.; investigation, S.H.; writing—original draft preparation, S.H.; writing—review and editing, H.W.; visualization, S.H.; supervision, H.W.; project administration, H.W. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

No data was created in this work.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. He, L.; Shan, Y.q.; Liu, C.; Cao, H.; Liu, X.n.; Guo, Y. Prediction of bedload transport inside vegetation canopies with natural morphology. J. Hydrodyn. 2024, 36, 556–569. [Google Scholar] [CrossRef]
  2. Wang, R.F.; Qu, H.R.; Su, W.H. From Sensors to Insights: Technological Trends in Image-Based High-Throughput Plant Phenotyping. Smart Agric. Technol. 2025, 12, 101257. [Google Scholar] [CrossRef]
  3. He, Q.; Zhan, J.; Liu, X.; Dong, C.; Tian, D.; Fu, Q. Multispectral polarimetric bidirectional reflectance research of plant canopy. Opt. Lasers Eng. 2025, 184, 108688. [Google Scholar] [CrossRef]
  4. Pieruschka, R.; Schurr, U. Plant phenotyping: Past, present, and future. Plant Phenomics 2019, 2019, 7507131. [Google Scholar] [CrossRef] [PubMed]
  5. Yang, Z.Y.; Xia, W.K.; Chu, H.Q.; Su, W.H.; Wang, R.F.; Wang, H. A comprehensive review of deep learning applications in cotton industry: From field monitoring to smart processing. Plants 2025, 14, 1481. [Google Scholar] [CrossRef] [PubMed]
  6. Liao, Q.; Ding, R.; Du, T.; Kang, S.; Tong, L.; Li, S. Salinity-specific stomatal conductance model parameters are reduced by stomatal saturation conductance and area via leaf nitrogen. Sci. Total Environ. 2023, 876, 162584. [Google Scholar] [CrossRef]
  7. Scafaro, A.P.; Posch, B.C.; Evans, J.R.; Farquhar, G.D.; Atkin, O.K. Rubisco deactivation and chloroplast electron transport rates co-limit photosynthesis above optimal leaf temperature in terrestrial plants. Nat. Commun. 2023, 14, 2820. [Google Scholar] [CrossRef]
  8. Zavafer, A.; Bates, H.; Mancilla, C.; Ralph, P.J. Phenomics: Conceptualization and importance for plant physiology. Trends Plant Sci. 2023, 28, 1004–1013. [Google Scholar] [CrossRef]
  9. Hwang, Y.; Kim, J.; Ryu, Y. Canopy structural changes explain reductions in canopy-level solar induced chlorophyll fluorescence in Prunus yedoensis seedlings under a drought stress condition. Remote Sens. Environ. 2023, 296, 113733. [Google Scholar] [CrossRef]
  10. Meerdink, S.K.; Roberts, D.A.; King, J.Y.; Roth, K.L.; Gader, P.D.; Caylor, K.K. Using hyperspectral and thermal imagery to monitor stress of Southern California plant species during the 2013–2015 drought. ISPRS J. Photogramm. Remote Sens. 2025, 220, 580–592. [Google Scholar] [CrossRef]
  11. Polk, S.L.; Cui, K.; Plemmons, R.J.; Murphy, J.M. Active diffusion and VCA-assisted image segmentation of hyperspectral images. In Proceedings of the IGARSS 2022-2022 IEEE International Geoscience and Remote Sensing Symposium, Kuala Lumpur, Malaysia, 17–22 July 2022; IEEE: New York, NY, USA, 2022; pp. 1364–1367. [Google Scholar]
  12. Polk, S.L.; Cui, K.; Chan, A.H.; Coomes, D.A.; Plemmons, R.J.; Murphy, J.M. Unsupervised diffusion and volume maximization-based clustering of hyperspectral images. Remote Sens. 2023, 15, 1053. [Google Scholar] [CrossRef]
  13. Li, L.; Li, J.; Wang, H.; Georgieva, T.; Ferentinos, K.; Arvanitis, K.; Sygrimis, N. Sustainable energy management of solar greenhouses using open weather data on MACQU platform. Int. J. Agric. Biol. Eng. 2018, 11, 74–82. [Google Scholar] [CrossRef]
  14. Wang, H.; Mei, S.L. Shannon wavelet precision integration method for pathologic onion image segmentation based on homotopy perturbation technology. Math. Probl. Eng. 2014, 2014, 601841. [Google Scholar] [CrossRef]
  15. Yang, K.; Li, Y. Effects of water stress and fertilizer stress on maize growth and spectral identification of different stresses. Spectrochim. Acta Part A Mol. Biomol. Spectrosc. 2023, 297, 122703. [Google Scholar] [CrossRef] [PubMed]
  16. Ali, S.; Hassan, M.; Kim, J.Y.; Farid, M.I.; Sanaullah, M.; Mufti, H. FF-PCA-LDA: Intelligent feature fusion based PCA-LDA classification system for plant leaf diseases. Appl. Sci. 2022, 12, 3514. [Google Scholar] [CrossRef]
  17. Meerdink, S.; Roberts, D.; Hulley, G.; Gader, P.; Pisek, J.; Adamson, K.; King, J.; Hook, S.J. Plant species’ spectral emissivity and temperature using the hyperspectral thermal emission spectrometer (HyTES) sensor. Remote Sens. Environ. 2019, 224, 421–435. [Google Scholar] [CrossRef]
  18. Zhang, S. Challenges in KNN classification. IEEE Trans. Knowl. Data Eng. 2021, 34, 4663–4675. [Google Scholar] [CrossRef]
  19. Sai, K.; Sood, N.; Saini, I. Abiotic stress classification through spectral analysis of enhanced electrophysiological signals of plants. Biosyst. Eng. 2022, 219, 189–204. [Google Scholar] [CrossRef]
  20. Wang, Y.; Zhang, Q.; Yu, F.; Zhang, N.; Zhang, X.; Li, Y.; Wang, M.; Zhang, J. Progress in Research on Deep Learning-Based Crop Yield Prediction. Agronomy 2024, 14, 2264. [Google Scholar] [CrossRef]
  21. Zhou, G.; Wang, R.F. The Heterogeneous Network Community Detection Model Based on Self-Attention. Symmetry 2025, 17, 432. [Google Scholar] [CrossRef]
  22. Yao, M.; Huo, Y.; Ran, Y.; Tian, Q.; Wang, R.; Wang, H. Neural radiance field-based visual rendering: A comprehensive review. arXiv 2024, arXiv:2404.00714. [Google Scholar] [CrossRef]
  23. Kamilaris, A.; Prenafeta-Boldú, F.X. Deep learning in agriculture: A survey. Comput. Electron. Agric. 2018, 147, 70–90. [Google Scholar] [CrossRef]
  24. Tu, Y.-H.; Wang, R.-F.; Su, W.-H. Active Disturbance Rejection Control—New Trends in Agricultural Cybernetics in the Future: A Comprehensive Review. Machines 2025, 13, 111. [Google Scholar] [CrossRef]
  25. Sun, H.; Chu, H.Q.; Qin, Y.M.; Hu, P.; Wang, R.F. Empowering Smart Soybean Farming with Deep Learning: Progress, Challenges, and Future Perspectives. Agronomy 2025, 15, 1831. [Google Scholar] [CrossRef]
  26. Zhou, L.; Cai, J.; Ding, S. The identification of ice floes and calculation of sea ice concentration based on a deep learning method. Remote Sens. 2023, 15, 2663. [Google Scholar] [CrossRef]
  27. Wang, R.F.; Tu, Y.H.; Li, X.C.; Chen, Z.Q.; Zhao, C.T.; Yang, C.; Su, W.H. An Intelligent Robot Based on Optimized YOLOv11l for Weed Control in Lettuce. In Proceedings of the 2025 ASABE Annual International Meeting. American Society of Agricultural and Biological Engineers, Toronto, ON, Canada, 13–16 July 2025; p. 1. [Google Scholar]
  28. Wang, R.F.; Su, W.H. The application of deep learning in the whole potato production Chain: A Comprehensive review. Agriculture 2024, 14, 1225. [Google Scholar] [CrossRef]
  29. Zhao, C.T.; Wang, R.F.; Tu, Y.H.; Pang, X.X.; Su, W.H. Automatic lettuce weed detection and classification based on optimized convolutional neural networks for robotic weed control. Agronomy 2024, 14, 2838. [Google Scholar] [CrossRef]
  30. Wang, R.F.; Tu, Y.H.; Chen, Z.Q.; Zhao, C.T.; Su, W.H. A Lettpoint-Yolov11l Based Intelligent Robot for Precision Intra-Row Weeds Control in Lettuce. SSRN 2025. [Google Scholar] [CrossRef]
  31. Dong, S.; Wang, P.; Abbas, K. A survey on deep learning and its applications. Comput. Sci. Rev. 2021, 40, 100379. [Google Scholar] [CrossRef]
  32. Zhou, G.; Wang, R.F.; Cui, K. A Local Perspective-based Model for Overlapping Community Detection. arXiv 2025, arXiv:2503.21558. [Google Scholar] [CrossRef]
  33. Dargan, S.; Kumar, M.; Ayyagari, M.R.; Kumar, G. A survey of deep learning and its applications: A new paradigm to machine learning. Arch. Comput. Methods Eng. 2020, 27, 1071–1092. [Google Scholar] [CrossRef]
  34. Li, Z.; Sun, C.; Wang, H.; Wang, R.F. Hybrid optimization of phase masks: Integrating Non-Iterative methods with simulated annealing and validation via tomographic measurements. Symmetry 2025, 17, 530. [Google Scholar] [CrossRef]
  35. Li, Y.; Pan, L.; Peng, Y.; Li, X.; Wang, X.; Qu, L.; Song, Q.; Liang, Q.; Peng, S. Application of deep learning-based multimodal fusion technology in cancer diagnosis: A survey. Eng. Appl. Artif. Intell. 2025, 143, 109972. [Google Scholar] [CrossRef]
  36. de Souza Rodrigues, L.; Caixeta Filho, E.; Sakiyama, K.; Santos, M.F.; Jank, L.; Carromeu, C.; Silveira, E.; Matsubara, E.T.; Junior, J.M.; Goncalves, W.N. Deep4Fusion: A Deep FORage Fusion framework for high-throughput phenotyping for green and dry matter yield traits. Comput. Electron. Agric. 2023, 211, 107957. [Google Scholar] [CrossRef]
  37. Elavarasan, D.; Durai Raj Vincent, P. Fuzzy deep learning-based crop yield prediction model for sustainable agronomical frameworks. Neural Comput. Appl. 2021, 33, 13205–13224. [Google Scholar] [CrossRef]
  38. Singh, A.; Jones, S.; Ganapathysubramanian, B.; Sarkar, S.; Mueller, D.; Sandhu, K.; Nagasubramanian, K. Challenges and opportunities in machine-augmented plant stress phenotyping. Trends Plant Sci. 2021, 26, 53–69. [Google Scholar] [CrossRef]
  39. Wang, K.; Abid, M.A.; Rasheed, A.; Crossa, J.; Hearne, S.; Li, H. DNNGP, a deep neural network-based method for genomic prediction using multi-omics data in plants. Mol. Plant 2023, 16, 279–293. [Google Scholar] [CrossRef]
  40. Hossen, M.I.; Awrangjeb, M.; Pan, S.; Mamun, A.A. Transfer learning in agriculture: A review. Artif. Intell. Rev. 2025, 58, 97. [Google Scholar] [CrossRef]
  41. Cui, K.; Shao, Z.; Larsen, G.; Pauca, V.; Alqahtani, S.; Segurado, D.; Pinheiro, J.; Wang, M.; Lutz, D.; Plemmons, R.; et al. PalmProbNet: A Probabilistic Approach to Understanding Palm Distributions in Ecuadorian Tropical Forest via Transfer Learning. In Proceedings of the 2024 ACM Southeast Conference, Marietta, GA, USA, 18–20 April 2024; pp. 272–277. [Google Scholar]
  42. Khaki, S.; Wang, L.; Archontoulis, S.V. A CNN-RNN framework for crop yield prediction. Front. Plant Sci. 2020, 10, 1750. [Google Scholar] [CrossRef]
  43. Lu, W.; Du, R.; Niu, P.; Xing, G.; Luo, H.; Deng, Y.; Shu, L. Soybean yield preharvest prediction based on bean pods and leaves image recognition using deep learning neural network combined with GRNN. Front. Plant Sci. 2022, 12, 791256. [Google Scholar] [CrossRef]
  44. Zhuang, L.; Wang, C.; Hao, H.; Li, J.; Xu, L.; Liu, S.; Guo, X. Maize emergence rate and leaf emergence speed estimation via image detection under field rail-based phenotyping platform. Comput. Electron. Agric. 2024, 220, 108838. [Google Scholar] [CrossRef]
  45. El Sakka, M.; Ivanovici, M.; Chaari, L.; Mothe, J. A review of CNN applications in smart agriculture using multimodal data. Sensors 2025, 25, 472. [Google Scholar] [CrossRef] [PubMed]
  46. Huo, Y.; Yao, M.; Tian, Q.; Wang, T.; Wang, R.; Wang, H. FA-YOLO: Research On Efficient Feature Selection YOLO Improved Algorithm Based On FMDS and AGMF Modules. arXiv 2024, arXiv:2408.16313. [Google Scholar] [CrossRef]
  47. Peyal, H.I.; Nahiduzzaman, M.; Pramanik, M.A.H.; Syfullah, M.K.; Shahriar, S.M.; Sultana, A.; Ahsan, M.; Haider, J.; Khandakar, A.; Chowdhury, M.E. Plant disease classifier: Detection of dual-crop diseases using lightweight 2d cnn architecture. IEEE Access 2023, 11, 110627–110643. [Google Scholar] [CrossRef]
  48. Huo, Y.; Wang, R.F.; Zhao, C.T.; Hu, P.; Wang, H. Research on Obtaining Pepper Phenotypic Parameters Based on Improved YOLOX Algorithm. AgriEngineering 2025, 7, 209. [Google Scholar] [CrossRef]
  49. Turkoglu, M.O.; D’Aronco, S.; Wegner, J.D.; Schindler, K. Gating revisited: Deep multi-layer RNNs that can be trained. IEEE Trans. Pattern Anal. Mach. Intell. 2021, 44, 4081–4092. [Google Scholar] [CrossRef]
  50. Vural, N.M.; Ilhan, F.; Yilmaz, S.F.; Ergüt, S.; Kozat, S.S. Achieving online regression performance of LSTMs with simple RNNs. IEEE Trans. Neural Netw. Learn. Syst. 2021, 33, 7632–7643. [Google Scholar] [CrossRef]
  51. Basiri, M.E.; Nemati, S.; Abdar, M.; Cambria, E.; Acharya, U.R. ABCDM: An attention-based bidirectional CNN-RNN deep model for sentiment analysis. Future Gener. Comput. Syst. 2021, 115, 279–294. [Google Scholar] [CrossRef]
  52. Chen, J.; Zhuge, H. Extractive summarization of documents with images based on multi-modal RNN. Future Gener. Comput. Syst. 2019, 99, 186–196. [Google Scholar] [CrossRef]
  53. Dera, D.; Ahmed, S.; Bouaynaya, N.C.; Rasool, G. Trustworthy uncertainty propagation for sequential time-series analysis in rnns. IEEE Trans. Knowl. Data Eng. 2023, 36, 882–896. [Google Scholar] [CrossRef]
  54. Hochreiter, S.; Schmidhuber, J. Long short-term memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef]
  55. Tian, H.; Wang, P.; Tansey, K.; Han, D.; Zhang, J.; Zhang, S.; Li, H. A deep learning framework under attention mechanism for wheat yield estimation using remotely sensed indices in the Guanzhong Plain, PR China. Int. J. Appl. Earth Obs. Geoinf. 2021, 102, 102375. [Google Scholar] [CrossRef]
  56. Cheng, E.; Wang, F.; Peng, D.; Zhang, B.; Zhao, B.; Zhang, W.; Hu, J.; Lou, Z.; Yang, S.; Zhang, H.; et al. A GT-LSTM spatio-temporal approach for winter wheat yield prediction: From the field scale to county scale. IEEE Trans. Geosci. Remote Sens. 2024, 62, 4409318. [Google Scholar] [CrossRef]
  57. Wang, L.; Zhang, H.; Bian, L.; Zhou, L.; Wang, S.; Ge, Y. Poplar seedling varieties and drought stress classification based on multi-source, time-series data and deep learning. Ind. Crops Prod. 2024, 218, 118905. [Google Scholar] [CrossRef]
  58. Chandel, N.S.; Rajwade, Y.A.; Dubey, K.; Chandel, A.K.; Subeesh, A.; Tiwari, M.K. Water stress identification of winter wheat crop with state-of-the-art AI techniques and high-resolution thermal-RGB imagery. Plants 2022, 11, 3344. [Google Scholar] [CrossRef] [PubMed]
  59. Ali, T.; Rehman, S.U.; Ali, S.; Mahmood, K.; Obregon, S.A.; Iglesias, R.C.; Khurshaid, T.; Ashraf, I. Smart agriculture: Utilizing machine learning and deep learning for drought stress identification in crops. Sci. Rep. 2024, 14, 30062. [Google Scholar] [CrossRef] [PubMed]
  60. Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, Ł.; Polosukhin, I. Attention is all you need. In Advances in Neural Information Processing Systems, Proceedings of the 31st International Conference on Neural Information Processing Systems, Long Beach, CA, USA, 4–9 December 2017; Curran Associates Inc.: Red Hook, NY, USA, 2017; Volume 30. [Google Scholar]
  61. Wang, Z.; Wang, R.; Wang, M.; Lai, T.; Zhang, M. Self-supervised transformer-based pre-training method with General Plant Infection dataset. In Proceedings of the Chinese Conference on Pattern Recognition and Computer Vision (PRCV), Urumqi, China, 18–20 October 2024; Springer: Berlin/Heidelberg, Germany, 2024; pp. 189–202. [Google Scholar]
  62. Zhang, W.T.; Bai, Y.; Zheng, S.D.; Cui, J.; Huang, Z.z. Tensor Transformer for hyperspectral image classification. Pattern Recognit. 2025, 163, 111470. [Google Scholar] [CrossRef]
  63. Guo, Y.; Lan, Y.; Chen, X. CST: Convolutional Swin Transformer for detecting the degree and types of plant diseases. Comput. Electron. Agric. 2022, 202, 107407. [Google Scholar] [CrossRef]
  64. Rahman, M.H.; Busby, S.; Ru, S.; Hanif, S.; Sanz-Saez, A.; Zheng, J.; Rehman, T.U. Transformer-Based hyperspectral image analysis for phenotyping drought tolerance in blueberries. Comput. Electron. Agric. 2025, 228, 109684. [Google Scholar] [CrossRef]
  65. Yao, M.; Huo, Y.; Tian, Q.; Zhao, J.; Liu, X.; Wang, R.; Xue, L.; Wang, H. FMRFT: Fusion mamba and DETR for query time sequence intersection fish tracking. Comput. Electron. Agric. 2025, 237, 110742. [Google Scholar] [CrossRef]
  66. Wu, A.Q.; Li, K.L.; Song, Z.Y.; Lou, X.; Hu, P.; Yang, W.; Wang, R.F. Deep Learning for Sustainable Aquaculture: Opportunities and Challenges. Sustainability 2025, 17, 5084. [Google Scholar] [CrossRef]
  67. Akkem, Y.; Biswas, S.K.; Varanasi, A. A comprehensive review of synthetic data generation in smart farming by using variational autoencoder and generative adversarial network. Eng. Appl. Artif. Intell. 2024, 131, 107881. [Google Scholar] [CrossRef]
  68. Wang, K.; Gou, C.; Duan, Y.; Lin, Y.; Zheng, X.; Wang, F.Y. Generative adversarial networks: Introduction and outlook. IEEE/CAA J. Autom. Sin. 2017, 4, 588–598. [Google Scholar] [CrossRef]
  69. Pu, Y.; Gan, Z.; Henao, R.; Yuan, X.; Li, C.; Stevens, A.; Carin, L. Variational autoencoder for deep learning of images, labels and captions. In Advances in Neural Information Processing Systems, Proceedings of the 30th International Conference on Neural Information Processing Systems, Barcelona, Spain, 5–10 December 2016; Curran Associates Inc.: Red Hook, NY, USA, 2016; Volume 29. [Google Scholar]
  70. Miranda, M.; Zabawa, L.; Kicherer, A.; Strothmann, L.; Rascher, U.; Roscher, R. Detection of anomalous grapevine berries using variational autoencoders. Front. Plant Sci. 2022, 13, 729097. [Google Scholar] [CrossRef] [PubMed]
  71. Creswell, A.; White, T.; Dumoulin, V.; Arulkumaran, K.; Sengupta, B.; Bharath, A.A. Generative adversarial networks: An overview. IEEE Signal Process. Mag. 2018, 35, 53–65. [Google Scholar] [CrossRef]
  72. Gui, J.; Sun, Z.; Wen, Y.; Tao, D.; Ye, J. A review on generative adversarial networks: Algorithms, theory, and applications. IEEE Trans. Knowl. Data Eng. 2021, 35, 3313–3332. [Google Scholar] [CrossRef]
  73. Yu, H.; Welch, J.D. MichiGAN: Sampling from disentangled representations of single-cell data using generative adversarial networks. Genome Biol. 2021, 22, 158. [Google Scholar] [CrossRef]
  74. Sun, R.; Huang, C.; Zhu, H.; Ma, L. Mask-aware photorealistic facial attribute manipulation. Comput. Vis. Media 2021, 7, 363–374. [Google Scholar] [CrossRef]
  75. Qin, C.; Liu, J.; Ma, S.; Du, J.; Jiang, G.; Zhao, L. Inverse design of semiconductor materials with deep generative models. J. Mater. Chem. A 2024, 12, 22689–22702. [Google Scholar] [CrossRef]
  76. Debnath, S.; Preetham, A.; Vuppu, S.; Kumar, S.N.P. Optimal weighted GAN and U-Net based segmentation for phenotypic trait estimation of crops using Taylor Coot algorithm. Appl. Soft Comput. 2023, 144, 110396. [Google Scholar] [CrossRef]
  77. Li, J.; Wang, Y.; Zheng, L.; Zhang, M.; Wang, M. Towards end-to-end deep RNN based networks to precisely regress of the lettuce plant height by single perspective sparse 3D point cloud. Expert Syst. Appl. 2023, 229, 120497. [Google Scholar] [CrossRef]
  78. Singh, A.K.; Rao, A.; Chattopadhyay, P.; Maurya, R.; Singh, L. Effective plant disease diagnosis using Vision Transformer trained with leafy-generative adversarial network-generated images. Expert Syst. Appl. 2024, 254, 124387. [Google Scholar] [CrossRef]
  79. Murphy, K.M.; Ludwig, E.; Gutierrez, J.; Gehan, M.A. Deep learning in image-based plant phenotyping. Annu. Rev. Plant Biol. 2024, 75, 771–795. [Google Scholar] [CrossRef]
  80. Chawla, T.; Mittal, S.; Azad, H.K. MobileNet-GRU fusion for optimizing diagnosis of yellow vein mosaic virus. Ecol. Inform. 2024, 81, 102548. [Google Scholar] [CrossRef]
  81. Taghavi Namin, S.; Esmaeilzadeh, M.; Najafi, M.; Brown, T.B.; Borevitz, J.O. Deep phenotyping: Deep learning for temporal phenotype/genotype classification. Plant Methods 2018, 14, 66. [Google Scholar] [CrossRef]
  82. Sagar, B.; Cauvery, N. Agriculture data analytics in crop yield estimation: A critical review. Indones. J. Electr. Eng. Comput. Sci. 2018, 12, 1087–1093. [Google Scholar] [CrossRef]
  83. Godfray, H.C.J.; Beddington, J.R.; Crute, I.R.; Haddad, L.; Lawrence, D.; Muir, J.F.; Pretty, J.; Robinson, S.; Thomas, S.M.; Toulmin, C. Food security: The challenge of feeding 9 billion people. Science 2010, 327, 812–818. [Google Scholar] [CrossRef] [PubMed]
  84. Nelson, W.J.; Lee, B.C.; Gasperini, F.A.; Hair, D.M. Meeting the challenge of feeding 9 billion people safely and securely. J. Agromed. 2012, 17, 347–350. [Google Scholar] [CrossRef] [PubMed]
  85. Muruganantham, P.; Wibowo, S.; Grandhi, S.; Samrat, N.H.; Islam, N. A systematic literature review on crop yield prediction with deep learning and remote sensing. Remote Sens. 2022, 14, 1990. [Google Scholar] [CrossRef]
  86. Khan, S.N.; Li, D.; Maimaitijiang, M. Using gross primary production data and deep transfer learning for crop yield prediction in the US Corn Belt. Int. J. Appl. Earth Obs. Geoinf. 2024, 131, 103965. [Google Scholar] [CrossRef]
  87. Yang, F.; Zhang, D.; Zhang, Y.; Zhang, Y.; Han, Y.; Zhang, Q.; Zhang, Q.; Zhang, C.; Liu, Z.; Wang, K. Prediction of corn variety yield with attribute-missing data via graph neural network. Comput. Electron. Agric. 2023, 211, 108046. [Google Scholar] [CrossRef]
  88. Maimaitijiang, M.; Sagan, V.; Sidike, P.; Hartling, S.; Esposito, F.; Fritschi, F.B. Soybean yield prediction from UAV using multimodal data fusion and deep learning. Remote Sens. Environ. 2020, 237, 111599. [Google Scholar] [CrossRef]
  89. Chu, Z.; Yu, J. An end-to-end model for rice yield prediction using deep learning fusion. Comput. Electron. Agric. 2020, 174, 105471. [Google Scholar] [CrossRef]
  90. Wei, J.; Tian, X.; Ren, W.; Gao, R.; Ji, Z.; Kong, Q.; Su, Z. A precise plot-level rice yield prediction method based on panicle detection. Agronomy 2024, 14, 1618. [Google Scholar] [CrossRef]
  91. Liu, Y.; Wang, S.; Wang, X.; Chen, B.; Chen, J.; Wang, J.; Huang, M.; Wang, Z.; Ma, L.; Wang, P.; et al. Exploring the superiority of solar-induced chlorophyll fluorescence data in predicting wheat yield using machine learning and deep learning methods. Comput. Electron. Agric. 2022, 192, 106612. [Google Scholar] [CrossRef]
  92. Jeong, S.; Ko, J.; Yeom, J.M. Predicting rice yield at pixel scale through synthetic use of crop and deep learning models with satellite data in South and North Korea. Sci. Total Environ. 2022, 802, 149726. [Google Scholar] [CrossRef] [PubMed]
  93. Cho, W.; Kim, S.; Na, M.; Na, I. Forecasting of tomato yields using attention-based LSTM network and ARMA model. Electronics 2021, 10, 1576. [Google Scholar] [CrossRef]
  94. Pisharody, S.N.; Duraisamy, P.; Rangarajan, A.K.; Whetton, R.L.; Herrero-Langreo, A. Precise Tomato Ripeness Estimation and Yield Prediction using Transformer Based Segmentation-SegLoRA. Comput. Electron. Agric. 2025, 233, 110172. [Google Scholar] [CrossRef]
  95. Zhang, Y.; Wu, M.; Li, J.; Yang, S.; Zheng, L.; Liu, X.; Wang, M. Automatic non-destructive multiple lettuce traits prediction based on DeepLabV3+. J. Food Meas. Charact. 2023, 17, 636–652. [Google Scholar] [CrossRef]
  96. Qin, Y.M.; Tu, Y.H.; Li, T.; Ni, Y.; Wang, R.F.; Wang, H. Deep Learning for sustainable agriculture: A systematic review on applications in lettuce cultivation. Sustainability 2025, 17, 3190. [Google Scholar] [CrossRef]
  97. Chakraborty, M.; Pourreza, A.; Zhang, X.; Jafarbiglu, H.; Shackel, K.A.; DeJong, T. Early almond yield forecasting by bloom mapping using aerial imagery and deep learning. Comput. Electron. Agric. 2023, 212, 108063. [Google Scholar] [CrossRef]
  98. Deguine, J.P.; Aubertot, J.N.; Flor, R.J.; Lescourret, F.; Wyckhuys, K.A.; Ratnadass, A. Integrated pest management: Good intentions, hard realities. A review. Agron. Sustain. Dev. 2021, 41, 38. [Google Scholar] [CrossRef]
  99. Das, A.; Pathan, F.; Jim, J.R.; Kabir, M.M.; Mridha, M. Deep learning-based classification, detection, and segmentation of tomato leaf diseases: A state-of-the-art review. Artif. Intell. Agric. 2025, 15, 192–220. [Google Scholar] [CrossRef]
  100. Polk, S.L.; Chan, A.H.; Cui, K.; Plemmons, R.J.; Coomes, D.A.; Murphy, J.M. Unsupervised detection of ash dieback disease (Hymenoscyphus fraxineus) using diffusion-based hyperspectral image clustering. In Proceedings of the IGARSS 2022-2022 IEEE International Geoscience and Remote Sensing Symposium, Kuala Lumpur, Malaysia, 17–22 July 2022; IEEE: New York, NY, USA, 2022; pp. 2287–2290. [Google Scholar]
  101. Wilke, C. Remote Sensing for Crops Spots Pests and Pathogens. ACS Cent. Sci. 2023, 9, 339–342. [Google Scholar] [CrossRef] [PubMed]
  102. Wang, Z.; Zhang, H.W.; Dai, Y.Q.; Cui, K.; Wang, H.; Chee, P.W.; Wang, R.F. Resource-Efficient Cotton Network: A Lightweight Deep Learning Framework for Cotton Disease and Pest Classification. Plants 2025, 14, 2082. [Google Scholar] [CrossRef] [PubMed]
  103. Shoaib, M.; Sadeghi-Niaraki, A.; Ali, F.; Hussain, I.; Khalid, S. Leveraging deep learning for plant disease and pest detection: A comprehensive review and future directions. Front. Plant Sci. 2025, 16, 1538163. [Google Scholar] [CrossRef]
  104. Wyckhuys, K.; Sanchez-Bayo, F.; Aebi, A.; van Lexmond, M.B.; Bonmatin, J.M.; Goulson, D.; Mitchell, E. Stay true to integrated pest management. Science 2021, 371, 133. [Google Scholar] [CrossRef]
  105. Chen, P.; Wang, R.; Yang, P. Deep learning in crop diseases and insect pests. Front. Plant Sci. 2023, 14, 1145458. [Google Scholar] [CrossRef]
  106. Liu, J.; Wang, X. Plant diseases and pests detection based on deep learning: A review. Plant Methods 2021, 17, 22. [Google Scholar] [CrossRef]
  107. Gao, Y.; Alyokhin, A.; Prager, S.M.; Reitz, S.; Huseth, A. Complexities in the Implementation and Maintenance of Integrated Pest Management in Potato. Annu. Rev. Entomol. 2024, 70, 45–63. [Google Scholar] [CrossRef]
  108. Tassis, L.M.; de Souza, J.E.T.; Krohling, R.A. A deep learning approach combining instance and semantic segmentation to identify diseases and pests of coffee leaves from in-field images. Comput. Electron. Agric. 2021, 186, 106191. [Google Scholar] [CrossRef]
  109. Xiong, Y.; Liang, L.; Wang, L.; She, J.; Wu, M. Identification of cash crop diseases using automatic image segmentation algorithm and deep learning with expanded dataset. Comput. Electron. Agric. 2020, 177, 105712. [Google Scholar] [CrossRef]
  110. Wang, C.; Du, P.; Wu, H.; Li, J.; Zhao, C.; Zhu, H. A cucumber leaf disease severity classification method based on the fusion of DeepLabV3+ and U-Net. Comput. Electron. Agric. 2021, 189, 106373. [Google Scholar] [CrossRef]
  111. Rehman, M.U.; Eesaar, H.; Abbas, Z.; Seneviratne, L.; Hussain, I.; Chong, K.T. Advanced drone-based weed detection using feature-enriched deep learning approach. Knowl.-Based Syst. 2024, 305, 112655. [Google Scholar] [CrossRef]
  112. Zhang, J.; Huang, Y.; Pu, R.; Gonzalez-Moreno, P.; Yuan, L.; Wu, K.; Huang, W. Monitoring plant diseases and pests through remote sensing technology: A review. Comput. Electron. Agric. 2019, 165, 104943. [Google Scholar] [CrossRef]
  113. Xu, W.; Li, W.; Wang, L.; Pompelli, M.F. Enhancing corn pest and disease recognition through deep learning: A comprehensive analysis. Agronomy 2023, 13, 2242. [Google Scholar] [CrossRef]
  114. Di, X.; Cui, K.; Wang, R.F. Toward Efficient UAV-Based Small Object Detection: A Lightweight Network with Enhanced Feature Fusion. Remote Sens. 2025, 17, 2235. [Google Scholar] [CrossRef]
  115. Zhu, H.; Lin, C.; Liu, G.; Wang, D.; Qin, S.; Li, A.; Xu, J.L.; He, Y. Intelligent agriculture: Deep learning in UAV-based remote sensing imagery for crop diseases and pests detection. Front. Plant Sci. 2024, 15, 1435016. [Google Scholar] [CrossRef]
  116. Muhammed, D.; Ahvar, E.; Ahvar, S.; Trocan, M.; Montpetit, M.J.; Ehsani, R. Artificial Intelligence of Things (AIoT) for smart agriculture: A review of architectures, technologies and solutions. J. Netw. Comput. Appl. 2024, 228, 103905. [Google Scholar] [CrossRef]
  117. Sharma, R.P.; Ramesh, D.; Pal, P.; Tripathi, S.; Kumar, C. IoT-enabled IEEE 802.15. 4 WSN monitoring infrastructure-driven fuzzy-logic-based crop pest prediction. IEEE Internet Things J. 2021, 9, 3037–3045. [Google Scholar] [CrossRef]
  118. Cai, J.; Xiao, D.; Lv, L.; Ye, Y. An early warning model for vegetable pests based on multidimensional data. Comput. Electron. Agric. 2019, 156, 217–226. [Google Scholar] [CrossRef]
  119. Hari, P.; Singh, M.P. Adaptive knowledge transfer using federated deep learning for plant disease detection. Comput. Electron. Agric. 2025, 229, 109720. [Google Scholar] [CrossRef]
  120. Mallick, M.T.; Murty, D.O.; Pal, R.; Mandal, S.; Saha, H.N.; Chakrabarti, A. High-speed system-on-chip-based platform for real-time crop disease and pest detection using deep learning techniques. Comput. Electr. Eng. 2025, 123, 110182. [Google Scholar] [CrossRef]
  121. Wang, S.; Zeng, Q.; Yuan, G.; Ni, W.; Li, C.; Duan, H.; Xie, N.; Xiao, F.; Yang, X. Generalized zero-shot pest and disease image classification based on causal gating model. Comput. Electron. Agric. 2025, 230, 109827. [Google Scholar] [CrossRef]
  122. Chen, X.; Yang, B.; Liu, Y.; Feng, Z.; Lyu, J.; Luo, J.; Wu, J.; Yao, Q.; Liu, S. Intelligent survey method of rice diseases and pests using AR glasses and image-text multimodal fusion model. Comput. Electron. Agric. 2025, 237, 110574. [Google Scholar] [CrossRef]
  123. Jiang, Y.; Li, C. Convolutional neural networks for image-based high-throughput plant phenotyping: A review. Plant Phenomics 2020, 2020, 4152816. [Google Scholar] [CrossRef]
  124. Zarboubi, M.; Bellout, A.; Chabaa, S.; Dliou, A. CustomBottleneck-VGGNet: Advanced tomato leaf disease identification for sustainable agriculture. Comput. Electron. Agric. 2025, 232, 110066. [Google Scholar] [CrossRef]
  125. Weihs, B.J.; Tang, Z.; Tian, Z.; Heuschele, D.J.; Siddique, A.; Terrill, T.H.; Zhang, Z.; York, L.M.; Zhang, Z.; Xu, Z. Phenotyping alfalfa (Medicago sativa L.) root structure architecture via integrating confident machine learning with ResNet-18. Plant Phenomics 2024, 6, 0251. [Google Scholar] [CrossRef]
  126. Bheemanahalli, R.; Wang, C.; Bashir, E.; Chiluwal, A.; Pokharel, M.; Perumal, R.; Moghimi, N.; Ostmeyer, T.; Caragea, D.; Jagadish, S.K. Classical phenotyping and deep learning concur on genetic control of stomatal density and area in sorghum. Plant Physiol. 2021, 186, 1562–1579. [Google Scholar] [CrossRef]
  127. Atkins, K.; Garzón-Martínez, G.A.; Lloyd, A.; Doonan, J.H.; Lu, C. Unlocking the power of AI for phenotyping fruit morphology in Arabidopsis. GigaScience 2025, 14, giae123. [Google Scholar] [CrossRef]
  128. Zhou, S.; Chai, X.; Yang, Z.; Wang, H.; Yang, C.; Sun, T. Maize-IAS: A maize image analysis software using deep learning for high-throughput plant phenotyping. Plant Methods 2021, 17, 48. [Google Scholar] [CrossRef]
  129. Li, L.; Chang, H.; Zhao, S.; Liu, R.; Yan, M.; Li, F.; El-Sheery, N.I.; Feng, Z.; Yu, S. Combining high-throughput deep learning phenotyping and GWAS to reveal genetic variants of fruit branch angle in upland cotton. Ind. Crops Prod. 2024, 220, 119180. [Google Scholar] [CrossRef]
  130. Yu, H.; Dong, M.; Zhao, R.; Zhang, L.; Sui, Y. Research on precise phenotype identification and growth prediction of lettuce based on deep learning. Environ. Res. 2024, 252, 118845. [Google Scholar] [CrossRef]
  131. Uffelmann, E.; Huang, Q.Q.; Munung, N.S.; De Vries, J.; Okada, Y.; Martin, A.R.; Martin, H.C.; Lappalainen, T.; Posthuma, D. Genome-wide association studies. Nat. Rev. Methods Prim. 2021, 1, 59. [Google Scholar] [CrossRef]
  132. Arya, S.; Sandhu, K.S.; Singh, J.; Kumar, S. Deep learning: As the new frontier in high-throughput plant phenotyping. Euphytica 2022, 218, 47. [Google Scholar] [CrossRef]
  133. Champigny, M.J.; Unda, F.; Skyba, O.; Soolanayakanahally, R.Y.; Mansfield, S.D.; Campbell, M.M. Learning from methylomes: Epigenomic correlates of Populus balsamifera traits based on deep learning models of natural DNA methylation. Plant Biotechnol. J. 2020, 18, 1361–1375. [Google Scholar] [CrossRef] [PubMed]
  134. Wang, X.; Xuan, H.; Evers, B.; Shrestha, S.; Pless, R.; Poland, J. High-throughput phenotyping with deep learning gives insight into the genetic architecture of flowering time in wheat. GigaScience 2019, 8, giz120. [Google Scholar] [CrossRef] [PubMed]
  135. Peleke, F.F.; Zumkeller, S.M.; Gültas, M.; Schmitt, A.; Szymański, J. Deep learning the cis-regulatory code for gene expression in selected model plants. Nat. Commun. 2024, 15, 3488. [Google Scholar] [CrossRef] [PubMed]
  136. Rairdin, A.; Fotouhi, F.; Zhang, J.; Mueller, D.S.; Ganapathysubramanian, B.; Singh, A.K.; Dutta, S.; Sarkar, S.; Singh, A. Deep learning-based phenotyping for genome wide association studies of sudden death syndrome in soybean. Front. Plant Sci. 2022, 13, 966244. [Google Scholar] [CrossRef]
  137. Wang, H.; Yan, S.; Wang, W.; Chen, Y.; Hong, J.; He, Q.; Diao, X.; Lin, Y.; Chen, Y.; Cao, Y.; et al. Cropformer: An interpretable deep learning framework for crop genomic prediction. Plant Commun. 2025, 6, 101223. [Google Scholar] [CrossRef]
  138. Waadt, R.; Seller, C.A.; Hsu, P.K.; Takahashi, Y.; Munemasa, S.; Schroeder, J.I. Plant hormone regulation of abiotic stress responses. Nat. Rev. Mol. Cell Biol. 2022, 23, 680–694. [Google Scholar] [CrossRef]
  139. Zhang, H.; Zhu, J.; Gong, Z.; Zhu, J.K. Abiotic stress responses in plants. Nat. Rev. Genet. 2022, 23, 104–119. [Google Scholar] [CrossRef] [PubMed]
  140. Subeesh, A.; Chauhan, N. Deep learning based abiotic crop stress assessment for precision agriculture: A comprehensive review. J. Environ. Manag. 2025, 381, 125158. [Google Scholar] [CrossRef]
  141. Kidokoro, S.; Shinozaki, K.; Yamaguchi-Shinozaki, K. Transcriptional regulatory network of plant cold-stress responses. Trends Plant Sci. 2022, 27, 922–935. [Google Scholar] [CrossRef] [PubMed]
  142. Yu, Z.; Duan, X.; Luo, L.; Dai, S.; Ding, Z.; Xia, G. How plant hormones mediate salt stress responses. Trends Plant Sci. 2020, 25, 1117–1130. [Google Scholar] [CrossRef] [PubMed]
  143. Mittler, R.; Zandalinas, S.I.; Fichman, Y.; Van Breusegem, F. Reactive oxygen species signalling in plant stress responses. Nat. Rev. Mol. Cell Biol. 2022, 23, 663–679. [Google Scholar] [CrossRef]
  144. Cai, J.; Shen, L.; Kang, H.; Xu, T. RNA modifications in plant adaptation to abiotic stresses. Plant Commun. 2025, 6, 101229. [Google Scholar] [CrossRef]
  145. Zhu, J.K. Abiotic stress signaling and responses in plants. Cell 2016, 167, 313–324. [Google Scholar] [CrossRef]
  146. Zhou, Z.; Majeed, Y.; Naranjo, G.D.; Gambacorta, E.M. Assessment for crop water stress with infrared thermal imagery in precision agriculture: A review and future prospects for deep learning applications. Comput. Electron. Agric. 2021, 182, 106019. [Google Scholar] [CrossRef]
  147. Chandel, N.S.; Chakraborty, S.K.; Rajwade, Y.A.; Dubey, K.; Tiwari, M.K.; Jat, D. Identifying crop water stress using deep learning models. Neural Comput. Appl. 2021, 33, 5353–5367. [Google Scholar] [CrossRef]
  148. Yu, S.; Fan, J.; Lu, X.; Wen, W.; Shao, S.; Liang, D.; Yang, X.; Guo, X.; Zhao, C. Deep learning models based on hyperspectral data and time-series phenotypes for predicting quality attributes in lettuces under water stress. Comput. Electron. Agric. 2023, 211, 108034. [Google Scholar] [CrossRef]
  149. Buitrago, M.F.; Groen, T.A.; Hecker, C.A.; Skidmore, A.K. Changes in thermal infrared spectra of plants caused by temperature and water stress. ISPRS J. Photogramm. Remote Sens. 2016, 111, 22–31. [Google Scholar] [CrossRef]
  150. Bouain, N.; Krouk, G.; Lacombe, B.; Rouached, H. Getting to the root of plant mineral nutrition: Combinatorial nutrient stresses reveal emergent properties. Trends Plant Sci. 2019, 24, 542–552. [Google Scholar] [CrossRef] [PubMed]
  151. Theodore Armand, T.P.; Nfor, K.A.; Kim, J.I.; Kim, H.C. Applications of artificial intelligence, machine learning, and deep learning in nutrition: A systematic review. Nutrients 2024, 16, 1073. [Google Scholar] [CrossRef]
  152. Chandel, N.S.; Jat, D.; Chakraborty, S.K.; Upadhyay, A.; Subeesh, A.; Chouhan, P.; Manjhi, M.; Dubey, K. Deep learning assisted real-time nitrogen stress detection for variable rate fertilizer applicator in wheat crop. Comput. Electron. Agric. 2025, 237, 110545. [Google Scholar] [CrossRef]
  153. Deng, Y.; Xin, N.; Zhao, L.; Shi, H.; Deng, L.; Han, Z.; Wu, G. Precision Detection of Salt Stress in Soybean Seedlings Based on Deep Learning and Chlorophyll Fluorescence Imaging. Plants 2024, 13, 2089. [Google Scholar] [CrossRef]
  154. Miura, G. Surviving salt stress. Nat. Chem. Biol. 2023, 19, 1291. [Google Scholar] [CrossRef]
  155. Ma, X.; Zeng, X.; Huang, Y.; Liu, S.H.; Yin, J.; Yang, G.F. Visualizing plant salt stress with a NaCl-responsive fluorescent probe. Nat. Protoc. 2024, 20, 902–933. [Google Scholar] [CrossRef]
  156. Yao, J.P.; Wang, Z.Y.; de Oliveira, R.F.; Wang, Z.Y.; Huang, L. A deep learning method for the long-term prediction of plant electrical signals under salt stress to identify salt tolerance. Comput. Electron. Agric. 2021, 190, 106435. [Google Scholar] [CrossRef]
  157. Feng, X.; Zhan, Y.; Wang, Q.; Yang, X.; Yu, C.; Wang, H.; Tang, Z.; Jiang, D.; Peng, C.; He, Y. Hyperspectral imaging combined with machine learning as a tool to obtain high-throughput plant salt-stress phenotyping. Plant J. 2020, 101, 1448–1461. [Google Scholar] [CrossRef]
  158. Khotimah, W.N.; Boussaid, F.; Sohel, F.; Xu, L.; Edwards, D.; Jin, X.; Bennamoun, M. SC-CAN: Spectral convolution and Channel Attention network for wheat stress classification. Remote Sens. 2022, 14, 4288. [Google Scholar] [CrossRef]
  159. Jin, X.; Lin, Q.; Zhang, X.; Zhang, S.; Ma, W.; Sheng, P.; Zhang, L.; Wu, M.; Zhu, X.; Li, Z.; et al. OsPRMT6b balances plant growth and high temperature stress by feedback inhibition of abscisic acid signaling. Nat. Commun. 2025, 16, 5173. [Google Scholar] [CrossRef] [PubMed]
  160. Kan, Y.; Mu, X.R.; Gao, J.; Lin, H.X.; Lin, Y. The molecular basis of heat stress responses in plants. Mol. Plant 2023, 16, 1612–1634. [Google Scholar] [CrossRef] [PubMed]
  161. Lantzouni, O.; Alkofer, A.; Falter-Braun, P.; Schwechheimer, C. GROWTH-REGULATING FACTORS interact with DELLAs and regulate growth in cold stress. Plant Cell 2020, 32, 1018–1034. [Google Scholar] [CrossRef]
  162. Yang, W.; Yang, C.; Hao, Z.; Xie, C.; Li, M. Diagnosis of plant cold damage based on hyperspectral imaging and convolutional neural network. IEEE Access 2019, 7, 118239–118248. [Google Scholar] [CrossRef]
  163. Ryu, J.; Wi, S.; Lee, H. Snapshot-Based Multispectral Imaging for Heat Stress Detection in Southern-Type Garlic. Appl. Sci. 2023, 13, 8133. [Google Scholar] [CrossRef]
  164. Xing, D.; Wang, Y.; Sun, P.; Huang, H.; Lin, E. A CNN-LSTM-att hybrid model for classification and evaluation of growth status under drought and heat stress in chinese fir (Cunninghamia lanceolata). Plant Methods 2023, 19, 66. [Google Scholar] [CrossRef]
  165. Tan, Z.; Shi, J.; Lv, R.; Li, Q.; Yang, J.; Ma, Y.; Li, Y.; Wu, Y.; Zhang, R.; Ma, H.; et al. Fast anther dehiscence status recognition system established by deep learning to screen heat tolerant cotton. Plant Methods 2022, 18, 53. [Google Scholar] [CrossRef]
  166. Mao, Y.; Li, H.; Wang, Y.; Wang, H.; Shen, J.; Xu, Y.; Ding, S.; Wang, H.; Ding, Z.; Fan, K. Rapid monitoring of tea plants under cold stress based on UAV multi-sensor data. Comput. Electron. Agric. 2023, 213, 108176. [Google Scholar] [CrossRef]
  167. Wang, J.; Liu, N.; Liu, G.; Liu, Y.; Zhuang, X.; Zhang, L.; Wang, Y.; Yu, G. RSD-YOLO: A Defect Detection Model for Wind Turbine Blade Images. In Proceedings of the International Conference on Computer Engineering and Networks, Kashi, China, 18–21 October 2024; Springer: Berlin/Heidelberg, Germany, 2025; pp. 423–435. [Google Scholar]
  168. Yang, Z.X.; Li, Y.; Wang, R.F.; Hu, P.; Su, W.H. Deep Learning in Multimodal Fusion for Sustainable Plant Care: A Comprehensive Review. Sustainability 2025, 17, 5255. [Google Scholar] [CrossRef]
  169. Jiang, D.; Wang, H.; Li, T.; Gouda, M.A.; Zhou, B. Real-time tracker of chicken for poultry based on attention mechanism-enhanced YOLO-Chicken algorithm. Comput. Electron. Agric. 2025, 237, 110640. [Google Scholar] [CrossRef]
  170. Teklegiorgis, S.; Belayneh, A.; Gebermeskel, K.; Akomolafe, G.F.; Dejene, S.W. Modelling the current and future agro-ecological distribution potential of Mexican prickly poppy (Argemone mexicana L.) invasive alien plant species in South Wollo, Ethiopia. Geol. Ecol. Landscapes 2024, 1–15. [Google Scholar] [CrossRef]
  171. Sekar, K.C.; Thapliyal, N.; Pandey, A.; Joshi, B.; Mukherjee, S.; Bhojak, P.; Bisht, M.; Bhatt, D.; Singh, S.; Bahukhandi, A. Plant species diversity and density patterns along altitude gradient covering high-altitude alpine regions of west Himalaya, India. Geol. Ecol. Landscapes 2024, 8, 559–573. [Google Scholar] [CrossRef]
Figure 1. The scheme diagram of deep learning technologies in plant attribute monitoring.
Figure 1. The scheme diagram of deep learning technologies in plant attribute monitoring.
Sustainability 17 07602 g001
Figure 2. Faster R-CNN network architecture for maize seedling detection [44].
Figure 2. Faster R-CNN network architecture for maize seedling detection [44].
Sustainability 17 07602 g002
Figure 3. Deep learning-based multimodal framework for drought stress identification: (a) the RNN model and (b) the LSTM model [59].
Figure 3. Deep learning-based multimodal framework for drought stress identification: (a) the RNN model and (b) the LSTM model [59].
Sustainability 17 07602 g003
Figure 4. LWC-Former model for hyperspectral Transformer-based analysis of drought tolerance in blueberries [64].
Figure 4. LWC-Former model for hyperspectral Transformer-based analysis of drought tolerance in blueberries [64].
Sustainability 17 07602 g004
Figure 5. Deep learning model-based prediction of maize and soybean yields [86].
Figure 5. Deep learning model-based prediction of maize and soybean yields [86].
Sustainability 17 07602 g005
Figure 6. Deep learning-based workflow for plant disease and pest image recognition: (a) input raw data, (b) leaf classification, (c) lesion detection, and (d) lesion segmentation [103].
Figure 6. Deep learning-based workflow for plant disease and pest image recognition: (a) input raw data, (b) leaf classification, (c) lesion detection, and (d) lesion segmentation [103].
Sustainability 17 07602 g006
Figure 7. Severity classification of cucumber leaf diseases based on integrated DeepLabV3+ and U-Net: (a) example cucumber images, (b) image segmentation and annotation, (c) image augmentation, and (d) the DUNet network architecture [110].
Figure 7. Severity classification of cucumber leaf diseases based on integrated DeepLabV3+ and U-Net: (a) example cucumber images, (b) image segmentation and annotation, (c) image augmentation, and (d) the DUNet network architecture [110].
Sustainability 17 07602 g007
Figure 8. Generalized zero-shot pest and disease image classification based on a causal gating model: (A) redundancy loss differences computed by SCM and non-SCM base feature extractors, (B) the structural causal model, (C) the causal feature extraction module, and (D) the dataset [121].
Figure 8. Generalized zero-shot pest and disease image classification based on a causal gating model: (A) redundancy loss differences computed by SCM and non-SCM base feature extractors, (B) the structural causal model, (C) the causal feature extraction module, and (D) the dataset [121].
Sustainability 17 07602 g008
Figure 9. Common computer vision tasks in plant phenomics [79].
Figure 9. Common computer vision tasks in plant phenomics [79].
Sustainability 17 07602 g009
Figure 10. High-throughput software-based extraction of cotton fruit branch angles: (a,b) collection and imaging of cotton branches; (c) object detection using YOLOv5 combined with grayscale conversion, edge detection, and the Hough transform; and (d) calculation of branch angles using the Hough transform method [129].
Figure 10. High-throughput software-based extraction of cotton fruit branch angles: (a,b) collection and imaging of cotton branches; (c) object detection using YOLOv5 combined with grayscale conversion, edge detection, and the Hough transform; and (d) calculation of branch angles using the Hough transform method [129].
Sustainability 17 07602 g010
Figure 11. High-throughput phenotypic analysis of the genetic architecture of wheat flowering time using deep learning networks: (A) a phenotyping platform and breeder training image dataset and (B) the schematic of the WheatNet neural network [134].
Figure 11. High-throughput phenotypic analysis of the genetic architecture of wheat flowering time using deep learning networks: (A) a phenotyping platform and breeder training image dataset and (B) the schematic of the WheatNet neural network [134].
Sustainability 17 07602 g011
Figure 12. Deep learning CNN-based framework revealing the cis-regulatory code of plant gene expression [135].
Figure 12. Deep learning CNN-based framework revealing the cis-regulatory code of plant gene expression [135].
Sustainability 17 07602 g012
Figure 13. Common biotic and abiotic stress factors in plants [140].
Figure 13. Common biotic and abiotic stress factors in plants [140].
Sustainability 17 07602 g013
Figure 14. Deep learning model based on hyperspectral data and a time series for predicting lettuce quality under water stress: (a) experimental materials and site and (b) the deep learning model architecture [148].
Figure 14. Deep learning model based on hyperspectral data and a time series for predicting lettuce quality under water stress: (a) experimental materials and site and (b) the deep learning model architecture [148].
Sustainability 17 07602 g014
Figure 15. (a) Two deep learning models for real-time monitoring of nitrogen stress in wheat [152]. (b) Noninvasive detection of salinity stress in soybean seedlings using a deep learning CNN and chlorophyll fluorescence imaging [153].
Figure 15. (a) Two deep learning models for real-time monitoring of nitrogen stress in wheat [152]. (b) Noninvasive detection of salinity stress in soybean seedlings using a deep learning CNN and chlorophyll fluorescence imaging [153].
Sustainability 17 07602 g015
Figure 16. Rapid and accurate monitoring of tea plants under winter cold stress using a GRU-CNN neural network architecture [166].
Figure 16. Rapid and accurate monitoring of tea plants under winter cold stress using a GRU-CNN neural network architecture [166].
Sustainability 17 07602 g016
Table 1. Deep learning models applied to plant trait dynamics monitoring.
Table 1. Deep learning models applied to plant trait dynamics monitoring.
Model TypeMain CharacteristicsAdvantagesDisadvantagesAlgorithms and ApplicationsPotential Scenarios
CNNMulti-layer convolution and pooling structures that automatically extract local spatial featuresExcellent performance in image classification and object detection; strong feature extraction capacityLimited in capturing long-range dependencies; less effective for temporal modelingResNet, DenseNet, and VGG; maize emergence rate and leaf emergence speed monitoring [44]Plant pest and disease identification, leaf phenotyping, and crop distribution classification
RNNProcesses sequential data through recurrent connectionsStrong sequence modeling capabilities; suited for temporal dependenciesProne to vanishing/exploding gradients; limited in capturing long-term dependenciesBasic RNN; accurate regression of lettuce plant height from single-view sparse 3D point clouds [77]Crop growth modeling and yield prediction
LSTMAdds gating mechanisms to RNNs to enhance long-term memoryEffectively learns complex dynamics in long sequencesComputationally intensive; sensitive to parametersStacked LSTM; Poplar seedling variety and drought stress classification [57]Multitemporal yield prediction, phenological monitoring, and stress analysis
TransformerSelf-attention mechanism capturing global dependenciesHighly efficient parallel computation; supports multimodal data fusionLarge model size; requires substantial training dataVision Transformer (ViT); Hyperspectral imagery for blueberry drought phenotyping [64]Hyperspectral feature extraction and multimodal phenotypic prediction
GANGenerates high-quality synthetic data via adversarial trainingMitigates sample scarcity; enhances model robustnessTraining instability; difficult generator–discriminator balancePix2Pix GAN; Novel leaf disease augmentation model for plant disease diagnosis [78]Pest and disease sample expansion, data augmentation, and rare phenotype simulation
VAEProbabilistic generative model learning latent representationsGenerates diverse samples; interpretable latent variablesLower sample clarity compared to GANs β -VAE; dimensionality reduction and generation of hyperspectral plant phenotyping data [79]Phenotypic representation, anomaly detection, and phenomics data generation
GRUSimplified LSTM variantFewer parameters; efficient trainingSlightly lower expressiveness than LSTMGRU network; optimized diagnosis of yellow vein mosaic virus [80]Pest and disease monitoring and continuous growth dynamics prediction
Ensemble DLIntegrates multiple models to improve robustnessHigh accuracy; good generalizationHigh computational resource consumptionEnsemble CNN + LSTM; automated genotype classification and dynamic phenotyping recognition during plant growth [81]Cross-scale yield prediction and stress adaptation modeling
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Han, S.; Wang, H. Application of Deep Learning Technology in Monitoring Plant Attribute Changes. Sustainability 2025, 17, 7602. https://doi.org/10.3390/su17177602

AMA Style

Han S, Wang H. Application of Deep Learning Technology in Monitoring Plant Attribute Changes. Sustainability. 2025; 17(17):7602. https://doi.org/10.3390/su17177602

Chicago/Turabian Style

Han, Shuwei, and Haihua Wang. 2025. "Application of Deep Learning Technology in Monitoring Plant Attribute Changes" Sustainability 17, no. 17: 7602. https://doi.org/10.3390/su17177602

APA Style

Han, S., & Wang, H. (2025). Application of Deep Learning Technology in Monitoring Plant Attribute Changes. Sustainability, 17(17), 7602. https://doi.org/10.3390/su17177602

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop