Next Article in Journal
Effect of Water Restriction and Supplementary Nitrogen on the Growth Dynamics of Bromus valdivianus Phil.
Previous Article in Journal
Identification of Candidate Genes Related to SPAD Value Using Multi-Year Phenotypic Data in Rice Germplasms by Genome-Wide Association Study (GWAS)
Previous Article in Special Issue
Variable-Rate Nitrogen Application in Wheat Based on UAV-Derived Fertilizer Maps and Precision Agriculture Technologies
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Non-Destructive System Using UVE Feature Selection and Lightweight Deep Learning to Assess Wheat Fusarium Head Blight Severity Levels

1
College of Mechanical and Electronic Engineering, Northwest A&F University, Yangling 712100, China
2
Key Laboratory of Agricultural Internet of Things, Ministry of Agriculture and Rural Affairs, Yangling 712100, China
3
Shaanxi Key Laboratory of Agricultural Information Perception and Intelligent Service, Yangling 712100, China
4
Information Management Office, Northwest A&F University, Yangling 712100, China
*
Author to whom correspondence should be addressed.
Agronomy 2025, 15(9), 2051; https://doi.org/10.3390/agronomy15092051
Submission received: 18 July 2025 / Revised: 21 August 2025 / Accepted: 22 August 2025 / Published: 26 August 2025

Abstract

Fusarium head blight (FHB), a globally significant agricultural disaster, causes annual losses of dozens of millions of tons of wheat toxins produced by FHB, such as deoxyroscyliaceol, further pose serious threats to human and livestock health. Consequently, rapid and non-destructive determination of FHB severity is crucial for implementing timely and precise scientific control measures, thereby ensuring wheat supply security. Therefore, this study adopts hyperspectral imaging (HSI) combined with a lightweight deep learning model. Firstly, the wheat ears were inoculated with Fusarium fungi at the spike’s midpoint, and HSI data were acquired, yielding 1660 samples representing varying disease severities. Through the integration of multiplicative scatter correction (MSC) and uninformative variable elimination (UVE) methods, features are extracted from spectral data in a manner that optimizes the reduction of feature dimensionality while preserving elevated classification accuracy. Finally, a lightweight FHB severity discrimination model based on MobileNetV2 was developed and deployed as an easy-to-use analysis system. Analysis revealed that UVE-selected characteristic bands for FHB severity predominantly fell within 590–680 nm (chlorophyll degradation related), 930–1043 nm (water stress related) and 738 nm (cell wall polysaccharide decomposition related). This distribution aligns with the synergistic effect of rapid chlorophyll degradation and structural damage accompanying disease progression. The resulting MobileNetV2 model achieved a mean average precision (mAP) of 99.93% on the training set and 98.26% on the independent test set. Crucially, it maintains an 8.50 MB parameter size, it processes data 2.36 times faster, significantly enhancing its suitability for field-deployed equipment by optimally balancing accuracy and operational efficiency. This advancement empowers agricultural workers to implement timely control measures, dramatically improving precision alongside optimized field deployment.

1. Introduction

Wheat, as one of the world’s three major staple crops, plays a vital role in global food security. Recent years have witnessed growing international concern over food security, with wheat disease control emerging as a key research focus [1,2]. Fusarium Head Blight (FHB), caused by Fusarium pathogens, is a highly destructive disease. Its mycotoxins—notably deoxynivalenol (DON)—exhibit significant cytotoxicity, immunosuppressive effects, and carcinogenic potential in humans and animals [3,4,5]. The increasing prevalence of FHB is driven by multiple factors, including climate warming, microbial source accumulation from extensive straw incorporation, and enhanced Fusarium fungicide resistance. In pandemic years, yield losses can exceed 40%, costing global agriculture over $3 billion annually [6]. Global regulators such as FAO and EFSA have consequently imposed strict limits on DON levels in food and feed products to mitigate health risks [7].
Current FHB management primarily relies on fungicide application and resistant cultivars to block disease spread and suppress DON production [8]. However, the traditional chemical control faces a dilemma: excessive use exacerbates environmental burdens and accelerates pathogen resistance, while insufficient application fails to contain epidemics, compromising yield and quality [9,10]. Early FHB symptoms are often subtle and easily missed during manual scouting, delaying optimal intervention. Thus, developing intelligent detection technologies and equipment to efficiently and accurately assess the severity of FHB, while enhancing smart monitoring capabilities, has become a critical breakthrough for establishing a wheat safety system and sustainable disease management framework [11]. Machine and deep learning-based non-destructive detection technologies have rapidly advanced in recent years [12]. Yet conventional RGB imaging lacks sensitivity to biochemical changes in latent infections, limiting early detection and causing high misclassification rates for mild symptoms. Hyperspectral imaging (HSI) offers a promising alternative for wheat FHB monitoring [13], simultaneously capturing spatial and spectral data to provide comprehensive diagnostic information [14,15,16]. Nevertheless, HSI’s high dimensionality, large data volume, and field environmental variability significantly challenge identification accuracy and model robustness. Deep learning advancements help overcome these bottlenecks.
In recent years, feature selection and dimensionality reduction have become essential in hyperspectral image analysis, not only for computational efficiency but also for improving interpretability and diagnostic reliability. While feature selection methods such as uninformative variable elimination (UVE), random frog jump (RF), successive projections algorithm (SPA) and competitive adaptive weighted sampling (CARS) identify a small subset of wavelengths with strong discriminative power, a complementary body of research has focused on nonlinear dimensionality reduction techniques that preserve the intrinsic structure of high-dimensional data. Lespinats et al. [17] provided a comprehensive treatment of such methods, emphasizing the importance of maintaining the topology and neighbofrhood relationships of the original data when reducing dimensionality. This perspective is highly relevant to disease detection, where subtle spectral differences correspond to real physiological states.
Beyond generic dimensionality reduction, several studies have proposed interpretative frameworks for analysing and validating reduced representations. Colange et al. [18] introduced the MING method as a visual exploration tool, allowing practitioners to relate projection space structures back to their high-dimensional origins. This is particularly valuable in plant pathology, where interpretability of the selected features is critical for linking spectral signatures to biological processes. Colange et al. [19] further developed a technique to superimpose neighborhood graphs onto reduced spaces, enabling the identification and quantification of distortions caused by dimensionality reduction. Supervised dimensionality reduction can also benefit from explicit control over class separability and neighbor preservation. Colange et al. [20] proposed a method for steering distortions in supervised embeddings, allowing improved discrimination of classes while retaining the intrinsic structure of the data. Such approaches are conceptually aligned with the biological motivation of UVE, which aims to preserve relevant variance while removing noise and redundancy.
The reliability of dimensionality reduction in diagnostic and monitoring applications has also been critically assessed in other domains. Geoffroy et al. [21] evaluated the robustness of various techniques for fault detection in building performance monitoring, highlighting the need to assess stability under varying operational conditions—an aspect equally important for in-field plant disease detection. Similarly, Geoffroy et al. [22] demonstrated how multidimensional scaling can support anomaly detection in continuous commissioning scenarios, illustrating the general applicability of structure-preserving methods in diverse diagnostic contexts.

2. Materials and Methods

To address this, this study first generated gradient-disease samples through artificial inoculation and acquired HSI data. After preprocessing, optimal feature bands were selected by comparative algorithm evaluation. A high-performance lightweight model was subsequently developed and deployed as a rapid identification system; the overall process is shown in Figure 1. This solution delivers real-time, cost-effective decision support for large-scale FHB monitoring and precision chemical application, advancing intelligent equipment development for field deployment.

2.1. Experimental Materials and Sample Preparation

Fielder wheat varieties provided by the College of Plant Protection, Northwest A&F University, were utilized in our research. The selected seeds were planted in plastic pots and cultivated under controlled greenhouse conditions (25 °C, 70% relative humidity, and 12 h/12 h light/dark cycle). At anthesis, wheat spikes were inoculated following established protocols [23] during 5–8 August 2024. Key procedures included:
① Preparing Fusarium graminearum spore suspensions (1 × 105–5 × 105 cells/mL); ② Selecting spikes with uniform developmental stages, with completely extended anthers and an open angle > 45° of the glume shell; and ③ Vortex-mixing suspensions before inoculation to ensure homogeneity. Inoculation was carried out via single-floret injection, delivering 10 μL of spore suspension into selected florets. Injected sites were marked, and inoculated spikes were bagged for 2 d to maintain humidity.
From 16 days post-inoculation, diseased wheat ears representing graded severity levels were collected. Following China’s National Standard GB/T 3543.5-2022 (Field Investigation Specifications for Wheat Fusarium Head Blight), samples with mechanical damage or external contamination were excluded. This process established a four-tier severity classification library comprising 255 infected wheat ears.

2.2. Data Acquisition

In the Key Laboratory of Agricultural Internet of Things, Ministry of Agriculture and Rural Affairs, Northwest A&F University, RGB images and HSI data were collected from wheat ear samples selected for experiments.

2.2.1. Phenotypic Image Data

RGB images were captured under natural light using a Xiaomi 13 mobile phone (Xiaomi Corporation, Beijing, China) equipped with a triple rear camera system. The primary camera features a 1/1.49-inch IMX800 sensor (54 MP resolution), 23 mm equivalent focal length, f/1.8 aperture, and seven aspherical lenses. Disease severity was classified into four grades according to China’s National Standard GB/T 15796-2011 (Technical Specification for Detection and Forecast of Wheat Fusarium Head Blight), based on the percentage of infected spikelets per ear [24]. To achieve precise severity classification, we implemented a multi-angle data fusion approach. Each wheat ear was imaged at four orthogonal orientations (0°, 90°, 180°, 270°). The final disease index was calculated as the mean infected area proportion across these angles. According to the national standard classification thresholds, samples were categorized into four severity levels based on this index. The results matched the visual assessment by agronomic experts, with partial samples from each level shown in Figure 2.
To mitigate model bias from sample imbalance, data augmentation was performed through geometric transformation (rotation 180°), image brightness adjustment (0.7–1.4) and Gaussian noise. The distribution of original versus augmented samples is detailed in Table 1. In the context of symptom representation, the ratio of affected spikelets to the overall number of spikelets is considered.

2.2.2. Hyperspectral Image Data

Hyperspectral images were acquired using the SOC-710VP imaging system (Surface Optics Corporation, San Diego, CA, USA) with a 400–1000 nm spectral range covering the visible to near-infrared band at 2.1 nm resolution. The system configuration included three 100-W halogen lamps arranged at 45° incidence to minimize specular reflection, a 50 mm C-mount lens at f/2.1 aperture, and a 55 cm lens-to-sample working distance. Image acquisition was controlled via HyperScanner software (17.5.3) under default low-gain mode with 24 ms integration time. Scanner settings utilized quick scan mode (5–10 s acquisition time), capturing 696 × 520 spatial pixels across 256 spectral channels.
To minimize environmental interference, experiments were conducted in a climate-controlled darkroom with 30 min system preheating prior to imaging. The acquisition method of HSI data is shown in Figure 2. During the collection process, the samples were mounted on a motorized lift stage with spike axes perpendicular to the scanning direction. Three wheat ears per group were arranged on black velvet (reflectance less than 0.02) with 2 cm inter-ear spacing, ensuring complete coverage within a single scan. Each ear was imaged separately at 0°, 90°, 180°, and 270° orientations, yielding 900 hyperspectral cubes from 225 ears to enhance data representativeness.
Raw digital numbers (DN) were converted to reflectance using SRAna710e software (3.6.3) with white reference calibration [25], eliminating sensor drift and dark current effects. ENVI 5.6 software then extracted regions of interest (ROIs) encompassing entire wheat ears that were cropped to 150 × 420 pixels (256 channels). The specific ROI extraction steps are shown in Figure 3. Subsequently, binary masking removed background artifacts, isolating ear-specific pixels for downstream analysis.

2.3. Data Processing

2.3.1. HSI Pretreatment

During HSI acquisition, dynamic illumination variations and minor instrument parameter deviations introduce significant noise, compromising data accuracy. Four preprocessing methods were therefore applied in this study [26,27,28,29]: normalization, standard normal variable transformation (SNV), multiplicative scattering correction (MSC), and Savitzky–Golay smoothing (SG). Their efficacy in noise suppression was evaluated using Support Vector Machine (SVM)-based metrics to enhance feature selection reliability.

2.3.2. Selection of HSI Characteristic Wavelength

Although hyperspectral pixels contain continuous spectral data, adjacent bands exhibit high correlation. To eliminate redundancy and extract biophysically meaningful features, four feature selection algorithms were implemented, including uninformative variable elimination (UVE), random forest (RF), successive projection algorithm (SPA) and competitive adaptive reweighted sampling (CARS) [30,31,32,33]. SVM evaluation metrics assessed their performance in information retention, class separability, and data structure preservation.

2.3.3. Lightweight Model

For field-deployable solutions requiring low computing power and real-time processing, three lightweight models were compared: MobileNetV2, EfficientNet-B0, and ShuffleNetV2 [34,35,36]. In MobileNetV2, the architecture employs a residual structure that initially increases the number of channels via 1 × 1 convolution, subsequently applies depthwise separable convolution, and ultimately reduces the channel count. The linear bottleneck mechanism incorporates linear activation within the dimensionality reduction layer to mitigate information loss; EfficientNet-B0 achieves a balance among network depth, width, and input resolution through a composite scaling strategy, while also integrating inverted residuals, the Squeeze-and-Excitation (SE) attention mechanism, and Swish activation with an enhanced MBConv module. ShuffleNetV2 emphasizes a design that is compatible with hardware constraints by implementing channel division, which separates the input into two streams: constant mapping and convolution. Additionally, it employs channel shuffling to enhance information interaction. This architecture adheres to four engineering principles aimed at reducing memory access costs and optimizing branch structures.

2.4. Experimental Hyperparameter Setting

Experimental configurations are detailed in Table 2. All models were implemented in Python (3.8.20)/PyTorch (2.4.1) with consistent hyperparameters: an initial learning rate of 0.001, a batch size of 100, a total of 50 training epochs, and optimization of model parameters via the Adam optimizer. During the training process, the datasets were partitioned into training, validation and test sets according to 6:2:2. Each epoch involved training the model on the training set and subsequently evaluating its performance on the validation set. Early stopping (patience = 10) preserved the best-performing models when validation accuracy plateaued, effectively mitigating overfitting. To improve the reliability of the findings, ten repeated experiments were performed to minimize variability, and the results were reported as the mean values derived from these ten trials.

2.5. Evaluating Indicator

In the present study, the evaluation of deep learning model performance was conducted using several metrics, including Precision, Recall, F1-score, mean average precision (mAP), frames per second (FPS), logarithmic efficiency index (LEI), the number of model parameters and model size measured in megabytes (MB) [37,38]. These metrics are chosen to provide a comprehensive assessment across three dimensions: classification accuracy, target detection efficacy, and model complexity. The definitions and calculation methodologies for each of these indices are detailed below:
Precision quantifies the ratio of true positive instances to the total number of instances that the model has predicted as positive.
P r e c i s i o n = T P T P + F P × 100
where true positive (TP) denotes the count of accurately identified positive samples, while false positive (FP) indicates the number of incorrectly identified positive samples.
The Recall rate is defined as the ratio of all actual positive samples that the model successfully identifies.
R e c a l l = T P T P + F N × 100
where false negative (FN) denotes the quantity of false negative samples identified.
F1-score is defined as the harmonic average of Precision and Recall, serving as a comprehensive metric for assessing the classification efficacy of a model. This metric is particularly advantageous in scenarios characterized by imbalanced data distributions, as it effectively reconciles Precision and Recall.
F 1 - s c o r e = 2 × P r e c i s i o n × R e c a l l P r e c i s i o n + R e c a l l
FPS quantifies the real-time processing capability of the model, reflecting the number of image frames that can be recognized within a one-second interval. This measure is intrinsically linked to the model’s responsiveness in practical applications.
F P S = 1 t
Additionally, the variable t signifies the average duration required for the model to process a single frame of imagery.
mAP is a fundamental metric for evaluating the performance of object detection models. It computes the average of the precision (AP) across all levels of FHB severity, integrating the area under the Precision-Recall curves at various confidence thresholds. A higher mAP value signifies superior and more equitable detection performance across different categories.
m A P = i = 1 S A P ( S ) S
The LEI is a comprehensive metric for measuring the utilization of computational resources by a model, which can directly reflect the performance output of the model under limited resources (memory/computational power).
L E I = l o g 10 F P S P a r a m e t e r s
The term Parameters refers to the total count of learnable parameters within a model, which reflects the theoretical complexity of the model and is a critical determinant of computational load and memory usage. Furthermore, the model size, MB, indicates the amount of storage space required for the model file on disk, thereby directly influencing the feasibility of deploying the model on devices with limited storage capacity.

2.6. System Development

The lightweight detection model was deployed via a B/S architecture web platform using Python’s Flask framework. This lightweight backend handles API services while rendering frontend pages through a modern stack. Tailwind CSS for responsive design and JavaScript for client-server communication. The system comprises an FHB detection interface and a results dashboard.

3. Results

3.1. Results of Data Processing

3.1.1. HSI Reflectivity Characteristics

The spectral curves of wheat FHB at various severity levels, as obtained through the methodology outlined in Section 2.2.2, are illustrated in Figure 4.
The spectral reflectance data distinctly demonstrate the variations within the 400–900 nm range in wheat samples with different FHB severity. Analysis within the green spectral region (500–600 nm) demonstrates that healthy plants display a distinct reflectance peak near 550 nm, corresponding to the green light peak. As disease severity escalates, the reflectance at this green peak generally diminishes. Interestingly, samples exhibiting the highest disease severity level 4 show a green peak reflectance reaching up to 0.36, substantially higher than that observed in level 1 samples. Within the red spectral range (650–680 nm), the characteristic chlorophyll-induced absorption trough progressively diminishes with increasing disease severity, resulting in a continuous rise in reflectance. The red edge region (680–750 nm) exhibits the most pronounced response, characterized by a marked increase in reflectance, a blue shift in the red edge position, and a reduction in the red edge slope. In the near-infrared region (750–900 nm), reflectance decreases concomitantly with the severity of damage, with level 1 samples showing the highest reflectance values.

3.1.2. HSI Data Preprocessing

Figure 5 compares the spectral reflectance data of Fusarium-infected wheat under different preprocessing methods.
As illustrated in Figure 5b, the normalization constrains the spectral amplitudes to [0, 1], mitigating absolute reflection intensity disparities while highlighting spectral shapes and relative variation. SNV standardization, shown in Figure 5c, centers spectra at zero mean with unit variance. Enhancing waveform similarity, compressing dynamic ranges, and enabling precise feature comparison through uniform scaling. MSC aims to mitigate or eliminate spectral baseline shifts and tilts induced by physical factors of the sample, such as particle size, uneven distribution or surface scattering. The corrected spectral baseline appears flatter, and the overall shape of the spectrum is adjusted to more accurately reflect the true chemical composition information of the sample, as depicted in Figure 5d, thereby enhancing the correlation between the spectrum and the target properties. SG filtering and smoothing presented in Figure 5e suppresses high-frequency random noise, resulting in a smoother reflectance spectral line that retains the principal features of the spectrum to the greatest extent possible, significantly improving the signal-to-noise ratio (SNR).
The results of cross-validation utilizing SVM were presented in Table 3, illustrating that various preprocessing techniques yield differing impacts on model performance. The SG filter yielded an F1-score of 95.12% and 91.06% on the training and testing datasets, respectively. The original spectral data and normalization techniques achieved Precision rates of 95.54% and 97.81% on the training set; however, their performance markedly deteriorated when evaluated on the testing set. The SNV preprocessing approach attained an exceptional F1-score of 99.92% during training but exhibited a substantial decrease to 93.25% on the testing data. Among the methods assessed, the MSC technique demonstrated the highest F1-score at 94.27% and a Precision score of 94.31% on the testing set.

3.1.3. Selection of Characteristic Wavelengths

Following MSC preprocessing, spectral dimensionality reduction was implemented using four feature selection algorithms to identify minimal feature sets preserving classification efficacy: UVE, RF, SPA and CARS. The initial, SVM-based evaluation determined optimal feature count ranging from 1 to 20, with final performance metrics detailed in Table 4. UVE demonstrated superior overall performance and generalization capability, achieving a test set F1-score that surpassed the original dataset by 0.48 percentage points, reaching 94.75%. Furthermore, the disparity between the training and test sets was a mere 4.22 percentage points, indicating a minimal risk of overfitting. Notably, UVE accomplished an 81% feature reduction rate with only 11 features, thereby significantly enhancing model efficiency.
In contrast, RF exhibited a tendency towards overfitting, evidenced by a substantial 5.82 percentage point gap in precision between the training and test sets. SPA algorithm yielded a test set precision of only 92.78%, which was inferior to that of the original data. Meanwhile, the F1-score for the CARS test set was recorded at 93.75%, reflecting a balanced performance, albeit not as high as that of UVE.
Figure 6 illustrates the characteristic spectral bands selected by each feature selection algorithm. The UVE algorithm selects 11 spectral bands from a wide range spanning 407 to 1043 nm, representing an 81% reduction in dimensionality. Its selection predominantly targets the chlorophyll-sensitive region between 590 and 680 nm, the characteristic cell wall peak at 738 nm, and the moisture-responsive band from 930 to 1043 nm. In contrast, the RF algorithm concentrates primarily on the near-infrared region between 663 and 819 nm, with the majority of its 20 selected bands located within this interval, except for a single band at 540 nm in the visible spectrum. The SPA algorithm identifies 16 bands that are mainly dispersed across the 453 to 700 nm range, encompassing the transition from visible to near-infrared light. Meanwhile, the CARS algorithm extracts 14 bands covering 372 to 962 nm, with seven bands situated in the near-infrared segment from 703 to 962 nm and others scattered within the visible range of 468 to 671 nm. Importantly, only the UVE algorithm concurrently captures critical physiological response bands associated with chlorophyll degradation (590–680 nm), water stress (above 930 nm), and cellular structural alterations (738 nm).

3.2. Comparison of Different Detection Models

Following the feature extraction of HSI data utilizing UVE, three representative lightweight models: MobileNetV2, EfficientNet-B0, and ShuffleNetV2 were comparatively evaluated. The training process illustrated in Figure 7 reveals ShuffleNetV2 achieved the fastest convergence, while MobileNetV2 and EfficientNet-B0 exhibited comparable convergence rates. However, EfficientNet-B0 is characterized by a more stable training trajectory, albeit necessitating the highest number of training rounds.
To assess performance, the datasets were partitioned into training, validation and test sets according to 6:2:2. This partitioning procedure was randomly executed ten times, with each split employed independently for model training and evaluation. The overall performance metric reported corresponds to the mean outcome derived from these ten separate iterations. The performance metrics for MobileNetV2, EfficientNet-B0 and ShuffleNetV2 are presented in Table 5. The training set demonstrated exceptional performance, with the mAP, F1-score, and Recall all surpassing 99%. Evaluation on the test set indicated that EfficientNet-B0 achieved a Precision of 99.47%, MobileNetV2 attained 98.26%, and ShuffleNetV2 reached 96.36%. Concerning model size, ShuffleNetV2 comprised 4.80 MB of parameters, MobileNetV2 contained 8.50 MB, and EfficientNet-B0 encompassed 15.30 MB. In terms of processing speed, ShuffleNetV2 processed at 1178 FPS, MobileNetV2 at 310 FPS, and EfficientNet-B0 at 131 FPS.
Figure 8 presents the test set confusion matrices for each model. Three models demonstrate an exceptional accuracy in identifying Level 1 diseases. Crucially, MobileNetV2 and EfficientNet-B0 show comparable recognition for Level 3 and 4, while ShuffleNetV2 exhibits comparatively lower performance in identifying Level 2, 3 and 4 compared to both counterparts. In summary, field-deployable solutions must balance detection accuracy against hardware constraints. Although MobileNetV2 shows a marginal 1.21% precision deficit versus EfficientNet-B0, it possesses a more moderate parameter count, comprising only 55.5% of that of EfficientNet-B0, and achieves a frame rate that is 2.36 times greater than that of EfficientNet-B0. This optimal accuracy-efficiency trade-off makes MobileNetV2 ideally suited for resource-limited field detection equipment.

3.3. Model Deployment

The MobileNetV2-based lightweight disease detection system was implemented via a B/S architecture web platform, as illustrated in Figure 9, enabling cross-platform accessibility. Developed with Python’s Flask framework, this lightweight solution concurrently handles backend APIs and frontend rendering. At the backend level, the Flask framework provides two primary API interfaces. The first interface is dedicated to model inference, which accepts uploaded image data, engages the lightweight detection models to assess disease severity, and returns the detection outcomes. The second interface is responsible for generating heat maps that illustrate disease characteristics based on the results obtained from the model, thereby visually representing the distribution and severity of the diseases. Additionally, Flask’s multithreading enhances concurrent request handling. The frontend is constructed using a contemporary web technology stack and comprises three essential functional pages. The disease detection homepage features a hierarchical design, utilizing Tailwind CSS to establish visual hierarchy, and incorporates a navigation bar, a drag-and-drop file upload area, and a system description example; The disease severity identification page interacts with the backend API via JavaScript, displaying a dynamic loading animation during processing and employing Chart.js to create quantitative analysis charts that convey information such as disease types upon receiving results. The heatmap visualization page utilizes Canvas technology to render the disease heat map, employing color gradients to indicate the severity of the disease. In terms of interaction logic, the frontend leverages JavaScript to facilitate rich interactive experiences, utilizing AJAX technology to enable file uploads without page refreshes, and incorporating smooth transition animations in the results display area. The system is designed responsively to accommodate various device sizes. A blue color scheme is predominantly used to convey a professional and trustworthy image, complemented by orange to highlight disease areas.

4. Discussion

Effective control of FHB in wheat relies on early intervention, making rapid, non-destructive detection in the field crucial for safeguarding yield. Since implementing control measures at the earliest infection stage (Level 1) most effectively curbs disease spread, developing lightweight detection models suitable for field use is essential. While RGB imaging is commonly used, it primarily detects symptomatic color changes [39]. HSI, by contrast, captures discriminative spectral features that enable earlier detection of subtle symptoms. For instance, Zhou et al. [40] achieved 90.1% accuracy under field conditions using a model combining multiscale feature pyramids, quadtree refinement, and transformers; Diao et al. [41] selected 15 spectral bands with a lightweight network, reducing parameters tenfold with only a 3% accuracy decline in crop-weed discrimination. Almoujahed et al. [42] mapped DON contamination at 93.42% accuracy by combining HSI with severity data. These results demonstrate that feature selection enhanced with deep learning improves FHB identification. Nevertheless, subtle inter-class spectral variations across severity levels and mobile deployment remain key challenges for developing accurate, real-time, lightweight models.
Figure 4 shows that in the green spectrum (500–600 nm), healthy plants exhibit a reflection peak near 550 nm (green peak) due to low chlorophyll absorption [43]. With increasing disease severity, reflectance generally declines, indicating chlorophyll degradation and loss of photosynthetic capacity. Notably, Level 4 samples show higher green peak reflectance (~0.36) than Level 1, likely due to necrosis and cell lysis enhancing light scattering [44]. In the red light region (650–680 nm), the chlorophyll absorption valley weakens as the infection progresses, elevating reflectance alongside pigment loss [45]. The red-edge region (680–750 nm) is most responsive: a sharp reflectance rise, blue-shift [46], and slope decrease indicate accelerated chlorophyll loss, senescence, and structural damage [47]. Within the near-infrared region of 750–900 nm, reflectivity decreases with severity, mainly due to hyphal invasion, cell rupture, and tissue collapse, reducing internal scattering. Level 1 samples maintain high near-infrared reflectance, likely from stress responses like dehydration, cell wall thickening, and wax accumulation that enhance scattering. These samples also show minimal carotenoid alteration and structural disruption, leading to lower absorption and higher reflectance [48,49]. Reflectance variation around 970 nm requires evaluating both tissue dehydration and structural damage, as severe dehydration typically reduces near-infrared reflectivity [50].
In summary, the spectral response to FHB infection in wheat includes increased red edge reflectance, blue shift, slope decrease, and significant variations in near-infrared reflectance and Level 1 reflectance. These reflect disease impacts on physiology and biochemistry, such as chlorophyll loss, cellular damage, water change, and stress responses. These responses provide a critical basis for precisely identifying sensitive spectral bands to enable early, non-destructive detection and classification of FHB.
As presented in Table 3, the SG method limited the train-test precision gap to 3.96%, yet overall performance in terms of precision and F1-score was constrained due to excessive suppression of high-frequency features. Both raw spectra and normalization showed high training precision but suffered test declines of 5.38 and 7.61 percentage points, indicating clear overfitting. Although SNV achieved the highest training precision, its generalization was limited with a 6.06-point precision drop. In contrast, MSC produced the most balanced outcome with the smallest precision gap of 4.68 between training and test sets. By compensating for light scattering, MSC improved spectral interpretability and generalization. It was selected as the optimal method for its robustness and accuracy, reducing overfitting while ensuring stability for subsequent feature selection.
As shown in Table 4 and Figure 6, the UVE algorithm selects features based on regression coefficient stability. It compares real coefficients against artificial noise to establish a threshold that removes unstable or weak bands, thereby improving the robustness of high-dimensional, collinear spectral data and suppressing noise effectively. UVE thus identifies key bands linked to critical physiological processes, covering 407 to 1043 nm. These include chlorophyll degradation (590–680 nm), moisture stress (930–1043 nm), and cell wall polysaccharides breakdown (near 738 nm). Notably, these insights were all achieved with very few features.
This study combines MSC preprocessing with UVE for feature selection, effectively reducing HSI dimensionality and retaining key FHB-related information. As shown in Figure 6, RF and CARS concentrate too narrowly on limited spectral ranges, risking loss of critical pathological information. Furthermore, UVE uses a noise-based stability assessment to improve robustness against interference, making it better for interpreting the biological basis of disease-induced spectral changes while keeping the model simple. By removing redundancy, it optimally balances feature reduction and generalization, making it the recommended choice.
Based on the identified FHB-sensitive bands, a lightweight detection model was developed. As shown in Table 5, all three models performed strongly on the training set, with high overall precision, balanced metrics, and detection rates. On the test set, however, EfficientNet-B0 achieved the highest precision, followed by MobileNetV2, with ShuffleNetV2 trailing notably. In terms of computational efficiency, ShuffleNetV2 was the lightest and fastest due to its minimal parameters; MobileNetV2 offered a middle ground; while EfficientNet-B0 demanded the most resources. The LEI scores reflected this trade-off: ShuffleNetV2 scored 2.39, MobileNetV2 1.56, and EfficientNet-B0 0.93. Accordingly, EfficientNet-B0 is recommended for high precision tasks such as lab-based physiological analysis. For resource-constrained or latency-sensitive environments (e.g., microcontrollers and Raspberry Pi Zero), ShuffleNetV2 is ideal. MobileNetV2 strikes the best balance between accuracy and speed, making it suitable for real-time monitoring under limited computational budgets, such as smartphone applications, embedded systems, and field-based real-time monitoring. It was therefore selected as the optimal architecture for real-time FHB monitoring.
An interactive and user-friendly visualization system was developed based on MobileNetV2, featuring a simple interface and intuitive result presentation. This system offers an effective solution for assessing FHB severity. While this model enables non-destructive and accurate FHB detection, its performance under diverse field conditions remains limited. Future work should focus on domain adaptation, synthetic data generation using generative adversarial networks (GANs), and hybrid training strategies combining lab and field data to improve robustness.

5. Conclusions

This study developed a practical framework for rapid and accurate assessment of FHB severity by selecting informative hyperspectral features, constructing a lightweight model, and implementing an accessible visualization system, thereby supporting timely disease management. The key findings and conclusions are as follows:
  • Comparison of preprocessing methods revealed significant variation in discrimination ability across severity levels within the 370–1100 nm range. The combined MSC-UVE approach performed best, identifying 11 characteristic bands with improved discrimination. FHB spectral features mainly concentrate in regions linked to chlorophyll degradation (590–680 nm), water stress (930–1043 nm), and cell wall breakdown (approximately 738 nm), reflecting chlorophyll loss, photosystem II damage, and structural changes. Using these features, a MobileNetV2 model reached 99.93% mAP in training and 98.26% precision on test data, balancing high accuracy with operational efficiency and a small size of 8.50 MB.
  • Nevertheless, lab-developed models show limited robustness in complex field environments with varying light, background, and cultivars. Further study of subtle early FHB traits and accurate selection of representative spectral bands is needed to improve the model’s usefulness in early precision management. To reduce the high cost of HSI technology, data collection can be optimized by focusing on key growth and stress stages. Alternatively, lower-cost multispectral systems with custom filters based on HSI-identified bands can be adopted, significantly cutting hardware cost while preserving critical information.

Author Contributions

X.L.: Writing—Original Draft, Methodology, and Investigation; S.Y.: Software; H.S.: Data collation and annotation; L.M.: Review and Editing; Z.Y.: Paper Revising; X.C.: Supervision. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Natural Science Foundation Project (32201662) and the Key Research and Development Plan Project of Shaanxi Province (2025NC-YBXM-215).

Acknowledgments

The authors would like to thank Yang Yang from the College of Plant Protection of NWAFU for invaluable assistance during the wheat inoculation experiments.

Conflicts of Interest

The authors declare no conflicts of interest. The funder had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Wegulo, N.S.; Baenziger, S.P.; Nopsa, H.J.; Bockus, W.W.; Adams, H.H. Management of Fusarium head blight of wheat and barley. Crop Prot. 2015, 73, 100–107. [Google Scholar] [CrossRef]
  2. Mesterhazy, A. What is Fusarium Head Blight (FHB) Resistance and What Are Its Food Safety Risks in Wheat? Problems and Solutions—A Review. Toxins 2024, 16, 31. [Google Scholar] [CrossRef]
  3. Shang, S.; He, S.; Zhao, R.; Li, H.; Fang, Y.; Hu, Q. Fumarylacetoacetate hydrolase targeted by a Fusarium graminearum effector positively regulates wheat FHB resistance. Nat. Commun. 2025, 16, 5582. [Google Scholar] [CrossRef]
  4. Francis, F.; Florian, R.; Tarek, A.; Thierry, L.; Ludovic, B. Searching for FHB Resistances in Bread Wheat: Susceptibility at the Crossroad. Front. Plant Sci. 2020, 11, 731. [Google Scholar] [CrossRef]
  5. Li, X.; Zhao, L.; Fan, Y.; Jia, Y.; Sun, L.; Ma, S.; Ji, C.; Ma, Q.; Zhang, J. Occurrence of mycotoxins in feed ingredients and complete feeds obtained from the Beijing region of China. Anim. Sci. Biotechnol. 2014, 5, 37. [Google Scholar] [CrossRef]
  6. Bruce, D.; William, W.W. Risk premiums due to Fusarium Head Blight (FHB) in wheat and barley. Agric. Syst. 2018, 162, 145–153. [Google Scholar] [CrossRef]
  7. Alikarami, M.; Saremi, H. Fusarium Head Blight management with nanotechnology: Advances and future prospects. Physiol. Mol. Plant Pathol. 2025, 139, 102782. [Google Scholar] [CrossRef]
  8. Zhang, J.; Sun, S.; Guo, Y.; Ren, F.; Sheng, G.; Wu, H.; Zhao, B.; Cai, Y.; Gu, C.; Duan, Y. Risk assessment and resistant mechanism of Fusarium graminearum to fluopyram. Pestic. Biochem. Physiol. 2025, 212, 106449. [Google Scholar] [CrossRef]
  9. Chen, B.; Shen, X.; Li, Z.; Wang, J.; Li, X.; Xu, Z.; Shen, Y.; Lei, Y.; Huang, X.; Wang, X.; et al. Antibody generation and rapid immunochromatography using time-resolved fluorescence microspheres for propiconazole: Fungicide abused as growth regulator in vegetable. Foods 2022, 11, 324. [Google Scholar] [CrossRef] [PubMed]
  10. Dominik, R.; Lukas, P.; Ludwig, R.; Anja, H.; Daniel, C.; Ole, P.N.; Torstem, S. Efficient Noninvasive FHB Estimation using RGB Images from a Novel Multiyear, Multirater Dataset. Plant Phenomics 2023, 5, 0068. [Google Scholar] [CrossRef] [PubMed]
  11. Qiu, M.; Zheng, S.; Tang, L.; Hu, X.; Xu, Q.; Zheng, L.; Weng, S. Raman Spectroscopy and Improved Inception Network for Determination of FHB-Infected Wheat Kernels. Foods 2022, 11, 578. [Google Scholar] [CrossRef] [PubMed]
  12. Ba, W.; Jin, X.; Lu, J.; Rao, Y.; Zhang, T.; Zhang, X.; Zhou, J.; Li, S. Research on predicting early Fusarium head blight with asymptomatic wheat grains by micro-near infrared spectrometer. Spectrochim. Acta Part A Mol. Biomol. Spectrosc. 2023, 287 Pt 1, 122047. [Google Scholar] [CrossRef]
  13. Liang, K.; Ren, Z.; Song, J.; Yuan, R.; Zhang, Q. Wheat FHB resistance assessment using hyperspectral feature band image fusion and deep learning. Int. J. Agric. Biol. Eng. 2024, 17, 240–249. [Google Scholar] [CrossRef]
  14. Song, A.; Guo, X.; Wen, W.; Wang, C.; Gu, S.; Chen, X.; Wang, J.; Zhao, C. Improving accuracy and generalization in single kernel oil characteristics prediction in maize using NIR-HSI and a knowledge-injected spectral tabtransformer. Artif. Intell. Agric. 2025, 15, 802–815. [Google Scholar] [CrossRef]
  15. Guerri, M.F.; Distante, C.; Spagnolo, P.; Ahmed, T.A. Boosting hyperspectral image classification with Gate-Shift-Fuse mechanisms in a novel CNN-Transformer approach. Comput. Electron. Agric. 2025, 237, 110489. [Google Scholar] [CrossRef]
  16. Li, S.; Sun, L.; Jin, X.; Feng, G.; Zhang, L.; Bai, H.; Wang, Z. Research on variety identification of common bean seeds based on hyperspectral and deep learning. Spectrochim. Acta Part A Mol. Biomol. Spectrosc. 2025, 326, 125212. [Google Scholar] [CrossRef]
  17. Sylvain, L.; Olivier, D.C.; Benoît, C.; Vera, G.; Delphine, G.; Eric, M.; Dominique, V.D.S.; Fabrice, R.; Olivier, B. Phylogeny and Sequence Space: A Combined Approach to Analyze the Evolutionary Trajectories of Homologous Proteins. The Case Study of Aminodeoxychorismate Synthase. Acta Biotheor. 2020, 68, 139–156. [Google Scholar] [CrossRef]
  18. Benoît, C.; Laurent, V.; Sylvain, L.; Denys, D. MING: An interpretative support method for visual exploration of multidimensional data. Inf. Vis. 2022, 21, 246–269. [Google Scholar] [CrossRef]
  19. Colange, B.; Vuillon, L.; Lespinats, S.; Dutykh, D. Interpreting Distortions in Dimensionality Reduction by Superimposing Neighbourhood Graphs. In Proceedings of the IEEE Visualization Conference (VIS), Vancouver, BC, Canada, 20–25 October 2019; pp. 211–215. [Google Scholar] [CrossRef]
  20. Colange, B.; Peltonen, J.; Aupetit, M.; Dutykh, D.; Lespinats, S. Steering distortions to preserve classes and neighbors in supervised dimensionality reduction. In Proceedings of the 34th International Conference on Neural Information Processing Systems (NIPS ‘20), Vancouver, BC, Canada, 6–12 December 2020; Volume 1108, pp. 13214–13225. [Google Scholar]
  21. Geoffroy, H.; Berger, J.; Colange, B.; Lespinats, S.; Dutykh, D. The use of dimensionality reduction techniques for fault detection and diagnosis in a AHU unit: Critical assessment of its reliability. J. Build. Perform. Simul. 2023, 16, 249–267. [Google Scholar] [CrossRef]
  22. Chardome, G.; Feldheim, V. Thermal Modelling of Earth Air Heat Exchanger (EAHE) and Analyse of Health Risk. In Proceedings of the Building Simulation 2019: 16th Conference of IBPSA, Rome, Italy, 2–4 September 2019; Volume 16, pp. 1964–1970. [Google Scholar] [CrossRef]
  23. Yu, J.; Bai, G.; Cai, S.; Ban, T. Marker-assisted characterization of Asian wheat lines for resistance to Fusarium head blight. Theor. Appl. Genet. 2006, 113, 308–320. [Google Scholar] [CrossRef] [PubMed]
  24. GB/T 15796-2011; Ministry of Agriculture of the People’s Republic of China. Rules for Monitoring and Forecast of the Wheat Head Blight (Fusarium graminearum Schw./Gibberella zeae (Schw.) Petch). China Standards Press: Beijing, China, 2011.
  25. Plaza, A.; Benediktsson, J.A.; Boardman, W.J.; Brazile, J.; Bruzzone, L.; Valls, C.G.; Chanussot, J.; Fauvel, M.; Gamba, P.; Gualtieri, A.; et al. Recent advances in techniques for hyperspectral image processing. Remote Sens. Environ. 2007, 113 (Suppl. 1), S110–S122. [Google Scholar] [CrossRef]
  26. Gai, Z.; Sun, L.; Bai, H.; Li, X.; Wang, J.; Bai, S. Convolutional neural network for apple bruise detection based on hyperspectral. Spectrochim. Acta Part A: Mol. Biomol. Spectrosc. 2022, 279, 121432. [Google Scholar] [CrossRef]
  27. He, Y.; Zhao, Y.; Xu, J.; Zhou, D.; Shi, W.; Wang, Y.; Wang, Y.; Wang, X.; Zhang, M.; Kang, N.; et al. Hyperspectral Imaging for Benign and Malignant Diagnosis of Breast Tumors. J. Biophotonics 2025, e202500188. [Google Scholar] [CrossRef] [PubMed]
  28. Jiang, X.; Cao, X.; Liu, Q.; Wang, F.; Fan, S.; Yan, L.; Wei, Y.; Chen, Y.; Yang, G.; Xu, B.; et al. Prediction of multi-task physicochemical indices based on hyperspectral imaging and analysis of the relationship between physicochemical composition and sensory quality of tea. Food Res. Int. 2025, 211, 116455. [Google Scholar] [CrossRef] [PubMed]
  29. Chang, H.; Meng, Q.; Wu, Z.; Tang, L.; Qiu, Z.; Ni, C.; Chu, J.; Fang, J.; Huang, Y.; Li, Y. Accurate ripening stage classification of pineapple based on visible and near-infrared hyperspectral imaging system. J. AOAC Int. 2025, 108, 293–303. [Google Scholar] [CrossRef]
  30. Yuan, M.; Ding, L.; Bai, R.; Yang, J.; Zhan, Z.; Zhao, Z.; Hu, Q.; Huang, L. Feature-level hyperspectral data fusion with CNN modeling for non-destructive authentication of “Weilian” from different origins. Microchem. J. 2025, 215, 114201. [Google Scholar] [CrossRef]
  31. Zhang, M.; Tang, S.; Lin, C.; Lin, Z.; Zhang, L.; Dong, W.; Zhong, N. Hyperspectral Imaging and Machine Learning for Diagnosing Rice Bacterial Blight Symptoms Caused by Xanthomonas oryzae pv. oryzae, Pantoea ananatis and Enterobacter asburiae. Plants 2025, 14, 733. [Google Scholar] [CrossRef]
  32. Wei, X.; Deng, C.; Fang, W.; Xie, C.; Liu, S.; Lu, M.; Wang, F.; Wang, Y. Classification method for folded flue-cured tobacco based on hyperspectral imaging and conventional neural networks. Ind. Crops Prod. 2024, 212, 118279. [Google Scholar] [CrossRef]
  33. Tyagi, N.; Porwal, S.; Singh, P.; Raman, B.; Garg, N. Nondestructive Identification of Wheat Species using Deep Convolutional Networks with Oversampling Strategies on Near-Infrared Hyperspectral Imagery. J. Nondestruct. Eval. 2024, 44, 5. [Google Scholar] [CrossRef]
  34. Li, Y.; Zhao, B.; Li, S.; Yang, X.; Yu, M.; Li, Z. Multimodal Diagnostic Approach for Osteosarcoma and Bone Callus Using Hyperspectral Imaging and Deep Learning. J. Biophotonics 2025, 18, e202500087. [Google Scholar] [CrossRef]
  35. Liu, B.; Gao, K.; Yu, A.; Ding, L.; Qiu, C.; Li, J. ES2FL: Ensemble Self-Supervised Feature Learning for Small Sample Classification of Hyperspectral Images. Remote Sens. 2022, 14, 4236. [Google Scholar] [CrossRef]
  36. Li, X.; Peng, F.; Wei, Z.; Han, G. Identification of yellow vein clearing disease in lemons based on hyperspectral imaging and deep learning. Front. Plant Sci. 2025, 16, 1554514. [Google Scholar] [CrossRef]
  37. Mao, R.; Wang, Z.; Li, F.; Zhou, J.; Chen, Y.; Hu, X. GSEYOLOX-s: An Improved Lightweight Network for Identifying the Severity of Wheat Fusarium Head Blight. Agronomy 2023, 13, 242. [Google Scholar] [CrossRef]
  38. Khandagale, P.H.; Patil, T.S.; Gavali, S.V.; Manjrekar, A.A.; Halkarnikar, P.P. Enhancing fruit disease classification with an advanced 3D shallow deep neural network for precise and efficient identification. Expert Syst. Appl. 2025, 293, 128559. [Google Scholar] [CrossRef]
  39. Kurmi, Y.; Saxena, P.; Kirar, S.B.; Gangwar, S.; Chaurasia, V.; Goel, A. Deep CNN model for crops’ diseases detection using leaf images. Multidimens. Syst. Signal Process. 2022, 33, 981–1000. [Google Scholar] [CrossRef]
  40. Zhou, Q.; Huang, Z.; Liu, L.; Wang, F.; Teng, Y.; Liu, H.; Zhang, Y.; Wang, R. High-throughput spike detection and refined segmentation for wheat Fusarium head blight in complex field environments. Comput. Electron. Agric. 2024, 227, 109552. [Google Scholar] [CrossRef]
  41. Diao, Z.; Yan, J.; He, Z.; Zhao, S.; Guo, P. Corn seedling recognition algorithm based on hyperspectral image and lightweight-3D-CNN. Comput. Electron. Agric. 2022, 201, 107343. [Google Scholar] [CrossRef]
  42. Almoujahed, M.B.; Apolo, O.E.A.; Alhussein, M.; Kazlauskas, M.; Kriaučiūnienė, Z.; Šarauskis, E.; Mouazen, M.A. Deoxynivalenol prediction and spatial mapping in wheat based on online hyperspectral imagery scanning. Smart Agric. Technol. 2025, 11, 100947. [Google Scholar] [CrossRef]
  43. Mahlein, K.A.; Oerke, C.E.; Steiner, U.; Dehne, W.H. Recent advances in sensing plant diseases for precision crop protection. Eur. J. Plant Pathol. 2012, 133, 197–209. [Google Scholar] [CrossRef]
  44. Carter, G.A.; Knapp, A.K. Leaf optical properties in higher plants: Linking spectral characteristics to stress and chlorophyll concentration. Am. J. Bot. 2001, 88, 677–684. [Google Scholar] [CrossRef]
  45. Blackburn, A.G. Hyperspectral remote sensing of plant pigments. J. Exp. Bot. 2006, 58, 855–867. [Google Scholar] [CrossRef]
  46. Zhu, L.; Chen, Z.; Wang, J.; Ding, J.; Yu, Y.; Li, J.; Xiao, N.; Jiang, L.; Zheng, Y.; Rimmington, M.G. Monitoring plant response to phenanthrene using the red edge of canopy hyperspectral reflectance. Mar. Pollut. Bull. 2014, 86, 332–341. [Google Scholar] [CrossRef]
  47. Zhang, D.; Wang, Q.; Lin, F.; Yin, X.; Gu, C.; Qiao, H. Development and evaluation of a new spectral disease index to detect wheat Fusarium head blight using hyperspectral imaging. Sensors 2020, 20, 2260. [Google Scholar] [CrossRef] [PubMed]
  48. Weber, V.S.; Araus, J.L.; Cairns, J.E.; Sanchez, C.; Melchinger, A.E.; Orsini, E. Prediction of grain yield using reflectance spectra of canopy and leaves in maize plants grown under different water regimes. Field Crops Res. 2012, 128, 82–90. [Google Scholar] [CrossRef]
  49. Sankaran, S.; Mishra, A.; Reza, E.; Davis, C. A review of advanced techniques for detecting plant diseases. Comput. Electron. Agric. 2010, 72, 1–13. [Google Scholar] [CrossRef]
  50. Damien, V.; Damien, E.; Guillaume, J.; Anne, C.; Antonio, J.P.F.; François, S.; Vincent, B.; Benoît, M.; Philippe, V. Near infrared hyperspectral imaging method to assess Fusarium head blight infection on winter wheat ears. Microchem. J. 2023, 191, 108812. [Google Scholar] [CrossRef]
Figure 1. Study the overall flowchart.
Figure 1. Study the overall flowchart.
Agronomy 15 02051 g001
Figure 2. Phenotypic images of wheat FHB samples at different severity levels.
Figure 2. Phenotypic images of wheat FHB samples at different severity levels.
Agronomy 15 02051 g002
Figure 3. Diagram of HSI data collection and processing workflow.
Figure 3. Diagram of HSI data collection and processing workflow.
Agronomy 15 02051 g003
Figure 4. Comparison of FHB reflectance curves at different levels.
Figure 4. Comparison of FHB reflectance curves at different levels.
Agronomy 15 02051 g004
Figure 5. Reflectance curve graphs after different pretreatment methods.
Figure 5. Reflectance curve graphs after different pretreatment methods.
Agronomy 15 02051 g005
Figure 6. Feature distribution maps under different feature selection algorithms.
Figure 6. Feature distribution maps under different feature selection algorithms.
Agronomy 15 02051 g006
Figure 7. Model training process.
Figure 7. Model training process.
Agronomy 15 02051 g007
Figure 8. Confusion matrix of the model test set.
Figure 8. Confusion matrix of the model test set.
Agronomy 15 02051 g008
Figure 9. FHB detection system interface.
Figure 9. FHB detection system interface.
Agronomy 15 02051 g009
Table 1. FHB grading standards and sample distribution.
Table 1. FHB grading standards and sample distribution.
FHB LevelInfection
Area
Sample Size
Before ExpansionAfter Expansion
Level 1<1/412372
Level 21/4~1/2196392
Level 31/2~3/4448448
Level 4>3/4224448
Table 2. Hardware and software configuration.
Table 2. Hardware and software configuration.
Configuration ItemValue
CPUADM Ryzen 9 7940H
Integrated GPUAMD Radeon 780 M Graphics
Dedicated GPUNVIDIA GeForce RTX 4060 Laptop GPU
System Memory (RAM)16 GB
CUDA Toolkit Version12.4
Operating systemWindows 10
Deep learning frameworkPyTorch 2.4.1
Table 3. Results of different preprocessing algorithms.
Table 3. Results of different preprocessing algorithms.
PretreatmentTrain SetTest SetPrecision
Discrepancy/%
Precision/%F1-Score/%Precision/%F1-Score/%
Original95.5495.5390.1690.095.38
Normalized97.8197.8190.2090.097.61
SNV99.9299.9293.3293.256.60
MSC98.9998.9994.3194.274.68
SG95.1295.1291.1691.063.96
Table 4. Results of different feature selection algorithms.
Table 4. Results of different feature selection algorithms.
Feature SelectionTrain SetTest SetPrecision
Discrepancy/%
Feature
Count
Precision/%F1-Score/%Precision/%F1-Score/%
UVE98.9798.9794.8894.754.0911
RF95.8895.8890.0689.745.8220
SPA98.0998.0993.1092.784.9916
CARS97.7997.7993.9393.753.8614
Table 5. Different model detection results.
Table 5. Different model detection results.
MethodsmAP
(%)
F1-Score
(%)
Recall
(%)
Precision (%)Parameters
(MB)
FPSLEI
TrainTrainValidationTest
MobileNetV299.9399.4099.4099.4098.2198.268.503101.56
EfficientNet-B099.9999.9799.9799.9799.5599.4715.301310.93
ShuffleNetV299.3499.1299.1299.1296.5096.364.8011782.39
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Liang, X.; Yang, S.; Mu, L.; Shi, H.; Yao, Z.; Chen, X. A Non-Destructive System Using UVE Feature Selection and Lightweight Deep Learning to Assess Wheat Fusarium Head Blight Severity Levels. Agronomy 2025, 15, 2051. https://doi.org/10.3390/agronomy15092051

AMA Style

Liang X, Yang S, Mu L, Shi H, Yao Z, Chen X. A Non-Destructive System Using UVE Feature Selection and Lightweight Deep Learning to Assess Wheat Fusarium Head Blight Severity Levels. Agronomy. 2025; 15(9):2051. https://doi.org/10.3390/agronomy15092051

Chicago/Turabian Style

Liang, Xiaoying, Shuo Yang, Lin Mu, Huanrui Shi, Zhifeng Yao, and Xu Chen. 2025. "A Non-Destructive System Using UVE Feature Selection and Lightweight Deep Learning to Assess Wheat Fusarium Head Blight Severity Levels" Agronomy 15, no. 9: 2051. https://doi.org/10.3390/agronomy15092051

APA Style

Liang, X., Yang, S., Mu, L., Shi, H., Yao, Z., & Chen, X. (2025). A Non-Destructive System Using UVE Feature Selection and Lightweight Deep Learning to Assess Wheat Fusarium Head Blight Severity Levels. Agronomy, 15(9), 2051. https://doi.org/10.3390/agronomy15092051

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop