Next Article in Journal
Breeding Smarter: Artificial Intelligence and Machine Learning Tools in Modern Breeding—A Review
Previous Article in Journal
YOLOv8n-DSP: A High-Precision Model for Oat Ear Detection and Counting in Complex Fields
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Monitoring Rice Blast Disease Progression Through the Fusion of Time-Series Hyperspectral Imaging and Deep Learning

1
Guangdong Provincial Key Laboratory of High Technology for Plant Protection, Plant Protection Research Institute, Guangdong Academy of Agricultural Sciences, Guangzhou 510640, China
2
College of Electronic Engineering & College of Artificial Intelligence, South China Agricultural University, Guangzhou 510642, China
3
Rice Research Institute, Guangdong Academy of Agricultural Sciences, Guangdong Key Laboratory of New Technology for Rice Breeding, Guangzhou 510640, China
4
School of Mechanical and Electrical Engineering, Guangzhou University, Guangzhou 510006, China
*
Authors to whom correspondence should be addressed.
These authors contributed equally to this work.
Agronomy 2026, 16(1), 136; https://doi.org/10.3390/agronomy16010136
Submission received: 21 November 2025 / Revised: 24 December 2025 / Accepted: 29 December 2025 / Published: 5 January 2026

Abstract

Rice blast, caused by Magnaporthe oryzae, is a devastating disease that jeopardizes global rice production and food security. Precision agriculture demands timely and accurate monitoring tools to enable targeted intervention. This study introduces a novel deep learning framework that fuses time-series hyperspectral imaging with an advanced Autoformer model (AutoMSD) to dynamically track rice blast progression. The proposed AutoMSD model integrates multi-scale convolution and adaptive sequence decomposition, effectively decoding complex spatio-temporal patterns associated with disease development. When deployed on a 7-day hyperspectral dataset, AutoMSD achieved 86.67% prediction accuracy using only 3 days of historical data, surpassing conventional approaches. This accuracy at an early infection stage underscores the model’s strong potential for practical field deployment. Our work provides a scalable and robust decision-support tool that paves the way for site-specific disease management, reduced pesticide usage, and enhanced sustainability in rice cultivation systems.

1. Introduction

Rice (Oryza sativa) serves as a staple food for nearly half of the global population, and its production is crucial for ensuring global food security [1]. Rice blast, caused by the fungal pathogen Magnaporthe oryzae, is a devastating disease worldwide, affecting over 80 major rice-producing countries [2]. This disease can occur at any growth stage of rice and is classified into leaf blast, neck blast, and node blast based on the infection sites. Among these, neck blast can reduce yields by 30–50%, with severe cases leading to total crop loss [3]. With climate change and the widespread adoption of monoculture rice varieties, the frequency and severity of rice blast outbreaks have intensified, making it a major biotic stress factor threatening global food security [4,5]. To mitigate large-scale epidemics and minimize yield losses, there is an urgent need to establish accurate and efficient early detection and monitoring methods for rice blast.
In recent years, hyperspectral imaging (HSI) has demonstrated significant potential for rapid and non-destructive detection of plant diseases, and has been widely applied in early diagnosis and pathogenesis monitoring of various plant diseases. For early detection and pathogenesis monitoring of wheat powdery mildew, HSI technology enables the acquisition of hyperspectral characteristics at different infection stages. Xuan et al. [6] effectively distinguished samples with different infection levels by combining Principal Component Analysis (PCA) and employed the Successive Projections Algorithm (SPA) to extract characteristic wavelengths. The partial least squares discriminant analysis model established for wheat powdery mildew ultimately achieved an overall classification accuracy of 91.4% in the validation set. Deng et al. [7] confirmed that models integrating multiple vegetation indices outperformed those using a single index, enabling timely and precise monitoring and early warning of wheat stripe rust development. Liang et al. [8] developed an optimal lightweight support vector machine model that combined Standard Normal Variate transformation and Linear Discriminant Analysis. The model achieved accuracy rates of 96.05% and 94.71% in the test set and training set, respectively. HSI technology also shows extensive application potential in rice disease detection. For rice sheath blight monitoring, hyperspectral remote sensing technology was integrated with stepwise spectral feature selection methods to establish an early detection model with an overall accuracy of 87% [9]. In rice blast monitoring, researchers calculated whole-leaf reflectance using HSI technology and constructed SVM models to evaluate disease severity across multiple rice growth stages [10]. To achieve precise detection of rice blast occurrence, researchers developed a large-scale dataset incorporating leaf-scale, canopy-scale reflectance spectra, and satellite imagery. Through spectral index band optimization methods, critical spectral positions were identified. Using multi-year, multi-site field data collected at leaf, near-ground canopy, and satellite platforms, the study systematically evaluated the performance of rice blast indices (RIBIs) in estimating leaf blast severity at different scales [11].
In parallel with hyperspectral imaging techniques, deep learning-based approaches have introduced powerful object detection frameworks that show promise in agricultural applications. The YOLO (You Only Look Once) series of algorithms, renowned for their real-time processing capabilities, have been successfully applied to various crop disease detection tasks. These methods excel in identifying visible disease symptoms from RGB images with remarkable speed and accuracy, making them valuable tools for automated field scouting and rapid symptom assessment. However, while object detection methods demonstrate excellent performance in recognizing apparent disease manifestations, they face inherent challenges in capturing pre-symptomatic physiological changes and forecasting temporal disease progression patterns, particularly when dealing with subtle spectral variations that precede visual symptom development.
Time-series data analysis has been extensively applied in plant disease monitoring. In another study, Su et al. [12] conducted spatiotemporal monitoring of wheat yellow rust using time-series multispectral images. Hassan et al. [13] employed UAV-based time-series multispectral imagery combined with spectral indices to rapidly and accurately estimate senescence rates during wheat maturation. Kim et al. [14] integrated environmental conditions with time-series data and applied Long Short-Term Memory (LSTM) recurrent neural networks for early prediction of rice blast occurrence, achieving a disease prediction accuracy of 67.4%. Guo et al. [15] proposed B-ResNet50 for wheat stripe rust severity classification, achieving an average recognition accuracy of 97.3% for infected wheat leaves. Verma et al. [16] utilized video stream data analysis with LSTM-convolutional neural networks to predict rice diseases, demonstrating that prediction models based on fuzzy features yielded excellent forecasting performance. Guo et al. [17] developed an improved LSTM model incorporating multi-year meteorological data and disease survey records, which showed superior predictive capability for peanut leaf spot disease.
Beyond the widely used LSTM [18], transformer models have garnered significant attention in time-series analysis following their remarkable success in natural language processing and computer vision domains. The core advantages in capturing temporal dependencies and interactions have demonstrated strong modeling capabilities and leading to breakthrough progress in various time-series applications [19,20]. Numerous Transformer-based time-series analysis models have been developed recently [21]. Autoformer, an enhanced time-series forecasting model based on Transformer architecture, effectively handles time-series data with varying frequencies and periodicities through its adaptive spatiotemporal attention mechanism (Auto-Correlation) [22]. FED former combines seasonal-trend decomposition with frequency-enhanced Transformers for flexible modeling of complex time-series data [23]. Pyraformer employs a low-complexity pyramidal attention mechanism to capture multi-scale temporal dependencies through multi-resolution representations, enabling efficient forecasting [24]. Informer improves prediction efficiency and accuracy through ProbSparse self-attention, self-attention distillation, and generative decoder mechanisms [25].
The existing literature reveals a complementary relationship between different technological approaches. While object detection methods like YOLO provide efficient solutions for real-time symptom identification, and traditional time-series analyses offer valuable insights into disease patterns, there remains a significant research gap in integrating temporal hyperspectral data with advanced deep learning architectures for predictive disease modeling. This study addresses this gap by proposing a novel framework that leverages the temporal dynamics of hyperspectral data through an enhanced Autoformer model, enabling prediction of disease development before visible symptoms emerge, thereby offering a complementary approach to existing detection methodologies.
To facilitate precise monitoring of rice blast disease progression and spatiotemporal dynamics, we collected hyperspectral data continuously for seven days from pathogen inoculation to full symptom development. Through systematic analysis of Magnaporthe oryzae-infected samples with varying disease severity under controlled greenhouse conditions, we acquired and analyzed multi-temporal hyperspectral data to extract spectral features characterizing disease progression. We proposed AutoMSD, an enhanced Autoformer-based temporal prediction network, to achieve early prediction and dynamic monitoring of rice blast. This study aimed to integrate hyperspectral technology with time-series analytical methods to develop an efficient and accurate dynamic prediction model for rice blast, addressing the critical need for large-scale, high-temporal-resolution monitoring with superior precision.

2. Materials and Methods

The experimental design and data processing workflow implemented in this study comprise three main modules: data acquisition, dimensionality reduction, and prediction. A detailed schematic of the workflow is provided in Figure 1.

2.1. Experimental Design

2.1.1. Rice Cultivation and Artificial Inoculation

Greenhouse experiments were carried out in 2023 at the experimental base of the Institute of Plant Protection, Guangdong Academy of Agricultural Sciences, Guangzhou, China. Two rice varieties were used: the susceptible cultivar Lijiang Xin Tuan Hei Gu (LTH) (does not contain any rice blast resistant genes), served as the control, and the resistant cultivar NIL-e1, developed by the Institute of Plant Protection, Guangdong Academy of Agricultural Sciences [26], was used for comparison. Rice seeds were germinated and sown in pots (12.2 cm diameter) with three holes per pot and five to six plants per hole (Figure S1). A total of 50 pots were prepared, including 40 pots of the susceptible variety and 10 pots of the resistant variety. Seedlings were fertilized with ammonium sulfate (Guangzhou Chemical Reagent Factory, Guangzhou, China) (0.5 g per pot) at the one-leaf one-heart stage, with a total of three fertilizer applications prior to inoculation. Inoculation was performed when the seedlings reached the 3.5–4 leaf stage via artificial spraying.
On 30 December 2023, inoculation was conducted using a spore suspension of the fungal isolates provided by the Institute of Plant Protection. Spores were eluted from infected corn kernels using sterile water and filtered through a double-layer fine mesh to remove debris. The spore concentration was adjusted to 1 × 105 spores/mL before spray inoculation. After inoculation, the plants were kept in a dark incubation chamber at 25 °C and >95% relative humidity for 24 h. They were then transferred to a glass greenhouse maintained at 25–28 °C with high humidity to promote disease development [27]. Hyperspectral data were acquired from the leaves of the inoculated seedlings over seven consecutive days.

2.1.2. Definition of the Disease Infection Process

Disease severity was defined to characterize the progression of rice blast infection. The assessment was conducted based on the seedling blast classification criteria established by the International Rice Research Institute (IRRI), as follows: Grade 0: no lesions; Grade 1: needle-point-sized lesions with diameter < 0.50 mm; Grade 2: small brown pinpoint lesions, 0.50 mm ≤ diameter < 1.00 mm; Grade 3: circular to elliptical gray lesions with brown edges, 1.00 mm ≤ diameter < 2.00 mm; Grade 4: elliptical or spindle-shaped lesions confined between two leaf veins, affecting <2% of leaf area; Grade 5: typical blast lesions affecting 2% to <10% of leaf area; Grade 6: typical blast lesions affecting 10% to <25% of leaf area; Grade 7: typical blast lesions affecting 25% to <50% of leaf area; Grade 8: typical blast lesions affecting 50% to <75% of leaf area; Grade 9: typical blast lesions affecting ≥75% of leaf area. Hyperspectral measurements were initiated on the day of inoculation and continued daily for the next six days. Each measurement was taken at 2:00 p.m., with leaves measured in a fixed numerical order to maintain consistent time intervals. As illustrated in Figure 2, the practical disease severity levels observed during the experiment are shown. The highest disease severity recorded over the seven-day observation period was grade 7.

2.1.3. Experimental Setup

To investigate the hyperspectral dynamics during rice blast infection, 40 pots of susceptible rice cultivars (LTH) were cultivated to collect spectral data spanning from pre-symptomatic to advanced infection stages. Additionally, a control group consisting of ten pots of resistant rice variety (NIL-e1) was included to enhance the robustness and comparability of the study. All plants were grown under uniform environmental conditions and nutrient regimes to ensure the reliability of experimental results.
Due to the small leaf size of the seedlings, each pot contained three planting holes, which were systematically analyzed as individual sampling units. During spectral measurement, multiple leaves were gently flattened to ensure accurate and representative data acquisition. Reflectance spectra were collected at three positions along the leaves: the basal one-third, mid-point, and distal two-thirds, to comprehensively capture spectral characteristics. Leaf clips were used to stabilize the blades during measurement, ensuring consistency and precision. The obtained spectra from each plant were averaged to represent the mean reflectance of the sampling site, thereby reducing random measurement errors.

2.2. Data Collection

2.2.1. Acquisition of Leaf Reflectance Spectra

In this study, hyperspectral data of rice leaves were measured using an ASD FieldSpec4 (Analytical Spectral Devices, Inc., Boulder, CO, USA) portable ground spectrometer (350–2500 nm), equipped with a plant leaf clip. The ASD FieldSpec4 portable spectroradiometer was a high-precision spectral measurement device widely used for measuring surface reflectance (Figure S2). The attached leaf clip includes a white reference panel with approximately 99% reflectance and a black background panel with less than 1% reflectance. With the halogen lamp activated, dark current correction and optimization were performed using the white reference panel. Multiple leaves were flattened for hyperspectral measurements, and three reflectance spectra were collected from each rice hole at positions 1/3, 1/2, and 2/3 from the base of the rice leaves. After seven consecutive days of measurements, a total of 3150 spectral data curves were obtained. All spectral data collected from the same rice hole were averaged, resulting in a total of 7 spectral data curves. This total of 3150 spectral curves was derived from: 50 pots × 3 planting holes per pot ×3 measurement positions per hole × 7 days of consecutive measurement.

2.2.2. Dataset Construction

In this study, a total of 50 pot-grown rice plants were cultivated, with three planting holes per pot. Hyperspectral data were acquired daily from the top, middle, and bottom positions of each hole. The three measurements per hole were averaged to obtain a representative spectrum for the respective rice plant. Data collection was conducted over a continuous seven-day period, yielding time-series hyperspectral data for 150 rice holes. Concurrently, the disease severity of each rice plant was assessed and classified on a daily basis according to the seedling blight grading standards established by the International Rice Research Institute (IRRI).
Each rice hole’s seven-day hyperspectral data were treated as one sample, resulting in a total of 150 samples. To reduce data dimensionality, two feature selection methods were applied: Random Forest (RF) [28] and Pearson correlation analysis [29]. The dataset was then randomly split into training and testing subsets at an 8:2 ratio. The training set, consisting of 120 samples with complete 7-day spectral profiles, was used for model training and hyperparameter tuning. The remaining 30 samples were allocated to the test set for independent evaluation of the model’s generalization performance.

2.3. Model Development

2.3.1. Dimensionality Reduction in Spectral Data

The high dimensionality and strong inter-band correlation of hyperspectral data often result in substantial redundancy. Direct use of raw spectral data for modeling imposes significant computational demands and increases the risk of model overfitting [30]. Therefore, dimensionality reduction is essential to mitigate these issues. This process not only reduces data complexity but also enhances the extraction of salient features, thereby streamlining model development and improving generalization performance. In this study, a two-stage strategy was adopted for data reduction and feature extraction. Initially, spectral preprocessing was applied to the data. Subsequently, the Pearson correlation coefficient and random forest (RF) algorithm were employed for further data compression and feature selection, respectively.
The Pearson correlation coefficient, also referred to as the Pearson product-moment correlation coefficient, is a widely used statistical measure for assessing the strength and direction of the linear relationship between two continuous variables [30]. Compared to other correlation measures, it offers greater stability and is less influenced by sample size in the absence of outliers. The coefficient is calculated as shown in Equation (1), where the value of r ranges from −1 to 1, A value of 1 denotes a perfect positive linear relationship, −1 indicates linear relationship, and 0 suggests no linear correlation.
r = i = 1 n x i x ¯ y i y ¯ i = 1 n x i x ¯ 2 i = 1 n y i y ¯ 2
The Random Forest (RF) algorithm provides an effective approach for feature extraction, facilitating data reduction and enhancing model optimization [31]. Feature importance in RF is typically assessed through a permutation-based method applied to out-of-bag (OOB) samples. Specifically, the importance of each feature is quantified by the increase in the OOB error after randomly permuting the values of that feature, while keeping others unchanged [32]. In this study, each spectral dataset covers the wavelength range from 350 nm to 2500 nm, resulting in 2151 spectral bands—each treated as an individual feature. The RF algorithm was employed to identify the most influential wavelengths by evaluating the impact of permuting each feature on the prediction error [33].
Assume a random forest model with T decision trees and n original spectral features. The feature importance evaluation based on OOB error proceeds as follows:
(1) Determine the out-of-bag data: Calculate the number of error samples, E r r O O B i for the out-of-bag data O O B i corresponding to the ith decision tree.
(2) Feature perturbation: While keeping other features unchanged, randomly perturb feature X j in O O B i to obtain the perturbed out-of-bag data O O B i j for feature j.
(3) Recalculate the error: Use the perturbed out-of-bag data O O B i j to recalculate the number of error samples, E r r O O B i j
(4) Iteration: Repeat steps 1 to 3 for each decision tree to obtain a set of error sample counts E r r O O B i j for feature X j .
(5) The importance score S i m X j of feature X j is calculated using Equation (2), which involves averaging the out-of-bag error changes across each decision tree. If permuting a feature result in a significant increase in error, it indicates that the feature has a substantial impact on the model’s predictions, thereby leading to a higher importance score.
S i m X j = 1 N t r e e i = 1 N t r e e E r r O O B i j E r r O O B i

2.3.2. Autoformer Model

The Autoformer model provides an effective and accurate framework for time series forecasting by integrating deep decomposition with autocorrelation mechanisms. This approach demonstrates strong capabilities in feature extraction and effectively captures long-term dependencies as well as periodic patterns within the data [24].
The model architecture consists of two main components: an encoder and a decoder, which together form a sequence-to-sequence prediction structure. The encoder transforms the input time series into informative context vectors, while the decoder utilizes these representations to generate future predictions.
Within the encoder, a deep decomposition technique is applied to progressively separate the input series into trend and seasonal components. Specifically, trend features were extracted via a moving average operation implemented through average pooling. The seasonal component is then derived by subtracting the estimated trend from the original data, as illustrated in Equations (3) and (4):
X trend = AvgPool X
X seasonal = X X trend
where X represents the original time series data, X trend denotes the trend component, and X seasonal represents the seasonal feature data.
The encoder further incorporates an autocorrelation mechanism to capture periodic patterns within the data by assessing the similarity between the original time series and its delayed versions. The autocorrelation computation is defined by Equation (5), as follows:
AutoCorr X , τ = t = 1 T τ X t μ X t + τ μ t = 1 T X t μ 2 t = 1 T τ X t + τ μ 2
where AutoCorr X , τ represents the autocorrelation coefficient between the sequence X and its lagged sequence with a lag of τ , and μ denotes the mean of the sequence.
The decoder generates future predictions based on the output of the encoder (i.e., the trend and seasonal components). The decoder consists of two main parts: a cumulative structure for trend-cyclical components and a stacked auto-correlation mechanism for seasonal components. The trend-cyclical component gradually refines the accuracy of trend predictions through the cumulative structure, while the seasonal component effectively leverages historical seasonal information via the stacked auto-correlation mechanism. The architecture of the Autoformer model is illustrated in Figure 3.

2.3.3. Multi-Scale Convolution

In time series classification and forecasting tasks, traditional neural network models are often limited to single-scale feature extraction, which hinders their ability to comprehensively capture multi-scale characteristics in time series data. To address this issue, researchers have proposed the Multi-Scale Convolutional Neural Network (MSCNN), which introduces multi-scale convolutional blocks to simultaneously extract features at different scales, thereby enhancing the model’s understanding of time series data and improving its predictive performance [34]. The core component of MSCNN consists of multiple convolutional units with varying scales. These units employ convolutional kernels of different lengths (e.g., lengths 3, 5, 7, etc.) to process input features concurrently, extracting both short-term and long-term patterns from the time series data through multi-scale convolutional operations. The output of the multi-scale convolutional block is a weighted sum of the outputs from all convolutional units, as illustrated in Equation (6).
O = W f c o n c a t C 3 , C 5 , C 7 ,
Assuming that O is the final output of the multi-scale convolutional block, C k represents the output of the k-th scale convolutional unit (where k is the index of the convolution kernel size, such as 3, 5, 7, etc.), and W f denotes the weights of the 1 × 1 convolutional layer. Here, concat indicates the concatenation of the outputs from different scale convolutional units along the channel dimension.

2.3.4. Adaptive Decomposition

Based on the decomposition module of Autoformer, this study introduces an adaptive sequence decomposition module named AdaptiveSeriesDecomp. This module progressively extracts trend features from the intermediate hidden variables within the prediction process. Unlike the moving average approach used in the original Autoformer decomposition, the proposed method employs a convolutional neural network to adaptively learn and extract trend components from the data, thereby capturing more complex temporal patterns. Specifically, one-dimensional convolution is applied to smooth short-term fluctuations and emphasize underlying trend features. For an input sequence X R L × d of length L , the procedure is formulated as follows:
Trend Extraction: The transformed input is processed through a one-dimensional convolutional layer to derive the trend component of the time series, as expressed in Equation (7):
X t = Conv 1 d X
Seasonal Component Calculation: The seasonal component is obtained by subtracting the extracted trend component from the original input data, as illustrated in Equation (8).
X s = X X t  
Summarizing the process described above, the AdaptiveSeriesDecomp module can be represented as illustrated in Equation (9).
X s , X t = AdaptiveSeriesDecomp X
The AdaptiveSeriesDecomp module substantially improves the performance of time series decomposition by integrating the adaptive learning capacity of convolutional neural networks. In comparison to the SeriesDecomp module used in Autoformer, the proposed AdaptiveSeriesDecomp achieves higher decomposition accuracy and exhibits enhanced capability in modeling complex temporal patterns. These advantages establish a robust foundation for downstream time series analysis and forecasting tasks, as demonstrated in Figure 4.

2.3.5. Architecture of the AutoMSD Model

Based on the Autoformer model, this study introduces optimizations through the incorporation of adaptive series decomposition and a multi-scale convolutional module. The input sequence is first processed by the adaptive decomposition module to separate trend and seasonal components. The encoder then encodes the input and performs multi-scale feature extraction using convolutional operations. The decoder receives both the initial seasonal components and the encoded representations from the encoder. The final prediction is obtained by aggregating the trend and seasonal components output by the decoder, as illustrated in Figure 5.

2.4. Model Training and Evaluation

2.4.1. Model Training

This study is implemented based on the PyTorch (1.10.2) open-source deep learning framework. The hardware configuration of the operating platform includes an NVIDIA GeForce RTX3090 GPU, equipped with an Intel (R) Core (TM) i7-11700@2.50 GHz processor and 64 GB of memory. We have constructed predictive models leveraging temporal hyperspectral data, Autoformer, and AutoMSD, with the aim of predicting the trend of rice blast disease and comparing the performance of different models. Key hyperparameters after tuning include: batch size of 32, number of epochs set to 100, optimizer Adam with learning rate 0.001, and loss function MSE. These parameters were optimized through grid search to ensure model performance.

2.4.2. Model Evaluation

This study employs a regression model and adopts three major metrics, namely MSE (Mean Squared Error), MAE (Mean Absolute Error), and RMSE (Root Mean Squared Error), to rigorously evaluate the predictive performance of the model. These metrics are widely used in regression analysis, and can precisely quantify the subtle discrepancies between predicted results and ground truth values.
In light of the continuous and progressive nature of disease grade evolution, we put forth a novel evaluation method, Tolerance Accuracy (as shown in Equation (10)). In contrast to conventional approaches to accuracy calculation, the Tolerance Accuracy method takes into account the continuous nature of disease grade evolution. A reasonable tolerance threshold was established to accommodate moderate discrepancies between the predicted grade and the actual grade. A prediction result was considered correctly classified if the difference between the predicted grade and the actual grade falls within the tolerance range. This evaluation approach more closely aligns with the actual situation of disease development and provides a certain level of flexibility and tolerance to model predictions.
Tolerance   Accuracy = i = 1 n Prediction i Truth i < Tolerance n
In the initial phase of the disease’s progression, we employed a tolerance accuracy assessment to gauge the model’s efficacy. By calculating the tolerance accuracy, we were able to gain a clear understanding of the accuracy of the model in predicting disease grades, including the proportion of correct predictions and the range of errors. This evaluation method was more closely aligned with the requirements of practical applications, as in real-world scenarios, minor grade differences often have a limited impact on disease prevention measures. The assessment tool enabled the acquisition of more precise insights into the early signals of disease occurrence, thereby providing robust data support for the implementation of timely and scientifically effective prevention and control measures.

3. Results

In order to evaluate the performance of the model and analyze it in conjunction with the actual experimental conditions of rice blast disease, historical data from the first two, three, and four days after the onset of the disease were selected for the purpose of predicting the disease situation in subsequent days.

3.1. Dimensionality Reduction in Spectral Data Analysis

The original hyperspectral dataset comprised 2151 bands covering the wavelength range from 350 nm to 2500 nm. To address data redundancy and enhance feature extraction, we applied two dimensionality reduction methods: Random Forest (RF) and Pearson correlation analysis. This section details the results of these approaches.

3.1.1. Analysis of Dimensionality Reduction Results Using Pearson Correlation

The data were subjected to dimensionality reduction through Pearson’s correlation coefficient to establish spectral feature correlations, as illustrated in Figure 3.
As illustrated in Figure 6, the correlation coefficients of the majority of the bands were relatively low, while a few bands exhibit higher correlation coefficients. When the bands with r > 0.5 were considered characteristic bands, the majority of their characteristic bands were distributed between 573 and 711 nm, with a total of 96 characteristic bands within this range. When the bands with r > 0.6 were regarded as characteristic bands, all these characteristic bands are distributed within the range of 692–705 nm, totaling 14 bands.

3.1.2. Analysis of RF Dimensionality Reduction Results

To determine the optimal number of decision trees in the random forest algorithm, we employed 5-fold cross-validation to evaluate the number of decision trees between 10 and 2150, a cross-validation evaluation was conducted, resulting in the identification of 80 as the optimal number of decision trees. Ultimately, the optimal decision trees yield 596 feature bands, which are extracted to construct the RF feature wavelength dataset.

3.2. Autoformer Modeling Analysis

Through the systematic processing and analysis of the hyperspectral dataset, four datasets with different numbers of characteristic bands were obtained: firstly, the original dataset containing 2151 bands; secondly, the dataset reduced to 596 bands after dimensionality reduction processing by the RF algorithm; and then, the dimensionality reduction by the Pearson algorithm, which, according to the different thresholds of correlation coefficients, respectively, resulted in two datasets containing 96 bands (when the correlation coefficient was set to 0.5) and 14 bands (when the correlation coefficient was set to 0.6), respectively. For these four datasets, training analyses were performed, and the results are summarized in Table 1. As evidenced by the data in Table 1, an increase in the number of historical days from 2 to 4 generally results in a downward trend in the MSE. Similarly, the MAE and RMSE also demonstrate comparable trends. It was noteworthy that datasets with a reduced number of characteristic bands tend to have a lower MSE, MAE, and RMSE when the same number of historical days is used. This indicates that as the feature dimension is reduced, the prediction accuracy of the model improves. Further analysis demonstrated a negative correlation between tolerance accuracy and three regression model evaluation metrics (MSE, MAE, RMSE). Specifically, as the regression metrics decrease, tolerance accuracy exhibits an upward trend, indicating an intrinsic relationship between the reduction in model prediction error and the improvement of tolerance accuracy.
The time series test results based on the Autoformer model, which was trained on four distinct datasets (Figure 7). These datasets encompassed historical data spanning three days. In the figure, the horizontal axis represents the sample number, and the vertical axis indicates the disease grade. Each sample comprises both the authentic trend curve of the disease over a seven-day period and the predicted trend curve based on three days of data. As illustrated in the figure, the model demonstrates superior performance on datasets with 14 and 96 bands in comparison to those with 2151 and 596 bands, respectively. This result was consistent with the tolerance accuracy performance and corresponding regression index performance (Table 1), thereby confirming the importance of effective feature band selection and dimensionality reduction strategies in enhancing model prediction performance.

3.3. AutoMSD Modeling Analysis

Building upon the Autoformer model, this study introduces the innovative integration of multi-scale convolution and adaptive sequence decomposition techniques to construct the AutoMSD model. To provide a comprehensive evaluation of the model’s performance, the same dataset previously utilized for training was employed, and the results were organized and exported to Table 2. A comprehensive examination of the data in Table 2 reveals a discernible pattern: as the number of dataset bands decreases and the amount of historical data increases, the MSE, MAE and RMSE all demonstrate a consistent decline, while the tolerance accuracy exhibits an analogous upward trajectory. This trend was highly consistent with the previous performance of the Autoformer model, thereby further validating the effective-ness of the model improvements.
The time-series test results of the AutoMSD model, trained on four distinct datasets with the specific objective of analyzing three-day historical data, were presented in Figure 8. This figure demonstrates the model’s actual performance across diverse datasets, facilitating further evaluation of its generalization capability and prediction accuracy. For consistency, the scales of Figure 7 and Figure 8 have been unified to enable direct comparison.
A direct comparison of regression metrics between the Autoformer and AutoMSD models across four datasets, under a three-day historical data setting, was presented in Figure 9. The comparative analysis demonstrates that the AutoMSD model consistently outperforms the Autoformer model on most regression metrics using the same datasets. These results highlight the substantial improvement in predictive performance and accuracy achieved by the AutoMSD model, as further quantified in Table 2. However, for the dataset with 596 bands, the Autoformer model shows slightly lower MSE and RMSE values, indicating better performance on this specific feature set. This may be attributed to the stability of the baseline model on certain data distributions. Overall, these results highlight the substantial improvement in predictive performance and accuracy achieved by the AutoMSD model.

4. Discussion

4.1. Effectiveness of the Proposed AutoMSD Model in Rice Blast Prediction

The present study aimed to develop an early prediction and dynamic monitoring method for rice blast disease using time-series hyperspectral data combined with deep learning. We introduced a dynamic prediction framework founded on an optimized Autoformer architecture, termed AutoMSD, which integrates multi-scale convolution and adaptive sequence decomposition. Dimensionality reduction was first conducted via Random Forest (RF) and Pearson correlation analysis to select informative spectral features. The proposed AutoMSD model demonstrated a clear superiority over the baseline Autoformer, achieving a tolerance accuracy of 86.67% when utilizing three-day historical data for predicting rice blast progression. These results indicated that the model significantly enhances both the accuracy and timeliness of disease forecasting, providing a robust technical foundation for field management and preventive strategies. Furthermore, this work highlights the promising potential of time-series hyperspectral analysis in plant disease surveillance.

4.2. Technical Advantages of Multi-Scale Convolution and Adaptive Decomposition

With regard to multi-scale convolution [34], this study effectively addresses the shortcomings of conventional neural network models in single-scale feature extraction by incorporating convolution kernels of varying lengths to concurrently extract both short-term and long-term patterns in time series data. The Adaptive Sequence Decomposition module, in contrast, serves to further enhance the effectiveness of time series decomposition through the use of convolutional neural networks, which facilitate the adaptively extracted of trend features from the data. The integration of these techniques enables the AutoMSD model to more accurately capture the core features and dependencies in the data, thus improving the prediction accuracy. Chen et al. [35] employed multi-scale convolution to substantially enhance the accuracy of time series classification, particularly, when processing datasets with varying lengths and across diverse domains. Qian et al. [36] combined multi-scale convolution to create a DMS-CNN model that successfully captured multi-scale temporal features in time series, thereby overcoming the limitation that traditional CNNs were unable to flexibly extract multi-scale features using a fixed-length filter. Deng et al. [37] combined multi-scale convolution to create a TCMS-CNN model for the purpose of predicting electricity load, and compared it with single-scale convolutional neural networks, thereby substantiating the efficacy of multi-scale convolution in enhancing model performance.

4.3. Model Evaluation and Comparative Analysis

In order to evaluate the model, a number of metrics were employed, including the MSE, MAE, RMSE, and tolerance accuracy. This allowed for a comprehensive assessment of the model’s prediction performance. The results demonstrated that the model’s prediction accuracy exhibits an upward trend with an increase in the number of historical days and a decrease in the number of dataset bands. In particular, the dataset with 96 bands demonstrated the optimal tolerance accuracy when the historical data was 3 days, thereby further substantiating the efficacy of the model optimization strategy. In comparison with existing rice blast prediction models, the AutoMSD model demonstrated some advantages in terms of both prediction performance and accuracy. In comparison to previous studies in this field, this study has made some advancement in the area of rice blast prediction. For example, Kim et al. [14] employed a long short-term memory recurrent neural network (LSTM) for the early prediction of rice blast disease, achieving a prediction accuracy of only 67.4%. In contrast, the AutoMSD model proposed in this study achieved a tolerance accuracy of 86.67% when utilizing the first three days of historical data, which represents a significant improvement in prediction accuracy. Moreover, Verma et al. [16] employed video streaming data analysis and an LSTM convolutional neural network to predict rice diseases. While some prediction outcomes were attained, this study not only enhanced the prediction accuracy by integrating hyperspectral technology and time-series data analysis techniques, but also facilitated the dynamic monitoring of the disease development process. Nevertheless, it should be noted that other studies have also demonstrated high levels of prediction accuracy under specific conditions. For instance, Zhang et al. [38] put forth a prediction method for wheat stripe rust based on KG and Bi-LSTM, achieving a prediction accuracy of 93.21%. While this study did not achieve the same level of accuracy in predicting rice blast, the discrepancy was understandable given the differences between rice blast and wheat stripe rust in terms of disease mechanism and spectral characteristics. Additionally, this study focuses on dynamic monitoring rather than predicting a single disease, which may contribute to the observed differences in accuracy. Furthermore, the AutoMSD model facilitates the dynamic monitoring of the progression of rice blast disease through the utilization of multi-scale convolutions and adaptive sequence decomposition techniques, thereby offering a more comprehensive repository of information for the purposes of disease prevention and control.

4.4. Limitations and Future Work

Notwithstanding the noteworthy outcomes of this investigation with respect to the forecasting of rice blast, certain constraints remain. Greenhouse-field discrepancies: a barrier to model generalization. All training and validation data in this study were derived from controlled greenhouse experiments, where environmental variables (e.g., temperature, humidity, light intensity) were stabilized to minimize confounding factors. However, in field environments, factors such as diurnal temperature variation, sudden rainfall events, leaf overlapping, and pest and disease infestations may influence rice plant development, leading to changes in leaf spectral reflectance.
For instance, greenhouse-grown rice exhibits uniform spectral reflectance due to consistent illumination, whereas field-grown plants may show uneven reflectance caused by shading—potentially leading the AutoMSD model to misclassify healthy tissue as diseased, or vice versa. This discrepancy raises concerns about the model’s spectral robustness and ability to generalize beyond controlled conditions.
To fully realize the AutoMSD model’s field potential, we proposed a phased approach integrating transfer learning: Preliminary field validation: Collect hyperspectral and environmental data from 2 to 3 representative rice-growing regions to establish a “bridge dataset” between greenhouse and field conditions. Domain-adaptive fine-tuning: Use this bridge dataset to adapt the pre-trained AutoMSD via DANN or contrastive domain adaptation, aligning spectral distributions across environments. Few-shot scaling: Deploy the adapted model in new regions, refining it with <50 labeled field samples per region using meta-learning. Multi-source integration: Transfer knowledge from pre-trained meteorological/soil models to incorporate these variables, building a hybrid prediction framework.
Transfer learning emerges as a critical solution to mitigate the aforementioned limitations, leveraging the model’s existing knowledge from greenhouse data to adapt to field conditions with minimal new data. Transfer learning can also facilitate the integration of auxiliary environmental data. Pre-trained models on meteorological or soil datasets (e.g., a CNN trained to predict soil moisture from hyperspectral data) can transfer knowledge to the AutoMSD model, enabling it to incorporate these variables without requiring retraining from scratch. This multi-source transfer enhanced the model’s ability to contextualize hyperspectral signals, reducing misclassifications caused by abiotic stressors.
This research presented a novel deep learning framework that supports early and dynamic prediction of rice blast disease using time-series hyperspectral data. By improving the accuracy and timeliness of disease monitoring, the AutoMSD model facilitates targeted and sustainable crop protection strategies. The methodology proposed here not only offers a practical tool for rice blast management but also establishes an adaptable framework that could be extended to other plant diseases, promoting the advancement of precision agriculture.

5. Conclusions

In this study, we proposed a dynamic prediction method for rice blast disease in rice based on time-series hyperspectral data and a deep learning model. By integrating the Autoformer model, multi-scale convolution, and adaptive sequence decomposition techniques, we have successfully constructed an efficient and accurate prediction model. The experimental results demonstrated that our model is capable of accurately predicting the disease development trend at the early stage of rice blast disease onset. This provides a scientific basis and technical support for farm management and disease prevention and control. Furthermore, this study indicated the significant potential of employing time-series data for plant disease prediction, offering novel insights and methodologies for future research in related disciplines. While this study was limited in scope, we believed that with continued technological advancement, more crops and disease types could be accurately predicted and controlled in the future.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/agronomy16010136/s1, Figure S1. The diagram of rice seedlings grown under greenhouse experimental conditions. Figure S2. The diagram of the ASD FieldSpec4 device working on the test site.

Author Contributions

W.W.: Writing—original draft, Conceptualization, Data curation, Funding acquisition. Y.Z.: Writing—review and editing, Investigation, Methodology, Validation. H.H.: Writing—review and editing, Investigation, Methodology, Validation. T.L.: Writing—original draft, Investigation, Data curation, Formal analysis. M.Z.: Investigation, Validation, Formal analysis. Y.F.: Writing—review and editing, Resources, Formal analysis. H.S.: Writing—review and editing, Methodology, Supervision. J.Y.: Writing—review and editing, Supervision, Resources, Supervision, Project administration. L.Y.: Writing—review and editing, Conceptualization, Methodology, Supervision. All authors have read and agreed to the published version of the manuscript.

Funding

This study was supported by the Introduction of Scientific and Technological Talents of Guangdong Academy of Agricultural Sciences (grant numbers R2021YJ-YB3020), the Guangdong Province Rice Industry Technology System (grant numbers 2024CXTD05), the Guangdong Province’s “Hundreds of Millions Project” Rural Science and Technology Commissioner Program (grant numbers KTP20240367), Guangdong Province Science and Technology Innovation Strategic Fund (grant numbers: 2025 ZC 06).

Data Availability Statement

The original contributions presented in this study are included in the article/Supplementary Material. Further inquiries can be directed to the corresponding authors.

Conflicts of Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

References

  1. Wang, G.L.; Valent, B. Durable resistance to rice blast. Science 2017, 355, 906–907. [Google Scholar] [CrossRef]
  2. Zhang, H.F.; Yang, J.; Liu, M.X.; Xu, X.Z.; Yang, L.Y.; Liu, X.Y.; Peng, Y.L.; Zhang, Z.G. Early molecular events in the interaction between Magnaporthe oryzae and rice. Phytopathol. Res. 2024, 6, 9. [Google Scholar] [CrossRef]
  3. Shahriar, S.A.; Imtiaz, A.A.; Hossain, M.B.; Husna, A.; Eaty, M.N.K. Review: Rice blast disease. Annu. Res. Rev. Biol. 2020, 35, 50–64. [Google Scholar] [CrossRef]
  4. Wu, J.; Kou, Y.J.; Bao, J.D.; Li, Y.; Tang, M.Z.; Zhu, X.L.; Ponaya, A.; Xiao, G.; Li, J.B.; Li, C.Y.; et al. Comparative genomics identifies the Magnaporthe oryzae avirulence effector AvrPi9 that triggers Pi9-mediated blast resistance in rice. New Phytol. 2015, 206, 1463–1475. [Google Scholar] [CrossRef] [PubMed]
  5. Xiao, G.; Wang, W.J.; Liu, M.X.; Li, Y.; Liu, J.B.; Franceschetti, M.; Yi, Z.F.; Zhu, X.Y.; Zhang, Z.G.; Lu, G.D.; et al. The Piks allele of the NLR immune receptor Pik breaks the recognition of AvrPik effectors of rice blast fungus. J. Integr. Plant Biol. 2023, 65, 810–824. [Google Scholar] [CrossRef] [PubMed]
  6. Xuan, G.T.; Li, Q.K.; Shao, Y.Y.; Shi, Y.K. Early diagnosis and pathogenesis monitoring of wheat powdery mildew caused by blumeria graminis using hyperspectral imaging. Comput. Electron. Agric. 2022, 197, 106921. [Google Scholar] [CrossRef]
  7. Deng, J.; Wang, R.; Yang, L.J.; Lv, X.; Yang, Z.Q.; Zhang, K.; Zhou, C.Y.; Pengju, L.; Wang, Z.F.; Abdullah, A.; et al. Quantitative estimation of wheat stripe rust disease index using unmanned aerial vehicle hyperspectral imagery and innovative vegetation indices. IEEE Trans. Geosci. Remote Sens. 2023, 61, 1–11. [Google Scholar] [CrossRef]
  8. Liang, X.Y.; Zhang, Z.T.; Yang, S.; Chen, X.; Yao, Z.F.; Song, H.B. Hyperspectral imaging-based lightweight detection method for rapid detection of Fusarium head blight severity in wheat. J. Agric. Mach. 2025, 56, 218–227. [Google Scholar]
  9. Lin, F.F.; Li, B.R.; Zhou, R.Y.; Chen, H.Z.; Zhang, J.C. Early detection of rice sheath blight using hyperspectral remote sensing. Remote Sens. 2024, 16, 2047. [Google Scholar] [CrossRef]
  10. Zhang, G.S.; Xu, T.Y.; Tian, Y.W. Hyperspectral imaging-based classification of rice leaf blast severity over multiple growth stages. Plant Methods 2022, 18, 123. [Google Scholar] [CrossRef]
  11. Tian, L.; Wang, Z.Y.; Xue, B.W.; Li, D.; Zheng, H.B.; Yao, X.; Zhu, Y.; Cao, W.X.; Cheng, T. A disease-specific spectral index tracks Magnaporthe oryzae infection in paddy rice from ground to space. Remote Sens. Environ. 2023, 285, 113384. [Google Scholar] [CrossRef]
  12. Su, J.Y.; Liu, C.J.; Hu, X.P.; Xu, X.M.; Guo, L.; Chen, W.H. Spatio-temporal monitoring of wheat yellow rust using UAV multispectral imagery. Comput. Electron. Agric. 2019, 167, 105035. [Google Scholar] [CrossRef]
  13. Hassan, M.A.; Yang, M.J.; Rasheed, A.; Jin, X.L.; Xia, X.C.; Xiao, Y.G.; He, Z.H. Time-series multispectral indices from unmanned aerial vehicle imagery reveal senescence rate in bread wheat. Remote Sens. 2018, 10, 809. [Google Scholar] [CrossRef]
  14. Kim, Y.; Roh, J.H.; Kim, H.Y. Early forecasting of rice blast disease using long short-term memory recurrent neural networks. Sustainability 2018, 10, 34. [Google Scholar] [CrossRef]
  15. Guo, W.; Dang, M.J.; Jia, X.; He, Q.; Gao, C.F.; Dong, P. Grade classification of wheat stripe rust disease based on deep learning. J. South China Agric. Univ. 2023, 44, 604–612. [Google Scholar]
  16. Verma, T.; Dubey, S. Prediction of diseased rice plant using video processing and LSTM-simple recurrent neural network with comparative study. Multimed. Tools Appl. 2021, 80, 29267–29298. [Google Scholar] [CrossRef]
  17. Guo, Z.Q.; Chen, X.H.; Li, M.; Chi, Y.C.; Shi, D.Y. Construction and validation of peanut leaf spot disease prediction model based on long time series data and deep learning. Agronomy 2024, 14, 294. [Google Scholar] [CrossRef]
  18. Hochreiter, S.; Schmidhuber, J. Long short-term memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef]
  19. Wen, Q.S.; Zhou, T.; Zhang, C.L.; Chen, W.Q.; Ma, Z.Q.; Yan, J.C.; Sun, L. Transformers in time series: A survey. In Proceedings of the International Joint Conference on Artificial Intelligence (IJCAI), Macao, China, 19–25 August 2023. [Google Scholar]
  20. Liang, Y.X.; Xia, Y.T.; Ke, S.Y.; Wang, Y.W.; Wen, Q.S.; Zhang, J.B.; Zheng, Y.; Zimmermann, R. Airformer: Predicting nationwide air quality in China with transformers. In Proceedings of the AAAI Conference on Artificial Intelligence, Washington, DC, USA, 7–14 February 2023; Volume 37, pp. 14329–14337. [Google Scholar]
  21. Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, L.; Polosukhin, I. Attention is all you need. In Proceedings of the 31st Conference on Neural Information Processing Systems (NIPS 2017), Long Beach, CA, USA, 4–7 December 2017; pp. 5998–6008. [Google Scholar]
  22. Wu, H.X.; Xu, J.H.; Wang, J.M.; Long, M.S. Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting. Adv. Neural Inf. Process. Syst. 2021, 34, 22419–22430. [Google Scholar]
  23. Zhou, T.; Ma, Z.Q.; Wen, Q.S.; Wang, X.; Sun, L.; Jin, R. FEDformer: Frequency enhanced decomposed transformer for long-term series forecasting. arXiv 2022, arXiv:2201.12740. [Google Scholar] [CrossRef]
  24. Liu, S.Z.; Yu, H.; Liao, C.; Li, J.G.; Lin, W.Y.; Liu, A.X.; Dustdar, S. Pyraformer: Low-complexity pyramidal attention for long-range time series modeling and forecasting. In Proceedings of the International Conference on Learning Representations, Virtual, 25 April 2022. [Google Scholar]
  25. Zhou, H.Y.; Zhang, S.H.; Peng, J.Q.; Zhang, S.; Li, J.X.; Xiong, H.; Zhang, W.C. Informer: Beyond efficient transformer for long sequence time-series forecasting. In Proceedings of the AAAI Conference on Artificial Intelligence, New York, NY, USA, 7–12 February 2020. [Google Scholar]
  26. Zhu, X.Y.; Chen, S.; Yang, J.Y.; Zhou, S.C.; Zeng, L.X.; Han, J.L.; Su, J.; Wang, L.; Pan, Q.H. The identification of Pi50(t), a new member of the rice blast resistance Pi2/Pi9 multigene family. Theor. Appl. Genet. 2012, 124, 1295–1304. [Google Scholar] [CrossRef]
  27. Greer, C.A.; Webster, R.K. Occurrence, distribution, epidemiology, cultivar reaction, and management of rice blast disease in California. Plant Dis. 2001, 85, 1096–1102. [Google Scholar] [CrossRef]
  28. Ham, J.; Chen, Y.C.; Crawford, M.M.; Ghosh, J. Investigation of the random forest framework for classification of hyperspectral data. IEEE. Trans. Geosci. Remote Sens. 2005, 43, 492–501. [Google Scholar] [CrossRef]
  29. Kirch, W. (Ed.) Pearson’s correlation coefficient. In Encyclopedia of Public Health; Springer: Dordrecht, The Netherlands, 2008; pp. 1090–1091. [Google Scholar]
  30. Liu, T.; Li, Z.M.; Feng, S.; Wang, W.Q.; Yuan, Q.Y.; Xu, T.Y. Classification detection of hyperspectral rice blast disease based on LMPSO-SVM. Trans. Chin. Soc. Agric. Mach. 2023, 54, 208–216. [Google Scholar]
  31. Wang, A.L.; Wang, Y.; Chen, Y.S. Hyperspectral image classification based on convolutional neural network and random forest. Remote Sens. Lett. 2019, 10, 1086–1094. [Google Scholar] [CrossRef]
  32. Speiser, J.L.; Miller, M.E.; Tooze, J.; Ip, E. A comparison of random forest variable selection methods for classification prediction modeling. Expert Syst. Appl. 2019, 134, 93–101. [Google Scholar] [CrossRef] [PubMed]
  33. Tan, K.; Wang, H.M.; Chen, L.H.; Du, Q.; Du, P.J.; Pan, C.C. Estimation of the spatial distribution of heavy metal in agricultural soils using airborne hyperspectral imaging and random forest. J. Hazard. Mater. 2020, 382, 120987. [Google Scholar] [CrossRef]
  34. Cui, Z.C.; Chen, W.L.; Chen, Y.X. Multi-scale convolutional neural networks for time sies classification. arXiv 2016, arXiv:1603.06995. [Google Scholar]
  35. Chen, W.; Shi, K. Multi-scale attention convolutional neural network for time series classification. Neural Net. 2021, 136, 126–140. [Google Scholar] [CrossRef]
  36. Qian, B.; Xiao, Y.; Zheng, Z.J.; Zhou, M.; Zhuang, W.Q.; Li, S.; Ma, Q.L. Dynamic multi-scale convolutional neural network for time series classification. IEEE Access 2020, 8, 109732–109746. [Google Scholar] [CrossRef]
  37. Deng, Z.F.; Wang, B.B.; Xu, Y.L.; Xu, T.T.; Liu, C.X.; Zhu, Z.L. Multi-scale convolutional neural network with time-cognition for multi-step short-term load forecasting. IEEE Access 2019, 7, 88058–88071. [Google Scholar] [CrossRef]
  38. Zhang, S.W.; Wang, Z.; Wang, Z.L. Prediction of wheat stripe rust disease by combining knowledge graph and bidirectional long short term memory network. Trans. Chin. Soc. Agric. Eng. 2020, 36, 172–178. [Google Scholar]
Figure 1. Experimental design and data processing flow.
Figure 1. Experimental design and data processing flow.
Agronomy 16 00136 g001
Figure 2. Comparison of Disease Severity Levels.
Figure 2. Comparison of Disease Severity Levels.
Agronomy 16 00136 g002
Figure 3. Autoformer Model Architecture.
Figure 3. Autoformer Model Architecture.
Agronomy 16 00136 g003
Figure 4. AdaptiveSeriesDecomp Module.
Figure 4. AdaptiveSeriesDecomp Module.
Agronomy 16 00136 g004
Figure 5. AutoMSD model architecture.
Figure 5. AutoMSD model architecture.
Agronomy 16 00136 g005
Figure 6. Spectral feature Pearson correlation coefficient.
Figure 6. Spectral feature Pearson correlation coefficient.
Agronomy 16 00136 g006
Figure 7. Time Series Prediction with Autoformer.
Figure 7. Time Series Prediction with Autoformer.
Agronomy 16 00136 g007
Figure 8. Time Series Prediction with AutoMSD.
Figure 8. Time Series Prediction with AutoMSD.
Agronomy 16 00136 g008
Figure 9. Comparison of regression metrics between Autoformer and AutoMSD under a three-day historical data setting.
Figure 9. Comparison of regression metrics between Autoformer and AutoMSD under a three-day historical data setting.
Agronomy 16 00136 g009
Table 1. Training Results of Autoformer.
Table 1. Training Results of Autoformer.
ModelMSEMAERMSEPrediction
Autoformer + 2151 + 2 day3.5191.3171.87623.33%
Autoformer + 2151 + 3 day2.3951.0311.54744.67%
Autoformer + 2151 + 4 day1.4380.6251.19950.83%
Autoformer + 596 + 2 day2.9151.2101.70736.67%
Autoformer + 596 + 3 day1.3840.7631.17754.00%
Autoformer + 596 + 4 day0.4980.3810.70582.22%
Autoformer + 96 + 2 day2.7391.1311.65537.33%
Autoformer + 96 + 3 day0.7360.5020.85868.89%
Autoformer + 96 + 4 day0.4600.3370.67887.78%
Autoformer + 14 + 2 day2.6571.0521.63037.50%
Autoformer + 14 + 3 day0.5070.3910.71274.17%
Autoformer + 14 + 4 day0.5010.3200.70778.89%
Table 2. Training Results of AutoMSD.
Table 2. Training Results of AutoMSD.
ModelMSEMAERMSEPrediction
AutoMSD + 2151 + 2 day2.521 1.103 1.588 39.17%
AutoMSD + 2151 + 3 day2.257 0.981 1.502 48.67%
AutoMSD + 2151 + 4 day1.899 0.811 1.378 48.67%
AutoMSD + 596 + 2 day2.289 1.003 1.513 44.67%
AutoMSD + 596 + 3 day1.703 0.848 1.305 50.67%
AutoMSD + 596 + 4 day0.429 0.338 0.655 88.33%
AutoMSD + 96 + 2 day2.429 1.056 1.558 42.67%
AutoMSD + 96 + 3 day0.473 0.345 0.688 86.67%
AutoMSD + 96 + 4 day0.507 0.327 0.712 74.17%
AutoMSD + 14 + 2 day2.468 1.093 1.571 40.67%
AutoMSD + 14 + 3 day0.487 0.355 0.698 84.44%
AutoMSD + 14 + 4 day0.425 0.278 0.652 88.33%
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wang, W.; Zhang, Y.; Huang, H.; Liu, T.; Zeng, M.; Fu, Y.; Shu, H.; Yang, J.; Yu, L. Monitoring Rice Blast Disease Progression Through the Fusion of Time-Series Hyperspectral Imaging and Deep Learning. Agronomy 2026, 16, 136. https://doi.org/10.3390/agronomy16010136

AMA Style

Wang W, Zhang Y, Huang H, Liu T, Zeng M, Fu Y, Shu H, Yang J, Yu L. Monitoring Rice Blast Disease Progression Through the Fusion of Time-Series Hyperspectral Imaging and Deep Learning. Agronomy. 2026; 16(1):136. https://doi.org/10.3390/agronomy16010136

Chicago/Turabian Style

Wang, Wenjuan, Yufen Zhang, Haoyi Huang, Tao Liu, Minyue Zeng, Youqiang Fu, Hua Shu, Jianyuan Yang, and Long Yu. 2026. "Monitoring Rice Blast Disease Progression Through the Fusion of Time-Series Hyperspectral Imaging and Deep Learning" Agronomy 16, no. 1: 136. https://doi.org/10.3390/agronomy16010136

APA Style

Wang, W., Zhang, Y., Huang, H., Liu, T., Zeng, M., Fu, Y., Shu, H., Yang, J., & Yu, L. (2026). Monitoring Rice Blast Disease Progression Through the Fusion of Time-Series Hyperspectral Imaging and Deep Learning. Agronomy, 16(1), 136. https://doi.org/10.3390/agronomy16010136

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop