Next Article in Journal
ICESat-2 Water Photon Denoising and Water Level Extraction Method Combining Elevation Difference Exponential Attenuation Model with Hough Transform
Previous Article in Journal
Marine Heatwaves and Cold Spells in Global Coral Reef Regions (1982–2070): Characteristics, Drivers, and Impacts
Previous Article in Special Issue
Hyperspectral Soil Heavy Metal Prediction via Privileged-Informed Residual Correction
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

PLCNet: A 3D-CNN-Based Plant-Level Classification Network Hyperspectral Framework for Sweetpotato Virus Disease Detection

1
Xuzhou Institute of Agricultural Sciences in Jiangsu Xuhuai District, Xuzhou 221131, China
2
College of Agronomy, Henan Agricultural University, Zhengzhou 450046, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2025, 17(16), 2882; https://doi.org/10.3390/rs17162882
Submission received: 19 May 2025 / Revised: 15 August 2025 / Accepted: 17 August 2025 / Published: 19 August 2025

Abstract

Sweetpotato virus disease (SPVD) poses a significant threat to global sweetpotato production; therefore, early, accurate field-scale detection is necessary. To address the limitations of the currently utilized assays, we propose PLCNet (Plant-Level Classification Network), a rapid, non-destructive SPVD identification framework using UAV-acquired hyperspectral imagery. High-resolution data from early sweetpotato growth stages were processed via three feature selection methods—Random Forest (RF), Minimum Redundancy Maximum Relevance (mRMR), and Local Covariance Matrix (LCM)—in combination with 24 vegetation indices. Variance Inflation Factor (VIF) analysis reduced multicollinearity, yielding an optimized SPVD-sensitive feature set. First, using the RF-selected bands and vegetation indices, we benchmarked four classifiers—Support Vector Machine (SVM), Gradient Boosting Decision Tree (GBDT), Residual Network (ResNet), and 3D Convolutional Neural Network (3D-CNN). Under identical inputs, the 3D-CNN achieved superior performance (OA = 96.55%, Macro F1 = 95.36%, UA_mean = 0.9498, PA_mean = 0.9504), outperforming SVM, GBDT, and ResNet. Second, with the same spectral–spatial features and 3D-CNN backbone, we compared a pixel-level baseline (CropdocNet) against our plant-level PLCNet. CropdocNet exhibited spatial fragmentation and isolated errors, whereas PLCNet’s two-stage pipeline—deep feature extraction followed by connected-component analysis and majority voting—aggregated voxel predictions into coherent whole-plant labels, substantially reducing noise and enhancing biological interpretability. By integrating optimized feature selection, deep learning, and plant-level post-processing, PLCNet delivers a scalable, high-throughput solution for precise SPVD monitoring in agricultural fields.

1. Introduction

Sweetpotato (Ipomoea batatas (L.) Lam.), a herb belonging to the sweetpotato genus Ipomoea in the Convolvulaceae family, is an important food crop, and can also be used as an industrial raw material and as fodder. It is not only easy to cultivate and versatile, but also highly productive. Indeed, globally, it ranks as the seventh most important staple food; in developing countries, it ranks fifth, after rice, wheat, maize, and cassava. Similarly to other crops, diseases and pests are one of the most important factors jeopardizing sweetpotato production, and virus diseases are one of the most damaging types of diseases. Sweetpotato virus disease (SPVD) is caused by sweetpotato chlorotic stunt virus (SPCSV) and sweetpotato feathery mottle virus (SPFMV). It is a viral disease caused by the co-infection of sweetpotato chlorotic stunt virus (SPCSV) and sweetpotato feathery mottle virus (SPFMV), which can lead to an over 90% reduction in sweetpotato yield, and even to crop failure in severe cases [1]. In addition, the prevalence of virus diseases is an important factor in the decline of sweetpotato quality and germplasm degradation [2]. Currently, there are no high-quality virus disease-resistant varieties in existence, and there is a lack of effective control agents. Therefore, early diagnosis and warning of SPVD, the timely removal of virus-carrying plants, and early avoidance of virus transmission are the most effective preventive and control measures [3].
Traditional SPVD detection methods mainly include serological detection [4], indicator plant detection [5], and molecular biology detection [6]. Although these methods have high sensitivity, they require specialized equipment and personnel to be performed, they have a high cost and low efficiency, and they struggle to meet the demand of large-scale industrial services, while their detection accuracy is also affected by the subjectivity of the staff, which can easily lead to the optimal opportunity for prevention and control being missed [7].
Hyperspectral remote sensing captures subtle spectral variations in plant leaves across the visible, near-infrared, and short-wave infrared regions by collecting hundreds of continuous, narrow-band spectral data points [8]. This high-resolution spectral information reveals minute, disease-induced changes in chlorophyll content, moisture levels, cellular structure, and other physiological and biochemical parameters, providing a theoretical foundation and data support for early disease diagnosis [9,10].
Unmanned Aerial Vehicles (UAVs) equipped with hyperspectral cameras offer key advantages, including high spatial resolution, flexible deployment, and rapid data acquisition over large areas under natural lighting conditions. By conducting air-to-ground remote sensing, UAVs can collect canopy-level spectral data that reflect crop structural attributes (e.g., leaf area index (LAI), biomass) [11], spatial heterogeneity (e.g., pest and disease patches, fertility gradients), and interactions between plants and their environment [12].
Several studies have demonstrated the effectiveness of UAV hyperspectral sensing in crop disease detection. For example, after identifying relevant spectral bands associated with changes in pigment concentration and leaf structure caused by sugarcane mosaic virus, researchers developed the anthocyanin red-edge index, which effectively distinguishes virus-affected areas in sugarcane [13]. Shi et al. proposed an end-to-end deep learning model, CropdocNet, which combines multi-level spectral–spatial features for accurate and automated diagnosis of crop diseases and pests, including sugarcane mosaic virus and potato late blight, using UAV-based hyperspectral imagery [14]. Similarly, Mickey et al. analyzed two grapevine viruses—grapevine leafroll disease (GLD) and Shiraz disease (SD)—in Australian vineyards, identifying their unique spectral signatures and optimal detection windows during the growing season to support targeted disease management [15].
Currently, hyperspectral disease identification techniques have been primarily applied to major crops such as rice [16], maize [17], and wheat [18]. However, there is currently a limited body of research and insufficient mechanistic understanding regarding hyperspectral responses to SPVD. Therefore, investigating UAV-based hyperspectral remote sensing for characterizing the spectral response patterns of SPVD holds significant potential for improving early detection and precise disease monitoring in sweetpotato cultivation.
In recent years, Convolutional Neural Networks (CNNs) have been increasingly applied in hyperspectral crop classification tasks [19,20]. CNNs are capable of learning underlying patterns in data without requiring knowledge of its statistical distribution, and they can extract both linear and nonlinear features without relying on prior domain-specific knowledge [21,22,23]. For instance, Trivedi et al. employed deep-learning-based CNN models to classify normal and abnormal potato leaves affected by fungal infections (e.g., early and late blight) into multiple categories [24]. Zeng et al. developed a multiscale selective-attention CNN (MSA-CNN) model for the early detection of powdery mildew in rubber trees [25]. Similarly, Bhatti et al. introduced a mobile application powered by a CNN model that can identify plant diseases and offer management recommendations, enabling farmers and agricultural practitioners to diagnose crop diseases quickly and accurately [26].
Despite the advantages of deep learning (DL) methods in disease detection, most existing approaches primarily focus on pixel- or leaf-level classification tasks [27,28]. While these methods have shown promising performance under controlled laboratory conditions, they often face significant limitations when applied in real-world field environments. Challenges such as variations in illumination between plants, strong background interference, and inconsistent predictions across different regions of the same plant canopy reduce the spatial consistency and reliability of classification results. Compared with the existing convolutional neural network frameworks (such as ResNet, DenseNet, U-Net, etc.), PLCNet not only provides higher classification accuracy, but also achieves stronger spatial consistency and enhanced biological interpretability, making it particularly suitable for efficient disease detection in complex field environments.
To address the limitations of existing methods, this study proposes PLCNet (Plant-Level Classification Network), a plant-level SPVD recognition framework for UAV-based hyperspectral imaging, built upon a 3D Convolutional Neural Network (3D-CNN) (Figure 1). Unlike traditional approaches that classify each pixel independently, PLCNet utilizes high-resolution hyperspectral images captured by UAV-mounted sensors and employs the Random Forest (RF) algorithm for optimal spectral band selection. To address multicollinearity, Variance Inflation Factor (VIF) analysis is performed on the combined RF-selected bands and vegetation indices, ensuring the selection of SPVD-sensitive, non-redundant features. These filtered feature bands are then input into a 3D-CNN, which extracts deep spectral–spatial features for robust and discriminative classification.
To enhance spatial consistency in the resulting classification maps, PLCNet integrates a post-classification refinement module that applies connected component analysis and majority voting, ensuring that each individual plant is assigned a consistent and biologically meaningful label. This comprehensive framework significantly improves the classification accuracy, spatial coherence, and biological interpretability of the method, offering a scalable and practical solution for the early detection of SPVD and high-throughput field-level monitoring.

2. Materials and Methods

2.1. Study Area and Sample Collection

The experimental site for this study was located at the base of the Xuzhou Academy of Agricultural Sciences in the Xuzhou Economic Development Zone, Xuzhou City, Jiangsu Province, China (N34°16′57″–N34°16′59″, E117°17′25″–E117°17′27″, elevation: 36 m). A sweetpotato field within this area was selected for hyperspectral remote sensing data acquisition. Xuzhou’s topography is primarily composed of plains, and it lies within the warm-temperate monsoon climate zone. Spanning east to west, the region exhibits climatic variation influenced by proximity to the ocean. The eastern part of Xuzhou experiences a warm-temperate humid monsoon climate, while the western part is characterized by a warm-temperate semi-humid climate. With four distinct seasons, abundant sunlight, moderate rainfall, and a synchrony of heat and precipitation, the region provides favorable climatic conditions for crop cultivation.
Three widely promoted and representative sweetpotato varieties were selected for the study. Xu Zishu No. 8 (designated FR-1) exhibits mostly crested or shallowly lobed leaves that are dark green in color with a purple halo and purplish-red veins. Xu Shu 37 (FR-2) has predominantly heart-shaped or shallowly lobed leaves that are dark green in color. Shangshu 19 (FR-3) features heart-shaped or shallowly lobed leaves with single notches that are also dark green in appearance. The sweetpotato seedlings used in the experiment were obtained from the Key Laboratory of Sweetpotato Biology and Genetic Breeding of the Ministry of Agriculture and Rural Affairs [29]. The healthy group consisted of virus-free seedlings verified through virus detection. Correspondingly, the infected group included seedlings that only tested positive for the following two viruses: sweetpotato feathery mottle virus (SPFMV) and sweetpotato chlorotic stunt virus (SPCSV). The infected variants were designated as SPVD-1, SPVD-2, and SPVD-3, corresponding to FR-1, FR-2, and FR-3, respectively.
Virus detection was conducted using RT-PCR (Eppendorf AG, Hamburg, Germany), with a TaKaRa MiniBEST Plant RNA Extraction Kit (TaKaRa Bio Dalian Co., Ltd., Dalian, China) [30]. A total of six experimental plots were established (Figure 2), each measuring 10 m × 8 m. Rows were spaced 1 m apart, with a 1 m distance between individual plants, resulting in 80 plants per plot. For each variety, virus-free (healthy) seedlings were planted on the left side of the plot, and SPVD-infected seedlings were planted on the right. To prevent cross-contamination, a 2 m-high insect-proof net and a protective buffer zone were installed between healthy and infected plots. All plots were managed under uniform irrigation and fertilization protocols.
After the acquisition of hyperspectral images via unmanned aerial vehicle (UAV) flights, chlorophyll content was measured using a SPAD-502 portable chlorophyll meter (Konica Minolta, Inc., Tokyo, Japan) to further assess the health and disease status of the sweetpotato plants. Within each plot, three rows of representative sweetpotato plants were selected for measurement. For each plant, chlorophyll readings were taken at three upper leaf positions, avoiding the main veins, using a SPAD instrument. Each leaf was measured three times, and the average value was recorded to ensure accuracy. To maintain statistical reliability, a minimum of six valid chlorophyll measurements were collected from each row.

2.2. UAV Data Collection and Preprocessing

The DJI Matrice 300 RTK (DJI Innovation, Shenzhen, China) used for airborne hyperspectral image acquisition in this study primarily consisted of a flight platform, flight control system, motorized gimbal, ground station control system, and data export module. The hyperspectral data acquisition system is a Cubert S185 imaging spectrometer ((Cubert GmbH, Ulm, Germany; http://cubert-gmbh.com/). The Cubert S185 covers a spectral range of 450–998 nm, providing 138 spectral bands with a spectral resolution of 4 nm.
Hyperspectral data were acquired under optimal environmental conditions: clear skies, no cloud cover, and minimal wind. Data collection was conducted between 10:00 a.m. and 12:00 p.m. during the early growth stage of sweetpotato (on 4 August 2024) to minimize spectral interference and maximize disease detection sensitivity. The UAV was flown at an altitude of 12m, with an 80% image overlap rate, a flight speed of 1 m·s−1, and a resulting spatial resolution of 0.005 m. An additional UAV hyperspectral image was acquired on 17 August 2024 for model validation. Early-stage image acquisition was prioritized because of the advantages of lower background interference and reduced leaf overlap, which are essential for enhancing the detectability of early disease symptoms and facilitating timely intervention.
During data collection, hyperspectral signals can be affected by external environmental factors or intrinsic sensor noise, often manifesting as fluctuations or spikes in the spectral curve. To improve spectral smoothness, enhance the signal-to-noise ratio, and increase the accuracy of information extraction, spatial-domain smoothing was applied. The Savitzky–Golay (S-G) filter, a low-pass smoothing filter based on local polynomial least-squares fitting, was used for this purpose. The S-G filter is particularly effective in preserving the original shape (e.g., peaks and troughs) and width characteristics of spectral signals, while reducing high-frequency noise. In this study, a window size of 13 and a polynomial order of five were selected for convolutional smoothing, which significantly enhanced the spectral quality [31].
To enable accurate identification of SPVD, regions of interest (ROIs) were manually annotated using ENVI 5.3 (L3Harris Geospatial, Broomfield, CO, USA), guided by field-based ground truth observations and expert knowledge of crop health conditions. The dataset was classified into three categories as follows: healthy plants, diseased plants, and others. Approximately 50,000 pixels were selected per category to ensure balanced class representation and minimize training bias. These ROIs were subsequently converted into labeled maps (ground truth images), where each pixel was assigned a category label corresponding to its class. This labeling process ensured spatial alignment with the original hyperspectral imagery and provided high-quality, pixel-level annotations for supervised learning.
The resulting dataset served as the foundation for the subsequent steps, including sample extraction, feature selection, and model training. By integrating expert-guided annotations with precise UAV hyperspectral imagery, the study established a robust and representative dataset to support plant-level classification of SPVD.

2.3. Feature Selection

Given the high dimensionality and strong inter-band correlations in hyperspectral data, it is crucial to perform feature selection to improve model accuracy and eliminate redundant information. In this study, a two-step feature selection strategy was adopted to effectively reduce data dimensionality and enhance model performance.
In the first step, three widely used feature selection methods—Local Covariance Matrix (LCM), Minimum Redundancy Maximum Relevance (mRMR), and Random Forest (RF)—were employed to evaluate and extract the most informative hyperspectral bands from different perspectives.
(1)
LCM evaluates the discriminative power of each spectral band by calculating the local variance within target regions of interest, involving steps such as local domain construction, covariance matrix computation, and feature screening based on local variability [32].
(2)
mRMR selects features by maximizing relevance with the target variable while minimizing redundancy among features, using an objective function and an incremental search strategy to iteratively select the most informative bands [33].
(3)
RF constructs an ensemble of decision trees and ranks spectral bands based on their contribution to model accuracy. Feature importance scores are calculated, and the top-k-ranked bands are selected as the optimal feature subset [34].
To ensure a fair comparison, the number of selected features was fixed at 30 for each method, and ten-fold cross-validation was used to validate the stability and reliability of the selected features.
In the second step, Variance Inflation Factor (VIF) analysis was applied to the feature subsets obtained from the first stage to eliminate highly collinear bands and ensure the stability of subsequent model parameters [35]. This two-step process not only reduces the initial feature space, thereby simplifying VIF computation, but also addresses the problem of multicollinearity among features.
Additionally, 24 vegetation indices, grouped into three categories based on their biophysical and biochemical significance, were incorporated into the analysis(Table 1). VIF was used to identify and retain vegetation indices that are most sensitive to SPVD, ensuring the robustness of disease detection models. Through this comprehensive feature selection process, the dimensionality of the hyperspectral dataset is significantly reduced, leading to improved generalization ability, enhanced computational efficiency, and the more stable performance of subsequent regression or classification models.

2.4. Modeling Methods

In this study, three commonly used hyperspectral disease classification methods—Support Vector Machine (SVM), Gradient Boosting Decision Tree (GBDT), and Residual Network (ResNet) and 3D Convolutional Neural Network (3D-CNN)—were employed for comparative analysis.
SVM is one of the most widely used classification models in remote sensing applications [52]. Its core principle involves projecting low-dimensional feature variables into a higher-dimensional space using kernel functions (e.g., linear, Gaussian), thereby constructing an optimal decision hyperplane that maximizes the margin between different classes.
GBDT is an ensemble learning algorithm based on the Boosting framework, which incrementally improves model performance by combining multiple weak learners. It has shown strong capabilities in handling complex, nonlinear relationships in hyperspectral data [53].
CNNs are a class of deep neural networks involving convolutional operations within a feedforward architecture. They are capable of automatically learning both shallow and deep discriminative features from data [54]. ResNet employs a deep architecture with residual connections, and has been widely used in tasks such as plant disease identification, object detection, and semantic segmentation [55]. By simultaneously processing spatial and spectral dimensions, 3D-CNNs further extend this capability, enabling the extraction of rich spectral–spatial features. To reduce computational complexity while maintaining high classification accuracy, depthwise separable convolutions are used, which significantly reduce the parameter count and training time [56,57,58,59].
In this study, a tailored 3D-CNN architecture was designed with three 3D convolutional layers to increase the number of spatial–spectral feature maps. This ensures that spatial information within different spectral bands is effectively captured without information loss. Each convolutional layer is followed by batch normalization and a ReLU activation function, enhancing training stability and nonlinearity. A hyperbolic tangent activation is applied in the fully connected layer to perform final feature mapping (Figure 1). For model optimization, the Adam optimizer with cosine annealing learning rate scheduling was adopted. The initial learning rate was set to 0.001, with a decay rate of 3% per epoch. An early stopping mechanism was also introduced to prevent overfitting; training was halted if the validation loss did not decrease for five consecutive epochs.
Finally, we also implemented the CropdocNet model as a direct pixel-level baseline. CropdocNet consists of a spectral encoder, with two consecutive Conv3D–BatchNorm3D–ReLU layers (channels 1→8→16, kernel 3 × 3 × 3); a spectral–spatial encoder, with one Conv3D–BatchNorm3D–ReLU layer (16→32, kernel 3 × 3 × 3); adaptive pooling, comprising an AdaptiveAvgPool3d to reshape features to (bands, patch_size, patch_size); and a classifier, composed of a 512-unit fully connected layer and a final output layer (num_classes). It was trained on the same 70/30 split with patch_size = 5, batch_size = 128, Adam (lr = 1 × 10−3), and 20 epochs, without any post-processing. This allows for a fair comparison of raw voxel-level performance against our two-stage PLCNet pipeline.

2.5. Post-Processing Module

To further improve the spatial consistency of model prediction results and reduce the salt-and-pepper noise caused by local misclassifications, this study introduces a post-processing strategy for whole-plant-level classification, integrating connected component analysis with a majority voting mechanism. This strategy is grounded in the biological reality that virus infections in sweetpotato typically affect the entire plant. As such, the goal is to assign a consistent category label to each individual plant in the classification map, thereby enhancing both the biological validity and the practical reliability of the results.
The post-processing pipeline consists of the following three steps:
(1)
Plant Mask Extraction (PME):
A binary plant mask is generated from the initial classification map by removing non-plant regions, such as soil and shadows. This step ensures that subsequent analysis is restricted to valid plant areas only.
(2)
Connected-Components Labeling:
Within the extracted plant mask, spatially connected regions are identified using 8-neighborhood connectivity analysis. Each connected component corresponds to a single sweetpotato plant or a tightly grouped cluster of plants in the field [60].
(3)
Plant-Level Majority Voting (PLMV):
For each connected region, the predicted category of all pixels is aggregated, and a majority voting rule is applied to determine the dominant class. This class is then assigned uniformly to all pixels within the connected component, ensuring that each plant receives a consistent label [61,62].
By applying this three-step refinement process, the inconsistencies typically observed along plant edges in initial classification maps are significantly reduced. The method improves spatial smoothness, enhances regional coherence, and strengthens the biological interpretability of the model output. Consequently, this approach is highly suited to real-world field applications, enabling more accurate, robust, and scalable identification of SPVD at the plant level.

2.6. Model Performance Evaluation

In this study, the dataset was divided into a training set and a testing set at a ratio of 7:3, ensuring that the class distribution remained consistent across both sets. To comprehensively evaluate the performance of the model in hyperspectral remote sensing classification tasks, three commonly used classification performance metrics were employed as follows:
Overall Accuracy (OA): OA refers to the proportion of correctly classified samples among all test samples. It is the most straightforward indicator for assessing the overall classification performance of a model. The calculation formula is
O A = N c o r r e c t N t o t a l
where N c o r r e c t is the number of correctly classified samples, and N t o t a l is the total number of samples.
The F1-score is the harmonic mean of Precision and Recall, balancing both completeness and exactness. It is particularly suitable for situations where there is a class imbalance.
F 1 = 2 · P r e c i s i o n · R e c a l l P r e c i s i o n + R e c a l l
Macro F1-Score: the Macro F1-Score is the arithmetic mean of the F1-scores across all C classes:
M a c r o - F 1 = 1 C i = 1 C F 1 i
where F 1 i   is the F1-score for class i, computed by treating class i as the positive class and all other classes as negative. This macro-averaging ensures that each class contributes equally to the final metric, which is particularly useful in imbalanced class scenarios.
Mean User’s Accuracy (UA_mean): UA_mean is the average of the user’s accuracy (precision) calculated for each class:
U A m e a n = 1 C i = 1 C T P i T P i + F P i
where T P i and and F P i represent the true positives and false positives for class i , respectively.
Mean Producer’s Accuracy (PA_mean): PA_mean is the average of the producer’s accuracy (recall) calculated for each class:
P A m e a n = 1 C i = 1 C T P i T P i + F N i
where F N i is the number of false negatives for class i .
In this study, the Kappa coefficient was not used due to its known limitations in remote sensing accuracy assessment [63]. Instead, UA_mean and PA_mean were included as they provide more interpretable, class-specific accuracy information that complements OA and F1.

3. Results

3.1. Spectral Characteristics and SPAD of Healthy and Diseased Sweetpotato

To compare the spectral differences between healthy and diseased sweetpotato plants, the normalized average spectra of diseased pixels relative to healthy pixels were plotted for each plot (Figure 3). Generally, a distinct reflectance peak was observed at around 550 nm, followed by a plateau beginning near 770 nm. In the visible region (450–690 nm), the spectral reflectance of diseased sweetpotato leaves across all three varieties was consistently higher than that of healthy leaves. This region exhibited a characteristic pattern of increasing and then decreasing reflectance, with a pronounced “green peak” near 550 nm and a “red valley” around 690 nm.
In the near-infrared region (700–980 nm), a sharp rise in reflectance was observed. Healthy sweetpotato plants of all three varieties showed a more pronounced increase in reflectance compared to SPVD-infected plants. Notably, after 730 nm, the spectral reflectance of diseased leaves was consistently lower than that of healthy leaves, indicating a clear spectral distinction associated with disease presence.
This box plot (Figure 4) illustrates the distribution of leaf chlorophyll content (SPAD value) across three sweetpotato varieties (FR-1, FR-2, FR-3) under healthy and SPVD-infected conditions. The key finding is that SPVD induces significant chlorophyll loss. Both the median and mean SPAD values of all infected groups (SPVD-1/2/3) are substantially lower than those of their corresponding healthy groups (FR-1/2/3), confirming severe pigment degradation following viral infection. There were varietal differences in disease resistance; for FR-1 (susceptible), SPVD-1 shows the greatest decline in chlorophyll, with a median SPAD of ~28. For FR-2 (moderately resistant), SPVD-2 has a median SPAD of ~32. For FR-3 (highly resistant), even when infected (SPVD-3), the median SPAD remains at ~35, near the level of healthy FR-1, while healthy FR-3 exhibits the highest median (~38), underscoring its intrinsic physiological advantage. These results not only demonstrate SPVD’s detrimental impact on chlorophyll content, but also provide a strong rationale for integrating chlorophyll-related vegetation indices into hyperspectral models to enhance their accuracy and interpretability.

3.2. Feature Selection for SPVD

To identify the most informative spectral bands for SPVD classification, we compared three widely used feature selection methods—Local Covariance Matrix (LCM), Minimum Redundancy–Maximum Relevance (mRMR), and Random Forest (RF). As summarized in Table 1, each method produced a distinct subset of wavelength variables.
LCM was concentrated in the 790–926 nm region, which is sensitive to plant structural and physiological changes. mRMR yielded a broader distribution spanning both visible and near-infrared bands, notably around 674, 678, and 686 nm, displaying key chlorophyll absorption features. RF selected bands across the full spectrum, with emphasis on the 650–750 nm red-edge region and other chlorophyll-sensitive wavelengths.
To reduce multicollinearity and enhance robustness, we applied Variance Inflation Factor (VIF) analysis to each subset, removing highly collinear bands and yielding the following stable sets (Table 2).
These filtered wavelengths minimize redundancy, improving model interpretability and stability. Among the three approaches, RF’s impurity-based ranking criterion delivered the best model performance (OA = 91.36%); therefore, its selected bands were adopted for all subsequent model training (Table 3).
To further enhance the biological interpretability and classification performance of the model, 24 vegetation indices commonly associated with chlorophyll content, carotenoids, and plant stress were initially evaluated. VIF analysis was then applied to remove redundant and highly collinear indices, resulting in a subset of five optimal indices closely related to SPVD (Table 4).
Among the selected indices, three—PSSRb, DATT, and MDATT—are associated with chlorophyll content, reflecting the decline in chlorophyll commonly observed in virus-infected leaves. One index, the PRI (Photochemical Reflectance Index), is indicative of carotenoid activity and photosynthetic efficiency, often altered under stress conditions. The final index, ND800,530, is a normalized difference index that combines the near-infrared and visible bands to capture structural and pigment-related changes in vegetation.
These SPVD-sensitive indices were subsequently incorporated into the feature sets of the LCM, mRMR, and RF selection methods to build classification models. The integration of spectral features with biologically relevant vegetation indices is expected to improve the robustness and physiological interpretability of the disease classification models.

3.3. Evaluating the Effect of Feature Selection Methods and Classifiers on Recognition Performance

To evaluate the impact of different feature selection methods on classification performance, we tested three FS  +  Vis combinations (LCM + Vis, mRMR + Vis, RF + Vis) across the following four classifiers: SVM, GBDT, ResNet, and 3D-CNN. The results are summarized in Table 5.
Among the four classifiers, 3D-CNN consistently achieved the highest accuracy, with the best configuration—RF + Vis + 3D-CNN—reaching OA = 96.55%, F1 = 95.36%, UA_mean = 0.9498, and PA_mean = 0.9504. ResNet outperformed the two traditional models but fell slightly short of the 3D-CNN, achieving OA = 93.67%, F1 = 92.96%, UA_mean = 0.9351, and PA_mean = 0.9303 under RF + Vis. Both SVM and GBDT yielded similar results with LCM + Vis or mRMR + Vis (OA ≈ 91.3–91.5%, F1 ≈ 90.7–91.1%, UA_mean ≈ 0.9133–0.9333, PA_mean ≈ 0.9093–0.9299), demonstrating that while LCM and mRMR extract informative features, they cannot capture complex nonlinear spectral–spatial relationships as effectively as deep networks.
Notably, RF-based feature selection consistently outperformed LCM and mRMR for all classifiers—especially when paired with deep backbones—indicating that RF is more adept at identifying SPVD-sensitive bands and indices.
Overall, these findings highlight that (1) feature selection quality critically influences downstream performance; (2) deep learning models—particularly our 3D-CNN—excel at hyperspectral representation; and (3) the combination of RF-selected features with a deep 3D-CNN backbone offers the most robust and generalizable pipeline for SPVD detection.

3.4. Classification Performance of Hyperspectral Images of SPVD

To assess the spatial distribution of sweetpotato virus infections within the study area, selected characteristic spectral bands were used as input features for classification using the 3D-CNN model. The resulting classification maps of SPVD, based on hyperspectral imagery, are shown in Figure 5a–f. Overall, the healthy sweetpotato plants from all three varieties were accurately identified, with classification results largely aligning with ground truth observations. However, a few instances of misclassification were observed, particularly in Figure 5d, which may be attributed to spectral interference from non-plant elements such as shadows, senescent (yellow) leaves, or overlapping vegetation, thereby affecting the classification accuracy.
Figure 5g–l presents the classification results for virus-infected sweetpotato plants located within the central “protected rows.” The majority of these areas were accurately identified, and notably, the healthy plants in the protected rows were not misclassified, despite their proximity to infected plants. This suggests that the model effectively distinguished between diseased and healthy plants under spatially constrained conditions. Nonetheless, minor misclassifications occurred in a few plots, specifically in Figure 5k,l, which may be attributed to the low infection severity in certain diseased plants. Such cases can result in spectral signatures that closely resemble those of healthy plants, thus reducing the model’s discriminative capability.
Figure 6 qualitatively compares the pixel-level CropdocNet output with the plant-level PLCNet result on both a healthy FR-2 patch and an SPVD-3 infected patch.
Healthy FR-2 (Figure 6a–f): CropdocNet (Figure 6a–c) correctly identifies the majority of the healthy canopy, but leaves small holes within the mask and produces isolated green speckles on the bare soil. PLCNet (Figure 6d–f), after connected-component filtering and majority voting, yields a continuous, gap-free canopy outline that faithfully matches the true plant boundary and virtually eliminates stray soil pixels.
Infected SPVD-3 (Figure 6g–l): CropdocNet (Figure 6g–i) correctly delineates the general infection area but leaves small “holes” within lesions—misclassifying a few diseased pixels as healthy—and produces fragmented gaps inside individual infected plants. PLCNet (Figure 6j–l) fills these gaps via connected-component analysis and majority voting, resulting in solid, contiguous infection masks that better reflect the true extent of disease.
Overall, PLCNet’s post-processing markedly enhances spatial coherence, removing “salt-and-pepper” noise and consolidating fragmented predictions while preserving the accurate delineation of both healthy and diseased tissue in field conditions.
To further evaluate model robustness over time, hyperspectral images were acquired again 13 days later, and the trained model was used to generate a new classification map, as shown in Figure 7. The results indicate the high temporal stability of the model, with only two instances of misclassification in the healthy FR-1 plots (Figure 7f), which were potentially caused by weeds or fallen leaves being incorrectly identified as diseased sweetpotato. However, the model showed a slight increase in false negatives during this later growth stage. This may be explained by tissue regeneration and new biomass accumulation in plants that were initially infected but showed reduced visible symptoms, causing attenuated spectral signals and making detection more difficult (Figure 7k,l).

4. Discussion

4.1. Challenges in Hyperspectral Remote Sensing Diagnosis of SPVD

Remote-sensing-based diagnosis of SPVD faces significant technical and theoretical challenges, arising both from inherent complexities in virus detection and from the unique agronomic characteristics of sweetpotato crops. Unlike diseases caused by fungi or bacteria, plant viral diseases often involve simultaneous infection by multiple virus species, resulting in heterogeneous spectral responses that complicate the extraction of stable and discriminative spectral features [64,65]. Direct quantification of virus loads under field conditions also remains challenging, limiting the ability to establish precise quantitative relationships between remote sensing signals and viral infection severity. Additionally, observed spectral variations in infected plants largely reflect indirect physiological responses, such as reduced chlorophyll content, which can also be influenced by non-viral environmental stressors, reducing the specificity of spectral-based disease models [66,67,68].
These challenges are particularly pronounced under the dense planting conditions typical of sweetpotato cultivation. The multi-layered canopy structure can induce significant spectral mixing effects, diminishing the sensitivity of disease-specific spectral bands. Furthermore, early-stage SPVD symptoms are often subtle and obscured within the canopy, restricting the reliable detection of initial infection stages. Compared to structurally simpler crops like rice or wheat, SPVD diagnosis necessitates enhanced spatial–spectral coupling across multiple scales and requires robust classification models capable of mitigating background interference.
The transient masking of SPVD symptoms can be further understood by considering the vascular-confined spread of SPFMV and SPCSV, and sweetpotato’s inherent source–sink transport dynamics [2,65]. Both viruses colonize the plant’s phloem sieve tubes and move systemically through the nutrient-transport network. Older leaves and stem tissues, which act as primary “source” organs, typically accumulate higher viral titers earlier, displaying chlorosis, leaf curling, and mosaic symptoms. Conversely, newly emerging “sink” tissues (such as apical meristems and unfolding young leaves) initially remain less exposed to high viral concentrations due to their role in drawing photoassimilates rather than exporting nutrients. During periods of vigorous plant growth, these sink tissues may develop with relatively low viral loads and exhibit nearly normal spectral signatures, temporarily masking symptoms and generating false negatives in hyperspectral classifications. Over time, as the virus progressively invades these sink tissues, symptoms become apparent again, but the initial growth phase offers a critical window during which symptom expression is significantly attenuated.
To effectively overcome these challenges, hyperspectral remote-sensing-based detection of SPVD should integrate synergistic optimization across algorithm development, targeted data acquisition strategies, and deeper elucidation of underlying phenotypic mechanisms. This comprehensive, multidisciplinary approach is essential for substantially improving the accuracy, stability, and generalization capabilities of SPVD identification models in practical agricultural settings.

4.2. Spectral Response Mechanisms of SPVD-Infected Sweetpotato Leaves

Common symptoms of SPVD-infected sweetpotato leaves include physiological changes that are typically accompanied by a reduction in chlorophyll content, such as yellowing, vein clearing, leaf curling, and plant dwarfing (Figure 4). The spectral characteristics observed in Figure 3 are highly consistent with findings from previous hyperspectral near-surface remote sensing studies [15,25,69]. This consistency underscores the reliability of low-altitude UAV-based hyperspectral remote sensing in detecting sweetpotato viral diseases. It also suggests that SPVD-infected and healthy plants exhibit significant spectral differences across multiple wavelength regions, which can be leveraged to differentiate between healthy and diseased sweetpotato plants.
Notably, across all SPVD-infected samples, higher reflectance was observed in the red-edge region (700–740 nm), a spectral range known to be highly sensitive to plant stress responses [70,71]. In this study, the elevated red-edge reflectance is likely attributable to physiological stress caused by viral infection. In contrast, spectral variations in other regions were more variety-specific. In the visible region (450–700 nm), the selected wavelengths were strongly correlated with the primary absorption bands of chlorophyll a and b. SPVD-induced chlorophyll degradation resulted in noticeable increases in reflectance, particularly in the red region (approximately 660–690 nm), where weakened absorption indicates reduced photosynthetic efficiency in diseased plants [2]. These reflectance changes serve as sensitive indicators of virus-induced photosynthetic inhibition.
The red-edge region, representing the transition from red to near-infrared reflectance, is particularly sensitive to chlorophyll concentration and cellular structural changes. Alterations in the slope of the red-edge curve—or the observed “blue shift” of its position—reflect leaf health deterioration, making it an essential spectral feature for early-stage viral disease detection [72,73,74,75]. In this study, red-edge wavelengths were repeatedly selected by multiple feature selection methods, highlighting their robustness and strong biological interpretability, which serve as a solid foundation for subsequent modeling and variable integration.
In the near-infrared region (750–950 nm), reflectance is primarily influenced by leaf cellular structure, tissue density, and water content. Following SPVD infection, tissue degradation and moisture loss led to a general decline in reflectance within this region, providing additional evidence of physiological stress and deterioration.
Collectively, these spectral bands reflect the distinct physiological and biochemical changes between healthy and SPVD-infected sweetpotato plants. The spectral response patterns observed are largely consistent with prior research on hyperspectral reflectance in plant disease detection [50,51]. These findings support the conclusion that hyperspectral technology can effectively enable the indirect, yet efficient, identification of SPVD by capturing photochemical signals indicative of plant health status.

4.3. From Pixel to Plant: A Post-Processing Strategy for Practical SPVD Monitoring

UAV-based hyperspectral screening offers non-destructive, high-throughput, and cost-effective advantages over PCR for large-scale SPVD surveillance. Our model achieves OA = 96.55%, F1 = 96.45%, yet field-scale imagery still produces “salt-and-pepper” noise due to canopy shading, leaf-edge regions, and soil background.
To illustrate this issue, we have compared pixel-level CropdocNet and our plant-level PLCNet. CropdocNet’s dual-branch design only fuses spectral and spatial features at a shallow layer and is limited in depth, so it often leaves false positives and small gaps within true canopy regions isolated. In contrast, PLCNet’s deep 3D-CNN backbone extracts multi-layer spectral–spatial representations, and its connected-component post-processing removes scattered noise and enforces whole-plant consistency.
To mitigate residual misclassifications, we have employed the following methods:
Pixel-level deep feature extraction: a volumetric 3D-CNN captures localized disease signals (e.g., red-edge shift, NIR suppression).
Plant-level post-processing: connected-component analysis and majority voting aggregate voxel predictions into coherent plant-level labels.
This hybrid pipeline balances model complexity with field practicality, yielding cleaner, biologically plausible infection maps that support early warning, high-risk area identification, and targeted sampling for optimized disease management.
In this study, the Kappa coefficient was excluded due to its well-documented limitations in remote sensing accuracy assessment, particularly its potential to misrepresent classification performance in imbalanced datasets [63]. Instead, UA_mean and PA_mean were included as they provide more interpretable, class-specific accuracy information, enabling a more comprehensive evaluation of classification reliability across all classes.

4.4. Toward More Robust and Scalable SPVD Detection

We will partner with agricultural institutions across multiple regions to build a standardized, open-access SPVD hyperspectral dataset covering diverse infection stages, virus combinations, and environmental gradients. By adopting uniform sampling protocols and metadata standards and including additional varieties beyond the three leaf-morphology/susceptibility types already studied, this multi-center resource will support cross-domain transfer and eliminate geographic bias.
To improve both accuracy and interpretability, we will incorporate ground-truth physiological measurements (chlorophyll, leaf nitrogen, water content) as auxiliary input channels or regularization terms. Simultaneously, we will deploy a two-stage classification pipeline (“coarse segmentation → fine refinement”) augmented by ensemble voting and uncertainty quantification, and leverage semi-supervised techniques (consistency regularization, pseudo-labeling) to exploit unlabeled field data and reduce annotation costs.
Building on PLCNet’s demonstrated efficiency, we will systematically evaluate deeper residual backbones (e.g., ResNet3D-50) and transformer-based models (hyperspectral ViT) under appropriate data and computation conditions to assess their spectral–spatial representation potential and trade-offs between complexity and performance.

5. Conclusions

This study introduced PLCNet, an innovative framework designed for the rapid, non-destructive identification of SPVD using UAV-acquired hyperspectral imagery. PLCNet uniquely addresses SPVD detection by integrating optimized spectral feature selection, deep learning, and a plant-level post-processing pipeline. Unlike conventional pixel-level methods, PLCNet treats each sweetpotato plant as a cohesive spatial unit, effectively capturing whole-plant disease symptomology.
Benchmark evaluations showed that PLCNet achieved superior classification performance (OA = 96.55%, Macro F1 = 95.36%, UA_mean = 0.9498, PA_mean = 0.9504), outperforming traditional classifiers (SVM, GBDT) and simpler CNN architectures (ResNet, CropdocNet). The inclusion of connected-component analysis and a majority voting post-processing module significantly reduced classification noise, improving the spatial coherence and biological interpretability of classification maps.
The demonstrated effectiveness, high throughput, and operational practicality of PLCNet underline its substantial potential for the large-scale precision monitoring and management of SPVD. By combining deep spectral–spatial feature extraction with robust plant-level refinement, the proposed framework provides a scalable, cost-effective, and accurate solution suitable for deployment across diverse agricultural production regions. Future research directions include expanding multi-regional datasets, incorporating physiological parameters, exploring advanced model architectures, and integrating semi-supervised techniques to further enhance the robustness and generalization capabilities of SPVD detection systems.

Author Contributions

Conceptualization, Q.Z., W.W. and G.Y.; Data Collection, Q.Z., W.W., H.S., X.G. and H.H.; Methodology, H.S., G.Y. and Q.Z.; Validation, G.Y., W.W. and Q.Z.; Writing—original draft, Q.Z. and Z.X.; Writing—review and editing, Q.Z., Q.C., G.Y., H.S., Z.X. and J.X.; funding acquisition, Z.X. and Q.Z. All authors have read and agreed to the published version of the manuscript.

Funding

Support for this research was provided by the Xuzhou Science and Technology Program under ID number KC23127 and the Scientific Research Fund of Xuzhou Academy of Agricultural Sciences under ID number XM2023008.

Data Availability Statement

All datasets presented in this study are available upon request from the corresponding author.

Acknowledgments

The authors would like to thank Chen Li and Yuan Yi for supporting the field campaigns. The authors would also like to thank the reviewers, whose comments and suggestions were helpful in improving the quality of this manuscript.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Adero, J.; Wokorach, G.; Stomeo, F.; Yao, N.; Machuka, E.; Njuguna, J.; Byarugaba, D.K.; Kreuze, J.; Yencho, G.C.; Otema, M.A.; et al. Next Generation Sequencing and Genetic Analyses Reveal Factors Driving Evolution of Sweetpotato Viruses in Uganda. Pathogens 2024, 13, 833. [Google Scholar] [CrossRef]
  2. Zhang, K.; Lu, H.; Wan, C.; Tang, D.; Zhao, Y.; Luo, K.; Li, S.; Wang, J. The Spread and Transmission of Sweet Potato Virus Disease (SPVD) and Its Effect on the Gene Expression Profile in Sweet Potato. Plants 2020, 9, 492. [Google Scholar] [CrossRef]
  3. Fang, D.; Fan, Z.C. Research Progress and Prospects on Control Measures of Sweet Potato Virus Diseases. Crops 2016, 3, 6–11. [Google Scholar] [CrossRef]
  4. Karyeija, R.F.; Kreuze, J.F.; Gibson, R.W.; Valkonen, J.P.T. Two Serotypes of Sweetpotato Feathery Mottle Virus in Uganda and Their Interaction with Resistant Sweetpotato Cultivars. Phytopathology® 2000, 90, 1250–1255. [Google Scholar] [CrossRef]
  5. He, Y.; Chen, Z.; Li, Y.; He, M.; Zhang, X.; Zhi, S.; Shen, W.; Qin, S.; Zhang, K.; Ni, Q. Research Progress on Virus Elimination Techniques for Sweet Potato. J. Chang. Veg. 2018, 8, 36–39. [Google Scholar]
  6. Sun, Z.; Gong, Y.; Zhao, L.; Shi, J.; Mao, B. Advances in Researches on Molecular Biology of SPVD. J. Nucl. Agric. Sci. 2020, 34, 71–77. [Google Scholar] [CrossRef]
  7. Zeng, F.; Ding, Z.; Song, Q.; Xiao, J.; Zheng, J.; Li, H.; Luo, Z.; Wang, Z.; Yue, X.; Huang, L. Feasibility of Detecting Sweet Potato (Ipomoea Batatas) Virus Disease from High-Resolution Imagery in the Field Using a Deep Learning Framework. Agronomy 2023, 13, 2801. [Google Scholar] [CrossRef]
  8. Sarkar, A.; Nandi, U.; Kumar Sarkar, N.; Changdar, C.; Paul, B. Deep Learning Based Hyperspectral Image Classification: A Review For Future Enhancement. Int. J. Comput. Digit. Syst. 2024, 15, 419–435. [Google Scholar] [CrossRef] [PubMed]
  9. Wan, L.; Li, H.; Li, C.; Wang, A.; Yang, Y.; Wang, P. Hyperspectral Sensing of Plant Diseases: Principle and Methods. Agronomy 2022, 12, 1451. [Google Scholar] [CrossRef]
  10. Lowe, A.; Harrison, N.; French, A.P. Hyperspectral Image Analysis Techniques for the Detection and Classification of the Early Onset of Plant Disease and Stress. Plant Methods 2017, 13, 80. [Google Scholar] [CrossRef]
  11. Zhang, J.; Cheng, T.; Guo, W.; Xu, X.; Qiao, H.; Xie, Y.; Ma, X. Leaf Area Index Estimation Model for UAV Image Hyperspectral Data Based on Wavelength Variable Selection and Machine Learning Methods. Plant Methods 2021, 17, 49. [Google Scholar] [CrossRef]
  12. Adão, T.; Hruška, J.; Pádua, L.; Bessa, J.; Peres, E.; Morais, R.; Sousa, J. Hyperspectral Imaging: A Review on UAV-Based Sensors, Data Processing and Applications for Agriculture and Forestry. Remote Sens. 2017, 9, 1110. [Google Scholar] [CrossRef]
  13. Moriya, É.A.S.; Imai, N.N.; Tommaselli, A.M.G.; Honkavaara, E.; Rosalen, D.L. Design of Vegetation Index for Identifying the Mosaic Virus in Sugarcane Plantation: A Brazilian Case Study. Agronomy 2023, 13, 1542. [Google Scholar] [CrossRef]
  14. Shi, Y.; Han, L.; Kleerekoper, A.; Chang, S.; Hu, T. Novel CropdocNet Model for Automated Potato Late Blight Disease Detection from Unmanned Aerial Vehicle-Based Hyperspectral Imagery. Remote Sens. 2022, 14, 396. [Google Scholar] [CrossRef]
  15. Mickey Wang, Y.; Ostendorf, B.; Pagay, V. Evaluating the Potential of High-Resolution Hyperspectral UAV Imagery for Grapevine Viral Disease Detection in Australian Vineyards. Int. J. Appl. Earth Obs. Geoinf. 2024, 130, 103876. [Google Scholar] [CrossRef]
  16. Wang, Y.; Xing, M.; Zhang, H.; He, B.; Zhang, Y. Rice False Smut Monitoring Based on Band Selection of UAV Hyperspectral Data. Remote Sens. 2023, 15, 2961. [Google Scholar] [CrossRef]
  17. Gao, J.; Ding, M.; Sun, Q.; Dong, J.; Wang, H.; Ma, Z. Classification of Southern Corn Rust Severity Based on Leaf-Level Hyperspectral Data Collected under Solar Illumination. Remote Sens. 2022, 14, 2551. [Google Scholar] [CrossRef]
  18. Deng, J.; Zhang, X.; Yang, Z.; Zhou, C.; Wang, R.; Zhang, K.; Lv, X.; Yang, L.; Wang, Z.; Li, P.; et al. Pixel-Level Regression for UAV Hyperspectral Images: Deep Learning-Based Quantitative Inverse of Wheat Stripe Rust Disease Index. Comput. Electron. Agric. 2023, 215, 108434. [Google Scholar] [CrossRef]
  19. Zhang, E.; Zhang, J.; Bai, J.; Bian, J.; Fang, S.; Zhan, T.; Feng, M. Attention-Embedded Triple-Fusion Branch CNN for Hyperspectral Image Classification. Remote Sens. 2023, 15, 2150. [Google Scholar] [CrossRef]
  20. Datta, D.; Mallick, P.K.; Gupta, D.; Chae, G.-S. Hyperspectral Image Classification Based on Novel Hybridization of Spatial-Spectral-Superpixelwise Principal Component Analysis and Dense 2D-3D Convolutional Neural Network Fusion Architecture. Can. J. Remote Sens. 2022, 48, 663–680. [Google Scholar] [CrossRef]
  21. Gao, H.; Chen, Z.; Li, C. Sandwich Convolutional Neural Network for Hyperspectral Image Classification Using Spectral Feature Enhancement. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 3006–3015. [Google Scholar] [CrossRef]
  22. Xue, Z.; Yu, X.; Liu, B.; Tan, X.; Wei, X. HResNetAM: Hierarchical Residual Network With Attention Mechanism for Hyperspectral Image Classification. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 3566–3580. [Google Scholar] [CrossRef]
  23. Mu, Q.; Kang, Z.; Guo, Y.; Chen, L.; Wang, S.; Zhao, Y. Hyperspectral Image Classification of Wolfberry with Different Geographical Origins Based on Three-Dimensional Convolutional Neural Network. Int. J. Food Prop. 2021, 24, 1705–1721. [Google Scholar] [CrossRef]
  24. Trivedi, A.K.; Mahajan, T.; Maheshwari, T.; Mehta, R.; Tiwari, S. Leveraging Feature Fusion Ensemble of VGG16 and ResNet-50 for Automated Potato Leaf Abnormality Detection in Precision Agriculture. Soft Comput. 2025, 29, 2263–2277. [Google Scholar] [CrossRef]
  25. Zeng, T.; Wang, Y.; Yang, Y.; Liang, Q.; Fang, J.; Li, Y.; Zhang, H.; Fu, W.; Wang, J.; Zhang, X. Early Detection of Rubber Tree Powdery Mildew Using UAV-Based Hyperspectral Imagery and Deep Learning. Comput. Electron. Agric. 2024, 220, 108909. [Google Scholar] [CrossRef]
  26. Bhatti, U.A.; Bazai, S.U.; Hussain, S.; Fakhar, S.; Ku, C.S.; Marjan, S.; Yee, P.L.; Jing, L. Deep Learning-Based Trees Disease Recognition and Classification Using Hyperspectral Data. Comput. Mater. Contin. 2023, 77, 681–697. [Google Scholar] [CrossRef]
  27. Zheng, J.; Sun, C.; Zhao, S.; Hu, M.; Zhang, S.; Li, J. Classification of Salt Marsh Vegetation in the Yangtze River Delta of China Using the Pixel-Level Time-Series and XGBoost Algorithm. J. Remote Sens. 2023, 3, 0036. [Google Scholar] [CrossRef]
  28. Yang, R.; Kan, J. Classification of Tree Species at the Leaf Level Based on Hyperspectral Imaging Technology. J. Appl. Spectrosc. 2020, 87, 184–193. [Google Scholar] [CrossRef]
  29. Zhang, C.L.; Sun, H.J.; Yang, D.J.; Ma, J.K.; Xie, Y.P. Effects of Leaf Curl Virus on Growth Characteristic and Yield of Sweet Potato. J. North. Agric. 2020, 48, 94–99. [Google Scholar] [CrossRef]
  30. Ping, Y. Lipid Metabolism Patterns in SPVD-Infected Sweet Potato Leaves Under Different Temperature Regimes; Jiangsu Normal University: Xuzhou, China, 2018. [Google Scholar]
  31. Wei, X.; Johnson, M.A.; Langston, D.B.; Mehl, H.L.; Li, S. Identifying Optimal Wavelengths as Disease Signatures Using Hyperspectral Sensor and Machine Learning. Remote Sens. 2021, 13, 2833. [Google Scholar] [CrossRef]
  32. Fang, L.; He, N.; Li, S.; Plaza, A.J.; Plaza, J. A New Spatial–Spectral Feature Extraction Method for Hyperspectral Images Using Local Covariance Matrix Representation. IEEE Trans. Geosci. Remote Sens. 2018, 56, 3534–3546. [Google Scholar] [CrossRef]
  33. Jiang, Y.; Li, C. mRMR-Based Feature Selection for Classification of Cotton Foreign Matter Using Hyperspectral Imaging. Comput. Electron. Agric. 2015, 119, 191–200. [Google Scholar] [CrossRef]
  34. Wang, Z.; Yuan, F.; Li, R.; Zhang, M.; Luo, X. Hidden AS Link Prediction Based on Random Forest Feature Selection and GWO-XGBoost Model. Comput. Netw. 2025, 262, 111164. [Google Scholar] [CrossRef]
  35. Allouis, T.; Durrieu, S.; Vega, C.; Couteron, P. Stem Volume and Above-Ground Biomass Estimation of Individual Pine Trees From LiDAR Data: Contribution of Full-Waveform Signals. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2013, 6, 924–934. [Google Scholar] [CrossRef]
  36. Blackburn, G.A. Spectral Indices for Estimating Photosynthetic Pigment Concentrations: A Test Using Senescent Tree Leaves. Int. J. Remote Sens. 1998, 19, 657–675. [Google Scholar] [CrossRef]
  37. Blackburn, G.A. Relationships between Spectral Reflectance and Pigment Concentrations in Stacks of Deciduous Broadleaves. Remote Sens. Environ. 1999, 70, 224–237. [Google Scholar] [CrossRef]
  38. Chappelle, E.W.; Kim, M.S.; Iii, M.M. Ratio Analysis of Reflectance Spectra (RARS): An Algorithm for the Remote Estimation of the Concentrations of Chlorophyll A, Chlorophyll B, and Carotenoids in Soybean Leaves. Remote Sens. Environ. 1992, 39, 239–247. [Google Scholar] [CrossRef]
  39. Becker, F.; Choudhury, B.J. Relative sensitivity of normalized difference vegetation Index (NDVI) and microwave polarization difference Index (MPDI) for vegetation and desertification monitoring. Remote Sens. Environ. 1988, 24, 297–311. [Google Scholar] [CrossRef]
  40. Sims, D.A.; Gamon, J.A. Relationships between Leaf Pigment Content and Spectral Reflectance across a Wide Range of Species, Leaf Structures and Developmental Stages. Remote Sens. Environ. 2002, 81, 337–354. [Google Scholar] [CrossRef]
  41. Gitelson, A.A.; Merzlyak, M.N. Signature Analysis of Leaf Reflectance Spectra: Algorithm Development for Remote Sensing of Chlorophyll. J. Plant Physiol. 1996, 148, 494–500. [Google Scholar] [CrossRef]
  42. Maccioni, A.; Agati, G.; Mazzinghi, P. New Vegetation Indices for Remote Measurement of Chlorophylls Based on Leaf Directional Reflectance Spectra. J. Photochem. Photobiol. B 2001, 61, 52–61. [Google Scholar] [CrossRef]
  43. Datt, B. A New Reflectance Index for Remote Sensing of Chlorophyll Content in Higher Plants: Tests Using Eucalyptus Leaves. J. Plant Physiol. 1999, 154, 30–36. [Google Scholar] [CrossRef]
  44. Lu, S.; Lu, F.; You, W.; Wang, Z.; Liu, Y.; Omasa, K. A Robust Vegetation Index for Remotely Assessing Chlorophyll Content of Dorsiventral Leaves across Several Species in Different Seasons. Plant Methods 2018, 14, 15. [Google Scholar] [CrossRef]
  45. Haboudane, D.; Miller, J.R.; Tremblay, N.; Zarco-Tejada, P.J.; Dextraze, L. Integrated Narrow-Band Vegetation Indices for Prediction of Crop Chlorophyll Content for Application to Precision Agriculture. Remote Sens. Environ. 2002, 81, 416–426. [Google Scholar] [CrossRef]
  46. Gitelson, A.A.; Keydan, G.P.; Merzlyak, M.N. Three-band Model for Noninvasive Estimation of Chlorophyll, Carotenoids, and Anthocyanin Contents in Higher Plant Leaves. Geophys. Res. Lett. 2006, 33, 431–433. [Google Scholar] [CrossRef]
  47. Gamon, J.A.; Peñuelas, J.; Field, C.B. A Narrow-Waveband Spectral Index That Tracks Diurnal Changes in Photosynthetic Efficiency. Remote Sens. Environ. 1992, 41, 35–44. [Google Scholar] [CrossRef]
  48. Gitelson, A.A.; Yoav, Z.; Olga, B. Assessing Carotenoid Content in Plant Leaves with Reflectance Spectroscopy. Photochem. Photobiol. 2002, 75, 272–281. [Google Scholar] [CrossRef] [PubMed]
  49. Mahlein, A.K.; Rumpf, T.; Welke, P. Development of Spectral Indices for Detecting and Identifying Plant Diseases. Remote Sens. Environ. 2013, 128, 21–30. [Google Scholar] [CrossRef]
  50. Merzlyak, M.N.; Gitelson, A.A.; Chivkunova, O.B.; Rakitin, V.Y. Non-destructive Optical Detection of Pigment Changes during Leaf Senescence and Fruit Ripening. Physiol. Plant. 1999, 106, 135–141. [Google Scholar] [CrossRef]
  51. Carter, G.A. Ratios of Leaf Reflectances in Narrow Wavebands as Indicators of Plant Stress. Int. J. Remote Sens. 1994, 15, 697–703. [Google Scholar] [CrossRef]
  52. Tarabalka, Y.; Fauvel, M.; Chanussot, J.; Benediktsson, J.A. SVM- and MRF-Based Method for Accurate Classification of Hyperspectral Images. IEEE Geosci. Remote Sens. Lett. 2010, 7, 736–740. [Google Scholar] [CrossRef]
  53. Li, S.; Sun, L.; Tian, Y.; Lu, X.; Fu, Z.; Lv, G.; Zhang, L.; Xu, Y.; Che, W. Research on Non-Destructive Identification Technology of Rice Varieties Based on HSI and GBDT. Infrared Phys. Technol. 2024, 142, 105511. [Google Scholar] [CrossRef]
  54. Roy, S.K.; Krishna, G.; Dubey, S.R.; Chaudhuri, B.B. HybridSN: Exploring 3D-2D CNN Feature Hierarchy for Hyperspectral Image Classification. IEEE Geosci. Remote Sens. Lett. 2020, 17, 277–281. [Google Scholar] [CrossRef]
  55. Kalaivani, S.; Tharini, C.; Viswa, T.M.S.; Sara, K.Z.F.; Abinaya, S.T. ResNet-Based Classification for Leaf Disease Detection. J. Inst. Eng. India Ser. B 2025, 106, 1–14. [Google Scholar] [CrossRef]
  56. Zhang, C.; Bengio, S.; Hardt, M.; Recht, B.; Vinyals, O. Understanding Deep Learning (Still) Requires Rethinking Generalization. Commun. ACM 2021, 64, 107–115. [Google Scholar] [CrossRef]
  57. Smirnov, E.A.; Timoshenko, D.M.; Andrianov, S.N. Comparison of Regularization Methods for ImageNet Classification with Deep Convolutional Neural Networks. AASRI Procedia 2014, 6, 89–94. [Google Scholar] [CrossRef]
  58. Chen, S.; Jin, M.; Ding, J. Hyperspectral Remote Sensing Image Classification Based on Dense Residual Three-Dimensional Convolutional Neural Network. Multimed. Tools Appl. 2021, 80, 1859–1882. [Google Scholar] [CrossRef]
  59. Ahmad, M.; Khan, A.M.; Mazzara, M.; Distefano, S.; Ali, M.; Sarfraz, M.S. A Fast and Compact 3-D CNN for Hyperspectral Image Classification. IEEE Geosci. Remote Sens. Lett. 2022, 19, 1–5. [Google Scholar] [CrossRef]
  60. Zhang, D.; Ma, H.; Pan, L. A Gamma-Signal-Regulated Connected Components Labeling Algorithm. Pattern Recognit. 2019, 91, 281–290. [Google Scholar] [CrossRef]
  61. Lam, L.; Suen, S.Y. Application of Majority Voting to Pattern Recognition: An Analysis of Its Behavior and Performance. IEEE Trans. Syst. Man Cybern.-Part Syst. Hum. 1997, 27, 553–568. [Google Scholar] [CrossRef]
  62. Gao, J.; Westergaard, J.C.; Sundmark, E.H.R.; Bagge, M.; Liljeroth, E.; Alexandersson, E. Automatic Late Blight Lesion Recognition and Severity Quantification Based on Field Imagery of Diverse Potato Genotypes by Deep Learning. Knowl.-Based Syst. 2021, 214, 106723. [Google Scholar] [CrossRef]
  63. Foody, G.M. Explaining the Unsuitability of the Kappa Coefficient in the Assessment and Comparison of the Accuracy of Thematic Maps Obtained by Image Classification. Remote Sens. Environ. 2020, 239, 111630. [Google Scholar] [CrossRef]
  64. Untiveros, M.; Fuentes, S.; Salazar, L.F. Synergistic Interaction of Sweet Potato Chlorotic Stunt Virus (Crinivirus) with Carla-, Cucumo-, Ipomo-, and Potyviruses Infecting Sweet Potato. Plant Dis. 2007, 91, 669–676. [Google Scholar] [CrossRef]
  65. Kokkinos, C.D.; Clark, C.A.; McGregor, C.E.; LaBonte, D.R. The Effect of Sweet Potato Virus Disease and Its Viral Components on Gene Expression Levels in Sweetpotato. J. Am. Soc. Hortic. Sci. 2006, 131, 657–666. [Google Scholar] [CrossRef]
  66. Römer, C.; Wahabzada, M.; Ballvora, A.; Pinto, F.; Rossini, M.; Panigada, C.; Behmann, J.; Léon, J.; Thurau, C.; Bauckhage, C.; et al. Early Drought Stress Detection in Cereals: Simplex Volume Maximisation for Hyperspectral Image Analysis. Funct. Plant Biol. 2012, 39, 878. [Google Scholar] [CrossRef]
  67. Grisham, M.P.; Johnson, R.M.; Zimba, P.V. Detecting Sugarcane Yellow Leaf Virus Infection in Asymptomatic Leaves with Hyperspectral Remote Sensing and Associated Leaf Pigment Changes. J. Virol. Methods 2010, 167, 140–145. [Google Scholar] [CrossRef]
  68. Chávez, P.; Zorogastúa, P.; Chuquillanqui, C.; Salazar, L.F.; Mares, V.; Quiroz, R. Assessing Potato Yellow Vein Virus (PYVV) Infection Using Remotely Sensed Data. Int. J. Pest Manag. 2009, 55, 251–256. [Google Scholar] [CrossRef]
  69. Wang, Y.M.; Ostendorf, B.; Pagay, V. Detecting Grapevine Virus Infections in Red and White Winegrape Canopies Using Proximal Hyperspectral Sensing. Sensors 2023, 23, 2851. [Google Scholar] [CrossRef]
  70. Feng, L.; Zhang, Z.; Ma, Y.; Du, Q.; Williams, P.; Drewry, J.; Luck, B. Alfalfa Yield Prediction Using UAV-Based Hyperspectral Imagery and Ensemble Learning. Remote Sens. 2020, 12, 2028. [Google Scholar] [CrossRef]
  71. Dierssen, H.M.; Ackleson, S.G.; Joyce, K.E.; Hestir, E.L.; Castagna, A.; Lavender, S.; McManus, M.A. Living up to the Hype of Hyperspectral Aquatic Remote Sensing: Science, Resources and Outlook. Front. Environ. Sci. 2021, 9, 649528. [Google Scholar] [CrossRef]
  72. Zhang, M.; Qin, Z.; Liu, X.; Ustin, S.L. Detection of Stress in Tomatoes Induced by Late Blight Disease in California, USA, Using Hyperspectral Remote Sensing. Int. J. Appl. Earth Obs. Geoinf. 2003, 4, 295–310. [Google Scholar] [CrossRef]
  73. Ali, M.M.; Bachik, N.A.; Muhadi, N.; Atirah; Tuan Yusof, T.N.; Gomes, C. Non-Destructive Techniques of Detecting Plant Diseases: A Review. Physiol. Mol. Plant Pathol. 2019, 108, 101426. [Google Scholar] [CrossRef]
  74. Larsolle, A.; Hamid Muhammed, H. Measuring Crop Status Using Multivariate Analysis of Hyperspectral Field Reflectance with Application to Disease Severity and Plant Density. Precis. Agric. 2007, 8, 37–47. [Google Scholar] [CrossRef]
  75. Rumpf, T.; Mahlein, A.-K.; Steiner, U.; Oerke, E.-C.; Dehne, H.-W.; Plümer, L. Early Detection and Classification of Plant Diseases with Support Vector Machines Based on Hyperspectral Reflectance. Comput. Electron. Agric. 2010, 74, 91–99. [Google Scholar] [CrossRef]
Figure 1. Technical process of PLCNet.
Figure 1. Technical process of PLCNet.
Remotesensing 17 02882 g001
Figure 2. Location of the sweetpotato experimental field in the study area.
Figure 2. Location of the sweetpotato experimental field in the study area.
Remotesensing 17 02882 g002
Figure 3. Comparison of the average reflectance curves of different experimental plots. (a) The average spectral curve of the disease from 450 to 998 nm; (b) green peak; (c) red valley; (d) NIR.
Figure 3. Comparison of the average reflectance curves of different experimental plots. (a) The average spectral curve of the disease from 450 to 998 nm; (b) green peak; (c) red valley; (d) NIR.
Remotesensing 17 02882 g003
Figure 4. Boxplot of leaf SPAD values by group (SPVD-1, FR-1, SPVD-2, FR-2, SPVD-3 and FR-3).
Figure 4. Boxplot of leaf SPAD values by group (SPVD-1, FR-1, SPVD-2, FR-2, SPVD-3 and FR-3).
Remotesensing 17 02882 g004
Figure 5. Spatial distribution of healthy and diseased sweetpotato in the study area (scale bar = 0.7 m). (ac) RGB mosaics of healthy plants FR-1, FR-2, and FR-3; (df) corresponding PLCNet classification maps; (gi) RGB mosaics of infected plants SPVD-1, SPVD-2, and SPVD-3; (jl) corresponding PLCNet classification maps. (green = healthy, orange = infected, white = background).
Figure 5. Spatial distribution of healthy and diseased sweetpotato in the study area (scale bar = 0.7 m). (ac) RGB mosaics of healthy plants FR-1, FR-2, and FR-3; (df) corresponding PLCNet classification maps; (gi) RGB mosaics of infected plants SPVD-1, SPVD-2, and SPVD-3; (jl) corresponding PLCNet classification maps. (green = healthy, orange = infected, white = background).
Remotesensing 17 02882 g005
Figure 6. Comparison of pixel- (CropdocNet) and plant-level (PLCNet) classifications on a healthy (FR-2; (af)) and an infected (SPVD-3; (gl)) sweetpotato plot. (a,d,g,j) RGB; (b,e,h,k) classification (green = healthy, orange = infected, white = background); (c,f,i,l) overlay.
Figure 6. Comparison of pixel- (CropdocNet) and plant-level (PLCNet) classifications on a healthy (FR-2; (af)) and an infected (SPVD-3; (gl)) sweetpotato plot. (a,d,g,j) RGB; (b,e,h,k) classification (green = healthy, orange = infected, white = background); (c,f,i,l) overlay.
Remotesensing 17 02882 g006
Figure 7. Validation of PLCNet on an independent dataset (scale bar = 0.7 m; green = healthy; orange = infected; white = background). (ac) UAV RGB mosaics of healthy plots FR-1, FR-2, and FR-3; (df) corresponding PLCNet classification maps; (gi) UAV RGB mosaics of infected plots SPVD-1, SPVD-2, and SPVD-3; (jl) corresponding PLCNet classification maps.
Figure 7. Validation of PLCNet on an independent dataset (scale bar = 0.7 m; green = healthy; orange = infected; white = background). (ac) UAV RGB mosaics of healthy plots FR-1, FR-2, and FR-3; (df) corresponding PLCNet classification maps; (gi) UAV RGB mosaics of infected plots SPVD-1, SPVD-2, and SPVD-3; (jl) corresponding PLCNet classification maps.
Remotesensing 17 02882 g007
Table 1. Spectral indices used in this study.
Table 1. Spectral indices used in this study.
TypeSpectral IndexShortFormulationReference
ChlorophyllPigment-Specific
Simple Ratio
PSSRaR800/R675[36]
PSSRbR800/R650[37]
Ratio Analysis of
Reflectance Spectra
RARSaR675/R700[38]
RARSbR675/R650 × R700
Normalized Difference
Vegetation Index
NDVI(RNIR − RR)/(RNIR + RR)[39]
Red-Edge NDVImNDVI(R750 − R705)/(R750 + R705)[40]
Green NDVIgNDVI(R750 − RG)/(R750 + RG)[41]
Macc01Macc01(R780 − R710)/(R780 − R680)[42]
DATTDATT(R850 − R710)/(R850 − R680)[43]
Modified DATTMDATT(R721 − R744)/(R721 − R714)[44]
Red-Edge Chlorophyll IndexCIR750/R710[45]
Chl_red edgeChl_red
edge
Rnir/Rred_edge − 1[46]
CarotenoidPhotochemical
Reflectance Index
PRI(R531 − R570)/(R531 + R570)[47]
Carotenoid Reflectance
Index
CRI550(1/R510) − (1/R550)[48]
CRI700(1/R510) − (1/R700)
CRI515,550(1/R515) − (1/R550)
CRI515,550(1/R515) − (1/R700)
RI530,800RI530,800R530/R800
ND800,530ND800,530(R800 − R530)/(R800 + R530)
Plant StressHealth Index (534,698,704)HI_2013(R534 − R698)/(R534 + R698) − 0.5 × R704[49]
Plant Senescence
Reflectance Index
PSRI(R680 − R500)/R750[50]
Simple RatioRRR695/R670[51]
R695/R760
R710/R760
Table 2. Optimal wavelength variable selection results.
Table 2. Optimal wavelength variable selection results.
MethodWavelength/nmVIF (nm)
LCM794, 798, 802, 806, 810, 814, 818, 822, 826, 830, 834, 854, 858, 862, 866, 870, 874, 878, 882, 886, 890, 894, 898, 902, 906, 910, 914, 918, 922, 926794, 926
mRMR674, 678, 682, 686, 770, 774, 778, 782, 786, 790, 794, 798, 802, 806, 810, 814, 818, 822, 826, 830, 834, 838, 842, 846, 906, 914, 922, 926, 930, 934674, 686, 906
RF450, 454, 458, 646, 650, 662, 666, 670, 674, 678, 682, 686, 690, 694, 698, 702, 706, 710, 714, 718, 722, 790, 794, 802, 806, 810, 814, 818, 822, 834458, 650, 706, 714, 914
Table 3. Comparison of the performance of different feature selection methods for SPVD identification using the SVM classifier (%). (OA: Overall Accuracy; F1: Macro F1-Score).
Table 3. Comparison of the performance of different feature selection methods for SPVD identification using the SVM classifier (%). (OA: Overall Accuracy; F1: Macro F1-Score).
MethodOA/%F1/%
LCM76.7575.37
mRMR89.1587.88
RF91.3690.41
Table 4. Optimal vegetation indices selection results.
Table 4. Optimal vegetation indices selection results.
TypeChlorophyllCarotenoid
IndexPSSRbDATTMDATTPRIND800,530
Table 5. Performance comparison of feature selection methods combined with vegetation indices and classifiers for SPVD identification. OA and F1 are expressed in %; UA_mean and PA_mean are macro-averaged user’s and producer’s accuracies expressed as proportions.
Table 5. Performance comparison of feature selection methods combined with vegetation indices and classifiers for SPVD identification. OA and F1 are expressed in %; UA_mean and PA_mean are macro-averaged user’s and producer’s accuracies expressed as proportions.
MethodSVMGBDTResNet3D-CNN
OA/%F1/%UA_meanPA_meanOA/%F1/%UA_meanPA_meanOA/%F1/%UA_meanPA_meanOA/%F1/%UA_meanPA_mean
LCM + Vis91.33 90.81 0.93320.929991.50 91.07 0.91360.909893.0792.410.92380.925793.8693.660.93940.9369
mRMR + Vis91.18 90.65 0.93330.929791.46 91.02 0.91330.909393.3792.670.92940.925794.9094.680.93030.9280
RF + Vis91.49 91.00 0.93430.931191.90 91.49 0.94220.941193.6792.960.93510.930396.5595.360.94980.9504
Note: the highest values in each column are highlighted in bold.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhang, Q.; Wang, W.; Su, H.; Yang, G.; Xue, J.; Hou, H.; Geng, X.; Cao, Q.; Xu, Z. PLCNet: A 3D-CNN-Based Plant-Level Classification Network Hyperspectral Framework for Sweetpotato Virus Disease Detection. Remote Sens. 2025, 17, 2882. https://doi.org/10.3390/rs17162882

AMA Style

Zhang Q, Wang W, Su H, Yang G, Xue J, Hou H, Geng X, Cao Q, Xu Z. PLCNet: A 3D-CNN-Based Plant-Level Classification Network Hyperspectral Framework for Sweetpotato Virus Disease Detection. Remote Sensing. 2025; 17(16):2882. https://doi.org/10.3390/rs17162882

Chicago/Turabian Style

Zhang, Qiaofeng, Wei Wang, Han Su, Gaoxiang Yang, Jiawen Xue, Hui Hou, Xiaoyue Geng, Qinghe Cao, and Zhen Xu. 2025. "PLCNet: A 3D-CNN-Based Plant-Level Classification Network Hyperspectral Framework for Sweetpotato Virus Disease Detection" Remote Sensing 17, no. 16: 2882. https://doi.org/10.3390/rs17162882

APA Style

Zhang, Q., Wang, W., Su, H., Yang, G., Xue, J., Hou, H., Geng, X., Cao, Q., & Xu, Z. (2025). PLCNet: A 3D-CNN-Based Plant-Level Classification Network Hyperspectral Framework for Sweetpotato Virus Disease Detection. Remote Sensing, 17(16), 2882. https://doi.org/10.3390/rs17162882

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop