Next Article in Journal
Geographical Variation in Cover Crop Management and Outcomes in Continuous Corn Farming System in Nebraska
Previous Article in Journal
Eco-Friendly Suppression of Grapevine Root Rot: Synergistic Action of Biochar and Trichoderma spp. Against Fusarium equiseti
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Advances in Hyperspectral and Diffraction Imaging for Agricultural Applications

1
Department of Food & Biological Engineering, Jiangsu University, Zhenjiang 212013, China
2
School of Electrical & Information Engineering, Jiangsu University, Zhenjiang 212013, China
*
Author to whom correspondence should be addressed.
Agriculture 2025, 15(16), 1775; https://doi.org/10.3390/agriculture15161775
Submission received: 15 July 2025 / Revised: 11 August 2025 / Accepted: 15 August 2025 / Published: 19 August 2025
(This article belongs to the Section Artificial Intelligence and Digital Agriculture)

Abstract

Hyperspectral imaging and diffraction imaging technologies, owing to their non-destructive nature, high efficiency, and superior resolution, have found widespread application in agricultural diagnostics. This review synthesizes recent advancements in the deployment of these two technologies across various agricultural domains, including the detection of plant diseases and pests, crop growth monitoring, and animal health diagnostics. Hyperspectral imaging utilizes multi-band spectral and image data to accurately identify diseases and nutritional status, while combining deep learning and other technologies to improve detection accuracy. Diffraction imaging, by exploiting the diffraction properties of light waves, facilitates the detection of pathogenic spores and the assessment of cellular vitality, making it particularly well-suited for microscopic structural analysis. The paper also critically examines prevailing challenges such as the complexity of data processing, environmental adaptability, and the cost of instrumentation. Finally, it envisions future directions wherein the integration of hyperspectral and diffraction imaging, through multisource data fusion and the optimization of intelligent algorithms, holds promise for constructing highly precise and efficient agricultural diagnostic systems, thereby advancing the development of smart agriculture.

1. Introduction

Agriculture serves as the cornerstone of global food security and environmental sustainability. As fundamental pillars of the agricultural system, crop production and livestock management play dual vital roles: ensuring reliable food provision while stimulating rural economic development [1]. Contemporary agriculture confronts escalating challenges from resource limitations, climate variability, and increasing demands for production efficiency. These compounding pressures necessitate transformative innovations in agricultural diagnostics. Modern diagnostic technologies empower precise tracking of crop physiology, livestock welfare, and soil ecosystems, enabling optimized yields and superior resource management. Hyperspectral and diffraction imaging exemplify this technological progression, offering immediate plant disease detection, real-time livestock health monitoring, and early risk identification to safeguard productivity.
These advanced tools generate evidence-based insights for agricultural stakeholders, supporting data-driven policymaking and farm-level decisions that advance sustainable practices. The precision agriculture revolution has consequently amplified the strategic value of diagnostic technologies as essential components of modern agri-food systems [2]. As critical enablers in the agricultural value chain, these imaging technologies enhance both production quality and sustainable practices through improved resource management. Their non-destructive nature, high precision, and operational efficiency position hyperspectral and diffraction imaging as transformative solutions with extensive diagnostic applications across agricultural systems [1,3,4]. Within this context, this review provides (1) a systematic examination of hyperspectral and diffraction imaging fundamentals, (2) an in-depth exploration of their cutting-edge agricultural diagnostic applications, and (3) a critical analysis of current limitations and promising future research directions.

1.1. Hyperspectral Imaging Technology

Hyperspectral imaging represents an advanced diagnostic approach that combines imaging capabilities with spectroscopic analysis. This technique acquires detailed spectral signatures across numerous narrow bands, while simultaneously preserving spatial information [5]. The technology’s principal advantage stems from its dual-output capacity to provide both the spatial distribution and spectral characteristics of targets, enabling precise identification and classification [6]. Hyperspectral imaging captures continuous spectral data across ultraviolet to near-infrared wavelengths (approximately 200–1000 nm), enabling accurate material characterization through detailed optical property analysis [7]. This capability establishes a robust foundation for numerous diagnostic applications [8,9], including classification tasks [10,11,12,13,14], moisture [15,16] and elemental quantification [17,18,19,20,21,22,23,24], colony counting [25,26,27], and sensory assessment [28]. The fundamental principles of hyperspectral imaging are illustrated in Figure 1. A typical hyperspectral imaging system consists of a light source, a sample stage, a spectrometer, and a computer for data acquisition and processing. The light source uniformly illuminates the sample, and the reflected light is captured by the spectrometer. For each spatial point on the sample surface, the system records a complete spectral signature across a wide range of wavelengths (e.g., 400–1000 nm). This results in a three-dimensional hyperspectral data cube containing both spatial (image) and spectral (wavelength) information. Conventional diagnostic methods in agriculture, such as visual inspection, microscopy, and chemical analysis, are often labor-intensive, time-consuming, and destructive. Compared to conventional diagnostic methods, hyperspectral imaging provides distinct advantages, including non-destructive operation [11,29,30,31], rapid data acquisition, high spatial resolution, and comprehensive spectral information content [32,33,34]. These capabilities enable both precise target identification and comprehensive data acquisition for analytical modeling [35]. The advancement of hyperspectral imaging technology stems from dual technological progress: (1) improved sensor hardware performance and (2) sophisticated developments in computational processing methods. The following sections present a comprehensive examination of hyperspectral imaging applications with variant computational algorithms and their diagnostic advantages in agriculture, as well as applications in agricultural science [19,36,37,38,39,40]. As demonstrated in Table 1, these techniques employ diverse analytical approaches, including principal component analysis (PCA), convolutional neural networks (CNNs), support vector machines (SVMs), random forests, graph neural networks, and spectral unmixing methods. Figure 2 presents various types of hyperspectral imaging equipment. Specifically, Figure 2A shows a UAV-based hyperspectral imaging system commonly used for aerial crop monitoring. Figure 2B displays a portable handheld hyperspectral device suitable for the field detection of plant stress. Figure 2C,D illustrate two indoor hyperspectral imaging setups: one for sample scanning under controlled conditions (2C), and another using a fixed frame structure for multi-angle imaging (2D). Figure 2E,F depict two other indoor systems with integrated control modules and optical units for laboratory use. Figure 2G presents a compact handheld hyperspectral camera used for rapid object scanning. Finally, Figure 2H illustrates another UAV-based platform designed for outdoor deployment in agricultural environments.
Figure 1. Basic principles of hyperspectral imaging technology.
Figure 1. Basic principles of hyperspectral imaging technology.
Agriculture 15 01775 g001
Table 1. Applications of hyperspectral imaging associated with various computational algorithms.
Table 1. Applications of hyperspectral imaging associated with various computational algorithms.
Application ScenariosAlgorithm TypeAdvantagesLimitations
Remote sensing classificationSuper PCA [41]Superpixel-based PCA; Preserves spatial structure; Enhances computational efficiency; Unsupervised learningComputational complexity
PCA [42]Effective dimensionality reductionPotential loss of some important information
CNN [43]Generates spectral images from RGB images; Low costResulting image quality depends on training data
Linear mixing model [44]Suitable for spectral unmixing; Good model generalizationModel may oversimplify complex spectral variations
Gabor filter + Unsupervised discriminant analysis [45]High classification accuracyHigh model complexity
CNN + Dual swin transformer [46]Captures both local and global features; High classification accuracyComplex architecture; High training cost; Requires large datasets
Crop image classificationFeature selection + Folded-PCA [47]Combines information-theoretic optimized feature selectionImplementation complexity
Information theory-based feature selection + Folded-PCA [48]Combines information-theoretic feature selection; Enhances classification accuracyHigh computational complexity
CNN + SVM [49]Combines CNN feature extraction with SVM classification; Adaptable to various crop typesHigh computational resource demands
Spatial-spectral homogeneous block extraction [50]Integrates spatial and spectral information; High accuracyHigh model complexity
CNN [51]Strong adaptability; High classification accuracyLong training time
3D CNN (LeNet-5) [52]Integrates spatial and spectral information; High classification accuracyHigh computational complexity
GNN + ARMA filter + Parallel CNN [53]Superior classification performanceHigh model complexity
CNN + GAT + C-means [54]Combines spatial and spectral data; High segmentation accuracyHigh model complexity
SoilOptimal band selection + Random Forest [55]Improves soil salinity estimation accuracyRelies on band selection methods
Evaluation of ML models (SVM, RF, CNN, etc.) [56]Provides a comprehensive comparison of classification performanceModel accuracy depends on sample quality; affected by spectral interference
HSI + SVM/RF/PLS-DA [57]Enables accurate identification of PE and PA microplastics in soilFeature selection relies on manual design; Limited generalization ability
HSI combined with radar + PLSR [58]Reduces soil moisture and surface roughness interference in SOC estimationComplex fusion process; Requires consistent data sources
Spectral feature extraction methods (PCA, CARS, GA) [59]Enhances SOC prediction accuracy through multi-feature integrationHigh computational cost; Uncertainty in feature selection
HSI + PLSR + RBF neural network [60]Enables non-destructive detection of silicon and moisture across regionsRequires retraining for different soil types; Limited model transferability
Figure 2. Hyperspectral imaging equipment: (A) UAV-based hyperspectral imaging system [61]; (B) Portable outdoor handheld hyperspectral device [62]; (C) Indoor hyperspectral imaging system [63]; (D) Indoor hyperspectral imaging system [64]; (E) Indoor hyperspectral imaging system [65]; (F) Handheld hyperspectral device [66]; (G) Handheld hyperspectral device [67]; (H) UAV-based hyperspectral imaging system [68].
Figure 2. Hyperspectral imaging equipment: (A) UAV-based hyperspectral imaging system [61]; (B) Portable outdoor handheld hyperspectral device [62]; (C) Indoor hyperspectral imaging system [63]; (D) Indoor hyperspectral imaging system [64]; (E) Indoor hyperspectral imaging system [65]; (F) Handheld hyperspectral device [66]; (G) Handheld hyperspectral device [67]; (H) UAV-based hyperspectral imaging system [68].
Agriculture 15 01775 g002
While offering superior classification accuracy, efficient feature extraction, and robust computational performance, these algorithms confront persistent challenges, including computational intensity, implementation complexity, and stringent data quality requirements.

1.2. Diffraction Imaging Technology

Diffraction imaging is an analytical technique based on light-wave diffraction phenomena. Through the interpretation of diffraction patterns generated by light-object interactions, this method provides high-resolution structural characterization [69]. Its exceptional capability lies in resolving microscopic features with superior sensitivity [70], offering unique advantages for non-destructive investigation of internal material architectures. Diffraction imaging is experiencing rapid growth in modern agricultural diagnostics, attracting significant research interest for its non-destructive evaluation and microscopic analysis capabilities. This technology not only provides a scientific foundation for assessing crop internal quality but also enables livestock health monitoring and detection of subtle compositional variations in animal feed. Continued improvements in hardware performance and computational processing are facilitating the synergistic combination of diffraction imaging with complementary diagnostic approaches, thereby expanding its agricultural applications. The following sections will provide a detailed examination of diffraction imaging’s diagnostic value in agriculture. The principle of diffraction imaging technology is illustrated in Figure 3. Diffraction imaging is a technique that captures and analyzes diffraction patterns resulting from the interaction between light waves, micro-apertures, and the sample. When coherent light illuminates the sample, the sample modulates the incoming wavefront and generates a characteristic intensity distribution pattern. This diffraction pattern is recorded by an image sensor (e.g., color sensor or CMOS) and transmitted to a computer for image reconstruction and feature extraction.
Recent technological progress in high-performance sensors, advanced data processing, and novel algorithms has substantially expanded the agricultural applications of hyperspectral and diffraction imaging [71]. These cutting-edge technologies provide strong support for intelligent agricultural diagnostics and establish a solid technical foundation for sustainable farming practices. However, their widespread implementation still faces significant challenges, particularly regarding high equipment costs and data processing complexity [72]. Future developments through technological innovation and cross-disciplinary cooperation are expected to gradually address these limitations. As these technologies mature, hyperspectral and diffraction imaging are becoming increasingly vital for agricultural applications. Their integration offers significant potential to enhance crop productivity, ensure food security, and support sustainable resource management. This review systematically evaluates recent advances in hyperspectral and diffraction imaging for agricultural diagnostics, with three primary objectives: Comprehensive analysis of technological breakthroughs, critical assessment of existing limitations, and projection of future development pathways. By providing both state-of-the-art synthesis and forward-looking perspectives, this work aims to serve as a foundational reference for researchers while stimulating innovation and practical implementation in the field.

2. Applications of Hyperspectral Imaging Technology in Agriculture

2.1. Disease Detection

Global agriculture continues to experience significant annual crop losses due to plant diseases across all major growing regions [73]. China continues to experience significant annual crop losses due to plant diseases throughout its agricultural regions [74]. This makes timely and precise disease detection essential for protecting crop yields. Current detection methodologies depend predominantly on labor-intensive field inspections and laboratory chemical tests. Although these conventional techniques provide dependable diagnoses, their operational constraints—particularly the extensive time requirements and limited scalability—render them ineffective for regional-scale monitoring [75]. Hyperspectral imaging provides a non-destructive, high-throughput solution for crop disease detection, leveraging its unique combination of high spatial and spectral resolution [76]. The technology acquires detailed reflectance profiles across numerous narrow bands, enabling detection of minute pathological changes in plant physiology and structure. This advanced capability permits identification of pre-symptomatic stress responses, allowing for timely, precision disease management interventions [77,78]. Plant diseases modify leaf optical properties through biochemical and structural changes, including altered chlorophyll content, water status, and cellular organization. These pathological alterations generate diagnostic spectral signatures [79], particularly reduced visible-range reflectance (400–700 nm) from chlorophyll degradation and elevated near-infrared reflectance (700–1300 nm) due to tissue structural damage.
Hyperspectral imaging has become a principal tool for early rice disease detection by capturing these characteristic spectral responses. Feng et al. [80] successfully combined hyperspectral imaging with deep transfer learning to improve rice disease identification. The study acquired hyperspectral images of diseased rice leaves, extracted spectral features, and applied transfer learning algorithms, demonstrating enhanced accuracy. This methodology achieved high classification accuracy despite limited labeled training data, significantly advancing disease monitoring capabilities for rice production systems.
Zhang et al. [81] constructed nine spectral band selection models based on three consecutive years of in situ canopy hyperspectral imaging data to reduce spectral redundancy. Additionally, two novel vegetation indices were developed and combined with a CARS-ridge regression algorithm to identify Fusarium head blight (FHB) in wheat. The results showed that the optimized model achieved a detection accuracy of 91.4%, significantly outperforming traditional methods. The hyperspectral image data used in this study provided high spatial and temporal resolution, which contributed to precise localization of disease spots and enhanced the generalization capability of the model across different regions. Rady et al. [82] employed VIS-NIR hyperspectral imaging (400–900 nm) in push-broom scanning mode to acquire reflectance images of GoldRush apples under different storage temperatures. The reflectance features were classified using five machine learning models: LDA, decision tree, KNN, PLS-DA, and FFNN. Among them, the decision tree achieved the highest overall classification accuracy of 82% on the test set, with class-wise accuracies of 81% for healthy apples and 86% for CM-infested ones. The other models also demonstrated promising performance: LDA achieved 72% overall accuracy, KNN 76%. The decision tree model’s classification accuracy suggests that, in practical non-destructive detection, the model can reliably identify most infested fruit, thereby improving sorting efficiency and reducing manual inspection. However, the residual risk of misclassification could result in a small proportion of infested fruit passing as healthy or healthy fruit being discarded. And forward feature selection (FFS) starts with an empty set and iteratively adds the wavelength that gives the greatest accuracy improvement, repeating until performance gain becomes negligible or a set limit is reached, thus identifying the optimal wavelength subset.
To reduce the number of wavelengths and simplify the model, the authors applied sequential forward selection (SFS) for feature selection. Using only stem orientation data and a significantly reduced number of wavelengths, the SFS-selected subset achieved nearly the same classification performance as the full-wavelength model, with 82% overall accuracy (81% for healthy, 86% for infested). This demonstrates the effectiveness of SFS in dimensionality reduction without compromising detection performance, which is particularly useful for developing compact multispectral systems for practical deployment.
The integration of hyperspectral imaging with deep learning has proven particularly effective for simultaneous classification of multiple crop diseases. Additionally, hyperspectral imaging technology demonstrates strong potential for environmental stress monitoring in crops. Controlled experiments with rice have confirmed its ability to detect pre-symptomatic stress responses, significantly improving both detection accuracy and early warning capacity. These capabilities provide scientifically validated tools for precision agriculture management. Key applications in disease diagnostics are systematically compared in Table 2.

2.2. Disease Early Warning

Plant diseases cause measurable biochemical and structural modifications in leaves, including changes in chlorophyll content, water status, and cellular organization. These pathological alterations generate distinctive spectral reflectance patterns across the 400–2500 nm range. Hyperspectral imaging detects disease-specific spectral signatures through high-resolution, contiguous spectral band acquisition, allowing identification of minute variations indicative of physiological stress responses. Hyperspectral imaging detects early disease indicators through leaf spectral variation analysis. Terentev et al. [110] acquired hyperspectral data and applied interdisciplinary methods to establish correlations between spectral features and plant biochemical changes. Using SVM classification, the system achieved 97–100% detection accuracy from day 4 post-inoculation, confirming its early-diagnosis potential. Moriya et al. [111] utilized UAV-based hyperspectral imaging (25-band sensor, 500–840 nm range) with concurrent multispectral simulation (3-band) for citrus canker detection and spatial mapping. The hyperspectral imaging system outperformed multispectral analysis in disease classification due to its spectral dimensionality, yielding 0.94 health map accuracy. This methodology has proven effective for both woody (citrus) and monocot crops (cereals), demonstrating versatile agricultural disease monitoring capabilities.
Researchers are increasingly integrating machine learning algorithms with hyperspectral data to improve plant disease identification accuracy. This combined approach, when applied to large-scale agricultural analytics, enhances the reliability of early warning systems for crop disease management. Bai et al. [112] explored hyperspectral imaging for rapid, spatially resolved plant disease monitoring. By analyzing plant tissue spectral features, these methods enable early disease detection and precise outbreak localization, providing a non-destructive tool for agricultural disease management. UAV-based hyperspectral systems further facilitate large-scale field monitoring, allowing timely interventions before widespread outbreaks occur, thus enhancing agricultural sustainability. Xu et al. [113] emphasize multisource data fusion for disease early-warning systems. Theyintegrated hyperspectral images with RGB and thermal infrared data to monitor wheat powdery mildew, demonstrating improved early diagnosis rates and reduced false positives compared to single-data approaches. Representative applications are systematically compared in Table 2. Figure 4 provides representative applications of hyperspectral imaging in early disease detection across multiple crops, such as cotton (A), rice (B), wheat (C), and strawberry (D).

2.3. Crop Remote Sensing and Monitoring

Hyperspectral imaging has become indispensable in crop growth monitoring due to its ability to capture detailed spectral signatures reflecting plant physiological status, including nutrient concentrations and stress responses. The integration of UAV-based hyperspectral systems further enhances this capability by enabling high-resolution, real-time field-scale monitoring, which supports data-driven precision agriculture practices, such as variable-rate fertilization and early-stage disease detection [118]. Recently, numerous research teams have successfully integrated hyperspectral data with machine learning algorithms for the real-time monitoring of crop growth dynamics. Moghimi et al. [62] demonstrated a synergistic combination of UAV-based hyperspectral imaging and DNNs for high-throughput wheat phenotyping and yield prediction. Researchers have employed hyperspectral imagery to collect data from experimental wheat plots and applied deep learning models to extract key features, achieving precise wheat growth stage classification and accurate yield estimation. Furthermore, Wang et al. [119] revealed significant differences in the absorption characteristics of maize leaves in the near-infrared and shortwave-infrared spectral bands during different developmental stages. These spectral variations serve as highly valuable biomarkers for monitoring crop growth dynamics and developmental progress.
Hyperspectral imaging also plays a critical role in assessing crop nutrient status. For example, studies have shown that leaf reflectance varies significantly within specific spectral regions in response to varying nitrogen concentrations. This finding has led to the development of hyperspectral-based nitrogen prediction models, allowing their application in precision fertilization strategies to optimize agricultural productivity. Additionally, UAV-based hyperspectral imaging has been used to monitor chlorophyll density in maize canopies subjected to lodging stress. In one study, controlled experiments were conducted to identify sensitive spectral parameters for modeling. The results revealed that increased stem reflectance, compared to leaf reflectance, was the primary driver of spectral variation under lodging conditions. Furthermore, vegetation index-based models exhibited the highest predictive accuracy, enabling effective classification of maize lodging severity [120]. Hyperspectral imaging remote sensing has demonstrated significant potential for crop type discrimination, substantially improving the precision of agricultural census data. Recent research has explored the integration of remote sensing data with deep transfer learning for crop classification. A particularly innovative approach involves adapting pre-trained CNNs—initially designed for general image recognition—to process multispectral satellite imagery for precise crop type identification across diverse agricultural systems. This methodology has proven particularly effective in overcoming the constraints of limited annotated training data, thereby enhancing the scalability and accuracy of crop monitoring programs [121]. Table 2 provides a comprehensive overview of hyperspectral imaging applications in agricultural remote sensing and surveillance.

2.4. Extended Applications of Hyperspectral Imaging Techniques

Hyperspectral imaging has emerged as a powerful tool for animal health assessment and disease diagnostics. This technology enables non-invasive differentiation between healthy and pathological tissues through spectral signature analysis of animal surfaces or organs. A notable application includes the rapid identification of microplastic contamination in fish intestinal tracts, demonstrating its utility in assessing environmental health impacts on aquatic species [98]. Furthermore, hyperspectral imaging facilitates early detection of livestock infections by analyzing spectral variations in skin or hair associated with parasitic or bacterial pathogens, thereby supporting prompt clinical intervention. In addition to the applications summarized in Table 2, hyperspectral and multispectral imaging technologies have also been utilized in the tea industry for quality evaluation and fermentation monitoring, as illustrated in Figure 5. Panel A depicts the spectroscopic acquisition of spectral and image data from tea samples, whereas panel B outlines an integrated workflow combining multispectral imaging, chemical index acquisition, data augmentation, PLS- and CNN-based modeling, and pixel-level visualization to monitor the dynamic fermentation process of black tea.
In comparison with conventional destructive techniques and visual inspection, hyperspectral imaging integrated with machine learning offers a non-invasive, early-stage warning, and accurate approach for detecting pest infestations. This technique is capable of identifying internal physiological changes prior to the manifestation of visible symptoms. Nevertheless, several limitations persist, including the high cost of instrumentation, the complexity of data preprocessing, and susceptibility to variations in surface conditions. In the future, detection accuracy could be improved by enlarging the dataset to include more varieties and orchard sources, refining wavelength selection and spectral preprocessing, and adopting ensemble methods such as random forests or gradient boosting, combined with cross-validation to reduce overfitting.

3. Applications of Diffraction Imaging Technology in Agriculture

Microstructural analysis plays a pivotal role in agricultural research, with critical applications spanning crop growth assessment, cellular characterization, pathogen identification, and postharvest quality control. Conventional approaches, including optical microscopy, scanning electron microscopy (SEM), and laser confocal microscopy, offer high-resolution imaging capabilities but are constrained by demanding sample preparation protocols and substantial operational costs, particularly for field-deployable diagnostics. Recent advances in diffraction-based imaging technologies, especially lens-free configurations, have introduced transformative capabilities for agricultural microanalysis. These systems provide distinct advantages through their lensless design, operational simplicity, and capacity for high-throughput processing, making them particularly suitable for in-field applications (Figure 6). Figure 6A illustrates lens-free imaging for real-time cell viability monitoring, where phase contrast generated by light diffraction is used to distinguish between live and dead cells. Figure 6B depicts a lens-free imaging system designed for capturing transient processes, such as rapid biochemical reactions, by combining microfluidics with a compact optical setup. Figure 6C demonstrates a high-throughput lens-free microscopic imaging platform, which employs a diffractive optical element and time-sequenced illumination to extract dynamic spectral data across frames. Figure 6D shows an application of lens-free imaging in fungal spore detection, where pathogenic fungi generate unique diffraction fingerprints on a CMOS sensor, enabling precise identification and classification of different fungal species through computational reconstruction.

3.1. Classification and Identification of Pathogenic Spores

In the transmission of crop diseases, the production and dispersion of pathogenic spores represent one of the most critical early indicators. As vectors of various fungal diseases, spores are often present in the air or on plant surfaces before visible symptoms appear. Therefore, the detection and identification of spores are considered key steps in early warning and precise control of plant diseases. Especially in protected agriculture and open-field farming environments, the rapid and high-throughput capture and recognition of airborne or surface-bound spores are essential for early intervention. Compared with conventional single-wavelength diffraction imaging techniques, the lens-free diffraction imaging system enhances the differentiation of morphologically similar spores by incorporating multi-wavelength illumination and defocus-based diffraction acquisition. Due to differences in pigment distribution or internal composition, different types of spores exhibit distinct absorption characteristics at specific wavelengths. These variations enable morphologically similar spores to display distinguishable features through the combination of diffraction and spectral information, thereby significantly improving classification accuracy and robustness.
Recent years have witnessed significant technological progress in spore detection [128,129], with numerous innovative methodologies being developed to enhance agricultural disease management. Particularly in rice pathogen surveillance, microfluidic-integrated diffraction reconstruction techniques have emerged as a prominent research focus [130,131,132,133,134,135,136]. Yang et al. [137] developed a novel rice blast spore detection method utilizing diffraction image light field analysis combined with advanced texture feature extraction. Comparative experiments revealed superior identification accuracy compared to conventional manual microscopy. Following neural network optimization, the system demonstrated rapid diagnostic capability, achieving spore classification within seconds while maintaining 97.18% test accuracy. This breakthrough represents a significant advancement in field-deployable phytopathogen detection technology. In a complementary approach, researchers have implemented a microfluidic-based purification system for airborne spores. Through meticulous optimization of chip architecture, material selection, and fabrication protocols, the team successfully overcame key challenges, including environmental contaminant interference and low target spore concentrations [138]. For greenhouse crop applications, the integration of diffraction–polarization imaging signatures with machine learning algorithms has demonstrated particular efficacy. A specialized imaging system was developed to acquire spore diffraction fingerprints across multiple polarization states. Subsequent feature extraction and classification using an SVM algorithm yielded exceptional taxonomic discrimination among prevalent phytopathogens: Botrytis cinerea (tomato gray mold), Pseudoperonospora cubensis (cucumber downy mildew), and Podosphaera xanthii (cucumber powdery mildew). The system achieved a mean classification accuracy of 95.85%, establishing a robust framework for automated spore identification and categorization [139].
To overcome the limitations of conventional diffraction methods in discriminating morphologically similar spores, researchers have pioneered large-field, lens-free, multispectral diffraction recognition systems. Through the development of customized optical configurations coupled with advanced computational algorithms, these innovative techniques achieve precise taxonomic differentiation of various spore species. Customized optical configurations play a critical role in enhancing the accuracy of spore taxonomic identification. By tailoring the optical path—including the illumination wavelength range, angular uniformity, and sample-to-sensor distance—the system maximizes the resolution and contrast of diffraction patterns. These adjustments allow for clearer visualization of species-specific fringe details that may otherwise be lost in standard setups. Furthermore, spectral band selection is optimized to highlight subtle differences in the spores’ optical responses, improving the separability of morphologically similar species. This targeted enhancement of optical contrast directly contributes to improved classification performance and model robustness in downstream machine learning analysis. Meanwhile, advanced computational algorithms play a pivotal role in decoding multispectral diffraction data for precise spore classification. Given that diffraction fingerprint images are often visually indistinguishable due to their complex and subtle patterns, manual analysis is impractical. Each spore species, however, exhibits unique spectral–diffractive responses across different illumination wavelengths. Machine learning models, particularly convolutional neural networks (CNNs), are capable of automatically extracting discriminative features from these high-dimensional datasets. This highlights the pivotal role of advanced computational algorithms in processing multispectral diffraction data for accurate spore classification. By integrating spectral and spatial information, these algorithms enhance class separability and classification robustness. Additionally, techniques such as spectral normalization, dimensionality reduction, and multi-channel fusion further contribute to improved model performance and generalization. The integration of lens-free multispectral diffraction technology improves the scalability of agricultural pathogen surveillance by offering a compact, cost-effective, and easily deployable solution. Its lensless architecture eliminates bulky optical components, simplifying the system’s structure and enabling miniaturization. This makes the platform highly adaptable to field environments such as greenhouses, open farms, and storage facilities. Furthermore, its compatibility with automated sampling devices and microfluidic modules facilitates high-throughput and distributed deployment, supporting large-scale pathogen monitoring in real-world agricultural settings [140]. Spore detection based on diffraction imaging techniques is summarized in Table 3.

3.2. Pathogenic Spore Counting

Spore counting constitutes a fundamental procedure in microbiological and agricultural research. Conventional methodologies predominantly rely on microscopic enumeration and culture-based approaches. The microscopic counting technique typically utilizes a hemocytometer, where a spore suspension aliquot is loaded into the counting chamber. Spores within predetermined grid regions are then visually quantified under optical magnification, with the total concentration derived through dilution factor calculations. Although this approach offers procedural simplicity, it exhibits significant susceptibility to observer-induced variability and demands substantial technical proficiency to maintain counting precision. Alternatively, culture-based quantification involves inoculating spore samples onto selective growth media. Subsequent incubation under controlled environmental parameters enables colony formation, with the enumerated colony-forming units serving as a proxy for initial spore concentration. This methodology provides the distinct advantage of discriminating between metabolically active and inactive spores. Nevertheless, these techniques necessitate extended processing durations and rigorous maintenance of cultivation parameters to ensure reproducible outcomes.
In recent years, diffraction imaging—particularly lens-free diffraction imaging—has emerged as a promising technique for spore quantification. This approach enables rapid and accurate spore counting through analysis of characteristic diffraction patterns. Zhang et al. [151] presented a crop disease monitoring system that utilizes airborne spore sensors based on diffraction light scattering patterns to identify and track pathogen sources. By analyzing the unique diffraction signatures of fungal spores in real time, the network enables precise disease source localization and field-scale surveillance, offering a novel approach for early disease detection and management in agricultural fields. This system acquires high-resolution, real-time images of individual spores and precisely quantifies spore counts. By integrating with an Internet of Things (IoT) platform and incorporating environmental data from multiple sensors, the system facilitates real-time disease evaluation and delivers actionable, high-accuracy monitoring information to farmers. Together, these studies demonstrate the notable benefits of diffraction imaging for spore enumeration and its potential as a reliable tool for early detection and prevention of agricultural diseases.
In a greenhouse environment, a method for the rapid detection of Botrytis cinerea spores in tomatoes was proposed based on microfluidic chip enrichment and lensless diffraction image processing. In this approach, airborne spores were effectively concentrated using a microfluidic device, and their diffraction patterns were captured by a lensless imaging system [136]. Image processing algorithms were then applied to analyze the spore information, enabling rapid identification and detection. Although this study did not specifically conduct quantitative spore counting experiments, its image acquisition and processing framework demonstrates strong scalability, providing both theoretical and technical support for future applications in spore concentration estimation and automatic counting. In addition, several other studies, while not directly aimed at spore quantification, have successfully achieved species-level spore classification using diffraction texture features or polarization fingerprint images. These findings suggest that combining diffraction image texture extraction techniques with deep learning-based counting networks may allow simultaneous identification and quantification of spores from a single image, thus advancing spore detection toward higher precision and automation. Related studies are summarized in the ‘Other applications’ section of Table 3.

3.3. Cell Viability Detection

Cell viability detection holds significant importance in agricultural microbiology, pesticide toxicity assessment, and the development of biocontrol agents. Cell viability directly impacts the efficacy of microbial fertilizers and the precision of pesticide toxicity evaluations. Viability, as a biological indicator, reflects the collective behavior of large cell populations and thus requires statistical analysis of high-throughput sample data for accurate assessment. Traditional detection methods—such as the MTT colorimetric assay, trypan blue staining, and flow cytometry—offer high sensitivity and accuracy. However, these techniques typically require complex sample preparation, expensive instruments, or chemical reagents, which limits their applicability in field-based diagnostics. In recent years, diffraction imaging-based methods for assessing cell viability have attracted growing research interest. In particular, lens-free diffraction imaging technology has demonstrated significant potential for evaluating the viability of agricultural pathogenic microorganisms. Its advantages include label-free operation, low cost, and high-throughput capacity, which makes it well-suited for on-site viability assessment in agricultural settings [152].
Recent studies have leveraged diffraction-based technologies to develop high-throughput, lens-free three-dimensional (3D) imaging techniques capable of label-free, high-throughput 3D tracking of over 1500 sperm cells simultaneously. These investigations revealed that only 4% to 5% of motile sperm exhibited characteristic helical swimming patterns, a behavior found to be inhibited by seminal plasma. Moreover, approximately 90% of helical sperm displayed right-handed helices, and key motility parameters such as helix radius were quantified, providing a powerful high-throughput tool for microbiological research [153]. Furthermore, Li et al. [126] proposed two quantitative parameters—fringe intensity contrast (FIC) and fringe dispersion (FD)—to describe the clarity and spatial distribution of cell diffraction patterns. It was observed that increasing methylmercury concentrations led to a marked decrease in FIC and an increase in FD, reflecting a loss of membrane integrity and fringe pattern degradation, thereby indirectly indicating reduced cell viability. Compared to conventional biochemical assays requiring staining or labeling, this diffraction-based approach enables non-invasive, real-time monitoring of live cells. BRL cells were exposed to different concentrations of methylmercury, and the researchers measured the fringe intensity contrast (FIC) and fringe dispersion (FD) parameters for each cell using a self-developed lens-free diffraction imaging system. In parallel, the classical MTT colorimetric assay was used to assess cell viability. The results from both methods were then compared to evaluate their consistency in assessing cell viability under varying methylmercury exposures. The findings showed that the FIC and FD parameters exhibited high consistency with MTT results, confirming their reliability in reflecting cell viability. However, the generalizability of this technique to other cell types may be limited by factors such as cell size, morphological consistency, and diffraction signal strength, suggesting the need for further standardization and image processing enhancement. Notably, this technique eliminates the need for labeling agents and avoids the requirement for periodic biochemical reaction steps, thereby improving the accuracy and efficiency of cell viability monitoring.
Meanwhile, studies have employed high-throughput lensless imaging technologies to monitor heterogeneous cell suspensions, providing valuable insights for agricultural applications such as cell counting and activity assessment. Su et al. [154] proposed a chip-based platform utilizing lens-free diffraction imaging (LUCAS), which is capable of capturing diffraction patterns of multilayered cells within a 4 mL solution across a field of view of approximately 10 cm2 and a depth of 4 mm, all within less than one second. The integrated algorithm enables simultaneous 3D localization and classification of cells. The platform’s automation and high-throughput capabilities offer technological inspiration for the concurrent analysis of cell viability and population in agricultural microbial fertilizers. Drawing from the imaging and analytical strategies of such lensless diffraction systems, there is considerable potential for broader applications in agriculture, including plant tissue culture and seed viability assessment. For instance, in plant tissue culture, this methodology could be adapted for non-destructive viability screening of cell clusters, thereby improving the stability of the cultivation system [154]. In seed screening, the diffraction texture patterns generated by active microcells within the embryo may serve as a promising non-contact indicator of germination potential [155]. These cross-disciplinary applications highlight the bridging potential of diffraction imaging technology between agricultural fundamental research and field-based applications.

3.4. Other Agricultural Applications of Diffraction Imaging Technology

Lens-free diffraction imaging technology offers unique advantages in agricultural microbiological detection. For example, Jiang et al. [144] proposed a microbial species identification method based on colony fingerprinting. Using lens-free imaging, researchers captured images of microcolonies (with diameters ranging from 10 to 500 μm) and distinguished different microbial species by analyzing the morphological and physiological characteristics present in these images. Beyond microbial detection, this technology also shows transformative potential in livestock and poultry health monitoring. Chen et al. [156] introduced a particle separation and detection system that integrates microfluidics with lens-free imaging. The system employs standing surface acoustic waves (SSAW) to achieve efficient particle separation, followed by real-time detection of the separated particles using lens-free imaging. Compared to conventional PCR-based methods, lens-free diffraction imaging offers greater convenience, making it ideally suited for large-scale, field-based monitoring applications.
Lens-free diffraction technology offers broad and far-reaching applications. In addition to its uses in the domains discussed above, it also plays a critical role in fields such as biomedical research and materials science and demonstrates tremendous potential in areas including environmental monitoring and food safety. These capabilities provide strong technological support for advancements across a wide range of industries. Moreover, diffraction technology holds significant promise in animal disease detection. For instance, in the context of veterinary disease prevention, researchers have employed this technology to monitor the infection process of the pseudorabies virus (PRV) in swine. By analyzing the optical diffraction fingerprint patterns of cells during viral infection, they achieved high-throughput, real-time visualization of the infection process. An online, location-based detection system based on lens-free diffraction imaging was also developed. This method exhibited a linear correlation of 98.9% with conventional “gold-standard” diagnostic techniques and offers key advantages such as high automation, high throughput, and low cost. As such, it provides a novel avenue for viral disease prevention and presents a new perspective for cell-level selection and breeding of superior livestock and poultry strains [146].

4. Conclusions

Both hyperspectral and diffraction imaging deliver distinct advantages for agricultural diagnostics, delivering efficient, non-destructive solutions across disease monitoring, crop growth analysis, agricultural product quality assessment, and pathogen detection. Capitalizing on their complementarity, recent studies are pioneering the integration of these technologies. For instance, a novel image reconstruction framework fuses diffraction-induced rotationally blurred images and sharp reference images to achieve high-quality hyperspectral image reconstructions [157]. Jeon et al. [158] introduced a hyperspectral imaging system that integrates diffraction optics with rotational snapshot imaging, aimed at addressing the limitations of conventional hyperspectral systems in terms of imaging speed and device size. This system replaces traditional dispersive components, coded apertures, and relay lenses with diffraction optical elements (DOEs), thereby achieving a more compact design and significantly faster imaging. Additionally, a novel multispectral snapshot diffraction-based metrology technique was presented, wherein the linear relationship between off-diagonal elements of the Mueller matrix and stacking errors was analyzed. This enabled a breakthrough in achieving wide-spectrum measurements in a single exposure—dramatically improving both measurement speed and system stability compared to traditional methods. These innovations demonstrate cross-domain potential, with applicability extending to nanoscale overlay metrology in semiconductor manufacturing [159].
Although these initial integration efforts are promising, several bottlenecks remain in practical applications [160]. The widespread adoption of these technologies continues to be constrained by factors such as the complexity of data processing, equipment costs, and environmental adaptability. Future research must therefore prioritize the following:

4.1. Integration of Hyperspectral and Diffraction Imaging

To date, relatively few studies have explored the deep integration of hyperspectral and diffraction imaging technologies. Future research may adopt multimodal data fusion strategies to combine the rich spectral information of hyperspectral imaging with the microstructural analysis capabilities of diffraction imaging, thereby enhancing diagnostic precision. For example, in disease diagnostics, hyperspectral imaging may facilitate large-scale screening, while diffraction imaging could further identify pathogenic spores, facilitating precise, multiscale detection.

4.2. Advanced Data Analytics and Artificial Intelligence

Both hyperspectral and diffraction imaging generate vast volumes of data, and traditional analytical methods incur high computational costs, limiting their feasibility for real-time agricultural monitoring. Future work could leverage deep learning and machine learning algorithms to optimize data processing and improve both accuracy and speed. For instance, multimodal analysis models combining CNNs with attention mechanisms could be developed to enhance early disease recognition capabilities.

4.3. Development of Low-Cost, Portable Devices

At present, the high cost of hyperspectral and diffraction imaging equipment constrains their application in small- to medium-scale agricultural settings. Future efforts must prioritize the development of portable, low-cost sensing devices that integrate optoelectronic sensors, MEMS technologies, and mobile computing processing platforms to improve accessibility. For example, compact hyperspectral diffraction imaging systems mounted on small UAVs may facilitate real-time field monitoring.

4.4. Enhanced Environmental Adaptability and Robustness

Field environments are highly variable, with factors such as lighting, temperature, and humidity affecting imaging quality. Future research could employ adaptive optical correction and environmental factor modeling to improve system stability. For instance, integrating spectral correction algorithms could mitigate the influence of ambient lighting on hyperspectral data, thereby enhancing detection reliability.

4.5. Interdisciplinary Integration and Smart Agriculture Applications

The advancement of hyperspectral and diffraction imaging technologies requires deep integration with disciplines such as agricultural science, materials science, computer vision, and robotics, to drive the development of intelligent agriculture. For example, intelligent agricultural decision-support systems could be designed to integrate multisource sensor data and enable precision agricultural management.

5. Future Prospects

Hyperspectral and diffraction imaging technologies hold significant promise for advancing agricultural diagnostics. However, key technical challenges remain, including equipment costs, data processing complexity, and environmental adaptability. In addition to these general challenges, deploying lens-free multispectral diffraction systems in practical agricultural settings presents specific obstacles. Variations in lighting conditions, airborne particulates, and humidity can compromise image consistency. Moreover, real-world samples often contain non-target contaminants, making it difficult to isolate spores without upstream filtration. Ensuring long-term operational stability and preventing optical misalignment or sensor fouling during field use are critical concerns that must be addressed for reliable deployment. Looking forward, the integration of multimodal data fusion, artificial intelligence-driven optimization, development of low-cost portable devices, and incorporation into intelligent agricultural systems will further enhance the precision and intelligence of agricultural diagnostics. These advancements will contribute to the sustainable development of modern agriculture.

Author Contributions

Conceptualization, L.C.; methodology, Y.W.; data curation, Z.S.; image analysis, L.C.; writing—original draft, L.C. and Y.W.; writing—review and editing, N.Y.; funding acquisition, N.Y. and Z.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Key Research and Development Program for Young Scientists Project (2022YFD2000200) of the Ministry of Science and Technology of China, the National Natural Science Foundation of China (General Program) (32171895), the Key Research and Development Program of Jiangsu Province (Project) (BE2023017-2), the Agricultural Science and Technology Independent Innovation Fund of Jiangsu Province (CX(23)3041), and the Key Research and Development Program of Zhenjiang City (NY2023002).

Institutional Review Board Statement

Not applicable.

Data Availability Statement

The original contributions presented in this study are included in the article. Further inquiries can be directed to the corresponding author.

Acknowledgments

We thank the editor and reviewers for their helpful suggestions to improve the quality of this manuscript.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
PCAprincipal component analysis
CNNconvolutional neural network
SVMsupport vector machine
GNNgraph neural network
ARMAautoregressive moving average
GATgraph attention network
CARS-Ridgecompetitive adaptive reweighted sampling-ridge
VIS-NIRvisible-near infrared
LDAlinear discriminant analysis
KNNk-nearest neighbor
PLS-DApartial least squares discriminant analysis
FFNNfeedforward neural network
UAVunmanned aerial vehicle
RGBred–green–blue
DNNdeep neural network
ANN-BPNartificial neural network with backpropagation
HSIhyperspectral imaging
SEMscanning electron microscopy
MTT3-(4,5-dimethylthiazol-2-yl)-2,5-diphenyltetrazolium bromide
FOVfield-of-view
FICfringe intensity contrast
LUCASlaser ultrasonic camera system
SSAWstanding surface acoustic wave
DOEdiffraction optical element
SPAsuccessive projections algorithm
GLCMgray-level co-occurrence matrix

References

  1. Ozcelik, A.E. Driving initiatives for future improvements of specialty agricultural crops. Comput. Electron. Agr. 2016, 121, 122–134. [Google Scholar] [CrossRef]
  2. Gebbers, R.; Adamchuk, V.I. Precision Agriculture and Food Security. Science 2010, 327, 828–831. [Google Scholar] [CrossRef] [PubMed]
  3. Shi, Y.; Wang, Y.T.; Hu, X.T.; Li, Z.Z.; Huang, X.W.; Liang, J.; Zhang, X.A.; Zheng, K.Y.; Zou, X.B.; Shi, J.T. Nondestructive discrimination of analogous density foreign matter inside soy protein meat semi-finished products based on transmission hyperspectral imaging. Food Chem. 2023, 411, 135431. [Google Scholar] [CrossRef]
  4. Nirere, A.; Sun, J.; Kama, R.; Atindana, V.A.; Nikubwimana, F.D.; Dusabe, K.D.; Zhong, Y.H. Nondestructive detection of adulterated wolfberry (Lycium Chinense) fruits based on hyperspectral imaging technology. J. Food Process Eng. 2023, 46, e14293. [Google Scholar] [CrossRef]
  5. Ram, B.G.; Oduor, P.; Igathinathane, C.; Howatt, K.; Sun, X. A systematic review of hyperspectral imaging in precision agriculture: Analysis of its current state and future prospects. Comput. Electron. Agr. 2024, 222, 109037. [Google Scholar] [CrossRef]
  6. Massaro, G.; Francesco, V.P.; And, M.D.A. Correlation Hyperspectral Imaging. Phys. Rev. Lett. 2024, 18, 183802. [Google Scholar] [CrossRef]
  7. Yang, C.; Guo, Z.; Fernandes Barbin, D.; Dai, Z.; Watson, N.; Povey, M.; Zou, X. Hyperspectral Imaging and Deep Learning for Quality and Safety Inspection of Fruits and Vegetables: A Review. J. Agr. Food Chem. 2025, 73, 10019–10035. [Google Scholar] [CrossRef]
  8. Sun, J.; Zhang, L.; Zhou, X.; Yao, K.S.; Tian, Y.; Nirere, A. A method of information fusion for identification of rice seed varieties based on hyperspectral imaging technology. J. Food Process Eng. 2021, 44, e13797. [Google Scholar] [CrossRef]
  9. Sun, J.; Lu, X.Z.; Mao, H.P.; Jin, X.M.; Wu, X.H. A Method for Rapid Identification of Rice Origin by Hyperspectral Imaging Technology. J. Food Process Eng. 2017, 40, e12297. [Google Scholar] [CrossRef]
  10. Tang, N.Q.; Sun, J.; Yao, K.S.; Zhou, X.; Tian, Y.; Cao, Y.; Nirere, A. Identification of Lycium barbarum varieties based on hyperspectral imaging technique and competitive adaptive reweighted sampling-whale optimization algorithm-support vector machine. J. Food Process Eng. 2021, 44, e13603. [Google Scholar] [CrossRef]
  11. Fu, L.H.; Sun, J.; Wang, S.M.; Xu, M.; Yao, K.S.; Cao, Y.; Tang, N.Q. Identification of maize seed varieties based on stacked sparse autoencoder and near-infrared hyperspectral imaging technology. J. Food Process Eng. 2022, 45, e14120. [Google Scholar] [CrossRef]
  12. Zhang, L.; Sun, J.; Zhou, X.; Nirere, A.; Wu, X.; Dai, R. Classification detection of saccharin jujube based on hyperspectral imaging technology. J. Food Process. Pres. 2020, 44, e14591. [Google Scholar] [CrossRef]
  13. Sun, J.; Lu, X.Z.; Mao, H.P.; Wu, X.H.; Gao, H.Y. Quantitative Determination of Rice Moisture Based on Hyperspectral Imaging Technology and BCC-LS-SVR Algorithm. J. Food Process Eng. 2017, 40, e12446. [Google Scholar] [CrossRef]
  14. Shi, L.; Sun, J.; Zhang, B.; Wu, Z.; Jia, Y.; Yao, K.; Zhou, X. Simultaneous detection for storage condition and storage time of yellow peach under different storage conditions using hyperspectral imaging with multi-target characteristic selection and multi-task model. J. Food Compos. Anal. 2024, 135, 106647. [Google Scholar] [CrossRef]
  15. Zhou, X.; Sun, J.; Mao, H.P.; Wu, X.H.; Zhang, X.D.; Yang, N. Visualization research of moisture content in leaf lettuce leaves based on WT-PLSR and hyperspectral imaging technology. J. Food Process Eng. 2018, 41, e12647. [Google Scholar] [CrossRef]
  16. Lu, B.; Sun, J.; Yang, N.; Wu, X.H.; Zhou, X.; Shen, J.F. Quantitative detection of moisture content in rice seeds based on hyperspectral technique. J. Food Process Eng. 2018, 41, e12916. [Google Scholar] [CrossRef]
  17. Sun, J.; Wu, M.M.; Hang, Y.Y.; Lu, B.; Wu, X.H.; Chen, Q.S. Estimating cadmium content in lettuce leaves based on deep brief network and hyperspectral imaging technology. J. Food Process Eng. 2019, 42, e13293. [Google Scholar] [CrossRef]
  18. Zhou, X.; Sun, J.; Tian, Y.; Lu, B.; Hang, Y.Y.; Chen, Q.S. Hyperspectral technique combined with deep learning algorithm for detection of compound heavy metals in lettuce. Food Chem. 2020, 321, 126503. [Google Scholar] [CrossRef]
  19. Tian, X.Y.; Aheto, J.H.; Bai, J.W.; Dai, C.X.; Ren, Y.; Chang, X.H. Quantitative analysis and visualization of moisture and anthocyanins content in purple sweet potato by Vis–NIR hyperspectral imaging. J. Food Process. Pres. 2021, 45, e15128. [Google Scholar] [CrossRef]
  20. Zhou, X.; Zhao, C.J.; Sun, J.; Yao, K.S.; Xu, M. Detection of lead content in oilseed rape leaves and roots based on deep transfer learning and hyperspectral imaging technology. Spectrochim. Acta Part A 2023, 290, 122288. [Google Scholar] [CrossRef]
  21. Zhou, X.; Zhao, C.J.; Sun, J.; Cao, Y.; Yao, K.S.; Xu, M. A deep learning method for predicting lead content in oilseed rape leaves using fluorescence hyperspectral imaging. Food Chem. 2023, 409, 135251. [Google Scholar] [CrossRef]
  22. Lu, X.; Sun, J.; Mao, H.P.; Wu, X.H.; Gao, H.Y. Quantitative determination of rice starch based on hyperspectral imaging technology. Int. J. Food Prop. 2017, 20, S1037–S1044. [Google Scholar] [CrossRef]
  23. Sun, J.; Cong, S.L.; Mao, H.P.; Wu, X.H.; Yang, N. Quantitative detection of mixed pesticide residue of lettuce leaves based on hyperspectral technique. J. Food Process Eng. 2018, 41, e12654. [Google Scholar] [CrossRef]
  24. Cheng, J.H.; Sun, J.; Yao, K.S.; Xu, M.; Wang, S.M.; Fu, L.H. Hyperspectral technique combined with stacking and blending ensemble learning method for detection of cadmium content in oilseed rape leaves. J. Sci. Food Agr. 2023, 103, 2690–2699. [Google Scholar] [CrossRef]
  25. Shi, J.Y.; Zhang, F.; Wu, S.B.; Guo, Z.M.; Huang, X.W.; Hu, X.T.; Holmes, M.; Zou, X.B. Noise-free microbial colony counting method based on hyperspectral features of agar plates. Food Chem. 2019, 274, 925–932. [Google Scholar] [CrossRef]
  26. Cao, Y.; Li, H.R.; Sun, J.; Zhou, X.; Yao, K.S.; Nirere, A. Nondestructive determination of the total mold colony count in green tea by hyperspectral imaging technology. J. Food Process Eng. 2020, 43, e13570. [Google Scholar] [CrossRef]
  27. Sun Li, C.; Jun, S.; Han Ping, M.; Xiao Hong, W.; Pei, W.; Xiao Dong, Z. Non-destructive detection for mold colonies in rice based on hyperspectra and GWO-SVR. J. Sci. Food Agr. 2018, 98, 1453–1459. [Google Scholar] [CrossRef] [PubMed]
  28. Ouyang, Q.; Wang, L.; Park, B.; Kang, R.; Wang, Z.; Chen, Q.S.; Guo, Z.M. Assessment of matcha sensory quality using hyperspectral microscope imaging technology. Lwt 2020, 125, 109254. [Google Scholar] [CrossRef]
  29. Sun, J.; Yang, F.Y.; Cheng, J.H.; Wang, S.M.; Fu, L.H. Nondestructive identification of soybean protein in minced chicken meat based on hyperspectral imaging and VGG16-SVM. J. Food Compos. Anal. 2024, 125, 105713. [Google Scholar] [CrossRef]
  30. Xiong, C.H.; She, Y.X.; Jiao, X.; Zhang, T.W.; Wang, M.; Wang, M.Q.; Abd El-Aty, A.M.; Wang, J.; Xiao, M. Rapid nondestructive hardness detection of black highland Barley Kernels via hyperspectral imaging. J. Food Compos. Anal. 2024, 127, 105966. [Google Scholar] [CrossRef]
  31. Cheng, J.H.; Sun, J.; Xu, M.; Zhou, X. Nondestructive detection of lipid oxidation in frozen pork using hyperspectral imaging technology. J. Food Compos. Anal. 2023, 123, 105497. [Google Scholar] [CrossRef]
  32. Zhu, W.J.; Li, J.Y.; Li, L.; Wang, A.C.; Wei, X.H.; Mao, H.P. Nondestructive diagnostics of soluble sugar, total nitrogen and their ratio of tomato leaves in greenhouse by polarized spectra–hyperspectral data fusion. Int. J. Agr. Biol. Eng. 2020, 13, 189–197. [Google Scholar] [CrossRef]
  33. Yao, K.S.; Sun, J.; Zhang, L.; Zhou, X.; Tian, Y.; Tang, N.Q.; Wu, X.H. Nondestructive detection for egg freshness based on hyperspectral imaging technology combined with harris hawks optimization support vector regression. J. Food Saf. 2021, 41, e12888. [Google Scholar] [CrossRef]
  34. Yao, K.S.; Sun, J.; Chen, C.; Xu, M.; Zhou, X.; Cao, Y.; Tian, Y. Non-destructive detection of egg qualities based on hyperspectral imaging. J. Food Eng. 2022, 325, 111024. [Google Scholar] [CrossRef]
  35. Khulal, U.; Zhao, J.W.; Hu, W.W.; Chen, Q.S. Nondestructive quantifying total volatile basic nitrogen (TVB-N) content in chicken using hyperspectral imaging (HSI) technique combined with different data dimension reduction algorithms. Food Chem. 2016, 197, 1191–1199. [Google Scholar] [CrossRef]
  36. Li, H.H.; Kutsanedzie, F.; Zhao, J.W.; Chen, Q.S. Quantifying Total Viable Count in Pork Meat Using Combined Hyperspectral Imaging and Artificial Olfaction Techniques. Food Anal. Methods 2016, 9, 2–9913. [Google Scholar] [CrossRef]
  37. Li, Y.T.; Sun, J.; Wu, X.H.; Lu, B.; Wu, M.M.; Dai, C. Grade Identification of Tieguanyin Tea Using Fluorescence Hyperspectra and Different Statistical Algorithms. J. Food Sci. 2019, 84, 2234–2241. [Google Scholar] [CrossRef]
  38. Xu, M.; Sun, J.; Yao, K.S.; Wu, X.H.; Shen, J.F.; Cao, Y.; Zhou, X. Nondestructive detection of total soluble solids in grapes using VMD-RC and hyperspectral imaging. J. Food Sci. 2022, 87, 326–338. [Google Scholar] [CrossRef]
  39. Yao, K.S.; Sun, J.; Cheng, J.H.; Xu, M.; Chen, C.; Zhou, X.; Dai, C.X. Development of Simplified Models for Non-Destructive Hyperspectral Imaging Monitoring of S-ovalbumin Content in Eggs during Storage. Foods 2022, 11, 2024. [Google Scholar] [CrossRef]
  40. Yang, F.Y.; Sun, J.; Cheng, J.H.; Fu, L.H.; Wang, S.M.; Xu, M. Detection of starch in minced chicken meat based on hyperspectral imaging technique and transfer learning. J. Food Process Eng. 2023, 46, e14304. [Google Scholar] [CrossRef]
  41. Zhang, X.; Jiang, X.; Jiang, J.; Zhang, Y.; Liu, X.; Cai, Z. Spectral–Spatial and Superpixelwise PCA for Unsupervised Feature Extraction of Hyperspectral Imagery. IEEE T. Geosci. Remote 2022, 60, 1–10. [Google Scholar] [CrossRef]
  42. Gao, S.; Xu, J.H. Hyperspectral image information fusion-based detection of soluble solids content in red globe grapes. Comput. Electron. Agr. 2022, 196, 106822. [Google Scholar] [CrossRef]
  43. Mei, S.H.; Geng, Y.H.; Hou, J.H.; Du, Q. Learning hyperspectral images from RGB images via a coarse-to-fine CNN. Sci. China Inf. Sci. 2021, 65, 152102. [Google Scholar] [CrossRef]
  44. Yang, J.Y.; Lee, Y.K.; Chi, J. Spectral unmixing-based Arctic plant species analysis using a spectral library and terrestrial hyperspectral Imagery: A case study in Adventdalen, Svalbard. Int. J. Appl. Earth Obs. 2023, 125, 103583. [Google Scholar] [CrossRef]
  45. Jia, S.; Zhao, Q.Q.; Zhuang, J.Y.; Tang, D.D.; Long, Y.Q.; Xu, M. Flexible Gabor-Based Superpixel-Level Unsupervised LDA for Hyperspectral Image Classification. IEEE T. Geosci. Remote 2021, 59, 10394–10409. [Google Scholar] [CrossRef]
  46. Li, D.; Neira-Molina, H.; Huang, M.; Syam, S.M.; Zhang, Y.; Feng, Z.J.; Bhatti, U.A.; Asif, M.; Sarhan, N.; Awwad, E.M. CSTFNet: A CNN and Dual Swin-Transformer Fusion Network for Remote Sensing Hyperspectral Data Fusion and Classification of Coastal Areas. IEEE J. Stars 2025, 18, 5853–5865. [Google Scholar] [CrossRef]
  47. Uddin; Md, P.; Mamun, M.A.; Md, A.H. Effective feature extraction through segmentation-based folded-PCA for hyperspectral image classification. Int. J. Remote Sens. 2019, 40, 7190–7220. [Google Scholar] [CrossRef]
  48. Uddin, M.P.; Mamun, M.A.; Afjal, M.I.; Hossain, M.A. Information-theoretic feature selection with segmentation-based folded principal component analysis (PCA) for hyperspectral image classification. Int. J. Remote Sens. 2021, 42, 286–321. [Google Scholar] [CrossRef]
  49. Chen, Y.X.; Liu, Z.H.; Chen, Z.Q. AMS: A hyperspectral image classification method based on SVM and multi-modal attention network. Knowl. Based Syst. 2025, 314, 113236. [Google Scholar] [CrossRef]
  50. Sahadevan, A.S. Extraction of spatial-spectral homogeneous patches and fractional abundances for field-scale agriculture monitoring using airborne hyperspectral images. Comput. Electron. Agr. 2021, 188, 106325. [Google Scholar] [CrossRef]
  51. Zhang, Z.; Huang, L.H.; Tang, B.; Wang, Q.W.; Ge, Z.X.; Jiang, L.H. Non-Euclidean Spectral-Spatial feature mining network with Gated GCN-CNN for hyperspectral image classification. Expert Syst. Appl. 2025, 272, 126811. [Google Scholar] [CrossRef]
  52. Lin, J.; Mou, L.; Zhu, X.X.; Ji, X.; Wang, Z.J. Attention-Aware Pseudo-3-D Convolutional Neural Network for Hyperspectral Image Classification. IEEE T. Geosci. Remote 2021, 59, 7790–7802. [Google Scholar] [CrossRef]
  53. Yang, J.; Sun, J.; Ren, Y.P.; Li, S.B.; Ding, S.J.; Hu, J.J. GACP: Graph neural networks with ARMA filters and a parallel CNN for hyperspectral image classification. Int J. Digit Earth 2023, 16, 1770–1800. [Google Scholar] [CrossRef]
  54. Peng, M.; Liu, Y.X.; Qadri, I.A.; Bhatti, U.A.; Ahmed, B.; Sarhan, N.M.; Awwad, E.M. Advanced image segmentation for precision agriculture using CNN-GAT fusion and fuzzy C-means clustering. Comput. Electron. Agr. 2024, 226, 109431. [Google Scholar] [CrossRef]
  55. Zhu, C.M.; Ding, J.L.; Zhang, Z.P.; Wang, Z. Exploring the potential of UAV hyperspectral image for estimating soil salinity: Effects of optimal band combination algorithm and random forest. Spectrochim. Acta Part A Mol. Biomol. Spectrosc. 2022, 279, 121416. [Google Scholar] [CrossRef] [PubMed]
  56. Ali, M.A.; Lyu, X.; Ersan, M.S.; Xiao, F. Critical evaluation of hyperspectral imaging technology for detection and quantification of microplastics in soil. J. Hazard. Mater. 2024, 476, 135041. [Google Scholar] [CrossRef]
  57. Chen, H.; Shin, T.; Park, B.; Ro, K.; Jeong, C.; Jeon, H.J.; Tan, P. Coupling hyperspectral imaging with machine learning algorithms for detecting polyethylene (PE) and polyamide (PA) in soils. J. Hazard. Mater. 2024, 471, 134346. [Google Scholar] [CrossRef]
  58. Jiang, R.Z.; Sui, Y.Y.; Zhang, X.; Lin, N.; Zheng, X.M.; Li, B.Z.; Zhang, L.; Li, X.K.; Yu, H.Y. Estimation of soil organic carbon by combining hyperspectral and radar remote sensing to reduce coupling effects of soil surface moisture and roughness. Geoderma 2024, 444, 116874. [Google Scholar] [CrossRef]
  59. Li, X.Y.; Qiu, H.M.; Fan, P.P. A review of spectral feature extraction and multi-feature fusion methods in predicting soil organic carbon. Appl. Spectrosc. Rev. 2025, 60, 78–101. [Google Scholar] [CrossRef]
  60. Xu, H.; Ran, J.; Chen, M.X.; Sui, B.W.; Bai, X.Y. Non-Destructive Testing Based on Hyperspectral Imaging for Determination of Available Silicon and Moisture Contents in Ginseng Soils of Different Origins. J. Food Sci. 2025, 90, e70285. [Google Scholar] [CrossRef]
  61. Bohnenkamp, D.; Behmann, J.; Mahlein, A. In-Field Detection of Yellow Rust in Wheat on the Ground Canopy and UAV Scale. Remote Sens. 2019, 11, 2495. [Google Scholar] [CrossRef]
  62. Moghimi, A.; Yang, C.; Anderson, J.A. Aerial hyperspectral imagery and deep neural networks for high-throughput yield phenotyping in wheat. Comput. Electron. Agr. 2020, 172, 105299. [Google Scholar] [CrossRef]
  63. Ahmed, M.T.; Monjur, O.; Khaliduzzaman, A.; Kamruzzaman, M. A comprehensive review of deep learning-based hyperspectral image reconstruction for agri-food quality appraisal. Artif. Intell. Rev. 2025, 58, 96. [Google Scholar] [CrossRef]
  64. Genangeli, A.; Avola, G.; Bindi, M.; Cantini, C.; Cellini, F.; Riggi, E.; Gioli, B. A Novel Correction Methodology to Improve the Performance of a Low-Cost Hyperspectral Portable Snapshot Camera. Sens. 2023, 23, 9685. [Google Scholar] [CrossRef]
  65. Arad, O.; Cheplanov, L.; Afgin, Y.; Reshef, L.; Brikman, R.; Elatrash, S.; Stern, A.; Tsror, L.; Bonfil, D.J.; Klapp, I. Low-Cost Dispersive Hyperspectral Sampling Scanner for Agricultural Imaging Spectrometry. IEEE Sens. J. 2023, 23, 18292–18303. [Google Scholar] [CrossRef]
  66. Li, Q.F.; Yang, Y.P.; Tan, M.; Xia, H.; Peng, Y.X.; Fu, X.R.; Huang, Y.G.; Yang, X.P.; Ma, X.Y. Rapid pesticide residues detection by portable filter-array hyperspectral imaging. Spectrochim. Acta. Part A Mol. Biomol. Spectrosc. 2025, 330, 125703. [Google Scholar] [CrossRef] [PubMed]
  67. Habib, A.; Han, Y.; Xiong, W.F.; Fang Ning, H.; Zhou, Z.; Crawford, A. Automated Ortho-Rectification of UAV-Based Hyperspectral Data over an Agricultural Field Using Frame RGB Imagery. Remote Sens. 2016, 8, 796. [Google Scholar] [CrossRef]
  68. Mäkynen, J.; Litkey, P.; Hakala, T.; Pölönen, I.; Kaivosoja, J.; Saari, H.; Honkavaara, E.; Pesonen, L. Processing and Assessment of Spectrometric, Stereoscopic Imagery Collected Using a Lightweight UAV Spectral Camera for Precision Agriculture. Remote Sens. 2013, 5, 5006–5039. [Google Scholar]
  69. Sun, M.J.; Jing, X.T.; Ma, Y.X.; Huang, H.X. Lensless imaging via LED array based computational ghost imaging. Opt. Laser Technol. 2025, 180, 111401. [Google Scholar] [CrossRef]
  70. Greenbaum, A.; Zhang, Y.B.; Feizi, A.; Chung, P.; Luo, W.; Kandukuri, S.R.; Ozcan, A. Wide-field computational imaging of pathology slides using lens-free on-chip microscopy. Sci. Transl. Med. 2014, 6, 267ra175. [Google Scholar] [CrossRef]
  71. Xu, M.; Sun, J.; Zhou, X.; Tang, N.Q.; Shen, J.F.; Wu, X.H. Research on nondestructive identification of grape varieties based on EEMD-DWT and hyperspectral image. J. Food Sci. 2021, 86, 2011–2023. [Google Scholar] [CrossRef] [PubMed]
  72. Jia, W.Y.; Ferragina, A.; Hamill, R.; Koidis, A. Modelling and numerical methods for identifying low-level adulteration in ground beef using near-infrared hyperspectral imaging (NIR-HSI). Talanta 2024, 276, 126199. [Google Scholar] [CrossRef]
  73. Savary, S.; Willocquet, L.; Pethybridge, S.J.; Esker, P.; McRoberts, N.; Nelson, A. The global burden of pathogens and pests on major food crops. Nat. Ecol. Evol. 2019, 3, 430–439. [Google Scholar] [CrossRef]
  74. Shen, G.H.; Cao, Y.Y.; Yin, X.C.; Dong, F.; Xu, J.H.; Shi, J.R.; Lee, Y. Rapid and nondestructive quantification of deoxynivalenol in individual wheat kernels using near-infrared hyperspectral imaging and chemometrics. Food Control 2022, 131, 108420. [Google Scholar] [CrossRef]
  75. Liu, J.; Wang, X.W. Plant diseases and pests detection based on deep learning: A review. Plant Methods 2021, 17, 22. [Google Scholar] [CrossRef]
  76. Lu, B.; Jun, S.; Ning, Y.; Xiao Hong, W.; Xin, Z. Identification of tea white star disease and anthrax based on hyperspectral image information. J. Food Process Eng. 2021, 44, e13584. [Google Scholar] [CrossRef]
  77. Wan, L.; Li, H.; Li, C.S.; Wang, A.C.; Yang, Y.H.; Wang, P. Hyperspectral Sensing of Plant Diseases: Principle and Methods. Agronomy 2022, 12, 1451. [Google Scholar] [CrossRef]
  78. Zhang, X.D.; Wang, Y.F.; Zhou, Z.K.; Zhang, Y.X.; Wang, X.Z. Detection Method for Tomato Leaf Mildew Based on Hyperspectral Fusion Terahertz Technology. Foods 2023, 12, 535. [Google Scholar] [CrossRef]
  79. Jiang, X.P.; Zhen, J.N.; Miao, J.; Zhao, D.M.; Shen, Z.; Jiang, J.C.; Gao, C.J.; Wu, G.F.; Wang, J.J. Newly-developed three-band hyperspectral vegetation index for estimating leaf relative chlorophyll content of mangrove under different severities of pest and disease. Ecol. Indic. 2022, 140, 108978. [Google Scholar] [CrossRef]
  80. Feng, L.; Wu, B.H.; He, Y.; Zhang, C. Hyperspectral Imaging Combined With Deep Transfer Learning for Rice Disease Detection. Front. Plant Sci. 2021, 12, 693521. [Google Scholar] [CrossRef]
  81. Zhang, H.; Zhao, J.; Huang, L.; Huang, W.; Dong, Y.; Ma, H.; Ruan, C. Development of new indices and use of CARS-Ridge algorithm for wheat fusarium head blight detection using in-situ hyperspectral data. Biosyst. Eng. 2024, 237, 13–25. [Google Scholar] [CrossRef]
  82. Rady, A.; Ekramirad, N.; Adedeji, A.A.; Li, M.; Alimardani, R. Hyperspectral imaging for detection of codling moth infestation in GoldRush apples. Postharvest Biol. Tec. 2017, 129, 37–44. [Google Scholar] [CrossRef]
  83. Cheshkova, A.F. A review of hyperspectral image analysis techniques for plant disease detection and identification. Vavilovskii Zh Genet. 2022, 26, 202–213. [Google Scholar]
  84. Shi, Y.; Han, L.X.; Kleerekoper, A.; Chang, S.; Hu, T.L. A Novel CropdocNet for Automated Potato Late Blight Disease Detection from the Unmanned Aerial Vehicle-based Hyperspectral Imagery. arXiv 2021, arXiv:2107.13277. [Google Scholar] [CrossRef]
  85. Gu, Q.; Sheng, L.; Zhang, T.H.; Lu, Y.W.; Zhang, Z.J.; Zheng, K.F.; Hu, H.; Zhou, H.K. Early detection of tomato spotted wilt virus infection in tobacco using the hyperspectral imaging technique and machine learning algorithms. Comput. Electron. Agr. 2019, 167, 105066. [Google Scholar] [CrossRef]
  86. Bai, Y.L.; Nie, C.W.; Yu, X.; Gou, M.Y.; Liu, S.B.; Zhu, Y.Q.; Jiang, T.T.; Jia, X.; Liu, Y.D.; Nan, F.; et al. Comprehensive analysis of hyperspectral features for monitoring canopy maize leaf spot disease. Comput. Electron. Agr. 2024, 225, 109350. [Google Scholar] [CrossRef]
  87. Shi, Y.; Wang, Y.Y.; Hu, X.T.; Li, Z.H.; Huang, X.W.; Liang, J.; Zhang, X.A.; Zhang, D.; Zou, X.B.; Shi, J.Y. Quantitative characterization of the diffusion behavior of sucrose in marinated beef by HSI and FEA. Meat Sci. 2023, 195, 109002. [Google Scholar] [CrossRef] [PubMed]
  88. Liu, X.; Liu, X.; Meng, K.X.; Zhang, K.X.; Yang, W.J.; Yang, J.T.; Feng, L.Y.; Gong, H.R.; Zhou, C.A. Discrimination of leaf diseases in Maize/Soybean intercropping system based on hyperspectral imaging. Front. Plant Sci. 2024, 15, 1434163. [Google Scholar] [CrossRef]
  89. Baek, I.; Kim, M.S.; Cho, B.; Changyeun, M.; Jinyoung, Y.B.; McClung, A.M.; Oh, M. Selection of Optimal Hyperspectral Wavebands for Detection of Discolored, Diseased Rice Seeds. Appl. Sci. 2019, 9, 1027. [Google Scholar] [CrossRef]
  90. Qi, C.; Sandroni, M.; Westergaard, J.C.; Sundmark, E.H.R.; Bagge, M.; Alexandersson, E.; Gao, J.F. In-field early disease recognition of potato late blight based on deep learning and proximal hyperspectral imaging. arXiv 2021, arXiv:2111.12155. [Google Scholar] [CrossRef]
  91. Zhang, Y.; Li, X.; Wang, M.; Xu, T.; Huang, K.; Sun, Y.; Yuan, Q.; Lei, X.; Qi, Y.; Lv, X. Early detection and lesion visualization of pear leaf anthracnose based on multi-source feature fusion of hyperspectral imaging. Front. Plant Sci. 2024, 15, 1461855. [Google Scholar] [CrossRef]
  92. Bharadwaj, S.; Midha, A.; Sharma, S.; Sidhu, G.S.; Kumar, R. Optical screening of citrus leaf diseases using label-free spectroscopic tools: A review. J. Agr. Food Res. 2024, 18, 101303. [Google Scholar] [CrossRef]
  93. Ferreira, L.D.C.; Carvalho, I.C.B.; Jorge, L.A.D.C.; Quezado-Duval, A.M.; Rossato, M. Hyperspectral imaging for the detection of plant pathogens in seeds: Recent developments and challenges. Front. Plant Sci. 2024, 15, 1387925. [Google Scholar] [CrossRef] [PubMed]
  94. Jia, Z.C.; Qi Feng, D.; Yue, W.; Ke, W.; Hong Zhe, J. Detection Model and Spectral Disease Indices for Poplar (Populus L.) Anthracnose Based Hyperspectral Reflectance. Forest 2024, 15, 1309. [Google Scholar] [CrossRef]
  95. Rouš, R.; Peller, J.; Polder, G.; Hageraats, S.; Ruigrok, T.; Blok, P.M. Apple scab detection in orchards using deep learning on colour and multispectral images. arXiv 2023, arXiv:2302.08818. [Google Scholar] [CrossRef]
  96. Guo, L.; Sun, X.R.; Fu, P.; Shi, T.Z.; Dang, L.N.; Chen, Y.Y.; Linderman, M.; Zhang, G.L.; Zhang, Y.; Jiang, Q.H.; et al. Mapping soil organic carbon stock by hyperspectral and time-series multispectral remote sensing images in low-relief agricultural areas. Geoderma 2021, 398, 115118. [Google Scholar] [CrossRef]
  97. Mehedi, I.M.; Bilal, M.; Hanif, M.S.; Palaniswamy, T.; Vellingiri, M.T. Leveraging Hyperspectral Remote Sensing Imaging for Agricultural Crop Classification Using Coot Bird Optimization With Entropy-Based Feature Fusion Model. IEEE Access 2024, 12, 130214–130227. [Google Scholar] [CrossRef]
  98. Zhang, Y.T.; Wang, X.; Shan, J.J.; Zhao, J.B.; Zhang, W.; Liu, L.F.; Wu, F.C. Hyperspectral Imaging Based Method for Rapid Detection of Microplastics in the Intestinal Tracts of Fish. Environ. Sci. Technol. 2019, 53, 5151–5158. [Google Scholar] [CrossRef]
  99. Duma, Z.; Zemcik, T.; Bilik, S.; Sihvonen, T.; Honec, P.; Reinikainen, S.; Horak, K. Varroa destructor detection on honey bees using hyperspectral imagery. Comput. Electron. Agr. 2024, 224, 109219. [Google Scholar] [CrossRef]
  100. Chen, S.Y.; Chang, C.Y.; Ou, C.S.; Lien, C.T. Detection of Insect Damage in Green Coffee Beans Using VIS-NIR Hyperspectral Imaging | Semantic Scholar. Remote Sens. 2020, 15, 2348. [Google Scholar] [CrossRef]
  101. Dashti, A.; Müller-Maatsch, J.; Roetgerink, E.; Wijtten, M.; Weesepoel, Y.; Parastar, H.; Yazdanpanah, H. Comparison of a portable Vis-NIR hyperspectral imaging and a snapscan SWIR hyperspectral imaging for evaluation of meat authenticity. Food Chem. X 2023, 18, 100667. [Google Scholar] [CrossRef]
  102. Zheng, L.; Zhao, M.Y.; Zhu, J.C.; Huang, L.S.; Zhao, J.L.; Liang, D.; Zhang, D.Y. Fusion of hyperspectral imaging (HSI) and RGB for identification of soybean kernel damages using ShuffleNet with convolutional optimization and cross stage partial architecture. Front. Plant Sci. 2022, 13, 1098864. [Google Scholar] [CrossRef]
  103. Sun, Y.; Liang, D.D.; Wang, X.C.; Hu, Y.H. Assessing and detection of multiple bruises in peaches based on structured hyperspectral imaging. Spectrochim. Acta Part A Mol. Biomol. Spectrosc. 2024, 304, 123378. [Google Scholar] [CrossRef] [PubMed]
  104. Vignati, S.; Tugnolo, A.; Giovenzana, V.; Pampuri, A.; Casson, A.; Guidetti, R.; Beghi, R. Hyperspectral Imaging for Fresh-Cut Fruit and Vegetable Quality Assessment: Basic Concepts and Applications. Appl. Sci. 2023, 13, 9740. [Google Scholar] [CrossRef]
  105. Jin, S.S.; Liu, X.H.; Wang, J.L.; Pan, L.Q.; Zhang, Y.M.; Zhou, G.H.; Tang, C.B. Hyperspectral imaging combined with fluorescence for the prediction of microbial growth in chicken breasts under different packaging conditions. Lwt 2023, 181, 114727. [Google Scholar] [CrossRef]
  106. Wang, S.M.; Sun, J.; Fu, L.H.; Xu, M.; Tang, N.Q.; Cao, Y.; Yao, K.S.; Jing, J.P. Identification of red jujube varieties based on hyperspectral imaging technology combined with CARS-IRIV and SSA-SVM. J. Food Process Eng. 2022, 45, e14137. [Google Scholar] [CrossRef]
  107. Ge, X.; Sun, J.; Lu, B.; Chen, Q.S.; Xun, W.; Jin, Y.T. Classification of oolong tea varieties based on hyperspectral imaging technology and BOSS-LightGBM model. J. Food Process Eng. 2019, 42, e13289. [Google Scholar] [CrossRef]
  108. Tian, Y.; Sun, J.; Zhou, X.; Yao, K.S.; Tang, N.Q. Detection of soluble solid content in apples based on hyperspectral technology combined with deep learning algorithm. J. Food Process. Pres. 2022, 46, e16414. [Google Scholar] [CrossRef]
  109. Nirere, A.; Sun, J.; Atindana, V.A.; Hussain, A.; Zhou, X.; Yao, K.S. A comparative analysis of hybrid SVM and LS-SVM classification algorithms to identify dried wolfberry fruits quality based on hyperspectral imaging technology. J. Food Process. Pres. 2022, 46, e16320. [Google Scholar] [CrossRef]
  110. Terentev, A.; Badenko, V.; Shaydayuk, E.; Dmitriy, E.; Danila, E.; Dmitriy, K.; Alexander, F.; Dolzhenko, V. Hyperspectral Remote Sens. Early Detect. Wheat Leaf Rust Caused By Puccinia triticina. Mol. Plant Pathol. 2023, 13, 1186. [Google Scholar]
  111. Moriya, É.A.S.; Imai, N.N.; Tommaselli, A.M.G.; Berveglieri, A.; Santos, G.H.; Soares, M.A.; Marino, M.; Reis, T.T. Detection and mapping of trees infected with citrus gummosis using UAV hyperspectral data. Comput. Electron. Agr. 2021, 188, 106298. [Google Scholar] [CrossRef]
  112. Bai, Y.L.; Jin, X.L. Hyperspectral approaches for rapid and spatial plant disease monitoring. Trends Plant Sci. 2024, 6, 711–712. [Google Scholar] [CrossRef]
  113. Xu, P.; Fu, L.; Xu, K.; Sun, W.B.; Tan, Q.; Zhang, Y.P.; Zha, X.T.; Yang, R.B. Investigation into maize seed disease identification based on deep learning and multi-source spectral information fusion techniques. J. Food Compos. Anal. 2023, 119, 105254. [Google Scholar] [CrossRef]
  114. Tan, F.; Cang, H.; Gao, X.W.; Wu, N.Y.; Di, R.Y.; Zhang, Y.; Gao, P.; Lv, X.; Zhang, C. Early detection of cotton Verticillium wilt based on generative adversarial networks and hyperspectral imaging technology. Ind. Crop. Prod. 2025, 231, 121167. [Google Scholar] [CrossRef]
  115. Liu, T.; Qi, Y.; Yang, F.; Yi, X.Y.; Guo, S.L.; Wu, P.Y.; Yuan, Q.Y.; Xu, T.Y. Early detection of rice blast using UAV hyperspectral imagery and multi-scale integrator selection attention transformer network (MS-STNet). Comput. Electron. Agr. 2025, 231, 110007. [Google Scholar] [CrossRef]
  116. Bohnenkamp, D.; Behmann, J.; Paulus, S.; Steiner, U.; Mahlein, A. A Hyperspectral Library of Foliar Diseases of Wheat. Phytopathology 2021, 111, 1583–1593. [Google Scholar] [CrossRef]
  117. Wu, G.S.; Fang, Y.L.; Jiang, Q.Y.; Cui, M.; Li, N.; Ou, Y.M.; Diao, Z.H.; Zhang, B.H. Early identification of strawberry leaves disease utilizing hyperspectral imaging combing with spectral features, multiple vegetation indices and textural features. Comput. Electron. Agr. 2023, 204, 107553. [Google Scholar] [CrossRef]
  118. Zhu, W.X.; Sun, Z.G.; Huang, Y.H.; Yang, T.; Li, J.; Zhu, K.Y.; Zhang, J.Q.; Yang, B.; Shao, C.X.; Peng, J.B.; et al. Optimization of multi-source UAV RS agro-monitoring schemes designed for field-scale crop phenotyping. Precis. Agric. 2021, 22, 1768–1802. [Google Scholar] [CrossRef]
  119. Wang, L.J.; Jin, J.; Song, Z.H.; Wang, J.L.; Zhang, L.B.; Rehman, T.U.; Ma, D.; Carpenter, N.R.; Tuinstra, M.R. LeafSpec: An accurate and portable hyperspectral corn leaf imager. Comput. Electron. Agr. 2020, 169, 105209. [Google Scholar] [CrossRef]
  120. Sun, Q.; Gu, X.H.; Chen, L.P.; Xu, X.B.; Wei, Z.H.; Pan, Y.C.; Gao, Y.B. Monitoring maize canopy chlorophyll density under lodging stress based on UAV hyperspectral imagery. Comput. Electron. Agr. 2022, 193, 106671. [Google Scholar] [CrossRef]
  121. Thangadeepiga, E.; Alagu Raja, R.A. Remote Sensing-Based Crop Identification Using Deep Learning. In Intelligent Data Engineering and Analytics; Spinger: Berlin/Heidelberg, Germany, 2021; pp. 109–122. [Google Scholar]
  122. Li, D.S.; Chen, Q.S.; Ou Yang, Q.; Liu, Z.H. Advances of Vis/NIRS and imaging techniques assisted by AI for tea processing. Crit. Rev. Food Sci. 2025, 1–19. [Google Scholar] [CrossRef] [PubMed]
  123. Ou Yang, Q.; Chang, H.L.; Fan, Z.Z.; Ma, S.Z.; Chen, Q.S.; Liu, Z.H. Monitoring changes in constituents during black tea fermentation using snapshot multispectral imaging and 1D-CNN enhanced with data augmentation. Comput. Electron. Agr. 2025, 237, 110643. [Google Scholar] [CrossRef]
  124. Wang, Y.Y.D.; Duan, Z.J. Advances in Mask-Modulated Lensless Imaging. Electronics 2024, 3, 617. [Google Scholar] [CrossRef]
  125. Zhao, J.; Li, M.S. Lensless ultrafast optical imaging. Light Sci. Appl. 2022, 11, 97. [Google Scholar] [CrossRef]
  126. Li, T.G.; Wang, Y.F.; Yang, N.; Wang, A.Y.; Dong, S.Z.; Wang, S.H.; Jiang, F.Y.; Li, S.F. Portable multispectral diffraction microfluidic sensing system for pathogenic fungal detection. Sens. Actuators B Chem. 2024, 411, 135775. [Google Scholar] [CrossRef]
  127. Zhang, X.D.; Song, H.J.; Wang, Y.F.; Hu, L.; Wang, P.; Mao, H.P. Detection of Rice Fungal Spores Based on Micro- Hyperspectral and Microfluidic Techniques. Biosensors 2023, 13, 278. [Google Scholar] [CrossRef]
  128. Zhang, X.D.; Guo, B.X.; Wang, Y.F.; Hu, L.; Yang, N.; Mao, H.P. A Detection Method for Crop Fungal Spores Based on Microfluidic Separation Enrichment and AC Impedance Characteristics. J. Fungi 2022, 8, 1168. [Google Scholar] [CrossRef]
  129. Yang, N.; Hu, J.Q.; Zhou, X.; Wang, A.Y.; Yu, J.J.; Tao, X.Y.; Tang, J. A rapid detection method of early spore viability based on AC impedance measurement. J. Food Process Eng. 2020, 43, e13520. [Google Scholar] [CrossRef]
  130. Yang, N.; Qian, Y.; Mesery, H.S.E.; Zhang, R.B.; Wang, A.Y.; Tang, J. Rapid detection of rice disease using microscopy image identification based on the synergistic judgment of texture and shape features and decision tree–confusion matrix method. J. Sci. Food Agr. 2019, 99, 6589–6600. [Google Scholar] [CrossRef]
  131. Xu, P.F.; Zhang, R.B.; Yang, N.; Kwabena Oppong, P.; Sun, J.; Wang, P. High-precision extraction and concentration detection of airborne disease microorganisms based on microfluidic chip. Biomicrofluidics 2019, 13, 24110. [Google Scholar] [CrossRef] [PubMed]
  132. Yang, N.; Ji, Y.Y.; Wang, A.Y.; Tang, J.; Liu, S.H.; Zhang, X.D.; Xu, L.J.; He, Y. An integrated nucleic acid detection method based on a microfluidic chip for collection and culture of rice false smut spores. Lab Chip 2022, 22, 4894–4904. [Google Scholar] [CrossRef]
  133. Yang, N.; Chen, C.Y.; Li, T.; Li, Z.; Zou, L.R.; Zhang, R.B.; Mao, H.P. Portable Rice Disease Spores Capture and Detection Method Using Diffraction Fingerprints on Microfluidic Chip. Micromachines 2019, 10, 289. [Google Scholar] [CrossRef]
  134. Zhang, X.D.; Bian, F.; Wang, Y.F.; Hu, L.; Yang, N.; Mao, H.P. A Method for Capture and Detection of Crop Airborne Disease Spores Based on Microfluidic Chips and Micro Raman Spectroscopy. Foods 2022, 11, 3462. [Google Scholar] [CrossRef]
  135. Wang, Y.F.; Shi, Q.; Ren, S.J.; Li, T.Z.; Yang, N.; Zhang, X.D.; Ma, G.X.; Mohamed, F.T.; Mao, H.P. Application of a spore detection system based on diffraction imaging to tomato gray mold. Int. J. Agric. Biol. Eng. 2024, 6, 212–217. [Google Scholar] [CrossRef]
  136. Wang, Y.F.; Mao, H.P.; Zhang, X.D.; Liu, Y.; Du, X.X. A Rapid Detection Method for Tomato Gray Mold Spores in Greenhouse Based on Microfluidic Chip Enrichment and Lens-Less Diffraction Image Processing. Foods 2021, 10, 3011. [Google Scholar] [CrossRef]
  137. Yang, N.; Yu, J.J.; Wang, A.Y.; Tang, J.; Zhang, R.B.; Xie, L.L.; Shu, F.Y.; Kwabena, O.P. A rapid rice blast detection and identification method based on crop disease spores’ diffraction fingerprint texture. J. Sci. Food Agr. 2020, 100, 3608–3621. [Google Scholar] [CrossRef]
  138. Li, X.X.; Zhang, X.L.; Liu, Q.; Zhao, W.; Liu, S.X.; Sui, G.D. Microfluidic System for Rapid Detection of Airborne Pathogenic Fungal Spores. Acs Sens. 2018, 3, 2095–2103. [Google Scholar] [CrossRef]
  139. Wang, Y.F.; Zhang, X.D.; Taha, M.F.; Chen, T.H.; Yang, N.; Zhang, J.R.; Mao, H.P. Detection Method of Fungal Spores Based on Fingerprint Characteristics of Diffraction-Polarization Images. J. Fungi 2023, 9, 1131. [Google Scholar] [CrossRef] [PubMed]
  140. Wang, Y.F.; Mao, H.P.; Xu, G.L.; Zhang, X.D.; Zhang, Y.K. A Rapid Detection Method for Fungal Spores from Greenhouse Crops Based on CMOS Image Sensors and Diffraction Fingerprint Feature Processing. J. Fungi 2022, 8, 374. [Google Scholar] [CrossRef]
  141. Li, G.X.; Zhang, R.B.; Yang, N.; Yin, C.S.; Wei, M.J.; Zhang, Y.C.; Sun, J. An approach for cell viability online detection based on the characteristics of lensfree cell diffraction fingerprint. Biosens. Bioelectron. 2018, 107, 163–169. [Google Scholar] [CrossRef] [PubMed]
  142. Chen, P.; Shu Hua, L.; Yanjun, K.; Jian, T.; Ya Fei, W.; Ning, Y.; Ru Bing, H. Crop Disease Source Location and Monitoring System Based on Diffractive Light Identification Airborne Spore Sensor Network. IEEE Internet Things J. 2022, 13, 11030–11042. [Google Scholar]
  143. Shi, Z.S.; Zhao, Y.; Liu, S.; Wang, Y.T.; Yu, Q.L. Size-Dependent Impact of Magnetic Nanoparticles on Growth and Sporulation of Aspergillus niger. Molecules 2022, 27, 5840. [Google Scholar] [CrossRef] [PubMed]
  144. Jiang, S.W.; Guo, C.F.; Bian, Z.C.; Wang, R.H.; Zhu, J.K.; Song, P.M.; Hu, P.; Hu, D.; Zhang, Z.B.; Hoshino, K.; et al. Ptychographic sensor for large-scale lensless microbial monitoring with high spatiotemporal resolution. Biosens. Bioelectron. 2022, 196, 113699. [Google Scholar] [CrossRef]
  145. Wang, J.Y.; You, W.; Jiao, Y.H.; Zhu, Y.H.; Liu, X.J.; Jiang, X.Q.; Hu, C.F.; Lu, W.L. Quantitative phase imaging of opaque specimens with flexible endoscopic microscopy. Opt. Laser. Eng. 2024, 180, 108342. [Google Scholar] [CrossRef]
  146. Li, T.G.; Yang, N.; Xiao, Y.; Liu, Y.; Pan, X.Q.; Wang, S.H.; Jiang, F.Y.; Zhang, Z.Y.; Zhang, X.C. Virus detection light diffraction fingerprints for biological applications. Sci. Adv. 2024, 10, eadl3466. [Google Scholar] [CrossRef] [PubMed]
  147. Kalwa, U.; Legner, C.; Wlezien, E.; Tylka, G.; Pandey, S. New methods of removing debris and high-throughput counting of cyst nematode eggs extracted from field soil. PLoS ONE 2019, 14, e0223386. [Google Scholar] [CrossRef]
  148. Sasagawa, K.; Sasagawa, K.; Ohta, Y.; Kawahara, M.; Haruta, M.; Tokuda, T.; Ohta, J. Wide field-of-view lensless fluorescence imaging device with hybrid bandpass emission filter. Aip. Adv. 2019, 3, 35108. [Google Scholar] [CrossRef]
  149. Feshki, M.; De Koninck, Y.; Gosselin, B. Deep Learning Empowered Fresnel-based Lensless Fluorescence Microscopy. In Proceedings of the EMBC 2023 45th Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Sydney Australia, 24–27 July 2023; pp. 1–4. [Google Scholar]
  150. Gurkan, U.A.; Moon, S.; Geckil, H.; Xu, F.; Wang, S.Q.; Lu, T.J.; Demirci, U. Miniaturized lensless imaging systems for cell and microorganism visualization in point-of-care testing. Biotechnol J. 2011, 6, 138–149. [Google Scholar] [CrossRef]
  151. Chen, J.; Han, W.; Fu, L.; Lv, Z.; Chen, H.; Fang, W.; Hou, J.; Yu, H.; Huang, X.; Sun, L. A Miniaturized and Intelligent Lensless Holographic Imaging System With Auto-Focusing and Deep Learning-Based Object Detection for Label-Free Cell Classification. IEEE Photonics J. 2024, 16, 1–8. [Google Scholar] [CrossRef]
  152. Bian, Y.X.; Jiang, Y.N.; Wang, J.X.; Yang, S.M.; Deng, W.J.; Yang, X.F.; Shen, R.B.; Shen, H.; Kuang, C.F. Deep learning colorful ptychographic iterative engine lens-less diffraction microscopy. Opt. Laser. Eng. 2022, 150, 106843. [Google Scholar] [CrossRef]
  153. Su, T.; Xue, L.; Ozcan, A. High-throughput lensfree 3D tracking of human sperms reveals rare statistics of helical trajectories. Proc. Natl. Acad. Sci. USA 2012, 109, 16018–16022. [Google Scholar] [CrossRef]
  154. Su, T.; Seo, S.; Erlinger, A.; Ozcan, A. High-throughput lensfree imaging and characterization of a heterogeneous cell solution on a chip. Biotechnol. Bioeng. 2009, 102, 856–868. [Google Scholar] [CrossRef]
  155. Young, L.W.; Parham, C.; Zhong, Z.; Chapman, D.; Reaney, M.J.T. Non-destructive diffraction enhanced imaging of seeds. J. Exp. Bot. 2007, 58, 2513–2523. [Google Scholar] [CrossRef]
  156. Chen, J.; Huang, X.W.; Xu, X.F.; Wang, R.J.; Wei, M.Y.; Han, W.T.; Cao, J.F.; Xuan, W.P.; Ge, Y.K.; Wang, J.C.; et al. Microfluidic Particle Separation and Detection System Based on Standing Surface Acoustic Wave and Lensless Imaging. IEEE Trans. Bio-Med. Eng. 2022, 69, 2165–2175. [Google Scholar] [CrossRef] [PubMed]
  157. Xu, H.; Hu, H.Q.; Chen, S.Q.; Xu, Z.H.; Li, Q.; Jiang, T.T.; Chen, Y.T. Hyperspectral image reconstruction based on the fusion of diffracted rotation blurred and clear images. Opt. Laser. Eng. 2023, 160, 107274. [Google Scholar] [CrossRef]
  158. Jeon, D.S.; Baek, S.; Shinyoung, Y.; Qiang, F.; Xiong, D.; Wolfgang, H.; Min, H.K. Compact snapshot hyperspectral imaging with diffracted rotation. Communities Collect. 2019, 38, 111–117. [Google Scholar] [CrossRef]
  159. Chen, X.G.; Hu, J.; Chen, W.L.; Yang, S.L.; Wang, Y.F.; Tang, Z.R.; Liu, S.Y. Multi-spectral snapshot diffraction-based overlay metrology. Opt. Lett. 2023, 48, 3383–3386. [Google Scholar] [CrossRef]
  160. Bai, J.; Niu, Z.; Bi, K.; Yang, X.; Huang, Y.; Fu, Y.; Wu, M.; Wang, L. Toward an Advanced Method for Full-Waveform Hyperspectral LiDAR Data Processing. IEEE T. Geosci. Remote 2024, 62, 1–16. [Google Scholar] [CrossRef]
Figure 3. Basic principles of diffraction imaging technology.
Figure 3. Basic principles of diffraction imaging technology.
Agriculture 15 01775 g003
Figure 4. Applications of hyperspectral imaging technology for disease early warning: (A) Cotton Verticillium wilt [114]; (B) Diseases of rice blast [115]; (C) Diseases of wheat [116]; (D) Strawberry leaf disease [117].
Figure 4. Applications of hyperspectral imaging technology for disease early warning: (A) Cotton Verticillium wilt [114]; (B) Diseases of rice blast [115]; (C) Diseases of wheat [116]; (D) Strawberry leaf disease [117].
Agriculture 15 01775 g004aAgriculture 15 01775 g004bAgriculture 15 01775 g004c
Figure 5. Applications of spectroscopic techniques in tea science and technology: (A) Applications of spectroscopic techniques in tea science [122]; (B) Monitoring black tea fermentation using multispectral imaging [123].
Figure 5. Applications of spectroscopic techniques in tea science and technology: (A) Applications of spectroscopic techniques in tea science [122]; (B) Monitoring black tea fermentation using multispectral imaging [123].
Agriculture 15 01775 g005
Figure 6. Various applications of lens-free imaging technology: (A) Lens-free imaging for real-time cell viability monitoring [124]; (B) Lens-free imaging for transient process capture [125]; (C) Lens-free imaging for high-throughput microscopic detection [126]; (D) Lens-free imaging technology for fungal spore detection [127].
Figure 6. Various applications of lens-free imaging technology: (A) Lens-free imaging for real-time cell viability monitoring [124]; (B) Lens-free imaging for transient process capture [125]; (C) Lens-free imaging for high-throughput microscopic detection [126]; (D) Lens-free imaging technology for fungal spore detection [127].
Agriculture 15 01775 g006
Table 2. Applications of hyperspectral imaging associated with various target diseases.
Table 2. Applications of hyperspectral imaging associated with various target diseases.
Application TypeTarget DiseaseDetection TechnologyMain Achievements
Disease detectionWheat powdery mildew [83]HSISuccessfully identified the infection degree of wheat powdery mildew using hyperspectral imaging; Sensitive bands at 560 nm, 680 nm, and 758 nm.
Potato late blight [84]UAV-based HSIAchieved early detection of potato late blight by integrating UAV-mounted hyperspectral sensors with deep learning models.
Tomato spotted wilt virus [85]HSISuccessfully detected tomato spotted wilt virus using hyperspectral imaging combined with machine learning techniques; Detection accuracy reached 85%.
Maize leaf blight [86]HSIAchieved high-precision detection of maize leaf blight by integrating hyperspectral imaging with biochemical and spectral features; Overall accuracy reached 86.12%.
Sucrose diffusion in beef [87]HSIEnabled quantitative and spatial visualization of sucrose diffusion dynamics
Leaf diseases in intercropping [88]HSISuccessfully distinguished multiple leaf diseases using hyperspectral features
Rice seeds [89]HSIIdentified optimal wavebands to improve detection performance and reduce data redundancy
Disease early warningPotato late blight [90]Proximal HSIEarly recognition of potato late blight achieved by combining deep learning models with proximal hyperspectral imaging; Test set accuracy reached 73.9%.
pear leaf anthracnose [91]HSIApplied hyperspectral imaging to enable early warning and visual diagnosis of Sclerotinia-infected tomato, improving detection accuracy and providing a basis for timely disease management.
Citrus leaf diseases [92]HSISummarized the application of multiple hyperspectral imaging technologies in citrus leaf disease detection; Emphasized advantages for fast, non-invasive detection and future development directions.
Crop seed-borne pathogens [93]HSIDemonstrated that hyperspectral imaging combined with AI technology can accurately distinguish between infected and healthy seeds, providing a novel approach for seed pathogen detection.
Poplar anthracnose [94]HSIDeveloped a spectral model for early and accurate disease detection
Crop
remote sensing and monitoring
Apple black rot [95]Multispectral imagingDeep learning combined with multispectral imaging enabled early detection of apple black rot, improving orchard disease management efficiency.
Wheat [62]UAV-based HSIUsing UAV-based hyperspectral data combined with deep neural network (DNN) analysis of wheat spectral characteristics, achieved yield prediction.
Rice [80]HSICombined hyperspectral imaging with advanced transfer learning methods to enable rapid detection of various rice upper leaf diseases.
Soil [96]Multispectral imagingBy integrating precise spectral features from hyperspectral imaging with dynamic temporal information from multispectral data, significantly improved the accuracy of soil organic carbon estimation.
Seeds [93]HSIDemonstrated that hyperspectral imaging combined with AI can accurately distinguish infected and healthy seeds, providing a novel approach for seed pathogen detection.
Crop [97]HSIImproved classification accuracy and robustness by combining HSI with feature fusion strategies.
Other applicationsFish species [98]HSIProposed a method combining hyperspectral imaging with a SVM model for rapid detection of microplastics in fish intestines, providing a new approach for assessing the impact of environmental pollution on aquatic organism health.
Honeybees [99]HSIUtilized hyperspectral imaging combined with multivariate statistical analysis to detect Varroa destructor mites on honeybee bodies.
Wheat [62]UAV-based HSI + Deep learningCombined UAV-based hyperspectral data with DNN to perform wheat pest and yield prediction, analyzing the correlation between spectral features and pest damage.
Coffee bean [100]HSIApplied hyperspectral imaging to detect damage caused by coffee berry borers, enabling high-precision, non-destructive pest identification in coffee beans.
Beef, lamb, and chicken [101]HSI + SVM and ANN-BPNVis-NIR HSI outperformed SWIR HSI with higher accuracy (96% vs. 88%), better Rp (~0.99 vs. 0.86), and lower RMSEP (4–9% vs. 15–24%).
Soybean kernel damage [102]HSI +RGB+DNNAchieved high accuracy of 98.36% using spectrum-RGB fusion and optimized convolutional architecture.
Peaches [103]HSI + Multivariate analysisEffectively identified multiple bruises at different stages, enabling improved quality inspection.
Fruits and vegetables [104]HSIHSI enables rapid, non-destructive assessment of moisture, color, damage, and spoilage.
Microorganism [105]HSICombined modalities accurately predicted microbial growth trends under various packaging conditions.
Red jujube [106]HIS + CARS-IRIV and SSA-SVMAchieved high-accuracy varietal identification through effective feature selection and model optimization.
Oolong tea [107]HIS + BOSS-LightGBMEnabled accurate and rapid classification of tea varieties by combining band selection with advanced machine learning.
Egg [31]HSI + SVMProvided a nondestructive and reliable method for predicting egg freshness with improved regression accuracy.
Soluble solid content in apples [108]HSI + DNNRealized precise prediction of soluble solid content, enhancing nondestructive apple quality evaluation.
Wolfberry [109]HSI + Hybrid SVM and LS-SVMDemonstrated effective differentiation of dried wolfberry quality grades through comparative classification approaches.
Table 3. Applications of different imaging associated with various target diseases.
Table 3. Applications of different imaging associated with various target diseases.
Application TypeTargetDetection TechnologyMain Achievements
Spore detectionRice disease spores [141]Microfluidic chip combined with diffraction-based optical technologyOptimized microfluidic chip structure to achieve purification and detection of low-concentration airborne spores, significantly improving detection sensitivity.
Fungal spores [139]Diffraction–polarization imaging combined with machine learningDeveloped a method based on diffraction-polarization imaging features of fungal spores; Integrated with SVM algorithm to achieve high-accuracy identification of multiple fungal spores.
Rice virus spores [133]Portable microfluidic chip combined with lens-free diffraction imagingProposed a portable device using microfluidic chip and lens-free diffraction imaging for rice virus spore capture and detection; Results highly correlated with microscopic identification.
Rice blast spores [137]Diffraction-based texture feature analysisDeveloped a rapid rice blast spore detection and identification method based on diffraction texture analysis, enabling effective classification of rice blast spores and other spores.
Airborne crop pathogen spores [142]Diffraction-based optical identification sensor networkBuilt a diffraction-based optical identification sensor network for airborne crop pathogen spores, enabling source localization and monitoring, and enhancing disease early warning capabilities.
Pathogenic fungal spores [127]Multispectral diffraction-based microfluidic sensing systemDeveloped a portable multispectral diffraction-based microfluidic sensing system for pathogenic fungal spore detection; Demonstrated high sensitivity and specificity.
Pathogenic spore countingSpores (tomato gray mold, cucumber downy and powdery mildew) [139]Diffraction–polarization fingerprint + SVM~95.85% accurate classification of multiple spore types using diffraction–polarization texture features.
Greenhouse crop fungal spores [140]CMOS sensor + Diffraction fingerprint processingCompact, high-throughput detection with ~92.7% SVM accuracy.
Rice disease spores in microfluidic chip [136]Microfluidic capture + Lensless diffractionPortable system enabling automatic spore detection via diffraction fingerprint imaging.
Aspergillus niger spore sporulation [143]Magnetic nanoparticle exposure + Spore count analysisDemonstrated that larger magnetic nanoparticles significantly inhibit spore production.
Other applicationsMicroorganisms in sediment [144]Lens-free imaging technologyUsed lens-free imaging to capture images of microorganisms in sediment; Analyzed morphological and physiological characteristics to achieve microorganism species identification.
Virus-infected cells [145]Lens-free diffraction imaging technologyAnalyzed diffraction pattern signatures of virus-infected cells; Achieved high-throughput, visual monitoring with a linear correlation of 98.9%.
Virus-infected cells [146]Lens-free imaging technologyProposed a novel method using lens-free diffraction pattern analysis for biological detection of virus-infected cells, enabling efficient, flexible identification and differentiation of various viruses.
Cyst nematode eggs in soil samples [147]Lensless or deep learning-based countingGreatly improved egg purity and automated counting efficiency with minimal manual error.
Rice blast fungal spores [137]Lensless diffraction imaging + CNNEnabled rapid (few seconds) identification of fungal spores with 97.18% accuracy.
Fluorescently labeled particles or cells [148]Lensless fluorescence imaging + Hybrid bandpass filtersAchieved wide field-of-view imaging with high signal-to-noise ratio using a compact, low-cost system.
Cells and microorganisms [149]
Cells and microorganisms [150]
Lensless imaging + Deep learning analysisEnabled automated identification of cell morphology with high resolution in a portable format.
Lensless holographic imagingThey present a compact lensless holographic imaging system featuring auto-focusing and deep learning-based detection for label-free cell classification.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Chen, L.; Wu, Y.; Yang, N.; Sun, Z. Advances in Hyperspectral and Diffraction Imaging for Agricultural Applications. Agriculture 2025, 15, 1775. https://doi.org/10.3390/agriculture15161775

AMA Style

Chen L, Wu Y, Yang N, Sun Z. Advances in Hyperspectral and Diffraction Imaging for Agricultural Applications. Agriculture. 2025; 15(16):1775. https://doi.org/10.3390/agriculture15161775

Chicago/Turabian Style

Chen, Li, Yu Wu, Ning Yang, and Zongbao Sun. 2025. "Advances in Hyperspectral and Diffraction Imaging for Agricultural Applications" Agriculture 15, no. 16: 1775. https://doi.org/10.3390/agriculture15161775

APA Style

Chen, L., Wu, Y., Yang, N., & Sun, Z. (2025). Advances in Hyperspectral and Diffraction Imaging for Agricultural Applications. Agriculture, 15(16), 1775. https://doi.org/10.3390/agriculture15161775

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop