Next Article in Journal
Biochar-Based Granular Fertilizers with Agro-Industrial Binders Enhance Enzymatic Activity and Nutrient Cycling in Tropical Oxisols
Next Article in Special Issue
Research on Non Destructive Detection Method and Model Op-Timization of Nitrogen in Facility Lettuce Based on THz and NIR Hyperspectral
Previous Article in Journal
Machine Learning-Based Estimation of Tractor Performance in Tillage Operations Using Soil Physical Properties
Previous Article in Special Issue
Impedance-Driven Decoupling Water–Nitrogen Stress in Wheat: A Parallel Machine Learning Framework Leveraging Leaf Electrophysiology
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Analysis of the Current Situation and Trends of Optical Sensing Technology Application for Facility Vegetable Life Information Detection

1
School of Agriculture Engineering, Jiangsu University, Zhenjiang 212013, China
2
Department of Agricultural Engineering, Jiangsu University, Zhenjiang 212013, China
3
College of Information Engineering, Northwest A & F University, Yangling 712100, China
4
Basic Engineering Training Center, Jiangsu University, Zhenjiang 212013, China
*
Authors to whom correspondence should be addressed.
Agronomy 2025, 15(9), 2229; https://doi.org/10.3390/agronomy15092229
Submission received: 15 August 2025 / Revised: 17 September 2025 / Accepted: 19 September 2025 / Published: 21 September 2025
(This article belongs to the Special Issue Crop Nutrition Diagnosis and Efficient Production)

Abstract

The production of facility vegetables is of great significance but there are still limitations to this production in terms of yield and quality. Optical sensing technology offers a rapid and non-destructive solution for phenotypic analysis, which is superior to traditional destructive methods. This article reviews and analyzes nine optical sensing technologies, including RGB imaging, and introduces the application of various algorithms in combination with detection principles throughout the entire growth cycle as well as key phenotypic characteristics of facility vegetables. Each technology has its advantages. For example, RGB and multi/high-spectrum technologies are the most frequently used while thermal imaging is particularly suitable for early detection of non-biological and biological stress responses, and these technologies can effectively obtain physiological, biochemical, yield, and quality information about crops. However, current research mainly focuses on laboratory verification and there is still a significant gap when it comes to practical production. Future progress will depend on the integration of multiple sensing technologies, data analysis based on artificial intelligence, and improvements in model interpretability. These developments will be crucial for ultimately achieving precise breeding and intelligent greenhouse management systems, and will gradually transition from basic phenotypic analysis to comprehensive decision support systems.

1. Introduction

Facility vegetables refers to vegetable varieties cultivated in artificially controlled environments such as greenhouses and plastic tunnels to enable year-round production. These include common crops like tomatoes, cucumbers, and lettuce. The stable production of protected-culture vegetables plays a vital role in global agriculture, effectively alleviating seasonal supply gaps while ensuring consistent vegetable availability throughout the year. According to the report “National Modern Facility Agriculture Construction Plan (2023–2030)” issued by the Ministry of Agriculture and Rural Development, as of 2021, China’s facility planting area reached about 2.67 million hectares, of which the area of facility vegetables accounted for more than 80%, ranking China first in the world. Facility vegetable production reached 230 million tons, accounting for 30% of total vegetable production. However, the development of facility vegetables still faces the problems of insufficient total quantity and low quality [1].
With population growth and resource constraints, it has become a key challenge to improve the production and quality of facility vegetables. Plant phenotyping is essential for measuring and evaluating complex traits related to plant growth, yield, and other important agricultural characteristics [2] that reflect physiological, biochemical, yield, and quality characteristics as well as morphological and structural characteristics during plant growth [3]. By analyzing large amounts of crop phenotypic data, it is possible to gain insight into how genes regulate the expression of crop traits and how environmental factors affect crop growth and development, which in turn can feed into crop genetic improvement and molecular design breeding [4,5]. Therefore, the accurate acquisition of phenotypic information about vegetables can help to breed and develop high-yielding and high-quality vegetable germplasm, which can not only ensure food security but also promote the development of the facility vegetable industry [6].
Traditionally, vegetable trait studies have been assessed by visual observation and manual measurement, which is not only time-consuming and laborious but also highly subjective [6]. High performance liquid chromatography (HPLC) [7] or electrochemical analysis [8] can provide accurate physiological and biochemical parameters for vegetables, but their destructive sampling methods result in continuous in situ detection not being possible and in poor real-time performance. Optical sensing technologies, including RGB imaging, multispectral imaging (MSI), hyperspectral imaging (HSI), thermal imaging, chlorophyll fluorescence (CFI), Raman spectroscopy (RS), terahertz (THz), X-rays (CT), optical coherence tomography (OCT), and other optical detection methods, which have the advantages of rapidity and non-destructiveness, have been widely used in crop phenotyping research [9]. This research can, by analyzing the interactions between light and plant tissues, obtain facility vegetable physiological/biochemical, yield, quality, internal/external, and other characteristic information about facility vegetables.
RGB imaging detects how plants reflect or absorb different light colors when illuminated, converting them into electrical/digital signals to form color images, and enables extraction of facility vegetables’ apparent morphology [10]. Multi-spectral and hyperspectral imaging reflect the composition and content of chemical elements, such as nitrogen, phosphorus, potassium, etc., within facility vegetables through the differences in light absorption, reflection, and transmission characteristics of different substances [11,12]. Thermal imaging, based on the theory of infrared radiation and high sensitivity to temperature, can obtain high-precision heat distribution maps of the area measured in a short period of time, such as the canopy temperature and leaf temperature of facility vegetables [13], to understand the physiology and water stress status of facility vegetables [14]. Chlorophyll fluorescence (CFI) can be used to assess the photosynthetic activity of facility vegetables by studying the spatial and temporal heterogeneity of fluorescence emission patterns within cells, leaves, or whole plants [15]. Raman spectroscopy utilizes the Raman scattering effect resulting from the interaction of light with molecules to analyze the chemical composition of facility vegetables [16]. Terahertz technology utilizes the interaction of terahertz waves with matter to detect the internal structure of facility vegetables. Terahertz waves have the characteristics of high penetrability and sensitivity to moisture, and are able to penetrate the surface layer of plants and detect the moisture content of plant tissues [17]. X-rays are able to penetrate vegetables and, based on differences in the degree of X-ray absorption by different tissues, images of the interior of facility vegetables are generated, which can be used to check internal defects [18].
From planting to consumption, facility vegetables undergo critical stages including seedling growth, vegetative growth, flowering and fruit set, fruit enlargement and ripening, harvesting and grading, and transportation and storage. Due to differences in principles and characteristics, various optical sensing technologies exhibit distinct application scenarios and value across these stages. Simultaneously, the four core phenotypic traits of facility vegetables—biochemical, physiological, yield, and quality—directly correlate with growth status assessment and economic value determination. Optical sensing technology serves as the core means for achieving non-destructive, efficient detection of these traits. Based on this, this article systematically reviews the current application status of nine mainstream optical sensing technologies throughout the entire growth cycle of protected-culture vegetables. It analyzes the operational mechanisms and practical effectiveness of each technology in detecting the four types of phenotypic traits, further identifies the bottlenecks currently facing technological application, and outlines future development directions. The aim is to provide theoretical support for technology selection and results transformation in protected-culture vegetable phenotyping research, thereby advancing the industry toward precision and intelligence.

2. Overview of the Application of Optical Sensing Technology in Different Growth Cycles of Facility Vegetables

This section provides an overview of the application scenarios of nine technologies in six stages, from seedlings to transportation and storage. Different sensors are applied to different growth cycles, as shown in Figure 1.

2.1. RGB Imaging

RGB imaging technology captures the reflected or transmitted light signals of an object in the visible band (380 nm~780 nm) and converts the acquired signals into digital image data by combining optoelectronic conversion and digital signal processing techniques [11]. In the field of agriculture, RGB imaging technology has a wide range of applications and is commonly used for crop phenotype extraction [19], biomass estimation [20], maturity detection [21], and so on.
Detecting the growth stage of seedlings is a key issue in the field of plant science. This involves identifying key milestones, such as seedling emergence from the soil, cotyledon unfolding, and the appearance of the first leaf, which represent the initial stages of plant development. The success or failure of these developmental stages, and the rate at which they progress, can significantly affect the future evolution of plants [22]. RGB imaging plays an important role in measuring seed germination rates. By analyzing seed images, germinated and non-germinated seeds can be accurately identified and the germination rate of seeds calculated. Using image recognition algorithms, morphological features of seeds, such as rupture of the seed coat and protrusion of the radicle, can be detected as a means of determining whether a seed has sprouted or not [23]. RGB imaging technology also shows unique advantages in assessing seedling robustness. By analyzing the color, shape, and size of the seedling, it is possible to determine its growth and health. Healthy seedlings usually have bright green leaves, while yellowing or wilting leaves may indicate growth problems. Seedling growth can be assessed by measuring morphological parameters such as stem thickness and leaf area [24].
During the flowering and fruiting stage of facility vegetables, RGB imaging plays an important role in identifying the number of flowers and monitoring their status by using its ability to capture color and morphology sensitively. By processing and analyzing plant images, flowers can be accurately identified and counted by using image recognition algorithms based on differences in color and morphology between flowers and leaves, stems, etc. For instance, the “bwconncomp” function in the Matlab image processing toolbox can be utilized to group pixels within the domain to determine whether they belong to the same flower [25]. By analyzing features such as the color, shape, and texture of flowers, their developmental stage and health can be determined. Fully open flowers are brightly colored with spreading petals, while flowers that are stunted or affected by pests and diseases may be dull in color, or have misshapen petals or disease spots. Studies have shown that quantitative analysis of flower color can be used to assess the physiological status of flowers and predict their potential for fruit set [26].
Accurate flower counts and condition monitoring are important for predicting fruit set. Flower number is directly related to the potential number of fruit set, while flower status affects the success of pollination and fertilization [27]. Through long-term image monitoring and data analysis, the relationship between flower number, condition and fruit set rate can be modeled to provide growers with predictive information on fruit set rate, which can help them make production management decisions in advance, such as rationally adjusting fertilizer application, irrigation, and pest control measures to improve fruit set rate and fruit yield [28].
During the harvest grading period of facility vegetables, RGB imaging technology is mainly used for appearance detection. The RGB imaging system can capture color images of objects and use image recognition algorithms such as solo to analyze their positional characteristics, making it convenient for harvesting [29].

2.2. Three-Dimensional Imaging

3D imaging technology is based on the principles of optical triangulation, time-of-flight, or interferometric measurements, etc., through LIDAR, structured light or stereoscopic vision and other sensing systems. These actively emit or passively receive light signals through point cloud data processing and 3D reconstruction algorithms and, ultimately, obtain the spatial coordinates of the object to be measured, the surface topography, and structural characteristics of the three-dimensional information [30].
During the nursery period of facility vegetables, 3D imaging technology can obtain three-dimensional morphological information about seedlings, providing more comprehensive data support for the study of seedling growth and development. It can non-contact measure key phenotypic parameters such as plant height, stem thickness, leaf area, leaf inclination, etc., quantify seedling growth and population uniformity, and accurately identify weak, overgrown, or lagging areas, thus guiding precise seedling spacing, seedling replenishment, or adjustment of environmental control strategies [31]. It can also be combined with multispectral or thermal imaging data to detect early stages of pest infestation, disease, water stress, or nutrient deficiency through changes in leaf morphology (e.g., wilting, curling) in three dimensions, or abnormalities in canopy temperature [32]. In addition, 3D imaging can assist automated nursery equipment, such as guiding transplanting robots to accurately grasp seedlings to avoid mechanical damage or optimizing supplemental lighting systems to dynamically adjust light angle and intensity according to seedling canopy structure to ensure uniform growth [33]. This technology continuously monitors growth dynamics and provides data support for precise agricultural management during the nursery period, which significantly improves the quality of nursery and production efficiency [34,35].
Three-dimensional imaging technology provides a brand-new technical means for the fine management of facility vegetables during the flowering and fruiting period [36]. This technology can accurately obtain three-dimensional spatial structure information about plants and construct high-precision three-dimensional models of them by means of advanced methods such as LiDAR and structured light. These models can clearly present the spatial distribution characteristics of flowers on the plant, including key parameters such as specific position, opening angle, and growth orientation, which are crucial for morphometrics, mechanical performance research, and mechanized equipment development [37]. Taking strawberry cultivation as an example, 3D imaging can visualize the distribution density and growth status of flowers at different canopy locations, helping growers to accurately assess plant light uniformity and growth and providing a scientific basis for optimizing planting density and light management, and thus improving productivity [38].

2.3. Multispectral and Hyperspectral Imaging

Multi-spectral imaging is an imaging technique that can acquire multiple wavelength image items of target information [39]. The information obtained by this technique is richer that obtained by monochrome and color images. Compared with hyperspectral imaging techniques, multispectral imaging has a lower spectral resolution, such as in the visible and near-infrared bands, which are about 30 nm~50 nm, while the spectral resolution of hyperspectral imaging is usually less than 10 nm [40]. While hyperspectral imaging is a set of images formed by successive imaging of the samples to be tested in different wavelength bands under the same environment, the result is the hyperspectral image data cube of the samples to be tested. This hyperspectral image data cube combines the spatial dimension and spectral dimension information of the samples to be tested, which is extremely helpful for further study of the samples to be tested [41].
During the nutrient growth period of facility vegetables, these techniques can accurately monitor growth dynamics [42]. For example, through multi-spectral indices, ordinary linear regression (OLR), multiple stepwise regression (MSR), and ridge regression (RR) inversion modeling, the nutrient content of leaves can be inversely modeled for diagnosis, thereby providing a basis for precise fertilization [43]; at the same time, the technique can efficiently diagnose the symptoms of deficiencies, for example, the abnormalities of reflectance of the red and near-infrared wavelengths in the case of nitrogen deficiency and the changes in the characteristics of the blue–green wavelength in the case of phosphorus deficiency, which are the most important factors in the development of disease. Some studies have confirmed that the accuracy of cucumber deficiency diagnosis based on hyperspectral imaging technology can reach more than 90% [44], which significantly improves the science and accuracy of facility vegetable production management and provides reliable technical support for high quality and high yield.
In addition, near-infrared hyperspectral imaging has also been widely used during the expansion and ripening period of facility vegetables. The core principle lies in the fact that different chemical components have characteristic absorption spectra in the near-infrared wavelength band (780–2500 nm), in which the key quality components such as sugar, organic acid, and water will produce specific molecular vibration absorption peaks [45]. By collecting diffuse reflectance or transmission spectra from the fruit surface and combining multivariate statistical methods (e.g., partial least squares regression, principal component analysis, etc.) to establish quantitative relationship models between spectral features and quality parameters, non-destructive testing can be realized [46]. For brix detection of tomato fruits, the prediction accuracy of near-infrared spectroscopy combined with PLS modeling exceeds 90% [47]. It provides multi-dimensional data support for fruit harvesting, irrigation control and quality grading, and significantly improves the accuracy and efficiency of fruit and vegetable quality management in facilities.

2.4. Chlorophyll Fluorescence Imaging

The release process for chlorophyll fluorescence is closely related to photochemical reactions, which can be used to sense photosynthesis, plant physiology, and environmental stress. Chlorophyll fluorescence detection plays an important role in the study of plant tissues such as leaves on a macro level, or chloroplasts and other cellular organelles on a micro level, and this detection does not cause damage to plants. Therefore, the chlorophyll fluorescence assay is known as a fast, non-destructive detection probe for plant photosynthetic function [48].
Currently, the chlorophyll fluorescence technique is mainly based on Butler’s energy competition model of photosynthesis. The model suggests that light energy absorbed by chlorophyll molecules releases energy through three pathways: photochemical reactions in the photosynthetic system, heat dissipation, and chlorophyll fluorescence. Detecting the chlorophyll fluorescence of plants can indirectly reflect the process of absorption, transfer, dissipation, and distribution of light energy in the plant photosynthetic system [49].
During the nutrient growth period of facility vegetables, chlorophyll fluorescence imaging assesses photosynthetic performance by quantifying core parameters such as maximum photochemical efficiency (Fv/Fm) and actual photochemical efficiency (ΦPSII) (typical value of Fv/Fm in healthy plants is about 0.8) [50]. Chlorophyll fluorescence imaging provides early diagnosis of abiotic stresses such as water deficit leading to abnormal fluorescence parameters [51], and biotic stresses such as changes in fluorescence characteristics at the early stage of cucumber downy mildew infestation [52], and its spatially resolved imaging features can localize the area of stress onset, providing a basis for the visualization of precise control (e.g., targeted irrigation, early disease prevention and control) [53]. Such a technological approach to visualizing and quantifying photosynthetic physiological responses significantly improves the timeliness of early warning of stress and the accuracy of management measures in vegetable production in facilities [54].

2.5. Thermal Imaging

Thermal imaging technology is mainly based on the principle of infrared radiation temperature measurement through the detection of infrared energy spontaneously radiating from the surface of the object to obtain temperature distribution information. The technology uses thermal imaging sensors to convert infrared radiation into electrical signals, which are processed to generate intuitive thermal images in which different color gradients accurately reflect the temperature changes of the object under testing [13].
The application of this technology in the expansion and ripening period of facility vegetables is mainly through non-contact monitoring of crop surface temperature distribution, which provides a scientific basis for precision agriculture management [55]. For example, in water management, it does this by detecting leaf temperature changes to determine the transpiration status of the plant to avoid water stress affecting fruit expansion, and then optimizes the irrigation strategy [56]. Thermal imaging can also identify localized temperature anomalies due to diseases or nutrient deficiencies, providing early warning of diseases and guiding precise control [57]. In addition, for fruit ripening, the technology can capture subtle temperature changes caused by metabolic activities. At the same time, based on the fact that different objects have different specific heat capacities at different temperatures, their thermal images are used as a clear indicator of maturity, which helps to determine the optimal harvest period [58]. For light management, thermal imaging can monitor leaf overheating due to bright light and help adjust supplemental light or shade programs to avoid photoinhibition affecting photosynthetic efficiency [59]. In the future, with deep integration with the Internet of Things and the popularization of low-cost equipment, thermal imaging technology will further improve the yield and quality of facility vegetables.

2.6. Raman Imaging

Raman imaging is a molecular spectral analysis method based on the Raman scattering effect. The principle is that when a laser interacts with sample molecules, the photons collide inelastically with the vibrational energy levels of the molecules, resulting in a change in the frequency of the scattered light, which produces a characteristic spectrum containing information about the molecular structure [60]. Due to the specificity of the vibrational modes of different chemical bonds, each chemical component will exhibit unique Raman spectral features, which makes this technique capable of qualitative and quantitative analysis of substances [61].
Raman imaging plays an important role in the harvesting and grading of vegetables in facilities by combining the molecular fingerprinting capability of Raman spectroscopy and the spatial analysis capability of high-resolution imaging to achieve non-destructive, rapid detection and accurate grading [62]. Raman imaging can non-destructively obtain information on the distribution of chemical components on the surface of vegetables, such as sugars, organic acids, vitamins, pigments, and pesticide residues, to assess their maturity, nutritional value, and safety [63]. In the automated sorting system, Raman imaging can classify vegetables in real time based on preset criteria (e.g., sugar–acid ratio, carotenoid content, etc.) [64], and identify moldy or diseased areas [65], which improves quality and reduces wastage, and provides a reliable technical support for intelligent harvesting and quality control in facility-based agriculture.
In addition, during the transportation and storage period for facility vegetables, Raman imaging technology can realize the accurate assessment of quality changes through dynamic monitoring of the chemical composition of fruit. The mechanism of action is as follows: over time, vegetable tissues undergo a series of biochemical reactions, such as degradation of polysaccharides, oxidative decomposition of antioxidants such as ascorbic acid, and metabolic transformation of cell wall components [66]. These changes in chemical components will directly cause changes in molecular vibration modes, which in turn leads to regular changes in the intensity, displacement, and other parameters of the characteristic peaks of Raman spectroscopy. The technique can also detect external damage suffered by fruits and vegetables during harvesting, transportation, storage, sorting, packaging, and marketing, as well as internal damage that occurs during the growth process [67]. These research results provide a reliable technical solution for the intelligent monitoring of quality in the cold chain logistics process of vegetables.

2.7. Terahertz Imaging

Terahertz (THz) radiation refers to long wavelength electromagnetic waves in the frequency range of 0.1–10 THz (corresponding to wavelengths of 30 μm–3 mm). Terahertz waves penetrate deep into the medium, and their high correlation helps to determine the exact refractive index and absorption coefficient of a given sample. Terahertz spectroscopy can be used to analyze macromolecules and constituents within crops due to the transmissive nature of the radiation, making it uniquely suited for bioinformatic detection applications [68].
The use of terahertz imaging to penetrate the skin of vegetables and obtain internal information plays an important role in the harvesting and grading period of facility vegetables, realizing non-destructive quality detection and automated grading [69]. Terahertz imaging can detect the internal water distribution and ripeness of fruits [70], and its spectral features can distinguish key indicators such as sugar [71] and starch content [72], and evaluate and grade the fruits from different perspectives. Compared with traditional visible light or near-infrared technology, terahertz imaging has better penetration ability for vegetables wrapped in multiple layers of leaves and is not disturbed by surface humidity, significantly improving the level of facility agriculture intelligence and reducing post-harvest losses. In the future, with the development of portable terahertz equipment, it is expected to become a key technology for post-harvest treatment of facility vegetables, thus improving the scientific and accuracy of grading and meeting the market demand for vegetables of different qualities.
Terahertz imaging technology plays an important role in quality monitoring and safety assurance during the transportation and storage period of facility vegetables. By analyzing the absorption and scattering characteristics of vegetable tissues on electromagnetic waves in the 0.2–1.2 THz band, we can accurately quantify the changes in cellular water content and thus determine the freshness and staleness of vegetables [73]. During the storage period, terahertz imaging can accurately identify early signs of spoilage that are difficult to detect by traditional methods and, based on the characteristic absorption peaks of the spoilage metabolites (e.g., ethylene), it can realize early deterioration diagnosis and shelf-life prediction, thus avoiding more serious damage caused by improper storage and preventing the deterioration of vegetable tissues. This can prevent greater economic losses caused by improper storage [74]. At the same time, this technology can accurately detect the surface corrosion of vegetables [74] and crop residues [75], which guarantees the quality and safety of agricultural products, provides an innovative solution for the intelligent management of cold chain logistics of agricultural products, and promotes the development of an agricultural economy.

2.8. X-Ray Imaging

The principle of X-ray imaging technology is based on the penetrating and material attenuation properties of X-rays. When high-energy X-rays pass through an object, substances of different densities absorb X-rays to different degrees, and the unabsorbed X-rays that penetrate the object are received by the detector, forming signals of different intensities, which are then converted into gray-scale images through computer processing [76].
X-ray imaging technology is used to improve the quality management and storage efficiency of vegetables mainly through non-destructive testing during the transportation and storage period of facility vegetables [77]. It is an effective tool for internal quality assessment and histological analysis of fresh fruits and vegetables because of its immunity to magnetic fields, its penetrating ability, and its ability to blacken photographic film, making it possible to examine multiple samples in greater detail in a shorter period of time [78]. During storage, the technology can rapidly detect internal defects in vegetables [79], such as mold [80] and root block damage [81], as well as mechanical damage to fruits and vegetables during transportation [82,83], to ensure that only high-quality products enter the distribution chain. In addition, studies have shown that X-ray irradiation also has the effect of extending shelf life and maintaining quality, which is of great commercial significance for food safety and preservation [84].
The application of this imaging technology in the transportation and storage period for facility vegetables can detect quality problems in vegetables in time, take corresponding measures to deal with them, prolong the storage life of vegetables, reduce losses, and provide key technical support for intelligent quality control of the facility vegetable supply chain.

2.9. Optical Coherence Tomography

Optical coherence tomography (OCT) enables non-invasive, high-resolution imaging of biological tissues and materials [85]. It operates based on the principle of low-coherence interferometry. A low-coherence light source emits light that is split into two beams: one travels through the sample (sample arm), and the other through a reference path (reference arm). Light reflected from different depths within the sample interferes with the reference light. By detecting and analyzing this interference signal, OCT can reconstruct cross-sectional images of the sample with a resolution in the order of micrometers [86].
In the study of facility vegetables, OCT has significant potential applications. During the vegetative growth stage, OCT can provide detailed images of leaf internal structures, including the arrangement of palisade and spongy mesophyll cells. Changes in these structures can indicate the physiological status of the plant, such as for water stress or nutrient deficiencies [87]. In the mature harvest stage, when evaluating the quality of fruits, OCT can be employed to detect subtle defects beneath the fruit surface, such as incipient bruising or internal structural abnormalities, which are difficult to identify with other optical techniques [88]. Compared to other imaging methods, OCT offers a unique combination of high resolution and non-invasiveness, making it a valuable tool for complementing existing optical sensing technologies in facility vegetable research. However, its current application in this field is still in the exploratory stage, with limitations such as a relatively shallow imaging depth (typically a few millimeters) and higher equipment costs than some conventional techniques like RGB imaging.
In conclusion, the nine optical sensing technologies reviewed in this chapter exhibit functional complementarity throughout the entire growth cycle of facility vegetables: RGB imaging and 3D imaging, with their low cost, ease of operation, and adaptability across the entire cycle, have become the core technologies for morphological monitoring during the seedling stage, plant type analysis during the vegetative growth stage, and appearance inspection during the harvest stage; multispectral and hyperspectral imaging, leveraging the spectral resolution advantage, perform outstandingly in nutrient diagnosis during the vegetative growth stage (such as nitrogen and phosphorus content detection) and quality component inversion during the fruit expansion and ripening stage (such as sugar content and pigments); thermal imaging and chlorophyll fluorescence imaging focus on physiological dynamic monitoring, achieving early warning for water stress, pests, and diseases, and extreme environmental responses throughout the cycle through temperature anomalies and changes in photosynthetic parameters; Raman spectroscopy, terahertz, and X-ray imaging, due to their high specificity or strong penetration, are mostly used for post-harvest quality grading and internal defect detection during storage; and OCT, although still in the exploration stage, provides a new path for micrometer resolution analysis of leaf microstructure and identification of internal hidden damages. The differences in application scenarios of the various technologies offer diversified technical options for precise management of the different growth stages of facility vegetables.

3. Application of Optical Sensing Technology in Phenotyping of Facility Vegetables

We categorize the phenotypic traits of protected-culture vegetables into biochemical traits, physiological traits, yield traits, and quality traits, covering key indicators from basal metabolism throughout the entire growth cycle to final economic value. Simultaneously, we clarify the application directions of technologies such as RGB, multi-/hyperspectral imaging, and thermal imaging in detecting biochemical traits, physiological traits, yield traits, and quality traits, as shown in Figure 2.
Based on Web of Science, we summarized the number of articles on the application of optical sensors in facility vegetables in each year from 2000 onwards. The literature was retrieved by using the combination of “optical technology type + facility vegetables + phenotypic research”, excluding irrelevant scenarios such as open-field cultivation, and the data were classified according to the following scientific framework: biochemical traits (divided into photosynthetic pigments, enzyme activities, etc., based on metabolic functions), physiological traits (divided into non-biological/biological stress responses and activities, photosynthetic efficiency, etc.), yield traits (divided into fruit quantity, biomass, etc., based on formation indicators), and quality traits (divided into external color/shape, internal soluble solids/freshness based on value), and corresponding detection capabilities of optical technologies were matched within each category to ensure comprehensive coverage without any omissions (as shown in Figure 3).

3.1. Biochemical Traits

RGB imaging, multi/hyperspectral imaging, terahertz imaging, and Raman imaging are commonly used for the detection of biochemical traits in facility vegetables. These traits (e.g., photosynthetic pigments, nutrient elements) are the material basis of physiological functions, directly affecting vegetables’ resistance, nutritional quality, and flavor formation [89,90]. For example, water content—an important biochemical trait of lettuce—directly influences its yield and quality [91], and multispectral technology is widely applied here by analyzing the characteristic spectral response of lettuce leaves in visible and near-infrared bands [92]. When light is irradiated to lettuce leaves, different water contents will lead to differences in the absorption and reflection characteristics of light in each band, and water content information can be obtained by analyzing the intensity changes of reflected or transmitted light in each band. This technology is the basis for improved refrigeration, optimization, and for maintaining freshness and extending shelf life [93].
Hyperspectral imaging technology shows unique advantages in the detection of pigments in facility vegetables. Chlorophyll and carotenoid content are important biochemical characteristics that affect the photosynthetic capacity of vegetation and control crop productivity [94], This technology is based on the fact that chlorophyll has significant absorption peaks near 660 nm and 450 nm, and the characteristic absorption characteristics of carotenoids are mainly at 450 nm, so that pigment content can be accurately estimated by obtaining continuous spectral information and analyzing the change of reflectance or absorbance at the characteristic wavelengths [95]. Taking cucumber as an example, the hyperspectral image of the leaf is acquired by hyperspectral imager, and the pixel-level spectral analysis combined with specific algorithms can not only quantitatively determine the pigment content but also intuitively show its spatial distribution in the leaf [96].
In addition, the large amount of spectral information available in hyperspectral images allows for high-precision assessment of crop nutrient status and proposes corresponding fertilizer treatments for optimal crop production [97], e.g., leaf nitrogen content is an important factor influencing optimal canopy light-use efficiency and the canopy photosynthesis rate [98], as well as the use of hyperspectral images to estimate the nitrogen content of different crop types and thus to measure their growth status.
This method provides an important means of studying the mechanism of plant photosynthesis, growth, and development, and response to environmental stresses, and also helps growers to identify problems such as deficiencies, pests, and diseases in a timely manner, implement precise interventions, and safeguard the healthy growth of facility vegetables. Table 1 provides a detailed list of the applications of optical sensing technology in the biochemical traits of facility vegetables.

3.2. Physiological Traits

Physiological traits are the dynamic functional properties exhibited by plants during growth, such as photosynthesis, respiration, transpiration, water and nutrient uptake, and utilization [90]. These traits directly affect plant growth and development and stress tolerance, and ultimately yield and quality formation [114]. Chlorophyll fluorescence imaging, thermal imaging, and multispectral and hyperspectral imaging are commonly used to monitor abiotic and biotic stress responses in the physiological traits of facility vegetables [115,116].
Abiotic stresses, such as drought, high/low temperature, salinity, and nutrient deficiencies, and biotic stresses, such as diseases and pests, can significantly affect the physiological metabolism of facility vegetables, which in turn reduces yield and quality. In abiotic stress detection, the use of chlorophyll fluorescence imaging to detect the cold damage level of tomato seedlings under low-temperature stress by the values of six fluorescence parameters, Y(II), qP, qL, Y(NPQ), Y(NO), and Fv/Fm, has been well assessed and has considerable promise in the non-destructive diagnosis of low-temperature injury in plants [117]. The use of chlorophyll, a fluorescence OJIP transient, to assess salt stress in tomato leaves and fruits can be used as a non-destructive, simple, and rapid technique [118] and also used to characterize and assess the response of tomato leaves and fruits to high and low temperature stress [119]. Under drought stress, the water content of vegetable leaves decreases, leading to an increase in leaf temperature, and thermography can be used to estimate transpiration rates and assess plant water status [120]. Multi-spectral imaging can be used to assess the effects of drought stress on vegetable growth by analyzing changes in leaf spectral features, such as leaf area index, leaf nitrogen content and other parameters [121].
In biotic stress monitoring, when facility vegetables are attacked by pests and diseases, the plants will produce a series of physiological changes, and these changes will lead to changes in leaf temperature. Thermal imaging technology reflects the stress condition of plants by detecting changes in leaf surface temperature [14]. For example, when a cucumber is infected with downy mildew, the temperature of the diseased spot will be different from that of the healthy part, and thermal imaging can clearly show the abnormal temperature area, which can help the grower to detect the disease in time [122]. Multispectral imaging, on the other hand, monitors disease by analyzing changes in spectral reflectance of leaves at multiple specific wavelengths [123]. Chlorophyll (Chl) and carotenoid (Car) contents in cucumber leaves were measured by hyperspectral imaging to determine whether they were infected with angular leaf spot [96]. Table 2 provides a detailed list of the applications of optical sensing technology in the physiological traits of facility vegetables.

3.3. Yield Traits

Yield traits are important indicators for evaluating the postharvest quality of facility vegetables. RGB imaging, multi/hyperspectral imaging, and chlorophyll fluorescence imaging are commonly used non-destructive testing methods for yield traits in facility vegetables. RGB imaging technology plays an important role in monitoring and evaluating yield traits of facility vegetables, where plant phenotypic information can be obtained quickly and non-destructively through high-resolution imaging, providing data support for precision agriculture management. Specifically, the visible light band (red, green, and blue) images captured by RGB cameras can resolve the growth status of vegetable crops, such as plant height, leaf area, and canopy cover [138], and physiological characteristics, such as leaf color change and fruit coloration [139]. Combined with image processing algorithms, key yield-related parameters such as fruit number and size distribution can be quantified [140]. In large-scale production in facility environments, the combination of RGB imaging and automated systems can realize high-throughput phenotyping of vegetable yield traits, providing a scientific basis for variety selection and cultivation decisions.
Multispectral and hyperspectral imaging technologies have significant advantages in monitoring yield traits of vegetables in facility settings, capturing vegetation reflectance features in multiple discrete bands to assess chlorophyll content [141], biomass accumulation, and water status [142]; furthermore, with the fine spectral data provided by successive narrow wavelength bands, it is possible to more deeply identify biochemical components related to yield [143] and early stress response [144]. This imaging technique accurately analyzes crop physiological status and yield potential through richer spectral information, analyzes characteristic parameters of spectral reflectance related to biomass, such as Normalized Vegetation Index (NDVI), Photochemical Reflectance Index (PRI), etc., and establishes a regression model between hyperspectral data and biomass [39]. By monitoring the changes in biomass, we can understand the growth dynamics of facility vegetables, analyze the relationship between biomass and yield, and provide scientific guidance for optimizing planting management measures and improving yield.
Chlorophyll fluorescence imaging provides visualization data for key physiological processes of yield formation by non-destructively and rapidly detecting the efficiency of light energy absorption, transfer and conversion in the plant photosynthetic system [145], and reflecting in real time the photosynthetic physiological status of the plant and its response to environmental stress [146]. In the facility environment, this technique can accurately reflect the spatial distribution and dynamic changes of leaf photosynthetic activities (e.g., maximum photochemical efficiency of PSII Fv/Fm, actual photochemical efficiency ΦPSII). In addition, chlorophyll fluorescence imaging can also diagnose photosynthetic inhibition caused by stresses such as pigment deficiency, drought [147], and disease [148] in an early stage, which can provide a basis for optimizing the regulation of the facility environment, intervening in a timely manner to reduce yield loss, and ultimately promoting an increase in vegetable yields by enhancing photosynthetic productivity. Table 3 provides a detailed list of the applications of optical sensing technology in the yield traits of facility vegetables.

3.4. Quality Traits

RGB imaging, Raman imaging, and multispectral and hyperspectral imaging are commonly used for non-destructive testing of facility vegetables’ quality traits [158,159]. These traits—covering external (color, shape), texture (hardness, crispness), and internal (soluble solids, vitamins) characteristics—directly determine commercial value and nutritional function. For example, color, a critical quality trait, not only affects consumer purchase intent but also reflects fruit ripeness; RGB imaging quantifies fruit color via color space conversion and image processing algorithms [160]. Combined with deep learning architectures such as Transformer, ResNet, and MobileNetV3, RGB imaging technology can be used to assess the freshness of fruits and vegetables such as apples and lettuce [161]. For shape analysis, RGB imaging technology describes and evaluates the shape using morphological parameters by extracting external contour information. Shape indices, such as roundness and aspect ratio, are parameters that reflect whether the shape is regular and uniform [162]. Regular and uniform shapes usually have better quality and market value. By analyzing the shape, the quality of good traits can be screened out and the commercial rate of the product can be improved. RGB imaging can also be used to detect the surface defects of fruits, such as diseased spots and wounds, etc., so as to detect the quality problems in a timely manner and guarantee the product quality [163].
Moisture, ripeness, internal defects, and soluble solids content (SSC), including sugar, acid, and minerals, are important internal quality characteristics of facility vegetables and are hot topics for research and attention. In tomato fruit quality assessment, an RGB imaging system is used to take images of tomato fruits, and the images are converted from the RGB color space to the HSV (hue, saturation, and lightness) color space, and the ripeness of tomato fruits can be determined by analyzing the changes in the hue (H) values [164]. As tomato fruits ripen, the hue value changes regularly, with higher hue values for unripe fruits and lower hue values for ripe fruits [165]. By setting the appropriate hue threshold, the ripeness of tomato fruits can be accurately judged, providing a basis for the selection of picking time. Raman technology is based on the principle of inelastic scattering of light. When light is irradiated on vegetables, the molecules will scatter the light, in which the Raman scattered light generated by inelastic scattering contains the vibration and rotation information of molecules, and different molecules have unique Raman spectral characteristics. By analyzing the Raman spectra, the internal components of vegetables can be identified and quantitatively detected, such as the content of carotenoids [166], nitrate [167], and so on. By measuring the spectra and pigmentation of green ripe tomatoes and using them as an indicator for classifying the first day’s spectral data as ripe or unripe, a multispectral imager can be utilized for predicting tomato ripeness [168]. Defects in pepper fruits can be accurately and reliably identified and classified by extracting spectral data from the region of interest (ROI) of pepper fruit samples, integrating hyperspectral imaging, partial least squares discriminant analysis, band selection methods, ANOVA-based classification, and improved weighted spectral analysis [169]. Due to its relatively limited penetration depth, OCT is primarily applicable for microstructural characterization of superficial tissue layers. Nevertheless, a key advantage of OCT lies in its ability to avoid compression and deformation of cells or interstitial spaces, thereby providing higher accuracy in depicting near-surface features. Consequently, by monitoring alterations in fruit cell arrangement and tissue density, OCT can effectively detect variations in internal moisture content, enabling the determination of the fruit’s drying level [170]. Furthermore, the high axial resolution depth-resolved images generated by OCT allow for the detection of changes in the healthy cell layers of infected fruits, facilitating the early visualization of internal defects caused by pathogenic infection [171]. Although both OCT and X-ray imaging can penetrate biological samples, OCT experiences greater contrast degradation with increasing penetration depth than X-rays. As a result, X-ray imaging offers superior capabilities for reconstructing and analyzing the 3D microstructure of fruits [85]. Table 4 provides a detailed list of the applications of optical sensing technology in the quality traits of facility vegetables.
This section reviews the application of various optical sensing technologies in detecting four key phenotypic characteristics (biochemistry, physiology, yield, and quality) of vegetables in agricultural facilities. The comparative analysis of their working principles reveals the functional complementarity for specific phenotypic characteristics but there are still significant differences in the maturity and readiness for large-scale application of the various technologies, as shown in Table 5 Hyperspectral and Raman imaging excel in quantifying biochemical components such as water, pigments, and nutrients. Thermal imaging and chlorophyll fluorescence imaging have unique advantages in non-invasive monitoring of physiological states, enabling early detection of stress through canopy temperature and photosynthetic efficiency signals. Yield estimation mainly utilizes RGB and 3D imaging to extract structural parameters, including plant height, canopy width, leaf area, and fruit number. In quality assessment, a multi-scale assessment framework has been established: RGB imaging is used to evaluate appearance, X-ray and OCT are used to detect the integrity of internal structures (such as tissue density, internal defects, and morphological disorder), and Raman spectroscopy can interpret molecular-level properties, including sugar accumulation and pesticide residues. However, in practical applications, it is still necessary to quantify the performance differences among different technologies, environmental conditions, and crop varieties. This is crucial for transforming sensing methods from experimental concepts to reliable and integrated phenotypic analysis systems that can support precision agriculture in actual production environments.

4. Challenges and Perspectives of Optical Sensing Technology in Facility-Based Vegetable Phenotyping Research

4.1. Problems in Reality

Despite the significant progress of optical sensing technology in laboratory-based facility vegetable phenotyping, its translation to large-scale practical production is hindered by three core bottlenecks that limit its popularization among small and medium-sized growers and integration into existing greenhouse management systems.
(i)
Complexity of data processing: The multi-dimensional data generated by optical sensors need to undergo specialized processing to achieve effective fusion and artificial intelligence modeling [191]. The core challenge lies in evaluating the appropriate processing methods and standardizing them, which is crucial for ensuring the reliability of the data and the consistency of the model. Data from different sensors (such as multispectral cameras from different manufacturers) may be processed using different spectral normalization techniques, resulting in incompatible and ineffective fusion outcomes and preventing comprehensive phenotypic analysis.
Although growing interest in deep learning has brought innovative algorithms that can solve these problems, such as reducing data preprocessing steps, retaining original features, and automatically learning from the data to improve detection accuracy, as well as enhancing accuracy and efficiency of vegetable phenotypic features, the collection, construction, and annotation of large-scale datasets are both time-consuming and prone to subjectivity [100]. Even with effective data and models, the reliability of some indirect phenotypic indicators used for analyzing the characteristics of facility vegetables is still limited by environmental factors. Especially, optical lenses are highly sensitive to light intensity and temperature, and this environmental interference makes it difficult to compare data at different time points or between greenhouses, further increasing the complexity of data processing and fusion.
(ii)
Sensor cost and robustness: The high cost of high-precision optical sensors remains a primary barrier to widespread adoption, and this issue is particularly prominent for multispectral imaging technology. For example, commercial mid-range multispectral cameras equipped with 5–8 spectral bands (covering visible, near-infrared, and red-edge bands, essential for traits like leaf nitrogen content and chlorophyll estimation) typically cost 30,000–50,000 RMB (≈4200–7000 USD). Even entry-level multispectral sensors (with 3–4 spectral bands) cost 15,000–20,000 RMB, still exceeding the affordability of most smallholders [192], and their high development and production costs make them unaffordable for many small-scale facility vegetable growers [193].
Furthermore, humidity in the greenhouse is relatively high and temperature fluctuates frequently. Occasionally, it may also be affected by pesticide fog or dust [194], which poses a significant challenge to the robustness of the sensors. The changes in instrument temperature and the external environment during operation can cause spectral drift in the data. Due to the shift in the characteristic peak positions of elements, the inversion results of the material composition will be inaccurate [195]. To ensure the accuracy and reliability of sensor measurements, the sensors need to be calibrated and maintained regularly, which not only requires specialized technicians and equipment but also incurs additional costs. Some high-end optical sensors need to be calibrated under specific environmental conditions, which increases the difficulty and cost of calibration [196].
(iii)
Poor interaction with the greenhouse: Most optical sensors operate as standalone devices, with limited integration into existing greenhouse intelligent systems (e.g., irrigation controllers, fertilization machines, environmental regulators) [197]. For example, hyperspectral sensors can detect nitrogen deficiency in cucumber leaves and output recommended fertilization rates, but most of the existing greenhouse fertilization systems (e.g., drip irrigation fertilization machines) lack data interfaces to receive these recommendations, requiring growers to manually input parameters, which delays action and increases human error. Similarly, thermal imaging-based crop water stress index (CWSI) data cannot be directly transmitted to irrigation controllers. This results in a certain delay between the stress detection and the irrigation adjustment [198].
In summary, resolving real-world application bottlenecks requires cross-disciplinary collaboration between sensor hardware engineers, software developers, and agricultural extension personnel. Future research should prioritize low-cost, robust sensor development and user-centric system integration to bridge the gap between laboratory innovation and practical production.

4.2. Prospects for Future Applications

4.2.1. Data Fusion and AI Applications

The data acquired by different types of optical sensors have their own unique characteristics, which means data fusion faces many difficulties. For example, RGB imaging data mainly present visual information such as the color and shape of the object [199], while multispectral imaging data contain rich spectral features and can be used to analyze the chemical composition of the object [39]. Thermal imaging data, on the other hand, focus on reflecting the temperature distribution of an object [13]. These data not only differ in dimension, but also in data format, resolution, and sampling frequency [200]. Fusing data from different sensors, due to their different installation positions, resolutions, etc., will lead to bias in the detailed expression of the fused data, and it is difficult to accurately reflect the real characteristics of the object [201].
By integrating different sensing data, the advantages of multi-source sensing can be utilized to compensate for the limitations of a single sensor, thus improving the accuracy of the assessment [202,203]. However, the data fusion process suffers from data redundancy and difficulties in mining complementary information. Data acquired by different sensors may have partially overlapping information, which generates data redundancy and increases the burden and cost of data processing. And to mine truly complementary information from these complex data, it is necessary to deeply understand the intrinsic connection and characteristics of each sensor datum, and to apply advanced data processing algorithms and models [202]. However, the relevant research is not mature enough, and effective data fusion methods still need to be further explored and improved.
A single data source still has large limitations, which not only requires multi-sensing data but also artificial intelligence-driven analysis. With the rapid development of artificial intelligence technology, it shows great potential for application in the field of optical sensor data processing and analysis [204]. Machine learning algorithms can efficiently process and analyze large amounts of optical sensor data. By learning from historical data, machine learning models can establish complex relationships between data and realize accurate prediction of the growth status of facility vegetables [205]. Deep learning algorithms excel in image recognition and classification. It is difficult to judge the quality of vegetables stored in different periods by the senses alone; the identification of vegetable freshness using hyperspectral imaging and deep learning methods can accurately evaluate the quality of vegetables [206]. For RGB imaging data, deep learning models can accurately recognize the variety, growth stage, and pest and disease symptoms of vegetables. Using deep convolutional neural networks in identifying the disease classes of vegetable pests and diseases, usually before they are visible to the human eye [207], the accuracy rate can exceed 90%, while the average detection time is less than 1s, which provides a strong support for timely disease control [208]. Through federated learning (FL), emerging artificial intelligence models are deployed on edge devices, and models are trained for tasks such as crop classification and yield prediction, marking a significant advancement for traditional agriculture towards smart agriculture [209]. Meanwhile, existing studies have utilized artificial intelligence (AI) and explainable artificial intelligence (XAI) technologies to predict crop yields and assess the impact of climate change on agriculture [210].
In conclusion, through the combination of artificial intelligence technology and data fusion, and in-depth analysis of the fused data to dig out more valuable information, the intelligent level of facility vegetable production management can be further improved [202].

4.2.2. Construction of Low-Cost and Reliable Sensor Networks

The development of low-cost and reliable sensor networks for environmental control in scenarios such as greenhouses is necessary [211], as it addresses the critical gap between advanced optical sensing technology and the economic feasibility for small-to-medium-scale growers. In terms of technical realization, how to ensure the performance and stability of sensors while reducing costs is the key issue [212]. Low-cost sensors—often constructed using consumer-grade components (e.g., off-the-shelf CMOS cameras, inexpensive narrow-band filters, and open-source microcontrollers like Arduino or Raspberry Pi) [213]—may exhibit declines in accuracy, spectral resolution, or temporal stability compared to their high-end counterparts; how to find a balance between cost and performance through technological innovation and optimization of design is thus a challenge to be solved. Existing studies have utilized the principles of wireless sensor networks (WSN) and employed low-cost non-dispersive infrared (NDIR) CO2 sensors to conduct in situ measurements of greenhouse soil gas emissions. They have achieved a R2 value of 0.96, and showed no significant difference from the 1:1 relationship at the significance level of α = 0.05 [214]. However, how to design and integrate a large number of low-cost optical sensors, and apply them reliably to the collection of greenhouse data, remains a significant challenge.
In terms of application promotion, growers’ acceptance of low-cost sensor networks is also an important factor. Many growers have a low level of understanding and trust in the new technology and are worried about the reliability and practicality of low-cost sensor networks, which requires strengthening publicity and training to improve the cognitive level and application ability of growers. It is also necessary to establish a perfect after-sales service system to provide growers with timely technical support and maintenance services to lessen their worries [215].

4.2.3. Intelligent Sensing System and Precision Agriculture

The future intelligent perception system for facility vegetables is expected to be highly integrated and intelligent. The systematic integration of a variety of advanced optical sensors, such as thermal imaging sensors, fluorescence imaging sensors, hyperspectral sensors, etc., to realize all-around real-time monitoring of the growth environment and growth state of facility vegetables has been evidenced in research on cereal crops [216]. Through wireless transmission technology, data collected by the sensors are transmitted in real time to the cloud server for storage and analysis. Using big data analysis and artificial intelligence algorithms, the massive data are deeply mined and processed to realize accurate prediction and intelligent regulation of the growth process of facility vegetables. When the system detects the symptoms of pests and diseases in vegetables, it can automatically analyze the type and severity of the pests and diseases, and issue timely warning information as well as provide corresponding suggestions for prevention and control measures. At the same time, according to the growth status of vegetables and environmental conditions, it automatically adjusts the operating parameters of irrigation, fertilization, ventilation and other equipment to realize the automation and intelligent management of facility vegetable production [217].
The development of an intelligent perception system will have a far-reaching promotion effect on precise planting and management of facility vegetables. In terms of precision planting, through real-time monitoring and analysis of the growing environment and growth status of vegetables, growers can accurately carry out irrigation, fertilization and pest control according to the actual needs of vegetables, avoiding waste of resources and environmental pollution, and improving the yield and quality of vegetables [218]. In terms of management, the intelligent perception system can provide growers with comprehensive production data and decision-making support to help them optimize the planting plan, rationally arrange the production plan, and improve production efficiency and economic benefits. The intelligent sensing system can also realize remote monitoring and management, and growers can understand the growth situation of facility vegetables anytime and anywhere through cell phones, computers, and other terminal devices, and carry out remote operation and management to improve the convenience and flexibility of management [219].

5. Conclusions

This review provides a comprehensive review and analysis of the application of various optical sensing technologies throughout the entire growth cycle of vegetables, covering their performance in terms of biochemical, physiological, yield, and quality characteristics.
The RGB imaging technology is the most widely applied, covering most of the crop growth cycle and phenotypic trait detection; hyperspectral and multispectral imaging technologies are widely used for evaluating biochemical components and physiological states; and fluorescence and thermal imaging technologies are very effective in precise and early detection of plant stress. For traits related to yield, 3D and RGB imaging provide reliable morphological data, while Raman imaging, X-ray, and optical coherence tomography technologies show great potential in non-destructive internal quality assessment after harvest.
The current challenge lies in converting the verified laboratory test results into reliable and integrated systems applicable to commercial greenhouse environments. To overcome this difficulty, issues such as sensor costs, data complexity, and the reliability of hardware under actual conditions need to be addressed.
The future development direction presents the following important trends: by integrating complementary types of sensors through multi-technology fusion, more comprehensive phenotypic images can be provided, thereby overcoming the limitations of single technology applications; the help of artificial intelligence, especially deep learning technology, will be crucial for automated analysis of complex and multi-dimensional sensor data, improving prediction accuracy, and building scalable models; and, at the same time, it is necessary to closely associate sensor data with the physiological functions and metabolic processes of plants, making the information more valuable and operational for growers.
The continuous development of optical sensing technology has provided greater impetus for more integrated and intelligent decision support systems in greenhouse vegetable production. However, most applications are limited to controlled experimental environments, and transitioning to practical applications still requires the continuous overcoming of key challenges such as multi-sensor data fusion, environmental adaptability, and better interaction with greenhouse systems. By continuously narrowing the gap between optical sensing technology and practical applications and management, the development of facility agriculture will be promoted towards greater adaptability, efficiency, and sustainability.

Author Contributions

Conceptualization, Z.L. (Zonghua Leng), X.Z., X.W., and S.T.; methodology, Z.L. (Zonghua Leng), X.Z., X.W., and Y.Z.; validation, S.T. and Y.Z.; formal analysis, Y.Z., X.H., and Z.L. (Zhaowei Li); investigation, X.H., Z.L. (Zhaowei Li); writing—original draft preparation, Z.L. (Zonghua Leng) and X.W.; writing—review and editing, X.H. and Z.L. (Zhaowei Li).; supervision X.Z. and S.T.; funding acquisition, X.Z., S.T., and Y.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the project of the National Key Research and Development Program of China (Grant No. 2022YFD2002302); Jiangsu Province Industry Forward-looking Program Project (Grant No. BE2023017); National Key Research and Development Program for Young Scientists (Grant No. 2022YFD2000200); and Agricultural Equipment Department of Jiangsu University (Grant No. NZXB20210106).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data are contained within the article. The data presented in this study can be requested from the authors.

Acknowledgments

The authors express their deep gratitude to the Agricultural Engineering College of Jiangsu University for the support provided. The authors also thank the reviewers for their important feedback. During the preparation of this manuscript, the authors used [Doubao, seed-1.6, ByteDance Ltd.] for preparatory composition assistance. The authors have reviewed and edited the output and take full responsibility for the content of this publication. This literature review utilized generative artificial intelligence (AI), specifically the large language model [Doubao, seed-1.6, ByteDance Ltd.], as an auxiliary tool to enhance research efficiency during specific, non-interpretive stages of the manuscript preparation. The use of AI was strictly limited to preparatory and editing assistance, and it was not employed to generate any scientific interpretations, conclusions, or original data analysis. The authors are solely responsible for the entire intellectual content, critical analysis, and final synthesis presented in this work. The application of AI was implemented in the following phases, with explicit oversight: Ideation and Outline Structuring (Phase 1): In the initial exploratory phase, AI was prompted to generate potential high-level structures and thematic headings for a literature review on optical sensing in facility vegetables. The authors critically evaluated all AI-suggested outlines, selecting and significantly modifying a logical framework that served as a preliminary scaffold. This scaffold was then entirely restructured, expanded, and refined based on the authors’ deep domain knowledge and the actual literature findings. Literature Sifting and Initial Summarization (Phase 2): For a subset of the collected papers, AI was used to generate very brief, initial summaries of abstracts or specific sections (e.g., methodology) to aid in rapid triaging. Crucially, every AI-generated summary was rigorously fact-checked, expanded upon, and critically integrated by the authors against the full text of the original source material. No summary was adopted without full author validation and synthesis. Language Polishing and Editing (Phase 3): In the final editing stage, AI was used as an advanced grammar and syntax checker to improve the clarity and fluency of sentences originally drafted by the authors. It was also used to suggest alternative wordings for specific phrases. All suggestions were accepted, rejected, or edited at the sole discretion of the authors to ensure they accurately reflected the intended scientific meaning. Among them, the core academic contributions—including the formulation of the research scope, the critical analysis and interpretation of the reviewed literature, the identification of research gaps, the drawing of conclusions, and the crafting of all original arguments—are exclusively those of the authors. The AI tool functioned purely as an assistive instrument, analogous to an advanced search or editing tool. All prompts, inputs and, most importantly, the final output, were under the complete control and judgment of the human authors. The manuscript in its entirety represents the authors’ original scholarship and critical thinking.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Zuo, L. Analysis and Suggestions on High-quality Development of China’s Modern Facility Animal Husbandry. Agric. Outlook 2024, 20, 3–6. [Google Scholar]
  2. Zhu, B.; Zhang, Y.; Sun, Y.; Shi, Y.; Ma, Y.; Guo, Y. Quantitative estimation of organ-scale phenotypic parameters of field crops through 3D modeling using extremely low altitude UAV images. Comput. Electron. Agric. 2023, 210, 107910. [Google Scholar] [CrossRef]
  3. Xiao, Q.; Bai, X.; Zhang, C.; He, Y. Advanced high-throughput plant phenotyping techniques for genome-wide association studies: A review. J. Adv. Res. 2022, 35, 215–230. [Google Scholar] [CrossRef]
  4. Zhang, H.; Zhou, J.; Zheng, X.; Zhang, Z.; Wang, Z.; Tan, X. Characterization of a Desiccation Stress Induced Lipase Gene from Brassica napusL. J. Agric. Sci. Technol. 2016, 18, 1129–1141. [Google Scholar]
  5. Movahedi, A.; Aghaei-Dargiri, S.; Barati, B.; Kadkhodaei, S.; Wei, H.; Sangari, S.; Yang, L.; Xu, C. Plant immunity is regulated by biological, genetic, and epigenetic factors. Agronomy 2022, 12, 2790. [Google Scholar] [CrossRef]
  6. Du, J.; Fan, J.; Wang, C.; Lu, X.; Zhang, Y.; Wen, W.; Liao, S.; Yang, X.; Guo, X.; Zhao, C. Greenhouse-based vegetable high-throughput phenotyping platform and trait evaluation for large-scale lettuces. Comput. Electron. Agric. 2021, 186, 106193. [Google Scholar] [CrossRef]
  7. Shi, J.; Wang, Y.; Li, Z.; Huang, X.; Shen, T.; Zou, X. Simultaneous and nondestructive diagnostics of nitrogen/magnesium/potassium-deficient cucumber leaf based on chlorophyll density distribution features. Biosyst. Eng. 2021, 212, 458–467. [Google Scholar] [CrossRef]
  8. Zhu, W.; Li, L.; Zhou, Z.; Yang, X.; Hao, N.; Guo, Y.; Wang, K. A colorimetric biosensor for simultaneous ochratoxin A and aflatoxins B1 detection in agricultural products. Food Chem. 2020, 319, 126544. [Google Scholar] [CrossRef] [PubMed]
  9. Liu, F.; Yang, R.; Chen, R.; Guindo, M.L.; He, Y.; Zhou, J.; Lu, X.; Chen, M.; Yang, Y.; Kong, W. Digital techniques and trends for seed phenotyping using optical sensors. J. Adv. Res. 2024, 63, 1–16. [Google Scholar] [CrossRef] [PubMed]
  10. Zhao, G.; Cai, W.; Wang, Z.; Wu, H.; Peng, Y.; Cheng, L. Phenotypic parameters estimation of plants using deep learning-based 3-D reconstruction from single RGB image. IEEE Geosci. Remote Sens. Lett. 2022, 19, 1–5. [Google Scholar] [CrossRef]
  11. Zhang, L.; Song, X.; Niu, Y.; Zhang, H.; Wang, A.; Zhu, Y.; Zhu, X.; Chen, L.; Zhu, Q. Estimating winter wheat plant nitrogen content by combining spectral and texture features based on a low-cost UAV RGB system throughout the growing season. Agriculture 2024, 14, 456. [Google Scholar] [CrossRef]
  12. Zhou, X.; Sun, J.; Mao, H.; Wu, X.; Zhang, X.; Yang, N. Visualization research of moisture content in leaf lettuce leaves based on WT-PLSR and hyperspectral imaging technology. J. Food Process Eng. 2018, 41, e12647. [Google Scholar] [CrossRef]
  13. Vadivambal, R.; Jayas, D.S. Applications of thermal imaging in agriculture and food industry—A review. Food Bioprocess Technol. 2011, 4, 186–199. [Google Scholar] [CrossRef]
  14. Wen, T.; Li, J.-H.; Wang, Q.; Gao, Y.-Y.; Hao, G.-F.; Song, B.-A. Thermal imaging: The digital eye facilitates high-throughput phenotyping traits of plant growth and stress responses. Sci. Total Environ. 2023, 899, 165626. [Google Scholar] [CrossRef]
  15. Qiu, X.; Wu, M.; Mukai, K.; Shimasaki, Y.; Oshima, Y. Effects of elevated irradiance, temperature, and rapid shifts of salinity on the chlorophyll a fluorescence (OJIP) transient of Chattonella marina var. antiqua. J. Fac. Agric. Kyushu Univ. 2019, 64, 293–300. [Google Scholar] [CrossRef]
  16. Guo, Z.; Wu, X.; Jayan, H.; Yin, L.; Xue, S.; El-Seedi, H.R.; Zou, X. Recent developments and applications of surface enhanced Raman scattering spectroscopy in safety detection of fruits and vegetables. Food Chem. 2024, 434, 137469. [Google Scholar] [CrossRef]
  17. Roitsch, T.; Cabrera-Bosquet, L.; Fournier, A.; Ghamkhar, K.; Jiménez-Berni, J.; Pinto, F.; Ober, E.S. New sensors and data-driven approaches—A path to next generation phenomics. Plant Sci. 2019, 282, 2–10. [Google Scholar] [CrossRef] [PubMed]
  18. Du, Z.; Hu, Y.; Ali Buttar, N.; Mahmood, A. X-ray computed tomography for quality inspection of agricultural products: A review. Food Sci. Nutr. 2019, 7, 3146–3160. [Google Scholar] [CrossRef]
  19. Songtao, H.; Ruifang, Z.; Yinghua, W.; Zhi, L.; Jianzhong, Z.; He, R.; Wanneng, Y.; Peng, S. Extraction of potato plant phenotypic parameters based on multi-source data. Smart Agric. 2023, 5, 132. [Google Scholar]
  20. Wei, L.; Yang, H.; Niu, Y.; Zhang, Y.; Xu, L.; Chai, X. Wheat biomass, yield, and straw-grain ratio estimation from multi-temporal UAV-based RGB and multispectral images. Biosyst. Eng. 2023, 234, 187–205. [Google Scholar] [CrossRef]
  21. Sun, Y.; Luo, Y.; Zhang, Q.; Xu, L.; Wang, L.; Zhang, P. Estimation of crop height distribution for mature rice based on a moving surface and 3D point cloud elevation. Agronomy 2022, 12, 836. [Google Scholar] [CrossRef]
  22. Hong, F.; Qu, C.; Wang, L. Cerium improves growth of maize seedlings via alleviating morphological structure and oxidative damages of leaf under different stresses. J. Agric. Food Chem. 2017, 65, 9022–9030. [Google Scholar] [CrossRef]
  23. Garbouge, H.; Rasti, P.; Rousseau, D. Enhancing the tracking of seedling growth using RGB-Depth fusion and deep learning. Sensors 2021, 21, 8425. [Google Scholar] [CrossRef]
  24. Yamaguchi, N.; Fukuda, O.; Okumura, H.; Yeoh, W.L.; Tanaka, M. Estimating tomato plant leaf area in greenhouse environment using multiple RGB-D cameras. In Proceedings of the 2024 Twelfth International Symposium on Computing and Networking Workshops (CANDARW), Naha, Japan, 26–29 November 2024; pp. 179–182. [Google Scholar]
  25. Diago, M.P.; Sanz-Garcia, A.; Millan, B.; Blasco, J.; Tardaguila, J. Assessment of flower number per inflorescence in grapevine by image analysis under field conditions. J. Sci. Food Agric. 2014, 94, 1981–1987. [Google Scholar] [CrossRef]
  26. Bhattarai, U.; Karkee, M. A weakly-supervised approach for flower/fruit counting in apple orchards. Comput. Ind. 2022, 138, 103635. [Google Scholar] [CrossRef]
  27. Wu, S.; Liu, J.; Lei, X.; Zhao, S.; Lu, J.; Jiang, Y.; Xie, B.; Wang, M. Research progress on efficient pollination technology of crops. Agronomy 2022, 12, 2872. [Google Scholar] [CrossRef]
  28. Fukuda, M.; Okuno, T.; Yuki, S. Central object segmentation by deep learning to continuously monitor fruit growth through RGB images. Sensors 2021, 21, 6999. [Google Scholar] [CrossRef]
  29. Tang, S.; Xia, Z.; Gu, J.; Wang, W.; Huang, Z.; Zhang, W. High-precision apple recognition and localization method based on RGB-D and improved SOLOv2 instance segmentation. Front. Sustain. Food Syst. 2024, 8, 1403872. [Google Scholar] [CrossRef]
  30. Gu, W.; Wen, W.; Wu, S.; Zheng, C.; Lu, X.; Chang, W.; Xiao, P.; Guo, X. 3D reconstruction of wheat plants by integrating point cloud data and virtual design optimization. Agriculture 2024, 14, 391. [Google Scholar] [CrossRef]
  31. Liang, X.; Yu, W.; Qin, L.; Wang, J.; Jia, P.; Liu, Q.; Lei, X.; Yang, M. Stem and Leaf Segmentation and Phenotypic Parameter Extraction of Tomato Seedlings Based on 3D Point. Agronomy 2025, 15, 120. [Google Scholar] [CrossRef]
  32. Sun, G.; Ding, Y.; Wang, X.; Lu, W.; Sun, Y.; Yu, H. Nondestructive determination of nitrogen, phosphorus and potassium contents in greenhouse tomato plants based on multispectral three-dimensional imaging. Sensors 2019, 19, 5295. [Google Scholar] [CrossRef]
  33. Yuan, X.; Liu, J.; Wang, H.; Zhang, Y.; Tian, R.; Fan, X. Prediction of Useful Eggplant Seedling Transplants Using Multi-View Images. Agronomy 2024, 14, 2016. [Google Scholar] [CrossRef]
  34. Wu, Q.; Wu, J.; Hu, P.; Zhang, W.; Ma, Y.; Yu, K.; Guo, Y.; Cao, J.; Li, H.; Li, B. Quantification of the three-dimensional root system architecture using an automated rotating imaging system. Plant Methods 2023, 19, 11. [Google Scholar] [CrossRef] [PubMed]
  35. Sun, B.; Zain, M.; Zhang, L.; Han, D.; Sun, C. Stem-Leaf Segmentation and Morphological Traits Extraction in Rapeseed Seedlings Using a Three-Dimensional Point Cloud. Agronomy 2025, 15, 276. [Google Scholar] [CrossRef]
  36. Wang, J.; Zhang, Y.; Gu, R. Research status and prospects on plant canopy structure measurement using visual sensors based on three-dimensional reconstruction. Agriculture 2020, 10, 462. [Google Scholar] [CrossRef]
  37. Wu, W.; Hu, Y.; Lu, Y. Parametric Surface Modelling for tea leaf point cloud based on Non-uniform rational basis spline technique. Sensors 2021, 21, 1304. [Google Scholar] [CrossRef]
  38. Kochi, N.; Tanabata, T.; Hayashi, A.; Isobe, S. A 3D shape-measuring system for assessing strawberry fruits. Int. J. Autom. Technol. 2018, 12, 395–404. [Google Scholar] [CrossRef]
  39. Silva, J.M.; Jacinto, A.C.P.; Ribeiro, A.L.A.; Damascena, I.R.; Ballador, L.M.; Lacerra, P.H.; Vargas, P.F.; Martins, G.D.; Castoldi, R. Phenotyping in Green Lettuce Populations Through Multispectral Imaging. Agriculture 2025, 15, 1295. [Google Scholar] [CrossRef]
  40. Wang, J.; Zhao, Y. Lensless multispectral camera based on a coded aperture array. Sensors 2021, 21, 7757. [Google Scholar] [CrossRef]
  41. Sun, J.; Lu, X.; Mao, H.; Jin, X.; Wu, X. A method for rapid identification of rice origin by hyperspectral imaging technology. J. Food Process Eng. 2017, 40, e12297. [Google Scholar] [CrossRef]
  42. Lee, H.; Kim, M.S.; Jeong, D.; Chao, K.; Cho, B.-K.; Delwiche, S.R. Hyperspectral near-infrared reflectance imaging for detection of defect tomatoes. In Proceedings of the SPIE 8027, Sensing for Agriculture and Food Quality and Safety III, 80270J, Orlando, FL, USA, 2 June 2011. [Google Scholar]
  43. Sun, G.; Hu, T.; Chen, S.; Sun, J.; Zhang, J.; Ye, R.; Zhang, S.; Liu, J. Using UAV-based multispectral remote sensing imagery combined with DRIS method to diagnose leaf nitrogen nutrition status in a fertigated apple orchard. Precis. Agric. 2023, 24, 2522–2548. [Google Scholar] [CrossRef]
  44. Tian, Y.; Zhang, L. Study on the methods of detecting cucumber downy mildew using hyperspectral imaging technology. Phys. Procedia 2012, 33, 743–750. [Google Scholar] [CrossRef]
  45. Zhou, L.; Zhou, L.; Wu, H.; Jing, T.; Li, T.; Li, J.; Kong, L.; Chen, L. Estimation of Cadmium Content in Lactuca sativa L. Leaves Using Visible–Near-Infrared Spectroscopy Technology. Agronomy 2024, 14, 644. [Google Scholar] [CrossRef]
  46. Jiang, J.; Cen, H.; Zhang, C.; Lyu, X.; Weng, H.; Xu, H.; He, Y. Nondestructive quality assessment of chili peppers using near-infrared hyperspectral imaging combined with multivariate analysis. Postharvest Biol. Technol. 2018, 146, 147–154. [Google Scholar] [CrossRef]
  47. Jiang, X.; Peng, M.; Liu, Y.; Ouyang, A. Influence of tomato storage period on the generalization of a near-infrared spectroscopy-based brix prediction mode. J. Food Compos. Anal. 2025, 146, 107829. [Google Scholar] [CrossRef]
  48. Pan, Q.; Lu, Y.; Hu, Y. Influences of Sprinkler Frost Protection on Air and Soil Temperature and Chlorophyll Fluorescence of Tea Plants in Tea Gardens. Agriculture 2024, 14, 2302. [Google Scholar] [CrossRef]
  49. Zhang, C.; Zhang, W.; Yan, H.; Ni, Y.; Akhlaq, M.; Zhou, J.; Xue, R. Effect of micro–spray on plant growth and chlorophyll fluorescence parameter of tomato under high temperature condition in a greenhouse. Sci. Hortic. 2022, 306, 111441. [Google Scholar] [CrossRef]
  50. Wang, Y.-x.; Sun, Y.; Li, Y.-h.; Sun, G.-x.; Wang, X.-c. Classification and monitoring of disease cucumber plants in greenhouse based on chlorophyll fluorescence imaging technology. J. Nanjing Agric. Univ. 2020, 43, 770–780. [Google Scholar]
  51. Kobori, H.; Tsuchikawa, S. Time-resolved principal component imaging analysis of chlorophyll fluorescence induction for monitoring leaf water stress. Appl. Spectrosc. 2013, 67, 594–599. [Google Scholar] [CrossRef]
  52. Chen, X.; Shi, D.; Zhang, H.; Pérez, J.A.S.; Yang, X.; Li, M. Early diagnosis of greenhouse cucumber downy mildew in seedling stage using chlorophyll fluorescence imaging technology. Biosyst. Eng. 2024, 242, 107–122. [Google Scholar] [CrossRef]
  53. Park, B.; Wi, S.; Chung, H.; Lee, H. Chlorophyll fluorescence imaging for environmental stress diagnosis in crops. Sensors 2024, 24, 1442. [Google Scholar] [CrossRef] [PubMed]
  54. Takayama, K.; Nishina, H.; Mizutani, K.; Arima, S.; Hatou, K.; Miyoshi, Y. Chlorophyll fluorescence imaging for health condition monitoring of tomato plants in greenhouse. In Proceedings of the International Symposium on High Technology for Greenhouse Systems: GreenSys2009, Quebec, QC, Canada, 14–19 June 2009; Volume 893, pp. 333–340. [Google Scholar]
  55. Awais, M.; Li, W.; Hussain, S.; Cheema, M.J.M.; Li, W.; Song, R.; Liu, C. Comparative evaluation of land surface temperature images from unmanned aerial vehicle and satellite observation for agricultural areas using in situ data. Agriculture 2022, 12, 184. [Google Scholar] [CrossRef]
  56. Zhang, S.; Yang, H.; Du, T. Evaluating the water status of greenhouse tomatoes using thermal infrared imaging. Trans. CSAE 2022, 38, 229–236. [Google Scholar]
  57. Yao, G. Phenotyping for the Real-time Detection of Plant Leaf Stress Using Infrared Thermal Imaging Sensors. In Proceedings of the 2024 3rd International Symposium on Sensor Technology and Control (ISSTC), Zhuhai, China, 25–27 October 2024; pp. 312–318. [Google Scholar]
  58. Gurupatham, S.K.; Ilksoy, E.; Jacob, N.; Van Der Horn, K.; Fahad, F. Fruit ripeness estimation for avocado using thermal imaging. In Proceedings of the ASME International Mechanical Engineering Congress and Exposition, Pittsburgh, PA, USA, 9–15 November 2018; p. V08BT10A051. [Google Scholar]
  59. Zhong, L.; Wang, X.; Zhang, X.; Shi, Y. Detection method of high temperature stress of tomato at seedling stage based on thermal infrared and RGB images. J. South China Agric. Univ. 2023, 44, 110–122. [Google Scholar]
  60. Zhu, J.; Agyekum, A.A.; Kutsanedzie, F.Y.; Li, H.; Chen, Q.; Ouyang, Q.; Jiang, H. Qualitative and quantitative analysis of chlorpyrifos residues in tea by surface-enhanced Raman spectroscopy (SERS) combined with chemometric models. LWT 2018, 97, 760–769. [Google Scholar] [CrossRef]
  61. Tahir, H.E.; Xiaobo, Z.; Zhihua, L.; Jiyong, S.; Zhai, X.; Wang, S.; Mariod, A.A. Rapid prediction of phenolic compounds and antioxidant activity of Sudanese honey using Raman and Fourier transform infrared (FT-IR) spectroscopy. Food Chem. 2017, 226, 202–211. [Google Scholar] [CrossRef]
  62. Hu, X.; Shi, J.; Zhang, F.; Zou, X.; Holmes, M.; Zhang, W.; Huang, X.; Cui, X.; Xue, J. Determination of retrogradation degree in starch by mid-infrared and Raman spectroscopy during storage. Food Anal. Methods 2017, 10, 3694–3705. [Google Scholar] [CrossRef]
  63. Ma, L.; Yang, X.; Xue, S.; Zhou, R.; Wang, C.; Guo, Z.; Wang, Y.; Cai, J. “Raman plus X” dual-modal spectroscopy technology for food analysis: A review. Compr. Rev. Food Sci. Food Saf. 2025, 24, e70102. [Google Scholar] [CrossRef] [PubMed]
  64. Juárez, I.D.; Steczkowski, M.X.; Chinnaiah, S.; Rodriguez, A.; Gadhave, K.R.; Kurouski, D. Using Raman spectroscopy for early detection of resistance-breaking strains of tomato spotted wilt orthotospovirus in tomatoes. Front. Plant Sci. 2024, 14, 1283399. [Google Scholar] [CrossRef] [PubMed]
  65. Lee, H.; Lim, H.-S.; Cho, B.-K. Classification of cucumber green mottle mosaic virus (CGMMV) infected watermelon seeds using Raman spectroscopy. In Proceedings of the Sensing for Agriculture and Food Quality and Safety VIII, Baltimore, MD, USA, 20–21 April 2016; pp. 50–55. [Google Scholar]
  66. Chylińska, M.; Szymańska-Chargot, M.; Zdunek, A. Imaging of polysaccharides in the tomato cell wall with Raman microspectroscopy. Plant Methods 2014, 10, 14. [Google Scholar] [CrossRef]
  67. Tong-tong, Z.; Xiao-lin, S.; Zhi-zhong, S.; He-huan, P.; Tong, S.; Dong, H. Current status and future perspective of spectroscopy and imaging technique applications in bruise detection of fruits and vegetables: A review. Spectrosc. Spectr. Anal. 2022, 42, 2657–2665. [Google Scholar]
  68. Zhang, X.; Wang, Y.; Zhou, Z.; Zhang, Y.; Wang, X. Detection method for tomato leaf mildew based on hyperspectral fusion terahertz technology. Foods 2023, 12, 535. [Google Scholar] [CrossRef]
  69. Huang, Y.; Li, Z.; Bian, Z.; Jin, H.; Zheng, G.; Hu, D.; Sun, Y.; Fan, C.; Xie, W.; Fang, H. Overview of deep learning and nondestructive detection technology for quality assessment of tomatoes. Foods 2025, 14, 286. [Google Scholar] [CrossRef]
  70. Xu-ting, Z.; Shu-juan, Z.; Bin, L.; Yin-kun, L. Study on moisture content of soybean canopy leaves under drought stress using terahertz technology. Spectrosc. Spectr. Anal. 2018, 38, 2350–2354. [Google Scholar]
  71. Long, Y.; Li, B.; Liu, H. Analysis of fluoroquinolones antibiotic residue in feed matrices using terahertz spectroscopy. Appl. Opt. 2018, 57, 544–550. [Google Scholar] [CrossRef]
  72. Li, Y.; Li, X.-q.; Yang, J.-y.; Sun, L.-j.; Chen, Y.-y.; Yu, L.; Wu, J.-z. Visualisation of Starch Distribution in Corn Seeds Based on Terahertz Time-Domain Spectral Reflection Imaging Technology. Spectrosc. Spectr. Anal. 2023, 43, 2722–2728. [Google Scholar] [CrossRef]
  73. Shen, Y.; Bin, L.; Zhao, C.; Li, G. Study on Terahertz Time-domain Spectral Signatures of Wheat from Different Years. J. Phys. Conf. Ser. 2021, 1871, 012001. [Google Scholar] [CrossRef]
  74. Wu, J.; Li, X.; Sun, L.; Liu, C.; Sun, X.; Yu, L. Advances in the application of terahertz time-domain spectroscopy and imaging technology in crop quality detection. Spectrosc. Spectr. Anal 2022, 42, 358–367. [Google Scholar]
  75. Tong, Y.; Ding, L.; Han, K.; Zou, X.; Wang, S.; Wen, Z.; Ye, Y.; Ren, X. Detection of carbendazim in oranges with metal grating integrated microfluidic sensor in terahertz. Food Addit. Contam. Part A 2022, 39, 1555–1564. [Google Scholar] [CrossRef]
  76. Guan, C.; Fu, J.; Cui, Z.; Wang, S.; Gao, Q.; Yang, Y. Evaluation of the tribological and anti-adhesive properties of different materials coated rotary tillage blades. Soil Tillage Res. 2021, 209, 104933. [Google Scholar] [CrossRef]
  77. Schoeman, L.; Williams, P.; Du Plessis, A.; Manley, M. X-ray micro-computed tomography (μCT) for non-destructive characterisation of food microstructure. Trends Food Sci. Technol. 2016, 47, 10–24. [Google Scholar] [CrossRef]
  78. Nugraha, B.; Verboven, P.; Janssen, S.; Wang, Z.; Nicolaï, B.M. Non-destructive porosity mapping of fruit and vegetables using X-ray CT. Postharvest Biol. Technol. 2019, 150, 80–88. [Google Scholar] [CrossRef]
  79. Mathanker, S.K.; Weckler, P.R.; Bowser, T.J. X-ray applications in food and agriculture: A review. Trans. ASABE 2013, 56, 1227–1239. [Google Scholar] [CrossRef]
  80. Gadgile, D.; Lande, B.; Dhabde, A.; Kamble, S. X-ray imaging for detecting lack mould rot of sapota (Achras sapota) fruit. Plant Pathol. Quar. 2017, 7, 92–94. [Google Scholar] [CrossRef]
  81. Mao, H.; Liu, Y.; Han, L.; Sheng, B.; Ma, G.; Li, Y. X-ray computerized tomography for characterization of pick-up destruction and pick-up parameter optimization of tomato root lumps. Span. J. Agric. Res. 2019, 17, e0202. [Google Scholar] [CrossRef]
  82. Lin, M.; Fawole, O.A.; Saeys, W.; Wu, D.; Wang, J.; Opara, U.L.; Nicolai, B.; Chen, K. Mechanical damages and packaging methods along the fresh fruit supply chain: A review. Crit. Rev. Food Sci. Nutr. 2023, 63, 10283–10302. [Google Scholar] [CrossRef]
  83. Mei, M.; Li, J. An overview on optical non-destructive detection of bruises in fruit: Technology, method, application, challenge and trend. Comput. Electron. Agric. 2023, 213, 108195. [Google Scholar] [CrossRef]
  84. Liu, J.; Sun, J.; Wang, Y.; Liu, X.; Zhang, Y.; Fu, H. Non-Destructive Detection of Fruit Quality: Technologies, Applications and Prospects. Foods 2025, 14, 2137. [Google Scholar] [CrossRef]
  85. Li, M.; Landahl, S.; East, A.R.; Verboven, P.; Terry, L.A. Optical coherence tomography—A review of the opportunities and challenges for postharvest quality evaluation. Postharvest Biol. Technol. 2019, 150, 9–18. [Google Scholar] [CrossRef]
  86. Podoleanu, A.G. Optical coherence tomography. Br. J. Radiol. 2005, 78, 976–988. [Google Scholar] [CrossRef]
  87. Loeb, G.; Barton, J. Imaging botanical subjects with optical coherence tomography: A feasibility study. Trans. ASAE 2003, 46, 1751–1757. [Google Scholar] [CrossRef]
  88. Zhou, Y.; Mao, J.; Liu, T.; Zhou, W.; Chen, Z. Discriminating hidden bruises in loquat by attenuation coefficients estimated from optical coherence tomography images. Postharvest Biol. Technol. 2017, 130, 1–6. [Google Scholar] [CrossRef]
  89. Akter, S.; Castaneda-Méndez, O.; Beltrán, J. Synthetic reprogramming of plant developmental and biochemical pathways. Curr. Opin. Biotechnol. 2024, 87, 103139. [Google Scholar] [CrossRef]
  90. Zhang, H.; Wang, L.; Jin, X.; Bian, L.; Ge, Y. High-throughput phenotyping of plant leaf morphological, physiological, and biochemical traits on multiple scales using optical sensing. Crop J. 2023, 11, 1303–1318. [Google Scholar] [CrossRef]
  91. Liu, H.; Li, H.; Ning, H.; Zhang, X.; Li, S.; Pang, J.; Wang, G.; Sun, J. Optimizing irrigation frequency and amount to balance yield, fruit quality and water use efficiency of greenhouse tomato. Agric. Water Manag. 2019, 226, 105787. [Google Scholar] [CrossRef]
  92. Zhang, L.; Zhang, B.; Zhang, H.; Yang, W.; Hu, X.; Cai, J.; Wu, C.; Wang, X. Multi-Source Feature Fusion Network for LAI Estimation from UAV Multispectral Imagery. Agronomy 2025, 15, 988. [Google Scholar] [CrossRef]
  93. Vitalis, F.; Muncan, J.; Anantawittayanon, S.; Kovacs, Z.; Tsenkova, R. Aquaphotomics monitoring of lettuce freshness during cold storage. Foods 2023, 12, 258. [Google Scholar] [CrossRef]
  94. Gonzalles, G.; Geng, N.; Luo, S.; Zhang, C.; Wu, C.; Li, D.; Li, Y.; Song, J. Effects of different water activities on the stability of carotenoids in puff-dried yellow peach powder during storage. Qual. Assur. Saf. Crops Foods 2021, 13, 1–8. [Google Scholar] [CrossRef]
  95. Tian, X.; He, S.-L.; Lü, Q.; Yi, S.-L.; Xie, R.-J.; Zheng, Y.-Q.; Liao, Q.-H.; Deng, L. Determination of photosynthetic pigments in citrus leaves based on hyperspectral images datas. Spectrosc. Spectr. Anal. 2014, 34, 2506–2512. [Google Scholar]
  96. Zhao, Y.-R.; Li, X.; Yu, K.-Q.; Cheng, F.; He, Y. Hyperspectral imaging for determining pigment contents in cucumber leaves in response to angular leaf spot disease. Sci. Rep. 2016, 6, 27790. [Google Scholar] [CrossRef]
  97. Zhang, L.; Wang, X.; Zhang, H.; Zhang, B.; Zhang, J.; Hu, X.; Du, X.; Cai, J.; Jia, W.; Wu, C. UAV-Based Multispectral Winter Wheat Growth Monitoring with Adaptive Weight Allocation. Agriculture 2024, 14, 1900. [Google Scholar] [CrossRef]
  98. Kergoat, L.; Lafont, S.; Arneth, A.; Le Dantec, V.; Saugier, B. Nitrogen controls plant canopy light-use efficiency in temperate and boreal ecosystems. J. Geophys. Res. Biogeosci. 2008, 113, G04017. [Google Scholar] [CrossRef]
  99. Niu, Y.; Xu, L.; Zhang, Y.; Xu, L.; Zhu, Q.; Wang, A.; Huang, S.; Zhang, L. Enhancing the Performance of Unmanned Aerial Vehicle-Based Estimation of Rape Chlorophyll Content by Reducing the Impact of Crop Coverage. Drones 2024, 8, 578. [Google Scholar] [CrossRef]
  100. Zuo, Z.; Mu, J.; Li, W.; Bu, Q.; Mao, H.; Zhang, X.; Han, L.; Ni, J. Study on the detection of water status of tomato (Solanum lycopersicum L.) by multimodal deep learning. Front. Plant Sci. 2023, 14, 1094142. [Google Scholar] [CrossRef] [PubMed]
  101. Yang, X.; Zhang, J.; Guo, D.; Xiong, X.; Chang, L.; Niu, Q.; Huang, D. Measuring and evaluating anthocyanin in lettuce leaf based on color information. IFAC-Pap. 2016, 49, 96–99. [Google Scholar] [CrossRef]
  102. Elmetwalli, A.H.; Derbala, A.; Alsudays, I.M.; Al-Shahari, E.A.; Elhosary, M.; Elsayed, S.; Al-Shuraym, L.A.; Moghanm, F.S.; Elsherbiny, O. Machine learning-driven assessment of biochemical qualities in tomato and mandarin using RGB and hyperspectral sensors as nondestructive technologies. PLoS ONE 2024, 19, e0308826. [Google Scholar] [CrossRef]
  103. Taha, M.F.; Mao, H.; Wang, Y.; ElManawy, A.I.; Elmasry, G.; Wu, L.; Memon, M.S.; Niu, Z.; Huang, T.; Qiu, Z. High-throughput analysis of leaf chlorophyll content in aquaponically grown lettuce using hyperspectral reflectance and RGB images. Plants 2024, 13, 392. [Google Scholar] [CrossRef] [PubMed]
  104. Li, Y.; Xu, X.; Wu, W.; Zhu, Y.; Yang, G.; Gao, L.; Meng, Y.; Jiang, X.; Xue, H. Hyperspectral Estimation of Leaf Nitrogen Content in White Radish Based on Feature Selection and Integrated Learning. Remote Sens. 2024, 16, 4479. [Google Scholar] [CrossRef]
  105. Xiaobo, Z.; Jiewen, Z.; Holmes, M.; Hanpin, M.; Jiyong, S.; Xiaopin, Y.; Yanxiao, L. Independent component analysis in information extraction from visible/near-infrared hyperspectral imaging data of cucumber leaves. Chemom. Intell. Lab. Syst. 2010, 104, 265–270. [Google Scholar] [CrossRef]
  106. Kim, H.-s.; Yoo, J.H.; Park, S.H.; Kim, J.-S.; Chung, Y.; Kim, J.H.; Kim, H.S. Measurement of environmentally influenced variations in anthocyanin accumulations in Brassica rapa subsp. Chinensis (Bok Choy) using hyperspectral imaging. Front. Plant Sci. 2021, 12, 693854. [Google Scholar] [CrossRef]
  107. Lü, M.; Song, Y.; Weng, H.; Sun, D.; Dong, X.; Fang, H. Effect of near infrared hyperspectral imaging scanning speed on prediction of water content in Arabidopsis. Spectrosc. Spect. Anal 2020, 40, 3508–3514. [Google Scholar]
  108. Ni, J.; Xue, Y.; Liao, J. Dynamic changes and spectrometric quantitative analysis of antioxidant enzyme activity of TYLCV-infected tomato plants. Comput. Electron. Agric. 2025, 232, 110109. [Google Scholar] [CrossRef]
  109. Zhang, J.; Zhang, D.; Cai, Z.; Wang, L.; Wang, J.; Sun, L.; Fan, X.; Shen, S.; Zhao, J. Spectral technology and multispectral imaging for estimating the photosynthetic pigments and SPAD of the Chinese cabbage based on machine learning. Comput. Electron. Agric. 2022, 195, 106814. [Google Scholar] [CrossRef]
  110. Zhang, Y.; Wang, X.; Wang, Y.; Hu, L.; Wang, P. Detection of tomato water stress based on terahertz spectroscopy. Front. Plant Sci. 2023, 14, 1095434. [Google Scholar] [CrossRef]
  111. Zhang, X.; Wang, P.; Mao, H.; Gao, H.; Li, Q. Detection of the nutritional status of phosphorus in lettuce using THz time-domain spectroscopy. Eng. Agrícola 2021, 41, 599–608. [Google Scholar] [CrossRef]
  112. Hara, R.; Ishigaki, M.; Ozaki, Y.; Ahamed, T.; Noguchi, R.; Miyamoto, A.; Genkawa, T. Effect of Raman exposure time on the quantitative and discriminant analyses of carotenoid concentrations in intact tomatoes. Food Chem. 2021, 360, 129896. [Google Scholar] [CrossRef]
  113. Radu, A.I.; Ryabchykov, O.; Bocklitz, T.W.; Huebner, U.; Weber, K.; Cialla-May, D.; Popp, J. Toward food analytics: Fast estimation of lycopene and β-carotene content in tomatoes based on surface enhanced Raman spectroscopy (SERS). Analyst 2016, 141, 4447–4455. [Google Scholar] [CrossRef] [PubMed]
  114. Azeem, A.; Javed, Q.; Sun, J.; Nawaz, M.I.; Ullah, I.; Kama, R.; Du, D. Functional traits of okra cultivars (Chinese green and Chinese red) under salt stress. Folia Hortic. 2020, 32, 159–170. [Google Scholar] [CrossRef]
  115. Gu, S.; Qian, R.; Xing, G.; Zhu, L.; Wang, X. Monitoring of Alternaria alternata infection in postharvest green pepper fruit during storage by multicolor fluorescence and chlorophyll fluorescence. Postharvest Biol. Technol. 2025, 219, 113246. [Google Scholar] [CrossRef]
  116. Yang, H.; Kim, Y.; Bae, Y.; Hyeon, S.; Choi, M.; Jang, D. Investigating growth, root development, and chlorophyll fluorescence of tomato scions and rootstocks under UV-B stress in a plant factory with artificial lighting. Sci. Hortic. 2025, 347, 114191. [Google Scholar] [CrossRef]
  117. Dong, Z.; Men, Y.; Liu, Z.; Li, J.; Ji, J. Application of chlorophyll fluorescence imaging technique in analysis and detection of chilling injury of tomato seedlings. Comput. Electron. Agric. 2020, 168, 105109. [Google Scholar] [CrossRef]
  118. Zushi, K.; Matsuzoe, N. Using of chlorophyll a fluorescence OJIP transients for sensing salt stress in the leaves and fruits of tomato. Sci. Hortic. 2017, 219, 216–221. [Google Scholar] [CrossRef]
  119. Zushi, K.; Kajiwara, S.; Matsuzoe, N. Chlorophyll a fluorescence OJIP transient as a tool to characterize and evaluate response to heat and chilling stress in tomato leaf and fruit. Sci. Hortic. 2012, 148, 39–46. [Google Scholar] [CrossRef]
  120. Hang, T.; Lu, N.; Takagaki, M.; Mao, H. Leaf area model based on thermal effectiveness and photosynthetically active radiation in lettuce grown in mini-plant factories under different light cycles. Sci. Hortic. 2019, 252, 113–120. [Google Scholar] [CrossRef]
  121. Li, M.; Wang, Z.; Cheng, T.; Zhu, Y.; Li, J.; Quan, L.; Song, Y. Phenotyping terminal heat stress tolerance in wheat using UAV multispectral imagery to monitor leaf stay-greenness. Smart Agric. Technol. 2025, 11, 100996. [Google Scholar] [CrossRef]
  122. Oerke, E.-C.; Steiner, U.; Dehne, H.-W.; Lindenthal, M. Thermal imaging of cucumber leaves affected by downy mildew and environmental conditions. J. Exp. Bot. 2006, 57, 2121–2132. [Google Scholar] [CrossRef]
  123. Zhu, W.; Feng, Z.; Dai, S.; Zhang, P.; Wei, X. Using UAV multispectral remote sensing with appropriate spatial resolution and machine learning to monitor wheat scab. Agriculture 2022, 12, 1785. [Google Scholar] [CrossRef]
  124. Gonzalez-Huitron, V.; León-Borges, J.A.; Rodriguez-Mata, A.; Amabilis-Sosa, L.E.; Ramírez-Pereda, B.; Rodriguez, H. Disease detection in tomato leaves via CNN with lightweight architectures implemented in Raspberry Pi 4. Comput. Electron. Agric. 2021, 181, 105951. [Google Scholar] [CrossRef]
  125. Liu, Y.; Ban, S.; Wei, S.; Li, L.; Tian, M.; Hu, D.; Liu, W.; Yuan, T. Estimating the frost damage index in lettuce using UAV-based RGB and multispectral images. Front. Plant Sci. 2024, 14, 1242948. [Google Scholar] [CrossRef]
  126. Zhang, Y.; Zhang, D.; Zhang, Y.; Cheng, F.; Zhao, X.; Wang, M.; Fan, X. Early detection of verticillium wilt in eggplant leaves by fusing five image channels: A deep learning approach. Plant Methods 2024, 20, 173. [Google Scholar] [CrossRef]
  127. Zhang, J.; Ai, Y.; Liang, H.; Zhang, D.; Liu, Y.; Li, L.; Qi, S.; Ma, H.; Zhao, S.; Xue, J. A Salt-Tolerance evaluation system for Chinese cabbage using multispectral image data fusion and Fine-Tuned, pruned convolutional-LSTM-ResNet networks. Comput. Electron. Agric. 2025, 231, 110005. [Google Scholar] [CrossRef]
  128. Yang, L.; Cui-ling, L.; Xiu, W.; Peng-fei, F.; Yu-kang, L.; Chang-yuan, Z. Identification of cucumber disease and insect pest based on hyperspectral imaging. Spectrosc. Spectr. Anal. 2024, 44, 301–309. [Google Scholar]
  129. Wan-jie, L.; Hui, F.; Dong, J.; Wen-yu, Z.; Jing, C.; Hong-xin, C. Early recognition of Sclerotinia stem rot on oilseed rape by hyperspectral imaging combined with deep learning. Spectrosc. Spectr. Anal. 2023, 43, 2220–2225. [Google Scholar]
  130. Zhou, X.; Sun, J.; Tian, Y.; Wu, X.; Dai, C.; Li, B. Spectral classification of lettuce cadmium stress based on information fusion and VISSA-GOA-SVM algorithm. J. Food Process Eng. 2019, 42, e13085. [Google Scholar] [CrossRef]
  131. Zhu, W.-J.; Mao, H.-P.; Li, Q.-L.; Liu, H.-Y.; Sun, J.; Zuo, Z.-Y.; Chen, Y. Study on the polarized reflectance-hyperspectral information fusion technology of tomato leaves nutrient diagnoses. Spectrosc. Spectr. Anal. 2014, 34, 2500–2505. [Google Scholar]
  132. Zhou, X.; Zhao, C.; Sun, J.; Yao, K.; Xu, M. Detection of lead content in oilseed rape leaves and roots based on deep transfer learning and hyperspectral imaging technology. Spectrochim. Acta Part A Mol. Biomol. Spectrosc. 2023, 290, 122288. [Google Scholar] [CrossRef]
  133. Wen-jing, Z.; Lin, L.; Mei-Qing, L.; Ji-zhan, L.; Xin-hua, W. Rapid detection of tomato mosaic disease in incubation period by infrared thermal imaging and near infrared spectroscopy. Spectrosc. Spectr. Anal. 2018, 38, 2757–2762. [Google Scholar]
  134. Takács, S.; Pék, Z.; Bíró, T.; Szuvandzsiev, P.; Palotás, G.; Czinkoczki, E.; Helyes, L. Detection and simulation of water stress in processing tomato. In Proceedings of the XVI International Symposium on Processing Tomato, San Juan, Argentina, 21 March–1 April 2022; Volume 1351, pp. 39–46. [Google Scholar]
  135. Zhou, X.; Zhao, C.; Sun, J.; Cao, Y.; Fu, L. Classification of heavy metal Cd stress in lettuce leaves based on WPCA algorithm and fluorescence hyperspectral technology. Infrared Phys. Technol. 2021, 119, 103936. [Google Scholar] [CrossRef]
  136. Zhou, X.; Zhao, C.; Sun, J.; Yao, K.; Xu, M.; Cheng, J. Nondestructive testing and visualization of compound heavy metals in lettuce leaves using fluorescence hyperspectral imaging. Spectrochim. Acta Part A Mol. Biomol. Spectrosc. 2023, 291, 122337. [Google Scholar] [CrossRef] [PubMed]
  137. Hsiao, S.-C.; Chen, S.; Yang, I.-C.; Chen, C.-T.; Tsai, C.-Y.; Chuang, Y.-K.; Wang, F.-J.; Chen, Y.-L.; Lin, T.-S.; Lo, Y.M. Evaluation of plant seedling water stress using dynamic fluorescence index with blue LED-based fluorescence imaging. Comput. Electron. Agric. 2010, 72, 127–133. [Google Scholar] [CrossRef]
  138. Gang, M.-S.; Kim, H.-J.; Kim, D.-W. Estimation of greenhouse lettuce growth indices based on a two-stage CNN using RGB-D images. Sensors 2022, 22, 5499. [Google Scholar] [CrossRef] [PubMed]
  139. Won, H.S.; Lee, E.; Lee, S.; Nam, J.-H.; Jung, J.; Cho, Y.; Evert, T.; Kan, N.; Kim, S.; Kim, D.S. Image analysis using smartphones: Relationship between leaf color and fresh weight of lettuce under different nutritional treatments. Front. Plant Sci. 2025, 16, 1589825. [Google Scholar] [CrossRef]
  140. Ohashi, Y.; Ishigami, Y.; Goto, E. Monitoring the growth and yield of fruit vegetables in a greenhouse using a three-dimensional scanner. Sensors 2020, 20, 5270. [Google Scholar] [CrossRef]
  141. Zhang, L.; Wang, A.; Zhang, H.; Zhu, Q.; Zhang, H.; Sun, W.; Niu, Y. Estimating leaf chlorophyll content of winter wheat from UAV multispectral images using machine learning algorithms under different species, growth stages, and nitrogen stress conditions. Agriculture 2024, 14, 1064. [Google Scholar] [CrossRef]
  142. Lu, B.; Sun, J.; Yang, N.; Wu, X.; Zhou, X.; Shen, J. Quantitative detection of moisture content in rice seeds based on hyperspectral technique. J. Food Process Eng. 2018, 41, e12916. [Google Scholar] [CrossRef]
  143. Sabzi, S.; Pourdarbani, R.; Rohban, M.H.; García-Mateos, G.; Arribas, J.I. Estimation of nitrogen content in cucumber plant (Cucumis sativus L.) leaves using hyperspectral imaging data with neural network and partial least squares regressions. Chemom. Intell. Lab. Syst. 2021, 217, 104404. [Google Scholar] [CrossRef]
  144. Zhang, Y.; Sun, J.; Li, J.; Wu, X.; Dai, C. Quantitative analysis of cadmium content in tomato leaves based on hyperspectral image and feature selection. Appl. Eng. Agric. 2018, 34, 789–798. [Google Scholar] [CrossRef]
  145. Jun, S.; Xin, Z.; Hanping, M.; Xiaohong, W.; Xiaodong, Z.; Hongyan, G. Identification of pesticide residue level in lettuce based on hyperspectra and chlorophyll fluorescence spectra. Int. J. Agric. Biol. Eng. 2016, 9, 231–239. [Google Scholar]
  146. Moustaka, J.; Moustakas, M. Early-stage detection of biotic and abiotic stress on plants by chlorophyll fluorescence imaging analysis. Biosensors 2023, 13, 796. [Google Scholar] [CrossRef]
  147. Akhlaq, M.; Miao, H.; Zhang, C.; Yan, H.; Run, X.; Chauhdary, J.N.; Rehman, M.M.u.; Li, J.; Ren, J. Resilience assessment of tomato crop chlorophyll fluorescence against water stress under elevated CO2 and protective cultivation. Irrig. Drain. 2025, 74, 1234–1252. [Google Scholar] [CrossRef]
  148. Rodríguez-Moreno, L.; Pineda, M.; Soukupová, J.; Macho, A.P.; Beuzón, C.R.; Barón, M.; Ramos, C. Early detection of bean infection by Pseudomonas syringae in asymptomatic leaf areas using chlorophyll fluorescence imaging. Photosynth. Res. 2008, 96, 27–35. [Google Scholar] [CrossRef]
  149. Wang, A.; Xu, Y.; Hu, D.; Zhang, L.; Li, A.; Zhu, Q.; Liu, J. Tomato Yield Estimation Using an Improved Lightweight YOLO11n Network and an Optimized Region Tracking-Counting Method. Agriculture 2025, 15, 1353. [Google Scholar] [CrossRef]
  150. Fernando, H.; Ha, T.; Attanayake, A.; Benaragama, D.; Nketia, K.A.; Kanmi-Obembe, O.; Shirtliffe, S.J. High-resolution flowering index for canola yield modelling. Remote Sens. 2022, 14, 4464. [Google Scholar] [CrossRef]
  151. Andujar, D.; Ribeiro, A.; Fernández-Quintanilla, C.; Dorado, J. Using depth cameras to extract structural parameters to assess the growth state and yield of cauliflower crops. Comput. Electron. Agric. 2016, 122, 67–73. [Google Scholar] [CrossRef]
  152. Mortensen, A.K.; Bender, A.; Whelan, B.; Barbour, M.M.; Sukkarieh, S.; Karstoft, H.; Gislum, R. Segmentation of lettuce in coloured 3D point clouds for fresh weight estimation. Comput. Electron. Agric. 2018, 154, 373–381. [Google Scholar] [CrossRef]
  153. Zhang, C.; Akhlaq, M.; Yan, H.; Ni, Y.; Liang, S.; Zhou, J.; Xue, R.; Li, M.; Adnan, R.M.; Li, J. Chlorophyll fluorescence parameter as a predictor of tomato growth and yield under CO2 enrichment in protective cultivation. Agric. Water Manag. 2023, 284, 108333. [Google Scholar] [CrossRef]
  154. Zhang, Y.; Niu, Y.; Cui, Z.; Chai, X.; Xu, L. Cross-Year Rapeseed Yield Prediction for Harvesting Management Using UAV-Based Imagery. Remote Sens. 2025, 17, 2010. [Google Scholar] [CrossRef]
  155. Elsayed, S.; El-Hendawy, S.; Elsherbiny, O.; Okasha, A.M.; Elmetwalli, A.H.; Elwakeel, A.E.; Memon, M.S.; Ibrahim, M.E.; Ibrahim, H.H. Estimating chlorophyll content, production, and quality of sugar beet under various nitrogen levels using machine learning models and novel spectral indices. Agronomy 2023, 13, 2743. [Google Scholar] [CrossRef]
  156. Li, Y.; Hu, X.; Zhang, F.; Shi, J.; Qiu, B. Colony image segmentation and counting based on hyperspectral technology. Trans. Chin. Soc. Agric. Eng 2020, 36, 326–332. [Google Scholar]
  157. Ni, J.; Xue, Y.; Zhou, Y.; Miao, M. Rapid identification of greenhouse tomato senescent leaves based on the sucrose-spectral quantitative prediction model. Biosyst. Eng. 2024, 238, 200–211. [Google Scholar] [CrossRef]
  158. Huang, Y.; Xiong, J.; Li, Z.; Hu, D.; Sun, Y.; Jin, H.; Zhang, H.; Fang, H. Recent advances in light penetration depth for postharvest quality evaluation of fruits and vegetables. Foods 2024, 13, 2688. [Google Scholar] [CrossRef]
  159. Yu, K.; Zhong, M.; Zhu, W.; Rashid, A.; Han, R.; Virk, M.S.; Duan, K.; Zhao, Y.; Ren, X. Advances in computer vision and spectroscopy techniques for non-destructive quality assessment of citrus fruits: A comprehensive review. Foods 2025, 14, 386. [Google Scholar] [CrossRef]
  160. Zhang, B.; Huang, W.; Li, J.; Zhao, C.; Fan, S.; Wu, J.; Liu, C. Principles, developments and applications of computer vision for external quality inspection of fruits and vegetables: A review. Food Res. Int. 2014, 62, 326–343. [Google Scholar] [CrossRef]
  161. Maraveas, C.; Kalitsios, G.; Kotzabasaki, M.I.; Giannopoulos, D.V.; Dimitropoulos, K.; Vatsanidou, A. Real-time freshness prediction for Apples and Lettuces using imaging recognition and advanced algorithms in a user-friendly mobile application. Smart Agric. Technol. 2025, 12, 101129. [Google Scholar] [CrossRef]
  162. Guan, X.; Shi, L.; Yang, W.; Ge, H.; Wei, X.; Ding, Y. Multi-Feature Fusion Recognition and Localization Method for Unmanned Harvesting of Aquatic Vegetables. Agriculture 2024, 14, 971. [Google Scholar] [CrossRef]
  163. Barnes, M.; Duckett, T.; Cielniak, G.; Stroud, G.; Harper, G. Visual detection of blemishes in potatoes using minimalist boosted classifiers. J. Food Eng. 2010, 98, 339–346. [Google Scholar] [CrossRef]
  164. Louro, A.H.F.; Mendonça, M.M.; Gonzaga, A. Classificação de tomates utilizando redes neurais artificiais. In Proceedings of the II Workshop de Visão Computacional, São Carlos, SP, Brazil, 16–18 October 2006. [Google Scholar]
  165. López Camelo, A.F.; Gómez, P.A. Comparison of color indexes for tomato ripening. Hortic. Bras. 2004, 22, 534–537. [Google Scholar] [CrossRef]
  166. Hara, R.; Ishigaki, M.; Kitahama, Y.; Ozaki, Y.; Genkawa, T. Excitation wavelength selection for quantitative analysis of carotenoids in tomatoes using Raman spectroscopy. Food Chem. 2018, 258, 308–313. [Google Scholar] [CrossRef] [PubMed]
  167. Matteini, P.; Distefano, C.; de Angelis, M.; Agati, G. Assessment of nitrate levels in greenhouse-grown spinaches by Raman spectroscopy: A tool for sustainable agriculture and food security. J. Agric. Food Res. 2025, 21, 101839. [Google Scholar] [CrossRef]
  168. Hahn, F. AE—Automation and Emerging Technologies: Multi-spectral prediction of unripe tomatoes. Biosyst. Eng. 2002, 81, 147–155. [Google Scholar] [CrossRef]
  169. Faqeerzada, M.A.; Kim, Y.-N.; Kim, H.; Akter, T.; Kim, H.; Park, M.-S.; Kim, M.S.; Baek, I.; Cho, B.-K. Hyperspectral imaging system for pre-and post-harvest defect detection in paprika fruit. Postharvest Biol. Technol. 2024, 218, 113151. [Google Scholar] [CrossRef]
  170. Rizzolo, A.; Vanoli, M.; Cortellino, G.; Spinelli, L.; Contini, D.; Herremans, E.; Bongaers, E.; Nemeth, A.; Leitner, M.; Verboven, P. Characterizing the tissue of apple air-dried and osmo-air-dried rings by X-CT and OCT and relationship with ring crispness and fruit maturity at harvest measured by TRS. Innov. Food Sci. Emerg. Technol. 2014, 24, 121–130. [Google Scholar] [CrossRef]
  171. Wijesinghe, R.E.; Lee, S.-Y.; Ravichandran, N.K.; Shirazi, M.F.; Moon, B.; Jung, H.-Y.; Jeon, M.; Kim, J. Bio-photonic detection method for morphological analysis of anthracnose disease and physiological disorders of Diospyros kaki. Opt. Rev. 2017, 24, 199–205. [Google Scholar] [CrossRef]
  172. Guo, J.; Yang, Y.; Lin, X.; Memon, M.S.; Liu, W.; Zhang, M.; Sun, E. Revolutionizing agriculture: Real-time ripe tomato detection with the enhanced tomato-yolov7 system. IEEE Access 2023, 11, 133086–133098. [Google Scholar] [CrossRef]
  173. Huang, X.y.; Pan, S.h.; Sun, Z.y.; Ye, W.t.; Aheto, J.H. Evaluating quality of tomato during storage using fusion information of computer vision and electronic nose. J. Food Process Eng. 2018, 41, e12832. [Google Scholar] [CrossRef]
  174. Huang, X.; Yu, S.; Xu, H.; Aheto, J.H.; Bonah, E.; Ma, M.; Wu, M.; Zhang, X. Rapid and nondestructive detection of freshness quality of postharvest spinaches based on machine vision and electronic nose. J. Food Saf. 2019, 39, e12708. [Google Scholar] [CrossRef]
  175. Ochiai, S.; Ishii, T.; Kamada, E. Combining Machine Learning with Images from RGB and Multispectral Cameras to Estimate Fresh Weight and Height of Spinach for Processing Use. Hortic. J. 2025, 94, 222–231. [Google Scholar] [CrossRef]
  176. Lunadei, L.; Diezma, B.; Lleó, L.; Ruiz-Garcia, L.; Cantalapiedra, S.; Ruiz-Altisent, M. Monitoring of fresh-cut spinach leaves through a multispectral vision system. Postharvest Biol. Technol. 2012, 63, 74–84. [Google Scholar] [CrossRef]
  177. Du, D.; Ye, Y.; Li, D.; Fan, J.; Scharff, R.B.; Wang, J.; Shan, F. Nondestructive evaluation of harvested cabbage texture quality using 3D scanning technology. J. Food Eng. 2024, 378, 112123. [Google Scholar] [CrossRef]
  178. Gu, J.; Zhang, Y.; Yin, Y.; Wang, R.; Deng, J.; Zhang, B. Surface defect detection of cabbage based on curvature features of 3D point cloud. Front. Plant Sci. 2022, 13, 942040. [Google Scholar] [CrossRef]
  179. Muñoz-Postigo, J.; Valero, E.; Martínez-Domingo, M.; Lara, F.; Nieves, J.; Romero, J.; Hernández-Andrés, J. Band selection pipeline for maturity stage classification in bell peppers: From full spectrum to simulated camera data. J. Food Eng. 2024, 365, 111824. [Google Scholar] [CrossRef]
  180. Eshkabilov, S.; Simko, I. Assessing Contents of Sugars, Vitamins, and Nutrients in Baby Leaf Lettuce from Hyperspectral Data with Machine Learning Models. Agriculture 2024, 14, 834. [Google Scholar] [CrossRef]
  181. Zhu, W.; Li, J.; Li, L.; Wang, A.; Wei, X.; Mao, H. Nondestructive diagnostics of soluble sugar, total nitrogen and their ratio of tomato leaves in greenhouse by polarized spectra–hyperspectral data fusion. Int. J. Agric. Biol. Eng. 2020, 13, 189–197. [Google Scholar] [CrossRef]
  182. Fu, L.; Sun, J.; Wang, S.; Xu, M.; Yao, K.; Zhou, X. Nondestructive evaluation of Zn content in rape leaves using MSSAE and hyperspectral imaging. Spectrochim. Acta Part A Mol. Biomol. Spectrosc. 2022, 281, 121641. [Google Scholar] [CrossRef]
  183. Shi, L.; Sun, J.; Cong, S.; Ji, X.; Yao, K.; Zhang, B.; Zhou, X. Fluorescence hyperspectral imaging for detection of selenium content in lettuce leaves under cadmium-free and cadmium environments. Food Chem. 2025, 481, 144055. [Google Scholar] [CrossRef] [PubMed]
  184. Huang, S.; Yan, W.; Liu, M.; Hu, J. Detection of difenoconazole pesticides in pak choi by surface-enhanced Raman scattering spectroscopy coupled with gold nanoparticles. Anal. Methods 2016, 8, 4755–4761. [Google Scholar] [CrossRef]
  185. Trebolazabala, J.; Maguregui, M.; Morillas, H.; de Diego, A.; Madariaga, J.M. Portable Raman spectroscopy for an in-situ monitoring the ripening of tomato (Solanum lycopersicum) fruits. Spectrochim. Acta Part A Mol. Biomol. Spectrosc. 2017, 180, 138–143. [Google Scholar] [CrossRef] [PubMed]
  186. Fu, X.; He, X.; Xu, H.; Ying, Y. Nondestructive and rapid assessment of intact tomato freshness and lycopene content based on a miniaturized Raman spectroscopic system and colorimetry. Food Anal. Methods 2016, 9, 2501–2508. [Google Scholar] [CrossRef]
  187. Gao, H.; Zhu, Z.; Sun, D.-W. Determination of Porosity and Permeability Correlation of Leafy Vegetable based on X-ray Computed Tomography and Cell Segmentation. J. Food Eng. 2025, 401, 112545. [Google Scholar] [CrossRef]
  188. Retta, M.; Verlinden, B.; Verboven, P.; Nicolaï, B. Texture-microstructure relationship of leafy vegetables during postharvest storage. In Proceedings of the VI International Conference Postharvest Unlimited, Madrid, Spain, 14–20 October 2017; Volume 1256, pp. 169–178. [Google Scholar]
  189. Landahl, S.; Terry, L.; Ford, H. Investigation of diseased onion bulbs using data processing of optical coherence tomography images. In Proceedings of the VI International Symposium on Edible Alliaceae, Fukuoka, Japan, 21–24 May 2012; Volume 969, pp. 261–270. [Google Scholar]
  190. Ullah, H.; Hussain, F.; Ahmad, E.; Ikram, M. A rapid and non-invasive bio-photonic technique to monitor the quality of onions. Opt. Spectrosc. 2015, 119, 295–299. [Google Scholar] [CrossRef]
  191. Buchaillot, M.L.; Soba, D.; Shu, T.; Liu, J.; Aranjuelo, I.; Araus, J.L.; Runion, G.B.; Prior, S.A.; Kefauver, S.C.; Sanz-Saez, A. Estimating peanut and soybean photosynthetic traits using leaf spectral reflectance and advance regression models. Planta 2022, 255, 93. [Google Scholar] [CrossRef]
  192. Scutelnic, D.; Muradore, R.; Daffara, C. A multispectral camera in the VIS–NIR equipped with thermal imaging and environmental sensors for non invasive analysis in precision agriculture. HardwareX 2024, 20, e00596. [Google Scholar] [CrossRef]
  193. Maeda, Y. Cost Constrained Cultivation Management With Sensors Considering Fluctuation of Crop Price. Electron. Commun. Jpn. 2025, 108, e12489. [Google Scholar] [CrossRef]
  194. Bhujel, A.; Basak, J.K.; Khan, F.; Arulmozhi, E.; Jaihuni, M.; Sihalath, T.; Lee, D.; Park, J.; Kim, H.T. Sensor systems for greenhouse microclimate monitoring and control: A review. J. Biosyst. Eng. 2020, 45, 341–361. [Google Scholar] [CrossRef]
  195. Liu, S.; Zeng, H.; Wang, H.; Tong, X.; Zhao, H.; Du, K.; Zhang, J. A novel adaptive spectral drift correction method for recalibrating the MarSCoDe LIBS data in China’s Tianwen-1 Mars mission. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2023, 16, 5430–5440. [Google Scholar] [CrossRef]
  196. Henriksen, M.B.; Garrett, J.L.; Prentice, E.F.; Stahl, A.; Johansen, T.A.; Sigernes, F. Real-time corrections for a low-cost hyperspectral instrument. In Proceedings of the 2019 10th Workshop on Hyperspectral Imaging and Signal Processing: Evolution in Remote Sensing (WHISPERS), Amsterdam, The Netherlands, 24–26 September 2019; pp. 1–5. [Google Scholar]
  197. Rahman, H.; Shah, U.M.; Riaz, S.M.; Kifayat, K.; Moqurrab, S.A.; Yoo, J. Digital twin framework for smart greenhouse management using next-gen mobile networks and machine learning. Future Gener. Comput. Syst. 2024, 156, 285–300. [Google Scholar] [CrossRef]
  198. Zou, B.; Dong, J.; Yang, Y. Greenhouse management system based on cloud platform. Transducer Microsyst. Technol. 2021, 40, 112–114+118. [Google Scholar]
  199. Kang, Z.; Zhou, B.; Fei, S.; Wang, N. Predicting the Greenhouse Crop Morphological Parameters Based on RGB-D Computer Vision. Smart Agric. Technol. 2025, 11, 100968. [Google Scholar] [CrossRef]
  200. Sun, D.; Robbins, K.; Morales, N.; Shu, Q.; Cen, H. Advances in optical phenotyping of cereal crops. Trends Plant Sci. 2022, 27, 191–208. [Google Scholar] [CrossRef]
  201. Cruz, M.; Aguilera, C.; Vintimilla, B.; Toledo, R.; Sappa, A. Cross-spectral image registration and fusion: An evaluation study. In Proceedings of the 2nd International Conference on Machine Vision and Machine Learning, Vancouver, BC, Canada, 13–19 July 2015; pp. 315–331. [Google Scholar]
  202. Singh, R.; Nisha, R.; Naik, R.; Upendar, K.; Nickhil, C.; Deka, S.C. Sensor fusion techniques in deep learning for multimodal fruit and vegetable quality assessment: A comprehensive review. J. Food Meas. Charact. 2024, 18, 8088–8109. [Google Scholar] [CrossRef]
  203. Xu, S.; Xu, X.; Zhu, Q.; Meng, Y.; Yang, G.; Feng, H.; Yang, M.; Zhu, Q.; Xue, H.; Wang, B. Monitoring leaf nitrogen content in rice based on information fusion of multi-sensor imagery from UAV. Precis. Agric. 2023, 24, 2327–2349. [Google Scholar] [CrossRef]
  204. Yu, J.; Liu, J.; Sun, C.; Wang, J.; Ci, J.; Jin, J.; Ren, N.; Zheng, W.; Wei, X. Sensing Technology for Greenhouse Tomato Production: A Systematic Review. Smart Agric. Technol. 2025, 11, 101020. [Google Scholar] [CrossRef]
  205. Liu, X.; Zhang, Q.; Min, W.; Geng, G.; Jiang, S. Solutions and challenges in AI-based pest and disease recognition. Comput. Electron. Agric. 2025, 238, 110775. [Google Scholar] [CrossRef]
  206. He, M.; Li, C.; Cai, Z.; Qi, H.; Zhou, L.; Zhang, C. Leafy vegetable freshness identification using hyperspectral imaging with deep learning approaches. Infrared Phys. Technol. 2024, 138, 105216. [Google Scholar] [CrossRef]
  207. Kose, U.; Prasath, V.S.; Mondal, M.R.H.; Podder, P.; Bharati, S. Artificial Intelligence and Smart Agriculture Applications; CRC Press: Boca Raton, FL, USA, 2022. [Google Scholar]
  208. Qiang, Z.; Shi, F. Pest disease detection of Brassica chinensis in wide scenes via machine vision: Method and deployment. J. Plant Dis. Prot. 2022, 129, 533–544. [Google Scholar] [CrossRef]
  209. Žalik, K.R.; Žalik, M. A review of federated learning in agriculture. Sensors 2023, 23, 9566. [Google Scholar] [CrossRef]
  210. Mohan, R.J.; Rayanoothala, P.S.; Sree, R.P. Next-gen agriculture: Integrating AI and XAI for precision crop yield predictions. Front. Plant Sci. 2025, 15, 1451607. [Google Scholar] [CrossRef] [PubMed]
  211. Qureshi, W.A.; Gao, J.; Elsherbiny, O.; Mosha, A.H.; Tunio, M.H.; Qureshi, J.A. Boosting Aeroponic System Development with Plasma and High-Efficiency Tools: AI and IoT—A Review. Agronomy 2025, 15, 546. [Google Scholar] [CrossRef]
  212. Montoya, A.P.; Obando, F.A.; Osorio, J.; Morales, J.; Kacira, M. Design and implementation of a low-cost sensor network to monitor environmental and agronomic variables in a plant factory. Comput. Electron. Agric. 2020, 178, 105758. [Google Scholar] [CrossRef]
  213. Glykos, I.; Loukatos, D.; Papangelis, A.; Androulidakis, N.; Arvanitis, K. A scalable, flexible, low cost and secure architectural framework for greenhouse IoT. In Proceedings of the 28th Pan-Hellenic Conference on Progress in Computing and Informatics, Athens, Greece, 13–15 December 2024; pp. 81–88. [Google Scholar]
  214. Debbagh, M. Development of a Low-Cost Wireless Sensor Network for Passive In Situ Measurement of Soil Greenhouse Gas Emissions; McGill University: Montréal, QC, Canada, 2019. [Google Scholar]
  215. O’sullivan, C.; Bonnett, G.; McIntyre, C.; Hochman, Z.; Wasson, A. Strategies to improve the productivity, product diversity and profitability of urban agriculture. Agric. Syst. 2019, 174, 133–144. [Google Scholar] [CrossRef]
  216. Mahlein, A.-K.; Alisaac, E.; Al Masri, A.; Behmann, J.; Dehne, H.-W.; Oerke, E.-C. Comparison and combination of thermal, fluorescence, and hyperspectral imaging for monitoring fusarium head blight of wheat on spikelet scale. Sensors 2019, 19, 2281. [Google Scholar] [CrossRef] [PubMed]
  217. Chamara, N.; Islam, M.D.; Bai, G.F.; Shi, Y.; Ge, Y. Ag-IoT for crop and environment monitoring: Past, present, and future. Agric. Syst. 2022, 203, 103497. [Google Scholar] [CrossRef]
  218. Mondal, P.; Basu, M. Adoption of precision agriculture technologies in India and in some developing countries: Scope, present status and strategies. Prog. Nat. Sci. 2009, 19, 659–666. [Google Scholar] [CrossRef]
  219. Tripodi, P.; Massa, D.; Venezia, A.; Cardi, T. Sensing technologies for precision phenotyping in vegetable crops: Current status and future challenges. Agronomy 2018, 8, 57. [Google Scholar] [CrossRef]
Figure 1. Application of optical sensing technology in different growth cycles of facility vegetables.
Figure 1. Application of optical sensing technology in different growth cycles of facility vegetables.
Agronomy 15 02229 g001
Figure 2. Application of optical sensing technology in the phenotypic study of facility vegetables.
Figure 2. Application of optical sensing technology in the phenotypic study of facility vegetables.
Agronomy 15 02229 g002
Figure 3. Statistics on the application of optical sensing technology in phenotyping studies of facility vegetables from 2000 to 2024: (a) number of relevant articles published per year since 2000; (b) percentage of studies on four phenotypic traits; and (c) percentage of studies on common traits in phenotypic studies on facility vegetables.
Figure 3. Statistics on the application of optical sensing technology in phenotyping studies of facility vegetables from 2000 to 2024: (a) number of relevant articles published per year since 2000; (b) percentage of studies on four phenotypic traits; and (c) percentage of studies on common traits in phenotypic studies on facility vegetables.
Agronomy 15 02229 g003
Table 1. Application of optical sensing technology in biochemical traits of facility vegetables.
Table 1. Application of optical sensing technology in biochemical traits of facility vegetables.
Optical Imaging
Technology
Facility
Vegetables
CharacteristicsApplicationReferences
RGB imagingoilseed rapechlorophyllBy studying the influence of soil pixels in RGB images on the estimation of leaf chlorophyll content (LCC). The results show that removing the soil background improves the estimation accuracy of LCC, with the coefficient of determination (R2) increasing from 0.58 to 0.68, while the applicability of the LCC estimation model is enhanced.[99]
RGB imagingtomatoleaf water contentDetecting water status of tomato by fusing RGB, NIR, and depth image information through deep learning. The accuracy of tomato moisture status detection based on multimodal deep learning is 93.09–99.18%. It is better than the result of 88.97–93.09% for unimodal deep learning.[100]
RGB imaginglettuceanthocyaninThe characteristics of different color component combinations such as R/G, BIG, and G/(r + B) were significantly correlated with anthocyanin content, and the highest correlation coefficient between S/H and anthocyanin content was 0.850 for quantitative prediction of anthocyanin content.[101]
RGB imaging,
hyperspectral imaging
tomatochlorophyll, carotenoids, etc.To evaluate the potential of image processing, spectral reflectance index (SRI) and machine learning models such as decision tree (DT) and random forest (RF) in qualitatively estimating the characteristics of citrus and tomato fruits at different ripening stages. Among them, the DT-5HV model performed better in Chl, a prediction of tomato with R2 of 0.905 and RMSE of 0.077 for the training dataset and R2 of 0.785 and RMSE of 0.077 for the validation dataset.[102]
RGB imaging, Hyperspectral imaginglettucechlorophyllBy using the open-source AutoML framework and constructing an estimation model for the chlorophyll content of aquatic lettuce based on spectral and color vegetation indices (with the Rp2 value reaching up to 0.98), the performance is superior to that of standard machine learning methods such as random forest, and it is suitable for nutrient management in fish–vegetable symbiotic systems.[103]
Hyperspectral imagingcarrotnitrogenA stacked ensemble learning approach based on hyperspectral data effectively estimated nitrogen content in radish across multiple growth stages. The stacking model using spectral features achieved strong predictive accuracy at full fertility, with R2 = 0.7, MAE = 0.16, and MSE = 0.05.[104]
Hyperspectral imagingcucumberchlorophyllA multiple linear regression model achieved a correlation of up to 0.774 in predicting chlorophyll content in cucumber leaves. The results demonstrate that hyperspectral imaging offers significant potential for non-destructive, in situ pigment prediction in live plants.[105]
Hyperspectral imagingChinese cabbageanthocyaninNo significant differences were found in anthocyanin accumulation measured by hyperspectral imaging and destructive analytical means for four cabbage varieties at different fertility stages. The results suggest that hyperspectral imaging can provide analytical capabilities comparable to destructive analysis to measure changes in anthocyanin accumulation occurring under different light and temperature conditions on indoor farms.[106]
Hyperspectral imagingArabidopsiscanopy water contentThe high accuracy of water content prediction in plant canopies can be ensured while appropriately increasing the scanning speed of hyperspectral imaging. When the scanning speed was increased from 20 mm.s(−1) to 40 mm.s(−1), the coefficient of determination of water content in the canopy of Arabidopsis thaliana was reduced by only 2.3%, which improved the efficiency of image acquisition and reduced the time of data processing for practical production applications.[107]
Hyperspectral imagingtomatoenzyme activityBy utilizing hyperspectral technology and support vector machine regression to achieve rapid, non-destructive detection of superoxide dismutase, peroxidase, and catalase activities in tomato yellow leaf curl virus (TYLCV)-infected tomato leaves. The catalase prediction model performed best (R2 = 0.82), while peroxidase activity was correlated with cultivar resistance, providing a new method for virus-resistant breeding and disease early warning.[108]
Multispectral imagingbok choyphotosynthetic pigmentsConvolutional neural network (CNN), multiple linear regression (MLR), and generalized linear model (GLM) were used to build machine learning models for estimation of photosynthetic pigments. The R2 and RMSE of CNN model validation set for estimation of SPAD were 0.87 and 2.31 respectively. The R2 and RMSE of GLM model validation set for estimation of SPAD were 0.88 and 2.39, respectively.[109]
Terahertz imagingtomatoleaf water contentA prediction model for tomato moisture fusion based on three-dimensional terahertz feature band fusion was developed using support vector machine (SVM). The correlation coefficient of the prediction set was 0.9792, and the root mean square error was 0.0531, which was better than the other three one-dimensional models.[110]
Terahertz imaginglettucephosphorus contentThe partial least squares method was used to build a lettuce phosphorus terahertz-TDS model with five principal component variables. The coefficient of determination of the model reached 0.7005, and the root mean square error of prediction was 0.003273, having a high prediction accuracy.[111]
Raman imagingtomatocarotenoidsPartial least squares regression (PLSR) and partial least squares discriminant analysis (PLS-DA) models were developed by obtaining Raman spectra of tomato at different carotenoid concentrations. The accuracy of the PLS-DA model was not affected by the time of exposure (hit rate: 90%).[112]
Raman imagingtomatolycopene, β-caroteneA combination of principal component analysis and partial least squares regression (PCA-PLSR) was used to process the data, and the results obtained were compared with HPLC measurements of the same extracts. Good agreement was obtained between HPLC and SERS results for most tomato samples.[113]
Table 2. Application of optical sensing technology in physiological traits of facility vegetables.
Table 2. Application of optical sensing technology in physiological traits of facility vegetables.
Optical Imaging
Technology
Facility
Vegetables
CharacteristicsApplicationReferences
RGB imagingtomatodisease stressFour convolutional neural network models were used to train and evaluate tomato leaf diseases and quantify the models for implementation in a Raspberry Pi 4. Among them, the Xception model showed the highest consistency in detection with 100% accuracy, recall, and precision.[124]
RGB imaging, multispectral imaginglettucefrost stressA UAV-based multispectral and RGB imaging system was developed for high-throughput frost damage detection in lettuce. The optimized multisource-green-NN model achieved the best performance (R2 = 0.715, RMSE = 0.014), providing an efficient tool for frost monitoring and cold-resistant breeding.[125]
Multispectral imagingeggplantyellow wiltA low-cost multispectral camera was combined with deep learning techniques to detect early yellow wilt in eggplant. Among them, the VGG16-ternary group attention model performed the best, achieving 86.73% accuracy on the test set.[126]
Multispectral imagingcabbagesalt stressA dual-input data fusion model for salt tolerance assessment was proposed. The validation of salt tolerance classification achieved 95.00% accuracy on day 5 after salt stress. All salt-sensitive varieties were fully identified on day 9 after salt stress.[127]
Hyperspectral imagingcucumberdowny mildewHyperspectral images of asymptomatic leaves, downy mildew leaves, and leafminer-infected leaves were acquired by a hyperspectral imaging system, and full-band and characteristic wavelength data were modeled by support vector machine (SVM), Elman neural network, and random forest (RF). Moving average (MA) preprocessing of the full-wavelength spectral data gave the best recognition results The OA of the MA–RF model reached 97.89% and the Kappa coefficient was 0.97.[128]
Hyperspectral imagingoilseed rapebotrytisThe early recognition model of oilseed rape mycosphaerella disease onset was established by hyperspectral images combined with a deep learning model, and the recognition correctness, precision, and recall reached more than 97.97% with good generalization ability.[129]
Hyperspectral imaginglettucecadmium stressVis–NIR hyperspectral imaging was applied to detect cadmium stress gradients in lettuce leaves. A classification model was established by RTD combined with VISSA–GOA–SVM, and the calibration and prediction accuracies reached 100% and 98.57%.[130]
Hyperspectral imagingtomatonitrogen, phosphorus, and potassium stressA high-precision tomato leaf nutrition detection model was established using polarized reflectance spectroscopy combined with hyperspectral information fusion technology. The results showed that the nitrogen correlation coefficient r = 0.9618, root mean square error RMSE = 0.451; phosphorus correlation coefficient r = 0.9163, root mean square error RMSE = 0.620; potassium correlation coefficient r = 0.9406, and root mean square error RMSE = 0.494.[131]
Hyperspectral imagingoilseed rapelead stressThe T-SAE model constructed based on transfer learning, using hyperspectral images, achieved precise identification of oilseed rape leaves and roots under different lead stress conditions, with a prediction accuracy rate of 98.75%.[132]
Thermal imagingtomatophytophthoraTomato leaf spectral data were collected using thermal imaging collection method (TCM) and random collection method (RCM). Discriminant analysis was performed using support vector machine (SVM) and spectral information compression using principal component analysis (PCA). The total recognition rates of the models built by the two methods were 92.59% and 99.77%, respectively.[133]
Thermal imagingtomatowater stressDifferent water supply levels were set at 50%, 75%, and 100% of crop evapotranspiration (ETc), and a thermal imager was used to measure the leaf surface temperature and calculate the crop water stress index (CWSI); it was found to be able to detect different degrees of water stress. Meanwhile no good correlation was found between CWSI and the simulated stress indicators but a trend could be determined in the case of severe stress treatments.[134]
Terahertz imagingtomatoleaf moldTerahertz hyperspectral imaging multi-source detection of tomato leaf mold was used to establish a leaf mold fusion diagnosis and health evaluation model, and the identification rate of tomato leaf mold samples reached 97.12%, which improved identification accuracy by 0.45% compared with a single detection method.[68]
Fluorescence imaginglettucecadmium stressBased on wavelet principal component analysis (WPCA) dimensionality reduction, support vector machine (SVM) and cuckoo search–optimized SVM (CS–SVM) models were developed to detect cadmium stress in lettuce. The optimal WPCA–CS–SVM model, utilizing third-level sym7 wavelet decomposition, achieved 99.79% calibration and 94.19% prediction accuracy.[135]
Fluorescence imaginglettucecadmium and lead stressA multiple linear regression (MLR) model utilizing 1st Der–WT–SR (first-order derivative–wavelet transform–stepwise regression) features data effectively detected for composite heavy metal content. For Cd (fifth-layer decomposition, Rp2 wavelet), performance reached R2 = 0.7905, RMSEP = 0.0269 mg/kg, RPD = 2.477; for Pb (first-layer decomposition), results were R2 = 0.8965, RMSEP = 0.0096 mg/kg, RPD = 3.211.[136]
Fluorescence imagingcabbagewater stressA dynamic fluorescence imaging indicator (DFI) system capable of non-destructive assessment of water stress in cabbage seedlings was developed. The DFI imaging time was short, less than 200 s, and the best results using the 720 nm channel were r = 0.944 and SEE = 0.286 MPa.[137]
Raman imagingtomatovirus stressRaman spectroscopy (RS) combined with partial least squares discriminant analysis (PLS–DA) enabled non-invasive, early detection of tomato spotted wilt (TSW), effectively distinguishing strain-specific symptoms across multiple varieties.[64]
Table 3. Application of optical sensing technology in yield traits of facility vegetables.
Table 3. Application of optical sensing technology in yield traits of facility vegetables.
Optical Imaging
Technology
Facility
Vegetables
CharacteristicsApplicationReferences
RGB imagingtomatonumber of fruitsAn improved lightweight YOLO11n network and an optimized region-tracking counting method are proposed for estimating tomato counts at different ripening stages. For fruit counting, the mean counting error (MCE) was 6.6%, which was reduced by 5.0% and 2.1% compared to Bytetrack and cross-line counting methods, respectively.[149]
RGB imaginglettucefresh weightFresh weight of lettuce was predicted by leaf color using RGB imaging from a smartphone. Results showed color intensity and proportion of dark green correlated with fresh lettuce weight (p = 0.005, 0.003, 0.014 and p < 0.001).[139]
RGB imagingoilseed rapeoilseed yieldFour models were developed using digitized pixel area as an indicator of seed yield. The results showed that the proposed HrFI (High Resolution Flowering Index) and Modified Yellowness Index (MYI) were better predictors of canola yield than NDYI (Normalized Difference Yellowness Index) and RBNI (Red-Blue Normalized Index).[150]
3D imagingtomato, cucumber, pepperleaf area, plant heightPlant and canopy surface models were constructed using a 3D scanner, and leaf area, leaf area index (LAI), and plant height were estimated using polygonal vertex coordinates therein. Significant correlations were found between the three, with R2 greater than 0.8 (except for tomato LAI).[140]
3D imagingcauliflowerfruit volumeThe Kinect Fusion algorithm was used, and the 3D model generated by the depth camera was compared with the actual structural parameters. The accuracy of the model deviated less than 2 cm in diameter/height from the ground truth, while the error in fruit volume estimation was less than 0.6%. Analysis of the structural parameters showed a significant correlation between the estimated and actual values of plant volume and fruit weight.[151]
3D imaginglettucefresh weightA method for segmenting lettuce and estimating fresh weight in a colorful 3D point cloud is proposed. The proposed segmentation method was able to successfully separate lettuce (F1-score = 0.88–0.91). Analysis of the segmented lettuce model showed that the calculated surface area was closely related to the measured fresh weight (R2 = 0.84–0.94).[152]
Fluorescence imagingtomatophotosynthetic activityA strong effect of ΦPSII on plant growth and yield under e[CO2] was shown by principal component analysis. It was found by hierarchical analysis that EC1000 in fall/winter 2020 and EC700 in spring/summer 2021 had a greater effect on obtaining the best crop yield, and there was no significant difference between the ranking scores of EC1000 and EC700 in fall/winter 2020.[153]
Multispectral imagingoilseed rapeoilseed yieldYield prediction using single-stage vegetation index (VI), selected multi-stage VI and multivariate VI-TF (Vegetation Index–Texture Feature) showed that the multi-stage VI and RF models had the highest accuracy in the training dataset (R2 = 0.93, rRMSE = 7.36%), while the multi-stage VI and PLSR performed the best in the test dataset (R2 = 0.62, the rRMSE = 15.20%).[154]
Multispectral imagingbeetsroot and sugar yieldThe GBR–ASF–6 SRIs and GBR–ASF–7 SRIs models performed better in predicting chlorophyll (Chl) content and sugar content (SC), with R2 values of 0.99 and 0.99 (RMSE = 0.073 and 1.568) for the training dataset, and 0.65 and 0.78 (RMSE = 0.354 and 6.294).[155]
Hyperspectral imagingmicrobial countA genetic algorithm–least squares support vector machine (GA–LS–SVM) model was developed using 11 characteristic wavelengths to distinguish colonies from background, achieving a 97.22% identification rate. This provides a novel method for rapid microbial detection in food and agricultural products.[156]
Hyperspectral imagingtomatosucrose contentA study used petiole sucrose concentration to assess tomato leaf senescence. The optimal model, combining moving average preprocessing, PCA, and partial least squares regression, achieved an R2p of 0.98 and RPD of 7.12 for sucrose prediction, aiding yield improvement through timely leaf removal.[157]
Table 4. Application of optical sensing technology in quality traits of facility vegetables.
Table 4. Application of optical sensing technology in quality traits of facility vegetables.
Optical Imaging
Technology
Facility
Vegetables
CharacteristicsApplicationReferences
RGB imagingtomatoripenessA tomato detection method based on improved YOLOv7 is proposed. ODConv improves the feature extraction ability of small targets and obtains more feature information about ripe tomatoes than YOLOv7. The Map@.5 metric of Tomato-YOLOv7 has increased by 1.3%.[172]
RGB imagingtomatomaturity, hardnessA non-destructive method combining computer vision and electronic nose was developed to evaluate tomato quality. Data fusion significantly improved the recognition of ripening stages and prediction of firmness. The best-performing models achieved up to 94.20% classification accuracy (SVC) and a 95.14% correlation coefficient with low error (SVR), demonstrating the effectiveness of multi-sensor fusion for robust tomato quality assessment.[173]
RGB imagingspinachfreshnessMachine vision and e-nose were employed to capture image and odor data from samples. Using K-Nearest Neighbor (KNN), support vector machine (SVM), and back propagation artificial neural network (BPNN) for spinach freshness prediction, the BPNN model with multisensory data fusion significantly enhanced detection accuracy, achieving a classification rate of 93.75%.[174]
RGB imaging, multispectral imagingspinachfresh weight, plant heightRGB and multispectral images were combined with machine learning to estimate fresh weight and plant height for spinach. The results show that the weight estimation model and height estimation model using support vector regression have the highest detection accuracy, and the RMSE of the test set is 0.720 and 4.19, respectively.[175]
Multispectral imagingspinachfreshnessAn image-based algorithm was developed to classify defective spinach leaves. Optimal results were achieved using R and B spectral bands, demonstrating that a vision system operating in these ranges provides a simple and rapid method for detecting deterioration in RTU-packaged spinach under various refrigeration conditions.[176]
3D imagingcabbagemorphologyEvaluation of the textural quality of cabbage using 3D scanning. The texture index was well predicted based on XGBoostR algorithm with R2 value higher than 0.89 and low RMSE value. Linear discriminant analysis also showed good discriminative effect with an accuracy of more than 98.3%.[177]
3D imagingcabbageexternal defectsBased on the 3D point cloud curvature features, a method to detect external defects of cabbages is proposed. The average detection accuracy was 96.25%, including 93.3% for dents and 96.67% for cracks.[178]
Hyperspectral imagingbell peppermaturityA workflow for classifying bell pepper ripeness using hyperspectral imaging and machine learning is presented. Four classifier algorithms (RBFNN, PLS-DA, SVM, and LDA) were used to predict maturity stages based on spectral reflectance, with the LDA algorithm achieving the best results.[179]
Hyperspectral imaginglettucesugar, vitamins, nutrientsHyperspectral imaging combined with machine learning effectively estimated glucose, fructose, and dry matter content. An artificial neural network model achieved high accuracy (r = 0.85–0.99) in predicting fresh leaf weight and contents of chlorophyll, anthocyanin, N, P, K, and β-carotene.[180]
Hyperspectral imagingtomatoessoluble sugarPolarized hyperspectral data fusion was used to estimate soluble sugars (SS), total nitrogen (N), and their ratio (SS/N) in greenhouse tomato leaves. The SS/N model outperformed individual SS and N models, with the support vector machine (SVM) providing the strongest predictive performance.[181]
Fluorescence imaging, hyperspectral imaginglettucezinc contentEmploying a modified stacked sparse autoencoder and least squares support vector regression (MSSAE-LSSVR) model integrated with deep learning and hyperspectral imaging, this study achieved accurate prediction of zinc content in oilseed rape leaves, with optimal results of R2 = 0.9566 and RMSE = 1.0240 mg/kg.[182]
Fluorescence imaging, hyperspectral imaginglettuceselenium contentThe multimodal difference-aware competitive adaptive reweighted sampling (MDCARS) method was proposed to extract cadmium-related features in complex environments and integrated with a ResNet convolutional neural network (RCNN) for quantitative prediction of selenium content. The combined RCNN–MDCARS model achieved optimal performance, with R2p = 0.8975, RMSEP = 0.0487 mg·kg−1, and RPD = 3.1240.[183]
Fluorescence imaging, hyperspectral imagingcabbagepesticide residuesA surface-enhanced Raman scattering (SERS) spectroscopic method was proposed for the detection of phenyl ether metronidazole in cabbage using a portable Raman analyzer. The correlation coefficient (R-p) of the prediction model was 0.9458; the root mean square error of prediction (RMSEP) was 3.27 mg L−1.[184]
Fluorescence imaging, hyperspectral imagingtomatopolysaccharidesRaman microspectroscopy was used to visualize the distribution of polysaccharides in fruit cell walls. Multivariate image analysis methods supported Raman microspectroscopy and helped to distinguish cellulose and pectin in tomato cell walls.[66]
Raman imagingtomatoripenessRaman measurements showed that the composition of tomato fruits changed from green to brown during transportation. The carotenoids of the fruit increased from the unripe to the ripe stage, and lycopene was the characteristic carotenoid of the optimal ripening stage.[185]
Raman imagingtomatofreshnessThe ratio of the two chromaticity indices a*/b* on the tomato surface increased when the freshness of the tomato decreased; the correlation coefficient (r) of the second-order polynomial curve fit was 0.908. The freshness discrimination model developed based on Raman spectroscopy gave 85.6% correctness.[186]
X-rayspinachporosity, permeabilityHigh- and low-resolution X-ray computed tomography (CT) combined with advanced cell segmentation techniques were used to analyze porosity–permeability relationships in spinach. The correlations between grayscale–porosity and porosity–permeability characterized the non-uniformity of intact spinach, with porosity ranging from 2.742% to 53.30% and permeability from 4.925 × 10−14 m2 to 2.829 × 10−11 m2. [187]
X-rayspinach, lettucebrittleness, crispnessThis study investigated how 3D microstructural changes in leaves during storage affect texture loss. Leaf thickness correlated strongly with pre-fracture strength and displacement, while tissue and pore specific surface areas were better linked to toughness, providing insight into their effects on brittleness and crunchiness.[188]
Optical coherence tomographyonionpostharvest pathological quality deteriorationBy measuring onion sections before and after infection using OCT, the results showed that differences between healthy and diseased tissues could be distinguished through OCT images. It was found that, in the infected areas, the average cell size was smaller and the shape was not round.[189]
Optical coherence tomographyonioninternal defectThe internal defect characteristics of the outer layer and the entire onion head of normal and highly water-containing onion scales (with defects) were characterized through cross-sectional imaging.[190]
Table 5. Comparison of Different Optical Sensing Technologies. The color bar indicates the current specification score of the sensing technology. The darker the color, the higher the score.
Table 5. Comparison of Different Optical Sensing Technologies. The color bar indicates the current specification score of the sensing technology. The darker the color, the higher the score.
Evaluation IndexTechnical MaturityApplication RangeEconomyThe Most Suitable TraitsThe Most Suitable Stages
Optical
Sensing Technology
RGB imaging morphological traitsplanting stage plant shape, fruit appearance at harvest time, entire growth cycle
3D imaging morphological traitsplant height, canopy volume, three-dimensional dimensions of fruits, vegetative growth period–harvest period
Multispectral/hyperspectral imaging biochemical traits, physiological traits, quality traitsnitrogen content during vegetative growth, initial stress response, maturity grading
Thermal imaging physiological traitstemperature anomalies caused by water stress, pests/diseases, entire growth cycle
Fluorescence imaging physiological traitsphotosynthetic efficiency, vegetative growth stage–fruit enlargement stage
Raman imaging biochemical traitsprecise component analysis, maturity/harvest stage
Terahertz imaging internal quality traitsinternal defects/composition during transport/storage
X-ray imaging internal quality traitsseed viability during seedling stage, internal damage during transport
Optical coherence tomography internal quality traitsnutrition growth period, storage period
Agronomy 15 02229 i001
0   100
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhang, X.; Leng, Z.; Wang, X.; Tian, S.; Zhang, Y.; Han, X.; Li, Z. Analysis of the Current Situation and Trends of Optical Sensing Technology Application for Facility Vegetable Life Information Detection. Agronomy 2025, 15, 2229. https://doi.org/10.3390/agronomy15092229

AMA Style

Zhang X, Leng Z, Wang X, Tian S, Zhang Y, Han X, Li Z. Analysis of the Current Situation and Trends of Optical Sensing Technology Application for Facility Vegetable Life Information Detection. Agronomy. 2025; 15(9):2229. https://doi.org/10.3390/agronomy15092229

Chicago/Turabian Style

Zhang, Xiaodong, Zonghua Leng, Xinchen Wang, Shijie Tian, Yixue Zhang, Xiangyu Han, and Zhaowei Li. 2025. "Analysis of the Current Situation and Trends of Optical Sensing Technology Application for Facility Vegetable Life Information Detection" Agronomy 15, no. 9: 2229. https://doi.org/10.3390/agronomy15092229

APA Style

Zhang, X., Leng, Z., Wang, X., Tian, S., Zhang, Y., Han, X., & Li, Z. (2025). Analysis of the Current Situation and Trends of Optical Sensing Technology Application for Facility Vegetable Life Information Detection. Agronomy, 15(9), 2229. https://doi.org/10.3390/agronomy15092229

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop