Next Article in Journal
Low-Arsenic Accumulating Cabbage Possesses Higher Root Activities against Oxidative Stress of Arsenic
Previous Article in Journal
Meta-Analysis of Microarray Data and Their Utility in Dissecting the Mapped QTLs for Heat Acclimation in Rice
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Synthetic Review of Various Dimensions of Non-Destructive Plant Stress Phenotyping

1
College of Mechanical and Electrical Engineering, Fujian Agriculture and Forestry University, Fuzhou 350002, China
2
Fujian Key Laboratory of Agricultural Information Sensing Technology, College of Mechanical and Electrical Engineering, Fujian Agriculture and Forestry University, Fuzhou 350002, China
*
Author to whom correspondence should be addressed.
Plants 2023, 12(8), 1698; https://doi.org/10.3390/plants12081698
Submission received: 20 March 2023 / Revised: 8 April 2023 / Accepted: 16 April 2023 / Published: 18 April 2023

Abstract

:
Non-destructive plant stress phenotyping begins with traditional one-dimensional (1D) spectroscopy, followed by two-dimensional (2D) imaging, three-dimensional (3D) or even temporal-three-dimensional (T-3D), spectral-three-dimensional (S-3D), and temporal-spectral-three-dimensional (TS-3D) phenotyping, all of which are aimed at observing subtle changes in plants under stress. However, a comprehensive review that covers all these dimensional types of phenotyping, ordered in a spatial arrangement from 1D to 3D, as well as temporal and spectral dimensions, is lacking. In this review, we look back to the development of data-acquiring techniques for various dimensions of plant stress phenotyping (1D spectroscopy, 2D imaging, 3D phenotyping), as well as their corresponding data-analyzing pipelines (mathematical analysis, machine learning, or deep learning), and look forward to the trends and challenges of high-performance multi-dimension (integrated spatial, temporal, and spectral) phenotyping demands. We hope this article can serve as a reference for implementing various dimensions of non-destructive plant stress phenotyping.

1. Introduction

Unfavorable factors that affect the metabolism, growth, or development of plants are known as stressors [1,2]. Plants have developed a variety of adaptive strategies to deal with environmental stressors, enabling them to survive and even thrive in unfavorable conditions [3]. These observable traits, which arise from the interaction between a plant’s genotypes and the environment, are the phenotypes we aim to obtain. Understanding these stress-induced phenotypes can aid plant breeders in developing stress-tolerant plant varieties, and can inform the development of management strategies to mitigate the effects of environmental stress on plants [3,4,5]. This is crucial for ensuring food safety and ecosystem conservation [6,7]. To accomplish this, a comprehensive understanding of plant stress phenotyping techniques is necessary [3,8].
Although plant stress phenotyping can be carried out using either destructive (biochemical analysis) or non-destructive (optical sensing) techniques [9,10], it is almost a consensus that non-invasive optical sensing technologies are highly suitable for seeing and examining plants under stress, as these methods are rapid, reliable, and repeatable [3,11]. Optical sensing utilizes optical waves (electromagnetic radiation) to interact with the object and feedback spectral characteristics. Under stress, plant physiology and morphological properties undergo a series of changes, affecting the plant’s spectral characteristics [12,13]. Specifically, when light waves strike a plant, they may be absorbed, transmitted, or reflected. In some cases, they may also be re-emitted as fluorescence. All of these can be detected as spectral characteristics [10,14]. These spectral signals can be non-destructively sensed by optical sensors. Non-destructive optical plant stress phenotyping begins with traditional one-dimensional (1D) spectroscopy, which originated mainly from remote sensing of vegetation research [15,16]. One-dimensional spectroscopy technologies used for plant phenotyping are mainly based on the reflectance or fluorescence properties of the plant, which can be further classified into (i) Visible and Infrared (Vis-IR) reflectance spectroscopy and (ii) chlorophyll fluorescence (ChlF) spectroscopy. Vis-IR spectra reflect not only morphological leaf traits (such as texture, thickness, or internal structure) but also biochemical component content [17,18]. ChlF spectroscopy can provide insights into the leaf’s photosynthesis function, particularly the capacity to withstand stressors and the degree to which the stressors have harmed the photosynthetic apparatus [19,20,21]. Vis-IR, covering 380 nm–2400 nm, especially hyperspectral spectroscopy, which provides abundant spectral information, can also be viewed as spectral-1D (S-1D), while ChlF spectroscopy is more like temperal-1D (T-1D). However, both Vis-IR and ChlF spectroscopy can only represent a limited part of the leaf.
With the development of imaging sensors, 2D imaging can provide both spatial and spectral information. Hence, 2D imaging is capable of differentiating spatial heterogeneity and also has the potential to realize high throughput. Moreover, imaging is more intuitive to human vision. Therefore, 2D imaging is now the most widely used form of plant stress phenotyping [22,23,24]. Two-dimensional imaging comprises visible, multispectral, hyperspectral, IR, thermal-IR, and ChlF imaging [22,25,26]. Similarly, 2D imaging can be divided into S-2D (visible, multispectral, hyperspectral, and IR imaging) and T-2D (ChlF imaging) in general conditions.
Projecting 3D objects onto 2D images results in the loss of some information. With the ability to differentiate plant architecture and parameterize organs, whole plants, or even population canopies, 3D measuring is crucial for a comprehensive understanding of plant stress resistance [27,28,29]. Additionally, the effect of occlusion, which is a major obstacle in 2D phenotyping methods, can be significantly diminished. In order to obtain the dynamic process of acclimation under stress, data in temporal, spectral, and spatial dimensions should be collected together. Though there remain challenges in technical issues, the development of data-acquiring and data-analyzing techniques keeps advancing to meet ever-changing needs [12,30,31].
To summarize, Figure 1 shows an overview of the data-acquisition process for non-destructive optical-based plant stress phenotyping. According to the type of data being collected, non-destructive plant stress phenotyping can be further divided into (i) 1D spectroscopy; (ii) 2D imaging phenotyping; (iii) 3D phenotyping (including T-3D and/or S-3D). However, a synthetic review that encompasses these various dimensions of phenotyping, ordered from 1D and 2D to 3D, along with spectral and temporal dimensions, has been rarely discussed. In this review, we provide an overview of the development of various types of dimensional plant stress phenotyping, from non-imaging to imaging or even videoing techniques, and their corresponding data-analyzing algorithms, from mathematical statistics to machine learning and deep learning.
Undoubtedly, there are a large number of off-the-shelf commercial phenotyping platforms equipped with various sensors (data-acquiring) and corresponding data-processing devices to extract stress phenotypes. However, they require a relatively high capital investment, and at times, may not be flexible enough to meet our specialized measuring requirements. A complete set of plant phenotyping systems mainly includes (i) the carrying platform; (ii) the data-acquiring (sensing) equipment; and (iii) the data-analyzing techniques [31,32,33]. In this context, we are not focused on the commercialized synthetic phenotyping system or the carrying platform but only on the data-acquiring and downstream data-processing techniques.
The article is divided into six sections, with the first introducing the background information and setting the context, emphasizing the importance of a synthesis review for various dimensions of phenotyping. Section 2, Section 3 and Section 4 cover phenotyping of 1D, 2D, and 3D, respectively, including data-acquiring technologies and corresponding data-analyzing methods. Section 5 discusses the advantages and disadvantages of different plant stress phenotyping dimensions, and proposes trends for multi-dimensional phenotyping, namely, combining spatial, spectral, and temporal dimensions. The conclusion summarizes various dimensions of plant stress phenotyping techniques, and emphasizes the need for an appropriate blend of these dimensions for precise and timely stress detection.

2. Phenotyping of 1D

One-dimensional spectroscopy technologies used for plant phenotyping can be classified into (i) Vis-IR spectroscopy and (ii) ChlF spectroscopy, as Figure 1 shows.

2.1. Vis-IR Reflectance Spectroscopy

Spectral reflectance characteristics of the leaf in Vis-IR regions (380–2500 nm) can offer details about the leaf’s structure, chlorophyll, water content, and other biochemical components [13,17]. The visible region (380–780 nm) provides information about spectral features of the pigments (such as chlorophyll a and b, carotenoids, and phytochrome) [17]. Chlorophyll a absorbs most at wavelengths of 430 nm and 660 nm, chlorophyll b absorbs most at wavelengths of 450 nm and 640 nm, while carotenoids absorb wavelengths in the range of 425 to 475 nm [34,35]. In addition, a sharp change in leaf reflectance between 680 and 750 nm, also termed “Red Edge”, has been measured on leaves of various species [36]. This parameter is meaningful for assessing chlorophyll status and is suitable for early stress detection. The IR region (800–2500 nm), covering NIR (Near Infrared) and SWIR (Short Wave Infrared), is associated with the measurement of overtones and combination tones of molecular vibrations, such as the O-H, N-H, C-H, and C-O covalent bonds of macromolecules in water, proteins, sugars, and cellulose [17]. Near Infrared spectroscopy is able to identify the presence of water and the physical structure of cells, as its light can penetrate deeper than visible light. With the development of hyperspectral spectroscopy, we can obtain more spectral information [37,38]. In particular, hyperspectral spectroscopy can possess a broader waveband (350 nm–2500 nm domain) with higher spectral resolution (0.1 to 1 nm), covering Ultra Violet, Visible, NIR, and SWIR wavebands [2,39]. This spectral “fingerprint” contains abundant information.
By applying appropriate spectral analysis, the changes in the reflectance signal can be extracted to characterize the plant’s phenotype and even assess plant-specific genotype responses to biotic and abiotic stress [40,41,42]. For instance, Matthew [42] discusses the use of reflectance spectroscopy for the early detection of plant physiological responses to different levels of water stress. One-dimensional (1D) spectral analysis has the advantage of providing abundant information with relatively smaller amounts of data. In addition, it is also the basis for feature extraction for images or higher-dimension phenotyping. Hence, 1D spectroscopy is still widely used for sensing plant stress conditions [43,44,45]. Nevertheless, 1D spectroscopy is mainly performed on a limited spot of leaves, so only a part of spectral information is tested to reflect the whole plant, which is certainly not enough. What is more, the environmental factor is one of the most common difficulties during outdoor data acquisition. Meteorological conditions can affect spectral reflectance characteristics and cause biases [17].

2.2. Chlorophyll Fluorescence (ChlF) Spectroscopy

Chlorophyll fluorescence (ChlF) can provide insights into a leaf’s photosynthesis function. More precisely, it can give information about the state of PS-II (Photosystem II: the light reaction of photosynthesis includes two photosystems, namely PS-I and PS-II, and PS-II is the first protein complex in the light reaction process) by reflecting the ability of PS-II to use the light absorbed by chlorophyll and the extent to which PS-II has been damaged under stress [14,20,46]. Chlorophyll fluorometry was developed by Kautsky [47] in the late 1960s. Since then, different ChlF testing methods have been developed, such as PAM (Pulse Amplitude Modulation) [48], OJIP (the fast polyphasic rise of the induction by continuous excitation) [49], and FRR (the Fast Repetition Rate Fluorometry) [50]. Among these, PAM still seems to be the most influential tool in ChlF and the detector can measure fluorescence regardless of ambient light interference [48]. The PAM method measures the ChlF through high-frequency modulation pulses to test the actual photosynthetic performance of the plant. Further, we can obtain a number of basic chlorophyll fluorescence parameters, including the maximum quantum yield of PS-II photochemistry (Fv/Fm), the effective quantum yield of PSII photochemistry (ΦPSII), and the non-photochemical quenching (NPQ) of ChlF [51,52]. There are also alternative methods, namely, OJIP-test or fast (prompt) ChlF. OJIP means a fast (often fewer than 1 s) increasing phase of chlorophyll a fluorescence induced by constant light excitation. O is the origin (the minimum fluorescence), J and I are in the intermediate levels, and P is the peak [47]. One of the most typical testing methods is using a PEA (Photosynthetic Efficiency Analyzer) instrument [47]. It can measure the fast fluorescence induction kinetics, which reflects the state of the electron transport chain in PS-II. Such analyses offer detailed insights into the functional and structural status of PS-II reaction centers and antennae, as well as the donor and acceptor sides [53,54]
Although fluorescence measurements can give insights into the ability of plant stress tolerance by analyzing photosynthetic capacity, it is a limited approach. For instance, ChlF spectroscopy can only measure a small area/proportion of the leaf, so one has to measure many leaves to represent the whole plant in a short time, as the parameters can change over time. In addition, not all stress may manifest itself in certain leaves. However, as K. Maxwell et al. [14,20] suggested, combining it with other techniques is the most powerful and elegant application of fluorescence.

2.3. Data Analysis of the 1D Spectral Curves

Prior to the advent of machine learning, most data processing tasks were typically undertaken using mathematical and statistical software packages, such as SPSS, Microsoft Excel, MATLAB, or other specialized software kits.

2.3.1. Vis-IR Reflectance Spectral Curve

The reflectance and absorption of different wavelengths of light are determined by the leaf’s cellular structure, chlorophyll, phytochrome, water content, and other biochemical components [17,55]. To extract meaningful information from the spectral curves, mathematical and statistical methods are employed. The data processing pipeline comprises three essential steps: preprocessing, calibration, and validation (prediction) steps, as Figure 2 shows.
Vegetation indices (VIs), derived from plant spectral reflectance characteristics, are widely used to assess plant status [40,56]. Many empirical and semi-empirical spectral indices have been derived from the leaf reflectance spectra and proved to be related to plant physiological status [56,57,58]. For instance, the well-known NDVI (Normalized Difference Vegetation Index) can measure the chlorophyll absorption in the red spectrum relative to the scattering by the cellular structure, and has been used to monitor stress from proximal or remote sensing images [57,59]. Additional VIs, such as those used for growth monitoring, crop yield estimation, and plant distribution, can also be extracted. VIs can be grouped into two categories: Plant Activity and Plant Productivity VIs. Plant Activity VIs (e.g., LAI: Leaf Area Index, NDWI: Normalized Difference Water Index, SAVI: Soil Adjusted Vegetation Index, EVI: Enhanced Vegetation Index) are suited to estimating the current physical state of plants, while Plant Productivity VIs (e.g., CHI: Chlorophyll Index, NDRE: Normalized Difference Red Edge, NIRV: Near Infrared of Vegetation) provide information on yields and biochemical states [60,61].

2.3.2. Chlorophyll Fluorescence (ChlF) Kinetic Curve

From a typical chlorophyll fluorescence kinetic curve, one can obtain the photosynthetic function parameters of the leaf. Light absorbed by chlorophyll molecules functions in three processes: (1) driving photosynthesis (photochemistry quenching); (2) being re-emitted as heat (non-photochemical quenching); or (3) being re-emitted as light (fluorescence) [14]. The total amount of energy involved in these three processes follows the law of energy conservation. Therefore, by measuring the amount of fluorescence emission, we can also obtain information about the other two processes. In other words, if the amount of fluorescence can be measured, then the photochemical and non-photochemical parameters of the sample can be deduced, which in turn allows for the estimation of the sample’s photosynthetic ability [14]. Figure 3 depicts a stylized fluorescence trace experiment on leaves to measure ChlF parameters. Here, the parameters of chlorophyll fluorescence are divided into two groups: preliminary parameters (in the order of testing steps, as shown in Figure 3) and deduced parameters (obtained from preliminary parameters), as listed in Table 1.
To date, a variety of ChlF parameters have been calculated to be used as stress indicators [21,63]. Table 1 shows some commonly used ones deduced from the PAM test [20,49]. Alternatively, parameters from the fast ChlF test, such as OJIP, can provide different parameters representing different photosynthetic statues of chlorophyll a. For a more comprehensive list, please refer to [47,52,64,65]. All these parameters often can either be directly obtained in the accompanying software or be calculated using other typical mathematical methods.

3. Phenotyping of 2D

3.1. Visible Imaging

Visible-band imaging captures visible light and records images through sensitive materials such as charge-coupled devices (CCD) or Complementary Metal Oxide Semiconductors (CMOS) [25]. For instance, a mono-spectrum camera means imaging in a single color or another single visible spectrum with varying grayscale. RGB cameras represent three spectral bands red (about 600 nm), green (about 550 nm), and blue (about 450 nm) [22,25]. Analyzing visible-band imaging can provide information about morphologic and geometric properties, pigment distribution, and stress analysis [66,67,68]. This is similar to visible spectroscopy.
For example, Enders [68] provides a method to classify cold-stress responses of inbred maize seedlings using RGB imaging. Tackenberg [67] utilized RGB images for measuring biomass, including above-ground fresh biomass and dry matter content. These methods can be useful for high-throughput plant phenotyping. However, visible imaging has the drawback of providing only visual information, with limited color and gray values in the visual spectral bands. Machine vision, unlike human vision, is capable of sensing invisible light such as UV or IR light wavebands, which can provide valuable information for assessing the plant’s stress state. However, automatic downstream image processing to fit human vision may overlook crucial details [22].

3.2. Multispectral and Hyperspectral Imaging

Multispectral and hyperspectral imaging techniques can cover the visible (400–700 nm), near-infrared (700–1100 nm), and shortwave infrared (1100–2500 nm) spectral regions [69,70]. These wavelength bands are similar to those used in Vis-NIR spectroscopy. The visible region reflects photosynthetic pigment information; NIR and SWIR regions show water content and nitrogen information; SWIR can be used to estimate the amounts of minerals, hemicellulose, protein, and phosphorus in plant materials [2,71]. In other words, multispectral and hyperspectral imaging can be viewed as an upgraded version of non-imaging Vis-IR spectroscopy, but they can capture both spectral and spatial (2D) information. Spectral imaging achieves this by integrating imaging and spectroscopy, thus simultaneously measuring spectral and spatial information. Hyperspectral imaging can be realized through four basic techniques: point scanning, line scanning, area scanning, and snapshot, listed in the order of their development. As of now, the line scanning (or push broom) movement of the sensor is the most typical imaging system utilized for close-range hyperspectral image collection [2,71]. Multispectral imaging is similar to hyperspectral imaging but with sparse wavelength information.
For instance, Anika [72] utilized high-throughput hyperspectral imaging to detect cadmium stress in crops, and a machine learning model was developed to automatically classify the crops based on their spectral features. However, compared to RGB imaging, the acquisition of multispectral and hyperspectral imaging can be more complex and slower. Additionally, hyperspectral imaging is more susceptible to the effects of illumination and environmental factors, and greatly relies on accurate data-analysis techniques and reliable sensing systems [71,73]. However, ongoing technological advancements would make multispectral and hyperspectral imaging more high-performing and lightweight [74,75].

3.3. Chlorophyll Fluorescence (ChlF) Imaging

Compared to point measurement ChlF spectroscopy, ChlF imaging can detect spatiotemporal heterogeneity over the entire leaf surface [19]. One ChlF imaging system typically includes an excitation source, which can be either natural solar light or artificial UV light (with a wavelength ranging from 340 to 360 nm), and a sensitive camera (such as CCD or CMOS) that records the re-emitted fluorescence light [76,77]. As ChlF imaging can detect spatial heterogeneity, the re-emitted fluorescence light comes not only from chlorophyll, but also from epidermal cell walls and leaf veins. [20,78]. To be more precise, when excited by UV radiation, plants exhibit fluorescent light from two different wavelength bands: (i) the Red to Far-Red region, which is mainly related to chlorophyll a; (ii) the Blue to Green region, which is primarily emitted from the epidermis cell walls and the leaf veins by compounds such as ferulic acid [14,64].
Specifically, the Blue/Red and Blue/Far-Red fluorescence ratios are effective indicators of early plant stress. The former is related to plant structural components, while the latter is related to changes in the photosynthetic apparatus. By analyzing these ratios, researchers can identify early signs of stress before visible changes occur [21,79,80]. ChlF imaging offers advantages in providing information on spatial heterogeneity and temporal changes, making it a useful T-2D phenotyping tool. However, fluorescence parameters are not stress-specific and can be affected by various environmental conditions [81,82,83]. That is to say, many conditions would affect photosynthetic machinery, and so would fluorescence parameters. To ensure accuracy, combining ChlF imaging with other sensing technologies or developing better hardware and data-processing techniques is necessary, especially in uncontrolled field conditions.

3.4. Infrared (IR) Imaging

The IR region can be divided into two types: reflected-IR (NIR and SWIR: 0.7–3.0 µm) and emitted-IR (thermal-IR: 3.0–100 µm) [84]. IR imaging primarily refers to reflected-IR imaging, which covers the NIR and SWIR regions and is similar to multispectral or hyperspectral imaging. This imaging can measure various biochemicals such as water content, mesophyll cell structure, nitrogen protein, and cellulose [26,71,75]. IR imaging is often combined with RGB imaging to detect physiological status and screen morphological traits under stress.
For example, Anna Kicherer [85] utilized RGB and NIR imaging to analyze grapevines under drought stress and applied machine learning to identify patterns associated with drought tolerance. The study showed that RGB and NIR imaging can accurately differentiate between drought-tolerant and drought-sensitive grapevine varieties. The utility of IR imaging in plant stress phenotyping is valuable. However, IR fails to offer robust data on chemical composition. Furthermore, the reflectance pattern in the NIR region could be influenced by leaf thickness, growth condition, and canopy architecture. So, IR imaging is often combined with other sensing techniques [86].

3.5. Thermal-IR Imaging

Thermal-IR imaging, also called thermography, differs from other optical imaging techniques as it measures the emitted infrared radiation from the surface of samples rather than reflected radiation. [84]. Generally, any object with a temperature above Absolute Zero (−273.15 °C) will emit thermal-IR radiation [10]. Then, these radiation data are demonstrated as thermal images by a false-color temperature gradient [87]. In other words, thermal-IR imaging allows the visualization of temperature differences on the surface of plants caused by stress [88]. For instance, adjustments in the water status of a plant under stress lead to changes in leaf transpiration and gas conductance. These associated changes result in temperature differences and thus can be instantly and remotely sensed by thermal-IR imaging [88].
With a continuous increase in resolution (0.01 °C) [89], thermal-IR imaging is a promising approach for the detection of subtle changes in plants upon stress [84,90,91]. For example, in one study, Martínez [90] presents a methodology for the early detection of fungal infections by measuring the temperature changes in grapes using infrared thermography. However, thermal-IR imaging is vulnerable to environmental changes, such as meteorological conditions and the crosstalk of other emitted and reflected thermal radiation sources. For this reason, calibration and ground data collection are necessary. Meanwhile, data processing needs to be carried out for correct temperature retrieval [84]. In addition, just like fluorescence imaging, thermal-IR imaging lacks stress specificity since it is unable to distinguish between different stresses; thus, additional tests will be necessary to identify results.

3.6. Image Processing of 2D Phenotyping

Two-dimensional imaging provides more spatial information than one-dimensional spectroscopy, but also generates more data to analyze, making it challenging to process using traditional mathematical or statistical methods. Image processing techniques are then used to overcome this challenge. Image processing is a type of computer technology that processes, analyzes, and extracts useful information from images. In machine vision, a digital mono-spectrum image is represented as a matrix of definite size with different pixels having values varying from 0 to 255. An RGB image consists of three matrices, while a multispectral or hyperspectral image has more matrices, or in other words, more spectral channels, as Figure 1 shows. Machine learning algorithms are typically used in image processing to analyze these matrices and obtain relevant information.
Machine learning means the ability to learn without being explicitly programmed. It is more precisely definitized by Tom Mitchell [92] in the classic 1997 textbook: “The field of machine learning is concerned with the question of how to construct computer programs that automatically improve with experience.” That is why ML provides an excellent solution for image processing. To date, various ML techniques are blossoming. That is to say, ML is not a specific algorithm but a general term for many algorithms.
Deep learning is one branch of machine learning, or more specifically, the development results of artificial neural networks (or neural networks for brief), which attempt to simulate the behavior of the human brain [93,94,95]. It is the number of layers that sets a deep learning algorithm apart from a single neural network. Multi-layer neural networks in deep learning can automatically achieve feature extraction, unlike handcraft feature extraction in traditional machine learning. Deep learning is now widely used in image processing, in particular [95,96,97].
To make a distinction, machine learning in image processing can be further divided into traditional machine learning (TML) and deep learning (DL) as they possess different data-analysis pipelines, as Figure 4 shows. Compared to TML, DL techniques have the advantage of automated feature extraction.
As for the deployment of DL models for plant stress phenotyping, three strategies are presented, ranging from directly using pre-trained models to designing a custom model from scratch.
(1) Off-the-shelf models. If the tasks and datasets are similar, then the trained models can be directly used. Some sharing pre-trained deep learning models can be referred to, such as the plant leaves disease diagnosing model, which can be searched for on public data-sharing websites such as GitHub. However, this is rarely the case.
(2) Transfer learning. If the tasks and datasets are slightly different, although they cannot be used directly, the parameters can be changed or some changes to the architecture can be made to fit our tasks. The term transfer refers to the fact that a major portion of the model weight and trainable parameters are frozen, facilitating the use of previously learned features [98,99]. The changes usually occur at the end of the network concerning the classification number or type of outputs (discrete or continuous) [100,101].
(3) Build from scratch. If the tasks and datasets are inapplicable, then the learning model must be built from scratch. A deep learning model built from scratch requires enormous amounts of data for training and a network customized to the data. The major drawback is data acquiring and preprocessing [102,103]. The customized model architecture should be designed based on the dataset, GPU, and the task of interest [104]. However, these can add an advantage to the model’s robustness to the new datasets [93,105].
To summarize the above three methods, “off the shelf“ and “build from scratch” are two relatively extreme cases. “Transfer learning” is the most common condition, without requiring a large amount of training data, but with less training time, lower computational requirements, and appreciable performance improvement [104]. However, transfer learning for a given deep learning model can only be used for similar tasks, not for radically different ones. Moreover, at present, the availability of large-scale public datasets and pre-trained models is not exhaustive or extensive; therefore, building a model from scratch and further research are often necessary.

4. Phenotyping of 3D

Three-dimensional phenotyping can be achieved using various techniques, but almost all are reconstruction models generated by computers [106]. According to the information being measured, 3D phenotyping can be further divided into surface and internal (anatomical) traits. (i) Surface traits can be obtained using laser scanners (such as Light Detection and Ranging) and photogrammetry (such as stereo cameras) techniques. Laser scanners are used to obtain the spatial 3D “point clouds” through scanning, while photogrammetry uses stereovision 2D images to reconstruct 3D models [107,108,109]. (ii) Internal traits are often used to detect undersurface information unseen in ordinary ways. However, they can be determined through X-ray CT (computed tomography) [110], MRI (magnetic resonance imaging) [111], or PET (Positron Emission Tomography) [112].
As for plant stress phenotyping, the surface traits of 3D models may not be fine enough to identify nuanced changes in plants under stress. Due to limitations in point cloud density, surfaces are often incomplete or sparse. For this reason, multi-view 2D imaging is often better suited for capturing surface information. However, with the development of Light Detection and Ranging (LiDAR), high-resolution 3D models are becoming increasingly promising. In the following, we describe some commonly used methods for capturing 3D features, including LiDAR for surface features, and X-ray CT, MRI, and PET for internal characteristics.

4.1. Light Detection and Ranging (LiDAR)

LiDAR uses pulsed lasers to build the “point cloud”. It includes a laser light source (mainly NIR of 950 nm or 1550 nm) and a receiving system. LiDAR actively emits laser light signals, using the time of flight between the source and the target to calculate the distance. Then, many light signals can create the “point cloud” to describe the 3D surface structure [113,114,115,116].
The LiDAR laser can partially penetrate the vegetation canopy and is not vulnerable to natural sunlight [117], so LiDAR sensors have been mounted on the ground and aerial phenotyping platforms to measure various phenotypic traits [114,118,119]. Although LiDAR can be used to monitor the 3D surfaces of plants from one meter up to thousands of meters [113,119], there remain some disadvantages, such as matching errors caused by illumination and shadowing, incomplete reconstruction data caused by occlusion, and tradeoffs between accuracy and efficiency [113,117]. That is the reason why LiDAR has low accuracy when performing large-scale scanning. In the future, the density of the achievable 3D point cloud needs to be increased to better describe 3D plant structures.

4.2. X-ray Computed Tomography (CT)

X-rays are a form of electromagnetic radiation with a wavelength ranging from 0.01 to 10 nm, and extremely high frequencies ranging from 3 × 1016 Hz to 3 × 1019 Hz. They can penetrate optically opaque materials. When the transmitted X-ray is recorded by CCD cameras or other sensitive materials, then X-ray images are produced. In short, X-ray imaging produces a transmittance image [106,120] showing discontinuities (due to different attenuation coefficients) in the material, which is quite different from typical Vis-NIR reflectance images.
Standard X-ray imaging can only produce 2D images, from which one can differentiate between different tissues, and cannot be used to acquire 3D structures [110,121]. Admirably, with the development of computed tomography (CT) techniques, now X-ray CT can provide detailed cross-sectional images of internal organs by rotating scanning [110,122]. Contrary to X-ray imaging which compresses a 3D object into a plane, X-ray CT reconstructs the 3D structure with those cross-section images. However, performing X-ray CT requires that attention is paid to the dose. In addition, X-ray CT can show spatial information but it is incompatible with metabolite analysis.

4.3. Magnetic Resonance Imaging (MRI)

MRI is based on the magnetic momentum nucleus (such as 1 H, 13 C, 14 N, 15 N, 31 P, and their bounds), which are highly abundant in living tissues [123]. Their magnetic momentum can be manipulated using strong magnetic fields and radio frequency. Due to the difference in absorption resonance frequencies [124], which can be detected to differentiate their content and generate images of the internal structure [111,123,125], the different contents are translated into various shades of grey.
It is possible using MRI to recognize internal tissues. That is why MRI can be used to visualize internal structures and metabolites; therefore, it has the potential to monitor physiological processes occurring in vivo [125,126]. MRI has been used to detect underground root internal structures or stem water transportation [127,128,129]. Since MRI necessitates large electro-magnets (commonly between 0.2 and 7.0 T), it is hard to operate directly in the field [130,131]. MRI also has the limitation that data acquisition takes a long time and has a high cost. However, there are several transportable MRI devices, such as NMR-MOUSE (mobile universal surface explorer) [124] and cut-open force-free NMR (NMR-CUFF) [132]. While their resolution is limited, this restriction could be overcome through ongoing technical improvements.

4.4. Positron Emission Tomography (PET)

The Positron Emission Tomography (PET) process includes: first, one of the atoms of the object compounds is substituted with a radionuclide without changing the host chemical property; second, the emitted positrons annihilate the electrons in the plant tissues, producing a distinct external signal consisting of two almost collinear gamma rays; finally, gamma rays can be detected to show the density of the object compounds [133,134]. Plant PET can provide a quantitative analysis of the dynamic function of stressed plants in a 3D view [112,135]. Alternatively, 3D structural tomography can be used to trace the changes between different tissues and offer a quantitative measurement of the transport and allocation of metabolites in plants under stress.
For example, the author in [135] notes that PET imaging has been successfully used in investigating the dynamic transport of nutrients, phytohormones, and photoassimilates. Thus, PET is a promising tool for 3D functional imaging, enabling our study of the complex interactions between plants and their environment [112,135]. However, this quantitative measurement within the plant needs a higher spatial resolution, which is challenging for accessible PET. The combination of PET and MRI is expected to be a powerful tool for understanding stress responses [134,135].

4.5. Data Processing of 3D Phenotyping

Tridimensional measuring enables the geometry information of the plant and individual organs to be gathered. As described earlier, 3D information includes surface traits and internal structure, so the data processing methods vary. Internal structure construction is mainly undertaken in the accompanying specialized software. The surface traits, according to the procedure of data processing, can be divided into (i) primary traits, i.e., whole plant level (such as height, width, volumetric measures); (ii) and derived traits, i.e., organ level (such as the exact leaf area, stem length, and branch number) [109,136,137]. Primary traits can be obtained using the complete plant point cloud analysis, while the derived features require previous segmentation of plant organs, and information is then derived through the organs. Figure 5 shows a stylized processing pipeline for 3D data processing coming from a common point cloud.
Using routines from standard data processing software libraries such as MATLAB [138], OpenCV [139,140], or the Point Cloud Library [141,142], primary traits can be extracted. For example, after cutting the point cloud to the region of interest and performing a data cleaning step, non-complex parameters such as height and width can be derived [143]. Machine learning approaches can then be employed for further processing, such as segmenting plant organs such as leaves, stems, and flowers [117,144,145,146]. For instance, Li [116] used a U-Net architecture, which is a popular CNN architecture for image segmentation tasks, and modified the U-Net mode to take the point cloud data as input, then output a segmentation mask that identifies the different plant organs. Finally, the corresponding traits of interest can be derived.

5. Discussion

Overall, plant phenotyping sensing methods have evolved from 1D spectroscopy to 2D imaging and then to 3D phenotyping, and even to T-3D and S-3D. Related data types vary from the 1D spectral curve and 2D images, to 3D models. Consequently, analyzing methods vary from mathematical and statistical methods to machine learning and deep learning algorithms, which are summarized in Table 2.
Plant response to stress is a dynamic equilibrium process accompanied by a series of morphological, physiological, and biochemical changes in different organs [147,148]. So, obtaining data at a specific time or spectrum has constraints. In order to monitor the dynamic process of acclimation under stress, a necessary complement is to combine temporal, spatial, and spectral information. Temporal and spectral information integrated with spatial architecture can be used to track changes in the growth and movement of whole plants or particular organs.
One-dimensional spectroscopy comprises Vis-IR and ChlF spectroscopy, which can be attributed to spectral-1D and temporal-1D, respectively. Two-dimensional imaging can be divided into spectral-2D (visible imaging, multispectral and hyperspectral imaging, and IR imaging) and temporal-2D (ChlF imaging) phenotyping. Three-dimensional models mainly focus on spatial architecture. To realize spectral-3D, there may remain technical obstacles; however, this approach also can be seen in some research using methods such as multispectral image registration [149,150], or the data fusion of LiDAR data with multispectral imaging [136]. By repeating measurements and/or analysis methods over time series, the temporal dimension information can be derived [151,152]. For example, MRI and/or PET can also be viewed as temporal-3D phenotyping [134,153,154]. MRI scanning allows the measurement of both the distribution and dynamics of water and other metabolites [124,125]. PET is a time-dynamical acquisition approach used for measuring transport fluxes [112,135]. Thus, spectral-3D or temporal-spectral-3D imaging systems can provide more spectral bands and temporal information. In this regard, multi-dimension combines (i) spatial information (from the 1D spot and 2D plane, to 3D stereoscopy); (ii) spectral information (multispectral, hyperspectral, IR, thermal-IR); and (iii) temporal information (temporal evolution of phenotyping variables).
In summary, each has its advantages and disadvantages. One-dimensional spectroscopy may gradually be less used, because of its limited spatial information. However, 1D spectral analysis is the basis and reference for higher-dimension image processing, and it requires less data storage while being rich in spectral information. Two-dimensional imaging technologies are the mainstream, thanks to advancements in imaging sensors and computational devices, and the availability of a wide range of processing algorithms. Three-dimensional imaging and multi-dimension approaches are important contemporary and future trends, capable of providing an excellent solution for comprehensive stress identification and prediction, although collecting various images and processing vast amounts of data are practical issues. Supported by the development of high-resolution compact portable data-acquiring devices, and high-performance computational data-processing algorithms, multi-dimension phenotyping will drive new research in agriculture.
Table 2. Data-acquiring and corresponding data-processing methods of various dimensions of plant stress phenotyping.
Table 2. Data-acquiring and corresponding data-processing methods of various dimensions of plant stress phenotyping.
DimensionsData-Acquiring MethodsData TypesData Analyzing Methods/ToolsReferences
1D PhenotypingVis-IR reflectance spectroscopyS-1D1. Chemometric methods (PCA or PLS regression), statistical methods, and tools such as SPSS, R language, etc.
2. For devices that can calculate automatically, typical mathematical methods are then used to further process the data.
[45,55,62,155,156]
ChlF spectroscopy1D, T-1D
2D PhenotypingVisible imaging2D, S-2D T-2D 1. Image preprocessing, segmentation algorithms (watershed algorithm and color segmentation methods), and image conversion (wavelet analysis).
2. Image processing tools, such as Plant CV (https://plantcv.danforthcenter.org, accessed on 20 March 2023), IAP (http://iap.ipkgatersleben.de, accessed on 20 March 2023), and Image J (http://imagej.nih.gov/ij, accessed on 20 March 2023), and, etc.
3. Machine learning and/deep learning for the identification, classification, quantification, and prediction of stress phenotypes, such as support vector machine (SVM), Random Forest, Gaussian processes (GP), CNN and LSTM, etc.
4. Specialized corresponding data analysis software, such as Envi, Evince, SpecSight, as well as other proposed in-house image processing software solutions.
[22,66,95,157,158,159]
Multispectral imaging2D, S-2D, T-2D
Hyperspectral imaging2D, S-2D, T-2D
ChlF imaging2D, T-2D
IR imaging2D, S-2D, T-2D
Thermal-IR imaging2D, T-2D
3D PhenotypingLiDAR3D1. Specialized 3D software solutions for showing 3D models, as well as with algorithms such as SFM (structure from motion), voxel-based volume carving, and Stereo correspondence algorithms.
2. Dimensionality reduction is commonly used for image processing and feature extraction, while machine learning and/or deep learning are used for data analysis.
3. To capture temporal information, optical flow-based tracking and adaptive hierarchical segmentation methods can be utilized.
[143,160]
X-ray CT
MRI
PET
Multispectral LiDAR
3D
3D, T-3D
3D, T-3D
3D, S-3D
Visible-IR: Visible Infrared; Chlorophyll Fluorescence: ChlF; CT: Computed Tomography; MRI: Magnetic Resonance Imaging; PET: Positron Emission Tomography; PCA: principal component analysis; PLS: partial least squares; S: Spectral; T: Temporal.

6. Conclusions

In this article, we provide an overview of the various spectroscopic and imaging techniques across spatial-temporal-spectral dimensions, which have been used for data acquisition in plant phenotyping, as well as the corresponding data processing methods. Overall, 1D spectroscopy can provide spectral information of certain spots in the leaf, 2D image-based phenotyping can provide both spatial and spectral information of the plant in plane vision, and 3D image-based phenotyping can provide stereoscopic features of the plant. In comparison, temporal-3D and/or spectral-3D can provide additional dimension insights into the morphology/anatomy of opaque samples, thus allowing an assessment of a range of physiological, morphological, and biochemical parameters. In short, it is claimed that incorporating appropriate dimensions in plant stress phenotyping can improve the accuracy and efficiency of stress detection. Hence, multi-dimension approaches that combine spatial, spectral, and temporal dimensions can be helpful, especially for complex plant stress identification/prediction.
However, multi-dimension phenotyping would undoubtedly cause the accumulation of big data. In addition, data vary in their spatial, spectral, and temporal types. Additionally, data fusion from different optical sources would also add complexity. Therefore, analyzing such heterogeneous and enormous amounts of data is rather challenging. Thus, more research is needed to handle the complexities of multi-dimensional phenotyping data. This includes the development of high-resolution sensing systems, high-performance computational graphics-processing technologies, and robust analysis pipelines. Furthermore, the integration of big data and AI-driven smart agriculture makes it increasingly feasible to achieve real-time, multi-dimensional monitoring of plants. This can aid in identifying plant stress, making predictions and recommendations for management, and accelerating breeding programs.

Author Contributions

D.Y. made a substantial contribution to the concept and design of this paper, L.W. wrote the original manuscript, X.L., T.O.A. and W.W. helped to collect data for this paper. H.W. provided constructive suggestions. D.Y. and L.W. contributed equally to this work. All authors have read and agreed to the published version of the manuscript.

Funding

This work is supported by the program of Interdisciplinary Integration Promoting the Development of Intelligent Agriculture (Horticulture) of Fujian Agriculture and Forestry University (No. 000-71202103B).

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Lichtenthaler, H.K. The stress concept in plants: An introduction. Ann. N. Y. Acad. Sci. 1998, 851, 187–198. [Google Scholar] [CrossRef] [PubMed]
  2. Sarić, R.; Nguyen, V.D.; Burge, T.; Berkowitz, O.; Trtílek, M.; Whelan, J.; Lewsey, M.G.; Čustović, E. Applications of hyperspectral imaging in plant phenotyping. Trends Plant Sci. 2022, 27, 301–315. [Google Scholar] [CrossRef] [PubMed]
  3. Deery, D.M.; Jones, H.G. Field Phenomics: Will It Enable Crop Improvement? Plant Phenomics 2021, 2021, 9871989. [Google Scholar] [CrossRef] [PubMed]
  4. Rivero, R.M.; Mittler, R.; Blumwald, E.; Zandalinas, S.I. Developing climate-resilient crops: Improving plant tolerance to stress combination. Plant J. 2022, 109, 373–389. [Google Scholar] [CrossRef]
  5. Zandalinas, S.I.; Mittler, R. Plant responses to multifactorial stress combination. N. Phytol. 2022, 234, 1161–1167. [Google Scholar] [CrossRef]
  6. Masson-Delmotte, V.; Zhai, P.; Pirani, A.; Connors, S.L.; Péan, C.; Berger, S.; Caud, N.; Chen, Y.; Goldfarb, L.; Gomis, M. Climate change 2021, The physical science basis. In Contribution of Working Group I to the Sixth Assessment Report of the Intergovernmental Panel on Climate Change; IPCC: Saint-Aubin, France, 2021; Volume 2. [Google Scholar]
  7. World Health Organization. The State of Food Security and Nutrition in the World 2021, Transforming Food Systems for Food Security, Improved Nutrition and Affordable Healthy Diets for All; Food & Agriculture Organization: Geneva, Switzerland, 2021; Volume 2021. [Google Scholar]
  8. Furbank, R.T.; Tester, M. Phenomics-technologies to relieve the phenotyping bottleneck. Trends Plant Sci. 2011, 16, 635–644. [Google Scholar] [CrossRef] [PubMed]
  9. Araus, J.L.; Kefauver, S.C.; Zaman-Allah, M.; Olsen, M.S.; Cairns, J.E. Translating High-Throughput Phenotyping into Genetic Gain. Trends Plant Sci. 2018, 23, 451–466. [Google Scholar] [CrossRef]
  10. Sun, D.; Robbins, K.; Morales, N.; Shu, Q.; Cen, H. Advances in optical phenotyping of cereal crops. Trends Plant Sci. 2022, 27, 191–208. [Google Scholar] [CrossRef]
  11. Fountas, S.; Malounas, I.; Athanasakos, L.; Avgoustakis, I.; Espejo-Garcia, B. AI-Assisted Vision for Agricultural Robots. AgriEngineering 2022, 4, 674–694. [Google Scholar] [CrossRef]
  12. Waiphara, P.; Bourgenot, C.; Compton, L.J.; Prashar, A. Optical Imaging Resources for Crop Phenotyping and Stress Detection. Methods Mol. Biol. 2022, 2494, 255–265. [Google Scholar]
  13. Sun, D.; Xu, Y.; Cen, H. Optical sensors: Deciphering plant phenomics in breeding factories. Trends Plant Sci. 2022, 27, 209–210. [Google Scholar] [CrossRef] [PubMed]
  14. Maxwell, K.; Johnson, G.N. Chlorophyll fluorescence—A practical guide. J. Exp. Bot. 2000, 51, 659–668. [Google Scholar] [CrossRef]
  15. Myneni, R.B.; Hall, F.G.; Sellers, P.J.; Marshak, A.L. The interpretation of spectral vegetation indexes. IEEE Trans. Geosci. Remote Sens. 1995, 33, 481–486. [Google Scholar] [CrossRef]
  16. Buschmann, C.; Nagel, E. In vivo spectroscopy and internal optics of leaves as basis for remote sensing of vegetation. Int. J. Remote Sens. 1993, 14, 711–722. [Google Scholar] [CrossRef]
  17. Zahir, S.A.D.M.; Omar, A.F.; Jamlos, M.F.; Azmi, M.A.M.; Muncan, J. A review of visible and near-infrared (Vis-NIR) spectroscopy application in plant stress detection. Sens. Actuat A-Phys. 2022, 338, 113468. [Google Scholar] [CrossRef]
  18. Peñuelas, J.; Filella, I. Visible and near-infrared reflectance techniques for diagnosing plant physiological status. Trends Plant Sci. 1998, 3, 151–156. [Google Scholar] [CrossRef]
  19. Moustakas, M.; Calatayud, Á.; Guidi, L. Chlorophyll fluorescence imaging analysis in biotic and abiotic stress. Front. Plant Sci. 2021, 12, 658500. [Google Scholar] [CrossRef]
  20. Krause, G.; Weis, E. Chlorophyll fluorescence and photosynthesis: The basics. Annu. Rev. Plant Biol. 1991, 42, 313–349. [Google Scholar] [CrossRef]
  21. Schreiber, U.; Bilger, W.; Neubauer, C. Chlorophyll fluorescence as a nonintrusive indicator for rapid assessment of in vivo photosynthesis. In Ecophysiology of Photosynthesis; Springer: Berlin/Heidelberg, Germany, 1995; pp. 49–70. [Google Scholar]
  22. Li, L.; Zhang, Q.; Huang, D. A review of imaging techniques for plant phenotyping. Sensors 2014, 14, 20078–20111. [Google Scholar] [CrossRef]
  23. Duarte-Carvajalino, J.M.; Silva-Arero, E.A.; Goez-Vinasco, G.A.; Torres-Delgado, L.M.; Ocampo-Paez, O.D.; Castano-Marin, A.M. Estimation of Water Stress in Potato Plants Using Hyperspectral Imagery and Machine Learning Algorithms. Horticulturae 2021, 7, 176. [Google Scholar] [CrossRef]
  24. Al-Tamimi, N.; Langan, P.; Bernad, V.; Walsh, J.; Mangina, E.; Negrao, S. Capturing crop adaptation to abiotic stress using image-based technologies. Open Biol. 2022, 12, 210353. [Google Scholar] [CrossRef]
  25. Udayakumar, N. Visible Light Imaging. In Imaging with Electromagnetic Spectrum: Applications in Food and Agriculture; Manickavasagan, A., Jayasuriya, H., Eds.; Springer: Berlin/Heidelberg, Germany, 2014; pp. 67–86. [Google Scholar]
  26. Ruffing, A.M.; Anthony, S.M.; Strickland, L.M.; Lubkin, I.; Dietz, C.R. Identification of Metal Stresses in Arabidopsis thaliana Using Hyperspectral Reflectance Imaging. Front. Plant Sci. 2021, 12, 624656. [Google Scholar] [CrossRef] [PubMed]
  27. Liu, H.; Bruning, B.; Garnett, T.; Berger, B. Hyperspectral imaging and 3D technologies for plant phenotyping: From satellite to close-range sensing. Comput. Electron. Agric. 2020, 175, 105621. [Google Scholar] [CrossRef]
  28. Golbach, F.; Kootstra, G.; Damjanovic, S.; Otten, G.; van de Zedde, R. Validation of plant part measurements using a 3D reconstruction method suitable for high-throughput seedling phenotyping. Mach. Vision Appl. 2016, 27, 663–680. [Google Scholar] [CrossRef]
  29. Rossi, R.; Costafreda-Aumedes, S.; Leolini, L.; Leolini, C.; Bindi, M.; Moriondo, M. Implementation of an algorithm for automated phenotyping through plant 3D-modeling: A practical application on the early detection of water stress. Comput. Electron. Agric. 2022, 197, 106937. [Google Scholar] [CrossRef]
  30. Fu, P.; Montes, C.M.; Siebers, M.H.; Gomez-Casanovas, N.; McGrath, J.M.; Ainsworth, E.A.; Bernacchi, C.J. Advances in field-based high-throughput photosynthetic phenotyping. J. Exp. Bot. 2022, 73, 3157–3172. [Google Scholar] [CrossRef] [PubMed]
  31. Atkinson, J.A.; Jackson, R.J.; Bentley, A.R.; Ober, E.; Wells, D.M. Field Phenotyping for the Future. In Annual Plant Reviews Online; Wiley Online Library: Hoboken, NJ, USA, 2018; pp. 719–736. [Google Scholar]
  32. Feng, L.; Chen, S.; Zhang, C.; Zhang, Y.; He, Y. A comprehensive review on recent applications of unmanned aerial vehicle remote sensing with various sensors for high-throughput plant phenotyping. Comput. Electron. Agric. 2021, 182, 106033. [Google Scholar] [CrossRef]
  33. Amarasingam, N.; Ashan Salgadoe, A.S.; Powell, K.; Gonzalez, L.F.; Natarajan, S. A review of UAV platforms, sensors, and applications for monitoring of sugarcane crops. Remote Sens. Appl. Soc. Environ. 2022, 26, 100712. [Google Scholar] [CrossRef]
  34. Bartley, G.E.; Scolnik, P.A. Plant carotenoids: Pigments for photoprotection, visual attraction, and human health. Plant Cell 1995, 7, 1027. [Google Scholar]
  35. Tokarz, D.; Cisek, R.; Garbaczewska, M.; Sandkuijl, D.; Qiu, X.; Stewart, B.; Levine, J.D.; Fekl, U.; Barzda, V. Carotenoid based bio-compatible labels for third harmonic generation microscopy. Phys. Chem. Chem. Phys. 2012, 14, 10653–10661. [Google Scholar] [CrossRef]
  36. Horler, D.; Dockray, M.; Barber, J. The red edge of plant leaf reflectance. Int. J. Remote Sens. 1983, 4, 273–288. [Google Scholar] [CrossRef]
  37. Gogoi, N.; Deka, B.; Bora, L. Remote sensing and its use in detection and monitoring plant diseases: A review. Agric. Rev. 2018, 39, 307–313. [Google Scholar] [CrossRef]
  38. Meacham-Hensold, K.; Montes, C.M.; Wu, J.; Guan, K.; Fu, P.; Ainsworth, E.A.; Pederson, T.; Moore, C.E.; Brown, K.L.; Raines, C. High-throughput field phenotyping using hyperspectral reflectance and partial least squares regression (PLSR) reveals genetic modifications to photosynthetic capacity. Remote Sens. Environ. 2019, 231, 111176. [Google Scholar] [CrossRef] [PubMed]
  39. Junttila, S.; Hölttä, T.; Saarinen, N.; Kankare, V.; Yrttimaa, T.; Hyyppä, J.; Vastaranta, M. Close-range hyperspectral spectroscopy reveals leaf water content dynamics. Remote Sens. Environ. 2022, 277, 113071. [Google Scholar] [CrossRef]
  40. El-Hendawy, S.E.; Al-Suhaibani, N.A.; Elsayed, S.; Hassan, W.M.; Dewir, Y.H.; Refay, Y.; Abdella, K.A. Potential of the existing and novel spectral reflectance indices for estimating the leaf water status and grain yield of spring wheat exposed to different irrigation rates. Agric. Water Manag. 2019, 217, 356–373. [Google Scholar] [CrossRef]
  41. El-Hendawy, S.; Al-Suhaibani, N.; Hassan, W.; Tahir, M.; Schmidhalter, U. Hyperspectral reflectance sensing to assess the growth and photosynthetic properties of wheat cultivars exposed to different irrigation rates in an irrigated arid region. PLoS ONE 2017, 12, e0183262. [Google Scholar] [CrossRef]
  42. Maimaitiyiming, M.; Ghulam, A.; Bozzolo, A.; Wilkins, J.L.; Kwasniewski, M.T. Early detection of plant physiological responses to different levels of water stress using reflectance spectroscopy. Remote Sens. 2017, 9, 745. [Google Scholar] [CrossRef]
  43. Yuan, M.; Couture, J.J.; Townsend, P.A.; Ruark, M.D.; Bland, W.L. Spectroscopic Determination of Leaf Nitrogen Concentration and Mass Per Area in Sweet Corn and Snap Bean. Agron. J. 2016, 108, 2519–2526. [Google Scholar] [CrossRef]
  44. Gold, K.M.; Townsend, P.A.; Herrmann, I.; Gevens, A.J. Investigating potato late blight physiological differences across potato cultivars with spectroscopy and machine learning. Plant Sci. 2020, 295, 110316. [Google Scholar] [CrossRef]
  45. Neto, A.J.S.; Lopes, D.C.; Pinto, F.A.; Zolnier, S. Vis/NIR spectroscopy and chemometrics for non-destructive estimation of water and chlorophyll status in sunflower leaves. Biosyst. Eng. 2017, 155, 124–133. [Google Scholar] [CrossRef]
  46. Muller, P.; Li, X.-P.; Niyogi, K.K. Non-photochemical quenching. A response to excess light energy. Plant Physiol. 2001, 125, 1558–1566. [Google Scholar] [CrossRef]
  47. Stirbet, A.; Govindjee. The slow phase of chlorophyll a fluorescence induction in silico: Origin of the S-M fluorescence rise. Photosynth. Res. 2016, 130, 193–213. [Google Scholar] [CrossRef] [PubMed]
  48. Schreiber, U.; Schliwa, U.; Bilger, W. Continuous recording of photochemical and non-photochemical chlorophyll fluorescence quenching with a new type of modulation fluorometer. Photosynth. Res. 1986, 10, 51–62. [Google Scholar] [CrossRef] [PubMed]
  49. Strasser, R.J.; Srivastava, A.; Tsimilli-Michael, M. The fluorescence transient as a tool to characterize and screen photosynthetic samples. In Probing Photosynthesis: Mechanisms, Regulation and Adaptation; CRC Press: Boca Raton, FL, USA, 2000; pp. 445–483. [Google Scholar]
  50. Kolber, Z.S.; Prášil, O.; Falkowski, P.G. Measurements of variable chlorophyll fluorescence using fast repetition rate techniques: Defining methodology and experimental protocols. Biochim. Et Biophys. Acta (BBA)-Bioenerg. 1998, 1367, 88–106. [Google Scholar] [CrossRef]
  51. Van Kooten, O.; Snel, J.F. The use of chlorophyll fluorescence nomenclature in plant stress physiology. Photosynth. Res. 1990, 25, 147–150. [Google Scholar] [CrossRef]
  52. Guo, Y.; Tan, J. Recent advances in the application of chlorophyll a fluorescence from photosystem II. Photochem. Photobiol. 2015, 91, 1–14. [Google Scholar] [CrossRef]
  53. Kalaji, H.M.; Jajoo, A.; Oukarroum, A.; Brestic, M.; Zivcak, M.; Samborska, I.A.; Cetner, M.D.; Łukasik, I.; Goltsev, V.; Ladle, R.J. Chlorophyll a fluorescence as a tool to monitor physiological status of plants under abiotic stress conditions. Acta Physiol. Plant 2016, 38, 102. [Google Scholar] [CrossRef]
  54. Kalaji, H.M.; Schansker, G.; Ladle, R.J.; Goltsev, V.; Bosa, K.; Allakhverdiev, S.I.; Brestic, M.; Bussotti, F.; Calatayud, A.; Dabrowski, P.; et al. Frequently asked questions about in vivo chlorophyll fluorescence: Practical issues. Photosynth. Res. 2014, 122, 121–158. [Google Scholar] [CrossRef]
  55. Ryckewaert, M.; Héran, D.; Simonneau, T.; Abdelghafour, F.; Boulord, R.; Saurin, N.; Moura, D.; Mas-Garcia, S.; Bendoula, R. Physiological variable predictions using VIS–NIR spectroscopy for water stress detection on grapevine: Interest in combining climate data using multiblock method. Comput. Electron. Agric. 2022, 197, 106973. [Google Scholar] [CrossRef]
  56. Glenn, E.P.; Huete, A.R.; Nagler, P.L.; Nelson, S.G. Relationship between remotely-sensed vegetation indices, canopy attributes and plant physiological processes: What vegetation indices can and cannot tell us about the landscape. Sensors 2008, 8, 2136–2160. [Google Scholar] [CrossRef]
  57. Rouse, J.W., Jr.; Haas, R.H.; Schell, J.; Deering, D. Monitoring the Vernal Advancement and Retrogradation (Green Wave Effect) of Natural Vegetation; Goddard Space Flight Center: Greenbelt, MD, USA, 1973. [Google Scholar]
  58. Garbulsky, M.F.; Peñuelas, J.; Gamon, J.; Inoue, Y.; Filella, I. The photochemical reflectance index (PRI) and the remote sensing of leaf, canopy and ecosystem radiation use efficiencies: A review and meta-analysis. Remote Sens. Environ. 2011, 115, 281–297. [Google Scholar] [CrossRef]
  59. Huang, S.; Tang, L.; Hupy, J.P.; Wang, Y.; Shao, G. A commentary review on the use of normalized difference vegetation index (NDVI) in the era of popular remote sensing. J. For. Res. 2021, 32, 1–6. [Google Scholar] [CrossRef]
  60. Xue, J.; Su, B. Significant Remote Sensing Vegetation Indices: A Review of Developments and Applications. J. Sens. 2017, 2017, 1353691. [Google Scholar] [CrossRef]
  61. Bannari, A.; Morin, D.; Bonn, F.; Huete, A. A review of vegetation indices. Remote Sens. Rev. 1995, 13, 95–120. [Google Scholar] [CrossRef]
  62. Murchie, E.H.; Lawson, T. Chlorophyll fluorescence analysis: A guide to good practice and understanding some new applications. J. Exp. Bot. 2013, 64, 3983–3998. [Google Scholar] [CrossRef] [PubMed]
  63. Gitelson, A.A.; Buschmann, C.; Lichtenthaler, H.K. Leaf chlorophyll fluorescence corrected for re-absorption by means of absorption and reflectance measurements. J. Plant Physiol. 1998, 152, 283–296. [Google Scholar] [CrossRef]
  64. Baker, N.R.; Rosenqvist, E. Applications of chlorophyll fluorescence can improve crop production strategies: An examination of future possibilities. J. Exp. Bot. 2004, 55, 1607–1621. [Google Scholar] [CrossRef]
  65. Kalaji, H.M.; Schansker, G.; Brestic, M.; Bussotti, F.; Calatayud, A.; Ferroni, L.; Goltsev, V.; Guidi, L.; Jajoo, A.; Li, P.; et al. Frequently asked questions about chlorophyll fluorescence, the sequel. Photosynth. Res. 2017, 132, 13–66. [Google Scholar] [CrossRef]
  66. Rahaman, M.M.; Chen, D.; Gillani, Z.; Klukas, C.; Chen, M. Advanced phenotyping and phenotype data analysis for the study of plant growth and development. Front. Plant Sci. 2015, 6, 619. [Google Scholar] [CrossRef]
  67. Tackenberg, O. A New Method for Non-destructive Measurement of Biomass, Growth Rates, Vertical Biomass Distribution and Dry Matter Content Based on Digital Image Analysis. Ann. Bot. 2007, 99, 777–783. [Google Scholar] [CrossRef]
  68. Enders, T.A.S.; Dennis, S.; Oakland, J.; Callen, S.T.; Gehan, M.A.; Miller, N.D.; Spalding, E.P.; Springer, N.M.; Hirsch, C.D. Classifying cold-stress responses of inbred maize seedlings using RGB imaging. Plant Direct. 2019, 3, e00104. [Google Scholar] [CrossRef]
  69. Qin, J.W.; Monje, O.; Nugent, M.R.; Finn, J.R.; O’Rourke, A.E.; Fritsche, R.F.; Baek, I.; Chan, D.E.; Kim, M.S. Development of a Hyperspectral Imaging System for Plant Health Monitoring in Space Crop Production. In Proceedings of the Conference on Sensing for Agriculture and Food Quality and Safety XIV, Online, 3 April–12 June 2022. [Google Scholar]
  70. Moghimi, A.; Yang, C.; Marchetto, P.M. Ensemble Feature Selection for Plant Phenotyping: A Journey From Hyperspectral to Multispectral Imaging. IEEE Access 2018, 6, 56870–56884. [Google Scholar] [CrossRef]
  71. Mishra, P.; Asaari, M.S.M.; Herrero-Langreo, A.; Lohumi, S.; Diezma, B.; Scheunders, P. Close range hyperspectral imaging of plants: A review. Biosyst. Eng. 2017, 164, 49–67. [Google Scholar] [CrossRef]
  72. Zea, M.; Souza, A.; Yang, Y.; Lee, L.; Nemali, K.; Hoagland, L. Leveraging high-throughput hyperspectral imaging technology to detect cadmium stress in two leafy green crops and accelerate soil remediation efforts. Environ. Pollut. 2022, 292, 118405. [Google Scholar] [CrossRef] [PubMed]
  73. Cui, L.H.; Yan, L.J.; Zhao, X.H.; Yuan, L.; Jin, J.; Zhang, J.C. Detection and Discrimination of Tea Plant Stresses Based on Hyperspectral Imaging Technique at a Canopy Level. Phyton-Int. J. Exp. Bot. 2021, 90, 621–634. [Google Scholar] [CrossRef]
  74. Paulus, S.; Mahlein, A.K. Technical workflows for hyperspectral plant image assessment and processing on the greenhouse and laboratory scale. Gigascience 2020, 9, giaa090. [Google Scholar] [CrossRef]
  75. Zubler, A.V.; Yoon, J.Y. Proximal Methods for Plant Stress Detection Using Optical Sensors and Machine Learning. Biosensors 2020, 10, 193. [Google Scholar] [CrossRef]
  76. Bandopadhyay, S.; Rastogi, A.; Juszczak, R. Review of top-of-canopy sun-induced fluorescence (SIF) studies from ground, UAV, airborne to spaceborne observations. Sensors 2020, 20, 1144. [Google Scholar] [CrossRef]
  77. Moustakas, M.; Bayçu, G.; Gevrek, N.; Moustaka, J.; Csatári, I.; Rognes, S.E. Spatiotemporal heterogeneity of photosystem II function during acclimation to zinc exposure and mineral nutrition changes in the hyperaccumulator Noccaea caerulescens. Environ. Sci. Pollut. Res. 2019, 26, 6613–6624. [Google Scholar] [CrossRef]
  78. Dong, Z.; Men, Y.; Li, Z.; Zou, Q.; Ji, J. Chlorophyll fluorescence imaging as a tool for analyzing the effects of chilling injury on tomato seedlings. Sci. Hortic. 2019, 246, 490–497. [Google Scholar] [CrossRef]
  79. Buschmann, C.; Lichtenthaler, H.K. Principles and characteristics of multi-colour fluorescence imaging of plants. J. Plant Physiol. 1998, 152, 297–314. [Google Scholar] [CrossRef]
  80. Pérez-Bueno, M.L.; Pineda, M.; Barón, M. Phenotyping plant responses to biotic stress by chlorophyll fluorescence imaging. Front. Plant Sci. 2019, 10, 1135. [Google Scholar] [CrossRef] [PubMed]
  81. Kolber, Z.; Klimov, D.; Ananyev, G.; Rascher, U.; Berry, J.; Osmond, B. Measuring photosynthetic parameters at a distance: Laser induced fluorescence transient (LIFT) method for remote measurements of photosynthesis in terrestrial vegetation. Photosynth. Res. 2005, 84, 121–129. [Google Scholar] [CrossRef]
  82. Peng, H.; Cendrero-Mateo, M.P.; Bendig, J.; Siegmann, B.; Acebron, K.; Kneer, C.; Kataja, K.; Muller, O.; Rascher, U. HyScreen: A Ground-Based Imaging System for High-Resolution Red and Far-Red Solar-Induced Chlorophyll Fluorescence. Sensors 2022, 22, 9443. [Google Scholar] [CrossRef]
  83. Sun, D.; Zhu, Y.; Xu, H.; He, Y.; Cen, H. Time-Series Chlorophyll Fluorescence Imaging Reveals Dynamic Photosynthetic Fingerprints of sos Mutants to Drought Stress. Sensors 2019, 19, 2649. [Google Scholar] [CrossRef]
  84. Messina, G.; Modica, G. Applications of UAV Thermal Imagery in Precision Agriculture: State of the Art and Future Research Outlook. Remote Sens. 2020, 12, 1491. [Google Scholar] [CrossRef]
  85. Briglia, N.; Montanaro, G.; Petrozza, A.; Summerer, S.; Cellini, F.; Nuzzo, V. Drought phenotyping in Vitis vinifera using RGB and NIR imaging. Sci. Hortic. 2019, 256, 108555. [Google Scholar] [CrossRef]
  86. Weiping, Y.; Xuezhi, W.; Wheaton, A.; Cooley, N.; Moran, B. Automatic optical and IR image fusion for plant water stress analysis. In Proceedings of the 2009 12th International Conference on Information Fusion, Seattle, WA, USA, 6–9 July 2009; pp. 1053–1059. [Google Scholar]
  87. Khanal, S.; Fulton, J.; Shearer, S. An overview of current and potential applications of thermal remote sensing in precision agriculture. Comput. Electron. Agric. 2017, 139, 22–32. [Google Scholar] [CrossRef]
  88. Pineda, M.; Barón, M.; Pérez-Bueno, M.-L. Thermal Imaging for Plant Stress Detection and Phenotyping. Remote Sens. 2020, 13, 68. [Google Scholar] [CrossRef]
  89. Balakrishnan, G.K.; Yaw, C.T.; Koh, S.P.; Abedin, T.; Raj, A.A.; Tiong, S.K.; Chen, C.P. A Review of Infrared Thermography for Condition-Based Monitoring in Electrical Energy: Applications and Recommendations. Energies 2022, 15, 6000. [Google Scholar] [CrossRef]
  90. Mastrodimos, N.; Lentzou, D.; Templalexis, C.; Tsitsigiannis, D.I.; Xanthopoulos, G. Development of thermography methodology for early diagnosis of fungal infection in table grapes: The case of Aspergillus carbonarius. Comput. Electron. Agric. 2019, 165, 104972. [Google Scholar] [CrossRef]
  91. Khorsandi, A.; Hemmat, A.; Mireei, S.A.; Amirfattahi, R.; Ehsanzadeh, P. Plant temperature-based indices using infrared thermography for detecting water status in sesame under greenhouse conditions. Agric. Water Manag. 2018, 204, 222–233. [Google Scholar] [CrossRef]
  92. Jordan, M.I.; Mitchell, T.M. Machine Learning; McGraw-hill: New York, NY, USA, 1997; Volume 1. [Google Scholar]
  93. Singh, A.K.; Ganapathysubramanian, B.; Sarkar, S.; Singh, A. Deep learning for plant stress phenotyping: Trends and future perspectives. Trends Plant Sci. 2018, 23, 883–898. [Google Scholar] [CrossRef] [PubMed]
  94. LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef]
  95. Gao, Z.; Luo, Z.; Zhang, W.; Lv, Z.; Xu, Y. Deep Learning Application in Plant Stress Imaging: A Review. AgriEngineering 2020, 2, 430–446. [Google Scholar] [CrossRef]
  96. Tausen, M.; Clausen, M.; Moeskjaer, S.; Shihavuddin, A.S.M.; Dahl, A.B.; Janss, L.; Andersen, S.U. Greenotyper: Image-Based Plant Phenotyping Using Distributed Computing and Deep Learning. Front. Plant Sci. 2020, 11, 1181. [Google Scholar] [CrossRef]
  97. Dobrescu, A.; Giuffrida, M.V.; Tsaftaris, S.A. Doing More With Less: A Multitask Deep Learning Approach in Plant Phenotyping. Front. Plant Sci. 2020, 11, 141. [Google Scholar] [CrossRef]
  98. Chen, J.; Zhang, D.; Nanehkaran, Y.A.; Li, D. Detection of rice plant diseases based on deep transfer learning. J. Sci. Food Agric. 2020, 100, 3246–3256. [Google Scholar] [CrossRef]
  99. Kaya, A.; Keceli, A.S.; Catal, C.; Yalic, H.Y.; Temucin, H.; Tekinerdogan, B. Analysis of transfer learning for deep neural network based plant classification models. Comput. Electron. Agric. 2019, 158, 20–29. [Google Scholar] [CrossRef]
  100. Barbedo, J.G.A. Impact of dataset size and variety on the effectiveness of deep learning and transfer learning for plant disease classification. Comput. Electron. Agric. 2018, 153, 46–53. [Google Scholar] [CrossRef]
  101. Ghazi, M.M.; Yanikoglu, B.; Aptoula, E. Plant identification using deep neural networks via optimization of transfer learning parameters. Neurocomputing 2017, 235, 228–235. [Google Scholar] [CrossRef]
  102. Weidman, S. Deep Learning from Scratch: Building with Python from First Principles; O’Reilly Media: Sebastopol, CA, USA, 2019. [Google Scholar]
  103. Mohanty, S.P.; Hughes, D.P.; Salathe, M. Using Deep Learning for Image-Based Plant Disease Detection. Front. Plant Sci. 2016, 7, 1419. [Google Scholar] [CrossRef]
  104. Arya, S.; Sandhu, K.S.; Singh, J.; Kumar, S. Deep learning: As the new frontier in high-throughput plant phenotyping. Euphytica 2022, 218, 47. [Google Scholar] [CrossRef]
  105. Rivas, P. Deep Learning for Beginners: A Beginner’s Guide to Getting Up and Running with Deep Learning from Scratch Using Python; Packt Publishing Ltd.: Mumbai, India, 2020. [Google Scholar]
  106. Li, Z.; Guo, R.; Li, M.; Chen, Y.; Li, G. A review of computer vision technologies for plant phenotyping. Comput. Electron. Agric. 2020, 176, 105672. [Google Scholar] [CrossRef]
  107. Wu, S.; Wen, W.L.; Gou, W.B.; Lu, X.J.; Zhang, W.Q.; Zheng, C.X.; Xiang, Z.W.; Chen, L.P.; Guo, X.Y. A miniaturized phenotyping platform for individual plants using multi-view stereo 3D reconstruction. Front. Plant Sci. 2022, 13, 897746. [Google Scholar] [CrossRef] [PubMed]
  108. Forero, M.G.; Murcia, H.F.; Mendez, D.; Betancourt-Lozano, J. LiDAR Platform for Acquisition of 3D Plant Phenotyping Database. Plants 2022, 11, 2199. [Google Scholar] [CrossRef]
  109. Sampaio, G.S.; Silva, L.A.; Marengoni, M. 3D Reconstruction of Non-Rigid Plants and Sensor Data Fusion for Agriculture Phenotyping. Sensors 2021, 21, 4115. [Google Scholar] [CrossRef]
  110. Kehoe, S.; Byrne, T.; Spink, J.; Barth, S.; Ng, C.K.; Tracy, S. A novel 3D X-ray computed tomography (CT) method for spatio-temporal evaluation of waterlogging-induced aerenchyma formation in barley. Plant Phenome J. 2022, 5, e20035. [Google Scholar] [CrossRef]
  111. Zhou, Y.F.; Maitre, R.; Hupel, M.; Trotoux, G.; Penguilly, D.; Mariette, F.; Bousset, L.; Chevre, A.M.; Parisey, N. An automatic non-invasive classification for plant phenotyping by MRI images: An application for quality control on cauliflower at primary meristem stage. Comput. Electron. Agric. 2021, 187, 106303. [Google Scholar] [CrossRef]
  112. Arino-Estrada, G.; Mitchell, G.S.; Saha, P.; Arzani, A.; Cherry, S.R.; Blumwald, E.; Kyme, A.Z. Imaging Salt Uptake Dynamics in Plants Using PET. Sci. Rep. 2019, 9, 18626. [Google Scholar] [CrossRef]
  113. Jin, S.C.; Sun, X.L.; Wu, F.F.; Su, Y.J.; Li, Y.M.; Song, S.L.; Xu, K.X.; Ma, Q.; Baret, F.; Jiang, D.; et al. Lidar sheds new light on plant phenomics for plant breeding and management: Recent advances and future prospects. Isprs J. Photogramm. Remote Sens. 2021, 171, 202–223. [Google Scholar] [CrossRef]
  114. Su, Y.J.; Wu, F.F.; Ao, Z.R.; Jin, S.C.; Qin, F.; Liu, B.X.; Pang, S.X.; Liu, L.L.; Guo, Q.H. Evaluating maize phenotype dynamics under drought stress using terrestrial lidar. Plant Methods 2019, 15, 1–16. [Google Scholar] [CrossRef] [PubMed]
  115. Perez-Sanz, F.; Navarro, P.J.; Egea-Cortines, M. Plant phenomics: An overview of image acquisition technologies and image data analysis algorithms. Gigascience 2017, 6, gix092. [Google Scholar] [CrossRef]
  116. Li, Y.L.; Wen, W.L.; Miao, T.; Wu, S.; Yu, Z.T.; Wang, X.D.; Guo, X.Y.; Zhao, C.J. Automatic organ-level point cloud segmentation of maize shoots by integrating high-throughput data acquisition and deep learning. Comput. Electron. Agric. 2022, 193, 106702. [Google Scholar] [CrossRef]
  117. Ao, Z.; Wu, F.; Hu, S.; Sun, Y.; Su, Y.; Guo, Q.; Xin, Q. Automatic segmentation of stem and leaf components and individual maize plants in field terrestrial LiDAR data using convolutional neural networks. Crop J. 2022, 10, 1239–1250. [Google Scholar] [CrossRef]
  118. Zhu, Y.L.; Sun, G.; Ding, G.H.; Zhou, J.; Wen, M.X.; Jin, S.C.; Zhao, Q.; Colmer, J.; Ding, Y.F.; Ober, E.S.; et al. Large-scale field phenotyping using backpack LiDAR and CropQuant-3D to measure structural variation in wheat. Plant Physiol. 2021, 187, 716–738. [Google Scholar] [CrossRef]
  119. Wang, Q.; Che, Y.P.; Shao, K.; Zhu, J.Y.; Wang, R.L.; Sui, Y.; Guo, Y.; Li, B.G.; Meng, L.; Ma, Y.T. Estimation of sugar content in sugar beet root based on UAV multi-sensor data. Comput. Electron. Agric. 2022, 203, 107433. [Google Scholar] [CrossRef]
  120. Piovesan, A.; Vancauwenberghe, V.; Van De Looverbosch, T.; Verboven, P.; Nicolai, B. X-ray computed tomography for 3D plant imaging. Trends Plant. Sci. 2021, 26, 1171–1185. [Google Scholar] [CrossRef]
  121. Kotwaliwale, N.; Singh, K.; Kalne, A.; Jha, S.N.; Seth, N.; Kar, A. X-ray imaging methods for internal quality evaluation of agricultural produce. J. Food Sci. Technol. 2014, 51, 1–15. [Google Scholar] [CrossRef]
  122. Okochi, T.; Hoshino, Y.; Fujii, H.; Mitsutani, T. Nondestructive tree-ring measurements for Japanese oak and Japanese beech using micro-focus X-ray computed tomography. Dendrochronologia 2007, 24, 155–164. [Google Scholar] [CrossRef]
  123. Blümich, B. PT Callaghan. Principles of Nuclear Magnetic Resonance Microscopy; Wiley Online Library: Hoboken, NJ, USA, 1995. [Google Scholar]
  124. Borisjuk, L.; Rolletschek, H.; Neuberger, T. Surveying the plant’s world by magnetic resonance imaging. Plant J. 2012, 70, 129–146. [Google Scholar] [CrossRef] [PubMed]
  125. Van As, H.; Scheenen, T.; Vergeldt, F.J. MRI of intact plants. Photosynth. Res. 2009, 102, 213–222. [Google Scholar] [CrossRef] [PubMed]
  126. Van As, H. Intact plant MRI for the study of cell water relations, membrane permeability, cell-to-cell and long distance water transport. J. Exp. Bot. 2007, 58, 743–756. [Google Scholar] [CrossRef] [PubMed]
  127. Van Dusschoten, D.; Metzner, R.; Kochs, J.; Postma, J.A.; Pflugfelder, D.; Buhler, J.; Schurr, U.; Jahnke, S. Quantitative 3D Analysis of Plant Roots Growing in Soil Using Magnetic Resonance Imaging. Plant Physiol. 2016, 170, 1176–1188. [Google Scholar] [CrossRef]
  128. Pflugfelder, D.; Metzner, R.; van Dusschoten, D.; Reichel, R.; Jahnke, S.; Koller, R. Non-invasive imaging of plant roots in different soils using magnetic resonance imaging (MRI). Plant Methods 2017, 13, 1–9. [Google Scholar] [CrossRef]
  129. Scheenen, T.W.J.; Vergeldt, F.J.; Heemskerk, A.M.; Van As, H. Intact plant magnetic resonance imaging to study dynamics in long-distance sap flow and flow-conducting surface area. Plant Physiol. 2007, 144, 1157–1165. [Google Scholar] [CrossRef]
  130. Meixner, M.; Tomasella, M.; Foerst, P.; Windt, C.W. A small-scale MRI scanner and complementary imaging method to visualize and quantify xylem embolism formation. N. Phytol. 2020, 226, 1517–1529. [Google Scholar] [CrossRef]
  131. Lambert, J.; Lampen, P.; von Bohlen, A.; Hergenroder, R. Two- and three-dimensional mapping of the iron distribution in the apoplastic fluid of plant leaf tissue by means of magnetic resonance imaging. Anal. Bioanal. Chem. 2006, 384, 231–236. [Google Scholar] [CrossRef]
  132. Windt, C.W.; Soltner, H.; van Dusschoten, D.; Blumler, P. A portable Halbach magnet that can be opened and closed without force: The NMR-CUFF. J. Magn. Reson. 2011, 208, 27–33. [Google Scholar] [CrossRef]
  133. Galieni, A.; D’Ascenzo, N.; Stagnari, F.; Pagnani, G.; Xie, Q.G.; Pisante, M. Past and Future of Plant Stress Detection: An Overview From Remote Sensing to Positron Emission Tomography. Front. Plant Sci. 2021, 11, 609155. [Google Scholar] [CrossRef]
  134. Hubeau, M.; Steppe, K. Plant-PET Scans: In Vivo Mapping of Xylem and Phloem Functioning. Trends Plant Sci. 2015, 20, 676–685. [Google Scholar] [CrossRef] [PubMed]
  135. Mincke, J.; Courtyn, J.; Vanhove, C.; Vandenberghe, S.; Steppe, K. Guide to Plant-PET Imaging Using (CO2)-C-11. Front. Plant Sci. 2021, 12, 602550. [Google Scholar] [CrossRef] [PubMed]
  136. Gao, T.; Zhu, F.Y.; Paul, P.; Sandhu, J.; Doku, H.A.; Sun, J.X.; Pan, Y.; Staswick, P.; Walia, H.; Yu, H.F. Novel 3D Imaging Systems for High-Throughput Phenotyping of Plants. Remote Sens. 2021, 13, 2113. [Google Scholar] [CrossRef]
  137. Wang, Y.J.; Wen, W.L.; Wu, S.; Wang, C.Y.; Yu, Z.T.; Guo, X.Y.; Zhao, C.J. Maize Plant Phenotyping: Comparing 3D Laser Scanning, Multi-View Stereo Reconstruction, and 3D Digitizing Estimates. Remote Sens. 2019, 11, 63. [Google Scholar] [CrossRef]
  138. Fuentes, S.; Palmer, A.R.; Taylor, D.; Zeppel, M.; Whitley, R.; Eamus, D. An automated procedure for estimating the leaf area index (LAI) of woodland ecosystems using digital imagery, MATLAB programming and its application to an examination of the relationship between remotely sensed and field measurements of LAI. Funct. Plant Biol. 2008, 35, 1070–1079. [Google Scholar] [CrossRef] [PubMed]
  139. Cabrera-Bosquet, L.; Fournier, C.; Brichet, N.; Welcker, C.; Suard, B.; Tardieu, F. High-throughput estimation of incident light, light interception and radiation-use efficiency of thousands of plants in a phenotyping platform. N. Phytol. 2016, 212, 269–281. [Google Scholar] [CrossRef]
  140. Lou, L.; Liu, Y.; Han, J.; Doonan, J.H. Accurate multi-view stereo 3D reconstruction for cost-effective plant phenotyping. In Proceedings of the Image Analysis and Recognition: 11th International Conference, ICIAR 2014, Vilamoura, Portugal, 22–24 October 2014; Springer: Berlin/Heidelberg, Germany, 2014; pp. 349–356. [Google Scholar]
  141. Rusu, R.B.; Cousins, S. 3d is here: Point cloud library (pcl). In Proceedings of the 2011 IEEE International Conference on Robotics and Automation, Shanghai, China, 9–13 May 2011; IEEE: New York, NY, USA, 2011; pp. 1–4. [Google Scholar]
  142. Qiu, Q.; Sun, N.; Bai, H.; Wang, N.; Fan, Z.; Wang, Y.; Meng, Z.; Li, B.; Cong, Y. Field-based high-throughput phenotyping for maize plant using 3D LiDAR point cloud generated with a “Phenomobile”. Front. Plant Sci. 2019, 10, 554. [Google Scholar] [CrossRef]
  143. Paturkar, A.; Sen Gupta, G.; Bailey, D. Making use of 3D models for plant physiognomic analysis: A review. Remote Sens. 2021, 13, 2232. [Google Scholar] [CrossRef]
  144. Elnashef, B.; Filin, S.; Lati, R.N. Tensor-based classification and segmentation of three-dimensional point clouds for organ-level plant phenotyping and growth analysis. Comput. Electron. Agric. 2019, 156, 51–61. [Google Scholar] [CrossRef]
  145. Paulus, S.; Dupuis, J.; Riedel, S.; Kuhlmann, H. Automated analysis of barley organs using 3D laser scanning: An approach for high throughput phenotyping. Sensors 2014, 14, 12670–12686. [Google Scholar] [CrossRef]
  146. Paulus, S. Measuring crops in 3D: Using geometry for plant phenotyping. Plant Methods 2019, 15, 1–13. [Google Scholar] [CrossRef] [PubMed]
  147. Taiz, L.; Zeiger, E.; Møller, I.M.; Murphy, A. Plant Physiology and Development: Sinauer Associates Incorporated; Springer: Berlin/Heidelberg, Germany, 2015. [Google Scholar]
  148. Alscher, R.G.; Cumming, J.R. Stress Responses in Plants: Adaptation and Acclimation Mechanisms; Wiley-Liss: Hoboken, NJ, USA, 1990. [Google Scholar]
  149. Liu, H.; Lee, S.-H.; Chahl, J.S. Registration of multispectral 3D points for plant inspection. Precis. Agric. 2018, 19, 513–536. [Google Scholar] [CrossRef]
  150. Sun, G.; Wang, X.; Sun, Y.; Ding, Y.; Lu, W. Measurement method based on multispectral three-dimensional imaging for the chlorophyll contents of greenhouse tomato plants. Sensors 2019, 19, 3345. [Google Scholar] [CrossRef] [PubMed]
  151. Chebrolu, N.; Läbe, T.; Stachniss, C. Spatio-temporal non-rigid registration of 3d point clouds of plants. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Online, 31 May–31 August 2020; IEEE: New York, NY, USA, 2020; pp. 3112–3118. [Google Scholar]
  152. Schunck, D.; Magistri, F.; Rosu, R.A.; Cornelißen, A.; Chebrolu, N.; Paulus, S.; Léon, J.; Behnke, S.; Stachniss, C.; Kuhlmann, H. Pheno4D: A spatio-temporal dataset of maize and tomato plant point clouds for phenotyping and advanced plant analysis. PLoS ONE 2021, 16, e0256340. [Google Scholar] [CrossRef]
  153. Van As, H. MRI of water transport in intact plants: Characteristics and dynamics. Comp. Biochem. Physiol. A-Mol. Integr. Physiol. 2006, 143, S42. [Google Scholar]
  154. Zwieniecki, M.A.; Melcher, P.J.; Ahrens, E.T. Analysis of spatial and temporal dynamics of xylem refilling in Acer rubrum L. using magnetic resonance imaging. Front. Plant Sci. 2013, 4, 265. [Google Scholar] [CrossRef]
  155. Cozzolino, D. Use of Infrared Spectroscopy for In-Field Measurement and Phenotyping of Plant Properties: Instrumentation, Data Analysis, and Examples. Appl. Spectrosc. Rev. 2014, 49, 564–584. [Google Scholar] [CrossRef]
  156. Jansen, M.; Gilmer, F.; Biskup, B.; Nagel, K.A.; Rascher, U.; Fischbach, A.; Briem, S.; Dreissen, G.; Tittmann, S.; Braun, S.; et al. Simultaneous phenotyping of leaf growth and chlorophyll fluorescence via GROWSCREEN FLUORO allows detection of stress tolerance in Arabidopsis thaliana and other rosette plants. Funct. Plant Biol. 2009, 36, 902–914. [Google Scholar] [CrossRef]
  157. Nakhle, F.; Harfouche, A.L. Ready Steady Go, A.I. A practical tutorial on fundamentals of artificial intelligence and its applications in phenomics image analysis. Patterns 2021, 2, 100323. [Google Scholar] [CrossRef]
  158. Gehan, M.A.; Fahlgren, N.; Abbasi, A.; Berry, J.C.; Callen, S.T.; Chavez, L.; Doust, A.N.; Feldman, M.J.; Gilbert, K.B.; Hodge, J.G.; et al. PlantCV v2, Image analysis software for high-throughput plant phenotyping. PeerJ 2017, 5, e4088. [Google Scholar] [CrossRef]
  159. Koh, J.C.O.; Spangenberg, G.; Kant, S. Automated Machine Learning for High-Throughput Image-Based Plant Phenotyping. Remote Sens. 2021, 13, 858. [Google Scholar] [CrossRef]
  160. Choudhury, S.D.; Samal, A.; Awada, T. Leveraging Image Analysis for High-Throughput Plant Phenotyping. Front. Plant Sci. 2019, 10, 508. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Data-acquisition process for non-destructive optical-based plant stress phenotyping, involving 1D spectroscopy, 2D imaging, and 3D phenotyping. One-dimensional spectroscopy techniques include Visible-IR and ChlF spectroscopy; two-dimensional imaging techniques include visible, multispectral, hyperspectral, IR, and thermal-IR imaging; three-dimensional phenotyping techniques include LiDAR, X-ray CT, MRI, and PET. The corresponding acquired raw data and data type are also shown in this picture.
Figure 1. Data-acquisition process for non-destructive optical-based plant stress phenotyping, involving 1D spectroscopy, 2D imaging, and 3D phenotyping. One-dimensional spectroscopy techniques include Visible-IR and ChlF spectroscopy; two-dimensional imaging techniques include visible, multispectral, hyperspectral, IR, and thermal-IR imaging; three-dimensional phenotyping techniques include LiDAR, X-ray CT, MRI, and PET. The corresponding acquired raw data and data type are also shown in this picture.
Plants 12 01698 g001
Figure 2. One-dimensional reflectance spectral data processing pipeline. After data collection, preprocessing is needed to remove the influence of irrelevant information and background noise in the results. Calibration of the model is used to find the correlation between the sample’s properties and absorbance, perform the fitness test of the model, and connect the attributes of samples with the preprocessed measured spectra. Validation of the model is used to predict the spectral signal of unknown samples based on the calibrated model and evaluate the model’s accuracy.
Figure 2. One-dimensional reflectance spectral data processing pipeline. After data collection, preprocessing is needed to remove the influence of irrelevant information and background noise in the results. Calibration of the model is used to find the correlation between the sample’s properties and absorbance, perform the fitness test of the model, and connect the attributes of samples with the preprocessed measured spectra. Validation of the model is used to predict the spectral signal of unknown samples based on the calibrated model and evaluate the model’s accuracy.
Plants 12 01698 g002
Figure 3. A typical chlorophyll fluorescence kinetic curve used to measure the leaf’s photochemical and non-photochemical parameters. A measuring beam means light that is too low to induce photosynthesis but high enough to elicit chlorophyll fluorescence. Actinic light means light is fit for photosynthetic function. Pulse means saturating flash that can transiently close all PS-II reaction centers, but the flash is short enough, so no increase in non-photochemical quenching occurs. (Reprinted with permission from Ref. [62]).
Figure 3. A typical chlorophyll fluorescence kinetic curve used to measure the leaf’s photochemical and non-photochemical parameters. A measuring beam means light that is too low to induce photosynthesis but high enough to elicit chlorophyll fluorescence. Actinic light means light is fit for photosynthetic function. Pulse means saturating flash that can transiently close all PS-II reaction centers, but the flash is short enough, so no increase in non-photochemical quenching occurs. (Reprinted with permission from Ref. [62]).
Plants 12 01698 g003
Figure 4. Two -dimensional image processing pipeline based on traditional machine learning (TML) and deep learning (DL). TML processing flow includes: data preprocessing, feature extraction, choosing the ML model, training the model, and finally obtaining the well-trained model (satisfying the accuracy). DL processing flow includes: preparing datasets and data preprocessing, choosing the DL model, training the model, and finally obtaining the well-trained model.
Figure 4. Two -dimensional image processing pipeline based on traditional machine learning (TML) and deep learning (DL). TML processing flow includes: data preprocessing, feature extraction, choosing the ML model, training the model, and finally obtaining the well-trained model (satisfying the accuracy). DL processing flow includes: preparing datasets and data preprocessing, choosing the DL model, training the model, and finally obtaining the well-trained model.
Plants 12 01698 g004
Figure 5. Three-dimensional phenotyping data processing pipeline. Data processing flow includes: data collection, data preprocessing, point cloud creation and organ segmentation, and derived traits extraction.
Figure 5. Three-dimensional phenotyping data processing pipeline. Data processing flow includes: data collection, data preprocessing, point cloud creation and organ segmentation, and derived traits extraction.
Plants 12 01698 g005
Table 1. ChlF parameters measured from the typical chlorophyll fluorescence kinetic curve.
Table 1. ChlF parameters measured from the typical chlorophyll fluorescence kinetic curve.
ParameterMeasurement and CalculationDescription
Preliminary parametersFoSwitch on the measuring light, and get the parameter Fo.Minimal fluorescence of chlorophyll a in dark-adapted leaves, indicating the baseline fluorescence of the sample.
FmOffer a pulse, then induces Fm.Maximal fluorescence of chlorophyll a in dark-adapted leaves.
F’Switch on actinic light, followed by an initial rise in fluorescence. Then fluorescence quenches due to the increasing competition with photochemical and non-photochemical events. This state is also named the light-adapted state. It represents the chlorophyll fluorescence yield in the light-adapted state in the presence of actinic light.
FmOffer a pulse to the light-adapted state, and get the parameter Fm.Maximal fluorescence of chlorophyll a in light-adapted leaves.
FoSwitch off the actinic light and measure immediately, recording Fo′. However, the accurate measurement is complex; an alternative approach is to calculate Fo′.Minimal fluorescence of chlorophyll a in light-adapted leaves.
Deduced
parameters
FvFv = Fm − Fo, the difference between Fm and Fo is the variable fluorescence Fv.It is related to the maximum quantum yield of PS-II, reflecting the amount of chlorophyll molecules that are in the open reaction centers and actively.
Fv/FmFv/Fm = (Fm − Fo)/Fm, the result is found to be a consistent value of roughly 0.83.It represents the photochemical efficiency of PS-II and is used as an indicator of stress or damage to the photosynthetic system.
ΦPSIIΦPSII = (Fm′ − F′)/Fm, this parameter doesn’t need a dark-adapted measurement, so it is a commonly measured light-adapted parameter.It represents the operating efficiency of PS-II photochemistry.
Fq/FvFq/Fv′ = (Fm′ − F′)/Fv’, this parameter also doesn’t need a dark-adapted measurement. It reflects the level of photoprotective quenching of fluorescence, and indicates the onset of photoinhibition.
NPQNPQ = (Fm − Fm′)/Fm, also be calculated as (Fm/Fm) − 1. NPQ is the non-photochemical quenching coefficient, which evaluates the rate constant for heat loss from PS-II.
PS-II: Photosystem II.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Ye, D.; Wu, L.; Li, X.; Atoba, T.O.; Wu, W.; Weng, H. A Synthetic Review of Various Dimensions of Non-Destructive Plant Stress Phenotyping. Plants 2023, 12, 1698. https://doi.org/10.3390/plants12081698

AMA Style

Ye D, Wu L, Li X, Atoba TO, Wu W, Weng H. A Synthetic Review of Various Dimensions of Non-Destructive Plant Stress Phenotyping. Plants. 2023; 12(8):1698. https://doi.org/10.3390/plants12081698

Chicago/Turabian Style

Ye, Dapeng, Libin Wu, Xiaobin Li, Tolulope Opeyemi Atoba, Wenhao Wu, and Haiyong Weng. 2023. "A Synthetic Review of Various Dimensions of Non-Destructive Plant Stress Phenotyping" Plants 12, no. 8: 1698. https://doi.org/10.3390/plants12081698

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop