Next Article in Journal
Molecular Responses of Red Ripe Tomato Fruit to Copper Deficiency Stress
Next Article in Special Issue
Comparison of Various Drought Resistance Traits in Soybean (Glycine max L.) Based on Image Analysis for Precision Agriculture
Previous Article in Journal
Identification of the Plant Family Caryophyllaceae in Korea Using DNA Barcoding
Previous Article in Special Issue
Heterogeneity Assessment of Kenaf Breeding Field through Spatial Dependence Analysis on Crop Growth Status Map Derived by Unmanned Aerial Vehicle
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Image-Based High-Throughput Phenotyping in Horticultural Crops

Department of Agricultural Biotechnology, National Institute of Agricultural Science, Rural Development Administration, Jeonju 54874, Republic of Korea
*
Author to whom correspondence should be addressed.
Plants 2023, 12(10), 2061; https://doi.org/10.3390/plants12102061
Submission received: 12 April 2023 / Revised: 12 May 2023 / Accepted: 18 May 2023 / Published: 22 May 2023
(This article belongs to the Special Issue Plant Phenomics for Precision Agriculture)

Abstract

:
Plant phenotyping is the primary task of any plant breeding program, and accurate measurement of plant traits is essential to select genotypes with better quality, high yield, and climate resilience. The majority of currently used phenotyping techniques are destructive and time-consuming. Recently, the development of various sensors and imaging platforms for rapid and efficient quantitative measurement of plant traits has become the mainstream approach in plant phenotyping studies. Here, we reviewed the trends of image-based high-throughput phenotyping methods applied to horticultural crops. High-throughput phenotyping is carried out using various types of imaging platforms developed for indoor or field conditions. We highlighted the applications of different imaging platforms in the horticulture sector with their advantages and limitations. Furthermore, the principles and applications of commonly used imaging techniques, visible light (RGB) imaging, thermal imaging, chlorophyll fluorescence, hyperspectral imaging, and tomographic imaging for high-throughput plant phenotyping, are discussed. High-throughput phenotyping has been widely used for phenotyping various horticultural traits, which can be morphological, physiological, biochemical, yield, biotic, and abiotic stress responses. Moreover, the ability of high-throughput phenotyping with the help of various optical sensors will lead to the discovery of new phenotypic traits which need to be explored in the future. We summarized the applications of image analysis for the quantitative evaluation of various traits with several examples of horticultural crops in the literature. Finally, we summarized the current trend of high-throughput phenotyping in horticultural crops and highlighted future perspectives.

1. Introduction

The world population keeps increasing and is expected to reach ten billion by 2050, so as the demand for food and energy. This alarms the need to maximize the yield and quality of food crops as well as reduce postharvest losses. Breeding for high yield, better quality, and resistance to biotic (disease, pest, weed) and abiotic (drought, salt, heat, cold) stresses should be the priority to meet the projected food demand. Plant phenotyping is the core of any plant breeding program, and accurate measurement of plant traits is essential for the selection of the best genotypes. Phenotype is the result of the interactions between genotype and all the surrounding environmental conditions during the plant growth cycle, whereas phenotyping refers to the measurement of any aspect of plant traits, including growth, development, and physiology [1]. Plant phenomics is the high-throughput collection and analysis of multidimensional phenotypes of the whole plant through its life span [2,3]. The advancement of next-generation sequencing and marker technology has accelerated genomic study, allowing the mapping and identification of genes controlling complex traits [4]. However, phenomic information is not adequately available due to the effect of environmental factors and lack of accurate measurements limiting the phenotypic characterization of crop traits.
Conventional phenotyping has been the bottleneck for breeding for a long time as it is labor-intensive, time-consuming, and does not have high throughput [5]. The recent introduction of high-throughput phenotyping methods is accelerating plant phenotyping, enabling high-throughput measurement of several phenotypic data nondestructively and objectively [3,6]. Furthermore, high-throughput phenotyping with the help of optical sensors, computer vision, and robotics will bring new traits under consideration which were difficult to measure via the conventional method.
High-throughput phenotyping platforms can image hundreds of plants daily using various types of optical sensors, allowing the measurement of morphological, physiological, biochemical, and performance traits in a non-destructive way [7,8,9,10]. The principle of image-based phenotyping is based on the interaction of electromagnetic radiation and the plant surface, including absorption, reflection, emission, transmission, and fluorescence, which differ between normal and stressed plants or among genotypes [9]. These interactions will help to estimate various types of phenotypic traits of the plant with the help of optical sensors. Image-based high-throughput phenotyping aims to quantify numerous traits, which requires the use of various types of optical sensors. Some of the currently available sensors for plant phenotyping include visible light (red–green–blue), thermal, fluorescence, hyperspectral, multispectral, light detection and ranging (LiDAR), magnetic resonance (MRI) imaging, X-ray computed tomography (X-ray CT), and positron emission tomography (PET) [3,11,12]. The applications of various types of sensors may differ depending on imaging platforms, accessibility, cost, and the target trait under consideration. The different sensors can be used separately or in combination for fast and accurate plant phenotyping, each of which comes with its own advantages and limitations.
High-throughput phenotyping is used in breeding, crop cultivation, and even postharvest, depending on the purpose of phenotyping. In plant breeding, phenotyping a large number of samples (population) aims to increase the selection intensity and accuracy and characterize various crop traits to select the best genotypes, while phenotyping in crop cultivation is used to monitor the occurrence of any plant stresses such as disease, pests, nutrient stress, weeds, or abiotic stress at early stages [1,8]. Real-time phenotypic data acquisition and analysis will help to make immediate management decisions for the crop. Hence, image-based phenotyping will play a pivotal role in the precision cultivation of horticultural crops. Horticultural crops are mostly utilized in the fresh state and are highly perishable due to their high water content, such as vegetables and fruits. The market value of horticultural products is highly dependent on external appearance (color, shape, size, texture) and internal (soluble solid content and firmness) quality attributes. The status of these quality attributes changes over time during maturity, ripening, and postharvest storage and should be routinely monitored [13,14]. Currently, external quality attributes are mostly evaluated using visual inspection in the horticulture chain, which is slow and subjective. On the other hand, internal quality attributes are quantified using destructive laboratory analysis or handheld/portable instruments, which are also limited in speed and sample size. Due to the highly perishable nature and the dynamics of quality attributes over time in horticultural products, image-based phenotyping will greatly improve the speed, volume, and accuracy of postharvest phenotyping.
In this paper, we reviewed the applications of image-based phenotyping for the assessment of various traits in horticultural crops. Commonly used imaging platforms and sensors for high-throughput phenotyping are described. The application of these technologies for phenotyping various traits of horticultural crops is discussed with several examples in the literature. Finally, the current trends and future perspectives of high-throughput phenotyping in horticultural crops are highlighted. Using multiscale imaging platforms equipped with state-of-the-art imaging technologies will enable rapid and accurate quantitative measurement of diverse plant phenotypic traits to accelerate crop improvement, precision agriculture, and objective postharvest phenotyping (Figure 1).

2. High-Throughput Phenotyping Platforms

High-throughput phenotyping is carried out using various types of imaging platforms. The suitability of high-throughput phenotyping platforms may vary depending on the imaging environment, e.g., laboratory, growth chamber, greenhouse, or field. In most controlled environment conditions, imaging is carried out by moving the sensor towards the plant (sensor movement type) or the plant is transported to the fixed imaging setup using a conveyer belt or other transporting methods (plant movement type) for routine phenotypic measurement. For example, a greenhouse-based sensor-to-plant platform was used to measure static and dynamic traits such as geometric, structural, color, and textural phenotypes of lettuce [15]. Image-based phenotyping in controlled environment conditions has the advantage of high precision, high repeatability, continuous automated operation, and absence of interference from external environmental conditions. However, they are generally expensive and can monitor a very limited number of samples. The conveyer type and benchtop-type platforms are mostly used and have well-established systems for controlled environment conditions [6].
Imaging platforms for field-based conditions can be ground or aerial-based, targeting phenotyping of plant characteristics at individual plant or area levels. They are grouped into ground-based or aerial-based on the structures where the sensor is mounted. Ground-based imaging platforms such as pole/tower-based, mobile platform (vehicle), gantry-based, and cable suspended are flexible for deployment and have a good spatial resolution. However, they are subject to varying environmental conditions due to the slow speed of covering a large field. Aerial-based imaging platforms include unmanned aerial platforms, manned aerial platforms, and satellites. These imaging platforms can cover a wide area in a short period of time and are able to overcome varying environmental conditions. The disadvantage of the platforms is that they have a limited payload, and the spatial resolution of the image is affected by the speed and altitude of the aerial structure [6,9]. Unmanned aerial vehicle (UAV) platforms were used for the measurement of various traits in horticultural crops. For example, UAV-based remote sensing coupled with different machine learning approaches was used for disease detection and classification in potato, tomato, banana, pear, and apple [16,17,18,19,20,21,22], for tree detection in orchards such as banana and citrus [23,24,25], for aboveground biomass estimation in onion, potato, tomato, and strawberry [26,27,28,29], and other traits of fruits and vegetables [23,30,31].

3. Commonly Used Imaging Techniques for High-Throughput Plant Phenotyping

3.1. Visible Light Imaging

Visible light sensors detect light in a wavelength spectrum of 400–700 nm and provide reflected values of red, green, and blue (RGB). Visible light imaging is widely used for high-throughput phenotyping because of its accessibility, simplicity, and low cost [32]. High-resolution RGB images can be used to accurately measure plant biomass [28,33,34,35,36,37], root architecture [38,39], plant growth rate [40,41,42,43], germination rate [44], yield [45,46,47], disease detection and quantification [17,48,49,50], and abiotic stresses [51]. Their application in the field can be affected by minimal color variation between the leaf and the background and the influence of light for automatic image processing [32].

3.2. Thermal Imaging

Thermal infrared imaging allows the visualization and distribution of infrared radiation over a leaf or plant surface. A thermal camera converts infrared radiation (heat) emitted from the object into visible images showing the spatial distribution of surface temperature. The thermal sensor records the emitted light from the object in the thermal range of 3–5 μm or 7–14 μm with an image showing the temperature values per pixel. Thermal imaging can be used to detect the physiological status of the plant in response to biotic and abiotic stress, such as canopy or leaf temperature [52], transpiration and stomatal conductance [53], and plant water status [9,11]. Under water deficit conditions, plants close their stomata and reduce water loss through transpiration which is also highly linked with the soil moisture content. The reduction in transpirational cooling results in increased leaf temperature. Hence, thermal imaging can be used to manage water and irrigation in precision agriculture [54].

3.3. Hyperspectral Imaging

Hyperspectral imaging captures electromagnetic spectra (λ) and spatial (x, y) data at every pixel in an image to reconstruct the 3D data matrix called hypercube, containing thousands of images in the spectral range of 250–2500 nm encompassing UV, VIS, NIR, and SWIR [55]. It offers a large amount of information, allowing the extraction of a wide range of phenotypic traits, while the storage and analysis of the vast amount of hyperspectral data is challenging. Some of its applications include estimation of nutrient content, disease detection [16,56,57,58], fruit maturity and ripening [59,60], and other physiological and biochemical traits which are used to infer plant growth and development as well as yield [55].

3.4. Fluorescence Imaging

The light energy absorbed by chlorophyll can be used for photosynthesis, dissipated as heat, or re-emitted. Fluorescence is the light emitted when the plant absorbs radiation of shorter wavelengths, mainly via the chlorophyll complex, and is very small (<3%) compared to the total amount of radiation emitted to the object from the light sources. The amount of re-emitted light (fluorescence) is a good indicator of the plant’s ability to utilize the absorbed light and is used to estimate the overall plant health status [61]. Fluorescence imaging is used to estimate photosynthetic efficiency and other associated metabolic processes of the plant affected by biotic and abiotic stresses [62,63,64,65]. The fluorescence pattern of plants under stress conditions will show an altered pattern compared to no stressed plants. Sensors sensitive to fluorescence are used to capture fluorescence signals after illumination of the plant or tissue with visible light, infrared, or UV light. Maximum quantum efficiency (Fv/Fm), non-photochemical quenching (energy dissipated as heat from photosynthetic reaction center), the effective quantum yield of PSII (ΦPSII or Fq′/Fm′), and relative electron transport rate are some of the parameters derived from chlorophyll fluorescence which are used to assess the physiological status of the plant in relation to different stress conditions, where Fm is maximum fluorescence of a dark-adapted leaf and Fv is the difference between Fm and minimum fluorescence from dark-adapted leaf (F0) [10]. The problem with fluorescence imaging in the field condition is that it does not specify the cause of signal changes in the plant, e.g., light, temperature, or other environmental factors [11,61].

3.5. Tomographic Imaging

Other imaging techniques such as magnetic resonance imaging (MRI), X-ray computed tomography (X-ray CT), and positron emission tomography (PET) provide high-resolution 3D images of a single plant or plant parts [66]. MRI captures the 3D images of the internal structures of the sample enabling non-invasive quantification of both static and dynamic traits such as structural, biochemical, and temporal changes inside the plant. MRI can be used to monitor changes in growth and development (seed and bulb germination, seed development, fruit growth, and root growth), water dynamics within the plant, drought stress responses (drought stress, salt stress, cold stress, and heat stress), and the plant–pathogen interaction [67]. X-ray CT is used to visualize the 3D structures of internal and external features of the plant at the micro or macro level. When the X-ray beam passes through the sample, part of the beam is absorbed, and the remaining radiograph is recorded by the detector. Multiple 2D projections are recorded by moving the sample or the sensor, which are then used to reconstruct the 3D images [68]. For example, it has been used for the characterization of size and shape-related morphological traits of seed and fruit [69,70]. These imaging techniques are time-consuming and are not suitable when a large number of samples are under consideration. In addition, due to the larger size and heavy weight of the equipment it is not usable on aerial imaging platforms [55]. Examples of images from commonly used sensors in high-throughput phenotyping are shown in Figure 2.
Imaging technology is the primary component of high-throughput plant phenotyping as acquired phenotypic traits are determined by the type of sensor (imaging technique). Visible light (RGB) imaging and multi/hyperspectral imaging techniques are widely used to acquire morphological, physiological, biochemical, biotic, and abiotic stress-related traits. Fluorescence imaging and thermal imaging are used to capture the photosynthetic and surface temperature of the plant, respectively, which are physiological traits. LiDAR, X-ray CT, and MRI are mainly used to acquire morphological traits [3]. Different imaging techniques come with their specific advantages and disadvantages to capture certain plant traits. The summary of imaging techniques and measurable phenotypic traits with potential applications in high-throughput plant phenotyping is presented in Table 1. Among the variety of commercially available sensors for different imaging techniques, the choice depends on the cost, robustness, and other specifications of the sensor to capture the target trait [74]. Examples of different sensors used for high-throughput phenotyping of some horticultural crops are presented in Table 2. High-throughput phenotyping will benefit from the increasing capabilities and advancements of sensor technologies.

4. Applications of Image-Based High-Throughput Phenotyping in Horticultural Crops

4.1. Measurement of Morphological Traits

The morphological traits, including color, size, shape, and surface texture, determine the appearance of the produce and are used as quality parameters for visual inspection of horticultural crops. Although visual evaluation is a widely used nondestructive method for grading and sorting in horticultural crops, utilization of high-throughput phenotyping platforms is essential to obtain robust, faster, and objective results [14]. Nowadays, quantitative measurement of these traits (color, size, shape, and surface texture) using image analysis is increasingly used in different horticultural crops [13,84,85,86,87].
For example, grape berry color is a very important trait in grape breeding, which is qualitatively classified into six classes (green, yellow, rose, red, grey, dark red violet, or blue–black) according to the International Organization of Vine and Wine [88] or simply as noir (red, blue, or black) and non-noir (green or white). However, a qualitative assessment is very difficult to differentiate between noir and non-noir groups. Image-based phenotyping using different color spaces, RGB (red–green–blue), L*a*b (lightness, red–green, blue–yellow), and HSI (hue, saturation, intensity), allows for the easy discrimination of grape berry genotypes with different colors. RGB and HSI are able to separate within the noir and non-noir groups and enable the identification of minor QTLs controlling grape berry color, which were not identified previously using qualitative evaluation [89]. Quantitative measurement of strawberry fruit shape based on elliptic Fourier descriptors (EFDs) [90] and image analysis allowed the identification of two QTLs for shape via a genome-wide association study [84]. The fruit shape was highly correlated with the fruit length-to-width ratio.
The application of computer vision for shape quantification using images of sweet potatoes has shown that shape features, length-to-width ratio, curvature, cross-section roundness, and cross-sectional diameters, are highly predictive of shape classes. A neural network-based shape classifier was able to predict marketable (high market value) and non-marketable sweet potato classes with 84.59% accuracy [13]. In most food industries, quality is mainly assessed by experts based on subjective evaluation, which is very slow and inconsistent. The application of image-based phenotyping in the food industry is very important for fast, reliable, and objective evaluation. The browning of apple slices was quantified using color space, L*a*b, and textural features (entropy, contrast, and homogeneity) from the RGB images taken over time and showed that cv. Golden Delicious has less browning compared to Honey Crispy and Granny Smith [87].
In most horticultural crops, color, size, and texture are used as indicators of maturity and ripening. The maturity and ripening of plum and banana fruits were estimated based on these features using image analysis in which color was the dominant feature for the classification of maturity and ripening levels [91,92]. In general, color, size, shape, and texture are used to evaluate the external qualities of horticultural crops that greatly affect the market value of the produce. The applications of these quality attributes for the assessment of external qualities of horticultural crops based on hyperspectral imaging were previously reviewed [14].

4.2. Measurement of Physiological Traits

Physiological traits indicate the processes that occur within the plant, such as photosynthesis, transpiration, and canopy temperature. These traits determine how the plant is functioning under certain environmental conditions and are used to characterize the plant response to biotic and abiotic stresses, plant growth, and plant development [3]. Physiological traits can be quantified using various types of sensors, including RGB, ChlF, multispectral/hyperspectral, and thermal.
In horticultural crops, the physiological processes continue after harvesting (postharvest physiology). Postharvest physiology deals with the response of horticultural produce during postharvest storage and handling conditions along the processing or marketing chain. It determines the ripening, shelf life, and the quality of the produce. Due to their highly perishable nature, the quality and shelf life of horticultural crops is dependent on pre/postharvest handling and storage conditions [14]. Hence, high-throughput postharvest phenotyping is necessary for rapid, robust, and accurate measurement of ripening, shelf life, quality, food safety, and biochemical contents of the horticultural produce [86]. This will help to track the physiological status of the produce and make immediate decisions to avoid economic losses. For example, visual inspection and analytical methods such as spectroscopy and HPLC (high-performance liquid chromatography) analysis are widely used for fruit quality assessment, which is labor-intensive, destructive, time-consuming, and not robust. Therefore, using high-throughput methods which can accurately and efficiently measure fruit and vegetable quality attributes is essential. Chilling injury, one of the most common postharvest physiological disorder in horticultural products was detected using hyperspectral imaging and achieved more than 91% detection accuracy in apple, peach, and kiwi fruit [93,94,95,96].

4.3. Biochemical Component Analysis

Horticultural crops are rich sources of pigments such as anthocyanin, carotenoid, and chlorophyll, which serve as strong antioxidants and promote human health [31]. Quantification of these pigments has been mainly based on laboratory extraction, which is laborious and time-consuming. Handheld nondestructive devices such as chlorophyll meters and chroma meters were developed as an option to overcome destructive measurement, but they are still limited to be used in large-scale production or breeding programs. Hence, image-based phenotyping of pigment content is receiving increasing attention because it is nondestructive, robust, and fast. Anthocyanin, carotenoid, and chlorophyll content of red lettuce genotypes showed a high correlation with the vegetation indices calculated from images taken by remotely piloted aircraft [31]. The total carotenoid content in cassava root was estimated from the colorimetric indices extracted from the RGB images of root pulp. The total carotenoid content of cassava root showed a high correlation with color indices b* and chroma [97]. In addition, optical sensors can be used to nondestructively measure the amount of nutrients in the plant, such as nitrogen, phosphorus, and potassium, enabling accurate monitoring of plant growth and precise management of crop production [98].

4.4. Disease Detection and Quantification

Plant diseases are one of the challenges of crop production worldwide, causing significant yield loss every year. Early detection and accurate measurement of disease is a vital part of phytopathology and breeding [10]. Assessment of disease using conventional visual scores and laboratory-based analysis is time-consuming, laborious, and subjective. In recent years, rapid and high-throughput methods for the measurement of disease extent and severity have been widely used based on image analysis captured by various types of sensors [22,48,79,99,100]. High-throughput detection and quantification of disease is especially essential in horticultural crops which are prone to diverse pathogens during pre-harvest and post-harvest handling stages. Image analysis has been widely used for the detection and quantification of horticultural crop diseases such as apple scab [101,102], fire blight [20,21,56,103], powdery mildew [48,104,105,106], Fusarium wilt [22], bacterial blight [107], bacterial wilt [108], early blight, and late blight [19,99,109,110,111]. Image analysis can be used to closely monitor the plant health status as the disease infection can be detected at early stages before the development of typical symptoms. This enables us to take appropriate management measures to reduce the yield or quality loss.

4.5. Abiotic Stress Responses

Abiotic stresses are any kind of environmental conditions that affect normal plant growth and yield, such as drought, salinity, heat stress, and cold stress. Rapid and accurate phenotyping of plant responses to various abiotic stresses is essential to accelerate plant breeding programs dealing with the development of climate-resilient genotypes. Image-based high-throughput phenotyping is especially important when screening a large number of accessions. Various imaging techniques have been used to measure the response of horticultural plants to different abiotic stresses [82,112,113,114]. Hyperspectral images were used to detect heat stress tolerance in ginseng [115]. Cadmium stress in kale and basil was detected using high-throughput hyperspectral images. Among the vegetation indices analyzed, only the anthocyanin reflectance index was able to detect all levels of cadmium stress in both kale and basil. The anthocyanin reflectance index was significantly higher in cadmium-stressed plants than in the respective controls [116]. The applications of high-throughput phenotyping using image analysis to assess various traits in selected horticultural crops are summarized in Table 3.

5. Current Status and Future Perspectives

The statistics of publications related to high-throughput phenotyping in horticultural crops for the last two decades were surveyed and summarized in Figure 3. The number of publications dealing with high-throughput phenotyping using image analysis is increasing every year and is mostly used in agriculture and biological sciences (Figure 3a,b). Among the search keywords, ‘fruit’ was the most mentioned word in these publications and showed an exponential increase in the last five years (Figure 3c), indicating the increasing interest of the researches to automate the measurement of various fruit traits during growth, maturity, ripening, and postharvest stages. Most of these documents are research articles (71%), followed by review papers (16%) (Figure 3d). Image-based phenotyping studies are actively conducted in many countries, with USA and China taking the lead with more number publications (Figure 3e). The applications of various types of sensors may depend on imaging platforms, accessibility, cost, and the target trait under consideration. Hyperspectral sensors are most popular for the phenotyping of horticultural crops, followed by thermal sensors (Figure 3f).
One of the upcoming challenges in phenomics is the handling of massive amounts of data generated from image-based phenotyping and the ability to extract important knowledge from big data [3]. Here, the application of computer science is inevitable when dealing with digital phenotyping. Specialized in handling multidimensional and multivariate data automatically, it is suitable for application to high-throughput phenotyping [190]. In machine learning (ML), humans are interpretable to the mathematical algorithm models that solve given problems such as classification, regression, and cluster. ML provides prompt results for classifying and identifying the plants or their phenotypes, predicting and estimating the yield or influences of external factors [191].
ML models start by training the dataset. Various algorithms/methods, such as support vector machine (SVM), decision tree, random forest, k-nearest neighbors (KNNs), logistics, regressions, clustering, dimensionality reduction, and artificial neural network (ANN), empower the training. All models require the accumulation of data for their accurate and efficient outputs [192]. Lack of sufficient data learning results in common errors in the output. This issue can be easily fulfilled by massive data processing from image-based high-throughput phenotyping.
However, when the amount of data to be processed is extremely large, deep learning will be more compatible than the other ML algorithms [193,194]. The machine learning applied ANNs is well known as ‘deep learning’. Deep learning has similar features but is slightly different from conventional ML types such as supervised learning, unsupervised learning, and reinforcement learning [195]. They differ depending on the absence or presence of human interventions in feature extraction after their training process. Deep learning requires no human intervention if the training data well annotates the targeted feature, while ML requires feature extraction before classification [196]. The most common algorithms used in deep learning are convolutional neural networks (CNN), long short-term memory networks (LSTM network), recurrent neural networks (RNN), multi-layer perceptron (MLP), and radial basis function networks (RBFN).
Despite their differences, both conventional MLs and deep learning provide powerful results. Therefore, appropriate selection based on the purpose and limitations would provide more advantages during the process. For instance, ML is generally used in the evaluation and prediction of stress, biomasses, and yields [16,28,171,173,186]. Deep learning is more proper in the detection, recognition, and counting of objects in large and complex datasets such as biotic/abiotic stresses and individual plants [197,198]. Recently, ML and DL methods have been increasingly used in high-throughput plant phenotyping of various traits (Table 4).
Available aerial high-throughput phenotyping platforms target the measurement of above-ground parts, while field-scale root phenotyping using images remains a bottleneck. Novel technologies enabling root phenotyping at the field level will be a breakthrough, especially for root and tuber crops, to capture root system architecture and to predict the yield of these underground parts at the field level. With the development of novel technologies with respect to sensing and data analysis methods, image-based phenotyping can discover new traits which were difficult to measure or detect using conventional phenotyping [3]. The newly discovered traits, in combination with the available omics data, need to be explored for the new frontier of crop improvement. The storage and sharing of the large amount of phenomic data obtained from image-based phenotyping are still challenging and need to be resolved. The data should be standardized and easily accessible among research communities, academia, industry, and farmers. Minimizing the cost of sensors and phenotyping platforms, along with the automation of big data analysis methods, will greatly increase the significance of phenomics in crop improvement to meet the projected global food demand.

6. Conclusions

Image-based phenotyping methods have become an integral part of plant breeding, cultivation, and quality assessment of economically important crops. This review highlights the progress and applications of image-based phenotyping as applied to horticultural crops. We explored commonly used imaging techniques: RGB, thermal, hyperspectral, fluorescence, and tomographic imaging with their advantages and drawbacks in relation to high-throughput phenotyping of horticultural crop traits. They are used to measure morphological, physiological, biochemical, disease and pest, and abiotic stresses.
In addition to accelerating the breeding cycle by enabling rapid and accurate measurement of phenotypic traits in a large population, image-based phenotyping will help to monitor the plant condition and make immediate decisions such as pesticide spray, fertilization, or harvesting, which will greatly improve yield and the quality of the produce. Moreover, the physiological processes of horticultural crops continue even after harvest, and their quality is highly dependent on postharvest storage and handling conditions. Hence, image-based phenotyping is especially important for postharvest phenotyping of horticultural crops. This will allow real-time monitoring of internal and external qualities of the horticultural product and will continue to play a significant role along the horticultural chain. Machine learning and deep learning technologies should be well integrated into image-based phenotyping to mine knowledge from the massive amount of data generated.

Author Contributions

A.M.A. wrote the original draft, Y.K. and J.K. contributed to writing the machine learning and deep learning section, S.L.K. advised on the contents of the manuscript, and J.B. reviewed and edited the manuscript. All authors have read and agreed to the published version of the manuscript.

Funding

This work was carried out with the support of the “Cooperative Research Program for Agriculture Science and Technology Development (Project No. PJ01673501)”, Rural Development Administration, Republic of Korea.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Hickey, L.T.; Hafeez, A.N.; Robinson, H.; Jackson, S.A.; Leal-Bertioli, S.C.M.; Tester, M.; Gao, C.; Godwin, I.D.; Hayes, B.J.; Wulff, B.B.H. Breeding crops to feed 10 billion. Nat. Biotechnol. 2019, 37, 744–754. [Google Scholar] [CrossRef]
  2. Yang, W.; Feng, H.; Zhang, X.; Zhang, J.; Doonan, J.H.; Batchelor, W.D.; Xiong, L.; Yan, J. Crop Phenomics and High-Throughput Phenotyping: Past Decades, Current Challenges, and Future Perspectives. Mol. Plant 2020, 13, 187–214. [Google Scholar] [CrossRef]
  3. Sun, D.; Robbins, K.; Morales, N.; Shu, Q.; Cen, H. Advances in optical phenotyping of cereal crops. Trends Plant Sci. 2022, 27, 191–208. [Google Scholar] [CrossRef]
  4. Werner, T. Next generation sequencing in functional genomics. Brief. Bioinform. 2010, 11, 499–511. [Google Scholar] [CrossRef]
  5. Fasoula, D.A.; Ioannides, I.M.; Omirou, M. Phenotyping and Plant Breeding: Overcoming the Barriers. Front. Plant Sci. 2020, 10, 1713. [Google Scholar] [CrossRef]
  6. Li, D.; Quan, C.; Song, Z.; Li, X.; Yu, G.; Li, C.; Muhammad, A. High-Throughput Plant Phenotyping Platform (HT3P) as a Novel Tool for Estimating Agronomic Traits From the Lab to the Field. Front. Bioeng. Biotechnol. 2021, 8, 623705. [Google Scholar] [CrossRef]
  7. Das Choudhury, S.; Samal, A.; Awada, T. Leveraging Image Analysis for High-Throughput Plant Phenotyping. Front. Plant Sci. 2019, 10, 508. [Google Scholar] [CrossRef]
  8. Chawade, A.; van Ham, J.; Blomquist, H.; Bagge, O.; Alexandersson, E.; Ortiz, R. High-Throughput Field-Phenotyping Tools for Plant Breeding and Precision Agriculture. Agronomy 2019, 9, 258. [Google Scholar] [CrossRef]
  9. Jangra, S.; Chaudhary, V.; Yadav, R.C.; Yadav, N.R. High-Throughput Phenotyping: A Platform to Accelerate Crop Improvement. Phenomics 2021, 1, 31–53. [Google Scholar] [CrossRef]
  10. Mutka, A.M.; Bart, R.S. Image-based phenotyping of plant disease symptoms. Front. Plant Sci. 2015, 5, 734. [Google Scholar] [CrossRef]
  11. Li, L.; Zhang, Q.; Huang, D. A review of imaging techniques for plant phenotyping. Sensors 2014, 14, 20078–20111. [Google Scholar] [CrossRef]
  12. He, F.; Meng, Q.; Tang, L.; Huang, X.; Lu, X.; Wang, R.; Zhang, K.; Li, Y. Research progress in hyperspectral imaging technology for fruit quality detection. J. Fruit Sci. 2021, 38, 1590–1599. [Google Scholar]
  13. Haque, S.; Lobaton, E.; Nelson, N.; Yencho, G.C.; Pecota, K.V.; Mierop, R.; Kudenov, M.W.; Boyette, M.; Williams, C.M. Computer vision approach to characterize size and shape phenotypes of horticultural crops using high-throughput imagery. Comput. Electron. Agric. 2021, 182, 106011. [Google Scholar] [CrossRef]
  14. Lu, Y.; Saeys, W.; Kim, M.; Peng, Y.; Lu, R. Hyperspectral imaging technology for quality and safety evaluation of horticultural products: A review and celebration of the past 20-year progress. Postharvest. Biol. Technol. 2020, 170, 111318. [Google Scholar] [CrossRef]
  15. Du, J.; Fan, J.; Wang, C.; Lu, X.; Zhang, Y.; Wen, W.; Liao, S.; Yang, X.; Guo, X.; Zhao, C. Greenhouse-based vegetable high-throughput phenotyping platform and trait evaluation for large-scale lettuces. Comput. Electron. Agric. 2021, 186, 106193. [Google Scholar] [CrossRef]
  16. Abdulridha, J.; Ampatzidis, Y.; Qureshi, J.; Roberts, P. Laboratory and UAV-based identification and classification of tomato yellow leaf curl, bacterial spot, and target spot diseases in tomato utilizing hyperspectral imaging and machine learning. Remote Sens. 2020, 12, 2732. [Google Scholar] [CrossRef]
  17. Calou, V.B.C.; Teixeira, A.D.S.; Moreira, L.C.J.; Lima, C.S.; de Oliveira, J.B.; de Oliveira, M.R.R. The use of UAVs in monitoring yellow sigatoka in banana. Biosyst. Eng. 2020, 193, 115–125. [Google Scholar] [CrossRef]
  18. Lizarazo, I.; Rodriguez, J.L.; Cristancho, O.; Olaya, F.; Duarte, M.; Prieto, F. Identification of symptoms related to potato Verticillium wilt from UAV-based multispectral imagery using an ensemble of gradient boosting machines. Smart Agric. Technol. 2023, 3, 100138. [Google Scholar] [CrossRef]
  19. Rodríguez, J.; Lizarazo, I.; Prieto, F.; Angulo-Morales, V. Assessment of potato late blight from UAV-based multispectral imagery. Comput. Electron. Agric. 2021, 184, 106061. [Google Scholar] [CrossRef]
  20. Schoofs, H.; Delalieux, S.; Deckers, T.; Bylemans, D. Fire blight monitoring in pear orchards by unmanned airborne vehicles (UAV) systems carrying spectral sensors. Agronomy 2020, 10, 615. [Google Scholar] [CrossRef]
  21. Xiao, D.; Pan, Y.; Feng, J.; Yin, J.; Liu, Y.; He, L. Remote sensing detection algorithm for apple fire blight based on UAV multispectral image. Comput. Electron. Agric. 2022, 199, 107137. [Google Scholar] [CrossRef]
  22. Zhang, S.; Li, X.; Ba, Y.; Lyu, X.; Zhang, M.; Li, M. Banana Fusarium Wilt Disease Detection by Supervised and Unsupervised Methods from UAV-Based Multispectral Imagery. Remote Sens. 2022, 14, 1231. [Google Scholar] [CrossRef]
  23. Aeberli, A.; Johansen, K.; Robson, A.; Lamb, D.W.; Phinn, S. Detection of banana plants using multi-temporal multispectral UAV imagery. Remote Sens. 2021, 13, 2123. [Google Scholar] [CrossRef]
  24. Donmez, C.; Villi, O.; Berberoglu, S.; Cilek, A. Computer vision-based citrus tree detection in a cultivated environment using UAV imagery. Comput. Electron. Agric. 2021, 187, 106273. [Google Scholar] [CrossRef]
  25. Osco, L.P.; Nogueira, K.; Marques Ramos, A.P.; Faita Pinheiro, M.M.; Furuya, D.E.G.; Gonçalves, W.N.; de Castro Jorge, L.A.; Marcato Junior, J.; dos Santos, J.A. Semantic segmentation of citrus-orchard using deep neural networks and multispectral UAV-based imagery. Precis. Agric. 2021, 22, 1171–1188. [Google Scholar] [CrossRef]
  26. Ballesteros, R.; Ortega, J.F.; Hernandez, D.; Moreno, M.A. Onion biomass monitoring using UAV-based RGB imaging. Precis. Agric. 2018, 19, 840–857. [Google Scholar] [CrossRef]
  27. Liu, Y.; Feng, H.; Yue, J.; Fan, Y.; Jin, X.; Zhao, Y.; Song, X.; Long, H.; Yang, G. Estimation of Potato Above-Ground Biomass Using UAV-Based Hyperspectral images and Machine-Learning Regression. Remote Sens. 2022, 14, 5449. [Google Scholar] [CrossRef]
  28. Zheng, C.; Abd-Elrahman, A.; Whitaker, V.; Dalid, C. Prediction of Strawberry Dry Biomass from UAV Multispectral Imagery Using Multiple Machine Learning Methods. Remote Sens. 2022, 14, 4511. [Google Scholar] [CrossRef]
  29. Johansen, K.; Morton, M.J.L.; Malbeteau, Y.; Aragon, B.; Al-Mashharawi, S.; Ziliani, M.G.; Angel, Y.; Fiene, G.; Negrão, S.; Mousa, M.A.A.; et al. Predicting Biomass and Yield in a Tomato Phenotyping Experiment Using UAV Imagery and Random Forest. Front. Artif. Intell. 2020, 3, 28. [Google Scholar] [CrossRef]
  30. Xiong, J.; Liu, Z.; Chen, S.; Liu, B.; Zheng, Z.; Zhong, Z.; Yang, Z.; Peng, H. Visual detection of green mangoes by an unmanned aerial vehicle in orchards based on a deep learning method. Biosyst. Eng. 2020, 194, 261–272. [Google Scholar] [CrossRef]
  31. Clemente, A.A.; Maciel, G.M.; Siquieroli, A.C.S.; Gallis, R.B.D.A.; Pereira, L.M.; Duarte, J.G. High-throughput phenotyping to detect anthocyanins, chlorophylls, and carotenoids in red lettuce germplasm. Int. J. Appl. Earth Obs. Geoinf. 2021, 103, 102533. [Google Scholar] [CrossRef]
  32. Kim, J.; Chung, Y.S. A short review of RGB sensor applications for accessible high-throughput phenotyping. J. Crop Sci. Biotechnol. 2021, 24, 495–499. [Google Scholar] [CrossRef]
  33. Sinde-González, I.; Gómez-López, J.P.; Tapia-Navarro, S.A.; Murgueitio, E.; Falconí, C.; Benítez, F.L.; Toulkeridis, T. Determining the Effects of Nanonutrient Application in Cabbage (Brassica oleracea var. capitate L.) Using Spectrometry and Biomass Estimation with UAV. Agronomy 2022, 12, 81. [Google Scholar] [CrossRef]
  34. Li, B.; Xu, X.; Zhang, L.; Han, J.; Bian, C.; Li, G.; Liu, J.; Jin, L. Above-ground biomass estimation and yield prediction in potato by using UAV-based RGB and hyperspectral imaging. ISPRS J. Photogramm. Remote Sens. 2020, 162, 161–172. [Google Scholar] [CrossRef]
  35. Liu, Y.; Feng, H.; Yue, J.; Jin, X.; Li, Z.; Yang, G. Estimation of potato above-ground biomass based on unmanned aerial vehicle red-green-blue images with different texture features and crop height. Front. Plant Sci. 2022, 13, 938216. [Google Scholar] [CrossRef]
  36. Johansen, K.; Morton, M.J.L.; Malbeteau, Y.M.; Aragon, B.; Al-Mashharawi, S.K.; Ziliani, M.G.; Angel, Y.; Fiene, G.M.; Negrão, S.S.C.; Mousa, M.A.A.; et al. Unmanned aerial vehicle-based phenotyping using morphometric and spectral analysis can quantify responses of wild tomato plants to salinity stress. Front. Plant Sci. 2019, 10, 370. [Google Scholar] [CrossRef]
  37. Laxman, R.H.; Hemamalini, P.; Bhatt, R.M.; Sadashiva, A.T. Non-invasive quantification of tomato (Solanum lycopersicum L.) plant biomass through digital imaging using phenomics platform. Indian J. Plant Physiol. 2018, 23, 369–375. [Google Scholar] [CrossRef]
  38. Alaguero-Cordovilla, A.; Gran-Gómez, F.J.; Tormos-Moltó, S.; Pérez-Pérez, J.M. Morphological characterization of root system architecture in diverse tomato genotypes during early growth. Int. J. Mol. Sci. 2018, 19, 3888. [Google Scholar] [CrossRef]
  39. Brainard, S.H.; Bustamante, J.A.; Dawson, J.C.; Spalding, E.P.; Goldman, I.L. A Digital Image-Based Phenotyping Platform for Analyzing Root Shape Attributes in Carrot. Front. Plant Sci. 2021, 12, 690031. [Google Scholar] [CrossRef]
  40. Hobart, M.; Pflanz, M.; Weltzien, C.; Schirrmann, M. Growth height determination of tree walls for precise monitoring in apple fruit production using UAV photogrammetry. Remote Sens. 2020, 12, 1656. [Google Scholar] [CrossRef]
  41. Kim, D.W.; Yun, H.S.; Jeong, S.J.; Kwon, Y.S.; Kim, S.G.; Lee, W.S.; Kim, H.J. Modeling and testing of growth status for Chinese cabbage and white radish with UAV-based RGB imagery. Remote Sens. 2018, 10, 563. [Google Scholar] [CrossRef]
  42. Wasonga, D.O.; Yaw, A.; Kleemola, J.; Alakukku, L.; Mäkelä, P.S.A. Red-green-blue and multispectral imaging as potential tools for estimating growth and nutritional performance of cassava under deficit irrigation and potassium fertigation. Remote Sens. 2021, 13, 598. [Google Scholar] [CrossRef]
  43. Gang, M.S.; Kim, H.J.; Kim, D.W. Estimation of Greenhouse Lettuce Growth Indices Based on a Two-Stage CNN Using RGB-D Images. Sensors 2022, 22, 5499. [Google Scholar] [CrossRef] [PubMed]
  44. Li, B.; Xu, X.; Han, J.; Zhang, L.; Bian, C.; Jin, L.; Liu, J. The estimation of crop emergence in potatoes by UAV RGB imagery. Plant Methods 2019, 15, 15. [Google Scholar] [CrossRef] [PubMed]
  45. Chen, R.; Zhang, C.; Xu, B.; Zhu, Y.; Zhao, F.; Han, S.; Yang, G.; Yang, H. Predicting individual apple tree yield using UAV multi-source remote sensing data and ensemble learning. Comput. Electron. Agric. 2022, 201, 107275. [Google Scholar] [CrossRef]
  46. Elsayed, S.; El-Hendawy, S.; Khadr, M.; Elsherbiny, O.; Al-Suhaibani, N.; Alotaibi, M.; Tahir, M.U.; Darwish, W. Combining thermal and RGB imaging indices with multivariate and data-driven modeling to estimate the growth, water status, and yield of potato under different drip irrigation regimes. Remote Sens. 2021, 13, 1679. [Google Scholar] [CrossRef]
  47. Kurtser, P.; Ringdahl, O.; Rotstein, N.; Berenstein, R.; Edan, Y. In-field grape cluster size assessment for vine yield estimation using a mobile robot and a consumer level RGB-D Camera. IEEE Robot. Autom. Lett. 2020, 5, 2031–2038. [Google Scholar] [CrossRef]
  48. Chandel, A.K.; Khot, L.R.; Sallato, B. Apple powdery mildew infestation detection and mapping using high-resolution visible and multispectral aerial imaging technique. Sci. Hortic. 2021, 287, 110228. [Google Scholar] [CrossRef]
  49. Gomez Selvaraj, M.; Vergara, A.; Montenegro, F.; Alonso Ruiz, H.; Safari, N.; Raymaekers, D.; Ocimati, W.; Ntamwira, J.; Tits, L.; Omondi, A.B.; et al. Detection of banana plants and their major diseases through aerial images and machine learning methods: A case study in DR Congo and Republic of Benin. ISPRS J. Photogramm. Remote Sens. 2020, 169, 110–124. [Google Scholar] [CrossRef]
  50. Kim, Y.; Oh, S.; Kim, K.; Jeong, H.W.; Kim, D. Bi-dimensional Image Analysis for the Phenotypic Evaluation of Russet in Asian Pear (Pyrus spp.). Hortic. Sci. Technol. 2022, 40, 192–198. [Google Scholar]
  51. Lee, U.; Silva, R.R.; Kim, C.; Kim, H.; Heo, S.; Park, I.S.; Kim, W.; Jansky, S.; Chung, Y.S. Image Analysis for Measuring Disease Symptom to Bacterial Soft Rot in Potato. Am. J. Potato Res. 2019, 90, 303–313. [Google Scholar] [CrossRef]
  52. Ahmadi, S.H.; Agharezaee, M.; Kamgar-Haghighi, A.A.; Sepaskhah, A.R. Comparing canopy temperature and leaf water potential as irrigation scheduling criteria of potato in water-saving irrigation strategies. Int. J. Plant Prod. 2017, 11, 333–348. [Google Scholar]
  53. Prashar, A.; Yildiz, J.; McNicol, J.W.; Bryan, G.J.; Jones, H.G. Infra-red Thermography for High Throughput Field Phenotyping in Solanum tuberosum. PLoS ONE 2013, 8, e65816. [Google Scholar] [CrossRef] [PubMed]
  54. Vieira, G.H.S.; Ferrarezi, R.S. Use of thermal imaging to assess water status in citrus plants in greenhouses. Horticulturae 2021, 7, 249. [Google Scholar] [CrossRef]
  55. Sarić, R.; Nguyen, V.D.; Burge, T.; Berkowitz, O.; Trtílek, M.; Whelan, J.; Lewsey, M.G.; Čustović, E. Applications of hyperspectral imaging in plant phenotyping. Trends Plant Sci. 2022, 27, 301–315. [Google Scholar] [CrossRef]
  56. Skoneczny, H.; Kubiak, K.; Spiralski, M.; Kotlarz, J. Fire blight disease detection for apple trees: Hyperspectral analysis of healthy, infected and dry leaves. Remote Sens. 2020, 12, 2101. [Google Scholar] [CrossRef]
  57. Abdulridha, J.; Batuman, O.; Ampatzidis, Y. UAV-based remote sensing technique to detect citrus canker disease utilizing hyperspectral imaging and machine learning. Remote Sens. 2019, 11, 1373. [Google Scholar] [CrossRef]
  58. Abdulridha, J.; Ampatzidis, Y.; Kakarla, S.C.; Roberts, P. Detection of target spot and bacterial spot diseases in tomato using UAV-based and benchtop-based hyperspectral imaging techniques. Precis. Agric. 2020, 21, 955–978. [Google Scholar] [CrossRef]
  59. Shao, Y.; Wang, Y.; Xuan, G.; Gao, Z.; Hu, Z.; Gao, C.; Wang, K. Assessment of Strawberry Ripeness Using Hyperspectral Imaging. Anal. Lett. 2020, 54, 1547–1560. [Google Scholar] [CrossRef]
  60. Gutiérrez, S.; Wendel, A.; Underwood, J. Spectral filter design based on in-field hyperspectral imaging and machine learning for mango ripeness estimation. Comput. Electron. Agric. 2019, 164, 104890. [Google Scholar] [CrossRef]
  61. Maxwell, K.; Johnson, G.N. Chlorophyll fluorescence—A practical guide. J. Exp. Bot. 2000, 51, 659–668. [Google Scholar] [CrossRef] [PubMed]
  62. Weng, H.Y.; Zeng, Y.B.; Cen, H.Y.; He, M.B.; Meng, Y.Q.; Liu, Y.; Wan, L.; Xu, H.X.; Li, H.Y.; Fang, H.; et al. Characterization and detection of leaf photosynthetic response to citrus huanglongbing from cool to hot seasons in two orchards. Trans. ASABE 2020, 63, 501–512. [Google Scholar] [CrossRef]
  63. Kumar, P.; Eriksen, R.L.; Simko, I.; Mou, B. Molecular Mapping of Water-Stress Responsive Genomic Loci in Lettuce (Lactuca spp.) Using Kinetics Chlorophyll Fluorescence, Hyperspectral Imaging and Machine Learning. Front. Genet. 2021, 12, 634554. [Google Scholar] [CrossRef] [PubMed]
  64. Adhikari, N.D.; Simko, I.; Mou, B. Phenomic and physiological analysis of salinity effects on lettuce. Sensors 2019, 19, 4814. [Google Scholar] [CrossRef] [PubMed]
  65. Dong, Z.; Men, Y.; Li, Z.; Zou, Q.; Ji, J. Chlorophyll fluorescence imaging as a tool for analyzing the effects of chilling injury on tomato seedlings. Sci. Hortic. 2019, 246, 490–497. [Google Scholar] [CrossRef]
  66. Metzner, R.; Eggert, A.; van Dusschoten, D.; Pflugfelder, D.; Gerth, S.; Schurr, U.; Uhlmann, N.; Jahnke, S. Direct comparison of MRI and X-ray CT technologies for 3D imaging of root systems in soil: Potential and challenges for root trait quantification. Plant Methods 2015, 11, 17. [Google Scholar] [CrossRef]
  67. Borisjuk, L.; Rolletschek, H.; Neuberger, T. Surveying the plant’s world by magnetic resonance imaging. Plant J. 2012, 70, 129–146. [Google Scholar] [CrossRef]
  68. Piovesan, A.; Vancauwenberghe, V.; Van De Looverbosch, T.; Verboven, P.; Nicolaï, B. X-ray computed tomography for 3D plant imaging. Trends Plant Sci. 2021, 26, 1171–1185. [Google Scholar] [CrossRef]
  69. Liu, W.; Liu, C.; Jin, J.; Li, D.; Fu, Y.; Yuan, X. High-Throughput Phenotyping of Morphological Seed and Fruit Characteristics Using X-Ray Computed Tomography. Front. Plant Sci. 2020, 11, 601475. [Google Scholar] [CrossRef]
  70. Ahmed, M.R.; Yasmin, J.; Park, E.; Kim, G.; Kim, M.S.; Wakholi, C.; Mo, C.; Cho, B.K. Classification of watermelon seeds using morphological patterns of x-ray imaging: A comparison of conventional machine learning and deep learning. Sensors 2020, 20, 6753. [Google Scholar] [CrossRef]
  71. Agostini, A.; Alenyà, G.; Fischbach, A.; Scharr, H.; Wörgötter, F.; Torras, C. A cognitive architecture for automatic gardening. Comput. Electron. Agric. 2017, 138, 69–79. [Google Scholar] [CrossRef]
  72. Kim, D.M.; Zhang, H.; Zhou, H.; Du, T.; Wu, Q.; Mockler, T.C.; Berezin, M.Y. Highly sensitive image-derived indices of water-stressed plants using hyperspectral imaging in SWIR and histogram analysis. Sci. Rep. 2015, 5, 15919. [Google Scholar] [CrossRef] [PubMed]
  73. Blonder, B.; De Carlo, F.; Moore, J.; Rivers, M.; Enquist, B.J. X-ray imaging of leaf venation networks. N. Phytol. 2012, 196, 1274–1282. [Google Scholar] [CrossRef] [PubMed]
  74. Kim, J.Y. Roadmap to High Throughput Phenotyping for Plant Breeding. J. Biosyst. Eng. 2020, 45, 43–55. [Google Scholar] [CrossRef]
  75. Bian, L.; Zhang, H.; Ge, Y.; Čepl, J.; Stejskal, J.; El-Kassaby, Y.A. Closing the gap between phenotyping and genotyping: Review of advanced, image-based phenotyping technologies in forestry. Ann. For. Sci. 2022, 79, 22. [Google Scholar] [CrossRef]
  76. Chen, Y.; Lee, W.S.; Gan, H.; Peres, N.; Fraisse, C.; Zhang, Y.; He, Y. Strawberry yield prediction based on a deep neural network using high-resolution aerial orthoimages. Remote Sens. 2019, 11, 1584. [Google Scholar] [CrossRef]
  77. Zine-El-Abidine, M.; Dutagaci, H.; Galopin, G.; Rousseau, D. Assigning apples to individual trees in dense orchards using 3D colour point clouds. Biosyst. Eng. 2021, 209, 30–52. [Google Scholar] [CrossRef]
  78. Taria, S.; Alam, B.; Rane, J.; Kumar, M.; Babar, R.; Singh, N.P. Deciphering endurance capacity of mango tree (Mangifera indica L.) to desiccation stress using modern physiological tools. Sci. Hortic. 2022, 303, 111247. [Google Scholar] [CrossRef]
  79. Bendel, N.; Backhaus, A.; Kicherer, A.; Köckerling, J.; Maixner, M.; Jarausch, B.; Biancu, S.; Klück, H.C.; Seiffert, U.; Voegele, R.T.; et al. Detection of two different grapevine yellows in Vitis vinifera using hyperspectral imaging. Remote Sens. 2020, 12, 4151. [Google Scholar] [CrossRef]
  80. Aeberli, A.; Phinn, S.; Johansen, K.; Robson, A.; Lamb, D.W. Characterisation of Banana Plant Growth Using High-Spatiotemporal-Resolution Multispectral UAV Imagery. Remote Sens. 2023, 15, 679. [Google Scholar] [CrossRef]
  81. Ampatzidis, Y.; Partel, V. UAV-based high throughput phenotyping in citrus utilizing multispectral imaging and artificial intelligence. Remote Sens. 2019, 11, 410. [Google Scholar] [CrossRef]
  82. Mulugeta Aneley, G.; Haas, M.; Köhl, K. LIDAR-Based Phenotyping for Drought Response and Drought Tolerance in Potato. Potato Res. 2022. [Google Scholar] [CrossRef]
  83. Adams, T.; Bruton, R.; Ruiz, H.; Barrios-Perez, I.; Selvaraj, M.G.; Hays, D.B. Prediction of aboveground biomass of three cassava (Manihot esculenta) genotypes using a terrestrial laser scanner. Remote Sens. 2021, 13, 1272. [Google Scholar] [CrossRef]
  84. Nagamatsu, S.; Tsubone, M.; Wada, T.; Oku, K.; Mori, M.; Hirata, C.; Hayashi, A.; Tanabata, T.; Isobe, S.; Takata, K.; et al. Strawberry fruit shape: Quantification by image analysis and qtl detection by genome-wide association analysis. Breed Sci. 2021, 71, 167–175. [Google Scholar] [CrossRef]
  85. Alfatni, M.S.M.; Shariff, A.R.M.; Shafri, H.Z.M.; Saaed, O.M.B.; Eshanta, O.M. Oil palm fruit bunch grading system using red, green and blue digital number. J. Appl. Sci. 2008, 8, 1444–1452. [Google Scholar] [CrossRef]
  86. Rifna, E.J.; Dwivedi, M. Emerging nondestructive technologies for quality assessment of fruits, vegetables, and cereals. In Food Losses, Sustainable Postharvest and Food Technologies, 1st ed.; Galanakis, C.M., Ed.; Elsevier: Amsterdam, The Netherlands, 2021; pp. 219–253. [Google Scholar]
  87. Subhashree, S.N.; Sunoj, S.; Xue, J.; Bora, G.C. Quantification of browning in apples using colour and textural features by image analysis. Food Qual. Saf. 2017, 1, 221–226. [Google Scholar] [CrossRef]
  88. OIV. OIV Descriptor List for Grape Varieties and Vitis Species, 2nd ed.; International Organisation of Vine and Wine: Paris, France, 2009. [Google Scholar]
  89. Underhill, A.N.; Hirsch, C.D.; Clark, M.D. Evaluating and Mapping Grape Color Using Image-Based Phenotyping. Plant Phenomics 2020, 2020, 8086309. [Google Scholar] [CrossRef]
  90. Kuhl, F.P.; Giardina, C.R. Elliptic Fourier features of a closed contour. Comput. Electron. Agric. 1982, 18, 236–258. [Google Scholar] [CrossRef]
  91. Kaur, H.; Sawhney, B.K.; Jawandha, S.K. Evaluation of plum fruit maturity by image processing techniques. J. Food Sci. Technol. 2018, 55, 3008–3015. [Google Scholar] [CrossRef]
  92. Surya Prabha, D.; Satheesh Kumar, J. Assessment of banana fruit maturity by image processing technique. J. Food Sci. Technol. 2015, 52, 1316–1327. [Google Scholar] [CrossRef]
  93. ElMasry, G.; Wang, N.; Vigneault, C. Detecting chilling injury in Red Delicious apple using hyperspectral imaging and neural networks. Postharvest Biol. Technol. 2009, 52, 1–8. [Google Scholar] [CrossRef]
  94. Pan, L.; Zhang, Q.; Zhang, W.; Sun, Y.; Hu, P.; Tu, K. Detection of cold injury in peaches by hyperspectral reflectance imaging and artificial neural network. Food Chem. 2016, 192, 134–141. [Google Scholar] [CrossRef] [PubMed]
  95. Ge, Y.; Tu, S. Identification of Chilling Injury in Kiwifruit Using Hyperspectral Structured-Illumination Reflectance Imaging System (SIRI) with Support Vector Machine (SVM) Modelling. Anal. Lett. 2022, 56, 2040–2052. [Google Scholar] [CrossRef]
  96. Lu, Y.; Lu, R. Detection of chilling injury in pickling cucumbers using dual-band chlorophyll fluorescence imaging. Foods 2021, 10, 1094. [Google Scholar] [CrossRef] [PubMed]
  97. De Carvalho, R.R.B.; Cortes, D.F.M.; e Sousa, M.B.; de Oliveira, L.A.; de Oliveira, E.J. Image-based phenotyping of cassava roots for diversity studies and carotenoids prediction. PLoS ONE 2022, 17, e0263326. [Google Scholar] [CrossRef] [PubMed]
  98. Sun, G.; Ding, Y.; Wang, X.; Lu, W.; Sun, Y.; Yu, H. Nondestructive determination of nitrogen, phosphorus and potassium contents in greenhouse tomato plants based on multispectral three-dimensional imaging. Sensors 2019, 19, 5295. [Google Scholar] [CrossRef]
  99. Sugiura, R.; Tsuda, S.; Tamiya, S.; Itoh, A.; Nishiwaki, K.; Murakami, N.; Shibuya, Y.; Hirafuji, M.; Nuske, S. Field phenotyping system for the assessment of potato late blight resistance using RGB imagery from an unmanned aerial vehicle. Biosyst. Eng. 2016, 148, 1–10. [Google Scholar] [CrossRef]
  100. Wu, G.; Fang, Y.; Jiang, Q.; Cui, M.; Li, N.; Ou, Y.; Diao, Z.; Zhang, B. Early identification of strawberry leaves disease utilizing hyperspectral imaging combing with spectral features, multiple vegetation indices and textural features. Comput. Electron. Agric. 2023, 204, 107553. [Google Scholar] [CrossRef]
  101. Belin, É.; Rousseau, D.; Boureau, T.; Caffier, V. Thermography versus chlorophyll fluorescence imaging for detection and quantification of apple scab. Comput. Electron. Agric. 2013, 90, 159–163. [Google Scholar] [CrossRef]
  102. Bleasdale, A.J.; Blackburn, G.A.; Whyatt, J.D. Feasibility of detecting apple scab infections using low-cost sensors and interpreting radiation interactions with scab lesions. Int. J. Remote Sens. 2022, 43, 4984–5005. [Google Scholar] [CrossRef]
  103. Jarolmasjed, S.; Sankaran, S.; Marzougui, A.; Kostick, S.; Si, Y.; Quirós Vargas, J.J.; Evans, K. High-throughput phenotyping of fire blight disease symptoms using sensing techniques in apple. Front. Plant Sci. 2019, 10, 576. [Google Scholar] [CrossRef] [PubMed]
  104. Qiu, T.; Underhill, A.; Sapkota, S.; Cadle-Davidson, L.; Jiang, Y. High throughput saliency-based quantification of grape powdery mildew at the microscopic level for disease resistance breeding. Hortic. Res. 2022, 9, uhac187. [Google Scholar] [CrossRef] [PubMed]
  105. Sultan Mahmud, M.; Zaman, Q.U.; Esau, T.J.; Price, G.W.; Prithiviraj, B. Development of an artificial cloud lighting condition system using machine vision for strawberry powdery mildew disease detection. Comput. Electron. Agric. 2019, 158, 219–225. [Google Scholar] [CrossRef]
  106. Tapia, R.; Abd-Elrahman, A.; Osorio, L.; Lee, S.; Whitaker, V.M. Combining canopy reflectance spectrometry and genome-wide prediction to increase response to selection for powdery mildew resistance in cultivated strawberry. J. Exp. Bot. 2022, 73, 5322–5335. [Google Scholar] [CrossRef] [PubMed]
  107. Elliott, K.; Berry, J.C.; Kim, H.; Bart, R.S. A comparison of ImageJ and machine learning based image analysis methods to measure cassava bacterial blight disease severity. Plant Methods 2022, 18, 86. [Google Scholar] [CrossRef]
  108. Kim, J.H.; Bhandari, S.R.; Chae, S.Y.; Cho, M.C.; Lee, J.G. Application of maximum quantum yield, a parameter of chlorophyll fluorescence, for early determination of bacterial wilt in tomato seedlings. Hortic. Environ. Biotechnol. 2019, 60, 821–829. [Google Scholar] [CrossRef]
  109. Kundu, R.; Dutta, D.; MK, N.; Chakrabarty, A. Near Real Time Monitoring of Potato Late Blight Disease Severity using Field Based Hyperspectral Observation. Smart Agric. Technol. 2021, 1, 100019. [Google Scholar] [CrossRef]
  110. Hou, C.; Zhuang, J.; Tang, Y.; He, Y.; Miao, A.; Huang, H.; Luo, S. Recognition of early blight and late blight diseases on potato leaves based on graph cut segmentation. J. Agric. Food Res. 2021, 5, 100154. [Google Scholar] [CrossRef]
  111. Franceschini, M.H.D.; Bartholomeus, H.; van Apeldoorn, D.F.; Suomalainen, J.; Kooistra, L. Feasibility of unmanned aerial vehicle optical imagery for early detection and severity assessment of late blight in Potato. Remote Sens. 2019, 11, 224. [Google Scholar] [CrossRef]
  112. Virlet, N.; Costes, E.; Martinez, S.; Kelner, J.J.; Regnard, J.L. Multispectral airborne imagery in the field reveals genetic determinisms of morphological and transpiration traits of an apple tree hybrid population in response to water deficit. J. Exp. Bot. 2015, 66, 5453–5465. [Google Scholar] [CrossRef]
  113. Briglia, N.; Montanaro, G.; Petrozza, A.; Summerer, S.; Cellini, F.; Nuzzo, V. Drought phenotyping in Vitis vinifera using RGB and NIR imaging. Sci. Hortic. 2019, 256, 108555. [Google Scholar] [CrossRef]
  114. Chen, S.; Guo, Y.; Sirault, X.; Stefanova, K.; Saradadevi, R.; Turner, N.C.; Nelson, M.N.; Furbank, R.T.; Siddique, K.H.M.; Cowling, W.A. Nondestructive phenomic tools for the prediction of heat and drought tolerance at anthesis in Brassica species. Plant Phenom. 2019, 2019, 3264872. [Google Scholar] [CrossRef] [PubMed]
  115. Faqeerzada, M.A.; Park, E.; Kim, T.; Kim, M.S.; Baek, I.; Joshi, R.; Kim, J.; Cho, B.K. Fluorescence Hyperspectral Imaging for Early Diagnosis of Heat-Stressed Ginseng Plants. Appl Sci. 2023, 13, 31. [Google Scholar]
  116. Zea, M.; Souza, A.; Yang, Y.; Lee, L.; Nemali, K.; Hoagland, L. Leveraging high-throughput hyperspectral imaging technology to detect cadmium stress in two leafy green crops and accelerate soil remediation efforts. Environ. Pollut. 2022, 292, 118405. [Google Scholar] [CrossRef] [PubMed]
  117. Ropelewska, E.; Rutkowski, K.P. Cultivar discrimination of stored apple seeds based on geometric features determined using image analysis. J. Stored Prod. Res. 2021, 92, 101804. [Google Scholar] [CrossRef]
  118. Wu, J.; Yang, G.; Yang, H.; Zhu, Y.; Li, Z.; Lei, L.; Zhao, C. Extracting apple tree crown information from remote imagery using deep learning. Comput. Electron. Agric. 2020, 174, 105504. [Google Scholar] [CrossRef]
  119. Sun, X.; Fang, W.; Gao, C.; Fu, L.; Majeed, Y.; Liu, X.; Gao, F.; Yang, R.; Li, R. Remote estimation of grafted apple tree trunk diameter in modern orchard with RGB and point cloud based on SOLOv. Comput. Electron. Agric. 2022, 199, 107209. [Google Scholar] [CrossRef]
  120. Fu, L.; Majeed, Y.; Zhang, X.; Karkee, M.; Zhang, Q. Faster R–CNN–based apple detection in dense-foliage fruiting-wall trees using RGB and depth features for robotic harvesting. Biosyst. Eng. 2020, 197, 245–256. [Google Scholar] [CrossRef]
  121. Chen, W.; Zhang, J.; Guo, B.; Wei, Q.; Zhu, Z. An Apple Detection Method Based on Des-YOLO v4 Algorithm for Harvesting Robots in Complex Environment. Math. Probl. Eng. 2021, 2021, 7351470. [Google Scholar] [CrossRef]
  122. Ge, L.; Zou, K.; Zhou, H.; Yu, X.; Tan, Y.; Zhang, C.; Li, W. Three dimensional apple tree organs classification and yield estimation algorithm based on multi-features fusion and support vector machine. Inf. Process. Agric. 2022, 9, 431–442. [Google Scholar] [CrossRef]
  123. Sabzi, S.; Abbaspour-Gilandeh, Y.; García-Mateos, G.; Ruiz-Canales, A.; Molina-Martínez, J.M.; Arribas, J.I. An automatic non-destructive method for the classification of the ripeness stage of red delicious apples in orchards using aerial video. Agronomy 2019, 9, 84. [Google Scholar] [CrossRef]
  124. Shurygin, B.; Konyukhov, I.; Khruschev, S.; Solovchenko, A. Non-Invasive Probing of Winter Dormancy via Time-Frequency Analysis of Induced Chlorophyll Fluorescence in Deciduous Plants as Exemplified by Apple (Malus × domestica Borkh.). Plants 2022, 11, 2811. [Google Scholar] [CrossRef] [PubMed]
  125. Schlie, T.P.; Dierend, W.; Köpcke, D.; Rath, T. Detecting low-oxygen stress of stored apples using chlorophyll fluorescence imaging and histogram division. Postharvest. Biol. Technol. 2022, 189, 111901. [Google Scholar] [CrossRef]
  126. Miao, Y.; Wang, L.; Peng, C.; Li, H.; Li, X.; Zhang, M. Banana plant counting and morphological parameters measurement based on terrestrial laser scanning. Plant Methods 2022, 18, 66. [Google Scholar] [CrossRef]
  127. Huang, K.Y.; Cheng, J.F. A novel auto-sorting system for Chinese cabbage seeds. Sensors 2017, 17, 886. [Google Scholar] [CrossRef]
  128. Turner, S.D.; Ellison, S.L.; Senalik, D.A.; Simon, P.W.; Spalding, E.P.; Miller, N.D. An automated image analysis pipeline enables genetic studies of shoot and root morphology in carrot (Daucus carota L.). Front. Plant Sci. 2018, 871, 1703. [Google Scholar] [CrossRef]
  129. Brainard, S.H.; Ellison, S.L.; Simon, P.W.; Dawson, J.C.; Goldman, I.L. Genetic characterization of carrot root shape and size using genome-wide association analysis and genomic-estimated breeding values. Theor. Appl. Genet. 2022, 135, 605–622. [Google Scholar] [CrossRef]
  130. Delgado, A.; Hays, D.B.; Bruton, R.K.; Ceballos, H.; Novo, A.; Boi, E.; Selvaraj, M.G. Ground penetrating radar: A case study for estimating root bulking rate in cassava (Manihot esculenta Crantz). Plant Methods 2017, 13, 65. [Google Scholar] [CrossRef]
  131. Yonis, B.O.; Pino del Carpio, D.; Wolfe, M.; Jannink, J.L.; Kulakow, P.; Rabbi, I. Improving root characterisation for genomic prediction in cassava. Sci. Rep. 2020, 10, 8003. [Google Scholar] [CrossRef]
  132. Atanbori, J.; Montoya, P.M.E.; Selvaraj, M.G.; French, A.P.; Pridmore, T.P. Convolutional Neural Net-Based Cassava Storage Root Counting Using Real and Synthetic Images. Front. Plant Sci. 2019, 10, 1516. [Google Scholar] [CrossRef]
  133. Agbona, A.; Teare, B.; Ruiz-Guzman, H.; Dobreva, I.D.; Everett, M.E.; Adams, T.; Montesinos-Lopez, O.A.; Kulakow, P.A.; Hays, D.B. Prediction of root biomass in cassava based on ground penetrating radar phenomics. Remote Sens. 2021, 13, 4908. [Google Scholar] [CrossRef]
  134. Nkouaya Mbanjo, E.G.; Hershberger, J.; Peteti, P.; Agbona, A.; Ikpan, A.; Ogunpaimo, K.; Kayondo, S.I.; Abioye, R.S.; Nafiu, K.; Alamu, E.O.; et al. Predicting starch content in cassava fresh roots using near-infrared spectroscopy. Front. Plant Sci. 2022, 13, 990250. [Google Scholar] [CrossRef] [PubMed]
  135. Selvaraj, M.G.; Valderrama, M.; Guzman, D.; Valencia, M.; Ruiz, H.; Acharjee, A. Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 2020, 16, 87. [Google Scholar] [CrossRef] [PubMed]
  136. Csillik, O.; Cherbini, J.; Johnson, R.; Lyons, A.; Kelly, M. Identification of citrus trees from unmanned aerial vehicle imagery using convolutional neural networks. Drones 2018, 2, 39. [Google Scholar] [CrossRef]
  137. Osco, L.P.; de Arruda, M.D.S.; Marcato Junior, J.; da Silva, N.B.; Ramos, A.P.M.; Moryia, É.A.S.; Imai, N.N.; Pereira, D.R.; Creste, J.E.; Matsubara, E.T.; et al. A convolutional neural network approach for counting and geolocating citrus-trees in UAV multispectral imagery. ISPRS J. Photogramm. Remote Sens. 2020, 160, 97–106. [Google Scholar] [CrossRef]
  138. Zhang, X.; Derival, M.; Albrecht, U.; Ampatzidis, Y. Evaluation of a ground penetrating radar to map the root architecture of HLB-infected citrus trees. Agronomy 2019, 9, 354. [Google Scholar] [CrossRef]
  139. Chang, A.; Yeom, J.; Jung, J.; Landivar, J. Comparison of canopy shape and vegetation indices of citrus trees derived from UAV multispectral images for characterization of citrus greening disease. Remote Sens. 2020, 12, 4122. [Google Scholar] [CrossRef]
  140. Rist, F.; Herzog, K.; Mack, J.; Richter, R.; Steinhage, V.; Töpfer, R. High-precision phenotyping of grape bunch architecture using fast 3d sensor and automation. Sensors 2018, 18, 763. [Google Scholar] [CrossRef] [PubMed]
  141. Rist, F.; Gabriel, D.; Mack, J.; Steinhage, V.; Töpfer, R.; Herzog, K. Combination of an automated 3D field phenotyping workflow and predictive modelling for high-throughput and non-invasive phenotyping of grape bunches. Remote Sens. 2019, 11, 2953. [Google Scholar] [CrossRef]
  142. Luo, L.; Liu, W.; Lu, Q.; Wang, J.; Wen, W.; Yan, D.; Tang, Y. Grape berry detection and size measurement based on edge image processing and geometric morphology. Machines 2021, 9, 233. [Google Scholar] [CrossRef]
  143. Liu, S.; Zeng, X.; Whitty, M. A vision-based robust grape berry counting algorithm for fast calibration-free bunch weight estimation in the field. Comput. Electron. Agric. 2020, 173, 105360. [Google Scholar] [CrossRef]
  144. Buayai, P.; Saikaew, K.R.; Mao, X. End-to-End Automatic Berry Counting for Table Grape Thinning. IEEE Access 2021, 9, 4829–4842. [Google Scholar] [CrossRef]
  145. Ramos, R.P.; Gomes, J.S.; Prates, R.M.; Simas Filho, E.F.; Teruel, B.J.; dos Santos Costa, D. Non-invasive setup for grape maturation classification using deep learning. J. Sci. Food Agric. 2021, 101, 2042–2051. [Google Scholar] [CrossRef] [PubMed]
  146. Anastasiou, E.; Balafoutis, A.; Darra, N.; Psiroukis, V.; Biniari, A.; Xanthopoulos, G.; Fountas, S. Satellite and proximal sensing to estimate the yield and quality of table grapes. Agriculture 2018, 8, 94. [Google Scholar] [CrossRef]
  147. Torres-Sánchez, J.; Mesas-Carrascosa, F.J.; Santesteban, L.G.; Jiménez-Brenes, F.M.; Oneka, O.; Villa-Llop, A.; Loidi, M.; López-Granados, F. Grape cluster detection using UAV photogrammetric point clouds as a low-cost tool for yield forecasting in vineyards. Sensors 2021, 21, 3083. [Google Scholar] [CrossRef]
  148. Olenskyj, A.G.; Sams, B.S.; Fei, Z.; Singh, V.; Raja, P.V.; Bornhorst, G.M.; Earles, J.M. End-to-end deep learning for directly estimating grape yield from ground-based imagery. Comput. Electron. Agric. 2022, 198, 107081. [Google Scholar] [CrossRef]
  149. Gao, Z.; Khot, L.R.; Naidu, R.A.; Zhang, Q. Early detection of grapevine leafroll disease in a red-berried wine grape cultivar using hyperspectral imaging. Comput. Electron. Agric. 2020, 179, 105807. [Google Scholar] [CrossRef]
  150. Seki, K.; Toda, Y. QTL mapping for seed morphology using the instance segmentation neural network in Lactuca spp. Front. Plant Sci. 2022, 13, 949470. [Google Scholar] [CrossRef]
  151. Du, J.; Li, B.; Lu, X.; Yang, X.; Guo, X.; Zhao, C. Quantitative phenotyping and evaluation for lettuce leaves of multiple semantic components. Plant Methods 2022, 18, 54. [Google Scholar] [CrossRef]
  152. Du, J.; Lu, X.; Fan, J.; Qin, Y.; Yang, X.; Guo, X. Image-Based High-Throughput Detection and Phenotype Evaluation Method for Multiple Lettuce Varieties. Front. Plant Sci. 2020, 11, 563386. [Google Scholar] [CrossRef]
  153. Zhang, L.; Xu, Z.; Xu, D.; Ma, J.; Chen, Y.; Fu, Z. Growth monitoring of greenhouse lettuce based on a convolutional neural network. Hortic. Res. 2020, 7, 124. [Google Scholar]
  154. Kim, C.; van Iersel, M.W. Morphological and Physiological Screening to Predict Lettuce Biomass Production in Controlled Environment Agriculture. Remote Sens. 2022, 14, 316. [Google Scholar] [CrossRef]
  155. Zhang, Y.; Li, M.; Li, G.; Li, J.; Zheng, L.; Zhang, M.; Wang, M. Multi-phenotypic parameters extraction and biomass estimation for lettuce based on point clouds. Measurement 2022, 204, 112094. [Google Scholar] [CrossRef]
  156. Maciel, G.M.; Gallis, R.B.A.; Barbosa, R.L.; Pereira, L.M.; Siquieroli, A.C.S.; Peixoto, J.V.M. Image phenotyping of lettuce germplasm with genetically diverse carotenoid levels. Bragantia 2020, 79, 224–235. [Google Scholar] [CrossRef]
  157. Osco, L.P.; Ramos, A.P.M.; Moriya, É.A.S.; Bavaresco, L.G.; de Lima, B.C.; Estrabis, N.; Pereira, D.R.; Creste, J.E.; Júnior, J.M.; Gonçalves, W.N.; et al. Modeling hyperspectral response of water-stress induced lettuce plants using artificial neural networks. Remote Sens. 2019, 11, 2797. [Google Scholar] [CrossRef]
  158. Sorrentino, M.; Colla, G.; Rouphael, Y.; Panzarová, K.; Trtílek, M. Lettuce reaction to drought stress: Automated high-throughput phenotyping of plant growth and photosynthetic performance. Acta Hortic. 2020, 1268, 133–141. [Google Scholar] [CrossRef]
  159. Wendel, A.; Underwood, J.; Walsh, K. Maturity estimation of mangoes using hyperspectral imaging from a ground based mobile platform. Comput. Electron. Agric. 2018, 155, 298–313. [Google Scholar] [CrossRef]
  160. Guo, Y.; Chen, S.; Wu, Z.; Wang, S.; Bryant, C.R.; Senthilnath, J.; Cunha, M.; Fu, Y.H. Integrating spectral and textural information for monitoring the growth of pear trees using optical images from the UAV platform. Remote Sens. 2021, 13, 1795. [Google Scholar] [CrossRef]
  161. Raju Ahmed, M.; Yasmin, J.; Wakholi, C.; Mukasa, P.; Cho, B.K. Classification of pepper seed quality based on internal structure using X-ray CT imaging. Comput. Electron. Agric. 2020, 179, 105839. [Google Scholar] [CrossRef]
  162. Horgan, G.W.; Song, Y.; Glasbey, C.A.; Van Der Heijden, G.W.A.M.; Polder, G.; Dieleman, J.A.; Bink, M.C.A.M.; Van Eeuwijk, F.A. Automated estimation of leaf area development in sweet pepper plants from image analysis. Funct. Plant Biol. 2015, 42, 486–492. [Google Scholar] [CrossRef]
  163. Musse, M.; Hajjar, G.; Ali, N.; Billiot, B.; Joly, G.; Pépin, J.; Quellec, S.; Challois, S.; Mariette, F.; Cambert, M.; et al. A global non-invasive methodology for the phenotyping of potato under water deficit conditions using imaging, physiological and molecular tools. Plant Methods 2021, 17, 81. [Google Scholar] [CrossRef] [PubMed]
  164. Van Harsselaar, J.K.; Claußen, J.; Lübeck, J.; Wörlein, N.; Uhlmann, N.; Sonnewald, U.; Gerth, S. X-ray CT Phenotyping Reveals Bi-Phasic Growth Phases of Potato Tubers Exposed to Combined Abiotic Stress. Front. Plant Sci. 2021, 12, 613108. [Google Scholar] [CrossRef] [PubMed]
  165. Caraza-Harter, M.V.; Endelman, J.B. Image-based phenotyping and genetic analysis of potato skin set and color. Crop Sci. 2020, 60, 202–210. [Google Scholar] [CrossRef]
  166. Si, Y.; Sankaran, S.; Knowles, N.R.; Pavek, M.J. Potato Tuber Length-Width Ratio Assessment Using Image Analysis. Am. J. Potato Res. 2017, 94, 88–93. [Google Scholar] [CrossRef]
  167. Yang, H.; Li, F.; Wang, W.; Yu, K. Estimating above-ground biomass of potato using random forest and optimized hyperspectral indices. Remote Sens. 2021, 13, 2339. [Google Scholar] [CrossRef]
  168. Liu, Y.; Feng, H.; Yue, J.; Fan, Y.; Jin, X.; Song, X.; Yang, H.; Yang, G. Estimation of Potato Above-Ground Biomass Based on Vegetation Indices and Green-Edge Parameters Obtained from UAVs. Remote Sens. 2022, 14, 5323. [Google Scholar] [CrossRef]
  169. Fan, Y.; Feng, H.; Jin, X.; Yue, J.; Liu, Y.; Li, Z.; Feng, Z.; Song, X.; Yang, G. Estimation of the nitrogen content of potato plants based on morphological parameters and visible light vegetation indices. Front. Plant Sci. 2022, 13, 1012070. [Google Scholar] [CrossRef]
  170. Muruganantham, P.; Samrat, N.H.; Islam, N.; Johnson, J.; Wibowo, S.; Grandhi, S. Rapid Estimation of Moisture Content in Unpeeled Potato Tubers Using Hyperspectral Imaging. Appl. Sci. 2023, 13, 53. [Google Scholar] [CrossRef]
  171. Sun, C.; Feng, L.; Zhang, Z.; Ma, Y.; Crosby, T.; Naber, M.; Wang, Y. Prediction of end-of-season tuber yield and tuber set in potatoes using in-season UAV-based hyperspectral imagery and machine learning. Sensors 2020, 20, 5293. [Google Scholar] [CrossRef]
  172. Van De Vijver, R.; Mertens, K.; Heungens, K.; Somers, B.; Nuyttens, D.; Borra-Serrano, I.; Lootens, P.; Roldán-Ruiz, I.; Vangeyte, J.; Saeys, W. In-field detection of Alternaria solani in potato crops using hyperspectral imaging. Comput. Electron. Agric. 2020, 168, 105106. [Google Scholar] [CrossRef]
  173. Duarte-Carvajalino, J.M.; Alzate, D.F.; Ramirez, A.A.; Santa-Sepulveda, J.D.; Fajardo-Rojas, A.E.; Soto-Suárez, M. Evaluating late blight severity in potato crops using unmanned aerial vehicles and machine learning algorithms. Remote Sens. 2018, 10, 1513. [Google Scholar] [CrossRef]
  174. Qi, C.; Sandroni, M.; Cairo Westergaard, J.; Høegh Riis Sundmark, E.; Bagge, M.; Alexandersson, E.; Gao, J. In-field classification of the asymptomatic biotrophic phase of potato late blight based on deep learning and proximal hyperspectral imaging. Comput. Electron. Agric. 2023, 205, 107585. [Google Scholar] [CrossRef]
  175. Saha, K.K.; Tsoulias, N.; Weltzien, C.; Zude-Sasse, M. Estimation of Vegetative Growth in Strawberry Plants Using Mobile LiDAR Laser Scanner. Horticulturae 2022, 8, 90. [Google Scholar] [CrossRef]
  176. Feldmann, M.J.; Hardigan, M.A.; Famula, R.A.; López, C.M.; Tabb, A.; Cole, G.S.; Knapp, S.J. Multi-dimensional machine learning approaches for fruit shape phenotyping in strawberry. GigaScience 2020, 9, giaa030. [Google Scholar] [CrossRef] [PubMed]
  177. Zingaretti, L.M.; Monfort, A.; Pérez-Enciso, M. Automatic fruit morphology phenome and genetic analysis: An application in the octoploid strawberry. Plant Phenom. 2021, 2021, 981291. [Google Scholar] [CrossRef]
  178. Li, B.; Cockerton, H.M.; Johnson, A.W.; Karlström, A.; Stavridou, E.; Deakin, G.; Harrison, R.J. Defining strawberry shape uniformity using 3D imaging and genetic mapping. Hortic. Res. 2020, 7, 115. [Google Scholar] [CrossRef]
  179. Zheng, C.; Abd-Elrahman, A.; Whitaker, V.M.; Dalid, C. Deep Learning for Strawberry Canopy Delineation and Biomass Prediction from High-Resolution Images. Plant Phenom. 2022, 2022, 9850486. [Google Scholar] [CrossRef]
  180. Guan, Z.; Abd-Elrahman, A.; Fan, Z.; Whitaker, V.M.; Wilkinson, B. Modeling strawberry biomass and leaf area using object-based analysis of high-resolution images. ISPRS J. Photogramm. Remote Sens. 2020, 163, 171–186. [Google Scholar] [CrossRef]
  181. Cockerton, H.M.; Li, B.; Vickerstaff, R.J.; Eyre, C.A.; Sargent, D.J.; Armitage, A.D.; Marina-Montes, C.; Garcia-Cruz, A.; Passey, A.J.; Simpson, D.W.; et al. Identifying Verticillium dahliae Resistance in Strawberry Through Disease Screening of Multiple Populations and Image Based Phenotyping. Front. Plant Sci. 2019, 10, 924. [Google Scholar] [CrossRef]
  182. Poobalasubramanian, M.; Park, E.S.; Faqeerzada, M.A.; Kim, T.; Kim, M.S.; Baek, I.; Cho, B.K. Identification of Early Heat and Water Stress in Strawberry Plants Using Chlorophyll-Fluorescence Indices Extracted via Hyperspectral Images. Sensors 2022, 22, 8706. [Google Scholar] [CrossRef]
  183. Zhu, Y.; Gu, Q.; Zhao, Y.; Wan, H.; Wang, R.; Zhang, X.; Cheng, Y. Quantitative Extraction and Evaluation of Tomato Fruit Phenotypes Based on Image Recognition. Front. Plant Sci. 2022, 13, 859290. [Google Scholar] [CrossRef] [PubMed]
  184. Sun, G.; Wang, X.; Sun, Y.; Ding, Y.; Lu, W. Measurement method based on multispectral three-dimensional imaging for the chlorophyll contents of greenhouse tomato plants. Sensors 2019, 19, 3345. [Google Scholar] [CrossRef] [PubMed]
  185. Chang, A.; Jung, J.; Yeom, J.; Maeda, M.M.; Landivar, J.A.; Enciso, J.M.; Avila, C.A.; Anciso, J.R. Unmanned Aircraft System- (UAS-) Based High-Throughput Phenotyping (HTP) for Tomato Yield Estimation. J. Sens. 2021, 2021, 8875606. [Google Scholar] [CrossRef]
  186. Tatsumi, K.; Igarashi, N.; Mengxue, X. Prediction of plant-level tomato biomass and yield using machine learning with unmanned aerial vehicle imagery. Plant Methods 2021, 17, 77. [Google Scholar] [CrossRef]
  187. Méline, V.; Caldwell, D.L.; Kim, B.S.; Khangura, R.S.; Baireddy, S.; Yang, C.; Sparks, E.E.; Dilkes, B.; Delp, E.J.; Iyer-Pascuzzi, A.S. Image-based assessment of plant disease progression identifies new genetic loci for resistance to Ralstonia solanacearum in tomato. Plant J. 2023, 113, 887–903. [Google Scholar] [CrossRef] [PubMed]
  188. Žibrat, U.; Susič, N.; Knapič, M.; Širca, S.; Strajnar, P.; Razinger, J.; Vončina, A.; Urek, G.; Gerič Stare, B. Pipeline for imaging, extraction, pre-processing, and processing of time-series hyperspectral data for discriminating drought stress origin in tomatoes. MethodsX 2019, 6, 399–408. [Google Scholar] [CrossRef] [PubMed]
  189. Fullana-Pericàs, M.; Conesa, M.À.; Gago, J.; Ribas-Carbó, M.; Galmés, J. High-throughput phenotyping of a large tomato collection under water deficit: Combining UAVs’ remote sensing with conventional leaf-level physiologic and agronomic measurements. Agric. Water Manag. 2022, 260, 107283. [Google Scholar] [CrossRef]
  190. Tsaftaris, S.A.; Minervini, M.; Scharr, H. Machine Learning for Plant Phenotyping Needs Image Processing. Trends Plant Sci. 2016, 21, 989–991. [Google Scholar] [CrossRef]
  191. Mochida, K.; Koda, S.; Inoue, K.; Hirayama, T.; Tanaka, S.; Nishii, R.; Melgani, F. Computer vision-based phenotyping for improvement of plant productivity: A machine learning perspective. GigaScience 2019, 8, giy153. [Google Scholar] [CrossRef]
  192. Yoosefzadeh-Najafabadi, M.; Earl, H.J.; Tulpan, D.; Sulik, J.; Eskandari, M. Application of Machine Learning Algorithms in Plant Breeding: Predicting Yield from Hyperspectral Reflectance in Soybean. Front. Plant Sci. 2021, 11, 624273. [Google Scholar] [CrossRef]
  193. Khan, M.; Jan, B.; Farman, H. Deep Learning: Convergence to Big Data Analytics, 1st ed.; SpringerBriefs in Computer Science; Springer: Singapore, 2019. [Google Scholar]
  194. LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef] [PubMed]
  195. Kim, Y.; Kwak, G.-H.; Lee, K.-D.; Na, S.-I.; Park, C.-W.; Park, N.-W. Performance Evaluation of Machine Learning and Deep Learning Algorithms in Crop Classification: Impact of Hyper-parameters and Training Sample Size. Korean J. Remote Sens. 2018, 34, 811–827. [Google Scholar]
  196. Toda, Y.; Okura, F.; Ito, J.; Okada, S.; Kinoshita, T.; Tsuji, H.; Saisho, D. Training instance segmentation neural network with synthetic datasets for crop seed phenotyping. Commun. Biol. 2020, 3, 173. [Google Scholar] [CrossRef] [PubMed]
  197. Singh, A.K.; Ganapathysubramanian, B.; Sarkar, S.; Singh, A. Deep Learning for Plant Stress Phenotyping: Trends and Future Perspectives. Trends Plant Sci. 2018, 23, 883–898. [Google Scholar] [CrossRef] [PubMed]
  198. Ubbens, J.R.; Stavness, I. Deep Plant Phenomics: A Deep Learning Platform for Complex Plant Phenotyping Tasks. Front. Plant Sci. 2017, 8, 1190. [Google Scholar] [CrossRef]
Figure 1. Schematic overview of image-based high-throughput phenotyping in horticultural crops. UAP, unmanned aerial platform; MAP, manned aerial platform; RGB, red–green–blue; LiDAR, light detection and ranging; X-ray CT, X-ray computed tomography; MRI, magnetic resonance imaging.
Figure 1. Schematic overview of image-based high-throughput phenotyping in horticultural crops. UAP, unmanned aerial platform; MAP, manned aerial platform; RGB, red–green–blue; LiDAR, light detection and ranging; X-ray CT, X-ray computed tomography; MRI, magnetic resonance imaging.
Plants 12 02061 g001
Figure 2. Examples of images from commonly used sensors in high-throughput phenotyping and their spectral range. (a) RGB; (b) NIR [71]; (c) SWIR [72]; (d) thermal (Qubit phenomics, Canada); (e) X-ray [73]; (f) MRI [66]; (g) fluorescence; (h) LiDAR and photogrammetry point cloud (Pix4D S.A., Prilly, Switzerland). RGB and fluorescence images were captured in our lab.
Figure 2. Examples of images from commonly used sensors in high-throughput phenotyping and their spectral range. (a) RGB; (b) NIR [71]; (c) SWIR [72]; (d) thermal (Qubit phenomics, Canada); (e) X-ray [73]; (f) MRI [66]; (g) fluorescence; (h) LiDAR and photogrammetry point cloud (Pix4D S.A., Prilly, Switzerland). RGB and fluorescence images were captured in our lab.
Plants 12 02061 g002
Figure 3. Statistics of image-based high-throughput phenotyping studies in horticultural crops during the past two decades. (a) The number of annual publications related to image-based phenotyping of horticultural crops. (b) Major areas of research using image-based high-throughput phenotyping. (c) Annual number of publications with different search keywords. (d) The type of publications related to image-based high-throughput phenotyping of horticultural crops. (e) Number of image-based high-throughput phenotyping studies by country (top 20). (f) Type of imaging techniques for high-throughput phenotyping of horticultural crops. Note: the data were obtained from the Scopus (https://www.scopus.com) database (Elsevier, The Netherlands) accessed on 21 February 2023. The publications were searched using keywords: horticultural crop, fruit, vegetable, ornamental, flower, and sensor types (RGB, thermal, hyperspectral, multispectral, X-ray CT, MRI, LiDAR, ToF, Raman, ChlF) within the search results of high-throughput phenotyping using image analysis.
Figure 3. Statistics of image-based high-throughput phenotyping studies in horticultural crops during the past two decades. (a) The number of annual publications related to image-based phenotyping of horticultural crops. (b) Major areas of research using image-based high-throughput phenotyping. (c) Annual number of publications with different search keywords. (d) The type of publications related to image-based high-throughput phenotyping of horticultural crops. (e) Number of image-based high-throughput phenotyping studies by country (top 20). (f) Type of imaging techniques for high-throughput phenotyping of horticultural crops. Note: the data were obtained from the Scopus (https://www.scopus.com) database (Elsevier, The Netherlands) accessed on 21 February 2023. The publications were searched using keywords: horticultural crop, fruit, vegetable, ornamental, flower, and sensor types (RGB, thermal, hyperspectral, multispectral, X-ray CT, MRI, LiDAR, ToF, Raman, ChlF) within the search results of high-throughput phenotyping using image analysis.
Plants 12 02061 g003
Table 1. Summary of most common imaging techniques used in high-throughput plant phenotyping [11,75].
Table 1. Summary of most common imaging techniques used in high-throughput plant phenotyping [11,75].
Imaging Technique aPhenotypic TraitsAdvantagesLimitationsPotential Applications
Visible light imagingShape, color, size, biomass, pigment content, disease and pest, stress responses, nutrient stress, vegetation indicesCheap, easy operation and maintenance, provide color information, high resolution, fast data acquisitionLimited to three spectral bands (RGB), affected by light, only provide relative measurementGrowth monitoring, plant stress detection, fruit maturity and ripening estimation, grading and sorting, quality evaluation, yield prediction, 3D modeling, crop management, robotic harvesting
Thermal imagingLeaf greenness, leaf color, leaf chlorophyll content, leaf/canopy temperature, disease and pest, phenology, photosynthetic statusWide measurement range, background interference can be removedRequire sensor calibration and atmospheric correction, difficulty of through time comparison due to changes in ambient condition affecting canopy temperature, need reference for comparison, difficult to separate soil and plant temperature in sparse canopies (limiting the automation of image processing)Plant stress detection, irrigation scheduling
Hyperspectral imagingLeaf/canopy water status, canopy coverage and volume, leaf greenness, disease and pest, photosynthetic rate, nutrient stress, metabolitesHigh spectral resolution, background interference can be removedExpensive, low spatial resolution, too large image data challenging for storage and analysis, affected by ambient light conditionGrowth monitoring, biotic and abiotic stress detection, fruit maturity and ripening estimation, quality evaluation, biomass estimation, metabolite prediction
Fluorescence imagingChlorophyll content, canopy coverage, disease and pest, photosynthetic statusSensitive to fluorescence and water stressLimited in field application, difficult to measure at the canopy scale due to the small signal-to-noise ratioGrowth monitoring, early detection of biotic and abiotic stress
Multispectral imagingCanopy coverage and volume, chlorophyll content, leaf greenness, plant diseases and pests, photosynthetic status, water contentEasy in image processing; mature technologyLimited to several spectral bands; spectral data should be frequently calibrated using referenced objects; effects of camera geometrics, illumination condition, and sun angle on the data signalGrowth monitoring,
biotic and abiotic stress detection
LiDARPlant height, canopy volume, shoot biomassProvide three-dimensional shapeExpensive, sensitive to the small difference in path length; specific illumination required for some laser scanning instruments, data processing is time-consumingGrowth monitoring, structure capture
3D laser scannerGeometrical plant traits such as shape, length, height, canopy structure and volumeLong measurement distance; high precision; good penetrationExpensive, affected by external factors such as wind and fogGrowth monitoring, organ morphogenesis
MRIInternal structures,
metabolites,
development of root
systems, water presence
Available for screening
3D structural information
Expensive, low throughput, slow data acquisitionAcquire 3D structures of the whole plant or plant parts
X-ray CTSize and shapeLarge penetration depth, scalable field of view, minimal sample preparationExpensive, low throughputGrowth monitoring, seed and fruit development, organ morphogenesis, 3D visualization of plant organs and tissues
a Imaging technique: LiDAR—light detection and ranging; MRI—magnetic resonance imaging; X-ray CT—X-ray computed tomography.
Table 2. Examples of sensors used for different imaging techniques in high-throughput plant phenotyping.
Table 2. Examples of sensors used for different imaging techniques in high-throughput plant phenotyping.
Imaging TechniqueSensor (Manufacturer)ResolutionCropReference
Visible light
imaging
DJI Phantom 4 Pro (DJI Technology Co., Shenzhen, China)3000 × 4000 pxStrawberry[76]
Sony Cyber-shot DSC-H3 camera (Sony Corporation, Tokyo, Japan)3264 × 2448 pxTomato[38]
Fujifilm X20 (Fujifilm Corporation, Tokyo, Japan)3000 × 4000 pxApple[77]
Thermal
imaging
3DR Solo quadcopter (3D Robotics, Berkeley, CA, USA)1280 × 960 pxBanana[23]
Vario CAM hr inspect 575 (Jenoptic, Jena, Germany)768 × 576 pxMango[78]
Hyperspectral imagingPika L 2.4 (Resonon Inc., Bozeman, MT, USA)unknownTomato[58]
HySpex VNIR 1800, HySpex SWIR 384
(Norsk Elektro Optikk A/S, Skedsmokorset, Norway)
unknownGrape[79]
Fluorescence imagingPlantScreenTM Transect XZ system
(Photon Systems Instruments, Drásov, Czech Republic)
1360 × 1024 pxLettuce[64]
MultispectralParrot Sequoia camera (Parrot Drone SAS, Paris, France)1280 × 960 pxBanana[80]
RedEdge-M, (MicaSense, Seattle, WA, USA)1280 × 960 pxCitrus[81]
LiDARPlantEye F400 (Phenospex, Heerlen, The Netherlands)unknownPotato[82]
3D laser scannerFARO Focus 3D 120 terrestrial laser scanner (Faro Technologies Inc., Lake Mary, FL, USA)1/5Cassava[83]
X-ray CTX-ray imaging system (Xeye-5100F, SEC, Suwon, Republic of Korea)2304 × 1300 pxWatermelon[70]
Table 3. Applications of image-based high-throughput phenotyping in horticultural crops.
Table 3. Applications of image-based high-throughput phenotyping in horticultural crops.
CropTraitSensor aEnvironmentReference
AppleSeed morphologyRGBLaboratory[117]
Plant growthRGBField[40,118,119]
Fruit detection and countingRGBField[77,120,121]
Yield predictionMS, RGBField[45,122]
Fruit ripeningAerial videoField[123]
Winter dormancyChlFField[124]
Low oxygen stressChlFLaboratory[125]
Apple scabThermal, MSControlled[101,102]
Fire blightMS, HSField[21,56,103]
Powdery mildewRGB, MSField[48]
Drought stressThermal, MSField[112]
BananaPlant growthLaser scanningField[80,126]
Plant countingMS, laser scanningField[23,126]
Fruit maturityRGBLaboratory[92]
Yellow SigatokaRGBField[17]
Multiple diseasesRGBField[49]
Fusarium wiltMSField[22]
CabbageSeed morphologyRGBLaboratory[127]
Plant growthRGBField[41]
Shoot biomassRGBField [33]
Heat stressMSField[41]
CarrotRoot morphologyRGBLaboratory[39,128,129]
CassavaRoot bulking rateGPRField[130]
Root morphologyRGBField and Laboratory[131,132]
Shoot biomassLiDARField[83]
Root biomassGPRField[133]
Carotenoid contentRGBLaboratory[97]
Starch contentThermalField[134]
Plant growthRGB, MSControlled and field[42,135]
Bacterial blightRGBLaboratory[107]
CitrusPlant countingRGBField[24,81,136,137]
Plant water statusThermalControlled[54]
Citrus cankerHSField[57]
Huanglongbing (HLB)GPR, MS, ChlFField[62,138,139]
GrapeBunch architecture3D scannerField[140,141,142]
Berry countingRGBField[143,144]
Berry maturityRGBLaboratory[145]
Yield predictionRGB, HSField[47,146,147,148]
Grape yellowsHSField[79]
Grape leafrollHSLaboratory[149]
Powdery mildewRGBLaboratory[104]
Drought stressRGB, ThermalField[113]
LettuceSeed morphologyRGBLaboratory[150]
Leaf semantic componentsRGBLaboratory[151]
Plant growthRGBControlled[43,152,153]
Shoot biomassChlFControlled[154]
Anthocyanin contentRGBField[31,155]
Carotenoid contentRGBField[31,156]
Chlorophyll contentRGBField[31]
Drought stressHS. ChlFControlled[63,157,158]
Salt stressChlFControlled[64]
MangoFruit maturityHS, LiDARField[159]
Fruit ripeningHSField[60]
Fruit detectionRGBField[30]
Drought stressThermalField[78]
PearPlant growthRGBField[160]
Fire blightHSField[20]
RussetRGBLaboratory[50]
PepperSeed qualityX-ray CTLaboratory[161]
Leaf areaRGBControlled[162]
PotatoCrop emergenceRGBField[44]
Plant growthRGBControlled[163]
Tuber growth and developmentX-ray CTControlled[164]
Tuber skin colorRGBLaboratory[165]
Tuber sizeRGBLaboratory[166]
Shoot biomassRGB, HSField[35,167,168]
Nitrogen contentRGBField[169]
Tuber moisture contentHSLaboratory[170]
Stomatal conductanceThermalField[53]
Yield predictionRGB, Thermal, HSField[46,171]
Early blightHSField[172]
Late blightRGB, MSField[19,99,173,174]
Bacterial soft rotRGBLaboratory[51]
Verticillium wiltMSField[18]
Drought stressLiDARControlled[82]
StrawberryPlant growthLiDARField[175]
Fruit morphologyRGBLaboratory[84,176,177,178]
Shoot biomassRGB, ThermalField[28,179,180]
Yield predictionRGBField[76]
Leaf gray moldHSLaboratory[100]
Verticillium wiltRGB, MSField[181]
Heat and drought stressHSControlled[182]
TomatoRoot architectureRGBControlled[38]
Shoot biomassRGBControlled[29,37]
Fruit morphologyRGBLaboratory[183]
Nitrogen, phosphorus, potassium contentMSControlled[98]
Chlorophyll contentRGB, MSControlled[184]
Yield predictionRGB, MSField[185,186]
Bacterial wiltChlFControlled[108,187]
TYLC, early blight, bacterial spotHSField[16,58]
Drought stressRGB, Thermal, HSControlled, field[188,189]
Chilling injury (seedling)ChlFControlled[65]
Salt stressRGB, MS, ThermalField[36]
a Sensor: RGB—red, green, blue; IR—infrared; HS—hyperspectral; MS—multispectral; NIR—near infrared; ChlF—chlorophyll fluorescence; LiDAR—light detection and ranging; GPR—ground penetrating radar.
Table 4. Examples of machine learning and deep learning applications in high-throughput plant phenotyping.
Table 4. Examples of machine learning and deep learning applications in high-throughput plant phenotyping.
Algorithm ApplicationAlgorithm TypeImaging
Technique
Plant SpeciesPhenotypic
Trait
Reference
ClassificationFaster R-CNNRGBStrawberryYield prediction[76]
ClassificationConvolution NetworkX-ray CTWatermelonSeed quality[70]
ClassificationLinear Discriminance Model,
Partially Least Square,
Multi-Layer Perceptron,
Radial-Basis Function Network
HyperspectralGrapeGrape yellows[79]
ClassificationYOLOv3MultispectralHamlin citrusPlant counting[81]
Detection3D Point cloudsVisible lightAppleFruit detection and counting[77]
DetectionConvolutional Neural Network, Template Matching, Local Maximum FilterThermalBananaPlant counting[23]
IdentificationNeural Network Radial Basis Function, K-Nearest NeighborHyperspectralTomatobacterial spot[58]
IdentificationPCAFluorescenceLetucceSalt stress[64]
IdentificationEVI2 ThresholdMultispectralBananaPlant growth[80]
IdentificationLogistic RegressionLiDARPotatoDrought stress[82]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Abebe, A.M.; Kim, Y.; Kim, J.; Kim, S.L.; Baek, J. Image-Based High-Throughput Phenotyping in Horticultural Crops. Plants 2023, 12, 2061. https://doi.org/10.3390/plants12102061

AMA Style

Abebe AM, Kim Y, Kim J, Kim SL, Baek J. Image-Based High-Throughput Phenotyping in Horticultural Crops. Plants. 2023; 12(10):2061. https://doi.org/10.3390/plants12102061

Chicago/Turabian Style

Abebe, Alebel Mekuriaw, Younguk Kim, Jaeyoung Kim, Song Lim Kim, and Jeongho Baek. 2023. "Image-Based High-Throughput Phenotyping in Horticultural Crops" Plants 12, no. 10: 2061. https://doi.org/10.3390/plants12102061

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop