Next Article in Journal
Modelling and Comparing Shading Effects of 3D Tree Structures with Virtual Leaves
Next Article in Special Issue
A Review of Unmanned Aerial Vehicle Low-Altitude Remote Sensing (UAV-LARS) Use in Agricultural Monitoring in China
Previous Article in Journal
Framework for Reconstruction of Pseudo Quad Polarimetric Imagery from General Compact Polarimetry
Previous Article in Special Issue
Sugarcane Yield Mapping Using High-Resolution Imagery Data and Machine Learning Technique
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Remote Sensing and Machine Learning in Crop Phenotyping and Management, with an Emphasis on Applications in Strawberry Farming

1
Gulf Coast Research and Education Center, University of Florida, Wimauma, FL 33598, USA
2
School of Forest Resources and Conservation, University of Florida, Gainesville, FL 32603, USA
3
Department of Horticultural Sciences, University of Florida, Gainesville, FL 32611, USA
*
Author to whom correspondence should be addressed.
Remote Sens. 2021, 13(3), 531; https://doi.org/10.3390/rs13030531
Submission received: 10 December 2020 / Revised: 18 January 2021 / Accepted: 27 January 2021 / Published: 2 February 2021
(This article belongs to the Special Issue Digital Agriculture with Remote Sensing)

Abstract

:
Measurement of plant characteristics is still the primary bottleneck in both plant breeding and crop management. Rapid and accurate acquisition of information about large plant populations is critical for monitoring plant health and dissecting the underlying genetic traits. In recent years, high-throughput phenotyping technology has benefitted immensely from both remote sensing and machine learning. Simultaneous use of multiple sensors (e.g., high-resolution RGB, multispectral, hyperspectral, chlorophyll fluorescence, and light detection and ranging (LiDAR)) allows a range of spatial and spectral resolutions depending on the trait in question. Meanwhile, computer vision and machine learning methodology have emerged as powerful tools for extracting useful biological information from image data. Together, these tools allow the evaluation of various morphological, structural, biophysical, and biochemical traits. In this review, we focus on the recent development of phenomics approaches in strawberry farming, particularly those utilizing remote sensing and machine learning, with an eye toward future prospects for strawberries in precision agriculture. The research discussed is broadly categorized according to strawberry traits related to (1) fruit/flower detection, fruit maturity, fruit quality, internal fruit attributes, fruit shape, and yield prediction; (2) leaf and canopy attributes; (3) water stress; and (4) pest and disease detection. Finally, we present a synthesis of the potential research opportunities and directions that could further promote the use of remote sensing and machine learning in strawberry farming.

Graphical Abstract

1. Introduction

According to the Food and Agriculture Organization (FAO)’s Future of Food and Agriculture: Alternative Pathways to 2050 report, the global population will reach almost 10 billion in 2050 [1], which mandates a continued increase in crop production. Meanwhile, agriculture is under increasing resource constraints within the context of climate change, with decreasing water and land resources. Precision agriculture is an important approach to help meet this goal of a continuous increase in crop production. Precision agriculture is an operation and management system supported by information technology that makes targeted measurements of plant growth, plant health, soil conditions, and other factors [2,3].
Through the integration of the Global Navigation Satellite System (GNSS), Geographic Information System (GIS), and remote sensing technologies, precision agriculture can help achieve a number of specific goals, such as (1) conduction of farmland surveys; (2) site-specific precision application of fertilizers, pesticides, and irrigation management schemes; and (3) fine-scale monitoring of crop status, soil moisture, diseases, and pests [2]. Implementing informed and science-based decision-making protocols can increase profits and productivity, environmental sustainability, crop quality, and ultimately food security [4]. Recently, applications of precision agriculture have gradually spread throughout the world as the adoption of auto-guidance systems, yield monitoring technology, and variable rate technology (VRT) in agriculture has increased in both developed and developing countries over the past 20 years [5].
Another important application of precision agriculture is in plant phenotyping, particularly within the context of breeding and genetic research. Phenotyping is broadly defined as the acquisition and evaluation of complex plant traits, such as geometric structure, abiotic stress tolerance, disease resistance, yield, and other physiological and biochemical characteristics. The measurement of economically important traits is essential to plant breeding [6]. With the combination of remote sensing, computer vision, and robotics, high-throughput plant phenotyping platforms have been developed. These systems usually use multiple sensors to measure various traits, such as color, texture, plant height, area, volume, degree of wilting, fresh weight, number of flowers/fruits, and quality of fruits [7]. This information enables scientists to establish a connection between genotype and phenotype, thus allowing them to select resilient varieties with high yield potential in the target environment. Of course, the same technologies and similar approaches are also valuable for crop management, specifically determination of nutrient needs, water, and pesticide requirements, as well as the detection of weeds, pathogens, and pests [8].
At present, plant phenotyping is the primary bottleneck in both plant breeding and crop management. Connecting phenotype to genotype in a set of target environments is the basic goal [9]. Next-generation advances in DNA sequencing technology and genome assembly methodology have dramatically increased the throughput and lowered the cost of genotyping. However, connecting this mountain of genomic information to the expression of traits is still a knotty problem [10]. The greatest challenge at present is to rapidly acquire large-scale plant phenotyping data with high dimensionality, density, and accuracy from single molecules to entire organisms. While new phenomics technology has significantly relieved some bottlenecks, many questions remain on how to efficiently define and extract complex traits as well as improve accuracy and throughput [11]. New advances in remote sensing and machine learning have the capacity to solve many of these problems.
Strawberry (Fragaria × ananassa) is a very popular fruit among consumers by virtue of its appealing appearance, flavor, and health benefits [12]. The latest statistics from the FAO show that the world’s strawberry yield and cultivated area from 1961 to 2018 grew at annual rates of about 1.82% and 2.44%, respectively (Figure 1). A large portion of the research conducted during this period focused on the medical benefits of strawberries to human health [6]. The high-value market for strawberries has significantly promoted the breeding of new varieties worldwide, including in Europe, Asia, and North America [13], which is now driving the need for high-throughput phenotyping techniques. Accurate and rapid acquisition of heritable traits of interest is critical to improving the strawberry breeding selection accuracy. Remote sensing and machine learning can greatly relieve the heavy burden of manual work for strawberry phenotyping, such as plant height measurement and fruit quality evaluation. How to effectively improve accuracy and throughput is a hot research theme. On the other hand, strawberry is a highly perishable and labor-intensive crop that can benefit greatly from precision agriculture approaches. The fruits have many developmental stages and when ripe are very sensitive to environmental and management conditions. Plant development and fruit production can continually cycle and change over a 6-month period, depending on the growing region. Therefore, real-time and intelligent monitoring of plant health and development as well as fruit quality assessment is essential for crop management and strategy formulation. The combination of remote sensing and machine learning is considered to have huge potential and a broad application space in these areas.
In this manuscript, we reviewed the use of remote sensing and machine learning in agricultural applications, especially focusing on the latest advances in strawberry phenotyping and management. The manuscript also presented a synthesis of potential research opportunities and directions that could further support strawberry farming. A rigorous two-step approach was adopted to search and screen the literature related to remote sensing and machine learning applications, with an emphasis on strawberry. Details of the adopted approach and the number of articles on each topic are shown in Appendix A and Figure A1, respectively.

2. Remote Sensing Platforms and Sensors

Remote sensing technology has developed rapidly in recent years, with sensors providing higher spatial, temporal, and spectral resolution images. Remotely sensed data are acquired by mounting sensors on multiple platforms, including satellites, unmanned aerial vehicles (drones), and ground-based vehicles. The unparalleled advantage of satellite observation lies in its large area of coverage, which allows for collecting various types of datasets, routinely on a global scale. A summary of agricultural data sources by Huang et al. [15] presented 28 optical and synthetic aperture radar (SAR) satellites for plant vegetation studies, with spatial resolutions varying from 0.3 m to 1 km. Research on the agriculture-related applications of satellite sensors focuses on several aspects, including crop type classification [16], soil property determination [17,18,19], crop mapping and spatial statistics [20], crop yield forecasting and canopy parameter estimation [21], and irrigation/drought evaluation [19,22]. For example, Sentinel-2 remote sensing imagery was used to retrieve various biophysical parameters of winter wheat, including the leaf area index, leaf chlorophyll content, and canopy chlorophyll content, utilizing vegetation indices and radiative transfer modeling [23]. A combination of two indices, enhanced vegetation index (EVI) and vegetation optical depth (VOD), derived from optical (MODIS) and microwave (Soil Moisture Active Passive Satellite, SMAP) remote sensors, respectively, was used to make a prediction of the corn, soybean, and wheat yield on the county scale, with an accuracy of over 76% [24].
In contrast to satellites, unmanned aerial vehicles (UAVs), or drones, can carry low-cost sensors and can operate on a flexible on-demand schedule. Due to higher spatial resolution, low cost, and high maneuverability, drones have become one of the most widely used remote sensing platforms in agriculture. Yang et al. [7] investigated the current progress and future prospects of UAVs as a remote sensing platform by reviewing 96 articles. Radoglou-Grammatikis et al. [25] further made a comprehensive survey of UAV applications in precision agriculture. Currently, non-destructive crop monitoring and smart spraying are two of the primary UAV applications. Moreover, the integration of UAVs, wireless sensor networks (WSNs), and the Internet of Things (IoT) and the maturity of 5th-generation (5G) technology can make several applications such as pesticide application, irrigation, crop monitoring, and soil property analysis more precise, timely, and efficient [26,27,28,29]. Compared with satellites and drones, ground-based platforms enable close-range detection of plant characteristics and generally serve as ground truth information sources for sensor calibration and data quality control. Ground-based platforms can be categorized into sensors mounted on fixed platforms, such as towers or booms (fixed scanning systems); handheld field measuring instruments; and sensors mounted on mobile ground vehicles [30,31].
Currently, the main sensors used in remote sensing agricultural applications consist of passive multispectral, hyperspectral, visible RGB (VIS), and near-infrared (NIR) sensors, fluorescence spectroscopy and imaging sensors, light detection and ranging (LiDAR), and synthetic aperture radar (SAR) [32]. High-resolution RGB images are widely used in vegetation classification; identification of plant leaves, canopy, and fruits; and estimation of geometric attributes. Multispectral and hyperspectral imaging provides spectral information about various parameters related to physiological and biochemical attributes, such as the leaf area index (LAI), crop water content, leaf/canopy chlorophyll content, and nitrogen content [7,33]. These parameters are very useful for crop growth evaluation and yield prediction. Fluorescence remote sensing is efficient in retrieving the chlorophyll and nitrogen content, nitrogen-to-carbon ratio, and LAI [34]. LiDAR has the advantage of a high point-cloud density, which is useful for obtaining horizontal and vertical structural characteristics of plants [35]. A synthetic aperture radar can function in very low visibility weather conditions (e.g., cloud cover). It has been extensively explored in crop classification, crop growth monitoring, and soil moisture monitoring [36,37,38]. Specific uses of different sensor types in different agricultural applications are elaborated by Yang et al. [7].
Strawberry is different from most agronomic crops like corn, soybeans, and wheat in various aspects. It is clonally propagated, and a single plant is relatively small in size but has a complex growth habit that includes several parts such as the crown, leaves, runners, inflorescences, and fleshy fruits. Higher-spatial-resolution imagery is needed to reveal the canopy structure and identify the fruits. Handheld sensors as well as sensors mounted on UAVs and ground-based platforms have been used to study various strawberry phenotyping traits. Some commonly used UAV types for agriculture applications are elaborated by Radoglou-Grammatikis et al. [25]. Ground-based platforms (e.g., tractors) were used to collect high-quality images and generate a 3D point cloud for strawberry plants [39]. Handheld non-imaging spectrometers that cover a wide spectral range (350–2500 nm) and provide continuous spectral reflectance could be used to study strawberry physiological characteristics [40]. Additionally, many researchers designed various types of phenotyping platforms for strawberry disease detection and fruit quality evaluation. Multispectral or hyperspectral sensors mounted on various platforms have been used for specific purposes, such as powdery mildew disease detection, fruit grading, and fruit 3D construction [41]. Some platforms and topics discussed in this review are shown in Figure 2.

3. Machine and Deep Learning Analysis Methods

Machine learning (ML) is one of the most effective ways to process and analyze the vast amounts of data obtained by today’s remote sensing techniques. In general, machine learning used in the agricultural field can be grouped into four categories: (1) crop monitoring, including yield estimation, disease and weed detection, species recognition, and crop quality assessment; (2) livestock management, such as animal welfare and livestock production; (3) water regulation, for example, plant evapotranspiration estimation; and (4) soil management, including the identification and prediction of soil temperature and moisture content [42].
Traditional machine learning methods, such as support vector machines (SVMs), artificial neural networks (ANNs), and random forests (RFs), require the extraction of key features from image or LiDAR datasets that sufficiently represent the characteristics of the studied objects or phenomena [43,44]. The quality of selected features is critical to the classification or prediction performance [45]. However, finding the best feature subset can be a time-consuming and subjective process, especially for highly dimensional datasets and in problems with a complex domain (e.g., crop yield estimation) [46]. For example, Sabanci et al. [47] extracted 12 features of wheat grains from high-resolution RGB images, including grain dimension (length, width, perimeter, and area), spectral band (red, green, and blue), and texture (contrast, correlation, energy, homogeny, and entropy) information. These features were imported into an ANN model to classify the wheat into two types, bread and durum, with an accuracy higher than 97%.
Deep learning (DL) has emerged as perhaps the most important branch of machine learning. Deep learning refers to the extension of ANNs to accommodate neural networks with a relatively large number of layers that enable hierarchical data representation [48,49]. Mainstream deep learning models at present include deep neural networks (DNNs) [50,51], recurrent/recursive neural networks (RNNs) for sequence or time data processing [52], convolutional neural networks (CNNs) for image analysis [53,54], deep generative models [55], and auto-encoder networks [56]. In contrast to conventional ML algorithms, DL models can achieve optimal discrimination features by determining a set of parameters during the training process; thus a specific step for feature extraction is not required [49,57]. The main disadvantages of DL methods are the need for massive training datasets, computing capacity, and training time [58].
Improving existing DL methods and creating novel algorithms have been the goals of numerous studies involving agricultural applications. Kamilaris et al. [59] reviewed the agricultural problems solved using DL, common DL models and frameworks, data sources and corresponding preprocessing procedures, and the overall performance of DL by summarizing 40 studies. They identified land cover classification, crop type estimation, crop phenology, fruit and weed detection, and fruit grading as the current main applications of DL in the agriculture field. DL may also have significant potential in seed identification, soil and leaf nitrogen content determination, and irrigation management. In addition, they identified the potential for long short-term memory (LSTM) or other RNN models in yield prediction, disease management, and water needs assessment based on consecutive observations. Overall, deep learning has experienced remarkable developmental progress and already has a number of operational applications in agriculture.

4. Fruit Traits

4.1. Fruit/Flower Detection

Automated counting of fruits and flowers from images is a critical step in autonomous robotic harvesting and yield prediction [60]. In recent years, numerous studies have been conducted on this topic, mainly aimed at developing new image-based object detection and localization algorithms to improve recognition accuracy. Traditional image segmentation methods use morphological operations to generate binary images and separate fruits from the background according to the similarity of color, spatial texture, and geometric shape. For example, Feng et al. [61] introduced a strawberry stem detection and fruit classification workflow, which used the OHTA color space to segment the fruit from black-and-white plastic sheets, extracted the principal inertia axis feature to define the stem position, made a judgment of strawberry ripeness based on the hue intensity and saturation (HIS) color space, and then selectively harvested strawberry fruits according to fruit ripeness and shape. However, this segmentation method is not yet robust and stable enough for application in commercial settings that have variable lighting conditions, observation angles, object orientations, relative positions, and various clustering and occlusion situations.
Recently, CNNs have evolved to be the most powerful approach for solving target identification and classification problems. The superiority of a CNN in image recognition lies in its ability to extract increasingly complex visual concepts and features through hierarchical structures. The first few layers can be used to learn simple local features, and the deeper hidden layers can capture more complicated semantic information, such as shape and texture. Koirala et al. [47] reviewed the use of deep learning in fruit detection and yield prediction. The author elaborated on the applications of current state-of-the-art DL frameworks in target recognition, including the faster regional convolutional neural network (Faster RCNN), single-shot multibox detector (SSD), and you only look once (YOLO), and in detectors, including the Oxford Visual Geometry Group network (VGGNet), the Residual Network (ResNet), and the Zeiler and Fergus network (ZFNet). Fruit weight and yield estimation were also discussed, which demonstrates the superiority of deep learning in analyzing multi-dimensional remote sensing data.
With regard to strawberry flower/fruit counting, Lin et al. [62] applied RCNN, Fast RCNN, and Faster RCNN models for the identification of strawberry flowers from the image, with an accuracy of 63.4%, 76.7%, and 86.1%, respectively. The Faster RCNN framework demonstrated good performance even if strawberry flowers were occluded by foliage, under shadow, and overlapped by other flowers. Another DL framework (SSD) was implemented by Lamb et al. [63] for strawberry detection. The authors modified the training images and network structure to optimize the detection precision and execution speed. This system with a sparse CNN can run quickly on mobile low-power hardware with an average precision of 84.2%. Yu et al. [64] further adapted a Mask-RCNN model for mature strawberry detection in the RGB image and achieved an accuracy of 95.78% even in a non-structural environment, particularly for overlapping and hidden fruits and those under varying illumination. Zhou et al. [65] proposed a robust deep learning architecture named improved Faster-RCNN, which adopted a transfer training technology based on Faster RCNN and greatly reduced the number of strawberry images required for training the network. The average fruit extraction accuracy was more than 86%.

4.2. Fruit Maturity/Ripeness

During strawberry ripening, the fruit surface color typically goes through green, white, pink, and red stages, concurrent with the accelerated biosynthesis of pigments (e.g., carotenoids and anthocyanins) over a period of up to 30 days. Fruit ripening is a complicated process, with a variety of internal physical and chemical changes, which is mainly controlled by the synthesis and action of hormones [66]. Azodanlou et al. [67] found that as the fruit matures, there is an increase in volatile organic compounds (VOCs) and sugars, as well as a decrease in acidity. Meanwhile, structural changes in cell wall polysaccharides, especially the dissolution of pectin, contributes to fruit softening. The state of ripeness at harvest directly determines fruit quality and shelf life. Unripe fruits have lower nutrient values but are more resistant to physical injury. Overripe fruits are more susceptible to the external environment and fungal infection [68]. Rahman et al. [69] found that the shelf life of strawberry fruits picked at the 1/3rd maturity stage and the full maturity stage were about 7.8 and 2.4 days, respectively, regardless of the genotype. Therefore, early evaluation of fruit ripeness and the determination of optimal harvest time are crucial to reducing waste in the supply chain and improving fruit quality [70].
Traditional strawberry ripeness assessment is implemented visually and subjectively based on the appearance, aroma, color distribution and intensity, as well as texture [71,72]. Standard maturity evaluation methods are quantitative, measuring the contents of internal quality attributes, such as firmness, soluble solids content (SSC), titratable acidity (TA), and total anthocyanins [73]. However, this technique is destructive, slow and requires expensive specialized devices and expertise [74]. Researchers have spent considerable efforts developing simple, non-invasive, and high-throughput ways to estimate the ripeness stage of strawberry fruits. Most of the studies have focused on extracting spatial and spectral information from representative wavelength bands (usually R, G, and NIR bands) to discriminate between strawberries at different growth stages. Raut et al. [75] proposed a direct color-mapping method to evaluate the redness of strawberries based on RGB images and sort them into pre-mature, mature, and over-mature classes. Jiang et al. [76] selected the wavelengths 535, 675, and 980 nm and introduced eight spectral indices to automatically identify immature, nearly mature, and mature strawberries using the Fisher linear discriminant (FLD) model, with a prediction accuracy over 95%. Guo et al. [77] combined spectral reflectance and textural indicators (correlation, contrast, entropy, and homogeneity) of 11 optimal wavelengths from hyperspectral images and used the SVM algorithm to classify ripe, mid-ripe, and unripe fruits, with an accuracy higher than 85%.
Yue et al. [78] assessed strawberry maturity in the greenhouse using only a smart phone equipped with 535 and 670 nm optical filters, which were chosen to capture anthocyanin and chlorophyll contents, respectively. Absorptance data for the two wavelengths served as variables in three regression classification methods (multivariate linear, multivariate nonlinear, and SoftMax regression). The multivariate nonlinear model yielded an identification accuracy of over 94%. Gao et al. [79] further used the AlexNet CNN deep learning model to categorize the strawberry fruits into ripe and early-ripe stages using hyperspectral datasets, achieving 98.6% classification accuracy. Recently, data collection processes, feature extraction and classification algorithms have been integrated into a real-time strawberry ripeness evaluation and decision-making system developed for harvesting robots [80].

4.3. Fruit Quality and Postharvest Monitoring

Postharvest operations, including sorting, grading, and spoilage stage monitoring, are of great significance for price determination, fulfillment of orders with specific quality standards, and sales strategy formulation [81]. In terms of strawberry grading, manual selection is widely based on the shape, size, color, maturity, and imperfection of the fruits [82]. Compared with apples and citrus fruits, strawberries are more vulnerable to damage due to their high moisture content, lack of exocarp protection, and susceptibility to fungal infection [12]. From the moment of harvest, strawberry fruits begin to lose nutrition and generally spoil after three days without cold storage, potentially generating toxins harmful to human health [83]. Therefore, it is helpful to have a rapid and non-invasive inspection method for postharvest strawberry monitoring.
As with ripeness evaluation, emerging computer vision and machine learning technologies have enabled the development of automatic, real-time, and non-destructive fruit-grading systems. Liming et al. [84] designed an intelligent strawberry-grading system by integrating conveyer belts, cameras, and other auxiliary devices and employing multi-attribute decision-making theory to grade the strawberry fruit into three or four classes based on color, shape (13 feature parameters), and size. The final accuracy was above 90%. Mahendra et al. [85] compared seven types of features and used the SVM classifier to categorize the fruits into two groups: good and damaged. They found that the speeded-up robust features (SURF) were most effective in classification, with an accuracy of 90.73%. Sustika et al. [81] evaluated the capability of six CNN architectures (the baseline CNN, AlexNet, GoogLeNet, VGGNet, Xception, and MobileNet) for the binary classification (good or bad) of strawberry fruit and its classification into four grading levels (1–4 ranking) using RGB images. The study indicated that VGGNet’s performance was the best, producing 96.49% and 89.12% accuracy for the binary and the four-grade-level classification, respectively.
Péneau et al. [86] represented the consumer perception of freshness quantitatively by establishing the relationship between the fruit’s physiochemical parameters (appearance, odor, texture, and flavor) and consumer/expert ratings of freshness. Dong et al. [87] used long-path Fourier-transform infrared spectroscopy (FTIR) technology to capture the spectral characteristics of VOCs generated after different lengths of storage and then detect changes in VOC (esters, alcohols, ethylene, etc.) abundance. A principal component analysis (PCA) was implemented on the spectral data to distinguish fresh, slightly spoiled, and spoiled strawberries. As for storage time estimation, Weng et al. [12] collected the spectral data for strawberries from 0 to 60 h of storage with an interval of 6 h, using hyperspectral imaging technology. SVMs and RFs were then used to classify strawberry samples from different storage times with an accuracy of 100%. Partial least-squares regression (PLSR) [88] analysis has also been used to estimate the storage time with a prediction accuracy approaching 100%.

4.4. Internal Fruit Attributes

As discussed in the previous section, fruit quality is broadly assessed by many parameters associated with the external attributes of the fruit, including appearance, texture, and flavor. However, the determination of internal fruit attributes (sugar content, juiciness, acidity, color, etc.) is also very important. NIR spectroscopy and multiple/hyperspectral imaging technology have been effective for evaluating internal fruit quality attributes in a non-contact manner. A difference between spectroscopy and imaging is that the former can only obtain single-point information, while the latter can provide spatial distribution. The VIS/NIR spectral range is usually selected for internal fruit attribute studies because it provides information about O–H, C–H, and N–H absorptions [89].
Spectroscopy and hyperspectral image data are highly redundant and require preprocessing and analysis. Most research on the retrieval of internal fruit attributes adopts the following steps: data pretreatment (spectral correction and noise reduction), optimal sensitive wavelength selection, feature extraction, and prediction model construction. ElMasry et al. [90] estimated the moisture content (MC), SSC, and pH of strawberry fruits using hyperspectral images. Optimal wavelengths were selected for the MC, SSC, and pH, using β-coefficients from partial least-squares models. Multiple linear regression (MLR) models were then applied to retrieve fruit quality attributes using the spectral data of optimal wavelengths, with prediction accuracies of 87%, 80%, and 92% for the MC, TSS, and pH, respectively.
Unlike many researchers who only used the spectral information directly as input variables, Weng et al. [91] extracted the spectral information about optimal wavelengths, 9 color features obtained from color histograms and moments, and 36 textural features simultaneously from the hyperspectral images for the detection of soluble solid content (SSC), pH, and vitamin C. Spectral and color features achieved the best prediction for SSC, with an R2 coefficient of 0.94. In terms of pH, optimal prediction was obtained using spectral features only, with an R2 of 0.85. A combination of spectral and textural features helped improve the estimation of vitamin C, with an R2 of 0.88. At present, the main parameters retrieved for the internal fruit quality of strawberry include firmness; vitamin C (VC); phenolic compounds; TA; total water-soluble sugar (TWSS) content; concentrations of glucose, fructose, and sucrose; SSC; pH; MC; and VC. A detailed summary of the acquisition of these parameters using spectroscopy and imaging technology is shown in Table 1.

4.5. Fruit Shape

Fruit shape is a critical parameter affecting the esthetic appearance and marketability of strawberries. Basic shape descriptors, such as length, width, and aspect ratio, can be manually measured with, for example, a vernier caliper. However, this method is not only labor intensive and time consuming but also limited in capturing the complex and multi-dimensional aspects of shape and uniformity. Currently, most shape classification studies are based on 2D digital images. Ishikawa et al. [103] extracted four types of shape descriptors from RGB images taken by a digital camera: (1) measured values, including contour line length, fruit length and width, and fruit width/length ratio; (2) ellipse similarity (ES) index, including the optimum ellipse area ratio and the boundary length ratio, which indicate the ellipse similarity of fruits; (3) elliptic Fourier descriptors (EFDs); and (4) chain code subtraction (CCS). Random forest analysis was conducted to categorize strawberry fruit shape into nine types: reniform, conical, cordate, ovoid, cylindrical, rhomboid, obloid, globose, and wedged. The recall ratio was used for accuracy evaluation since Kappa coefficients were not able to classify more than three types. The recall ratio ranged from 0.52 to 1, depending on shape type. Oo and Aung [104] proposed a simpler but efficient method for strawberry size estimation and shape classification based on RGB images. Only three parameters (diameter, length, and apex angle) were imported into a three-layer neural network for four classes. The estimation accuracy of diameter and length was 94% and 93%, respectively, for strawberries without calyx occlusion and 94% and 89% for those with calyx occlusion. The classification was between 94% and 97%.
Feldmann et al. [105] further extracted 68 strawberry fruit shape features of four types (linear and geometric descriptors, outline-based descriptors, landmark-based descriptors, and binary-image-pixel-based descriptors) from digital images and introduced a method called principal progression of k clusters (PPKC), which can automatically discover potential shape classes. Relationships between four shape categories and features were built and used for the classification. The accuracy varied from 68% to 99%. Zhou et al. [65] used the length-to-width ratio of the minimum external rectangle obtained from RGB images to assess the plumpness of strawberry fruits. Strawberries were classified into three types based on this ratio: plump (0–0.5), approximately plump (0.5–0.8), and fully plump (0.8–1). However, this approach applies only for strawberries of globose type.
Although 2D images can only reflect one dimension or plane characteristics, they are sufficient to determine shape properties to some extent. Today, imaging technology can obtain three-dimensional information. A 3D-shape-measuring system equipped with cameras, a rotation table, and an operation system was designed by He et al. [106], which generated 3D point clouds of strawberry fruits using photos from various angles and heights using structure from motion (SfM) methods. The errors were less than 0.6 mm for 90% and 0.3 mm for 80% or more of the strawberry fruits [107]. Construction of 3D strawberry architecture can provide information beyond basic descriptors. For example, uniformity is a key shape factor that is directly tied to fruit quality and sales volume. Li et al. [108] defined eight uniformity variables calculated from the 3D architecture of the strawberry fruit and evaluated the importance of each variable for manual uniformity assessment. The results showed that circularity of the maximum circumference had the closest predictive relationship with the manual uniformity score. A regular shape genetic locus was detected and found to be related to three uniformity parameters.

4.6. Strawberry Yield Prediction

Strawberry harvesting can extend for many months depending on the growing system and environment, with dramatic variation in weekly yields. Forecasting strawberry yield ahead of time can help growers formulate labor/equipment allocation strategies during the harvesting period. Since weather fluctuation is a key factor, many studies have been conducted on how weather parameters (e.g., solar radiation, wind, temperature) influence strawberry yield [109]. These significant influential factors were combined with other yield-associated traits as input for statistical models and machine learning methods to predict strawberry yield. For example, Misaghi et al. [110] applied three neural network models (multilayer perception (MLP), generalized feedforward neural network (GFNN), and modular neural network) for strawberry yield prediction using vegetation indices (normalized difference vegetation index (NDVI) and Soil Adjusted Vegetation Index (SAVI)) and soil characteristic parameters, with up to 94% final accuracy. MacKenzie and Chandler [111] built a relational expression between flower counts, temperature data, and strawberry total weight, with a coefficient of determination of 0.84. Various meteorological parameters (e.g., net radiation, vapor pressure, relative humidity) were examined by Pathak et al. [109] for strawberry yield forecast using a principal component regression model, achieving 70% yield prediction accuracy.
Hassan et al. [112] used hyperspectral remote sensing imagery to obtain LAI parameters and six vegetation indices (VIs) to explore the relationship between these parameters and yield under different growing conditions (organic and conventional). The prediction accuracy (R2) was higher than 0.7 except in the treatment using the black plastic mulch conventional system (<0.6). The result also showed that six VIs worked better than LAI as yield estimators. Maskey et al. [113] utilized predictive principal component regression (PPCR), neural network (NN), and random forest (RF) models to forecast strawberry yield using 26 parameters related to leaf and canopy properties, soil characteristics, and weather conditions. Each of the selected weather parameters was highly correlated with strawberry yield, and the neural network (NN) analysis provided the best prediction accuracy (95%). Nevertheless, these prediction models were generally spatially confined and need to be validated in field experiments. Another method for strawberry yield forecasting is to count fruit numbers and determine size and maturity using remote sensing images. Deep learning can play a crucial role in accomplishing this task. Using the Faster RCNN model, Chen et al. [114] predicted strawberry yield by identifying and counting strawberry flowers and immature and mature fruits from UAV high-resolution images obtained at two heights (2 m and 3 m). The results showed that the mean average precision was higher at 2 m (83%) than at 3 m (72%).

5. Leaf and Canopy Traits

Leaf and canopy traits are generally divided into two types: architectural and biophysical/biochemical characteristics. Architectural traits refer to external geometric morphology, such as leaf length and width; leaf area; leaf inclination angle; leaf azimuth; and canopy height, width, size, and shape. These parameters affect the penetration of light through the canopy, light utility efficiency (LUE), and, ultimately, photosynthesis efficiency. Biophysical/biochemical parameters describe internal physiological characteristics of leaves and are highly associated with crop growth dynamics, nutritional status, and photosynthetic capacity. These parameters include the green area index (GAI), green fraction (GF), above-ground biomass (AGBM), LAI, leaf/canopy water content, leaf/canopy chlorophyll content, leaf/canopy nitrogen content, and leaf/canopy temperature, etc.
At present, SfM analysis and LiDAR are two methods used to generate 3D point-cloud data and obtain 3D structural properties of leaves and canopies with desirable accuracy. SfM is a computer vision technique that aims to recover the three-dimensional geometry of objects by analyzing overlapping images taken from different perspectives [115]. The workflow involves several steps: (1) Image feature extraction and matching is performed, where matching algorithms are used to detect conjugate features or tie points between overlapped images. (2) Camera location and orientation estimation is done using the conjugate features in the images. A bundle-block adjustment process is then implemented to estimate the position and orientation of each camera at the exposure moment. (3) Orthoimage and 3D point-cloud production [116] is then conducted using more dense conjugate points detected through image matching. The point cloud can be rasterized to produce a digital surface model, which is used to generate ortho-rectified image mosaics. Light detection and ranging (LiDAR) is an active remote sensing method that utilizes laser to generate 3D point-cloud datasets [35]. Most LiDAR systems send laser pulses and compute the distance between the LiDAR source and the point where the LiDAR pulse hits an object (e.g., plant leaf) using the laser pulse travel time. Other navigation sensors are then used with the measured distance to compute the 3D location of the point. Dense 3D points can be created this way to accurately depict the surveyed objects. One of the major differences between LiDAR- and SfM-based datasets is the significantly higher cost associated with LiDAR measurements. LiDAR, however, is capable of producing 3D points along the LiDAR laser path, which can reveal some under-canopy information. Both methods have been widely used to extract height, size, and shape of various crop plants, such as blueberry, maize, and soybean [117,118,119].
Two main approaches have been commonly used to retrieve biophysical characteristics and related parameters: statistical modeling and radiative transfer modeling (RTM). The former aims to establish the relationship between features obtained from remote sensing and field measurements using traditional statistical modeling (e.g., regression analysis) and machine learning methods. Commonly used image features include spectral (e.g., band values and vegetation indices) and textural information. More than a hundred VIs could be calculated using different light spectra combinations extracted from UAV hyperspectral imagery, supplying abundant information about vegetation vigor and health [120]. It is worth noting that the red edge region, which is defined as the wavelength position of the inflection point on the red-NIR reflectance slope, has raised wide interest among researchers for LAI and chlorophyll content estimation [121]. As an alternative to statistical modeling, RTM considers the physical process of interactions between the vegetation canopy and solar radiation. Through the simulation of canopy reflectance, canopy parameters can be retrieved by RTM as long as other input parameters (radiation intensity, observation angle, soil conditions, etc.) are known [122,123]. Kattenborn et al. [124] revealed how canopy reflectance is linked with functional traits using the PROSAIL radiative transfer model (combination of PROSPECT leaf optical properties model and SAIL canopy bidirectional reflectance model). Recently, several scholars have compared and combined these two approaches to implement leaf/canopy property retrieval [81,124,125].
Studies on phenotyping of strawberry leaves and canopies using remote sensing techniques are relatively rare. Luisa et al. [126] investigated the relationship between 11 spectral response indices and nitrogen (N) content of young, mature, and old leaves. The results showed that only green reflectance (550 nm) was responsive to N fertilization for individual leaves. At the canopy level, green reflectance (550 nm), red reflectance (680 nm), VI, and NDVI were highly correlated with N content, with an R2 of 0.5, 0.6, 0.56, and 0.56, respectively. Sandino et al. [127] adopted a basic computer vision method to estimate strawberry leaf coverage from RGB images with an accuracy of 90%. Procedures such as smoothing, dilatation, contour detection, threshold segmentation, and edge detection operations were used. Similarly, a more complex algorithm was introduced by Jianlun et al. [128] to segment the greenhouse strawberry leaf edge from the background noise in the images, which integrated the scale space wavelet transformation, canny edge detection, Otsu threshold segmentation, and morphological analysis approaches.
Guan et al. [129] extracted planimetric canopy area, canopy surface area, canopy average height, standard deviation of canopy height, canopy volume, and canopy smoothness parameters from high-spatial-resolution RGB images (~0.5 mm) through SfM, object-based image analysis (OBIA), and GIS analysis. Three of the variables were used to predict the leaf area (R2 = 0.79) and dry biomass (R2 = 0.84) throughout the strawberry-growing season using multiple linear regression analysis. Abd-Elrahman et al. [130] built on this study by developing automated canopy delineation and canopy size metric extraction models to predict strawberry biomass at greater throughput. Takahashi et al. [131] applied Kinect (the depth sensor used in the Microsoft XBOX console) to detect plant height and leaf area receiving direct sunlight at different leaf layers over time under different environments. These parameters were compared to the yield, dry weight, and nitrogen content inside the leaf. Kokin et al. [132] used a thermal camera to examine the difference between the strawberry leaf surface temperature and ambient air temperature under night frost conditions, which reached a maximum of ~8 °C.

6. Abiotic/Biotic Stress Detection

6.1. Water Stress

Water deficit stress refers to the inhibition effect on plant growth caused by soil water deficiency or high evaporation requirement in a low-humidity atmosphere. The detection of the plant response to water stress is critical to irrigation management. Current irrigation practices are generally based on indirect estimation of plant water demand or evaporation calculated from soil moisture content and meteorological data [133,134]. Gutiérrez et al. [135] developed an automated irrigation system equipped with a distributed wireless network of soil moisture and temperature sensors placed in the root zone of the plants. Through irrigation control based on soil moisture and temperature threshold values, 90% saving was achieved in water consumption compared to traditional irrigation practices. Morillo et al. [136] implemented precision drip irrigation for strawberries using crop water requirement estimates and optimum irrigation pulse design. The method incorporated soil water content and crop evapotranspiration data obtained from a local meteorological station.
In contrast, monitoring physiological changes in plants due to water stress provides a more direct and intuitive way to assess water demand. Under water stress, a plant’s temperature increases due to stomatal closure and reduced transpiration. Severe water scarcity can lead to wilting and loss of key pigments such as chlorophyll, which cause irreversible damage to the photosynthesis process. Multiple remote sensors have been used to detect pre-symptom changes. Commonly used sensors for this approach include the thermal imager (TIR; 8–14 µm), VIS, NIR, shortwave infrared reflectance (400–2500 nm), and sun-induced fluorescence (SIF; 685 and 740 nm). Thermal infrared imaging has demonstrated advantages compared to other remote sensing spectral domains in crop water stress detection. Through the analysis of information in different spectral ranges, numerous indices sensitive to water stress were proposed, such as temperature-based indices (stress degree day (SDD), crop water stress index (CWSI), and water deficit index (WDI)) and leaf-water-content-related indices (water index (WI), leaf water index (LWI), moisture stress index (MSI), and normalized difference water index (NDWI)) [137]. These indicators can quantitatively reflect the water deficit of leaves or canopy to some extent.
As for strawberries, drought severely limits plant growth and reduces yield and fruit quality. Extensive research has been done to investigate responses of strawberries to water stress, including changes in yield and morphological, physiological, and biochemical properties. Drought-tolerant cultivars have been selected according to their adaptability to limited water supply [138]. Numerous parameters related to strawberry growth status, such as leaf area, leaf size, leaf longevity, dry mass, number of leaves per plant, leaf expansion rates, leaf chlorophyll content, chlorophyll stability index, leaf moisture content, stomatal conductance, photosynthetic rate, transpiration rate, root development, and plant height, exhibit a decreasing tendency under water stress [139,140,141]. Adak et al. [142] found that water deficit increased some biochemical features of fruits, such as total phenolics, total anthocyanins, antioxidant activity, and sugar content. Strawberry fruit weight and yield per unit declined by 59.72% and 63.62%, respectively, under water stress as compared to control conditions.
It is helpful in crop management to provide a real-time, accurate assessment of water demand inside the strawberry plant. Peñuelas et al. [143] found that strawberry leaf temperature and the CWSI obtained by a handheld infrared thermometer were very useful in evaluating even mild water stress. Razavi et al. [144] used chlorophyll fluorescence to identify drought stress in strawberries. Delalieux et al. [145] compared plant height, NDVI, red edge inflection point (REIP), and pigment-specific simple ratio for chlorophyll b (PSSRb) differences between strawberries under two irrigation scenarios (20% and 100%) using the COmpact hyperSpectral Imaging (COSI) system. The study indicated that the growth inhibition caused by water shortage could be detected using these spectral characteristics. Li et al. [146] measured strawberry plant temperature, dry surface temperature (Tdry), wet surface temperature (Twet) for a single point, and the whole plant area using a TIR sensor. They found the CWSI to be significant in detecting strawberry water stress. More indicators were examined by Gerhards et al. [147], including surface temperature (TS), CWSI, sun-induced fluorescence (F687, F780), and TIR indices, as well as the visible and near infrared (VNIR)/short-wave infrared (SWIR), photochemical reflectance index (PRI), normalized difference vegetation index (NDVI), and moisture stress index (MSI). These results illustrate the great potential of remote sensing in water stress detection.

6.2. Pest and Disease Detection

Strawberries are susceptible to many insects, mites, pests, and microorganisms (bacteria, fungi, and viruses) that regularly cause reductions in total and marketable yield [148,149]. Early diagnosis and control of strawberry pests and diseases is critical to avoiding yield losses. The occurrence of plant diseases is a process of pathological and physiological changes. Internal symptoms of diseased crops are eventually reflected as abnormal changes in external morphological characters, such as necrosis, rot, and deformity of strawberry roots, stems, leaves, flowers, and fruits. Visual identification of pathogen signs and plant disease symptoms performed by trained experts is the common practice. Nevertheless, this process is post-symptom, and its accuracy depends on the individual’s experience. Microscopic methods are not feasible for large-scale commercial detection of pest and disease problems [150].
Numerous studies have utilized remote sensing to recognize various strawberry diseases, such as powdery mildew, anthracnose crown rot, verticillium wilt, and gray mold (Figure 3). Reflectance at various spectral bands contains significant information about plant biophysical and biochemical properties, such as leaf pigment content (VIS: 400–700 nm), leaf internal structure and water content (NIR: 700–1100 nm), and the composition of leaf chemicals and water content (SWIR: 1100–2500 nm) [139]. Consequently, remote-sensing-based plant disease detection methods focus on the optical characteristics of infected and healthy strawberry plants in the images acquired by one or more sensors. The majority of the current research in this area focuses on differentiating between healthy strawberries and those affected by a single disease. Machine learning (particularly deep learning) plays an important role in analyzing images for disease detection. For example, Park et al. [151] applied a CNN to classify healthy and diseased strawberry using RGB images taken by a smart phone, with 89.7% accuracy. Chang et al. [149] extracted 40 textural indices from high-resolution RGB images and compared the performance of three supervised learning classifiers, ANNs, SVMs, and K-nearest neighbors (KNNs), in detecting the strawberry powdery mildew disease. The overall classification accuracy was 93.8% and 78.80% for the ANN and KNN classifiers, respectively. More studies addressing strawberry disease detection are detailed in Table 2.

7. Discussion and Outlook

The primary aim of this manuscript was to present an overview of how remote sensing and machine learning have been used in strawberry phenotyping and management. We reviewed studies that have applied state-of-art technological breakthroughs in machine and deep learning techniques to detect strawberry fruits and flowers from images with high accuracy. This work contributed greatly to autonomous robotic harvesting and yield prediction applications. Statistical models and machine learning methods were explored to evaluate strawberry fruit ripeness, estimate internal fruit attribute parameters, and monitor postharvest fruit quality based on RGB, multispectral, and hyperspectral image datasets. Various image-based fruit shape descriptors were suggested, such as fruit contour line length, uniformity, and ellipse similarity indexes. Structures from motion algorithms were used to generate 3D point clouds of strawberry fruits. Canopy and leaf images were analyzed to build models relating the biochemical content of leaves and spectral indexes as well as predict biophysical parameters, such as dry biomass and leaf area. Additionally, studies related to the detection of abiotic and biotic stressors were developed. Table A1 lists a categorized summary of the studies reviewed in this manuscript that were not presented in tabular form.
Although remote sensing data acquisition and machine learning data analysis are already advancing the prospects of strawberry precision agriculture and phenomics applications, there is still an urgent need for further exploration. For example, questions related to how to expand the robustness and transferability of the statistical and machine learning models connecting fruit quality to image-based spectral and geometrical information are still active research topics. Deep learning is also very promising for further advances in fruit quality assessment. A deep learning method may enable obtaining multiple fruit quality parameters, such as shape, size, color, and internal attributes, simultaneously, which can help build a comprehensive evaluation system for strawberries and promote the automation of postharvest grading processes. Strawberry yield forecasting can be improved by integrating multiple variables such as weather condition, soil parameters, fruit/flower counts, canopy metrics, and various spectral indices from hyperspectral images as input. Many of these parameters, such as fruit and flower count and canopy size, can be extracted directly from the images using deep learning networks, which may effectively increase prediction accuracy and reduce manual work of feature extraction.
Continuous, real-time observations of leaf and canopy phenotyping traits are critical to monitoring the growth and nutritional status of the plants. With the advancement of remote sensing technology, UAVs and updated ground-based platforms are being used extensively in agriculture. Sensors that are expensive and hard to access, like LiDAR and hyperspectral cameras, are gradually becoming more affordable. Thus, an increasing number of studies are being conducted on using remote sensing and machine learning to obtain structural (e.g., leaf width/length, leaf inclination angle, and canopy height and width), biophysical (e.g., LAI and biomass), and biochemical (e.g., chlorophyll and nitrogen content) traits of agronomic crops, fruit trees, and vegetables. Strawberry fruit shape is mostly depicted and evaluated by features extracted from 2D and 3D information facilitated by SfM and LiDAR technologies. As those technologies become more ubiquitous, more fruit descriptors and novel assessment systems can be developed based on the 3D architecture of strawberries. Although SfM methods were applied to high-spatial-resolution RGB images [129,130] to calculate several strawberry canopy parameters (e.g., canopy area, average height, volume, and smoothness), LiDAR could be used to obtain detailed information about a strawberry plant’s structural properties. For example, Jiang et al. [166] analyzed LiDAR data and proposed various quantification factors for the bush architecture of blueberries, including bush morphology (height, width, and volume), crown size, and shape descriptors (path curve λ and five shape indices). This type of research can be readily transferred to the strawberry domain.
Besides, there is a great deal of progress to be made in predicting strawberry biophysical parameters and the photosynthesis process, and there is much to learn from other crops. Paul et al. [167] applied Gaussian process regression and the SVM model to estimate the canopy-averaged chlorophyll content of pear trees based on convolutional auto-encoder features of hyperspectral data. Li et al. [168] summarized the development of remote sensing imaging technologies for retrieval and analysis of information about various nutrition parameters, such as nitrogen, phosphorus, potassium, calcium, iron, and magnesium. The study showed that a leaf or canopy nutritional distribution map can be generated, and the coefficient of determination (R2) of nitrogen even reached 0.91. Lu et al. [169] found that the total emitted solar-induced chlorophyll fluorescence (SIF) is more effective than top-of-canopy (TOC) SIF in the prediction of forest photosynthesis. Dechant et al. [170] further revealed the canopy structure is dominant in the relationship between SIF and gross primary production (GPP) for rice, wheat, and corn. Studies like these in strawberry are few. Multispectral and hyperspectral datasets, radiative transfer modeling, and machine learning analysis may be comprehensively applied to study strawberry’s biophysical properties and photosynthesis processes. Furthermore, in a general sense, there is still room for model and algorithm development and the fusion and application of multiple types of remote sensing images.
At present, several studies have assessed the feasibility of different methods or parameters in the detection of biotic and abiotic strawberry stresses, focusing on single stressors at discrete time points. These works have tried to distinguish between healthy plants and those with a single disease, improving discrimination accuracies where possible, as shown in Table 2. For future high-throughput disease detection, there is a need to integrate multiple sensors and multiple time points to identify field areas and plants under stress automatically and rapidly, diagnose the stressor type and evaluate its severity, comprehensively assess plant health through time, and model and predict plant responses to management strategies.

8. Conclusions

Strawberry is different from most agronomic crops like corn, soybean, and wheat in various aspects. It is generally grown on raised-bed structures instead of flat ground and is also grown in hydroponic systems and under greenhouse and plastic tunnel structures. Strawberry is clonally propagated and has a complex growth habit that includes several plant parts, such as the crown, leaves, runners, inflorescences, and fleshy fruits. The fruits have many developmental stages and when ripe are very sensitive to environmental and management conditions. Plant development and fruit production can continually cycle and change over a six-month period, depending on the growing region. These characteristics make phenotyping considerations complex for strawberry. Therefore, the methods developed in major row crops must be creatively adapted to strawberry.
The development of ground-based devices, UAVs, and emerging field robotics is advancing the potential for monitoring strawberry growth throughout the entire growing cycle, from planting to final harvest. Remote sensing can provide massive amounts of data about crop condition and health via plant and fruit characteristics. This deepens our knowledge about the crop itself and allows more advanced management practices. Remote sensing may also be useful for postharvest evaluation of strawberry fruits. Spectral and textual information obtained from multiple sensors can capture both external and internal fruit traits. Further, available artificial intelligence options include an expanding array of deep learning techniques and computer vision analysis methods. This combination of advances in sensors and data extraction and analysis will continue to accelerate the use of precision agriculture in strawberry production and phenomics technology in strawberry breeding and genetics.

Author Contributions

Conceptualization, A.A.-E. and C.Z.; methodology, C.Z.; writing—original draft preparation, C.Z.; writing—review and editing, A.A.-E., V.W. and C.Z.; visualization, C.Z.; supervision, A.A.-E. and V.W.; project administration, A.A.-E. and V.W.; and funding acquisition, A.A.-E. and V.W. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

No data used in this work.

Conflicts of Interest

The authors declare that they have no conflict of interest.

Appendix A

A two-step approach was adopted to search and screen the literature related to remote sensing and machine learning applications, with an emphasis on strawberry. In the first step, refereed articles about remote sensing, machine learning, phenotyping, and strawberries were collected from the IEEE Xplore, ScienceDirect, Web of Science, and Google Scholar scientific database portals. The following query was used, which included the keywords implemented in the search: [“machine learning” OR “deep learning” OR “computer vision” OR “remote sensing” OR “phenotyping” OR “phenomics”] AND [“strawberry”]. In the second step, we screened a total of 79 papers resulting from the search, of which 60 dealt with strawberry phenotyping and 19 were related to strawberry management during growth and development. The number of articles on each topic is shown in Figure A1.
Figure A1. Number of articles included in this review by topic.
Figure A1. Number of articles included in this review by topic.
Remotesensing 13 00531 g0a1

Appendix B

Table A1. Summary of research articles on strawberry phenotyping and management using remote sensing and machine learning.
Table A1. Summary of research articles on strawberry phenotyping and management using remote sensing and machine learning.
Strawberry: Part of
Interest
Phenotyping TraitsDataMethod and ModelReference
FruitFruit/Flower detectionMostly RGB images with high spatial resolutionTraditional morphological segmentation; CNNs (SSD, RCNN, Fast RCNN, Faster RCNN, Mask-RCNN, etc.)[60,61,62,63,64,65]
Ripeness and postharvest quality evaluationRGB, multispectral, and hyperspectral images, especially for R, G, and NIR bands1) Feature extraction (spectral and textural indexes) + classifier (FLD, SVM, multivariate linear, multivariate nonlinear, SoftMax regression, etc.)
2) CNN classifier (AlexNet, CNN, etc.)
[12,73,74,75,76,77,78,79,80,81,82,83,84,85,86,87]
Internal attributes’ retrieval (SSC, MC, pH, TA, vitamin C, TWSS, MPC, etc.)NIR, multispectral, and hyperspectral spectroscopy and imagesFeature extraction (spectral and textural features) + prediction model (PLSR, SVR, LWR, MLR, SVM, BPNN, etc.)[73,90,91,92,93,94,95,96,97,98,99,100,101,102]
Shape descriptionMostly RGB imagesShape descriptors extracted from 2D images; the SfM method was for generating 3D point clouds.[103,104,105,106,107,108]
Yield predictionRGB, multispectral, and hyperspectral images; weather parameters1) Feature extraction (fruit number, vegetation spectral indexes, LAI, weather condition parameters) + prediction model (MLP, GFNN, PPCR, NN, RF, etc.) for strawberry total weight
2) Strawberry detection and count of the number
[109,110,111,112,113,114]
Canopy and LeafStructural properties (planimetric canopy area, canopy surface area, canopy average height, standard deviation of canopy height, canopy volume, and canopy smoothness parameters)RGB images with high spatial resolutionSfM and Arcgis analysis[127,128,129,130,131]
Biophysical featuresDry biomass and leaf area of canopyRGB and NIR imagesFeature extraction (canopy geometric parameters, including canopy area, canopy average height, etc.) + prediction model (MLR)[129,130]
Nitrogen content of leavesRGB and NIR imagesFeature extraction (green and red reflectance (550 and 680 nm), VI, and NDVI) + regression analysis[126]
Leaf temperatureThermal images[132]
Water stressChlorophyll fluorescence, thermal, and hyperspectral imagesLeaf temperature and spectral characteristics (CWSI, NDVI, REIP, PSSRb, PRI, MSI) were extracted for water stress detection.[143,144,145,146,147]
Pest and disease stressPowdery mildew, anthracnose crown rot, verticillium wilt, gray mold, etc.RGB, multispectral, and hyperspectral imagesVarious types of color and texture features were imported to supervised classifiers for disease detection.[41,149,152,153,154,155,156,158,160,161,164,165]

References

  1. FAO. The Future of Food and Agriculture—Alternative Pathways to 2050; Food and Agriculture Organization of the United Nations: Rome, Italy, 2018. [Google Scholar]
  2. Bongiovanni, R.; Lowenberg-DeBoer, J. Precision agriculture and sustainability. Precis. Agric. 2004, 5, 359–387. [Google Scholar] [CrossRef]
  3. Zhang, N.; Wang, M.; Wang, N. Precision agriculture—A worldwide overview. Comput. Electron. Agric. 2002, 36, 113–132. [Google Scholar] [CrossRef]
  4. Liaghat, S.; Balasundram, S.K. A review: The role of remote sensing in precision agriculture. Am. J. Agric. Biol. Sci. 2010, 5, 50–55. [Google Scholar] [CrossRef] [Green Version]
  5. Say, S.M.; Keskin, M.; Sehri, M.; Sekerli, Y.E. Adoption of precision agriculture technologies in developed and developing countries. Online J. Sci. Technol. 2018, 8, 7–15. [Google Scholar]
  6. Costa, C.; Schurr, U.; Loreto, F.; Menesatti, P.; Carpentier, S. Plant phenotyping research trends, a science mapping approach. Front. Plant Sci. 2019, 9, 1933. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  7. Yang, G.; Liu, J.; Zhao, C.; Li, Z.; Huang, Y.; Yu, H.; Xu, B.; Yang, X.; Zhu, D.; Zhang, X.; et al. Unmanned aerial vehicle remote sensing for field-based crop phenotyping: Current status and perspectives. Front. Plant Sci. 2017, 8, 1111. [Google Scholar] [CrossRef]
  8. Chawade, A.; van Ham, J.; Blomquist, H.; Bagge, O.; Alexandersson, E.; Ortiz, R. High-throughput field-phenotyping tools for plant breeding and precision agriculture. Agronomy 2019, 9, 258. [Google Scholar] [CrossRef] [Green Version]
  9. Pasala, R.; Pandey, B.B. Plant phenomics: High-throughput technology for accelerating genomics. J. Biosci. 2020, 45, 1–6. [Google Scholar] [CrossRef]
  10. Pauli, D.; Chapman, S.C.; Bart, R.; Topp, C.N.; Lawrence-Dill, C.J.; Poland, J.; Gore, M.A. The quest for understanding phenotypic variation via integrated approaches in the field environment. Plant Physiol. 2016, 172, 622–634. [Google Scholar] [CrossRef] [Green Version]
  11. Yang, W.; Feng, H.; Zhang, X.; Zhang, J.; Doonan, J.H.; Batchelor, W.D.; Xiong, L.; Yan, J. Crop phenomics and high-throughput phenotyping: Past decades, current challenges, and future perspectives. Mol. Plant 2020, 13, 187–214. [Google Scholar] [CrossRef] [Green Version]
  12. Weng, S.; Yu, S.; Dong, R.; Pan, F.; Liang, D. Nondestructive detection of storage time of strawberries using visible/near-infrared hyperspectral imaging. Int. J. Food Prop. 2020, 23, 269–281. [Google Scholar] [CrossRef] [Green Version]
  13. Mezzetti, B.; Giampieri, F.; Zhang, Y.-T.; Zhong, C.-F. Status of strawberry breeding programs and cultivation systems in Europe and the rest of the world. J. Berry Res. 2018, 8, 205–221. [Google Scholar] [CrossRef]
  14. Food and Agriculture Organization of the United Nations. FAOSTAT Database; 2018. Available online: http://www.fao.org/faostat/en/?#data/QC (accessed on 20 November 2020).
  15. Huang, Y.; Chen, Z.-X.; Tao, Y.; Huang, X.-Z.; Gu, X.-F. Agricultural remote sensing big data: Management and applications. J. Integr. Agric. 2018, 17, 1915–1931. [Google Scholar] [CrossRef]
  16. Sicre, C.M.; Fieuzal, R.; Baup, F. Contribution of multispectral (optical and radar) satellite images to the classification of agricultural surfaces. Int. J. Appl. Earth Obs. Geoinf. 2020, 84, 101972. [Google Scholar] [CrossRef]
  17. Xu, Y.; Smith, S.E.; Grunwald, S.; Abd-Elrahman, A.; Wani, S.P. Incorporation of satellite remote sensing pan-sharpened imagery into digital soil prediction and mapping models to characterize soil property variability in small agricultural fields. ISPRS J. Photogramm. Remote. Sens. 2017, 123, 1–19. [Google Scholar] [CrossRef] [Green Version]
  18. Zhu, Q.; Luo, Y.; Xu, Y.-P.; Tian, Y.; Yang, T. Satellite soil moisture for agricultural drought monitoring: Assessment of SMAP-derived soil water deficit index in Xiang River Basin, China. Remote. Sens. 2019, 11, 362. [Google Scholar] [CrossRef] [Green Version]
  19. Du, T.L.T.; Bui, D.D.; Nguyen, M.D.; Lee, H. Satellite-based, multi-indices for evaluation of agricultural droughts in a highly dynamic tropical catchment, Central Vietnam. Water 2018, 10, 659. [Google Scholar] [CrossRef] [Green Version]
  20. Estel, S.; Mader, S.; Levers, C.; Verburg, P.H.; Baumann, M.; Kuemmerle, T. Combining satellite data and agricultural statistics to map grassland management intensity in Europe. Environ. Res. Lett. 2018, 13, 074020. [Google Scholar] [CrossRef]
  21. Fieuzal, R.; Baup, F. Forecast of wheat yield throughout the agricultural season using optical and radar satellite images. Int. J. Appl. Earth Obs. Geoinf. 2017, 59, 147–156. [Google Scholar] [CrossRef]
  22. Sharma, A.K.; Hubert-Moy, L.; Buvaneshwari, S.; Sekhar, M.; Ruiz, L.; Bandyopadhyay, S.; Corgne, S. Irrigation history estimation using multitemporal landsat satellite images: Application to an intensive groundwater irrigated agricultural watershed in India. Remote. Sens. 2018, 10, 893. [Google Scholar] [CrossRef] [Green Version]
  23. Xie, Q.; Dash, J.; Huete, A.; Jiang, A.; Yin, G.; Ding, Y.; Peng, D.; Hall, C.C.; Brown, L.; Shi, Y. Retrieval of crop biophysical parameters from Sentinel-2 remote sensing imagery. Int. J. Appl. Earth Obs. Geoinf. 2019, 80, 187–195. [Google Scholar] [CrossRef]
  24. Mateo-Sanchis, A.; Piles, M.; Muñoz-Marí, J.; Adsuara, J.E.; Pérez-Suay, A.; Camps-Valls, G. Synergistic integration of optical and microwave satellite data for crop yield estimation. Remote. Sens. Environ. 2019, 234, 111460. [Google Scholar] [CrossRef] [PubMed]
  25. Radoglou-Grammatikis, P.; Sarigiannidis, P.; Lagkas, T.; Moscholios, I. A compilation of UAV applications for precision agriculture. Comput. Netw. 2020, 172, 107148. [Google Scholar] [CrossRef]
  26. Giles, D.; Billing, R. Deployment and Performance of a UAV for Crop Spraying. Chem. Eng. Trans. 2015, 44, 307–312. [Google Scholar]
  27. Faiçal, B.S.; Freitas, H.; Gomes, P.H.; Mano, L.Y.; Pessin, G.; de Carvalho, A.C.; Krishnamachari, B.; Ueyama, J. An adaptive approach for UAV-based pesticide spraying in dynamic environments. Comput. Electron. Agric. 2017, 138, 210–223. [Google Scholar] [CrossRef]
  28. Di Gennaro, S.F.; Matese, A.; Gioli, B.; Toscano, P.; Zaldei, A.; Palliotti, A.; Genesio, L. Multisensor approach to assess vineyard thermal dynamics combining high-resolution unmanned aerial vehicle (UAV) remote sensing and wireless sensor network (WSN) proximal sensing. Sci. Hortic. 2017, 221, 83–87. [Google Scholar] [CrossRef]
  29. Popescu, D.; Stoican, F.; Stamatescu, G.; Ichim, L.; Dragana, C. Advanced UAV–WSN System for Intelligent Monitoring in Precision Agriculture. Sensors 2020, 20, 817. [Google Scholar] [CrossRef] [Green Version]
  30. Abd-Elrahman, A.; Pande-Chhetri, R.; Vallad, G. Design and development of a multi-purpose low-cost hyperspectral imaging system. Remote. Sens. 2011, 3, 570–586. [Google Scholar] [CrossRef] [Green Version]
  31. Jin, X.; Li, Z.; Atzberger, C. Editorial for the Special Issue “Estimation of Crop Phenotyping Traits using Unmanned Ground Vehicle and Unmanned Aerial Vehicle Imagery. Remote Sens. 2020, 12, 940. [Google Scholar] [CrossRef] [Green Version]
  32. Weiss, M.; Jacob, F.; Duveiller, G. Remote sensing for agricultural applications: A meta-review. Remote. Sens. Environ. 2020, 236, 111402. [Google Scholar] [CrossRef]
  33. Mishra, P.; Asaari, M.S.M.; Herrero-Langreo, A.; Lohumi, S.; Diezma, B.; Scheunders, P. Close range hyperspectral imaging of plants: A review. Biosyst. Eng. 2017, 164, 49–67. [Google Scholar] [CrossRef]
  34. Corp, L.A.; McMurtrey, J.E.; Middleton, E.M.; Mulchi, C.L.; Chappelle, E.W.; Daughtry, C.S. Fluorescence sensing systems: In vivo detection of biophysical variations in field corn due to nitrogen supply. Remote. Sens. Environ. 2003, 86, 470–479. [Google Scholar] [CrossRef]
  35. Wallace, L.; Lucieer, A.; Watson, C.; Turner, D. Development of a UAV-LiDAR system with application to forest inventory. Remote Sens. 2012, 4, 1519–1543. [Google Scholar] [CrossRef] [Green Version]
  36. Steele-Dunne, S.C.; McNairn, H.; Monsivais-Huertero, A.; Judge, J.; Liu, P.-W.; Papathanassiou, K. Radar remote sensing of agricultural canopies: A review. IEEE J. Sel. Top. Appl. Earth Obs. Remote. Sens. 2017, 10, 2249–2273. [Google Scholar] [CrossRef] [Green Version]
  37. McNairn, H.; Shang, J. A review of multitemporal synthetic aperture radar (SAR) for crop monitoring. In Multitemporal Remote Sensing. Remote Sensing and Digital Image Processing; Ban, Y., Ed.; Springer: Cham, Switzerland, 2016; Volume 20. [Google Scholar] [CrossRef]
  38. Liu, C.-A.; Chen, Z.-X.; Yun, S.; Chen, J.-S.; Hasi, T.; Pan, H.-Z. Research advances of SAR remote sensing for agriculture applications: A review. J. Integr. Agric. 2019, 18, 506–525. [Google Scholar] [CrossRef] [Green Version]
  39. Kuester, M.; Thome, K.; Krause, K.; Canham, K.; Whittington, E. Comparison of surface reflectance measurements from three ASD FieldSpec FR spectroradiometers and one ASD FieldSpec VNIR spectroradiometer. In Proceedings of the IGARSS 2001. Scanning the Present and Resolving the Future. Proceedings. IEEE 2001 International Geoscience and Remote Sensing Symposium (Cat. No.01CH37217), Sydney, NSW, Australia, 9–13 July 2001; pp. 72–74. [Google Scholar]
  40. Danner, M.; Locherer, M.; Hank, T.; Richter, K. Spectral Sampling with the ASD FieldSpec 4—Theory, Measurement, Problems, Interpretation; EnMAP Field Guides Technical Report; GFZ Data Services: Potsdam, Germany, 2015. [Google Scholar] [CrossRef]
  41. Mahmud, M.S.; Zaman, Q.U.; Esau, T.J.; Chang, Y.K.; Price, G.W.; Prithiviraj, B. Real-Time Detection of Strawberry Powdery Mildew Disease Using a Mobile Machine Vision System. Agronomy 2020, 10, 1027. [Google Scholar] [CrossRef]
  42. Liakos, K.G.; Busato, P.; Moshou, D.; Pearson, S.; Bochtis, D. Machine learning in agriculture: A review. Sensors 2018, 18, 2674. [Google Scholar] [CrossRef] [Green Version]
  43. Mochida, K.; Koda, S.; Inoue, K.; Hirayama, T.; Tanaka, S.; Nishii, R.; Melgani, F. Computer vision-based phenotyping for improvement of plant productivity: A machine learning perspective. GigaScience 2019, 8, giy153. [Google Scholar] [CrossRef] [Green Version]
  44. Cai, J.; Luo, J.; Wang, S.; Yang, S. Feature selection in machine learning: A new perspective. Neurocomputing 2018, 300, 70–79. [Google Scholar] [CrossRef]
  45. Miao, J.; Niu, L. A survey on feature selection. Procedia Comput. Sci. 2016, 91, 919–926. [Google Scholar] [CrossRef] [Green Version]
  46. Chlingaryan, A.; Sukkarieh, S.; Whelan, B. Machine learning approaches for crop yield prediction and nitrogen status estimation in precision agriculture: A review. Comput. Electron. Agric. 2018, 151, 61–69. [Google Scholar] [CrossRef]
  47. Sabanci, K.; Kayabasi, A.; Toktas, A. Computer vision-based method for classification of wheat grains using artificial neural network. J. Sci. Food Agric. 2017, 97, 2588–2593. [Google Scholar] [CrossRef] [PubMed]
  48. Koirala, A.; Walsh, K.B.; Wang, Z.; McCarthy, C. Deep learning–Method overview and review of use for fruit detection and yield estimation. Comput. Electron. Agric. 2019, 162, 219–234. [Google Scholar] [CrossRef]
  49. Jakhar, D.; Kaur, I. Artificial intelligence, machine learning and deep learning: Definitions and differences. Clin. Exp. Dermatol. 2020, 45, 131–132. [Google Scholar] [CrossRef]
  50. Miikkulainen, R.; Liang, J.; Meyerson, E.; Rawal, A.; Fink, D.; Francon, O.; Raju, B.; Shahrzad, H.; Navruzyan, A.; Duffy, N. Evolving deep neural networks. arXiv 2016, arXiv:1703.00548. [Google Scholar]
  51. Seifert, C.; Aamir, A.; Balagopalan, A.; Jain, D.; Sharma, A.; Grottel, S.; Gumhold, S. Visualizations of deep neural networks in computer vision: A survey. In Transparent Data Mining for Big and Small Data; Springer: Berlin/Heidelberg, Germany, 2017; pp. 123–144. [Google Scholar]
  52. Zhang, J.; Man, K.F. Time series prediction using RNN in multi-dimension embedding phase space. In Proceedings of the SMC’98 Conference Proceedings. 1998 IEEE International Conference on Systems, Man, and Cybernetics (Cat. No. 98CH36218), San Diego, CA, USA, 14 October 1998; pp. 1868–1873. [Google Scholar]
  53. Yu, S.; Jia, S.; Xu, C. Convolutional neural networks for hyperspectral image classification. Neurocomputing 2017, 219, 88–98. [Google Scholar] [CrossRef]
  54. Liu, T.; Abd-Elrahman, A. An object-based image analysis method for enhancing classification of land covers using fully convolutional networks and multi-view images of small unmanned aerial system. Remote. Sens. 2018, 10, 457. [Google Scholar] [CrossRef] [Green Version]
  55. Salakhutdinov, R. Learning deep generative models. Ann. Rev. Stat. Appl. 2015, 2, 361–385. [Google Scholar] [CrossRef] [Green Version]
  56. Pu, Y.; Gan, Z.; Henao, R.; Yuan, X.; Li, C.; Stevens, A.; Carin, L. Variational autoencoder for deep learning of images, labels and captions. arXiv 2016, arXiv:1609.08976. [Google Scholar]
  57. Bauer, A.; Bostrom, A.G.; Ball, J.; Applegate, C.; Cheng, T.; Laycock, S.; Rojas, S.M.; Kirwan, J.; Zhou, J. Combining computer vision and deep learning to enable ultra-scale aerial phenotyping and precision agriculture: A case study of lettuce production. Hortic. Res. 2019, 6, 1–12. [Google Scholar] [CrossRef] [Green Version]
  58. Zhang, L.; Zhang, L.; Du, B. Deep learning for remote sensing data: A technical tutorial on the state of the art. IEEE Geosci. Remote. Sens. Mag. 2016, 4, 22–40. [Google Scholar] [CrossRef]
  59. Kamilaris, A.; Prenafeta-Boldú, F.X. Deep learning in agriculture: A survey. Comput. Electron. Agric. 2018, 147, 70–90. [Google Scholar] [CrossRef] [Green Version]
  60. Puttemans, S.; Vanbrabant, Y.; Tits, L.; Goedemé, T. Automated visual fruit detection for harvest estimation and robotic harvesting. In Proceedings of the 2016 sixth international conference on image processing theory, tools and applications (IPTA), Oulu, Finland, 12–15 December 2016; pp. 1–6. [Google Scholar] [CrossRef]
  61. Feng, G.; Qixin, C.; Masateru, N. Fruit detachment and classification method for strawberry harvesting robot. Int. J. Adv. Robot. Syst. 2008, 5, 4. [Google Scholar] [CrossRef]
  62. Lin, P.; Chen, Y. Detection of Strawberry Flowers in Outdoor Field by Deep Neural Network. In Proceedings of the 2018 IEEE 3rd International Conference on Image, Vision and Computing (ICIVC), Chongqing, China, 27–29 June 2018; pp. 482–486. [Google Scholar] [CrossRef]
  63. Lamb, N.; Chuah, M.C. A strawberry detection system using convolutional neural networks. In Proceedings of the 2018 IEEE International Conference on Big Data (Big Data), Seattle, WA, USA, 10–13 December 2018; pp. 2515–2520. [Google Scholar] [CrossRef]
  64. Yu, Y.; Zhang, K.; Yang, L.; Zhang, D. Fruit detection for strawberry harvesting robot in non-structural environment based on Mask-RCNN. Comput. Electron. Agric. 2019, 163, 104846. [Google Scholar] [CrossRef]
  65. Zhou, C.; Hu, J.; Xu, Z.; Yue, J.; Ye, H.; Yang, G. A novel greenhouse-based system for the detection and plumpness assessment of strawberry using an improved deep learning technique. Front. Plant Sci. 2020, 11, 559. [Google Scholar] [CrossRef]
  66. Kafkas, E.; Koşar, M.; Paydaş, S.; Kafkas, S.; Başer, K. Quality characteristics of strawberry genotypes at different maturation stages. Food Chem. 2007, 100, 1229–1236. [Google Scholar] [CrossRef]
  67. Azodanlou, R.; Darbellay, C.; Luisier, J.-L.; Villettaz, J.-C.; Amadò, R. Changes in flavour and texture during the ripening of strawberries. Eur. Food Res. Technol. 2004, 218, 167–172. [Google Scholar]
  68. Kader, A.A. Quality and its maintenance in relation to the postharvest physiology of strawberry. In The Strawberry into the 21st Century; Timber Press: Portland, OR, USA, 1991; pp. 145–152. [Google Scholar]
  69. Rahman, M.M.; Moniruzzaman, M.; Ahmad, M.R.; Sarker, B.; Alam, M.K. Maturity stages affect the postharvest quality and shelf-life of fruits of strawberry genotypes growing in subtropical regions. J. Saudi Soc. Agric. Sci. 2016, 15, 28–37. [Google Scholar] [CrossRef] [Green Version]
  70. Li, B.; Lecourt, J.; Bishop, G. Advances in non-destructive early assessment of fruit ripeness towards defining optimal time of harvest and yield prediction—A review. Plants 2018, 7, 3. [Google Scholar]
  71. Rico, D.; Martin-Diana, A.B.; Barat, J.; Barry-Ryan, C. Extending and measuring the quality of fresh-cut fruit and vegetables: A review. Trends Food Sci. Technol. 2007, 18, 373–386. [Google Scholar] [CrossRef] [Green Version]
  72. Kader, A.A. Quality parameters of fresh-cut fruit and vegetable products. In Fresh-Cut Fruits and Vegetables; CRC Press: Boca Raton, FL, USA, 2002; pp. 20–29. [Google Scholar]
  73. Liu, C.; Liu, W.; Lu, X.; Ma, F.; Chen, W.; Yang, J.; Zheng, L. Application of multispectral imaging to determine quality attributes and ripeness stage in strawberry fruit. PLoS ONE 2014, 9, e87818. [Google Scholar] [CrossRef] [PubMed]
  74. Bai, J.; Plotto, A.; Baldwin, E.; Whitaker, V.; Rouseff, R. Electronic nose for detecting strawberry fruit maturity. In Proceedings of the Florida State Horticultural Society, Crystal River, FL, USA, 6–8 June 2010; Volume 123, pp. 259–263. [Google Scholar]
  75. Raut, K.D.; Bora, V. Assessment of Fruit Maturity using Direct Color Mapping. Int. Res. J. Eng. Technol. 2016, 3, 1540–1543. [Google Scholar]
  76. Jiang, H.; Zhang, C.; Liu, F.; Zhu, H.; He, Y. Identification of strawberry ripeness based on multispectral indexes extracted from hyperspectral images. Guang Pu Xue Yu Guang Pu Fen Xi = Guang Pu 2016, 36, 1423–1427. [Google Scholar] [PubMed]
  77. Guo, C.; Liu, F.; Kong, W.; He, Y.; Lou, B. Hyperspectral imaging analysis for ripeness evaluation of strawberry with support vector machine. J. Food Eng. 2016, 179, 11–18. [Google Scholar]
  78. Yue, X.-Q.; Shang, Z.-Y.; Yang, J.-Y.; Huang, L.; Wang, Y.-Q. A smart data-driven rapid method to recognize the strawberry maturity. Inf. Proc. Agric. 2019. [Google Scholar] [CrossRef]
  79. Gao, Z.; Shao, Y.; Xuan, G.; Wang, Y.; Liu, Y.; Han, X. Real-time hyperspectral imaging for the in-field estimation of strawberry ripeness with deep learning. Artif. Intell. Agric. 2020, 4, 31–38. [Google Scholar] [CrossRef]
  80. Xiong, Y.; Peng, C.; Grimstad, L.; From, P.J.; Isler, V. Development and field evaluation of a strawberry harvesting robot with a cable-driven gripper. Comput. Electron. Agric. 2019, 157, 392–402. [Google Scholar] [CrossRef]
  81. Sustika, R.; Subekti, A.; Pardede, H.F.; Suryawati, E.; Mahendra, O.; Yuwana, S. Evaluation of deep convolutional neural network architectures for strawberry quality inspection. Int. J. Eng. Technol. 2018, 7, 75–80. [Google Scholar]
  82. Usha, S.; Karthik, M.; Jenifer, R.; Scholar, P. Automated Sorting and Grading of Vegetables Using Image Processing. Int. J. Eng. Res. Gen. Sci. 2017, 5, 53–61. [Google Scholar]
  83. Shen, J.; Qi, H.-F.; Li, C.; Zeng, S.-M.; Deng, C. Experimental on storage and preservation of strawberry. Food Sci. Tech 2011, 36, 48–51. [Google Scholar]
  84. Liming, X.; Yanchao, Z. Automated strawberry grading system based on image processing. Comput. Electron. Agric. 2010, 71, S32–S39. [Google Scholar] [CrossRef]
  85. Mahendra, O.; Pardede, H.F.; Sustika, R.; Kusumo, R.B.S. Comparison of Features for Strawberry Grading Classification with Novel Dataset. In Proceedings of the 2018 International Conference on Computer, Control, Informatics and Its Applications (IC3INA), Tangerang, Indonesia, 1–2 November 2018; pp. 7–12. [Google Scholar] [CrossRef]
  86. Péneau, S.; Brockhoff, P.B.; Escher, F.; Nuessli, J. A comprehensive approach to evaluate the freshness of strawberries and carrots. Postharvest Biol. Technol. 2007, 45, 20–29. [Google Scholar] [CrossRef]
  87. Dong, D.; Zhao, C.; Zheng, W.; Wang, W.; Zhao, X.; Jiao, L. Analyzing strawberry spoilage via its volatile compounds using longpath fourier transform infrared spectroscopy. Sci. Rep. 2013, 3, 2585. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  88. Geladi, P.; Kowalski, B.R. Partial least-squares regression: A tutorial. Anal. Chim. Acta 1986, 185, 1–17. [Google Scholar] [CrossRef]
  89. Wang, H.; Peng, J.; Xie, C.; Bao, Y.; He, Y. Fruit quality evaluation using spectroscopy technology: A review. Sensors 2015, 15, 11889–11927. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  90. ElMasry, G.; Wang, N.; ElSayed, A.; Ngadi, M. Hyperspectral imaging for nondestructive determination of some quality attributes for strawberry. J. Food Eng. 2007, 81, 98–107. [Google Scholar] [CrossRef]
  91. Weng, S.; Yu, S.; Guo, B.; Tang, P.; Liang, D. Non-Destructive Detection of Strawberry Quality Using Multi-Features of Hyperspectral Imaging and Multivariate Methods. Sensors 2020, 20, 3074. [Google Scholar] [CrossRef]
  92. Liu, Q.; Wei, K.; Xiao, H.; Tu, S.; Sun, K.; Sun, Y.; Pan, L.; Tu, K. Near-infrared hyperspectral imaging rapidly detects the decay of postharvest strawberry based on water-soluble sugar analysis. Food Anal. Methods 2019, 12, 936–946. [Google Scholar] [CrossRef]
  93. Liu, S.; Xu, H.; Wen, J.; Zhong, W.; Zhou, J. Prediction and analysis of strawberry sugar content based on partial least squares prediction model. J. Anim. Plant Sci. 2019, 29, 1390–1395. [Google Scholar]
  94. Amodio, M.L.; Ceglie, F.; Chaudhry, M.M.A.; Piazzolla, F.; Colelli, G. Potential of NIR spectroscopy for predicting internal quality and discriminating among strawberry fruits from different production systems. Postharvest Biol. Technol. 2017, 125, 112–121. [Google Scholar] [CrossRef]
  95. LI, J.-B.; GUO, Z.-M.; HUANG, W.-Q.; ZHANG, B.-H.; ZHAO, C.-J. Near-infrared spectra combining with CARS and SPA algorithms to screen the variables and samples for quantitatively determining the soluble solids content in strawberry. Spectrosc. Spectr. Anal. 2015, 35, 372–378. [Google Scholar]
  96. Ding, X.; Zhang, C.; Liu, F.; Song, X.; Kong, W.; He, Y. Determination of soluble solid content in strawberry using hyperspectral imaging combined with feature extraction methods. Guang Pu Xue Yu Guang Pu Fen Xi = Guang Pu 2015, 35, 1020–1024. [Google Scholar] [PubMed]
  97. Sánchez, M.-T.; De la Haba, M.J.; Benítez-López, M.; Fernández-Novales, J.; Garrido-Varo, A.; Pérez-Marín, D. Non-destructive characterization and quality control of intact strawberries based on NIR spectral data. J. Food Eng. 2012, 110, 102–108. [Google Scholar] [CrossRef]
  98. Nishizawa, T.; Mori, Y.; Fukushima, S.; Natsuga, M.; Maruyama, Y. Non-destructive analysis of soluble sugar components in strawberry fruits using near-infrared spectroscopy. Nippon Shokuhin Kagaku Kogaku Kaishi = J. Jpn. Soc. Food Sci. Technol. 2009, 56, 229–235. [Google Scholar] [CrossRef] [Green Version]
  99. Wulf, J.; Rühmann, S.; Rego, I.; Puhl, I.; Treutter, D.; Zude, M. Nondestructive application of laser-induced fluorescence spectroscopy for quantitative analyses of phenolic compounds in strawberry fruits (Fragaria × ananassa). J. Agric. Food Chem. 2008, 56, 2875–2882. [Google Scholar] [CrossRef]
  100. Tallada, J.G.; Nagata, M.; Kobayashi, T. Non-destructive estimation of firmness of strawberries (Fragaria × ananassa Duch.) using NIR hyperspectral imaging. Environ. Control. Biol. 2006, 44, 245–255. [Google Scholar] [CrossRef] [Green Version]
  101. Nagata, M.; Tallada, J.G.; Kobayashi, T.; Toyoda, H. NIR hyperspectral imaging for measurement of internal quality in strawberries. In Proceedings of the 2005 ASAE Annual Meeting, Tampa, FL, USA, 17–20 July 2005. ASAE Paper No. 053131. [Google Scholar]
  102. Nagata, M.; Tallada, J.G.; Kobayashi, T.; Cui, Y.; Gejima, Y. Predicting maturity quality parameters of strawberries using hyperspectral imaging. In Proceedings of the ASAE/CSAE Annual International Meeting, Ottawa, ON, Canada, 1–4 August 2004. Paper No. 043033. [Google Scholar]
  103. Ishikawa, T.; Hayashi, A.; Nagamatsu, S.; Kyutoku, Y.; Dan, I.; Wada, T.; Oku, K.; Saeki, Y.; Uto, T.; Tanabata, T.; et al. Classification of strawberry fruit shape by machine learning. Int. Arch. Photogramm. Remote. Sens. Spat. Inf. Sci. 2018, 42. [Google Scholar] [CrossRef] [Green Version]
  104. Oo, L.M.; Aung, N.Z. A simple and efficient method for automatic strawberry shape and size estimation and classification. Biosyst. Eng. 2018, 170, 96–107. [Google Scholar] [CrossRef]
  105. Feldmann, M.J.; Hardigan, M.A.; Famula, R.A.; López, C.M.; Tabb, A.; Cole, G.S.; Knapp, S.J. Multi-dimensional machine learning approaches for fruit shape phenotyping in strawberry. GigaScience 2020, 9, giaa030. [Google Scholar] [CrossRef]
  106. He, J.Q.; Harrison, R.J.; Li, B. A novel 3D imaging system for strawberry phenotyping. Plant Methods 2017, 13, 1–8. [Google Scholar] [CrossRef]
  107. Kochi, N.; Tanabata, T.; Hayashi, A.; Isobe, S. A 3D shape-measuring system for assessing strawberry fruits. Int. J. Autom. Technol. 2018, 12, 395–404. [Google Scholar] [CrossRef]
  108. Li, B.; Cockerton, H.M.; Johnson, A.W.; Karlström, A.; Stavridou, E.; Deakin, G.; Harrison, R.J. Defining Strawberry Uniformity using 3D Imaging and Genetic Mapping. bioRxiv 2020. [Google Scholar] [CrossRef] [PubMed]
  109. Pathak, T.B.; Dara, S.K.; Biscaro, A. Evaluating correlations and development of meteorology based yield forecasting model for strawberry. Adv. Meteorol. 2016, 2016, 1–7. [Google Scholar] [CrossRef] [Green Version]
  110. Misaghi, F.; Dayyanidardashti, S.; Mohammadi, K.; Ehsani, M. Application of Artificial Neural Network and Geostatistical Methods in Analyzing Strawberry Yield Data; American Society of Agricultural and Biological Engineers: Minneapolis, MN, USA, 2004; p. 1. [Google Scholar]
  111. MacKenzie, S.J.; Chandler, C.K. A method to predict weekly strawberry fruit yields from extended season production systems. Agron. J. 2009, 101, 278–287. [Google Scholar] [CrossRef]
  112. Hassan, H.A.; Taha, S.S.; Aboelghar, M.A.; Morsy, N.A. Comparative the impact of organic and conventional strawberry cultivation on growth and productivity using remote sensing techniques under Egypt climate conditions. Asian J. Agric. Biol. 2018, 6, 228–244. [Google Scholar]
  113. Maskey, M.L.; Pathak, T.B.; Dara, S.K. Weather Based Strawberry Yield Forecasts at Field Scale Using Statistical and Machine Learning Models. Atmosphere 2019, 10, 378. [Google Scholar] [CrossRef] [Green Version]
  114. Chen, Y.; Lee, W.S.; Gan, H.; Peres, N.; Fraisse, C.; Zhang, Y.; He, Y. Strawberry yield prediction based on a deep neural network using high-resolution aerial orthoimages. Remote. Sens. 2019, 11, 1584. [Google Scholar] [CrossRef] [Green Version]
  115. Fonstad, M.A.; Dietrich, J.T.; Courville, B.C.; Jensen, J.L.; Carbonneau, P.E. Topographic structure from motion: A new development in photogrammetric measurement. Earth Surf. Proc. Landf. 2013, 38, 421–430. [Google Scholar] [CrossRef] [Green Version]
  116. Ozyesil, O.; Voroninski, V.; Basri, R.; Singer, A. A survey of structure from motion. arXiv 2017, arXiv:1701.08493. [Google Scholar]
  117. Patrick, A.; Li, C. High throughput phenotyping of blueberry bush morphological traits using unmanned aerial systems. Remote. Sens. 2017, 9, 1250. [Google Scholar] [CrossRef] [Green Version]
  118. Makanza, R.; Zaman-Allah, M.; Cairns, J.E.; Magorokosho, C.; Tarekegne, A.; Olsen, M.; Prasanna, B.M. High-throughput phenotyping of canopy cover and senescence in maize field trials using aerial digital canopy imaging. Remote. Sens. 2018, 10, 330. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  119. Han, L.; Yang, G.; Dai, H.; Yang, H.; Xu, B.; Feng, H.; Li, Z.; Yang, X. Fuzzy Clustering of Maize Plant-Height Patterns Using Time Series of UAV Remote-Sensing Images and Variety Traits. Front. Plant Sci. 2019, 10, 926. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  120. Xue, J.; Su, B. Significant remote sensing vegetation indices: A review of developments and applications. J. Sens. 2017, 2017, 1–17. [Google Scholar] [CrossRef] [Green Version]
  121. Hunt, E.R., Jr.; Doraiswamy, P.C.; McMurtrey, J.E.; Daughtry, C.S.; Perry, E.M.; Akhmedov, B. A visible band index for remote sensing leaf chlorophyll content at the canopy scale. Int. J. Appl. Earth Obs. Geoinf. 2013, 21, 103–112. [Google Scholar] [CrossRef] [Green Version]
  122. Clevers, J.G.; Kooistra, L. Using hyperspectral remote sensing data for retrieving canopy chlorophyll and nitrogen content. IEEE J. Sel. Top. Appl. Earth Obs. Remote. Sens. 2011, 5, 574–583. [Google Scholar] [CrossRef]
  123. Kattenborn, T.; Schmidtlein, S. Radiative transfer modelling reveals why canopy reflectance follows function. Sci. Rep. 2019, 9, 1–10. [Google Scholar] [CrossRef] [Green Version]
  124. Yuan, H.; Yang, G.; Li, C.; Wang, Y.; Liu, J.; Yu, H.; Feng, H.; Xu, B.; Zhao, X.; Yang, X. Retrieving soybean leaf area index from unmanned aerial vehicle hyperspectral remote sensing: Analysis of RF, ANN, and SVM regression models. Remote. Sens. 2017, 9, 309. [Google Scholar] [CrossRef] [Green Version]
  125. Wolanin, A.; Camps-Valls, G.; Gómez-Chova, L.; Mateo-García, G.; van der Tol, C.; Zhang, Y.; Guanter, L. Estimating crop primary productivity with Sentinel-2 and Landsat 8 using machine learning methods trained with radiative transfer simulations. Remote. Sens. Environ. 2019, 225, 441–457. [Google Scholar] [CrossRef]
  126. Luisa España-Boquera, M.; Cárdenas-Navarro, R.; López-Pérez, L.; Castellanos-Morales, V.; Lobit, P. Estimating the nitrogen concentration of strawberry plants from its spectral response. Commun. Soil Sci. Plant Anal. 2006, 37, 2447–2459. [Google Scholar] [CrossRef]
  127. Sandino, J.D.; Ramos-Sandoval, O.L.; Amaya-Hurtado, D. Method for estimating leaf coverage in strawberry plants using digital image processing. Rev. Bras. Eng. Agrícola Ambient. 2016, 20, 716–721. [Google Scholar] [CrossRef] [Green Version]
  128. Jianlun, W.; Yu, H.; Shuangshuang, Z.; Hongxu, Z.; Can, H.; Xiaoying, C.; Yun, X.; Jianshu, C.; Shuting, W. A new multi-scale analytic algorithm for edge extraction of strawberry leaf images in natural light. Int. J. Agric. Biol. Eng. 2016, 9, 99–108. [Google Scholar]
  129. Guan, Z.; Abd-Elrahman, A.; Fan, Z.; Whitaker, V.M.; Wilkinson, B. Modeling strawberry biomass and leaf area using object-based analysis of high-resolution images. J. Photogramm. Remote. Sens. 2020, 163, 171–186. [Google Scholar] [CrossRef]
  130. Abd-Elrahman, A.; Guan, Z.; Dalid, C.; Whitaker, V.; Britt, K.; Wilkinson, B.; Gonzalez, A. Automated Canopy Delineation and Size Metrics Extraction for Strawberry Dry Weight Modeling Using Raster Analysis of High-Resolution Imagery. Remote. Sens. 2020, 12, 3632. [Google Scholar] [CrossRef]
  131. Takahashi, M.; Takayama, S.; Umeda, H.; Yoshida, C.; Koike, O.; Iwasaki, Y.; Sugeno, W. Quantification of Strawberry Plant Growth and Amount of Light Received Using a Depth Sensor. Environ. Control. Biol. 2020, 58, 31–36. [Google Scholar] [CrossRef] [Green Version]
  132. Kokin, E.; Palge, V.; Pennar, M.; Jürjenson, K. Strawberry leaf surface temperature dynamics measured by thermal camera in night frost conditions. Agron. Res. 2018, 16. [Google Scholar] [CrossRef]
  133. Touati, F.; Al-Hitmi, M.; Benhmed, K.; Tabish, R. A fuzzy logic based irrigation system enhanced with wireless data logging applied to the state of Qatar. Comput. Electron. Agric. 2013, 98, 233–241. [Google Scholar] [CrossRef]
  134. Avşar, E.; Buluş, K.; Saridaş, M.A.; Kapur, B. Development of a cloud-based automatic irrigation system: A case study on strawberry cultivation. In Proceedings of the 2018 7th International Conference on Modern Circuits and Systems Technologies (MOCAST), Thessaloniki, Greece, 7–9 May 2018; pp. 1–4. [Google Scholar]
  135. Gutiérrez, J.; Villa-Medina, J.F.; Nieto-Garibay, A.; Porta-Gándara, M.Á. Automated irrigation system using a wireless sensor network and GPRS module. IEEE Trans. Instrum. Meas. 2013, 63, 166–176. [Google Scholar] [CrossRef]
  136. Morillo, J.G.; Martín, M.; Camacho, E.; Díaz, J.R.; Montesinos, P. Toward precision irrigation for intensive strawberry cultivation. Agric. Water Manag. 2015, 151, 43–51. [Google Scholar] [CrossRef]
  137. Gerhards, M.; Schlerf, M.; Mallick, K.; Udelhoven, T. Challenges and future perspectives of multi-/Hyperspectral thermal infrared remote sensing for crop water-stress detection: A review. Remote. Sens. 2019, 11, 1240. [Google Scholar] [CrossRef] [Green Version]
  138. Grant, O.M.; Davies, M.J.; Johnson, A.W.; Simpson, D.W. Physiological and growth responses to water deficits in cultivated strawberry (Fragaria× ananassa) and in one of its progenitors, Fragaria chiloensis. Environ. Exp. Bot. 2012, 83, 23–32. [Google Scholar] [CrossRef]
  139. Nezhadahmadi, A.; Faruq, G.; Rashid, K. The impact of drought stress on morphological and physiological parameters of three strawberry varieties in different growing conditions. Pak. J. Agric. Sci. 2015, 52, 79–92. [Google Scholar]
  140. Grant, O.M.; Johnson, A.W.; Davies, M.J.; James, C.M.; Simpson, D.W. Physiological and morphological diversity of cultivated strawberry (Fragaria× ananassa) in response to water deficit. Environ. Exp. Bot. 2010, 68, 264–272. [Google Scholar] [CrossRef]
  141. Klamkowski, K.; Treder, W. Response to drought stress of three strawberry cultivars grown under greenhouse conditions. J. Fruit Ornam. Plant Res. 2008, 16, 179–188. [Google Scholar]
  142. Adak, N.; Gubbuk, H.; Tetik, N. Yield, quality and biochemical properties of various strawberry cultivars under water stress. J. Sci. Food Agric. 2018, 98, 304–311. [Google Scholar] [CrossRef] [PubMed]
  143. Peñuelas, J.; Savé, R.; Marfà, O.; Serrano, L. Remotely measured canopy temperature of greenhouse strawberries as indicator of water status and yield under mild and very mild water stress conditions. Agric. For. Meteorol. 1992, 58, 63–77. [Google Scholar] [CrossRef]
  144. Razavi, F.; Pollet, B.; Steppe, K.; Van Labeke, M.-C. Chlorophyll fluorescence as a tool for evaluation of drought stress in strawberry. Photosynthetica 2008, 46, 631–633. [Google Scholar] [CrossRef]
  145. Delalieux, S.; Delauré, B.; Tits, L.; Boonen, M.; Sima, A.; Baeck, P. High resolution strawberry field monitoring using the compact hyperspectral imaging solution COSI. Adv. Anim. Biosci. 2017, 8, 156. [Google Scholar] [CrossRef]
  146. Li, H.; Yin, J.; Zhang, M.; Sigrimis, N.; Gao, Y.; Zheng, W. Automatic diagnosis of strawberry water stress status based on machine vision. Int. J. Agric. Biol. Eng. 2019, 12, 159–164. [Google Scholar] [CrossRef]
  147. Gerhards, M.; Schlerf, M.; Rascher, U.; Udelhoven, T.; Juszczak, R.; Alberti, G.; Miglietta, F.; Inoue, Y. Analysis of airborne optical and thermal imagery for detection of water stress symptoms. Remote. Sens. 2018, 10, 1139. [Google Scholar] [CrossRef] [Green Version]
  148. Oliveira, M.S.; Peres, N.A. Common Strawberry Diseases in Florida. EDIS 2020, 2020. [Google Scholar] [CrossRef]
  149. Chang, Y.K.; Mahmud, M.; Shin, J.; Nguyen-Quang, T.; Price, G.W.; Prithiviraj, B. Comparison of Image Texture Based Supervised Learning Classifiers for Strawberry Powdery Mildew Detection. AgriEngineering 2019, 1, 434–452. [Google Scholar] [CrossRef] [Green Version]
  150. Mahlein, A.-K. Plant Disease detection by imaging sensors–parallels and specific demands for precision agriculture and plant phenotyping. Plant Dis. 2016, 100, 241–251. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  151. Park, H.; Eun, J.-S.; Kim, S.-H. Image-based disease diagnosing and predicting of the crops through the deep learning mechanism. In Proceedings of the 2017 International Conference on Information and Communication Technology Convergence (ICTC), Jeju, Korea, 18–20 October 2017; pp. 129–131. [Google Scholar] [CrossRef]
  152. Shin, J.; Chang, Y.K.; Heung, B.; Nguyen-Quang, T.; Price, G.W.; Al-Mallahi, A. Effect of directional augmentation using supervised machine learning technologies: A case study of strawberry powdery mildew detection. Biosyst. Eng. 2020, 194, 49–60. [Google Scholar] [CrossRef]
  153. De Lange, E.S.; Nansen, C. Early detection of arthropod-induced stress in strawberry using innovative remote sensing technology. In Proceedings of the GeoVet 2019. Novel Spatio-Temporal Approaches in the Era of Big Data, Davis, CA, USA, 8–10 October 2019. [Google Scholar] [CrossRef]
  154. Liu, Q.; Sun, K.; Zhao, N.; Yang, J.; Zhang, Y.; Ma, C.; Pan, L.; Tu, K. Information fusion of hyperspectral imaging and electronic nose for evaluation of fungal contamination in strawberries during decay. Postharvest Biol. Technol. 2019, 153, 152–160. [Google Scholar] [CrossRef]
  155. Cockerton, H.M.; Li, B.; Vickerstaff, R.; Eyre, C.A.; Sargent, D.J.; Armitage, A.D.; Marina-Montes, C.; Garcia, A.; Passey, A.J.; Simpson, D.W. Image-based Phenotyping and Disease Screening of Multiple Populations for resistance to Verticillium dahliae in cultivated strawberry Fragaria x ananassa. bioRxiv 2018, 497107. [Google Scholar] [CrossRef] [Green Version]
  156. Altıparmak, H.; Al Shahadat, M.; Kiani, E.; Dimililer, K. Fuzzy classification for strawberry diseases-infection using machine vision and soft-computing techniques. In Proceedings of the Tenth International Conference on Machine Vision (ICMV 2017), Vienna, Austria, 13–15 November 2017; p. 106961N. [Google Scholar] [CrossRef]
  157. Hecht-Nielsen, R. Theory of the backpropagation neural network. In Proceedings of the International 1989 Joint Conference on Neural Networks, Washington, DC, USA, 1989; Volume 1, pp. 593–605. [Google Scholar] [CrossRef]
  158. Siedliska, A.; Baranowski, P.; Zubik, M.; Mazurek, W.; Sosnowska, B. Detection of fungal infections in strawberry fruit by VNIR/SWIR hyperspectral imaging. Postharvest Biol. Technol. 2018, 139, 115–126. [Google Scholar] [CrossRef]
  159. Thompson, B. Stepwise Regression and Stepwise Discriminant Analysis Need Not Apply Here: A Guidelines; Sage Publications: Thousand Oaks, CA, USA, 1995. [Google Scholar]
  160. Lu, J.; Ehsani, R.; Shi, Y.; Abdulridha, J.; de Castro, A.I.; Xu, Y. Field detection of anthracnose crown rot in strawberry using spectroscopy technology. Comput. Electron. Agric. 2017, 135, 289–299. [Google Scholar] [CrossRef]
  161. Abdel Wahab, H.; Aboelghar, M.; Ali, A.; Yones, M. Spectral and molecular studies on gray mold in strawberry. Asian J. Plant Pathol. 2017, 11, 167–173. [Google Scholar] [CrossRef]
  162. Yuhas, R.H.; Goetz, A.F.H.; Boardman, J.W. Discrimination among semi-arid landscape endmembers using the Spectral AngleMapper (SAM) algorithm. In Summaries of the Third Annual JPL Airborne Geoscience Workshop; AVIRIS Workshop: Pasadena, CA, USA, 1992; pp. 147–149. [Google Scholar]
  163. Levine, M.F. Self-developed QWL measures. J. Occup. Behav. 1983, 4, 35–46. [Google Scholar]
  164. Yeh, Y.-H.; Chung, W.-C.; Liao, J.-Y.; Chung, C.-L.; Kuo, Y.-F.; Lin, T.-T. Strawberry foliar anthracnose assessment by hyperspectral imaging. Comput. Electron. Agric. 2016, 122, 1–9. [Google Scholar] [CrossRef]
  165. Yeh, Y.-H.F.; Chung, W.-C.; Liao, J.-Y.; Chung, C.-L.; Kuo, Y.-F.; Lin, T.-T. A comparison of machine learning methods on hyperspectral plant disease assessments. IFAC Proc. Vol. 2013, 46, 361–365. [Google Scholar] [CrossRef] [Green Version]
  166. Jiang, Y.; Li, C.; Takeda, F.; Kramer, E.A.; Ashrafi, H.; Hunter, J. 3D point cloud data to quantitatively characterize size and shape of shrub crops. Hortic. Res. 2019, 6, 1–17. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  167. Paul, S.; Poliyapram, V.; İmamoğlu, N.; Uto, K.; Nakamura, R.; Kumar, D.N. Canopy Averaged Chlorophyll Content Prediction of Pear Trees Using Convolutional Autoencoder on Hyperspectral Data. IEEE J. Sel. Top. Appl. Earth Obs. Remote. Sens. 2020, 13, 1426–1437. [Google Scholar] [CrossRef]
  168. Li, D.; Li, C.; Yao, Y.; Li, M.; Liu, L. Modern imaging techniques in plant nutrition analysis: A review. Comput. Electron. Agric. 2020, 174, 105459. [Google Scholar] [CrossRef]
  169. Lu, X.; Liu, Z.; Zhao, F.; Tang, J. Comparison of total emitted solar-induced chlorophyll fluorescence (SIF) and top-of-canopy (TOC) SIF in estimating photosynthesis. Remote. Sens. Environ. 2020, 251, 112083. [Google Scholar] [CrossRef]
  170. Dechant, B.; Ryu, Y.; Badgley, G.; Zeng, Y.; Berry, J.A.; Zhang, Y.; Goulas, Y.; Li, Z.; Zhang, Q.; Kang, M.; et al. Canopy structure explains the relationship between photosynthesis and sun-induced chlorophyll fluorescence in crops. Remote. Sens. Environ. 2020, 241, 111733. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Global trends in strawberry yield and harvested area from 1961 to 2018 [14].
Figure 1. Global trends in strawberry yield and harvested area from 1961 to 2018 [14].
Remotesensing 13 00531 g001
Figure 2. Strawberry traits can be assessed using a variety of sensors mounted on multiple platforms.
Figure 2. Strawberry traits can be assessed using a variety of sensors mounted on multiple platforms.
Remotesensing 13 00531 g002
Figure 3. Several common strawberry diseases. Used with permission from N. Peres [148].
Figure 3. Several common strawberry diseases. Used with permission from N. Peres [148].
Remotesensing 13 00531 g003
Table 1. Summary of articles addressing the estimation of internal fruit quality attributes of strawberry based on remote sensing and machine learning.
Table 1. Summary of articles addressing the estimation of internal fruit quality attributes of strawberry based on remote sensing and machine learning.
AuthorYearParameters *DataFeature ExtractionOptimal Waveband Selection Method **Regression Model ***Prediction Accuracy (R or R2)Reference
Weng et al.2020SSC, pH, and vitamin C Hyperspectral imaging (range: 374–1020 nm; spectral resolution: 2.31 nm)All spectral information, 9 color features, and 36 textural features CARS, UVEPLSR, SVR, LWRR2: 0.9370 for SSC, 0.8493 for PH, and 0.8769 for vitamin C[91]
Liu et al.2019TWSS, glucose, fructose, and sucrose concentrationsNear-infrared hyperspectral imaging (range: 1000–2500 nm; spectral resolution: 6.8 nm) Spectral information (range: 1085–1780 nm; 5 wavelengths for fructose, glucose, and sucrose; 7 wavelengths for TWSS)SPASVRR2: 0.589 for fructose, 0.503 for glucose, 0.724 for sucrose, and 0.807 for TWSS[92]
Liu et al.2019Sugar contentHyperspectral imaging (range: 391–1043 nm; spectral resolution: 2.8 nm)Spectral information (range: 420–1007 nm; 76 wavelengths)CCSDPLSR: 0.7708–0.8053[93]
Amodio et al.2017SSC, pH, TA, ascorbic acid content, and phenolic contentFourier-transform (FT)-NIR spectrometer (range: 12,500–3600 cm−1; spectral interval: 8 cm−1)Spectral information (range: 9401–4597 cm−1, 7507–6094 cm−1, 5454–4597 cm−1, 6103–5446 cm−1, and 4428–4242 cm−1; 219 spectral)Bruker’s OPUS softwarePLSRR2: 0.85 for TSS, 0.86 for pH, and 0.58 for TA[94]
Li et al.2015SSCNear-infrared spectrometer (range: 12,000–3800 cm−1; spectral interval: 1.928 cm−1)Spectral information (25 wavelengths)CARS, SPA, MC-UVEPLSR, MLRR2: 0.9097[95]
Ding et al.2015SSCHyperspectral imaging (range: 874–1734 nm; spectral resolution: 5 nm)Spectral information (range: 941–1612 nm; 14, 17, 24, and 25 wavelengths selected by four methods); 20 spectral features by PCA; 58 spectral features by wavelet transform (WT)SPA, GAPLS & SPA, Bw, CARSPLSRR: >0.9 for SSC[96]
Liu et al.2014Firmness and SSCMultispectral imaging system (range: 405–970 nm; 19 wavelengths)Spectral information (range: 405, 435, 450, 470, 505, 525, 570, 590, 630, 645, 660, 700, 780, 850, 870, 890, 910, 940, and 970 nm; 19 wavelengths)NonePLSR, SVM, BPNNR: 0.94 for firmness and 0.83 for SSC [73]
Sánchez et al.2012SSC and TAHandheld MEMS-based NIR spectrophotometer (range: 1600–2400 nm; spectral intervals: 12 nm)Spectral informationMPLS, local algorithmMPLS, local algorithmR2: 0.48 for firmness, 0.62 for MPC, 0.69 for SSC, 0.65 for TA, and 0.40 for PH[97]
Nishizawa et al.2009SSC and glucose, fructose, and sucrose concentrationsNear-infrared (NIR) spectroscopySpectral information (range: 700–925 nm)SMLRSMLRR2: 0.86 for SSC, 0.74 for glucose, 0.50 for fructose, and 0.51 for sucrose[98]
Wulf et al.2008Phenolic compound contentLaser-induced fluorescence spectroscopy (LIFS) (EX: 337 nm; EM: 400–820 nm; spectral interval: 2 nm)Spectral informationNonePLSRR2: 0.99 for p-coumaroyl-glucose and cinnamoyl-glucose[99]
ElMasry et al.2007MC, SSC, and pHHyperspectral imaging in visible and near-infrared regions (range: 400–1000 nm; 826 wavelengths)Spectral information (8, 6, and 8 wavelengths for MC, TSS, and pH, respectively)β-coefficients from PLS modelsMLRR: 0.87 for MC, 0.80 for SSC, and 0.92 for pH [90]
Tallada et al.2006FirmnessNear-infrared hyperspectral imaging (range: 650–1000 nm; spectral resolution: 5 nm) Spectral informationSMLRSMLRR: 0.786 for firmness[100]
Nagata et al.2005Firmness and SSCNear-infrared hyperspectral imaging (range: 650–1000 nm; spectral resolution: 5 nm)Spectral information (3 and 5 wavelengths for firmness and SSC, respectively)SMLRSMLRR: 0.786 for firmness, and 0.87 for SSC[101]
Nagata et al.2004Firmness and SSCHyperspectral imaging in visible regions (range: 400–650 nm; spectral resolution: 2 nm)Spectral information (5 wavelengths for firmness)SMLRSMLRR: 0.784 for firmness[102]
* Parameters: SSC, soluble solid content; MC, moisture content; pH, acidity; TA, titratable acidity; TWSS, total water-soluble sugar; MPC, maximum penetration force. ** Optimal waveband selection method: CARS, competitive adaptive reweighted sampling; UVE, uninformative variable elimination; SPA, successive projection algorithm; MC-UVE, Monte Carlo–uninformative variable elimination; GAPLS & SPA, genetic algorithm partial least squares combined with SPA; Bw, weighted regression coefficient; MPLS, modified partial least-squares regression method; SMLR, stepwise multiple linear regression. *** Regression model: PLSR, partial least-squares regression; SVR, support vector regression; LWR, locally weighted regression; MLR, multiple linear regression; SVM, support vector machine; BPNN, back propagation neural network.
Table 2. Summary of recent articles investigating strawberry diseases using remote sensing and machine learning.
Table 2. Summary of recent articles investigating strawberry diseases using remote sensing and machine learning.
AuthorYearDiseaseDescriptionReference
Mahmud et al.2020Powdery mildewMahmud et al. (2020) designed a mobile machine vision system for strawberry powdery mildew disease detection. The system contains GPS, two cameras, a custom image processing program integrated with color co-occurrence matrix-based texture analysis and ANN classifier, and a ruggedized laptop computer. The highest detection accuracy can reach 98.49%.[41]
Shin et al.2020Powdery mildewShin et al. (2020) used three feature extraction methods (histogram of oriented gradients (HOG), speeded-up robust features (SURF), and gray level co-occurrence matrix (GLCM)) and two supervised learning classifiers (ANNs and SVMs) for the detection of strawberry powdery mildew disease. The classification accuracy was the highest, with 94.34% for ANNs and SURF and 88.98% for SVMs and the GLCM.[152]
Chang et al.2019Powdery mildew Chang et al. (2019) extracted 40 textural indices from high-resolution RGB images and compared the performance of three supervised learning classifiers, ANNs, SVMs, and KNNs, in the detection of powdery mildew disease in strawberry. The overall classification accuracy was 93.81%, 91.66%, and 78.80% for the ANN, SVM, and KNN classifiers, respectively.[149]
De Lange, E. S., and Nansen C2019Arthropod pest influenceDe Lange and Nansen (2019) used hyperspectral imaging instruments to detect the spectral response of three stress-induced changes on the strawberry leaves from the influence of three arthropod pests. Large differences were observed from the reflectance data.[153]
Liu et al.2019Fungal contaminationLiu et al. (2019) combined spatial-spectral information from hyperspectral imaging and aroma information from an electronic nose (E-nose) to estimate external and internal compositions (total soluble solids, titratable acidity) of fungi-infected strawberries during various storage times. PCA was used to extract the features from the hyperspectral images and aroma information. These parameters were highly correlated with microbial content.[154]
Cockerton et al.2018Verticillium wiltCockerton. et al. (2018) collected the high-resolution RGB and multispectral images of strawberry based on the UAV platform to study verticillium wilt resistance of multiple strawberry populations. The NDVI was linked to the disease susceptibility.[155]
Altiparmak et al.2018Iron deficiency or fungal infectionAltiparmak et al. (2018) proposed a new strawberry leaf disease infection detection and classification method based on only the RGB spectral response value. First, a color-processing detection algorithm (CPDA) was applied to calculate the red and green indices to extract the strawberry leaf from the background and determine the infected area based on the threshold segmentation. Secondly, the fuzzy logic classification algorithm (FLCA) was used to determine the disease type and differentiate iron deficiency from fungal infection.[156]
Siedliska et al.2018Fungal infectionSiedliska et al. (2018) tried to detect whether strawberry fruits were infected by the fungus using the VNIR/SWIR hyperspectral imaging technology. Nineteen optimal wavelengths were selected by the second derivative of the original spectra, and then the back propagation neural network (BPNN) [157] model was used to differentiate between good and infected fruits, with an accuracy of higher than 97%. The multiple linear regression model was used to estimate the total anthocyanin content (AC) and soluble solid content (SSC). The AC (681 and 1292 nm) and SSC (705, 842, 1162, and 2239 nm) prediction models were tested and produced R2 = 0.65 and R2 = 0.85, respectively.[158]
Lu et al.2017Anthracnose crown rotLu et al. (2017) collected in-field hyperspectral data using a mobile platform on three types of strawberry plants: infected but asymptomatic, infected and symptomatic, and healthy. Thirty-two spectral vegetation indices were used to train the model using stepwise discriminant analysis (SDA) [159], Fisher discriminant analysis (FDA), and KNN algorithms. The achieved classification accuracies were 71.3%, 70.5%, and 73.6% for these three models, respectively.[160]
Wahab et al.2017Gray mold Wahab et al. (2017) compared two systems of qPCR and spectroradiometer to detect the gray mold pathogen Botrytis cinerea for infected and healthy strawberry fruits. The results indicated that spectral analysis can effectively detect the gray mold infection and VNIR spectra can distinguish healthy fruits from infected strawberry fruits based on the difference of cellular pigments, while the SWIR can classify infection degrees caused by the cellular structure and water content.[161]
Yeh et al.2016Foliar anthracnose Yeh et al. (2016) classified the strawberry leaf images into healthy, incubation, and symptomatic stages of the foliar anthracnose disease based on hyperspectral imaging. Three methods, spectral angle mapper (SAM) [162], SDA, and self-developed correlation measure (CM) [163], were used to carry out the classification. Meanwhile, partial least-squares regression (PLSR) [88], SDA, and CM were also used to select the optimal wavelengths. Wavelengths of 551, 706, 750, and 914 nm were chosen, and the classification accuracy was 80%.[164]
Yeh et al.2013Foliar anthracnose Yeh et al. (2013) applied three hyperspectral image analysis methods to determine whether strawberry plants were affected by foliar anthracnose: SDA, SAM, and the proposed simple slope measure (SSM) method. The classified statuses of the strawberry plants were healthy, incubation, and symptomatic. The classification accuracies were 82.0%, 80.7%, and 72.7%, respectively.[165]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Zheng, C.; Abd-Elrahman, A.; Whitaker, V. Remote Sensing and Machine Learning in Crop Phenotyping and Management, with an Emphasis on Applications in Strawberry Farming. Remote Sens. 2021, 13, 531. https://doi.org/10.3390/rs13030531

AMA Style

Zheng C, Abd-Elrahman A, Whitaker V. Remote Sensing and Machine Learning in Crop Phenotyping and Management, with an Emphasis on Applications in Strawberry Farming. Remote Sensing. 2021; 13(3):531. https://doi.org/10.3390/rs13030531

Chicago/Turabian Style

Zheng, Caiwang, Amr Abd-Elrahman, and Vance Whitaker. 2021. "Remote Sensing and Machine Learning in Crop Phenotyping and Management, with an Emphasis on Applications in Strawberry Farming" Remote Sensing 13, no. 3: 531. https://doi.org/10.3390/rs13030531

APA Style

Zheng, C., Abd-Elrahman, A., & Whitaker, V. (2021). Remote Sensing and Machine Learning in Crop Phenotyping and Management, with an Emphasis on Applications in Strawberry Farming. Remote Sensing, 13(3), 531. https://doi.org/10.3390/rs13030531

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop