Image-Based, Organ-Level Plant Phenotyping for Wheat Improvement

Wheat was one of the first grain crops domesticated by humans and remains among the major contributors to the global calorie and protein budget. The rapidly expanding world population demands further enhancement of yield and performance of wheat. Phenotypic information has historically been instrumental in wheat breeding for improved traits. In the last two decades, a steadily growing collection of tools and imaging software have given us the ability to quantify shoot, root, and seed traits with progressively increasing accuracy and throughput. This review discusses challenges and advancements in image analysis platforms for wheat phenotyping at the organ level. Perspectives on how these collective phenotypes can inform basic research on understanding wheat physiology and breeding for wheat improvement are also provided.


Introduction
The rapid growth in world population calls for increased food production to meet the growing demand for calories. To this end, yields of wheat, a major staple food crop, will need to rise by 50% by 2050 as projected by CIMMYT [1]. The gains in yield will be brought on by basic research that explores wheat physiology and genetics, intensive breeding efforts, and innovative agronomic practices. Observing and quantifying plant phenotypes (i.e., phenotyping) is critical and integral to each of these efforts [2]. A phenotype is comprised of a collection of smaller measurable traits known as phenes [3]. The observed phenes and phenotypes of wheat at least partially dictate its agronomic performance while being indicative of its underlying physiology and genetics as well as its interaction with the environment. In spite of its relevance, phenotypic information is acquired at a rate that has been outpaced by that of genetic information by virtue of recent breakthroughs in genomics, sequencing, and genetic marker development. This has led to what is dubbed the "phenotyping bottleneck": a constraint on the scale of plant breeding and research caused by current limitations to high-throughput and high-resolution phenotyping [2]. At the core of this phenotyping bottleneck is how phenotyping is conducted. Quantitatively phenotyping wheat plants commonly involves time-consuming, manual measurements that are constrained in their throughput. This is further complicated by the environmental effects on phenotypes, which necessitate replicated, multi-location trials and studies.
Many tools have been crafted to alleviate this phenotyping bottleneck. Such tools rely heavily on image analysis and often serve to either expedite manual measurements or to automate the measurements altogether [4]. For above-ground organs, the scale and the detail that one can assess phenotypic information have both expanded vastly due to the advancements of different classes of tools. For the expansion of scale, imaging from ground vehicles, drones, aircraft, and even satellites have proven immensely useful for quantitatively studying groups or populations of plants. Readers are directed to several excellent review articles on high-throughput phenotyping (phenomics) using sensor technologies [4][5][6]. The present review focuses on the imaging tools that have enabled greater resolution of phenotypes of individual wheat plants at the organ level. This is highly relevant to fundamental research that tackles questions pertinent to wheat physiology and stress response as well as field research that can provide invaluable data for wheat breeding. Special attention is given to cost-effective tools that do not possess a steep learning curve as to focus on tools that could be implemented in projects at a broad range of sizes. We aim to provide a point of reference for researchers that are new to phenotyping of wheat plants or are interested in exploring alternative phenotyping platforms. Use of these image-based tools has been and can be applied to other cereal grain crops or to crops in general.

Shoot Phenotyping in Wheat: An Overview
Shoot traits such as leaf size, shoot height, spikelets per spike, tiller number, and maturity are amenable to non-destructive sampling and have been available to farmers, breeders, and researchers for centuries. However, quantification of shoot phenotypes of individual wheat plants is not limited to these "traditional" measures. Rather, shoot morphological analysis in wheat has been expanded to a greater collection of traits by virtue of image analysis software. At the simplest level, ImageJ [7] has enabled manual measurements of leaf shape, color, and extent of disease development [8,9]. Beyond the capabilities and throughput of ImageJ, additional tools have emerged that quantify familiar phenotypes from images with higher throughput or that compute relevant, novel phenotypes otherwise unavailable to researchers. Phenotypes assayed by these software tools vary greatly in their scope-with some tools analyzing leaves specifically and others zooming out to analyze individual shoots or to canopies formed by groups of plants (Table 1). While proprietary tools come at an added expense, publicly available tools enable researchers and breeders to quantitatively phenotype their plants while minimizing cost [10]. Such tools are often validated against their proprietary counterparts or against measurements in ImageJ to ensure phenotyping accuracy. Publicly available tools are often open source as well, with their source code being freely available for technically advanced users to tailor to their specific needs [10] (Table 1).

Phenotyping of Individual Leaves
Multiple programs have been created that expedite quantification of leaf traits relative to manual quantification in ImageJ (Table 1). At the most basic level, there are easily implemented options for assaying leaf area phenotypes. Easy Leaf Area [11] enables batch processing of camera or scanner images by quantifying the green area for each plant in each image while using a square red scale marker to calibrate its area measurements. The mobile app LeafScan [12], while coming at only a nominal cost to users, also simplifies leaf area measurements by calibrating area measurements to a square of fixed area in the image background. Another mobile app, Plant Screen Mobile [13], leverages a calibration object in smartphone images to rapidly measure leaf size and dimensions while also providing more options for color thresholding of leaves against more heterogeneous backgrounds. More complex leaf phenotypes, such as lamina shape (shape of leaf blade), can be assayed from scanned images using the MatLab program Lamina2Shape [14]. These leaf shape phenotypes are indicative of wheat's response to changes in agronomic practices such as sowing date, sowing density, and nitrogen fertilizer application, as leaf growth patterns are modulated to accommodate nutrient and space limitations [15,16]. Finally, the proprietary CI-202 Leaf Area Meter (CID Bio-Science, Camas, WA, USA) has a built-in scanner that flattens leaves and measures leaf perimeter, area, width, and length. This device is joined by other proprietary scanners that include the LI-3000C (LI-COR, Lincoln, NE, USA), and the WinDIAS system (Delta-T Devices, Cambridge, UK), which are advantaged for portability and throughout, respectively (Table 1).
Morphological traits such as shape are not the only relevant parameters in leaf phenotyping. Plants exist in environments where biotic stressors such as pests and diseases are present. Quantitatively assaying disease resistance and pest deterrence in leaves is important to research and breeding in wheat, as it provides valuable data for studies that elucidate key genes or linked molecular markers for biotic stress resistance. WinFOLIA (Regent Instruments, Quebec City, QC, Canada) is a proprietary software that quantifies herbivory and disease extent in addition to leaf morphological parameters. WinFOLIA has been implemented in research on monocot crops, such as studies of agronomic practices in maize [17] and disease response in barley [18]. In contrast to WinFOLIA, several leaf phenotyping freeware tools have the added benefit of being mobile applications, enabling enhanced portability relative to flatbed scanner systems. BioLeaf [19] and LeafByte [20] are freely available smartphone applications that can rapidly quantify the proportion of damage to leaves from pests. These apps considerably lower the equipment requirement for imaging leaf tissue and lower the technical learning curve for researchers and breeders that wish to approach pest damage and deterrence quantitatively. Another mobile application, Leaf Doctor [21], measures the proportion of diseased area on a leaf. This allows researchers to have quantitative measures of disease susceptibility rather than relying on coarse scales for disease grading [22]. This suite of leaf-level tools collectively gives research groups several options for leaf quantitative phenotyping based on the traits in consideration as well as the size and budget of the research project.

Phenotyping of Individual Shoots
Compared to individual leaves, phenotyping of whole shoots is technologically challenged by the inability for common imaging devices to capture information on three-dimensional (3D) structures. As a result, many phenotypes commonly assayed on whole shoots rely on the measurement of 2D projections of shoots taken with monocular (single camera) imaging systems [23,24]. Some of the leaf phenotyping programs survey the green area in images, making them suited to measure projected areas of whole shoots as well. For example, Easy Leaf Area can be used to batch process images of shoots to generate projected area measurements from a collection of images [11]. Likewise, ImageJ is still suited for fully manual measurements of area, height, and diameter of the shoot in the image [23,25]. CoverageTool [25] and HTPheno [23] are more specialized tools for whole shoot phenotyping. CoverageTool allows users to partition shoot images into regions by color and quickly quantify the projected area occupied by each color [25]. This enables rapid measurements of not only projected area, but also of senescence or tip dieback. HTPheno can batch process multiple images quickly that are taken with the same specifications, enabling the user to rapidly measure shoot width, height, and projected area without the need for manual measurements [23]. The usage of CoverageTool, HTPheno, and other shoot phenotyping software (such as Canopy Reconstruction [26]) is naturally constrained by the user s ability to quickly image shoots with consistent lighting, scales, and camera angles. Conveyor-assisted imaging systems that can generate images suitable for analysis with high throughput phenotyping software are constructed by companies such as Phenospex (Heerlen, Netherlands), Lemnatech (Aachen, Germany), and Qubit Phenomics (Kingston, ON, Canada). Such systems have been implemented at various centers worldwide [27] and have been used in the validation of software tools including HTPheno.

Phenotyping of Canopy Cover
Wheat shoots invariably come into contact with each other in the field setting. Communities of individual plants form canopies in which the spatial distribution of leaves collectively determines the productivity of the entire plot [28]. A common metric used to quantitatively analyze these plant canopies is the fractional green canopy cover (FGCC): the proportion of a given two-dimensional, vertically viewed area that is taken up by green shoot tissue. As with shoot phenotyping, programs that quantify green area in images such as ImageJ, CoverageTool, Easy Leaf Area, and Canopy Cover Free (a mobile app version of Easy Leaf Area) are appropriate for assaying the proportion of a plot image that is occupied by a plant canopy-effectively measuring FGCC. The mobile application Canopeo [29] is specialized for FGCC measurements and has been leveraged more extensively in wheat research in comparison. The Canopeo app semi-automatically processes images of wheat canopies taken by users with a smartphone while allowing users to append additional metadata or notes to each image [29]. As FGCC is computed as a unitless proportion of green area to non-green area, scale markers are not needed for imaging, which increases throughput and lowers equipment need. This app is compatible with a wide range of computer operating systems and is continually maintained, ensuring its longevity as new tools emerge [30]. The relatively shorter canopies of wheat benefit from being more accessible throughout the plant s life cycle than those of taller monocot crops such as sugarcane or maize, making the use of cost-effective, handheld imagery feasible for time-series measurements of wheat in the field or greenhouse. Each of the freeware tools mentioned here are able to robustly quantify FGCC from images taken in these settings irrespective of the imaging device used.

Phenotyping of Shoot Chemical Content
Image-based shoot phenotyping in the visible wavelengths can be complemented by information acquired using infrared (IR) spectroscopy. This approach uses reflectance of wavelengths in the near infrared (NIR) or short-wave infrared (SWIR) spectrum (700 nm-2500 nm) relative to that of visible wavelengths to make inferences about the chemistry and overall health of the plant. The use of hyperspectral cameras capable of scanning large regions of the IR spectrum has given IR imagery its well-documented role in aerial remote sensing [31] but has also given it versatile uses in organ-level phenotyping. Commercially available hyperspectral cameras have enabled shoot chemical phenotyping through the analysis of false-color images. Many field-portable spectroradiometers circumvent the need for image analysis altogether by reporting spectral data for a targeted imaging field, though this is at the expense of measurement area. At the simplest level, some handheld devices feature a "point and shoot" measurement method that quickly and easily measures spectral reflectance at a target location, though the collection of traits measured is more limited. The broad spectral range and the high spectral resolution of many currently available tools have enabled the usage of numerical spectral vegetation indices (SVIs) that can have modelled relationships with plant chemical and physiological status [32]. A commonly reported SVI is the Normalized Difference Vegetation Index (NDVI), which is computed using measured reflectance of the red and NIR bands [33]. NDVI measures are indicative of vigor and phenology in wheat [34], and each of the aforementioned tools can readily compute NDVI measures at varying scales. As NDVI serves as a suitable proxy for plant health, it has been used in exploration of nematode resistance [35], the effects of no-till practices [36], and the genetics of heat and drought resistance [37] in wheat.
Other SVIs have been developed beyond the NDVI that are applicable to shoot chemical phenotyping in wheat. One such index is the Normalized Water Index (NWI), which leverages water absorption of certain NIR wavelengths to estimate shoot water content [38]. Beyond shoot water content, concentration of pigments, such as anthocyanins, can be measured in NIR imagery using the anthocyanin reflectance index. This has proven useful in the detection of stress symptoms in wheat, particularly in response to yellow rust [39]. Finally, the Pigment Specific Simple Ratio for Chlorophyll a has proven to be an effective way to measure Chlorophyll a in field-grown wheat [40]. Other indices are being developed and explored as well that correlate with shoot nitrogen uptake, leaf nitrogen content, and yield components in wheat [41]. Further work on modelling relationships between SVIs and mineral macronutrient (N, P, K, S) and micronutrient (such as Zn, Fe, and B) concentrations is underway [42]. Currently available hyperspectral devices are capable of assaying spectral regions that enable users to carry out research and breeding in wheat using these SVIs and others. These proximal sensing tools have no barrier to their implementation outside of the hardware cost, making them amenable to incorporation into experimental designs or breeding regimes without the need to substantially overhaul the experimental design or breeding pipeline.

Research Trajectories in Shoot Phenotyping
Aside from profound advances in aerial and satellite imagery for remote sensing [43], affordable, smaller-scale image-based phenotyping of shoot traits is becoming increasingly user-friendly and higher throughput. Field phenotyping at the scale of individual plants or small groups of plants is benefited by field-portable or handheld imaging devices. Increased availability of software capable of analyzing images taken by these devices in the field enables the exploration of research questions without the introduction of controlled environment as a confounding factor. The rapid expansion of mobile applications in this space that can generate valuable phenotypic information stands to benefit both research and breeding efforts in wheat. Indeed, breeding efforts are benefited by the rapid growth of low-cost shoot phenotyping systems through the enablement of participatory breeding in which growers can provide quantitative inputs to breeders without changing their growing operations [44]. Continued development is occurring in this space as deep learning approaches continue to be implemented in the development of wheat phenotyping software [45]. For example, faster and easier acquisition of shoot traits that have been thus far excluded from image-based shoot phenotyping, such as spike number and tiller count, is steadily becoming possible using deep learning facilitated image analysis [46].

Barriers and Strategies for Root Phenotyping
Plants allocate a large percentage of their photosynthate produced in shoots to sink tissues, such as roots [47]. Roots are the interface with the soil that plants use for anchorage, water and nutrient uptake, and microbial symbiosis [48]. Compared to shoot phenotyping, quantifying the phenotypes and constituent phenes of roots poses additional challenges. Retrieving roots from the soil for visualization is destructive to the plant and is also likely to perturb the phenotypes that are being measured [48]. In addition, the complexity of root systems presents another issue for automated phenotyping efforts. Root intersections, branching, and the presence of fine roots beyond the researcher's imaging capacity pose barriers to root phenotyping that are not present when phenotyping aboveground tissues [49].
Simple excavation remains a relevant method for phenotyping roots of plants growing in the field ("shovelomics" [50]). However, researchers should take great care to maintain the integrity of the root system during excavation and subsequent washing. Although washed roots extracted from soil cores can be suspended in water and imaged with high contrast, soil coring fails to capture the whole root system in all but the younger plants. As such, root crowns, rather than the whole root system, are phenotyped for mature plants in shovelomics [50]. Hydroponic systems provide an alternative liquid medium for plant growth and non-destructive imaging. However, plants grown hydroponically will invariably differ in their root morphology relative to soil-grown plants [51]. Physical tools have also been devised that ameliorate the visualization constraint for root phenotyping. Glass-sided boxes of soil called rhizotrons (or rhizoboxes) enable a relatively coarse visualization of the root system by allowing the roots touching the glass wall to be observed and imaged [52]. Roots in the field have been approached with "minirhizotrons", which are long, periscope-equipped glass tubes inserted into the soil of agronomic plots. Like rhizotrons, minirhizotrons enable imaging of roots growing along the glass tube wall. Minirhizotrons are valuable for their ability to image roots continuously and non-destructively in situ, though they have limited ability to image entire root systems [53]. In recent years, X-ray computed tomography has seen improved image resolution and quality and has been applied to generating 3D images of roots in situ [54].
Following image acquisition, the task of quantification of root architectural traits remains. Manual quantification through tracing of root images, despite providing relatively accurate measurements, is laborious and limited to simple root systems. On the other hand, recent advances in computer vision have produced several software tools for root phenotyping that collectively accommodate a large collection of imaging methods, plant species, and phenotypes [10]. Many such tools are applicable to wheat phenotyping ( Table 2) and have been employed to address a wide range of research questions as discussed below.

Software Applicable to Root Phenotyping in Wheat
The freeware ROOTEDGE [55] and the proprietary phenotyping platform WinRHIZO (Regent Instruments, Quebec City, Canada) were the early imaging tools that emerged alongside of ImageJ. ROOTEDGE, an early alternative to hand measurements of root systems, received some proof-of-concept work on wheat roots [56], but was written for early versions of MS-DOS and has since been overshadowed by more recently released tools that can better distinguish roots from image backgrounds and that are compatible with more operating systems and image specifications. WinRHIZO is a closed-source, hardware-associated tool that gained traction early on because of its ability to robustly image and quantify roots. The scanners associated with WinRHIZO are calibrated and illuminated in a manner that minimizes shadows, though generic flatbed scanner images can also be analyzed with the WinRHIZO software. Early applications of WinRHIZO in wheat enabled evaluation of many traits and processes, such as the modelling of root length density [57] and the investigation of crop rotation [58]. This proprietary platform has sustained maintenance, which is a limitation of many publicly available freeware tools [30]. As a result, continued applications of WinRHIZO have been seen widely in wheat research. For example, this platform has been applied in genetic mapping of root traits [59][60][61], exploration of nutrient response [62][63][64], gauging response to water limitation [60,65,66], and evaluation of germplasm [67], among a plethora of other studies. However, the cost associated with the WinRHIZO image analysis system (scanner and software) may be considered prohibitory to new users in spite of its utility. To that end, several freely available tools have emerged that have overcome many of the limitations of ROOTEDGE while providing a workaround for the cost limitation of WinRHIZO [10] (Table 2).
Among the freely available root phenotyping tools, there are varying levels of automation for root trait quantification. Semi-automation has the advantage of enabling the user to improve phenotyping accuracy by manually correcting errors made in root identification, though this comes at the expense of throughput [10]. SmartRoot [68] and RootNav [69] are semi-automated tools that have been applied in wheat research. SmartRoot is an open source, operating system-independent plugin for ImageJ. Primary or lateral roots in whole root systems can be automatically identified by SmartRoot. The user has the option to review the automatic identification for each root segment to ensure an accurate measurement of the imaged root system [68]. SmartRoot has been used in wheat research to characterize germplasm [70,71], analyze plant-plant interactions [72], and study potential breeding targets for root architectural traits [73].
RootNav is another open source, semi-automated phenotyping tool that distinguishes primary and lateral roots [69]. RootNav users specify root tips and the source of the roots in the image. The program will generate a system of primary and lateral roots using this information and using the pixel intensities in the image. The user can proofread this root system after it has been generated to ensure robust identification of the root network. The user also has the ability to change thresholding parameters in different regions of the image to enable detection of roots in backgrounds with inconsistent lighting [69]. The ability to analyze lateral roots in complex root systems has given many uses for RootNav in wheat research, such as studies linking seedling traits to yield components [74] and nitrogen uptake [75], as well as to studies delving into the genetic components of root architecture [76][77][78]. RootNav is continually maintained, and a second version of the tool has been released as a command-line operated Python program that leverages deep learning to more effectively identify roots [79].
Fully automated programs differ from their semi-automated counterparts in that the specifications for image processing and analysis are set at the start of individual runs, and these specifications are applied to all images in the batch. This considerably increases throughput, as the user is not tasked with proofreading each image in the dataset. However, accuracy may be compromised if the specifications are not set carefully or if the images are of inconsistent quality [4]. The now discontinued GiA Roots [80] was a fully automated program that had been leveraged in wheat research. GiA Roots identified root pixels from the background of batches of images using thresholding parameters that were set by the user prior to the analysis. The roots identified in this scheme were measured for traits such as length, length distribution, surface area, and convex area [80]. GiA Roots has been adopted to study components of seedling water deficit response [81][82][83] and linkage drag of genes underpinning seeding root system traits [84] in wheat. However, GiA Roots has slowly been rendered outdated by new software tools such as Digital Imaging of Root Traits (DIRT) [85] and the neural network driven programs SegRoot [86] and saRIA [87], described below.
DIRT is a unique web platform backed by a computing cluster for the processing of image data [85,88,89]. Furthermore, this platform is unique in that it hosts a growing collection of public image sets shared by researchers around the globe [85]. The DIRT platform quantifies root system architectural traits such as width accumulation, spatial distribution, and rooting angles in batches of images uploaded to the online interface. DIRT was designed to process images of excised roots taken with low-cost imaging systems (such as tripod-mounted smartphones) in a field setting. The use of scale markers in the images enables DIRT to correct for camera tilting and to set the scale for each image without user intervention [85,88,89]. DIRT offers promise to wheat research due to its unique ability to quantify excised root systems without the need for added technical training required for installing and running standalone software. The DIRT platform has already been used to explore wheat's response to phosphorous deficiency [90]. To date, many wheat image collections exist on the DIRT platform, raising promise for its future usage in wheat research.

Recent Advances in Root Phenotyping Using Deep Learning
The semi-automated saRIA [87] and the fully automated SegRoot [86] are recently published tools that can quantify useful traits in wheat root systems. Each of these tools was trained using a convolutional neural network (CNN) to identify roots in visually noisy images [45]. The semi-automated saRIA was developed in MatLab (MathWorks, Natick, MA, USA) and computes a suite of traits from roots grown in agar, fine soil, or other media with shapes that differ sufficiently from the roots being studied [87]. Some room for machine error is given when using saRIA, as the user has the option to remove objects that have been mis-identified as roots in the image. SegRoot is fully automated and has its output traits limited only to length. However, SegRoot has a very low image quality requirement in that it is amenable to analyzing roots from complex, visually heterogeneous soil backgrounds [86].
Both SegRoot and saRIA were tested for accuracy against other published tools, and both outperformed GiA Roots in the identification of roots in heterogeneous growth media, demonstrating clear benefits of machine learning-informed approaches in root phenotyping. These tools have promise for rapidly identifying wheat roots imaged in soil-bound systems, such as minirhizotrons and rhizotrons, without the need for manual image pre-processing with image manipulation software. This is especially important, as it opens the door to the analysis of more mature root systems, expanding the scope of wheat root phenotyping away from the limited architectures of seedlings to those of mature plants.

Research Trajectories in Root Phenotyping
Phenotyping under controlled conditions has brought a notable number of studies that assay wheat seedling traits prior to transplantation [73,74,91], which has produced datasets of traits that are correlative of yield components in wheat. Outdoor rhizotrons enable measurements of soil-bound roots in an environment that simulates field soil [52,92]. Minirhizotrons can be used to assay root turnover of plants in field soil, which can generate informative phenotypes of individual plants or of several individuals of the same genotype sown in close proximity [53,93]. Excavation-based shovelomics approaches can provide the throughput needed in a field setting to overcome the variability inherent in root phenotyping by virtue of their low equipment requirement and relatively simple procedure [50]. Shovelomics and its accompanying software tools (REST [94], DIRT [85]) collectively enable root phenotyping at scale in the field. Though full excavation becomes impossible for mature plants, the correlative traits derived from immature root systems or from mature root crowns remove the confounding factor of having a controlled environment. This potentially allows them to be more indicative of mature plant phenotypes for breeding or of underlying plant physiology for research.
Overall, the needs of the wheat research community for root phenotyping are increasingly being met with the collection of physical tools and the development of effective imaging tools [30].

Challenges and Software Applicable to Seed Phenotyping in Wheat
Wheat is grown almost exclusively for grain consumption. The yield of wheat has an obvious interplay with grain-centered phenotypes such as seed number, shape, and size. Grain color is also relevant as it may influence consumer preference of end-use products [95]. Despite the importance of seed phenotypes to wheat cultivation and marketing, phenotyping of wheat seed is limited by a pair of factors. Firstly, seeds are small and variable in size, requiring a high number of careful dimensional measurements to successfully capture the trends in seed characteristics across cultivars or populations [96]. Secondly, robust measurements of seed color are difficult or impossible to make by human estimation. To circumvent the issue of throughput and color phenotyping, several image-based tools have emerged in the last 20 years that can enable rapid quantification of seed dimensions and colors. There is currently a pair of publicly available software tools, as well as a pair of proprietary, hardware-specific tools, available to researchers for seed phenotyping (Table 3). SmartGrain [97] and GrainScan [96] are independent of specialized hardware and can be used with low-cost, conventional flatbed document scanners. SmartGrain analyzes seed size and dimensions for each grain in the image, effectively giving the seed count [97]. SmartGrain is semi-automated in the sense that the user needs to review and potentially correct the masking of the grains from the background to ensure that they were detected properly. GrainScan is fully automated and analyzes seed size and dimensions while also taking standardized color measurements [96]. Although GrainScan is equipped with batch processing and tunable thresholding parameters, it does not offer manual correction by the user. Both packages have been widely employed in wheat research. SmartGrain has been leveraged to explore the genetics underlying yield components [98][99][100][101][102], to aid breeding and introgression [103][104][105], and to analyze drought response [106]. GrainScan has been used in the exploration of yield genetics [107,108], the evaluation of ancestral germplasm [109], and in the unraveling of wheat domestication [110]. Both of the phenotyping tools noted here could see broad implementation in academic laboratories, as they have low equipment requirements and require little advanced technical expertise to operate.
The proprietary phenotyping platforms, WinSEEDLE (Regent Instruments, Quebec City, QC, Canada) and SeedCount (Next Instruments, Condell Park, Australia), are associated with specialized hardware that comes as an extra expense for researchers. In both cases, this specialized hardware functions to increase throughput and image quality relative to conventional flatbed scanners. Both WinSEEDLE and SeedCount are fully automated and provide information about seed shape, number, and color. These systems predate the freeware outlined above and have been applied in many studies since their release. Like GrainScan and SmartGrain, WinSEEDLE has been used in explorations of genetic and phenotypic underpinnings of yield components in wheat [111][112][113][114][115][116]. SeedCount has seen broad use in academic research. In addition to usage in genetic mapping studies for yield [117][118][119], SeedCount has been leveraged to explore wheat abiotic stress response [120][121][122], grain mineral accumulation [118,[123][124][125], and to evaluate germplasm [126,127]. Usage of these proprietary tools certainly extends beyond what is reported in academic literature. The marketing of these proprietary platforms is primarily directed toward seed industry professionals. As such, applications in the public domain represent only a subset of their overall usage. Nonetheless, there are clear avenues for usage of these proprietary seed phenotyping tools in academic research, provided that their cost is not prohibitive.

Research Trajectories in Seed Phenotyping
Compared to images of shoots and roots, the images of seeds are less complex in terms of the identification of individual seeds and to the number of informative phenotypes that can be garnered from these seeds. Discrepancies in seed size are detected readily by gravimetric measurements while clear discrepancies in seed shape across varieties are often not quantified. However, phenotypes that are difficult or arduous to assay by hand such as color or seed count are quickly measured by image analysis using aforementioned tools. Improvements in this space could stand to leverage recent advances in deep learning and image analysis to enable more reliable identification of seeds against a wider range of image backgrounds, namely light-colored or textured backgrounds with low contrast to the seeds [128]. Increasing robustness of seed identification across many backgrounds could expand the currently limited collection of mobile applications for seed phenotyping, which are currently limited to seed counting [129], by supplying tools capable of measuring dimensions and shape from smartphone images. Such improvements have the potential to supplant the need for scanning devices when phenotyping wheat grains, much like the portable apps that are becoming increasingly germane to shoot phenotyping.

Concluding Remarks and Future Perspectives
A suite of accessible phenotyping tools has been developed that is applicable to both fundamental and applied wheat research. An integrated phenotyping approach that accounts for many aspects of plant growth can unravel the effects of genetics, environmental factors, and management practices on the physiology of wheat and ultimately translate this to enhancing its performance and productivity in the field. Rapid, comprehensive evaluation of wheat at varying stages in the breeding pipeline-from wild species to landraces to elite breeding lines-will be instrumental in increasing wheat yields to accommodate the increase in the demand for wheat that comes with a rapidly growing world population. Moreover, the increasing accessibility of these tools, particularly for shoot and seed phenotyping, has resulted in lower learning curves and imaging requirements so that a wider range of breeders can rise to meet this global need without being required to reinvent their breeding pipelines.
Research trajectories for shoot, root, and seed phenotyping in wheat, and in crops in general, are discussed in the above sections. Looking forward, several developments will continue to improve the throughput, accuracy, and accessibility of the phenotyping outlined here. From the biological sciences, advances in next-generation sequencing, genetic, and functional genomic analyses collectively improve our ability to leverage phenotypic observations to explore the idiosyncrasies of the wheat genome and to develop new cultivars. From the engineering and computational sciences, the software and hardware tools available for this effort will steadily improve as well; breakthroughs in deep learning, computer vision, and graphical user interfaces are being complemented by advances in image acquisition and computing clusters. The continued collaborative efforts among these disciplines hold promise to generate a wealth of phenotyping data readily accessible for analysis by researchers in basic and applied sciences, much like wealth of genetic information that already exists publicly for wheat [130]. The expansion of phenotyping information will facilitate multidisciplinary research investigations in wheat that enable us to hit the needed yield benchmarks for decades to come.

Conflicts of Interest:
The authors declare no conflict of interest.