Next Article in Journal
Numerical Optimization of Laser Powder Bed Fusion Process Parameters for High-Precision Manufacturing of Pure Molybdenum
Previous Article in Journal
Controlling Product Properties in Forming Processes Using Reinforcement Learning—An Application to V-Die Bending
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Rapid Detection of Key Phenotypic Parameters in Wheat Grains Using Linear Array Camera

1
School of Agricultural Engineering, Jiangsu University, Zhenjiang 212013, China
2
Key Laboratory of Modern Agricultural Equipment and Technology, Ministry of Education, Jiangsu University, Zhenjiang 212013, China
3
School of Food and Biological Engineering, Jiangsu University, Zhenjiang 212013, China
*
Author to whom correspondence should be addressed.
Appl. Sci. 2025, 15(10), 5484; https://doi.org/10.3390/app15105484
Submission received: 14 March 2025 / Revised: 9 May 2025 / Accepted: 10 May 2025 / Published: 14 May 2025

Abstract

:
To address the limitations of traditional manual measurements of phenotypes, a study focused on the rapid acquisition of phenotypic parameters in wheat grains was conducted. This research introduced an index for bud-point determination, which offers valuable insights into breeding selection, optimizing sowing and growth direction for grains, and assessing seed vitality. A grain collection device utilizing a linear array camera was designed and constructed, accompanied by custom software to simplify device operation. Image processing was performed using Halcon image, enabling the development and implementation of a system for extracting key phenotypic parameters. The preprocessing method involves isolating wheat grain regions, combined with S channel extraction, global thresholding, and opening and closing operations to achieve robust segmentation. Based on this preprocessing method, a phenotypic analysis was performed by evaluating the geometric characteristics of wheat grains. The process for extracting phenotypic characteristics and determining the bud point was subjected to experimental analysis. The results demonstrated high correlation and accuracy. The errors estimating the comprehensive grain length of five wheat varieties using the extraction algorithm developed in this study, the determination coefficient and root mean square error indices, were 0.986 and 0.0887, respectively, compared with manual measurements. Similarly, for grain width, the determination coefficient and RMSE were 0.9505 and 0.0541; for grain weight, these values were 0.7635 and 3.8329, respectively. Furthermore, the algorithm achieved a bud point direction determination accuracy rate of 97.5%. The mean error in the positioning accuracy of germination points was within 0.10 mm, and the mean angular error was maintained within ±3 degrees. The phenotype analysis of wheat using the image processing techniques presented in this study shows strong consistency with manual detection methods, providing valuable quantitative data to assist in breeding selection and accelerate genetic improvement practices.

1. Introduction

As a staple food crop, the yield and quality of wheat are critically linked to global food security and human nutritional health [1,2,3]. With ongoing population growth and the increasing diversity of dietary demands, the global need for wheat has been rising. However, challenges such as limited cultivated land area and yield reduction caused by sub-optimal crop grains have increased the necessity to cultivate improved wheat varieties and enhance yield per unit area. Consequently, improving wheat productivity has become a primary focus for breeders [4,5,6]. The main determinants of wheat yield include grain weight, the number of tillers, and the ear formation rate. This study analyzes the phenotypic indices of wheat grains, recognizing that traits such as grain length and width are strongly positively correlated with grain weight. By combining grain phenotypic measurements with bud point determinations, this study offers a comprehensive framework for evaluating germplasm resources [7].
The continuous improvement of agricultural informatization has promoted the extensive application of technologies, such as machine vision, infrared spectroscopy, and image processing, in the agricultural field [8]. These technologies have made substantial progress in primary areas, including disease diagnosis [9], variety classification [10,11,12,13], grain detection [14,15], and yield prediction [16,17]. For instance, Xing et al. [18] used a visible–near-infrared hyperspectral imaging system to detect sprouting damage in red spring wheat, demonstrating that reflectance at 878 nm and 728 nm effectively differentiates unsprouted, sprouted, and severely sprouted wheat across the 400–1000 nm spectral range. Similarly, Zhang Tingting et al. [19] used a hyperspectral imaging system to obtain spectral information from wheat grain samples for seed viability discrimination. By applying multiple pretreatment techniques, characteristic band extraction methods, and coupled analyses, they developed a MUCS-PLS-DA model that achieved a germination rate prediction accuracy of 93.1%. These findings enhance the extensive research exploiting near-infrared hyperspectral data in wheat grain analysis. However, hyperspectral data contain a large amount of redundant information and require feature extraction. Moreover, the data analysis and processing steps are complex, and most breeding researchers cannot analyze spectral data. Furthermore, real-time analysis remains difficult with hyperspectral technology due to its computational intensity and data volume, highlighting the need for simplified, rapid phenotyping methods to better support breeding programs.
As the foundation of machine vision technology, image processing technology enables quick and precise analyses of various grain characteristics through operations such as the digital processing of images and feature extraction [8,20]. For example, it can be applied to assess attributes such as the color and morphology of wheat grains, as well as structural features, including awns, spike shapes, and the number of wheat ears [21,22]. In traditional methods, detecting the appearance and quality of wheat grains mainly relies on manual inspection [23]. Researchers conduct relevant work based on their own experience, which is inefficient and subject to strong subjective factors, making it difficult to meet detection needs in scientific research scenarios. Consequently, obtaining quick and accurate wheat grain phenotypic information through image processing is of great significance. For example, Jia Weiding [24] improved the existing image processing methods and proposed an improved marker watershed algorithm for grain segmentation. Zhu Junsong [25] applied preprocessing techniques to raw wheat grain images that preserved grain shape, thereby facilitating the reliable extraction of color, shape, and texture features, which are essential for further analysis. Similarly, Zhang Xiaoyu et al. [26] used image processing to extract features from corn, building a mathematical model that accurately measured seedling height and germination rate, demonstrating the versatility of these techniques across crops. Furthermore, Bi Kun et al. [27] employed morphological image processing to extract parameters, such as spike length, awn length, number of awns, and spike type, enabling non-destructive variety identification based on wheat morphology. Li et al. [28] extracted spike area and length through basic image preprocessing, combined with vertical rachis rotation and stem segmentation techniques, establishing predictive models for phenotypic trait analysis. The ongoing development and application of machine vision across diverse agricultural domains offer valuable approaches for wheat grain detection and related industrial uses [29,30]. Wu Xudong [31] developed a machine vision-based germination device to assess seed viability, integrating morphology and machine learning methods to accurately detect wheat seed viability. Collectively, these studies validate the feasibility of using image processing and machine vision technologies for the detection of grain phenotypic traits and viability, providing a solid foundation for further research in this area.
Image quality is of great importance for accurate grain identification and segmentation and for acquiring phenotypic parameters. Area-array camera-based image acquisition has been widely adopted in applications such as wheat quality assessment and grain counting due to its ability to capture detailed and reliable images [32]. Furthermore, line-scan cameras offer significant advantages, including a wide dynamic range, high resolution, rapid scanning speeds, and the ability to produce clear images of moving objects. When paired with lenses characterized by low distortion and high optical resolution, line-scan cameras have exhibited great potential in advancing grain detection accuracy and efficiency [33,34].
This study focuses on five experimental wheat varieties provided by the Jiangsu Academy of Agricultural Sciences. The main research contents are as follows: (1) the design and construction of a wheat grain image acquisition system that integrates a high-precision linear array camera, complemented by custom upper-computer software to optimize the device’s operational performance and ensure smooth data capture; (2) the use of Halcon to develop effective methods for extracting image feature regions and quantifying wheat grain phenotypic traits. Critical parameters such as grain length, grain width, and grain weight are modeled through fitting relationships, with their accuracy validated against manual measurement data. Moreover, the precision of bud-point determination was investigated to ensure a comprehensive phenotypic characterization.

2. Materials and Methods

2.1. Experimental Materials

Five wheat varieties—Huaimai 20 (hm20), Huaimai 44 (hm44), Ninghongmai 237 (nhm237), Ninghongmai 432 (nhm432), and Xumai 35 (xm35)—were obtained from the Institute of Germplasm Resources and Agricultural Biotechnology of Jiangsu Academy of Agricultural Sciences and selected as the experimental materials. Before image acquisition, the wheat grains were placed in a desiccator containing desiccant to prevent moisture-induced degradation, ensuring high-quality images unaffected by climatic variations. During preliminary setup and debugging, the shooting effects of various bottom-plate materials were systematically evaluated under consistent lighting conditions. Factors such as image contrast, durability, and cost were carefully considered to optimize image clarity and practical applications; black hard cardboard was selected as a suitable background material for capturing wheat grain images. The shooting effect is shown in Figure 1.

2.2. Detection System

This system comprises a lifting platform, a synchronous belt drive part, a tray loading part, and an image acquisition part. The specific installation positions and the physical objects are shown in Figure 2. The equipment’s usage method will be detailed in the following system workflow. Wheat grains are placed in the tray, and upon starting the system, images of the grains are captured on the platform and transmitted to a computer via an image acquisition card. Subsequently, the Halcon v24.11 image processing software is utilized to further process the wheat grains.

2.2.1. Hardware Build

  • Visual unit
Linear array cameras have higher pixel precision than area-array cameras; can continuously acquire images, making it easier to achieve uniform lighting; can capture images of moving objects; and have better expandability. The size of wheat grains is relatively small, with a length of approximately 6–9 mm and a width of about 2–3 mm. When the horizontal detection accuracy of the device can reach 0.1 mm, the defects of wheat grains can be better presented. Generally, 2–5 pixels are required to distinguish a pattern with a length or width of 0.1 mm. Here, 3 pixels are selected for representation, and the single-pixel precision is approximately 0.034 mm. Since a 250 mm * 250 mm black hard plate is selected as the bottom plate in the material tray, the width of the field of view is set to 250 mm, and the camera resolution should reach 8 K to meet the detection requirements. Therefore, the Hikvision MV-CL086-91CC series color CMOS linear array camera is selected, and the pixel calculation formula is as follows:
R = W F O V P ,
where R represents the number of camera pixels; W F O V represents the number of pixels in the image width; and P represents the single-pixel accuracy.
The pixel size of the selected MV-CL086-91CC line-scan camera is 5 μm, and the resolution is 8192 pixels. The magnification ratio, β , is 0.1638 according to the formula used to calculate β . The calculation formula is
β = R P S W ,
where β represents the magnification ratio; W represents the field width; and P S represents the pixel size.
The shooting distance of the wheat is 160 mm, and the magnification ratio is 0.1638 according to the above formula. The focal length is 22.5 mm according to the focal length calculation formula, and the MVL-LF2528M-F model lens is selected. The calculation formula of the lens focal length is
f = β × W D 1 + β ,
where f represents the focal length; β represents the magnification ratio; and W D represents the working distance.
The visual inspection unit adopts a modular design. The dynamic adjustment of the camera and the light source is achieved using a lifting slide table and an adjustable fixture to meet the precision requirements of the application scenarios. Using the MVS software, parameters such as the exposure, bright and dark field correction, and frequency conversion of the trigger signal are optimized to improve the imaging quality. The KW-ELS375F-W strip light source and the KPD-5A1C-24V light source controller (Shanghai Kaiwei Optoelectronic Technology Co., Ltd., Shanghai, China) are selected as the light sources for image acquisition. For the motion module, the CCM-W40-10-L1100-5A double-rail synchronous belt conveying module is chosen. Its effective stroke is 850 mm, and the moving speed ranges from 0 to 1.5 m/s, which can meet the requirements for obtaining wheat images. The design of the visual inspection unit is shown in Figure 2a.
  • Electronic control unit
The overall layout of the electronic control unit is shown in Figure 2b, which consists of a power supply part, a motion control part, a servo control part, and a servo control terminal block part. The motion control scheme is shown in Figure 3. In the system, instructions are sent to the image acquisition card and the motion control card through the operation monitor’s host computer software. The motion control card is connected to the industrial computer via an Ethernet network interface, and the MODBUS_TCP communication protocol is adopted for data exchange. In accordance with the received instructions, the motion control card transmits signals to the light source controller to activate the bar light source. Additionally, it controls the servo driver by sending pulse signals, causing the servo motor to operate the double-rail synchronous belt module and facilitating the reciprocating motion of the material tray. Limit sensors are installed on the synchronous belt module, which perform positioning and limit control. The image acquisition card controls the camera to take images through the Camera Link interface to support subsequent image processing and analysis. During this process, the signals fed back by the motor encoder are transmitted to the servo controller to ensure the accuracy and stability of the motor operation. At the same time, the shaft encoder transmits the position information of the material tray to the industrial computer and saves it in the database such that, when the program is started next time, the accurate position of the material tray can be quickly obtained and restored. Through the coordinated operation of precise motion control and image acquisition, this system improves the automation level and operation efficiency of the production line and ensures the high precision and reliability of the production process.

2.2.2. Software Design

To enhance the efficiency of wheat grain image acquisition and simplify its operation process, upper-computer software supporting the detection system’s hardware was designed. The main functions of the software include real-time control of the detection system, visualizing the shooting results, adjusting and saving the motion control parameters, and setting the image detection parameters. The main interface of the software is shown in Figure 4, which includes a start button, a parameter adjustment interface, a camera calibration interface, and an ROI (Region of Interest) area selection interface.
  • Motion control interface
A motion control interface is developed for the ECI2608 motion control card (Shenzhen Zmotion Technology Co., Ltd., Shenzhen, China) and the servo control system, as shown in Figure 5. After the program is compiled and run, the motion control interface displays information such as the connection status, motion status, position, and speed of the servo, which facilitates the observation of relative displacement changes. Meanwhile, it also shows the status interface of input/output ports. When the system switches to the manual state, the motion control parameters on the interface can be adjusted, and the set operating parameters can be saved to the database. During the system operation, the database parameters set by the user will be called.
  • Camera calibration interface
When processing and analyzing images to obtain the actual parameters of wheat grains, a system is required to determine the actual value corresponding to each unit pixel in the collected image and then obtain the actual measured values of the phenotypic traits. Size calibration is necessary when the device is used for the first time or after the camera parameters are changed (the HG-150 honeycomb calibration plate is selected as the calibration plate, and the diameter of the circular dot is 5 mm). The user opens the calibration plate image taken by the MVS software in the camera calibration interface and checks whether the roundness and aspect ratio of the resulting representation meet the requirements (>0.95), as shown in Figure 6. In the camera calibration module, local images can be directly imported, and the pixel equivalent will be automatically calculated and displayed. At the same time, the data will be saved in the database. The resulting representation in the lower right corner allows the operator to observe whether the calibration image effect is feasible.
The steps to obtain the actual pixel value are as follows: (1) Perform preprocessing procedures such as threshold segmentation, region screening, and morphological processing on the calibration plate image through the Halcon software to obtain the feature regions of the circular dots. (2) Use the minimum bounding rectangle operator and the roundness operator to, respectively, obtain the mean values of the horizontal and vertical pixel lengths of the diameters of all the circular dots on the calibration plate, as well as the roundness values. Convert the unit into micrometers, and the ratio of the actual value of the diameter of the circular dot to the obtained number of pixels is the actual value of a single pixel. The specific formula is as follows:
P = D × 1000 2 L ,
where P represents the actual physical dimension; D represents the diameter of the calibration plate dots; and L represents half the side length of the minimum bounding rectangle.
Since the camera has a certain degree of distortion, after adjusting its parameters to the maximum extent, it is necessary to determine the actual shooting effect. Therefore, pictures of the fixed plate are taken at different positions in the material tray, calibration is performed in the host computer software, and the quality of the results is judged according to the aspect ratio and roundness.
  • Region Selection Interface
To cover the shooting areas for materials of different sizes, design work on the ROI (Region of Interest) area selection interface is carried out. The required ROI area can be determined via frame selection within the display window, as shown in Figure 7.

2.2.3. System Workflow

The specific workflow of this detection system within one detection cycle is shown in Figure 8. The user starts the system, opens the host computer software, and logs into the account. After entering the main interface, the device first undergoes initialization, and the system automatically obtains the position of the shaft. If the shaft is not in the zero-return position, it will automatically return to zero (the position of the zero-limit sensor). Subsequently, the user clicks the start icon on the main interface and presses the start button at the same time. The synchronous belt-sliding table causes the material tray to start moving. When passing through the set scanning start position, the light source turns on, and the camera simultaneously starts to acquire images. This continues until the scanning end position is reached, at which point both the light source and the camera turn off. The synchronous belt-sliding table then returns to the starting position. Finally, the captured images are displayed on the main interface of the host computer software and saved to the specified folder path. The overall process is completed through coordination between threads, including the camera thread, the image acquisition thread, the motion control thread, and the operation monitoring thread.

2.3. Research on Image Processing Methods

Figure 9 shows the research process for wheat grain phenotypes designed in this study. It mainly includes image preprocessing, research on the method of obtaining phenotypic parameters, and conversion between pixel values and actual values in the camera coordinate system.

2.3.1. Research on Preprocessing Methods

The image preprocessing process is shown in Figure 10. The wheat images are processed using the Halcon machine vision software. First, the region of the color image is selected, and then, the color image is converted to grayscale. Subsequently, through image enhancement, feature extraction, and morphological processing, the influence of noise is reduced, and the original characteristic regions of the wheat grains are restored.
The device employs a color linear array camera that captures images containing a substantial amount of information. However, during the image preprocessing stage, this information may become redundant. Therefore, according to the processing flow shown in the figure above, the color image is the first grayscale image to remove redundant information. Then, the image is smoothed to reduce the impact of noise on quality degradation during the image acquisition process. The image is separated into various color channels, facilitating the intuitive observation of grayscale distribution across different levels through a grayscale histogram. To enhance the visualization of channel segmentation effects, the same threshold segmentation method is applied to each of the different channels. The processing effect is shown in Figure 11. Finally, the method corresponding to the selected framed image is chosen for the preprocessing process.
When performing image enhancement, mean filtering, median filtering, Gaussian filtering, bilateral filtering, and guided filtering were adopted for effect comparison. When the operator parameters did not change significantly, guided filtering achieved relatively ideal noise removal. Moreover, after the image was converted into different channels, the gray histograms of each channel showed differences, as shown in Figure 11. The gray levels in the H channel were uniformly distributed, which was not conducive to feature extraction. The gray values of the background in the S channel were lower than those in other channels, making it convenient for segmenting the foreground and background. During the dynamic adjustment of the threshold segmentation values for different channels, when the low threshold was set to 23, the wheat area was well covered. Considering the integrity of the feature area and the impact of noise, the segmentation results for the S channel were superior. Consequently, this study conducted subsequent operations based on the S channel.
After a series of image processing operations, including grayscale conversion and image enhancement in the early stage, the wheat area should be relatively completely extracted, and there will be no serious holes or protrusions that would have an impact. Only slight morphological processing of the image is required. By jointly using opening and closing operations to perform morphological processing on the wheat area, the protrusions on the edges of the wheat in the image can be removed, and the holes inside the area can be filled to show the original feature area of the wheat grains as much as possible. Simultaneously, to further eliminate noise and non-target objects, items smaller than specific preset criteria are excluded based on predefined parameters (such as area, major and minor axes, and other indicators). This process effectively screens out wheat grains that meet the established requirements.

2.3.2. Research on the Extraction Method for Phenotypic Parameters

The phenotypic parameters of wheat grain are important reference indicators reflecting the quality and morphological characteristics. The original images of wheat grains are obtained through the image acquisition device built in this study. Based on the Halcon image processing software, a method for extracting the wheat grain phenotypic parameters is designed. Firstly, the characteristic regions of the wheat grains are extracted through the preprocessing method designed in the previous section, and a wheat grain mask is obtained, as shown in the final processing method depicted in Figure 11. Secondly, since wheat grains are approximately elliptical when obtaining the pixel values of their phenotypic parameters, the values of, for example, grain length and grain width can be obtained through a minimum circumscribed circle and a minimum circumscribed rectangle. Then, the true values of the parameters can be obtained through numerical conversion in the camera calibration section. A schematic diagram of a wheat grain is shown in Figure 12.
On this basis, the bud points are located and determined. First, it is necessary to determine the coordinate position of the center point of the seed in the image. The seed contour is obtained through the preprocessing method. The position of the center point of the seed circle is determined according to the minimum circumscribed circle, which is the geometric center of the seed. Then, the size of the seed is determined through the best-fitting ellipse, which is subsequently used to calculate the direction and positioning of the bud points, as shown in Figure 12. The calculation formula based on the fitting ellipse is as follows:
r 1 = 2 ( μ 2,0 + μ 0,2 + ( μ 2,0 μ 0,2 ) 2 + 4 μ 1,1 2 ) ,
r 2 = 2 ( μ 2,0 + μ 0,2 ( μ 2,0 μ 0,2 ) 2 + 4 μ 1,1 2 ) ,
θ = 1 2 a r c t a n 2 μ 1,1 μ 0,2 μ 2,0 ,
where μ 2,0 and μ 0,2 reflect the grayscale variation situations of the image in the horizontal and vertical directions, respectively; μ 1,1 reflects the joint distribution relationship of the image grayscale in the two directions; r 1 and r 2 represent the major and minor semi-axes of the fitted ellipse, respectively; and θ represents the rotation angle of the fitted ellipse.
Based on the above content, the bud-point determination can be theoretically determined: the central coordinates of the wheat grain and the lengths of the major axis and minor axis are determined through the minimum circumscribed circle, and the other parameters are determined through the equivalent ellipse. As shown in Figure 6, point o is the center of the seed, and the length of the seed is 2 r 1 . The endpoint, A , of the endosperm of the seed is calculated by adding the offsets in the x and y directions to the central coordinates of the seed, where the offset in the x direction is r 1 × c o s θ , and the offset in the y direction is r 1 × s i n θ . During the germination cultivation process of wheat grains, the germination point is not near the endpoint, A , of the endosperm. Therefore, based on the above formula, a coefficient, k , is introduced, which represents the distance from the central point to point A 1 . When k is 1, it represents the coordinate of the endpoint, A . A suitable value of k is determined so that k r 1 represents position A 1 of the germination point. The specific expression is as follows:
A x = o x + k r 1 × c o s θ ,
A y = o y k r 1 × s i n θ ,
Point A 1 is the germination point of the wheat in the two-dimensional image. The specific value of the coefficient k is determined through the experimental analysis in Section 2.4. Taking point A 1 as the foot of the perpendicular, a perpendicular line to A o is drawn, which intersects the surface of the wheat at point A 2 , which is the actual germination point on the surface of the wheat grain. Due to the randomness of the grains’ placement direction, the direction of the germination point may be misjudged by relying only on the above steps. Therefore, the determination condition of the pixels in the endpoint area is added to determine the germination endpoint of the wheat.

2.4. Verification Method and Indicator

A research experiment on the phenotypic parameters of wheat grains was conducted using the image acquisition device and image processing method. The specific details are as follows.
Experiment 1: We randomly selected 50 grains from each of the five types of wheat for image acquisition, resulting in 250 images. We compared and analyzed the phenotypic parameters (as shown in Figure 13) extracted by the Halcon software code with the results of manual measurement.
The parameter measurement error was used to evaluate the morphological parameters of intact wheat grains. The coefficient of determination ( R 2 ) and the root mean square error ( R M S E ) were selected as the evaluation indices. Additionally, the mean absolute error ( M A E ) and mean absolute percentage error ( E M A P ) were incorporated for further analysis, and the calculation formulas are as follows:
R 2 = 1 i = 1 n x i y i 2 i = 1 n x i x ¯ 2 ,
where x i reflects the true value of the phenotypic parameter of the wheat grain, y i reflects the predicted value of the phenotypic parameter of the wheat grain, x ¯ reflects the average of the true values of the phenotypic parameters of the wheat grains, and n reflects the number of wheat grains.
R M S E = 1 n i = 1 n y i y ^ i 2 ,
where y i reflects the true value of the phenotypic parameter of the wheat grain, y ^ i reflects the predicted value of the phenotypic parameter of the wheat grain, and n reflects the number of wheat grains.
M A E = 1 n i = 1 n x i y i ,
E M A P = M A E 1 n i = 1 n x i × 100 % ,
where x i reflects the true value of the phenotypic parameter of the wheat grain, and n reflects the number of wheat grains.
The actual parameters were obtained by combining the real pixel values calibrated by the software. The weight-fitting formula was determined for these parameters, including grain length, grain width, and area, and a comparative experiment was conducted on grain length, grain width, and grain weight with the manual detection method (using a vernier caliper and an electronic scale). The manual detection equipment is shown in Figure 14.
Experiment 2: Based on the bud point determination method, 40 wheat grains of each of five varieties were randomly selected for image acquisition, resulting in 200 images. For the collected images, the bud-point determination code written in Halcon was used to determine the direction of the bud points. The algorithm determines the center coordinates of the wheat grains by identifying the morphological characteristics and edge information of the bud points, as well as their relative positional relationships with the overall shape, and then performs shape fitting. The direction of the bud points is determined by judging the grayscale means of the two ends. To verify the accuracy of the algorithm, the results of the code operation were manually judged. The wheat grains in the above experiment were germinated and cultivated, as shown in Figure 15. After soaking for 1 to 3 days, wheat grains with bud points that could be observed with the naked eye were photographed again. Problematic wheat grains during the germination process were removed, and 120 germination images were collected.
After observing the positions of the bud points, we manually marked the positions of the actual bud points in the image in Halcon. We compared the predicted positions of the bud points with the software using the manually marked positions of the actual bud points. We quantitatively evaluated bud-point positioning accuracy by calculating indicators such as the distance error and angular error between the two (formula reference for phenotypic parameters: absolute mean error), as shown in Figure 16. We statistically analyzed the error data to study the differences in the positioning accuracy of the bud points of the different wheat grain varieties, as well as the factors affecting the positioning accuracy, to provide an experimental basis for improving the accuracy of bud-point positioning technology.

3. Results

Using the verification method and analytical indicators developed in Section 2.4, the fitting effect diagram shown in Figure 17 was obtained. By comparing phenotypic parameters from software-based measurement with those from manual measurement, the determination coefficients ( R 2 ) for both grain length and width were found to be above 0.9, and only the root mean square error of xm35 exceeded 0.1. Among the determination coefficients R 2 of the grain weight, the effects of hm20 and nhm237 were relatively poor, while the other three reached above 0.8. The root mean square error was in the range of 2.3 to 3.9, which was larger than the errors for the grain length and width. The R 2 values of the grain length, width, and weight under the summarized results were 0.986, 0.9505, and 0.7635, respectively, whereas the corresponding root mean square errors were 0.0887, 0.0541, and 3.8329, respectively.
To clearly present the comparison results, the mean error values and average error percentages for wheat grain length, width, and weight are summarized in Table 1. Regarding grain length, the highest mean error was observed in xm35, reaching 0.0914 mm, corresponding to an average error of 1.392%, while nhm237 exhibited the lowest mean error of 0.051 mm with an average of 0.675%. The overall mean error for grain length across all varieties was 0.066 mm, with an average error of 0.932%. For grain width, hm44 exhibited the largest mean error at 0.050 mm, with an average of 1.508%, whereas nhm237 had the smallest mean error of 0.036 mm and an average of 1.098%. The overall mean grain width error was 0.043 mm, corresponding to an average of 1.254%. In terms of grain weight, nhm237 showed the highest mean error of 3.24 mg, with an average of 7.606%, while xm35 demonstrated the lowest mean error at 1.67 mg and an average of 3.831%. The overall mean error for grain weight was 3.15 mg, with an average of 7.142%. Collectively, the error patterns vary between wheat varieties for grain length, width, and weight. nhm237 exhibited relatively low errors in grain length and width but comparatively high errors in grain weight. However, xm35 showed larger errors in grain length and width but a smaller error in grain weight. The varieties hm20, hm44, and nhm432 showed moderate error levels across all measured parameters. These results indicate that the wheat grain data acquisition method proposed in this study demonstrates a high level of consistency with manual measurements, suggesting that it can reliably meet the accuracy requirements necessary for wheat variety testing.
To determine the direction of the bud point, shape, roundness, texture, and pixel grayscale values were experimentally analyzed in the early image processing code. During the experiment, there were several special grains in terms of shape differences, the roundness of endpoints, and texture features at both ends. Through comparison, a determination based on the pixel mean value was found to have the highest accuracy rate. Table 2 shows that the overall accuracy rate of determining the direction of the bud points of different wheat grain varieties based on a pixel mean value is 97.5%, proving that the method has a certain degree of feasibility. Further experimental analysis of bud point positioning can be carried out based on this method.
Based on the bud-point direction determination, the bud-point coefficient, k, was determined using images of photographed wheat grains. A small subset of wheat grains was selected for germination cultivation, and images were captured both before and after germination. The actual germination point positions were manually selected within the software based on the post-germination images. Through the traversal calculation of the major semi-axis value, r 1 , determined by the fitted ellipse and distance, d, from the center point to the marked germination point, we found that setting k to 0.72 provides an optimal fit for the wheat germination point locations, as illustrated in Figure 16. To further validate the accuracy of coefficient k, images of germinated grains were matched with their corresponding ungerminated seeds. The actual germination point positions were manually marked in Halcon, and deviations between these points and the predicted germination points after applying the correction coefficient were calculated. Specifically, both angular and distance differences were computed by connecting the annotated and predicted germination points to the center point of each wheat grain. The entire experimental processing workflow is detailed in Figure 14. The resulting deviations were compiled and analyzed in Excel, where average angular and distance differences for each category were calculated. These summarized results are presented in Table 3.
During image processing, the irregular shape of certain wheat grains can cause slight deviations in the calculated bud-point position within localized areas. While the bud point determined through ellipse fitting may exhibit minor positional errors, these deviations minimally impact the accuracy of bud-point angle calculations. As shown in Table 3, using the bud-point direction determination method proposed in Section 2.3.2, only two misjudgments occurred out of 120 grains, corresponding to an accuracy rate of 98.33%. The coefficient k was initially established during the research and validated through traversal analysis using Halcon. The data indicate that with k at 0.72, the mean positional error between the predicted and actual bud points across the five wheat varieties remains within 0.10 mm. Additionally, the mean angular error between the predicted and actual bud-point directions is confined within ±3°. Altogether, the experimental results confirm that this method reliably and accurately localizes wheat grain bud points. The high accuracy and low error margins indicate their suitability for precise phenotypic analysis and related applications in wheat breeding and quality assessment.

4. Discussion

In this study, traditional image processing techniques, including morphological operations and geometric analysis, were employed instead of more recent deep learning approaches. Although deep learning methods have demonstrated good performance in image feature extraction [35], they often necessitate extensive labeled data during the initial phases of dataset construction and require substantial computational resources. Conversely, traditional image processing methods [36,37,38] exhibit increased strength and reliability when working with small sample sizes and limited datasets. Moreover, images acquired using linear array cameras possess higher spatial resolution and generate larger data volumes than images captured by conventional area-array cameras or mobile devices, thereby containing rich phenotypic information. While this enhances the detail and quality of the data, it also significantly increases the complexity and time required for dataset construction and preprocessing. Furthermore, the training characteristics and development of deep learning models in agricultural applications often present interpretability challenges, particularly when applied across different crop grains with variable morphology [39]. However, the traditional image processing methods are computationally less intensive, easier to tune, and more transparent in operation when dealing with moderately complex image features. This makes them particularly advantageous in agricultural vision systems where the target objects have relatively stable and uniform backgrounds. For example, Jia Weiding [24], Bi Kun [27], Li [28], and Li Jinqiong [40] have demonstrated the effectiveness of traditional image processing techniques for the rapid and accurate extraction of grain parameters. Considering factors such as data accessibility, model interpretability, and computational resource constraints, traditional image processing methods offer a more practical and suitable approach for the objectives and experimental conditions of this study.
We used linear array cameras combined with image processing technology, successfully achieving the rapid and accurate detection of the key phenotypic parameters of wheat grains. The proposed image processing method demonstrates excellent agreement with manual measurements, yielding a determination coefficient of 0.986 for grain length and 0.9505 for grain width. Furthermore, the root mean square error values for these measurements are below 0.1, indicating a high level of precision in quantifying the geometric dimensions of wheat grains. Previous studies have demonstrated the strength and applicability of traditional image processing techniques among various crops. For instance, Jia Weiding [24] enhanced wheat feature extraction through adaptive thresholding, Zhang Xiaoyu [26] used similar techniques for maize kernel detection, Liu Hao et al. [41] achieved a classification accuracy of over 0.90 for king oyster mushrooms, and Bi Kun et al. [27] effectively extracted wheat ear features with an appearance recognition accuracy of over 0.88. These studies collectively validate the reliability and effectiveness of image processing techniques in agricultural phenotyping. Moreover, when compared with deep learning-based phenotypic extraction methods—such as the ImCascade R-CNN model developed by Pan Weiting et al. [42]—it demonstrated excellent performance. The coefficient of determination for grain length and width improved by 0.051 and 0.129, respectively, showing that well-designed image processing methods can work better than complex deep learning models, especially when there are limited data computing resources. This confirms the feasibility of combining high-precision linear array cameras with image processing technology to accurately obtain crop phenotypic parameters. However, grain weight estimation may be influenced by factors such as variations in grain density under different growth environments, genetic differences between varieties, and internal structural characteristics, including the depth of the ventral groove. The determination coefficient and root mean square error for grain weight were 0.7635 and 3.8329, respectively, which are higher than those observed for grain length and grain width. Nonetheless, the method still captures grain weight characteristics to a meaningful extent, providing valuable phenotypic information despite inherent variability.
Compared with the intuitive reflection of wheat phenotypic parameters, the determination of wheat grain bud points tends to be applied to grain sowing and seed vigor detection. For wheat seeds sown with oriented bud points, the root biomass is more developed, the root distribution is more uniform, the base of the stem is thick and strong, and the risk of lodging is reduced. The contact surface between the grain and the soil is also more stable, which can reduce the displacement caused by rainfall or irrigation and ensure population uniformity. The bud point determination method proposed in this study has an accuracy rate as high as 97.5%, the mean error of the distance between the positioning accuracy and actual germination point is within 0.10 mm, and the mean error of the angle is within ±3°. Accurately determining bud points is crucial for advancing wheat breeding and screening processes, as it provides key phenotypic reference data that support selection and genetic improvement. The methodology presented in this study for germ point identification is theoretically supported by previous studies conducted by Zhang Ruoyu et al. [43] and Wu Xudong [31], who extensively investigated grain orientation and its impact on phenotype characterization. Their findings validate the reliability and precision of germ-point detection techniques, highlighting their significance for enhancing trait analysis in wheat breeding programs. If the identification of bud point information is applied to the seed suction actuator for precise orientation adjustment operations, aligning the bud points of multiple adsorbed grains in the same direction and directly placing them in the holes is conducive to improving the accuracy of the directional seeding device.
The five wheat varieties provided by the institute represent a relatively limited sample and may not fully capture the complete range of characteristics present across all varieties. Furthermore, the image acquisition device utilized in this study is currently restricted to controlled laboratory settings, limiting its applicability in the field or under variable environmental conditions. When improving the current device to develop the next-generation device, it is necessary to consider the complex environments in actual application scenarios, such as illumination conditions, temperature, and humidity, which may impact the image quality and the detection results.

5. Conclusions

With the development of modern agriculture, the requirements for precision and efficiency in breeding and screening work are increasing. In this study, an image acquisition device was built based on a linear array camera, and wheat grain images were processed using the Halcon software, thus acquiring grain phenotypic parameters and detecting seed viability. The following conclusions were drawn:
(1) In terms of phenotypic parameter measurement, the extraction algorithm proposed and manual measurement results in this study had a mean absolute error, mean absolute percentage error, determination coefficient, and root mean square error of 0.066 mm, 0.932%, 0.986, and 0.0887, respectively; the grain width errors are 0.043 mm, 1.254%, 0.9505, and 0.0541, respectively; and the grain weight errors are 3.15 mg, 7.142%, 0.7635, and 3.8329, respectively. Moreover, the detection speed is significantly better than that of manual measurement. Although the error performance varies between different varieties, it is generally highly consistent with manual measurement and can meet the needs of wheat variety evaluation.
(2) In the determination of bud points for seed viability detection, the accuracy rate reached 97.5%. In terms of positioning accuracy, by determining the coefficient k value, the average error of the distance between the tested bud points and the actual bud points was found to be within 0.10 mm, and the average error of the angle was within ±3°. Although the irregular shape of wheat grains causes a small deviation in the position of the bud points, it has little impact on the angle calculation. Among 120 grains, only 2 were misjudged, and the accuracy rate reached 98.33%.
This study provides an effective method for rapidly screening wheat grains and detecting seed viability, contributes to the development of the seed industry, and improves the efficiency and accuracy of seed quality detection.

Author Contributions

Conceptualization, K.D. and W.Z.; methodology, K.D.; software, K.D; validation, K.D. and X.L.; formal analysis, C.S.; investigation, K.D.; resources, W.Z.; data curation, K.D.; writing—original draft preparation, K.D.; writing—review and editing, W.Z.; visualization, K.Y.; supervision, W.Z.; project administration, W.Z.; funding acquisition, W.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Key Laboratory Project of the Ministry of Education on Modern Agricultural Equipment and Technology (MAET202322) and the Youth Fund Project of the National Natural Science Foundation of China (61901194).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data provided in this study can be made available upon request to the corresponding author. As the wheat photos obtained in this study are still needed for subsequent research, only some will be provided as a reference (https://gitee.com/dkw98/wheat98.git (28 April 2025)). If necessary, you can contact the author to obtain the complete dataset (zwj0410@foxmail.com (W.Z.)).

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Wang, J.S. Development Direction and Optimization Strategies of Conventional Wheat Breeding Methods. Prim. Agric. Technol. Ext. 2024, 12, 85–87. [Google Scholar]
  2. Zhang, X.T. Study on the Correlation between Wheat Breeding Technology and the Yield of High-quality Wheat. Heilongjiang Grain 2024, 08, 67–69. [Google Scholar]
  3. Zareef, M.; Arslan, M.; Hassan, M.M.; Ahmad, W.; Ali, S.; Li, H.; Ouyang, Q.; Wu, X.; Hashim, M.M.; Chen, Q. Recent Advances in Assessing Qualitative and Quantitative Aspects of Cereals Using Nondestructive Techniques: A Review. Trends Food Sci. Technol. 2021, 116, 815–828. [Google Scholar] [CrossRef]
  4. Wang, Y.H. Research on the Traceability System of Seed Quality in China. Ph.D. Thesis, Chinese Academy of Agricultural Sciences, Beijing, China, 2016. [Google Scholar]
  5. Liu, S. The Value Space of the Seed Industry. Beijing Agric. 2012, 14, 6–7. [Google Scholar]
  6. Chen, J.; Lian, Y.; Zou, R.; Zhang, S.; Ning, X.B.; Han, M.N. Monitoring Method of Rice Grain Crushing Rate Based on Machine Vision Technology. Agric. Eng. Technol. 2020, 40, 94. [Google Scholar]
  7. Gao, L.; Yang, J.; Song, S.; Xu, K.; Liu, H.; Zhang, S.; Yang, X.; Zhao, Y. Genome–Wide Association Study of Grain Morphology in Wheat. Euphytica 2021, 217, 170. [Google Scholar] [CrossRef]
  8. Lu, W.C.; Luo, B.; Pan, D.Y.; Zhao, Y.; Yu, C.H.; Wang, C. Synchronous Measurement of Wheat Spike Length and Number of Spikelets Based on Image Processing. J. Chin. Agric. Mech. 2016, 37, 210–215. [Google Scholar] [CrossRef]
  9. Lin, Z.Q.; Mu, S.M.; Shi, A.J.; Sun, X.X.; Li, L. Application of Support Vector Machine Based on Spark in Wheat Disease Image Recognition. J. Henan Agric. Sci. 2017, 46, 148–153. [Google Scholar] [CrossRef]
  10. Paliwal, J.; Visen, N.; Jayas, D. Evaluation of Neural Network Architectures for Cereal Grain Classi” Cation Using Morphological Features. J. Agric. Eng. Res. 2001, 79, 361–370. [Google Scholar] [CrossRef]
  11. Utku, H. Application of the Feature Selection Method to Discriminate Digitized Wheat Varieties. J. Food Eng. 2000, 46, 211–216. [Google Scholar] [CrossRef]
  12. He, S.M.; He, Z.H. Research on the Classification of Wheat Varieties Based on Image Recognition. Sci. Agric. Sin. 2005, 09, 1869–1875. [Google Scholar]
  13. Meng, X.; Wang, K.J.; Han, X.Z. Identification of Wheat Varieties Based on Improved BP Network. Guizhou Agric. Sci. 2017, 45, 156–160. [Google Scholar]
  14. Zhang, Y.R.; Chen, S.S.; Zhou, X.Q.; Wang, W.Y.; Wu, Q.; Wang, H.R. Research on the Recognition Method of Imperfect Grains of Wheat Based on Image Processing and Neural Network. Sci. Technol. Cereals Oils Foods 2014, 22, 59–63. [Google Scholar] [CrossRef]
  15. Chen, J.; Lian, Y.; Zou, R.; Zhang, S.; Ning, X.; Han, M. Real-Time Grain Breakage Sensing for Rice Combine Harvesters Using Machine Vision Technology. Int. J. Agric. Biol. Eng. 2020, 13, 194–199. [Google Scholar] [CrossRef]
  16. Wang, G.Q. Research on the Application of Image Processing Technology in Wheat Yield Prediction. Master’s Thesis, China Agricultural University, Beijing, China, 2004. [Google Scholar]
  17. Zhang, Q.; Chen, Q.; Xu, W.; Xu, L.; Lu, E. Prediction of Feed Quantity for Wheat Combine Harvester Based on Improved YOLOv5s and Weight of Single Wheat Plant without Stubble. Agriculture 2024, 14, 1251. [Google Scholar] [CrossRef]
  18. Xing, J.; Symons, S.; Shahin, M.; Hatcher, D. Detection of Sprout Damage in Canada Western Red Spring Wheat with Multiple Wavebands Using Visible/near-Infrared Hyperspectral Imaging. Biosyst. Eng. 2010, 106, 188–194. [Google Scholar] [CrossRef]
  19. Zhang, T.T.; Xiang, Y.Y.; Yang, L.M.; Wang, J.H.; Sun, Q. Study on the Screening Method of Characteristic Bands for Non-destructive Detection of the Viability of Single Wheat Seeds Using Hyperspectral Technology. Spectrosc. Spectr. Anal. 2019, 39, 1556–1562. [Google Scholar]
  20. Manickavasagan, A.; Sathya, G.; Jayas, D.S.; White, N.D.G. Wheat Class Identification Using Monochrome Images. J. Cereal Sci. 2008, 47, 518–527. [Google Scholar] [CrossRef]
  21. Genaev, M.A.; Komyshev, E.G.; Smirnov, N.V.; Kruchinina, Y.V.; Goncharov, N.P.; Afonnikov, D.A. Morphometry of the Wheat Spike by Analyzing 2D Images. Agronomy 2019, 9, 390. [Google Scholar] [CrossRef]
  22. Majumdar, S. Classification of Cereal Grains Using Machine Vision. Ph.D. Thesis, The University of Manitoba, Winnipeg, MB, Canada, 1997. [Google Scholar]
  23. Gao, H.; Zhen, T.; Li, Z. Detection of Wheat Unsound Kernels Based on Improved ResNet. IEEE Access 2022, 10, 20092–20101. [Google Scholar] [CrossRef]
  24. Jia, W.D. Construction of a Wheat Seed Measurement System Based on Image Processing Technology. Master’s Thesis, Shanxi Agricultural University, Jinzhong, China, 2022. [Google Scholar]
  25. Zhu, J.S. Research on the Detection Technology of Wheat Appearance Quality Based on Machine Vision. Master’s Thesis, Jiangsu University, Zhenjiang, China, 2021. [Google Scholar]
  26. Zhang, X.Y.; Yang, J.Z. Application Research of Digital Image Processing in Maize Germination Test. J. Maize Sci. 2005, 2, 109–111. [Google Scholar]
  27. Bi, K.; Jiang, P.; Li, L.; Shi, B.Y.; Wang, C. Non-destructive Measurement of Wheat Ear Morphological Characteristics Based on Morphological Image Processing. Trans. Chin. Soc. Agric. Eng. 2010, 26, 212–216. [Google Scholar]
  28. Li, Y.; Du, S.; Zhong, H.; Chen, Y.; Liu, Y.; He, R.; Ding, Q. A Grain Number Counting Method Based on Image Characteristic Parameters of Wheat Spikes. Agriculture 2024, 14, 982. [Google Scholar] [CrossRef]
  29. Zhu, W.; Feng, Z.; Dai, S.; Zhang, P.; Wei, X. Using UAV Multispectral Remote Sensing with Appropriate Spatial Resolution and Machine Learning to Monitor Wheat Scab. Agriculture 2022, 12, 1785. [Google Scholar] [CrossRef]
  30. Wang, Q.; Qin, W.; Liu, M.; Zhao, J.; Zhu, Q.; Yin, Y. Semantic Segmentation Model-Based Boundary Line Recognition Method for Wheat Harvesting. Agriculture 2024, 14, 1846. [Google Scholar] [CrossRef]
  31. Wu, X.D. Research on the Technical Method of Wheat Seed Vigor Detection Based on Machine Vision Technology. Master’s Thesis, North University of China, Taiyuan, China, 2022. [Google Scholar]
  32. Yan, N. Research on the Development and Method of a Wheat Seed Variety Detection and Sorting Device Based on Machine Vision. Master’s Thesis, North University of China, Taiyuan, China, 2022. [Google Scholar]
  33. Wang, S.; Brolly, R.; Carpenter, D.A.; DeJager, A.; Doran, J.E.; Fabinski, R.P.; Frank, T.; Kather, R.; Kosman, S.; Lum, A.; et al. 43- and 50-Mp High-Performance Interline CCD Image Sensors. IEEE Trans. Electron Devices 2019, 66, 1329–1337. [Google Scholar] [CrossRef]
  34. Cheng, Z.P. Research on Image Quality Improvement Based on Linear Array Camera. Master’s Thesis, Tianjin University of Commerce, Tianjin, China, 2022. [Google Scholar]
  35. LeCun, Y.; Bengio, Y.; Hinton, G. Deep Learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef]
  36. Alibabaei, K.; Gaspar, P.D.; Lima, T.M.; Campos, R.M.; Girão, I.; Monteiro, J.; Lopes, C.M. A Review of the Challenges of Using Deep Learning Algorithms to Support Decision-Making in Agricultural Activities. Remote Sens. 2022, 14, 638. [Google Scholar] [CrossRef]
  37. Albahar, M. A Survey on Deep Learning and Its Impact on Agriculture: Challenges and Opportunities. Agriculture 2023, 13, 540. [Google Scholar] [CrossRef]
  38. Kamilaris, A.; Prenafeta-Boldú, F.X. Deep Learning in Agriculture: A Survey. Comput. Electron. Agric. 2018, 147, 70–90. [Google Scholar] [CrossRef]
  39. Samek, W.; Wiegand, T.; Müller, K.-R. Explainable Artificial Intelligence: Understanding, Visualizing and Interpreting Deep Learning Models. arXiv 2017, arXiv:1708.08296. [Google Scholar] [CrossRef]
  40. Li, J.Q. Research on the Detection Device for the Percentage of Cracked Grains in Paddy Rice Based on Visual Recognition Technology. Master’s Thesis, Jiamusi University, Jiamusi, China, 2023. [Google Scholar]
  41. Liu, H.; Lin, X.H.; Zhu, Y.N.; Zhou, Z.; Wang, M.; Chen, X.Y. Design of Appearance Quality Grading System for Pleurotus eryngii Based on Machine Vision. Food Mach. 2023, 39, 105–111. [Google Scholar] [CrossRef]
  42. Pan, W.T.; Sun, M.L.; Yuan, Y.; Liu, P. A Method for Wheat Grain Phenotype Identification Based on Deep Learning ImCascade R-CNN. Smart Agric. 2023, 5, 110–120. [Google Scholar]
  43. Zhang, R.Y.; Ge, Y.Y.; Chen, D.; Chen, T.E.; Jiang, K. Geometric Feature Recognition Method of White Seed Pumpkin Seeds Based on Machine Vision. J. Chin. Agric. Mech. 2024, 45, 175–180. [Google Scholar] [CrossRef]
Figure 1. Wheat grain images obtained using the line-scanning device.
Figure 1. Wheat grain images obtained using the line-scanning device.
Applsci 15 05484 g001
Figure 2. Wheat grain image acquisition and detection system. (a) Schematic diagram of wheat grain image acquisition and detection system: 1—industrial computer; 2—motion control card; 3—servo drive; 4—light source controller; 5—encoder; 6—servo motor; 7—a limit sensor; 8—double-track timing belt; 9—bar light source; 10—a feeder tray; 11—line-scan camera. (b) Wheat grain image acquisition and detection system physical map.
Figure 2. Wheat grain image acquisition and detection system. (a) Schematic diagram of wheat grain image acquisition and detection system: 1—industrial computer; 2—motion control card; 3—servo drive; 4—light source controller; 5—encoder; 6—servo motor; 7—a limit sensor; 8—double-track timing belt; 9—bar light source; 10—a feeder tray; 11—line-scan camera. (b) Wheat grain image acquisition and detection system physical map.
Applsci 15 05484 g002
Figure 3. Control unit flow chart.
Figure 3. Control unit flow chart.
Applsci 15 05484 g003
Figure 4. Main interface of wheat grain online detection.
Figure 4. Main interface of wheat grain online detection.
Applsci 15 05484 g004
Figure 5. Motion control parameter interface.
Figure 5. Motion control parameter interface.
Applsci 15 05484 g005
Figure 6. Camera calibration interface.
Figure 6. Camera calibration interface.
Applsci 15 05484 g006
Figure 7. ROI setting interface.
Figure 7. ROI setting interface.
Applsci 15 05484 g007
Figure 8. Wheat grain image acquisition process.
Figure 8. Wheat grain image acquisition process.
Applsci 15 05484 g008
Figure 9. Flow chart of wheat image processing.
Figure 9. Flow chart of wheat image processing.
Applsci 15 05484 g009
Figure 10. Image preprocessing steps.
Figure 10. Image preprocessing steps.
Applsci 15 05484 g010
Figure 11. Comparison of image processing effects.
Figure 11. Comparison of image processing effects.
Applsci 15 05484 g011
Figure 12. Schematic diagram of a wheat grain.
Figure 12. Schematic diagram of a wheat grain.
Applsci 15 05484 g012
Figure 13. The Halcon software outputs phenotypic parameter results.
Figure 13. The Halcon software outputs phenotypic parameter results.
Applsci 15 05484 g013
Figure 14. Manual testing equipment: (a) Vernier calipers; (b) electronic scales.
Figure 14. Manual testing equipment: (a) Vernier calipers; (b) electronic scales.
Applsci 15 05484 g014
Figure 15. Wheat grain germination process.
Figure 15. Wheat grain germination process.
Applsci 15 05484 g015
Figure 16. Precise positioning of the germination point.
Figure 16. Precise positioning of the germination point.
Applsci 15 05484 g016
Figure 17. Comparison of software and manual measurements and related linear fitting: (a) hm20 scatter plot; (b) hm44scatter plot; (c) nhm237 scatter plot; (d) nhm432 scatter plot; (e) xm35 scatter plot; (f) summary of scatter plots.
Figure 17. Comparison of software and manual measurements and related linear fitting: (a) hm20 scatter plot; (b) hm44scatter plot; (c) nhm237 scatter plot; (d) nhm432 scatter plot; (e) xm35 scatter plot; (f) summary of scatter plots.
Applsci 15 05484 g017aApplsci 15 05484 g017b
Table 1. Error analysis of wheat grain shape parameters.
Table 1. Error analysis of wheat grain shape parameters.
CategoryMean Grain Length Error (mm)Average Error of Grain Length (%)Mean Particle Width Error (mm)Average Error of Grain Width (%)Mean Grain Weight Error (mg)Average Error of Grain Weight (%)
hm200.0630.796%0.0390.796%2.420.796%
hm440.0650.988%0.0500.988%2.240.988%
nhm2370.0510.675%0.0360.675%3.240.675%
nhm4320.0580.808%0.0430.808%2.080.808%
xm350.0911.392%0.0451.392%1.671.392%
all0.0660.932%0.0430.932%3.150.932%
Table 2. Wheat bud-point direction determination test.
Table 2. Wheat bud-point direction determination test.
Number of Grains (pcs)Number of Correctly Identified Units (pcs)Accuracy (%)
hm20403895%
hm44403895%
nhm2374040100%
nhm4324040100%
xm35403997.5%
all20019597.5%
Table 3. Wheat bud point validation test.
Table 3. Wheat bud point validation test.
Mean Distance Between Test Buds (mm)Mean Distance Between Actual Bud Points (mm)Mean Distance Difference (mm)Mean Angular Difference (°)Misjudging the Number of Grains (Grains)
hm202.5592.5280.0311.152
hm442.0992.0520.0470.090
nhm2372.2722.2840.0120.480
nhm4322.1882.1680.0201.730
xm352.2612.1620.0992.730
all 0.0421.242
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhu, W.; Duan, K.; Li, X.; Yu, K.; Shao, C. Rapid Detection of Key Phenotypic Parameters in Wheat Grains Using Linear Array Camera. Appl. Sci. 2025, 15, 5484. https://doi.org/10.3390/app15105484

AMA Style

Zhu W, Duan K, Li X, Yu K, Shao C. Rapid Detection of Key Phenotypic Parameters in Wheat Grains Using Linear Array Camera. Applied Sciences. 2025; 15(10):5484. https://doi.org/10.3390/app15105484

Chicago/Turabian Style

Zhu, Wenjing, Kaiwen Duan, Xiao Li, Kai Yu, and Changfeng Shao. 2025. "Rapid Detection of Key Phenotypic Parameters in Wheat Grains Using Linear Array Camera" Applied Sciences 15, no. 10: 5484. https://doi.org/10.3390/app15105484

APA Style

Zhu, W., Duan, K., Li, X., Yu, K., & Shao, C. (2025). Rapid Detection of Key Phenotypic Parameters in Wheat Grains Using Linear Array Camera. Applied Sciences, 15(10), 5484. https://doi.org/10.3390/app15105484

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop