1. Introduction
Fruit size is a key characteristic in the breeding of small-fruit tree varieties (e.g., cherries, blueberries, apricots, and plums), as it directly influences market value and consumer demand—breeders have long prioritized developing large-fruit varieties for these species. To accurately evaluate fruit size, key parameters such as single-fruit weight, fruit length, and fruit diameter are measured; these parameters not only reflect fruit volume but also provide insights into appearance and quality. Currently, the measurement of longitudinal and transverse diameters of small fruits relies primarily on manual methods. Given the short shelf life of small fruits and the large volume of breeding materials, rapid measurements of numerous samples are essential. However, manual measurement is time consuming and labor intensive, and results are prone to significant uncertainty due to human error (e.g., inconsistent identification of the equatorial plane). Additionally, the diverse shapes and sizes of small fruits further reduce measurement efficiency, making manual methods inadequate for the high-throughput demands of modern agricultural breeding.
To address these limitations, there is an urgent need for an efficient, accurate automated measurement alternative. Machine vision technology, which enables precise size measurement via image capture, has been widely applied in breeding and phenotyping—supporting high-throughput analysis of fruits [
1,
2,
3], seeds [
4,
5,
6,
7,
8,
9,
10], and plant health [
11,
12]. Despite this progress, no specialized image acquisition and size measurement equipment has been developed for small fruits (e.g., cherries with stems, powdery blueberries, or wrinkled walnuts). Existing machine vision-based fruit measurement systems either depend on complex deep learning frameworks (requiring large annotated datasets for training [
6,
13,
14]) or are limited to single fruit species [
15,
16], exhibiting poor compatibility with small/irregular fruits [
1,
3]. Moreover, open-source tools (e.g., Python+OpenCV) demand advanced programming skills for debugging [
4,
17], while commercial platforms (e.g., Keyence IM8000) are costly and lack customization for small-fruit traits.
This study proposes a non-contact, high-precision fruit size measurement system based on machine vision, integrated via LabVIEW. Compared with traditional manual methods, this system significantly improves measurement efficiency, reduces time costs, and minimizes result fluctuation. Its core innovations address gaps in existing solutions: (1) modular integration to lower operational barriers for breeders without programming experience; (2) an optimized image processing pipeline for high-precision contour extraction of small fruits; and (3) lightweight adaptation to multiple fruit species without hardware reconfiguration. The following sections detail the system’s design, implementation, and validation.
2. Materials and Methods
2.1. Materials and Procedure
2.1.1. Sample Preparation
All experimental samples were provided by the Shandong Institute of Pomology to ensure a consistent genetic background and uniform growth conditions.
Three small-to-medium fruit species with distinct morphological characteristics were selected, with 15 independent specimens analysed per species:
The cherry (‘Tieton’ variety) samples were uniformly mature, free from mechanical damage, and exhibited transverse diameters of 20–23 mm, corresponding to the Medium grade defined in GB/T 26906-2024 [
18] (Sweet Cherry). This species was used as the primary model to demonstrate the measurement workflow, with the same methodological framework applied subsequently to the other fruits.
The blueberry (‘Northland’ variety) samples were harvested at commercial maturity and retained intact powdery bloom coatings, which posed challenges for edge detection owing to their blurred contour boundaries.
The walnut (‘Xiangling’ variety) samples were naturally dried and presented characteristically wrinkled surfaces, selected to assess the system’s performance when analysing irregular or non-smooth contours.
All samples were acclimated to the experimental environment for 2 h before testing to avoid size fluctuations caused by temperature- or humidity-induced shrinkage or expansion; preliminary tests confirmed this period reduced diameter variation to less than 0.05 mm, which is negligible for the measurements.
2.1.2. Experimental Equipment
Imaging device: A 5-megapixel color industrial camera (Hikvision Digital Technology Co., Ltd., Hangzhou, China; Effective Pixels: 2592 (H) × 1944 (V); Optical Format: 1/2.5″; Pixel Size: 2.2 µm × 2.2 µm; Sensitivity: 1.4 V/lux-sec 550 nm; Dynamic Range: >70.1 dB; Signal-to-Noise Ratio: 38.1 dB) was used for high-resolution fruit image capture without perspective distortion. Similar camera configurations have been successfully applied in measuring morphological features of radiata pine seedlings [
19], confirming the suitability of this hardware selection for plant phenotyping applications.
Lighting system: An adjustable LED backlight (Keyence Corporation, Osaka, Japan; wavelength range: 550–650 nm) with uniform intensity output, set to 1000 ± 50 lux, was used to enhance fruit-background contrast and minimize shadow interference.
Software platform: LabVIEW 2021 (National Instruments, Austin, TX, USA) integrated with the Vision Development Module 2021 and Machine Vision Module 2021 was used, enabling automated image acquisition, processing, and data analysis via reusable LabVIEW sub Virtual Instruments (sub-VIs), specifically the flat-field correction sub-VI, calibration sub-VI, and pattern matching and coordinate system establishment sub-VI.
Reference tools: A standard digital caliper (Mitutoyo Corporation, Kawasaki, Japan; model: CD-6CS, precision: ±0.01 mm) was used for gold-standard validation. A 10 mm first-grade standard gauge block (Mahr GmbH, Göttingen, Germany; accuracy: ±0.001 mm) was used for caliper calibration, and a standard ruler (Shanghai Measuring & Cutting Tool Works Co., Ltd., Shanghai, China; accuracy: ±0.01 mm) was used for pixel-to-real-size conversion.
2.1.3. Standardized Measurement Procedures
To ensure methodological rigour, the manual, standard, and system measurement procedures were harmonised according to the following three protocols: For manual measurement, three operators with basic training—representing typical laboratory conditions—measured five cherry fruits once, resulting in a total of fifteen replicates per species. No prior calliper calibration was performed, reflecting common on-site practices. For the gold-standard measurement, two experienced operators, each with at least three years of expertise in fruit measurement, independently measured every fruit three times, yielding six replicates per fruit. The mean of these six replicates was used as the reference value for subsequent accuracy assessment. For the system measurement, each fruit was imaged and analysed three times under fixed camera positioning, without moving the sample, in order to quantify random errors attributable to the image acquisition process, such as minor lighting variations.
Prior to size calculation, all samples underwent preliminary image processing, including flat-field correction, grayscale conversion, morphological processing, and calibration, as detailed in subsequent sections, to ensure data accuracy.
2.2. Preliminary Image Processing
After capturing the cherry images, grayscale processing was applied to reduce computational complexity. The original RGB images contained three colour channels (R, G, and B) with redundant chromatic information that increased processing time. Conversion to grayscale reduced the data to a single channel representing brightness only, thereby accelerating subsequent calculations. To ensure accurate grayscale rendering, the weighted average method was applied, assigning coefficients according to human visual sensitivity to colour (0.299 for R, 0.587 for G, and 0.114 for B). This weighting preserved the perceived luminance and key brightness details consistent with human vision. The specific calculation formula for grayscale conversion is provided in
Supplementary Equation (S1). Owing to lighting differences (e.g., edge darkening from the backlight), image brightness was non-uniform, which could introduce errors in contour detection. To address this issue, flat-field correction was performed in two sequential steps. In the PRNU correction (bright-field correction), a reference image (denoted F) was captured under uniform illumination to record the camera’s response to consistent light intensity. In the DSNU correction (dark-field correction), a noise image (denoted D) was captured in complete darkness to compensate for camera-specific dark current noise.
The correction process first normalizes the raw image (R) by subtracting dark-field noise and dividing by the bright-field reference (to eliminate lighting unevenness), then uses a magnification factor (calculated from the mean of F–D) to adjust the normalized image back to the 0–255 grayscale range. Detailed formulas for the normalization and magnification factor are provided in
Supplementary Equations (S2)–(S4).
The visual effects of grayscale conversion and flat-field correction are shown in
Figure 1, where (a) is the original RGB image, (b) is the grayscale image after weighted average processing, (c) is the image before flat-field correction (with obvious edge darkening), and (d) is the image after correction (with uniform brightness).
After correction, the standard deviation of grayscale values decreased from 25.14 (before correction) to 15.02 (after correction), as shown in
Figure 2. This reduction indicates grayscale values became more concentrated, image brightness uniformity improved, and lighting interference was significantly minimized.
2.3. Edge Contour Detection
To identify the most suitable edge detection algorithm for this experiment, the commonly used algorithms Canny, Sobel, and Roberts were tested to determine the optimal choice. A circular image was used as the original image. To simulate an actual detection environment, a 5 × 5 convolution kernel was applied to create a blurred version. Contours were then extracted from the blurred images using the Canny, Sobel, and Roberts algorithms. The maximum contour distance in eight directions was measured for each contour-extracted image, and the results are shown in
Figure 3 and
Table 1.
The measurement results indicated that for the original image, the number of farthest pixels on its edge ranged between 379.22 and 380.36 across different directions. However, after blurring the image, the edges became difficult to determine, and the contours extracted using the Canny, Sobel, and Roberts algorithms exhibited varying degrees of deviation. To quantify this deviation, we calculated the absolute error of each algorithm in eight detection directions and their mean absolute error (MAE) (
Table 1): the Canny algorithm showed the lowest MAE (2.00), 12.3% lower than Sobel (2.28) and 20.2% lower than Roberts (2.49) and outperformed the other two in 6 out of 8 directions. The Canny algorithm proved significantly more accurate in matching the actual contours, thereby providing more precise measurement results. Consequently, the Canny algorithm was selected for contour extraction in this study.
Figure 4 further confirms this result: (a) and (b) show the original and blurred circular images (simulating real fruit edge ambiguity), while (c)–(e) present the contour extraction effects of Canny, Sobel, and Roberts algorithms, respectively. It is clear that the Canny algorithm retains the most complete and accurate contour, with minimal deviation from the original circular shape, whereas Sobel and Roberts algorithms produce fragmented or offset edges. (f) and (g) additionally show the Canny processing results for actual cherry images, verifying its applicability to real samples.
To facilitate subsequent stem separation and remove unnecessary curves after applying the Canny algorithm, the Otsu algorithm was applied to automatically threshold and binarize the image, thereby enhancing the speed of image processing.
2.4. Morphological Processing
To refine the binarized image and separate the cherry fruit from stems/noises, four key morphological processing operations were used—each tailored to address specific image artifacts.
The mathematical expressions (NOT definitions) used for these operations are provided in
Supplementary Equations (S5)–(S8), and their specific roles in this study are summarised below. In the dilation operation, bright regions such as cherry contours are expanded by growing the edges of white pixels. This process connects small gaps in the fruit contour caused by image noise and ensures contour continuity, which is essential for accurate diameter measurement. In the erosion operation, bright regions are reduced by trimming edge pixels. This step removes small noise artefacts, such as dust particles, and smooths irregular protrusions on the cherry contour, thereby preventing overestimation of fruit size. In the closing operation, which consists of dilation followed by erosion, small holes within the cherry contour—such as dark spots arising from uneven illumination—are first filled during dilation, after which erosion restores the contour to its approximate original size. This results in a complete, smooth binarised image of the entire fruit (including stem), as illustrated in
Figure 5b,c. In the opening operation, which consists of erosion followed by dilation, narrow connections between the fruit and stem are first removed by erosion, since the stem is thinner and therefore entirely eliminated at this stage. Dilation then restores the fruit contour to its original dimensions, effectively isolating the fruit from the stem.
The specific processing workflow involved three sequential operations. First, the closing operation was applied to the binarized image to fill contour holes and obtain a complete binary image of the cherry including its stem. Next, the opening operation was applied to the same binarized image to remove the stem and generate a binary image containing only the fruit. Finally, the “fruit-only” image was subtracted from the “cherry + stem” image to produce a “stem-only” binary image (
Figure 5d,e), thus achieving complete separation of the stem from the fruit.
This workflow ensured that the final “fruit-only” image was free from holes, noise, and stem interference, providing a clean target for transverse diameter measurement in accordance with the GB/T 26906-2024 Sweet Cherry standard, which specifies that diameter measurements be taken on the fruit body excluding the stem.
Cherry size was measured using a binary image of the fruit. According to the newly revised GB/T 26906-2024 “Sweet Cherry” standard, the size specifications of the cherries were classified based on the transverse diameter (d) of the fruit, as outlined in the table below (
Table 2). Therefore, it is necessary to measure the transverse diameter of the cherry.
2.5. Pattern Matching and Coordinate System Establishment
Before performing size measurements, it is essential to determine the position and orientation of the cherry fruit in the 2D plane and establish a coordinate system based on its location and orientation to measure the transverse diameter of the cherry. This can be achieved using a geometric matching algorithm that identifies the position and orientation by learning the geometric features of the provided template and the spatial relationships between these features. These combined features form a set that describes the image, and the algorithm performs feature matching with the detected image using this set. In this study, an edge-based geometric matching algorithm was applied to calculate the gradient values of the edges, which were points along the contour found in the image. The algorithm then used these gradient values and point positions to perform matching from the template center. As the cherry image was previously segmented from the background, the current image was free from environmental interference. To increase the robustness, the matching score can be lowered by setting the parameters, allowing for a greater tolerance of different cherry shapes. The matching results are shown in
Figure 6.
Figure 6 shows that cherries of varying shapes, positions, and angles were accurately identified, and the corresponding coordinate systems were successfully established.
2.6. Calibration
After establishing the coordinate system, calibration was required. Typically, the imaging model of the camera follows the pinhole imaging method, as shown in
Figure 7. This figure shows quadrilateral ABCD projected to its image A’B’C’D’ via the pinhole.
Calibration was necessary to convert the measured pixel distance from the image into the corresponding real-world fruit dimension. As a distortion-free industrial camera was used in this study, the calibration addressed only the spatial coordinate transformation, with no need for distortion correction. This approach is consistent with established calibration methods used in agricultural product measurement [
20,
21]. The calibration process converts information across three coordinate systems: the real-fruit coordinate system, the camera coordinate system, and the image (or pixel) coordinate system. The real-fruit coordinate system (Xw, Yw, Zw) represents the actual three-dimensional space of the cherry, for example, with the fruit centre positioned at Xw = 0, Yw = 0, Zw = 30 cm, corresponding to the fixed camera-to-object distance. The camera coordinate system (Xc, Yc, Zc) describes the three-dimensional space relative to the camera lens and is obtained from the real-fruit coordinates through the application of a rotation matrix R and a translation vector T. The image (or pixel) coordinate system (x, y or u, v) defines the two-dimensional position of the fruit on the camera sensor and is derived from the camera coordinates using the focal length f and the sensor pixel dimensions dx and dy. Detailed formulas for coordinate transformation (e.g., real fruit→camera, camera→pixel) are provided in
Supplementary Equations (S9)–(S14). The key output of calibration is the pixel equivalent (P): the real fruit distance corresponding to 1 pixel in the image (unit: mm/pixel).
To convert pixel distance in images to actual fruit size, the pixel equivalent (key for this conversion) was calibrated using a standard ruler (accuracy: ±0.01 mm, 1 mm intervals)—consistent with the fixed camera-sample distance (30 cm,
Section 2.1.3) to avoid perspective distortion. The calibration procedure consisted of three sequential operations. First, a ruler was positioned at the same height as the cherry, corresponding to a distance of 30 cm from the camera, and its image was captured. Next, the number of pixels (x) representing a known ruler length (X = 10 mm) was measured using the LabVIEW program. Finally, the pixel-to-length conversion factor was determined as P = X/x = 10 mm/500 pixels = 0.02 mm per pixel, indicating that 500 pixels in the image corresponded to 10 mm in real distance.
This pixel equivalent (P = 0.02 mm/pixel) was used to convert all image measurements to real size: if the transverse diameter of the cherry extends 1080 pixels in the image, the actual diameter is 1080 pixels × 0.02 mm/pixel = 21.6 mm. The proportional relationship between real distance and pixel distance, as well as the pixel equivalent formula, are provided in
Supplementary Equations (S15) and (S16).
To verify calibration accuracy and repeatability, five repeated measurements of the standard ruler were performed under fixed conditions (30 cm camera distance, 1000 ± 50 lux lighting). The pixel equivalent ranged from 19.8 to 20.2 µm per pixel, with a standard deviation of 0.15 µm per pixel, confirming that the calibration was repeatable and reliable. After calibration, the system determined the cherry’s transverse diameter by identifying the two farthest points on the fruit contour through edge detection, counting the pixels between them, and multiplying by the pixel equivalent to obtain the actual diameter.
Calibration repeatability over short-term use was further verified through five independent measurements performed on different days using the same standard ruler. The pixel equivalent remained consistent (mean ± SD: 20.0 ± 0.15 µm per pixel, CV = 0.75%), indicating negligible calibration error (<1 µm for a 21.6 mm cherry).Error propagation analysis indicated a total measurement error of 320 µm, comprising calibration error (51%), imaging error (40%), and algorithm error (9%). After calibration, the cherry size was measured by defining a ROI and establishing its relationship with the coordinate system, enabling the direct measurement of the cherry’s transverse diameter.
2.7. Measurement System Interface and Related Programs
The system was programmed using LabVIEW, incorporating Machine and Vision modules for machine vision algorithm processing. In addition to basic image acquisition and morphological processing, including the Canny algorithm, the program featured three sub-VIs that encapsulated the flat-field correction module, calibration module, and pattern matching and coordinate system establishment module.
The detailed program flowchart of the flat-field correction module (including steps for PRNU/DSNU correction and grayscale normalization) is provided in
Supplementary Figure S2, which illustrates how this module removes lighting interference and improves image uniformity.
The calibration module enables spatial coordinate transformation (real fruit→camera→pixel). Its detailed program flowchart (including steps for loading calibrated images and applying pixel equivalent conversion) is provided in
Supplementary Figure S3. The pattern-matching module identifies fruit position/orientation and establishes a coordinate system. Its detailed program flowchart (including parameter setting for geometric matching and ROI definition) is provided in
Supplementary Figure S4.
To clarify the logical connection between the aforementioned sub-modules, the cherry-size measurement system follows a sequential overall workflow. The process begins with image acquisition, where RGB images of cherries are captured using a distortion-free industrial camera. The images are then pre-processed through grayscale conversion, flat-field correction to reduce lighting interference, Canny edge detection, and morphological operations to separate the fruit from the stem. Calibration follows, converting pixel distances to real fruit dimensions using the pixel equivalent (0.02 mm per pixel) obtained from a standard ruler. Pattern matching is then used to locate the fruit, determine its orientation, and establish a coordinate system defining the measurement region of interest. Finally, the transverse diameter of the fruit contour is measured, and the result is output. This integrated workflow ensures that each module operates logically and cohesively within the overall measurement process.
Figure 8 shows the size measurement device’s physical setup and the LabVIEW system’s main interface with real-time transverse diameter display.
3. Results and Discussion
3.1. Preliminary Performance Validation: System vs. Manual Measurement
To validate the measurement performance of the proposed system, 15 “Tieton” cherries were selected as test samples. Manual measurement (traditional method control) and system measurement were conducted for each cherry (1 replicate per method per cherry) to compare precision and stability.
Table 3 presents the 15 sets of paired measurement data, along with key statistical indicators (mean, standard deviation [SD], coefficient of variation [CV], average absolute error, and average relative error).
As shown in
Table 3, two key findings confirm the equivalent performance between the system and manual measurements, supporting the system’s feasibility as a manual replacement. The system showed excellent consistency with manual measurements: the average absolute error between the two methods was only 0.054 mm, with an average relative error of 0.171%, well below the ±1 mm tolerance specified in GB/T 26906-2024 Sweet Cherry. This minimal deviation means the system’s results align closely with traditional manual methods, providing a reliable basis for applying it in workflows requiring consistency with conventional operations.
In terms of measurement repeatability, the system’s performance was comparable to manual measurements: the standard deviation (SD) of system measurements (1.67 mm) was nearly identical to that of manual measurements (1.69 mm), and the system’s coefficient of variation (CV = 5.12%) was marginally lower than the manual CV (5.18%). This difference falls within the range of random measurement fluctuation. Notably, the system eliminates subjective random errors in manual operations—such as inconsistent identification of the equatorial plane by operators or uncalibrated measuring tools—helping maintain stable, reproducible results even in high-throughput breeding scenarios (e.g., batch measurement of thousands of samples), where manual errors tend to accumulate.
3.2. Accuracy Verification via Standard Measurement
To eliminate interference from manual operation errors (e.g., uncalibrated tools, subjective equatorial plane judgment) identified in
Section 3.1 and verify the system’s intrinsic precision, we compared its measurements against a gold standard using the same 15 ‘Tieton’ cherries. This approach avoids sample variability and isolates the system’s inherent measurement error against a reliable reference.
3.2.1. Measurement Protocol
The gold-standard measurement, used for determining the true value, used a pre-calibrated Mitutoyo CD-6CS digital calliper. The instrument had been calibrated with a 10 mm first-grade gauge block, resulting in an error of less than 5 μm. Two experienced operators, each with a minimum of three years’ practice in dimensional measurement, measured every cherry three times, giving six replicates per specimen. The mean of these six measurements was taken as the true transverse diameter (Dtrue) to minimise operator bias.
The system measurement was performed under the same experimental conditions described in
Section 3.1, with a fixed camera height of 30 cm, LED backlight intensity of 1000 ± 50 lux, and a pixel equivalent of 0.02 mm per pixel. Each cherry was imaged and analysed three times, and the mean of these replicates was used as the system’s final value (Dsystem) to ensure consistency.
3.2.2. Results and Analysis
The gold-standard measurement (Dtrue) had a mean value of 32.63 mm, and the system measurement (Dsystem) averaged 32.61 mm. The difference of 0.02 mm indicates excellent consistency between the two datasets. The system achieved an average absolute error of 0.25 mm, an average relative error of 0.8%, and a root mean square error (RMSE) of 0.32 mm—all well within the ±1 mm tolerance specified in GB/T 26906-2024 Sweet Cherry, ensuring no misclassification of cherry grades (e.g., “Medium” [24–28 mm] versus “Large” [28–30 mm]) in breeding applications.
Statistical analysis further confirmed the system’s accuracy. A paired t-test yielded a p-value of 0.88 (above 0.05), showing no statistically significant difference between Dsystem and Dtrue, thus excluding systematic errors from hardware (e.g., camera distortion) or software (e.g., edge-detection algorithms). The coefficient of determination (R2 = 0.99) demonstrated that 99% of the variation in Dsystem was explained by Dtrue, confirming a strong linear relationship between the two measurements.
Regarding measurement system measurement repeatability, the standard deviation of Dtrue was 1.85 mm, while that of Dsystem was 1.82 mm. The slightly lower variability of the system indicates a reduced influence of human judgement, such as minor inconsistencies in identifying the equatorial plane during manual measurements.
3.3. Multi-Fruit Applicability Validation
To assess the system’s compatibility with small-to-medium fruits of varying morphology (e.g., smooth cherries, powdery-coated blueberries, and wrinkled walnuts) 15 samples of each fruit type were analysed. Manual measurements (MMV, reference) and system measurements (SMV) were performed under identical conditions (camera-to-object distance 30 cm, backlight intensity 1000 ± 50 lux). Statistical indicators, including the mean ± standard deviation (SD), root mean square error (RMSE), coefficient of determination (R
2), 95% confidence interval (95% CI), and paired
t-test (α = 0.05), were used to evaluate agreement between methods (
Table 4).
For cherries with smooth and regular contours, system measurements closely matched the reference values. The mean of system measurements (32.61 ± 1.82 mm) was almost identical to that of manual measurements (32.63 ± 1.85 mm), with an RMSE of 0.32 mm and R2 = 0.99, indicating an excellent linear correlation. For blueberries, where the powdery surface often causes contour blurring, the system maintained high accuracy, achieving the lowest RMSE (0.29 mm) and an R2 of 0.99, demonstrating effective compensation for coating-related edge interference. For walnuts, despite their irregular, wrinkled surfaces, the system also performed well (RMSE = 0.34 mm, R2 = 0.98), confirming reliable contour recognition even for uneven fruit geometries. Across all fruit types, the 95% CIs of manual and system measurements overlapped substantially, and paired t-tests returned p-values > 0.05, confirming no statistically significant differences between the two methods.
The system’s adaptability to diverse fruit morphologies was achieved through two lightweight parameter adjustments requiring no hardware changes: (i) adaptive calibration of the pixel equivalent (0.02 mm/pixel for cherries and 0.018 mm/pixel for blueberries) to account for fruit size differences, and (ii) adjustment of the template matching score to 0.7 to improve tolerance to shape variability, such as walnut wrinkles. In terms of efficiency, the system required an average of only 0.4 s per fruit—six times faster than manual measurement (2.4 s)—a clear advantage for high-throughput breeding workflows involving thousands of samples.
Overall, the system achieved consistently low RMSE values (<0.35 mm), high R
2 (>0.98), and no significant statistical bias across all fruit types, confirming strong accuracy and robustness. These results fully validate the system’s multi-fruit compatibility and suitability for practical breeding applications. The agreement between gold-standard and system measurements is visually illustrated in the Youden plot (
Figure 9), where all data points cluster tightly around the ideal y = x line, in line with the statistical results (
p > 0.05, R
2 > 0.98).
Our system’s performance aligns with and improves on existing studies. Its efficiency (0.4 s per fruit) is higher than Neupane et al. [
3]’s orchard systems (0.8–1.2 s per fruit), and it’s easier to operate than Blasco et al. [
17]’s open-source tools. In accuracy, it outperforms Bortolotti et al. [
21]’s apple system (RMSE 0.32 vs. 0.41 mm) and solves Miranda et al. [
14]’s small-fruit contour issue. For multi-fruit use, it avoids Neupane et al. [
16]’s single-species limitation, and its walnut RMSE (0.34 mm) is lower than Jana et al. [
1]’s 0.52 mm, confirming its advantages over prior systems.
4. Conclusions
This study addressed the inherent limitations of manual size measurements for small-to-medium fruits, such as high labour demand, low throughput, and operator-induced variability, which restrict large-scale breeding and germplasm evaluation. To meet these challenges, we developed a LabVIEW-integrated machine-vision system for non-contact, high-precision fruit measurement, incorporating three core innovations that close existing technological gaps.
Experimental validation confirmed the system’s reliability and robustness. Tests on three fruit species (cherry, blueberry, and walnut) conducted under controlled conditions demonstrated excellent agreement with gold-standard measurements. No significant statistical differences were observed between the system and reference results, and overall measurement uncertainty remained minimal. The system maintained stable accuracy under varying lighting and distance conditions and achieved an average measurement time of only 0.4 s per fruit, making it several times faster than manual methods and ideally suited for high-throughput phenotyping.
The selected species represented distinct morphological challenges: stem interference in cherries, powdery coatings in blueberries, and surface irregularity in walnuts. The system’s consistently low measurement error and high correlation with reference data confirm its capability to handle such variability effectively.
Nevertheless, some limitations persist. The current validation covered a limited number of cultivars and maturity stages, which constrains generalisability. Moreover, the laboratory-based configuration lacks portability and is not yet optimised for field deployment, where fluctuating illumination, fruit motion, and varying sampling distances complicate on-site measurements.
Future work will expand validation to multiple cultivars and maturity stages and improve instance segmentation for clustered fruits such as grapes. Field applicability will be enhanced through hardware miniaturisation, integrating a Raspberry Pi for edge computing, a lightweight 12 V LED backlight for portable operation, and a rapid calibration module based on a 100 mm reference ruler. Incorporating GNSS functionality will also enable mapping of fruit-size distribution across orchards. These developments aim to balance measurement precision with portability, bridging the gap between laboratory validation and field application.