Next Article in Journal
Simple and Cost-Effective Design of a THz-Metamaterial-Based Hybrid Sensor on a Single Substrate
Previous Article in Journal
Study on Quasi-Open Microwave Cavity Sensor Measuring Pulverized Coal Mass Concentration in Primary Air Pipe
Previous Article in Special Issue
Measurement of Maize Leaf Phenotypic Parameters Based on 3D Point Cloud
 
 
Due to scheduled maintenance work on our database systems, there may be short service disruptions on this website between 10:00 and 11:00 CEST on June 14th.
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

HALF: Histogram of Angles in Linked Features for 3D Point Cloud Data Segmentation of Plants for Robust Sensing

by
Hidenori Takauji
1,*,
Naofumi Wada
2,
Shun’ichi Kaneko
3 and
Takanari Tanabata
4
1
Department of Electronics and Information Engineering, Faculty of Engineering, Hokkai-Gakuen University, Sapporo 0640926, Japan
2
Department of Information Science and Technology, Faculty of Information Science and Technology, Hokkaido University of Science, Sapporo 0068585, Japan
3
Graduate School of Information Science and Technology, Hokkaido University, Sapporo 0640814, Japan
4
Facility for Genome Informatics, Kazusa DNA Research Institute, Chiba 2920818, Japan
*
Author to whom correspondence should be addressed.
Sensors 2025, 25(12), 3659; https://doi.org/10.3390/s25123659
Submission received: 28 April 2025 / Revised: 28 May 2025 / Accepted: 7 June 2025 / Published: 11 June 2025

Abstract

:
This paper presents a novel method, Histogram of Angles in Linked Features (HALF), designed for the segmentation of 3D point cloud data of plants for robust sensing. The proposed method leverages local angular features extracted from 3D measurements obtained via sensing technologies such as laser scanning, LiDAR, or photogrammetry. HALF enables efficient identification of plant structures—leaves, stems, and knots—without requiring large-scale labeled datasets, making it highly suitable for applications in plant phenotyping and structural analysis. To enhance robustness and interpretability, we extend HALF to a convolution-based mathematical framework and introduce the Sequential Competitive Segmentation Algorithm (SCSA) for phytomer-level classification. Experimental results using 3D point cloud data of soybean plants demonstrate the feasibility of our method in sensor-based plant monitoring systems. By providing a low-cost and efficient approach for plant structure analysis, HALF contributes to the advancement of sensor-driven plant phenotyping and precision agriculture.

1. Introduction

Plants survive in their environment by adapting to it. The knowledge obtained from analyzing gene functions related to environmental adaptation has been useful for breeding species that can adjust to their agricultural production locations and for improving species’ resistance to climate change [1]. To analyze gene functions, it is essential to study various genotypes under diverse growing conditions, clarify variations in individual growth, and quantify these differences using large-scale datasets. Recently, several studies have focused on the geometric analysis of plant parts, such as leaves and stems, and their structures in relation to size and shape [2,3].
Three-dimensional shapes represented by, for instance, 3D point cloud data are fundamentally important due to their generality and universality, along with the popularity of sensing technologies. With the rapid advancement of sensing technologies such as 3D laser scanning, LiDAR, and multi-view photogrammetry, large-scale three-dimensional (3D) point cloud data acquisition has become increasingly feasible in plant science. These sensor-based measurements play a crucial role in plant phenotyping, enabling non-destructive and high-throughput analysis of plant structures.
However, effectively segmenting such 3D data into meaningful plant organs remains a significant challenge, particularly in minimizing the need for manually labeled training data. Several machine learning-based methods, such as DCNNs, have similar objectives to ours but require a substantial amount of labeled training data. In our application of these methods to the analysis of living plants, which necessitates ongoing databases that include numerous individuals, the cost of preparing these data is a significant challenge; therefore, it is preferable to avoid any complicated processing if possible. For this reason, the novel method presented in this paper may be promising for cost savings. We develop an in-depth discussion on this issue in the following section.
We designed a fundamental and useful random algorithm, named HALF (Histogram of Angles in Linked Features), to transform their 3D point cloud data (PCD) into a labeled version using statistical features from an angular histogram based on a simple top-down definition of plant parts [4]. The algorithm is anticipated to be useful for the aforementioned purpose of phenotype analysis by examining the outer shape of plants. In this paper, we first propose some extensions of the basic tool to its theoretical version as a mathematical convolution operation. We then design a classification algorithm based on our HALF analysis for the complete segmentation of the PCD version of individual plants to facilitate phytomer extraction. Finally, we confirm the effectiveness of HALF through evaluation experiments using soybean PCD.

2. Related Works

Research on the segmentation of 3D point cloud data has been widely applied to the analysis and classification of plant structures. Conventional segmentation methods include approaches based on clustering and feature descriptors, which have been used extensively to analyze 3D point cloud data. For example, clustering-based methods can group data based on specific criteria, enabling the identification of plant geometries. However, clustering alone faces limitations in accurately distinguishing detailed differences in shapes and parts when applied to complex plant structures, thereby restricting segmentation accuracy.
Methods that combine handcrafted features and machine learning have also been utilized for plant segmentation using 3D point cloud data. Paulus et al. [5] proposed a method for classifying plant stems and leaves using PFH (Point Feature Histogram) [6] and FPFH (Fast Point Feature Histogram) [7] as feature descriptors. Similarly, Ziamtsov et al. [8] achieved high accuracy in distinguishing leaves and stems of tomatoes and tobacco by employing PFH, FPFH, SHOT (Signature of Histograms of Orientations) [9], and various machine learning methods. While these handcrafted features are effective for identifying basic plant structures, challenges remain regarding the preparation of labeled data and the high computational cost involved.
In recent years, semantic segmentation methods based on deep learning have advanced and are now being applied to the analysis of 3D point cloud data. Representative methods include PointNet [10] and PointNet++ [11], which directly extract features from input data and enable high-precision identification of plant organs such as leaves and stems. Patel et al. [12] and Luo et al. [13] applied PointNet and PointNet++ to plant phenotyping, achieving efficient organ-level segmentation. Additionally, Otobe et al. [14] successfully employed PointNet++ to separate leaves in the point cloud data of bell peppers, facilitating phenotype extraction.
Other methods have also been proposed. For instance, Boogaard et al. [15] introduced class-dependent sampling to address class imbalance, aiming to improve segmentation accuracy. Ghahremani et al. [16] combined a k-nearest neighbor algorithm with a deep learning model to identify the complex structures of wheat. Turgut et al. [17] used PointNet and PointNet++ to segment the organs of 3D models of roses, leveraging deep learning to identify plant organs. More recently, Xie et al. [18] applied Transformer-based methods such as Stratified Transformer and PAConv to plant phenotyping, utilizing self-attention mechanisms to enhance feature extraction. Research on point cloud deep learning remains active, and various segmentation techniques beyond PointNet and PointNet++ are being explored [19].
However, while these deep learning methods demonstrate high expressiveness in identifying plant organs, they require large amounts of labeled data, leading to significant costs associated with data preparation. The proposed HALF method differs from conventional segmentation techniques based on handcrafted features and deep learning. It focuses on angle histograms within 3D point cloud data to efficiently extract angle-specific characteristics unique to each plant part. This approach can identify plant parts without the need for large-scale labeled data, thus reducing data preparation costs. Additionally, HALF can be implemented through a simple algorithm based on randomness, requiring relatively low computational costs. This study demonstrates that the proposed HALF-based segmentation is effective for the efficient identification of plant structures. The contributions of this study are summarized as follows:
Low-cost shape identification: HALF does not require labeled data, allowing for easy application to large datasets at low cost.
Novel segmentation method using angle histograms: HALF efficiently extracts features useful for identifying plant parts.
Applicability to plant phenotyping: HALF offers a straightforward method for quantifying plant morphology, which is expected to benefit applications in plant phenotyping.

3. Fundamentals of HALF

3.1. Basic Definitions

To design effective features for phytomer segmentation, we can utilize natural characteristics such as shapes, sizes, colors, textures, and structures. In this paper, for three representative classes of plant parts—stems, knots, and leaves—we concentrate on their shapes as the targets of investigation because they are robust to changes in illumination and detectable at relatively low cost using effective applications of 3D tools, providing informative data for classification.
Histograms have a long history as an effective framework for representing and generating basic statistical features of various types of real-world data in many fields [20,21]. Their non-parametric nature has proven to be particularly effective in cases where sophisticated parametric statistical models cannot be utilized due to a lack of advanced knowledge about the targets or disturbances in observation.
The main problem this paper addresses is how to determine the type of quantity that should be represented in histograms for effective classification, alongside an efficient and robust computational process. To this end, we propose the use of simple angular values between any pair of three-dimensional points, p 1 = x 1 , y 1 , z 1 and p 2 = x 2 , y 2 , z 2 , centered around a nodal point, p 0 = x 0 , y 0 , z 0 . Each point has three-dimensional coordinates contaminated by independently and identically distributed random noise and is selected from point cloud data (PCD). Points should be chosen to have relatively longer lengths in both arms to better capture essential characteristics of part shapes, such as stems, knots, and leaves. We assume these coordinates are disturbed by normally distributed noise with zero mean and variance σ 0 2 originating from the 3D measurement process. Figure 1 illustrates a conditional selection of a tuple p 1 , p 2 around a center of interest p 0 , where both elements lie in the gap or shell between two spheres of the radii r m i n and r m a x .
Figure 2 outlines the selection scheme proposed for typical shapes of stems and leaves.
After the initial selection process, two difference vectors— Δ p 1 = p 1 p 0 and Δ p 2 = p 2 p 0 —are computed, doubling the noise variance to 2 σ 0 2 . These vectors are then normalized to form the unit vectors Δ p ~ 1 = Δ p 1 / Δ p 1 and Δ p ~ 2 = Δ p 2 / Δ p 2 , each scaled to unit length. In the final step, the angle ξ between Δ p ~ 1 and Δ p ~ 2 is calculated by
θ = a r c c o s Δ p ~ 1 T Δ p ~ 2
For a fixed target point p 0 of interest, we aggregate the sampled angles θ for N pairs of ( p 1 , p 2 ) in the surrounding neighborhood into a histogram H = { f j } j = 1 , , p , where f j and p represent the frequencies in the j -th class and the total number of samples in the class, respectively. Figure 3 shows the HALFs for three classes with real PCD, demonstrating a certain degree of representation stability even with real data.
In this study, the parameters r m i n and r m a x shown in Figure 1 were set to 5 mm and 20 mm, respectively. These values were determined based on the actual sizes of soybean organs—such as stem diameter and internode spacing—and the point cloud density obtained in our measurement environment, in consultation with domain experts.

3.2. Convolution-Based Mathematics of HALF

HALF is a statistic of angles between 3D point pairs p 1 , p 2 around an intentionally allocated interest point p 0 within the point set. We define the coordinate system just at p 0 and an axis or origin line directed towards an arbitrary orientation from the origin, while assuming statistical independence in the phase angle distributions of p 1 and p 2 around the origin p 0 . When our objects are thick stems or leaves, they can be approximated by a geometrical line or plane. In these cases, the angle between two vectors p 1 p 0 = p 1 and p 2 p 0 = p 2 can always be represented by the difference between their respective phase angles of p 1 and p 2 , as shown in Figure 4.
In these instances, any histogram of difference angles can be calculated by the convolving two histograms of phase angles [22].
Let us formalize the aforementioned concept. Figure 4a illustrates the definition of an interest point, a pair of selected points, and the angle between two vectors for a typical knot data set. This pair p 1 , p 2 establishes the intersection angle θ   0 ,   180 for HALF analysis. In Figure 4b the same angle can be represented as the difference between two phase angles η . A histogram of the phase angles is represented as follows:
H η = f η η 0 η < 360
Phase angle η is defined within the interval 0 η < 360 . Intersection angle θ , which is essential for HALF analysis, is defined as the absolute value of the difference between two phase angles. Therefore, it can be calculated through the following three steps:
u = η 1 η 2
w = u
z = w     0 w < 180   360 w   180 w < 360
where the three variables u , w, and z correspond to these three steps. The frequency function of u can be defined by the convolution of the two frequency functions of η 1 and η 2 as follows:
f u u = 0 η 1 < 360 0 η 2 < 360 f η 1 η 1 f η 2 η 2 = 0 η 1 < 360 0 η 1 u < 360 f η 1 η 1 f η 2 η 1 u
By transforming to absolute values and applying an operation at 180 degrees of the angle, their frequencies are defined by the functions as follows:
f w w = f u w + f u w
f z z = f w z + f w 360 z
The following are numerical examples of these mathematical expressions.
f u 340 = f η 1 340 f η 2 0 + f η 1 341 f η 2 1 + + f η 1 359 f η 2 19
f w 200 = f u 200 + f u 200
f z 42 = f w 42 + f w 318
Figure 5 shows two sets of simulation data for convolution-based HALF calculation, each of which lies on a plane and on lines to validate the aforementioned prospect through mathematical convolution. These sets are inclined at an angle of 70 degrees from the y-axis. A critical aspect of this prospect is to neglect the thickness of any real objects. In the data, each point p = x , y , z T has its phase angle η = tan 1 e x T p p , s i g n e x × p e x × p measured from the origin line e x = 1,0 , 0 T for calculating the fundamental histogram of phase angles.
Figure 6 presents clear profiles of histograms for three classes of plant parts: leaf, stem, and knot, created through the previously mentioned mathematical formalization. The histogram for the leaf class shows a uniform distribution, while the stem class exhibits a bi-modal histogram. The histogram for the knot class features multiple peaks, in this case four, with one peak splitting into two due to trivial numerical quantization around 80 or 90 degrees.
It is important to note that the condition for understanding our HALF calculation as an extension of convolution is not always satisfied for any real shapes of plants in PCD. However, the mathematical framework for constructing HALF is clear and effective for gaining insights into the HALF structure based on real data.

3.3. Similarity Evaluation in HALF

To design effective discrimination or classification methods, an effective similarity measure based on HALF is essential. The intersection is utilized to obtain a simple similarity between any two histograms of the same size, which can prove to be an effective approach [23,24]. In the aforementioned HALF analysis, any HALF is represented as a p ^ -tuple of ordered positive integers or real numbers, H = { f j } j = 1 , , p . In this paper, we define three classes of plant parts: leaf, stem, and knot, each with its own reference HALF, G = g i i = 1,2 , , p . For each pair of H ,   G , the intersection is defined as follows:
ζ = i = 1 p m i n f i ,   g i
An ordinary scheme for discrimination is as follows: based on the maximum value of ζ , the interest point p 0 is classified into the k class, similar to linear discrimination rules.
As shown in the previous section, the calculation of distribution based on convolution is a mathematically rigorous scheme for HALF. For plant structure analysis, there may be advantages in analyzing thin leaves and stems. Here, we examine the limitations of HALF analysis when applied to more realistic 3D objects. Figure 7 illustrates the results of a simulation experiment using a thicker leaf and stem, modified from the objects shown in Figure 5. The differences between them and the ideal HALF were calculated using histogram intersection.
From this figure, we observe that leaf HALFs remain effective for diameters between 0 and 2 mm, whereas stem HALFs degrade in quality as their thickness increases. Notably, in this experiment stems with diameters over approximately 1 mm lost their similarity to the ideal stem with no diameter. This occurred because stem HALFs generally exhibit peaks in very narrow regions around 0 and 180 degrees, so the introduction of thicker diameters caused a significant increase in variance around these peaks. This phenomenon may also occur in knot HALFs. The reason for the endurance of quality in leaf HALFs may relate to the good uniformity of frequency within them; thus, the introduction of thinness does not have a severe effect on their HALFs. This consideration may be necessary for HALF-based algorithms handling more complicated shapes, such as roots, fruits, or flowers.

4. Sequential Competitive Segmentation Algorithm: SCSA

4.1. 3D Point Cloud Data for Soybean Plant

We planted the soybean variety GmJMC112 (FUKUYUTAKA), which was derived from the Genebank project at NARO (https://www.gene.affrc.go.jp), accessed on 15 April 2025. The seed was sown in a 1/5000a Wagner pot in 2018 using baked soil, and the plant was grown in the greenhouse at the Kazusa DNA Research Institute. A water tube was installed in the pot, with irrigation occurring twice a day. The 3D point cloud data was obtained by capturing all-surrounding images in a studio where the cultivation pot is rotated in front of the camera and then reconstructing the 3D point cloud using the photogrammetry method.

4.2. Definition of Optimum Reference HALFs

A prominent feature of growing plants are variations in the sizes and shapes of their parts as they grow. These characteristics can be observed even in individuals from the same epoch. For instance, the shapes of knots in the lower parts of a plant may differ significantly from those in the upper parts, particularly in thickness. Conducting research on living plants can present challenges. In this paper, we explore a technical trial using an original approach to segment or classify plant parts into optimal classes using the optimal reference HALF, which are sufficiently distinct from each other to facilitate similarity evaluation.
Figure 8 shows the initial HALFs and the detected optimum HALFs for three classes. Any initial HALFs can be defined through an approach based on convolution.

4.3. SCSA and Elemental Procedures

Figure 9a shows a snapshot at the k + 1st iteration of Bubble-based Selection Algorithm (BSA) used in sequential extensive merging, where a bubble radius r , centered at the segmented point c k from the previous iteration, B k = p ; p c k r , is utilized to enclose PCDs for segmentation. The point c k is classified into the best class according to the current reference HALFs at that point in time. In this example, six points colored blue are labeled as belonging to the same class as c k and are then sorted by their distances from c k . Initially, after defining or segmenting p 1 as c 1 k + 1 , a new bubble B 1 k + 1 = p ; p c 1 k + 1 r is created around it. Next, since p 2 is not included in B 1 k + 1 and there are not enough segmented points yet, this p 2 defines the next segmented point c 2 k + 1 and creates a new bubble B 2 k + 1 . The points p 3 , p 5 , and p 6 do not contribute to segmentation because they are involved in B 1 k + 1 or B 2 k + 1 . Lastly, p 4 satisfies the condition to be considered a new segmented point c 3 k + 1 . In this experiment, we set a limitation on extending new segmentation to just four points surrounding c k + 1 .  Figure 9b illustrates a small-scale real example of SCSA processing. Notably, the uniform extension of bubbles for two classes—leaves and stems—was maintained throughout the SCSA process.
Table 1 presents the BSA process from the perspective of distance evaluation between seed points, d i j = p j c i k + 1 , where the matrix D = d i j represents the sequence of segmentation.

4.4. Stepwise Details of SCSA

The following steps constitute a segmentation process based on the BSA scheme described in Figure 9 in the previous section.
SCSA in steps:
Initialization:
Step 1. Set candidate seed points (CSPs) for leaf, stem, and knot classes that satisfy two conditions: a CSP must have over 85% similarity to the reference HALF, indicating good quality, and must be at least 24 mm distant from any other CSPs in the same class. The threshold of 85% was empirically chosen based on testing of nearby values.
Phytomer segmentation:
Step 2. Select four SPs for the leaf, upper and lower stems, and knot from the CSPs defined in Step 1. These SPs should have connectivity as defined by the bubble-based selection algorithm shown in Figure 9a. The lower stem SP should be lower in height than the knot SP due to the shape of the phytomer structure.
Step 3. Using BSA, select and competitively compare SPs to define segmentation or classification into three classes through an alternate width-priority search.
Step 4. Return any already classified CSPs for leaves and stems, except for knots, back to the pool of normal points.
Repeat Steps 2 through 4 while any CSP for leaves remains.
Stem segmentation:
Step 5. Select the best SP of good quality for the stem. Then select another CSP for the knot that exhibits connectivity just as in Step 2.
Step 6. Using BSA, select and competitively compare SPs to define segmentation or classification into two classes through an alternate width-priority search.
Step 7. Return any already classified CSPs for stems, except for knots, back to the pool of normal points.
Repeat Steps 5 through 7 while any stem CSP remains.
Figure 10 represents a set of candidate seed points defined in Step 1 of the SCSA, consisting of 51, 11, and two candidate seed points for the three classes. The red points indicate the round number of phytomer segmentation and the corresponding class ID. The SPs, S1 and S2, represent the upper and lower stem seeds, respectively.
Figure 11 shows the results of phytomer segmentation using the SCSA. In the first round of phytomer segmentation (Steps 2 through 4), the right-hand upper phytomer, which consists of the leaf-stem-knot-stem structure, was successfully segmented and extracted; subsequently, the left-hand lower phytomer was recognized, as shown in the figure. Finally, in the third round (Steps 5 through 7), the upper center region was identified as a stem.

4.5. Segmentation Performance Evaluation

The segmentation performance of the SCSA was evaluated by comparing it with the ground-truth segmentation. Figure 12 shows the ground-truth segmentation, which was created by manually labeling each point in each class as either a leaf, a stem, or a knot. Table 2 presents the total number of point clouds in the ground truth, as well as the number of point clouds in each class. Determining the boundary between the knot and the stem is challenging when labeling the ground truth. Therefore, the center of the knot was manually specified, and a group of points within a sphere with a radius five times the stem diameter was classified as a knot. This setting also classifies the stipules as knots.
Recall, precision, and F-score were utilized as metrics for accuracy evaluation. Table 3 presents the evaluation results for each class. The leaf class showed better performance than the other classes, as the number of points in the leaf class is large and the effect of misclassification is minimal. The low recall of the stem class is attributed to misclassification at its boundary with the leaf class and the influence of unclassified point clouds at the stem tip. The lower evaluation results of the knot class compared to the other classes suggest that its boundary with the stem class is unclear. In particular, the low precision of the knot class may also be influenced by class imbalance, as the knot class accounts for only 1.6% of the dataset. The comparison with the ground truth confirms that SCSA is less prone to misclassification, except at the endpoints of the region, due to its use of a region-growing method.
All computations were performed on a machine running Windows 11 Home, equipped with an Intel(R) Core(TM) i9-12900H CPU (2.50 GHz) and 16.0 GB of RAM. The processor was manufactured by Intel Corporation, headquartered in Santa Clara, CA, USA. The HALF feature extraction process required approximately 27.0 s to process a point cloud with 136,502 points, resulting in an average computation time of approximately 0.198 milliseconds per point. The SCSA segmentation process took approximately 2.25 s. The total computation time, averaged over 100 runs, was approximately 29.3 s. Since further reductions in computation time may be necessary, the proposed HALF-SCSA method is currently considered suitable for offline use.

4.6. Comparison with Existing Method

To further evaluate the proposed HALF-SCSA method, we conducted an additional experiment using tomato 3D point cloud data and the Plant 3D (P3D) toolkit [8], which classifies points into leaf and stem categories based on FPFH [7] features and a trained neural network model. In this experiment, we used the pre-trained model provided by P3D without retraining. Figure 13 shows the visual comparison between the segmentation results of P3D and HALF-SCSA. As a result, P3D produced binary segmentation distinguishing only leaf and stem regions. In contrast, HALF-SCSA segmented the same data into three categories: leaf, stem, and knot. While the difference in the number of output classes reflects the design of the models used in this comparison, we emphasize that HALF-SCSA achieved this segmentation without any training or labeled data, highlighting its practical advantage as a learning-free and easily transferable method. However, as can be seen in Figure 13, although the lower knot was successfully segmented, the upper knot was not clearly identified. This suggests that further refinement may be needed to improve the method’s robustness in capturing diverse structural characteristics across different plant types.
Furthermore, we also investigated how the proposed method handles noise (including outliers), which is a common and significant issue in plant phenotyping. Figure 14 shows regions in the soybean and tomato datasets where noisy points were observed. In (a), HALF-SCSA treats outliers as unclassified (gray) points which are visually enhanced for clarity. In (b), HALF-SCSA excludes noisy points by labeling them as unclassified, whereas P3D assigns all points to the stem class. This behavior is attributed to the design of the SCSA, which locally evaluates spatial continuity and avoids assigning unreliable labels to ambiguous points, thereby reducing misclassification. Although this evaluation is qualitative, it demonstrates that the proposed method exhibits a certain degree of robustness to real-world noise.

4.7. Limitation

The proposed HALF-SCSA method was designed for plants with clearly distinguishable organ structures, such as leaves, stems, and knots, as typically found in soybean. Soybean is an agriculturally important crop, and detailed morphological analysis of its organs is critical for crop breeding and cultivation management. The success of HALF-SCSA in tomato, which exhibits similar morphological traits, suggests its potential applicability to other crops with comparable organ structures. However, for plants with significantly different overall architecture and organ shapes—such as rice—the current method may not be directly suitable. We recognize this as a limitation of our approach. Future work will explore algorithmic extensions and alternative feature representations to accommodate a broader range of plant morphologies.

5. Conclusions

In this study, we proposed a novel segmentation method, HALF (Histogram of Angles in Linked Features), for analyzing plant morphology using 3D point cloud data. This method utilizes local angular information within 3D point clouds as statistical features, enabling the identification of key plant structures (leaves, stems, and knots) without requiring labeled data. Furthermore, we introduced the Sequential Competitive Segmentation Algorithm (SCSA), which enables phytomer-level classification using HALF. The effectiveness of HALF was confirmed through evaluation experiments using soybean 3D point cloud data, demonstrating the feasibility of SCSA-based classification.
Since HALF directly processes 3D point cloud data acquired through sensing technologies such as LiDAR, laser scanning, and photogrammetry, it is well-suited for integration into sensor-based plant monitoring systems, making it highly applicable to plant phenotyping tasks. This capability also adds value to high-throughput phenotyping, precision agriculture, and the development of advanced sensing platforms for automated plant analysis.
Future research will focus on extending the applicability of HALF to enable more detailed plant morphological analysis. As part of this effort, we aim to develop a method for detailed leaf segmentation, distinguishing specific structures such as leaf edges and leaf tips. Additionally, we will expand the application of HALF beyond segmentation to the quantitative measurement of plant morphology. By utilizing HALF for precise growth analysis and cultivar evaluation, we aim to enhance the accuracy of 3D plant structure analysis and broaden its practical applications.

Author Contributions

Conceptualization, S.K.; methodology, H.T.; software, H.T.; validation, H.T.; formal analysis, S.K.; investigation, N.W.; resources, T.T.; data curation, T.T.; writing—original draft preparation, H.T. and S.K.; writing—review and editing, N.W. and T.T.; visualization, H.T.; supervision, S.K.; project administration, T.T.; funding acquisition, T.T. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by a CREST Grant (No. JPMJCR16O1) from the Japan Science and Technology Agency and the Kazusa DNA Research Institute Foundation.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

The authors would like to thank Shota Hasegawa and Kota Tsujimura, who were students at Hokkai-Gakuen University at the time of the experiment, for their valuable cooperation. They also thank Kota Takahashi of Hokkaido University of Science for his assistance with the literature survey. In addition, the authors also thank Kazusa DNA Research Institute and Sachiko Isobe (now at the University of Tokyo) for providing the 3D point cloud data of individual plants used in this paper.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Li, D.; Quan, C.; Song, Z.; Li, X.; Yu, G.; Li, C.; Muhammad, A. High-Throughput Plant Phenotyping Platform (HT3P) as a Novel Tool for Estimating Agronomic Traits From the Lab to the Field. Front. Bioeng. Biotechnol. 2021, 8, 623705. [Google Scholar] [CrossRef] [PubMed]
  2. Kochi, N.; Hayashi, A.; Shinohara, Y.; Tanabata, T.; Kodama, K.; Isobe, S. All-around 3D plant modeling system using multiple images and its composition. Breed. Sci. 2022, 72, 75–84. [Google Scholar] [CrossRef] [PubMed]
  3. Burgess, A.J.; Retkute, R.; Pound, M.P.; Foulkes, J.; Preston, S.P.; Jensen, O.E.; Pridmore, T.P.; Murchie, E.H. High-Resolution Three-Dimensional Structural Data Quantify the Impact of Photoinhibition on Long-Term Carbon Gain in Wheat Canopies in the Field. Plant Physiol. 2015, 169, 1192–1204. [Google Scholar] [CrossRef] [PubMed]
  4. Takanari, T.; Hidenori, T.; Shun’ichi, K.; Shota, H.; Sachiko, I.; Naofumi, W. HALF (Histogram in Angles of Longer arm Feature) Analysis—Application to Point-cloud Segmentation for Plant Morphology. In Proceedings of the SSII2024 (The 30th Symposium on Sensing via Image Information), Yokohama, Japan, 12–14 June 2024. [Google Scholar]
  5. Paulus, S.; Dupuis, J.; Mahlein, A.K.; Kuhlmann, H. Surface Feature Based Classification of Plant Organs from 3D Laserscanned Point Clouds for Plant Phenotyping. BMC Bioinform. 2013, 14, 238. [Google Scholar] [CrossRef] [PubMed]
  6. Rusu, R.B.; Blodow, N.; Marton, Z.C.; Beetz, M. Aligning Point Cloud Views using Persistent Feature Histograms. In Proceedings of the 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, Nice, France, 22–26 September 2008. [Google Scholar]
  7. Rusu, R.B.; Blodow, N.; Beetz, M. Fast Point Feature Histograms (FPFH) for 3D Registration. In Proceedings of the 2009 IEEE International Conference on Robotics and Automation, Kobe, Japan, 12–17 May 2009. [Google Scholar]
  8. Ziamtsov, I.; Navlakha, S. Machine Learning Approaches to Improve Three Basic Plant Phenotyping Tasks using Three-Dimensional Point Clouds. Plant Physiol. 2019, 181, 1425–1440. [Google Scholar] [CrossRef] [PubMed]
  9. Tombari, F.; Salti, S.; Stefano, L.D. Unique Signatures of Histograms for Local Surface Description. In Proceedings of the European Conference on Computer Vision, Heraklion, Greece, 5–11 September 2010. [Google Scholar]
  10. Qi, C.R.; Su, H.; Mo, K.; Guibas, L.J. PointNet: Deep Learning on Point Sets for 3D Classification and Segmentation. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017. [Google Scholar]
  11. Qi, C.R.; Yi, L.; Su, H.; Guibas, L.J. PointNet++: Deep Hierarchical Feature Learning on Point Sets in a Metric Space. In Proceedings of the 31st International Conference on Neural Information Processing Systems, Long Beach, CA, USA, 4–9 December 2017. [Google Scholar]
  12. Patel, A.K.; Park, E.-S.; Lee, H.; Priya, G.G.L.; Kim, H.; Joshi, R.; Arief, M.A.A.; Kim, M.S.; Baek, I.; Cho, B.-K. Deep Learning-Based Plant Organ Segmentation and Phenotyping of Sorghum Plants Using LiDAR Point Cloud. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2023, 16, 8492–8507. [Google Scholar] [CrossRef]
  13. Luo, L.; Jiang, X.; Yang, Y.; Samy, E.R.A.; Lefsrud, M.; Hoyos-Villegas, V.; Sun, S. Eff-3DPSeg: 3D Organ-Level Plant Shoot Segmentation Using Annotation-Efficient Deep Learning. Plant Phenomics 2022, 5, 80. [Google Scholar] [CrossRef] [PubMed]
  14. Otobe, Y.; Unseok, L.; Usami, T.; Hayashi, A.; Kochi, N.; Shinohara, Y.; Itaka, S.; Suzuki, T. Part Segmentation in 3D Point Cloud of Paprika using Deep Learnings. In Proceedings of the 85th National Convention of Information Processing Society of Japan, Tokyo, Japan, 2–4 March 2023. (In Japanese). [Google Scholar]
  15. Boogaard, F.P.; van Henten, E.J.; Kootstra, G. Improved Point-Cloud Segmentation for Plant Phenotyping Through Class-Dependent Sampling of Training Data to Battle Class Imbalance. Front. Plant Sci. 2022, 13, 838190. [Google Scholar] [CrossRef] [PubMed]
  16. Ghahremani, M.; Williams, K.; Corke, F.M.K.; Tiddeman, B.; Liu, Y.; Doonan, J.H. Deep Segmentation of Point Clouds of Wheat. Front. Plant Sci. 2021, 12, 608732. [Google Scholar] [CrossRef] [PubMed]
  17. Turgut, K.; Dutagaci, H.; Galopin, G.; Rousseau, D. Segmentation of Structural Parts of Rosebush Plants with 3D Point-Based Deep Learning Methods. Plant Methods 2022, 18, 20. [Google Scholar] [CrossRef] [PubMed]
  18. Xie, K.; Zhu, J.; Ren, H.; Wang, Y.; Yang, W.; Chen, G.; Lin, C.; Zhai, R. Delving into the Potential of Deep Learning Algorithms for Point Cloud Segmentation at Organ Level in Plant Phenotyping. Remote Sens. 2024, 16, 3290. [Google Scholar] [CrossRef]
  19. He, Y.; Yu, H.; Liu, X.; Yang, Z.; Sun, W.; Anwar, S.; Mian, A. Deep Learning Based 3D Segmentation: A Survey. arXiv 2024, arXiv:2103.05423v5. [Google Scholar]
  20. Kashino, K.; Kurozumi, T.; Murase, H. A Quick Search Method for Audio and Video Signals based on Histogram Pruning. IEEE Trans. Multimed. 2003, 5, 348–357. [Google Scholar] [CrossRef]
  21. Agostinelli, C.; Basu, A.; Filzmoser, P.; Mukherjee, D. Recent Advances in Robust Statistics: Theory and Applications; Springer: Berlin/Heidelberg, Germany, 2016. [Google Scholar]
  22. Papoulis, A. Probability: Random Variables, and Stochastic Processes; McGraw-Hill Kohgaku-Sha: Tokyo, Japan, 1965. [Google Scholar]
  23. Wada, N.; Kaneko, S.; Takeguchi, T. Using color reach histogram for object search in color and/or depth scene. Pattern Recognit. 2006, 39, 881–888. [Google Scholar] [CrossRef]
  24. Tian, Y.; Fang, M.; Kaneko, S. Absent Color Indexing: Histogram-Based Identification Using Major and Minor Colors. Mathematics 2022, 10, 2196. [Google Scholar] [CrossRef]
Figure 1. Shell for sampling three PCD points.
Figure 1. Shell for sampling three PCD points.
Sensors 25 03659 g001
Figure 2. Representative structures of the three classes (Leaf, Stem, and Knot) in HALF analysis.
Figure 2. Representative structures of the three classes (Leaf, Stem, and Knot) in HALF analysis.
Sensors 25 03659 g002
Figure 3. Examples of HALF for Leaf, Stem, and Knot classes.
Figure 3. Examples of HALF for Leaf, Stem, and Knot classes.
Sensors 25 03659 g003
Figure 4. Intersection angle and phase angles. (a) Definition of an angle in HALF for Knot class (b) Definition of phase angles.
Figure 4. Intersection angle and phase angles. (a) Definition of an angle in HALF for Knot class (b) Definition of phase angles.
Sensors 25 03659 g004
Figure 5. Two set of 3D points for simulation.
Figure 5. Two set of 3D points for simulation.
Sensors 25 03659 g005
Figure 6. Typical HALFs by convolution-based calculation.
Figure 6. Typical HALFs by convolution-based calculation.
Sensors 25 03659 g006
Figure 7. Relationship between leaf and stem thickness and the HALF degradation trend.
Figure 7. Relationship between leaf and stem thickness and the HALF degradation trend.
Sensors 25 03659 g007
Figure 8. Initial and optimum reference HALFs. (a) Initial HALFs. (b) Optimum reference HALFs detected using initial HALFs.
Figure 8. Initial and optimum reference HALFs. (a) Initial HALFs. (b) Optimum reference HALFs detected using initial HALFs.
Sensors 25 03659 g008
Figure 9. Small example of the SCSA process. (a) Bubble-based selection algorithm (BSA) used in sequential extensive merging. Bubbles define segmentation points for the selection process. (b) Example of applying BSA with uniformly distributed bubbles. For 131,641 data points, 61 iterations were completed ( r = 5   m m ).
Figure 9. Small example of the SCSA process. (a) Bubble-based selection algorithm (BSA) used in sequential extensive merging. Bubbles define segmentation points for the selection process. (b) Example of applying BSA with uniformly distributed bubbles. For 131,641 data points, 61 iterations were completed ( r = 5   m m ).
Sensors 25 03659 g009
Figure 10. Candidate seed points for leaf (blue), stem (green), and knot (red) classes.
Figure 10. Candidate seed points for leaf (blue), stem (green), and knot (red) classes.
Sensors 25 03659 g010
Figure 11. SCSA segmentation of two phytomers. Gray points represent data that have not yet been classified.
Figure 11. SCSA segmentation of two phytomers. Gray points represent data that have not yet been classified.
Sensors 25 03659 g011
Figure 12. Ground -truth segmentation.
Figure 12. Ground -truth segmentation.
Sensors 25 03659 g012
Figure 13. Comparison of segmentation results on tomato point cloud data using Plant 3D (P3D) and the proposed HALF-SCSA. (a) P3D result using a pre-trained model: leaf and stem classification. (b) HALF-SCSA result: segmentation into leaf, stem, and knot.
Figure 13. Comparison of segmentation results on tomato point cloud data using Plant 3D (P3D) and the proposed HALF-SCSA. (a) P3D result using a pre-trained model: leaf and stem classification. (b) HALF-SCSA result: segmentation into leaf, stem, and knot.
Sensors 25 03659 g013
Figure 14. Examples of noisy points. (a) HALF-SCSA result on soybean data (gray points are unclassified). (b) Comparison between P3D (top) and HALF-SCSA (bottom) on tomato data.
Figure 14. Examples of noisy points. (a) HALF-SCSA result on soybean data (gray points are unclassified). (b) Comparison between P3D (top) and HALF-SCSA (bottom) on tomato data.
Sensors 25 03659 g014
Table 1. BSA from the perspective of distance.
Table 1. BSA from the perspective of distance.
D p j
123456
c i k + 1 1Sensors 25 03659 i001 d 12 d 13 d 14 d 15 d 16
2 Sensors 25 03659 i001 d 23 d 24
3 Sensors 25 03659 i001
B 1 k + 1 B 2 k + 1 B 3 k + 1
Sensors 25 03659 i001 shows segmented points at the center of the bubbles.
Table 2. Number of point clouds in ground-truth segmentation.
Table 2. Number of point clouds in ground-truth segmentation.
TotalKnotStemLeaf
136,50221947191127,117
Table 3. Segmentation performance of SCSA.
Table 3. Segmentation performance of SCSA.
Precision (%)Recall (%)F-Score (%)
Leaf99.699.999.8
Stem96.584.189.9
Knot78.789.883.9
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Takauji, H.; Wada, N.; Kaneko, S.; Tanabata, T. HALF: Histogram of Angles in Linked Features for 3D Point Cloud Data Segmentation of Plants for Robust Sensing. Sensors 2025, 25, 3659. https://doi.org/10.3390/s25123659

AMA Style

Takauji H, Wada N, Kaneko S, Tanabata T. HALF: Histogram of Angles in Linked Features for 3D Point Cloud Data Segmentation of Plants for Robust Sensing. Sensors. 2025; 25(12):3659. https://doi.org/10.3390/s25123659

Chicago/Turabian Style

Takauji, Hidenori, Naofumi Wada, Shun’ichi Kaneko, and Takanari Tanabata. 2025. "HALF: Histogram of Angles in Linked Features for 3D Point Cloud Data Segmentation of Plants for Robust Sensing" Sensors 25, no. 12: 3659. https://doi.org/10.3390/s25123659

APA Style

Takauji, H., Wada, N., Kaneko, S., & Tanabata, T. (2025). HALF: Histogram of Angles in Linked Features for 3D Point Cloud Data Segmentation of Plants for Robust Sensing. Sensors, 25(12), 3659. https://doi.org/10.3390/s25123659

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop