Next Article in Journal
Symbol Detection in Mechanical Engineering Sketches: Experimental Study on Principle Sketches with Synthetic Data Generation and Deep Learning
Previous Article in Journal
NOx Formation Mechanism and Emission Prediction in Turbulent Combustion: A Review
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Low-Overlap Bullet Point Cloud Registration Algorithm Based on Line Feature Detection

1
Changchun Institute of Optics, Fine Mechanics and Physics, Chinese Academy of Sciences, Changchun 130033, China
2
University of Chinese Academy of Sciences, Beijing 100049, China
*
Author to whom correspondence should be addressed.
Appl. Sci. 2024, 14(14), 6105; https://doi.org/10.3390/app14146105
Submission received: 23 May 2024 / Revised: 6 July 2024 / Accepted: 9 July 2024 / Published: 12 July 2024

Abstract

:
A bullet point cloud registration algorithm with a low overlap rate based on line feature detection was proposed to solve the problem of the difficulty and low efficiency of point cloud registration due to the low overlap rate among point clouds sampled by the bullet model. In this paper, voxel downsampling is used to remove some noise points and outliers from the bullet point cloud and applied to the specified resolution to reduce the calculation cost. The bullet point cloud is transformed to a better initial position by fitting the central axis with the geometrical features of the bullet. Then, the direction vector of the bullet linear features is obtained by using an icosahedral fitting discrete Hough transform to simplify the parameter space of the search transformation. Finally, the optimal rotation angle is searched for in the parameter space by using the improved Cuckoo algorithm to realize the registration of the bullet point cloud with a low overlap rate. Simulation and experimental results show that the proposed registration method can accurately register bullet point clouds of different densities with a low overlap rate. Compared with the commonly used ICP, GICP, and TRICP algorithms, the registration error of the proposed algorithm is reduced by 92.68% on average when the overlap rate is 52.85%. The registration error is reduced by 98.87% in the case of a 41.36% overlap rate, by 99.52% in the case of a 33.02% overlap rate, and by 98.89% in the case of a 22.75% overlap rate.

1. Introduction

China has strict gun control, but the number of gun cases has increased year by year. The Ministry of Public Security announced at a press conference that, since 2023, a total of 13,000 gun explosion cases have been investigated and 43,000 guns have been confiscated. Because the marks on the bullets and shell casings have a strong reflection on the gun equipment [1,2], through the inspection and analysis of guns and bullets in gun-related cases, the obtained information is of great significance to the matching identification of the types of guns and bullets [3] and provides reliable evidence for the detection of cases. Bullet marks are small, usually up to the micron level, so traditional bullet mark detection is performed by technicians operating microscope equipment combined with image processing technology to observe and compare the features of the bullet marks [4]. However, this comparison method lacks objective quantitative analysis and technical personnel are highly subjective, so the results are not objective and comprehensive.
In the early 2000s, with the development of optical imaging instruments with the ability to acquire depth data, researchers began to apply such instruments to the examination of bullet marks. Two-dimensional images initially acquired by optical photography technology will be affected by lighting conditions, the reflection of bullet metal materials, the material color, and other factors [5]. Three-dimensional image acquisition technology can avoid these effects. At present, bullet mark inspection is gradually changing from 2D to 3D measurement, comparison, and identification to increase useful information [6,7]. Bacharach [6] proposed in 2002 that 3D imaging could better characterize the surface feature topography of bullets. With the development and popularity of various 3D scanning technologies, various scanning devices can quickly acquire 3D point clouds on the bullet surface, including confocal microscopy [8], coherent scanning interferometry [9], photometric stereoscopy [10], structured light scanning [11], and zoom [12]. To obtain the surface topography of the warhead, the warhead is usually scanned at different angles, and the obtained point cloud data of all parts need to be registered into the same coordinate system by using registration technology; that is, a complete three-dimensional point cloud model of the warhead is registered. Point cloud registration technology affects the quality of the three-dimensional point cloud model.
Domestic and foreign researchers have invested a lot of research on point cloud registration technology. Although the registration problem has been extensively studied, most of the current methods are mainly based on the feature matching between point cloud data, while the overlapping regions between point cloud data are relatively less considered. Besl and Mckay [13] proposed one of the most classical algorithms in the field of point cloud registration, the iterative closest point (ICP) algorithm. The ICP algorithm can produce accurate results when initialized near the optimal pose, but is not reliable without initialization. A series of ICP-based variants enhance the robustness of ICP algorithms [14], such as TRICP [15] and GOICP [16]. Ji et al. [17] proposed a hybrid algorithm combining the genetic algorithm and ICP algorithm. Meng [18] combined kd-tree and extrapolation to improve the speed and accuracy of the ICP algorithm. Zhu et al. [19] deployed an improved iterative nearest point (ICP) algorithm in which an adaptive threshold was introduced to accelerate the iterative convergence. Although the classical ICP registration problem has received extensive attention and improvement, most of the current methods rarely consider the overlapping region between point clouds. The overlap area of a point cloud refers to the area with the same structure between the data of two point clouds. Li [20] et al. analyzed the geometric structural features of point clouds for segmentation, created multidimensional shape descriptors for each block, and used these descriptors to identify overlapping regions between the point clouds. However, this approach can lead to false matches when there are multiple similar structures. In addition, when implementing the registration algorithm, if the point pairs matched by features are not selected from the overlapping region of two point clouds, many incorrect corresponding point pairs will be generated, thus reducing the accuracy of the matching. When the overlap rate is lower than 60%, the registration difficulty is increased, and, in practical applications, a low overlap degree of point clouds is often encountered [21,22]. On the one hand, problems such as occlusion may be encountered in the process of 3D scanning, which cannot ensure that the overlap rate of each scan is very high; on the other hand, in view of the relatively high cost of point cloud data acquisition, the adoption of low-overlap-rate point clouds can effectively reduce the number of point cloud fragments that need to be processed, thus reducing the overall workload of the registration.
In our experiments, we found that the bullet as a feature with fine bullet marks is more of a challenge in the low-overlap-rate alignment process. Compared with traditional images and depth images, the features of bullet point clouds are more difficult to extract, so it is difficult for existing point cloud alignment algorithms to accurately align bullet point clouds. Especially in cases such as small overlapping areas, these algorithms suffer from poor accuracy, falling into local optimization, or even the inability to align and mismatch. To address the shortcomings of the existing alignment algorithms, this paper proposes a low-overlap bullet point cloud registration method based on line feature detection. By fitting the central axis, good initial locations of the two bullet point clouds with large differences are found to simplify the search space for subsequent rotation parameters. Then, the linear features of the bullet are extracted by using an icosahedral fitting discrete Hof transform. Finally, an improved Cuckoo algorithm is used to find the optimal rotation angle to reduce the complexity of the matching algorithm and achieve low-overlap bullet point cloud registration.
The paper is organized as follows. Section 1 describes the research background and related work. Section 2 is devoted to our methodology. Section 2.1 introduces automatic cyclic voxel downsampling, while Section 2.2 explains the principle of fitting the bullet median. Section 2.3 introduces the use of the discrete Hough transform to extract the linear features of the bullet traces. Section 2.4 searches for the optimal rotation angle in the parameter space using the improved Cuckoo algorithm to achieve the low overlap rate of the bullet point cloud alignment. In Section 2.5, the algorithm is analyzed for its computational complexity. In Section 3, we apply this algorithm to some noisy bullet point cloud models and actual bullet point cloud data collected, and compare the results with those of the ICP, TRICP, and GOICP algorithms, where the computational error is close to zero, which proves the validity of this algorithm. Suggestions for possible further improvements of the algorithm are presented in Section 4. Finally, in Section 5, the methodology and experimental results used in this paper are summarized.

2. Bullet Low-Overlap Point Cloud Registration Based on Line Feature Detection

Generally speaking, registration is used to find a transformation that transforms the target point cloud to the source point cloud position in 3D space. Unlike the method used by Horner, Ji Ri [23], or Pomerleau et al. [24], the algorithm in this paper does not require complex computation to generate and match local descriptors for key points in the scanning process. Instead, as shown in Figure 1, a good initial position is determined by using the coarse registration of the central axis, and then the optimal rotation angle is found by improving the Cuckoo algorithm. The core of the registration algorithm is to reduce the search space by finding good initial position optimization transformation parameters, rather than searching the whole parameter space to find the optimal transformation.

2.1. Voxelization Downsampling

Due to the large number of point clouds on the surface of the captured warhead, the running time will be greatly increased, and the presence of noise will reduce the data quality. Point cloud filtering can well solve the above problems [25], and the filtered point cloud will use fewer points to describe warhead information than the original point cloud collection, as shown in Figure 2. Through investigation, we can consider bullet marks as linear features [1] to describe the same linear bullet marks. The original dense point cloud requires 10 point clouds to describe and contain noise points, while the filtered point cloud only requires 3 points to describe the linear features.
In order to reduce the size of the point cloud, this paper uses voxel filtering to create a three-dimensional voxel grid, filters the point cloud with a uniform resolution of 0.003 mm, adopts automatic cyclic voxel filtering, calculates the maximum and minimum values of the x-, y-, and z-axes of the input point cloud, establishes a three-dimensional boundary box according to these values, and divides the boundary box into small cubes with specified voxel sizes. While the actual acquisition of the bullet point cloud data is usually in the order of millions, a reasonable filtering of the dense point cloud can help to improve the alignment efficiency.

2.2. Fitting the Registration Central Axis

Fitting the central axis is used to shorten the calculation time of the subsequent registration. By fitting the central axis of two sets of bullet point cloud data, the distance between two points in space can be effectively reduced. In this paper, a pre-registration method based on the geometric features of the warhead is proposed. Two bullet point clouds with large position differences in space are registered on the same central axis. The algorithm for fitting the central axis is as follows (see Algorithm 1).
Algorithm 1 Fit the registration central axis
Input: Point cloud data P1 and P2 of two bullets
Output: Pre-registered point clouds P1’ and P2’ along the same central axis
1:
 Compute ymin and ymax for point cloud P1
2:
 Divide the y-axis range [ymin, ymax] into n layers, each with height Δy = (ymax − ymin)/n
3:
for each layerk, do
4:
     layerk ← [ymin + (k − 1) × Δy, ymin + k × Δy]
5:
     (xi, yi, zi)← { (x, y, z)|ymin + (k − 1) × Δy ≤ y ≤ ymin + k × Δy }
6:
end for
7:
for each layerk, do
8:
     Randomly select four non-coplanar points: (x1, y1, z1), (x2, y2, z2), (x3, y3, z3), (x4, y4, z4)
9:
     Verify the four points are not coplanar
10:
      Solve the circumsphere equation to find the sphere center (h, l, k) using SVD
11:
      Collect all sphere centers to form point set S
12:
end for
13:
 Remove invalid points from S that exceed the point cloud’s y-axis range to obtain filtered point set S’
14:
 Centering and Principal Component Analysis (PCA)
15:
 Fit the central axis line of P2 similarly to find line points L’
16:
 Pre-register P1 and P2 by aligning their central axis lines L and L’
17:
 Output the pre-registered point clouds P1’ and P2
18:
end
  • Treat bullet point cloud data as a set of points P = {P1, P2, …, Pm}. Each of these points Pi is a three-dimensional coordinate (xi, yi, zi). First, compute the maximum and minimum values of the Y-axis, stratifying these points in the Y-axis direction, as follows:
y min = min i = 1   y i
y max = max i = 1   y i
Based on ymin and ymax, the whole range is divided into n layers. The height range of each layer Δy is as follows:
Δ y = y max y min n
For each layer k, define its height range on the Y-axis as [ymin + (k − 1)Δy, ymin + kΔy], where k = 1, 2, …, n. Each layer contains points that meet the following conditions:
layer k = { P i | y min + ( k 1 ) Δ y y i y min + k Δ y }
2.
For each layerk, take any of the four points to remember, (x1, y1, z1), (x2, y2, z2), (x3, y3, z3), and (x4, y4, z4), as shown in Figure 3, for each layer of the outer ball, where each layer is marked as (h, l, k). First, verify that the four points are not coplanar; if any of them are coplanar, then discard the coplanar point and take any one more point for verification, and then bring the four points that are not coplanar into the equation of the outer sphere to obtain the following four equations:
( x 1 - h ) 2 + ( y 1 - k ) 2 + ( z 1 - 1 ) 2 = r 2 ( x 2 - h ) 2 + ( y 2 - k ) 2 + ( z 2 - 1 ) 2 = r 2 ( x 3 - h ) 2 + ( y 3 - k ) 2 + ( z 3 - 1 ) 2 = r 2 ( x 4 - h ) 2 + ( y 4 - k ) 2 + ( z 4 - 1 ) 2 = r 2 ,
By constructing a linear equation, SVD is applied to find the best solution in the sense of least squares, that is, (h, l, k), the coordinate point set X of the center of the sphere.
3.
In all of the obtained spherical center coordinate point sets X, the invalid points whose axis ranges exceed the point cloud are eliminated to obtain the point set X″.
4.
The point cloud data X″ is a matrix of n × 3, where n is the number of points; calculate the central point of the point cloud, that is, p0, the average of the coordinates of all points on each dimension, as follows:
p 0 = 1 n i = 1 n x i ,
Centralize the coordinates of all points to form a new set of points X′, such that each point is x’ = xi − p0. Find a direction vector v that maximizes the projection variance of the point set X’ in that direction. This is achieved by solving the eigenvalues and eigenvectors of the covariance matrix C, as follows:
C = 1 n - 1 X T X ,
After finding the first principal component direction v1, take a series of points along that direction, calculate the projection ti = xi · v1 of each point after centralization in the direction v1, and then evenly choose a series of t values between the largest and smallest ti, which corresponds to the point L(t) = p0 + tv1 on the line.
5.
A good coarse registration position of two low-overlapping point clouds on the coaxial line is obtained by matching the central axis of two point clouds.

2.3. Extraction of Linear Features by Discrete Hoff Transform

Many existing alignment algorithms, such as the ICP algorithm, have high requirements for the initial position of the point cloud alignment, and many incorrect feature point matches can occur with small overlap rates, both of which can result in excessive errors in the results. This paper is implemented in the following ways: (1) avoiding the key point detection method and treating the “line shape” in the data as the feature to be matched; (2) using a technique that does not depend on the initial configuration of the search results, which is usually not affected by local spurious feature matching between features, because the algorithm in this paper takes into account the global relative position of the detected features; (3) performing a geometric constraint alignment search using a similar idea as in [26]. We draw on this idea by filtering the entire set of impact points by the most linear feature of the bullet marks, discarding widely varying correspondences and extracting the marks as linear features. The detected bullet marks are shown in Figure 4.

2.3.1. Sharp Feature Detection

For each point q in the bullet point cloud after voxel downsampling, point q is assigned to the set GA or GB, and a value δ(q) is defined. The calculation formula is as follows [27]:
δ ( q ) = λ 0 λ 1 + λ 2 + λ 3 ,

2.3.2. Line Feature Detection

  • After identifying subsets GA’ ⊂ GB and GB’ ⊂ GB that contain sharp feature points, we continue to look for linear features in these subsets.
  • The line features are found by discretizing the Hough parameter space into a regular icosahedron. This step uses the method in [28].
  • Use an improved version of the Hough transform algorithm, which is applied iteratively and corrected with least squares error line fitting to improve the accuracy.

2.4. Improved Cuckoo Algorithm to Find the Best Rotation Angle

Point cloud registration methods usually find local descriptors that match in two global scans for the registration. In this paper, a good initial position is found by fitting the registration axis, and the parameter space is simplified to reduce the computational complexity. After fitting the central axis of registration, it is sufficient to consider only the rotation of the central axis [29]. In order to determine the rotation relationship between the two point clouds, the Improved Cuckoo Search (ICS) algorithm [30] was adopted in this paper to find the optimal rotation angle in the parameter space to achieve the purpose of fine registration (see Algorithm 2).
Algorithm 2 Improved Cuckoo Search Algorithm for Bullet Point Cloud Registration
Input: Point clouds P1 and P2 of two bullets
      Population size n ← 20
      Maximum iterations MaxIter ← 30
Output: Optimal rotation matrix for aligning P1 to P2
1:
 Initialize population (nests) with random solutions
2:
 Evaluate fitness of each nest
3:
 Identify the current best nest with the highest fitness
4:
for iter ← 1 to MaxIter do
5:
     for each nest i do
6:
           Randomly select three nests xi, xj, xk from the population
7:
           xnew ← xi + r × (xj − xk), r ← [0, 1], and α ←0.01
8:
           Apply Lévy flight:
            L ← (Γ(1 + β)×sin(π × β)/(Γ((1 + β)/2) × β × 2^((β − 1)/2)))^(1/β)
            u ← normal(0, σ_u^2)
            v ← normal(0, σ_v^2) step = u/abs(v)^(1/β)
            xnew ← xnew + step × (xnew − best_nest)
9:
           if fitness(xnew) > fitness(xi) then
10:
              xi ← xnew
11:
          end if
12:
      end for
13:
      for each nest i do
14:
          With probability pa, discard nest i and generate a new nest using:
          xnew ← xr1 + r ×
          (xr2 − xr3), where xr1, xr2 are two randomly chosen nests, r is a
          random number in [0, 1]
15:
      end for
16:
      Evaluate fitness of all nests
17:
      Identify the current best nest with the highest fitness
18:
end for
19:
 Output the optimal rotation matrix from the best nest
20:
end
The Cuckoo Search Algorithm is a heuristic optimization algorithm that mimics the parasitic breeding behavior of cuckoo birds. The core of improving the Cuckoo Search Algorithm is to increase the diversity of the population by exchanging information about different members within the group. The periodic and oscillation characteristics of sine and cosine functions are used to optimize the convergence to the global optimal solution, which mainly includes the following aspects:
  • Parasitic breeding: Cuckoos place their eggs in the nests of other birds. In the algorithm, this is simulated as finding new solutions in the solution space. The basic principle of this mechanism is to increase the diversity of the population by exchanging information about different members within the population, and to use the periodic and oscillatory properties of the sine and cosine functions to optimize the convergence to the global optimal solution. First, 3 individuals are randomly selected from the entire nest x and then passed as follows:
b = 2 2 × t maxiter
x i t = x q 3 t + b sin ( 2 π r ) a 0 L é v y ( b e s t x q 1 t ) , r < 0.5 ,
x i t + 1 = x q 1 t + b cos ( 2 π r ) ( b e s t a 0 L é v y x q 2 t ) , r 0.5 ,
Three of them are x(q1), x(q2), and x(q3), where n is the number of population sizes, usually a constant value of 0.01, and Lévy is Levi’s flight, the current optimal nest location; maxiter is the maximum number of iterations, and r ∈ {0, 1] is a random number.
2.
Egg discovery and discarding: If other birds find foreign eggs, they discard the nest with the probability of pa ∈ [0, 1], and then rebuild the nest in the following way:
x i t + 1 = x i t + r 1 ( x g t x k t ) + r 2 ( b e s t x i t ) ,
where x i t and x g t are two random locations of the nth generation population.
3.
Levy flight: Cuckoo’s movement follows Levy flight, a random walk with a small step aggregation and an occasional long leap. Although this method is conducive to jumping out of a local optimal, the accuracy of the search is entirely determined by the small step aggregation [30]. To solve this problem, the improved algorithm uses the random iteration number change formula, so that the search range is wide in the initial iteration, and narrow in the later iteration, and then searches around the optimal value. The calculation formula is as follows:
L é v y = ϕ × μ ν 1 / β ,
ϕ = Γ ( 1 + β ) × sin ( π × 1 / β ) Γ ( 2 ( β 1 ) / 2 × β × ( 1 + β ) / 2 ) ,
A = 2 × b r 1 b ,
C = 2 × r 2 ,
x i t + 1 = b e s t A ( C b e s t x i t ) ,
where Γ is a gamma function, μ and ν are independent random variables subject to normal distribution with a mean of 0 and variance of 1, β ∈ [1, 2] is a constant, and r1,r2 ∈ [0, 1] is a random number. In an ideal case, the rotation transformation between the two bullet point clouds is obtained by using the improved Cuckoo algorithm, and the solving error of the transformation is 0 [31]. In fact, due to the influence of measurement errors and other factors, the distance between corresponding features cannot reach the ideal position through iteration, so the solution of the problem of seeking the best rotation angle becomes as follows. To minimize the Euclidean distance between the target point cloud P = {P1, P2, …, Pm} and the source point cloud Q = {Q1, Q2, …, Qn}, the transformation matrix parameters are first encoded and normalized, which represent the nest position in the Cuckoo Search Algorithm. Then, the Cuckoo Search Algorithm is used to optimize the objective function of the point cloud model. The Cuckoo global optimization function based on pattern search and chemotactic behavior is as follows:
F ( θ ) = m i n R v ( θ ) P i Q i 2
The parameters of the algorithm in this paper are set as follows: the population number is set to 20, the rotation angle range is [0°, 360°], Pa = 0.25, and the maximum number of iterations is set in the experiment, and the optimal rotation angle is obtained by running it independently and completely 30 times.

2.5. Algorithm Complexity Analysis

This algorithm mainly includes the following four steps: voxel downsampling, fitting the central axis, using the discrete Hough transform to extract line features, and improving the Cuckoo algorithm to find the optimal rotation angle to complete the matching. The total computational complexity is O(m + k3 + n. + T-m + m), where m is the number of points in the point cloud, and assuming that there are k voxels on each axis, and the number of populations n is set to 20 and the maximal number of iterations is T, then k3 ≪ m and n ≪ m and h ≪ m, where the overall complexity is then mainly controlled by O(T-m). The ICP algorithm consists of initializing the alignment, iterating the nearest-point matching, calculating the transformation matrix and applying it, and iterating until convergence, and the computational complexity is O(T − (mlogm + m)) = O(mlogm). Similarly, the computational complexity of TRICP and GOICP is O(T − (mlogm + m)) and O(T − (mlogm + m) + bd), where the GOICP algorithm branch delimitation is obtained by the branching factor b = 3 and depth of tree d = 10. The algorithms in this paper have some advantages in terms of complexity.

3. Experimental Result

In order to verify the performance of the proposed algorithm, the point cloud data of the three-dimensional morphology of different types of warheads were obtained, respectively. The actual bullet point cloud collected on the automatic zoom three-dimensional surface measuring instrument and the bullet point cloud obtained from the CAD model were taken as the experimental data. The proposed algorithms, ICP, TRICP, and GOICP, were used for the registration, and the experimental results were analyzed. The experimental platform was the Intel(R) Core (TM) i5-8250U CPU @ 1.60 GHz 1.80 GHz processor (Intel, Santa Clara, CA, USA), with 8 GB of memory, using Windows 10, a 64-bit operating system, and a Matlab 2022b coding environment.
The initial position of the bullet point cloud sampled by the CAD model is shown in Figure 5, where the 9 mm_source is the source point cloud, represented in green, and the 9 mm_target is the target point cloud, represented in blue.
The algorithm proposed in this paper can simplify the dense point cloud to the specified resolution. Table 1 shows the point cloud filtering results when the target size is specified as 0.003 mm. After two voxel filtering operations, the point cloud can be automatically adjusted to the specified size, and the resolution of the source point cloud is basically the same as that of the target cloud, with an error of about 1.4%.
The registration results of the bullet point clouds under different overlap rates are shown in Table 2. The CAD model of the bullet is similar to that of five bullet point clouds generated by the three-dimensional topography measurement from five angles, as shown in Figure 6, with different overlap rates among the point clouds. The definition is based on local density comparison of the points in the two aligned point clouds to assess their spatial overlap degree. By calculating the local density of each point within a given search radius and comparing the density ratio of the corresponding pair of points, the two points are considered to overlap when the local densities of the two points are similar. The ratio of the total number of overlapping points to the total number of points in the point cloud is defined as the overlap rate.
Part of the CAD bullet point cloud obtained from the above five different angles was used for the registration experiments, and the proposed algorithm was compared with the ICP, TRICP, and GOICP algorithms. The ICP algorithm is a classical registration algorithm that is easy to implement and has a high efficiency, but is easily affected by the initial position. The TRICP algorithm provides a more robust registration method by improving the ability to deal with outliers and partial overlaps through a pruning mechanism; the GOICP algorithm is committed to solving the local optimal problem of the ICP algorithm and providing the global optimal solution, but the calculation cost is high. The experimental results are shown in Table 2. It can be seen that the proposed algorithm has a good registration effect when the overlap rate is greater than 20%, while the ICP algorithm fails when the overlap rate is lower than 40%.
At the same time, we use the mean distance error, nearest neighbor distance error, Hausdorff distance, and root error for the overlapping regions of the mean square as the registration error, where the smaller the registration error, the better the registration effect. It can be seen from Table 3, Table 4, Table 5 and Table 6 that the registration algorithm proposed in this paper has a high speed and high registration accuracy. When the overlap rate is large, the registration accuracy of these registration algorithms is similar. With the decrease in the overlap rate, when the overlap rate is not less than 20%, the registration algorithm proposed in this paper has a higher registration accuracy than the ICP, TRICP, and GOICP algorithms.
Figure 7 shows the comparison diagram of the mean distance error, nearest neighbor distance error, Hausdorff distance, and root mean square error curves of the four algorithms. The smaller the registration error, the better the registration effect. As can be seen from Figure 7, when the overlap rate is not less than 20%, the four error indicators of the proposed algorithm are far lower than the ICP algorithm and its variant algorithms, indicating that the registration accuracy of the proposed algorithm is superior to the other three algorithms.
As can be seen from the graphs generated by the ICP algorithm and its variants’ error values, the RMS, NNED, HD, and MED, at different overlap rates of the bullet point cloud in Figure 7, it is difficult to find an accurate match because the geometric structure and complexity of the overlapping regions of the point cloud can be different, and it is even more difficult for the ICP, TRICP, and GOICP alignment algorithms to accurately match the overlapping regions if they contain even smaller and more complex geometric features, resulting in wavy fluctuations in the error value. However, the algorithm in this paper has a stable error value close to zero when the overlap rate is not less than 20%, which indicates that the alignment algorithm in this paper is very effective in dealing with the current point cloud data, and can accurately find the best alignment transform.
According to the actual project requirements, an automatic zoom microscope was used in this paper to collect bullet data, as shown in Figure 8. The autozoom microscope combines the function of a surface roughness measuring instrument with that of a topography measuring instrument based on the focus change technology. The focus change technique combines the small-range depth focus of an optical system with vertical scan detection to provide morphology and color information from the focus change. The main part of the system consists of a precision optical device containing a variety of lens systems that can be equipped with different objectives, thus achieving measurements of different resolutions. Precision optics move vertically along the optical axis while continuously capturing data from the surface. This means that each area of the object can be clearly focused for imaging and subsequently convert the acquired sensor data into 3D point cloud data. The test object of this experiment is a bullet, and two point clouds of the tested object under different viewing angles are obtained. The overlapping part of the point cloud from the two viewing angles can be extracted in pairs through the point cloud overlapping part extraction algorithm. The details are shown in Figure 9 and Figure 10.
An automatic zoom microscope was used to carry out the morphological measurements of the bullet marks of 9 mm pistol bullets. In this paper, the key parts of the bullet marks were selected for acquisition with higher precision and were more conducive to achieving high-precision registration. The results are shown in Figure 9.
The extracted two pairs of point cloud data were registered in two pairs according to the sequence, as shown in Figure 9. In the process of self-mining, we separately collected the topography of the overlapping regions of two point clouds and used it as the true value of the registration of the overlapping region (Figure 10 and Figure 11). After the registration, we pairwise calculated the overlapping areas of two point clouds, using the mean distance error, nearest neighbor distance error, and Hausdorff distance as registration errors to evaluate the registration accuracy of the overlapping regions. As can be seen from Table 7, the registration accuracy is high; as can be seen from the effect diagram in Figure 12, the point cloud splicing effect after the registration of the proposed algorithm is remarkable, preserving the integrity of the original data while enhancing the overall shape recognition of the model, greatly reducing errors, providing reliable data fusion, and laying a solid foundation for subsequent analysis and application.

4. Discussions

In this section, we discuss the drawbacks of our approach and possible solutions. We believe that improving on these drawbacks will lead to more stable and robust results.

4.1. Initial Position Dependency

This algorithm relies on an initial alignment position determined by fitting the y-axis stratification to the center of the sphere. If the initial position is inaccurately estimated, multiple local optimal solutions may exist in the search space. This dependency is especially noticeable in complex scenes or when the quality of the point cloud data is poor. To solve this problem, one can try to introduce a multiple random initialization strategy to reduce the effect of the initial position dependence by running the algorithm multiple times and taking the optimal result. In addition, using population intelligence algorithms such as Particle Swarm Optimization (PSO) or Differential Evolution (DE) to optimize the initial positions can also provide better starting solutions.

4.2. Sensitivity of Parameter Selection

The method performance is more sensitive to some parameters (e.g., the resolution of voxel filtering, the number of populations and number of iterations of the ICS algorithm, etc.). Different parameter choices may lead to significant variations in computational complexity and alignment accuracy, making it difficult to maintain consistent performance in all scenes. To solve this problem, adaptive parameter tuning methods such as the Genetic Algorithm (GA) or Bayesian Optimization (BO) can be used to automatically select the optimal parameters based on specific data. Identifying key parameters and performing targeted optimization also helps to improve the robustness of the algorithm.

4.3. Computing Resource Consumption

Although the computational complexity of the algorithms is not very high, the consumption of computational resources (e.g., memory and processor resources) is still a challenge when dealing with extremely large-scale point cloud data. Especially in real-time applications, computational resource consumption may become a bottleneck. To solve this problem, parallel computing techniques can be utilized to improve the computational efficiency by distributing computational tasks to be executed on multi-core processors. Parallelization is achieved using techniques such as OpenMP (https://www.openmp.org/) or CUDA (https://developer.nvidia.com/cuda-toolkit, accessed on 4 July 2024), as well as the use of distributed computing frameworks (e.g., Apache Spark (https://spark.apache.org/) or Hadoop (https://hadoop.apache.org/)) for the distributed processing of large-scale point cloud data on multiple machines to further increase the processing power.

5. Conclusions

Aiming at the problems of a large amount of point cloud data, long measurement times, and the low efficiency of splicing registration algorithms, a point cloud registration algorithm based on a binocular structured light measurement system with a low overlap rate was proposed. The proposed registration method can accurately register bullet point clouds at different resolutions with a low overlap. The algorithm in this paper was used to eliminate the noise points of the point cloud and realize the registration of the point cloud at different resolutions. The CAD bullet point cloud registration experiment and self-collected bullet point cloud experiment verified that the proposed algorithm can effectively apply low-overlap point cloud registration. At the same time, compared with the ICP, GICP, and TICP algorithms, the registration error of the proposed algorithm is more than 95% less than that of the ICP algorithm and its variant algorithms when the overlap rate is not less than 20%.

Author Contributions

Conceptualization, Q.Z.; methodology, Q.Z.; software, Q.Z.; validation, Q.Z., R.H. and Y.L.; formal analysis, Q.Z. and H.W.; investigation, Q.Z. and R.H.; resources, Y.L. and H.W.; data curation, Q.Z.; writing—original draft preparation, Q.Z. and Z.M.; writing—review and editing, X.H. and Z.W.; visualization, Z.M. and X.H.; supervision, Z.M. and Z.W.; project administration, X.H. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The original contributions presented in the study are included in the article, further inquiries can be directed to the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Fan, Y.; Mingkai, L. Application of identification of bullet marks in case investigation. Sci. Technol. 2015, 25, 227–228. [Google Scholar]
  2. Thompson, W.C.; Vuille, J.; Taroni, F.; Bidermann, A. After uniqueness: The evolution of forensic science opinions. Judicature 2018, 102, 18. [Google Scholar]
  3. Suapang, P.; Yimmun, S.; Chumnan, N. Tool and Firearm Identification System Based on Image Processing. In Proceedings of the 11th International Conference on Control, Automation and Systems (ICCAS)/Robot World Conference, Gyeonggi-do, Republic of Korea, 26–29 October 2011; pp. 178–182. [Google Scholar]
  4. Yanlin, T.; Xiaosen, L.; Yiteng, Z. Study on Automatic Identification Technology of Three-dimensional Bullet Marks. J. Yunnan Police Coll. 2021, 3, 122–128. [Google Scholar]
  5. Song, J.; Vorburger, T.V.; Ballou, S.; Thompson, R.M.; Yen, J.; Renegar, T.B.; Zheng, A.; Silver, R.M.; Ols, M. The national ballistics imaging comparison (NBIC) project. Forensic Sci. Int. 2012, 216, 168–182. [Google Scholar] [CrossRef] [PubMed]
  6. Bachrach, B. Development of a 3D-based automated firearms evidence comparison system. J. Forensic Sci. 2002, 47, 1253–1264. [Google Scholar] [CrossRef] [PubMed]
  7. Leon, F.P.; Beyerer, J. Automatic comparison of striation information on firearm bullets. In Proceedings of the Intelligent robots and computer vision XVIII: Algorithms, Techniques, and Active Vision, Boston, MA, USA, 20–21 September 1999; pp. 266–277. [Google Scholar]
  8. Brinck, T.B. Comparing the Performance of IBIS and BulletTRAX-3D Technology Using Bullets Fired through 10 Consecutively Rifled Barrels. J. Forensic Sci. 2008, 53, 677–682. [Google Scholar] [CrossRef]
  9. Senin, N.; Groppetti, R.; Garofano, L.; Fratini, P.; Pierni, M. Three-dimensional surface topography acquisition and analysis for firearm identification. J. Forensic Sci. 2006, 51, 282–295. [Google Scholar] [CrossRef] [PubMed]
  10. Sakarya, U.; Leloğlu, U.M.; Tunali, E. Three-dimensional surface reconstruction for cartridge cases using photometric stereo. Forensic Sci. Int. 2008, 175, 209–217. [Google Scholar] [CrossRef] [PubMed]
  11. Qiao, P.; He, X.; Wei, Z.; Wang, F.; Lin, W. Application of Gaussian Filter in Extracting Bullet Feature of 3-D Bullet Image. Chin. J. Liq. Cryst. Disp. 2012, 27, 708–712. [Google Scholar] [CrossRef]
  12. Newton, L.; Senin, N.; Gomez, C.; Danzl, R.; Helmli, F.; Blunt, L.; Leach, R. Areal topography measurement of metal additive surfaces using focus variation microscopy. Addit. Manuf. 2019, 25, 365–389. [Google Scholar] [CrossRef]
  13. Besl, P.J.; McKay, N.D. Method for registration of 3-D shapes. In Proceedings of the Sensor fusion IV: Control Paradigms and Data Structures, Boston, MA, USA, 12–15 November 1992; pp. 586–606. [Google Scholar]
  14. Zheng, W.; Lian, G.; Zhang, X.; Guo, F. Point cloud registration method based on principal component analysis and feature map matching. Chin. J. Intell. Sci. Technol. 2023, 5, 543–552. [Google Scholar]
  15. Chetverikov, D.; Svirko, D.; Stepanov, D.; Krsek, P. The trimmed iterative closest point algorithm. In Proceedings of the 2002 International Conference on Pattern Recognition, Quebec, QC, Canada, 11–15 August 2002; pp. 545–548. [Google Scholar]
  16. Yang, J.; Li, H.; Campbell, D.; Jia, Y. Go-ICP: A globally optimal solution to 3D ICP point-set registration. IEEE Trans. Pattern Anal. Mach. Intell. 2015, 38, 2241–2254. [Google Scholar] [CrossRef]
  17. Ji, S.; Ren, Y.; Ji, Z.; Liu, X.; Hong, G. An improved method for registration of point cloud. Optik 2017, 140, 451–458. [Google Scholar] [CrossRef]
  18. Meng, J.; Li, J.; Gao, X. An accelerated ICP registration algorithm for 3D point cloud data. In Proceedings of the 9th International Symposium on Advanced Optical Manufacturing and Testing Technologies: Optical Test, Measurement Technology, and Equipment, Chengdu, China, 26–29 June 2019; pp. 18–23. [Google Scholar]
  19. Zhu, Q.; Wu, J.; Hu, H.; Xiao, C.; Chen, W. LIDAR point cloud registration for sensing and reconstruction of unstructured terrain. Appl. Sci. 2018, 8, 2318. [Google Scholar] [CrossRef]
  20. Li, J.; Qian, F.; Chen, X. Point cloud registration algorithm based on overlapping region extraction. J. Phys. Conf. Ser. 2020, 1634, 012012. [Google Scholar] [CrossRef]
  21. Poiesi, F.; Boscaini, D. Learning General and Distinctive 3D Local Deep Descriptors for Point Cloud Registration. IEEE Trans. Pattern Anal. Mach. Intell. 2023, 45, 3979–3985. [Google Scholar] [CrossRef] [PubMed]
  22. Zhang, Y.; Li, X.; Han, X. Three-Dimensional Point Cloud Registration Method with Low Overlap Rate. Laser Optoelectron. Prog. 2021, 58, 0810014. [Google Scholar] [CrossRef]
  23. Hörner, J. Automatic Point Clouds Merging. Master’s Thesis, Univerzita Karlova, Prague, Czechia, 2018. [Google Scholar]
  24. Pomerleau, F.; Colas, F.; Siegwart, R. A review of point cloud registration algorithms for mobile robotics. Found. Trends Robot. 2015, 4, 1–104. [Google Scholar] [CrossRef]
  25. Lu, J.; Wang, Z.; Hua, B.; Chen, K. Automatic point cloud registration algorithm based on the feature histogram of local surface. PLoS ONE 2020, 15, e0238802. [Google Scholar] [CrossRef]
  26. Theiler, P.; Schindler, K. Automatic registration of terrestrial laser scanner point clouds using natural planar surfaces. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2012, 1, 173–178. [Google Scholar] [CrossRef]
  27. Pauly, M.; Gross, M.; Kobbelt, L.P. Efficient simplification of point-sampled surfaces. In Proceedings of the IEEE Visualization, VIS 2002, Boston, MA, USA, 27 October–1 November 2002; pp. 163–170. [Google Scholar]
  28. Dalitz, C.; Schramke, T.; Jeltsch, M. Iterative Hough Transform for Line Detection in 3D Point Clouds. Image Process. Line 2017, 7, 184–196. [Google Scholar] [CrossRef]
  29. Prokop, M.; Shaikh, S.A.; Kim, K.-S. Low Overlapping Point Cloud Registration Using Line Features Detection. Remote Sens. 2019, 12, 61. [Google Scholar] [CrossRef]
  30. Qin, G. Novel Cuckoo Search Algorithm and Its Application. Chin. J. Eng. Math. 2023, 40, 883–895. [Google Scholar]
  31. Wei, M. Three-dimensional point cloud registration algorithm based on cuckoo optimization. Comput. Appl. Softw. 2020, 37, 216–223. [Google Scholar]
Figure 1. Flow chart of bullet registration.
Figure 1. Flow chart of bullet registration.
Applsci 14 06105 g001
Figure 2. Point clouds of different densities representing the features.
Figure 2. Point clouds of different densities representing the features.
Applsci 14 06105 g002
Figure 3. Fitting to find the center of the sphere: (a) finding the center of the circle in a two-dimensional plane; (b) finding the center of the sphere in a three-dimensional space.
Figure 3. Fitting to find the center of the sphere: (a) finding the center of the circle in a two-dimensional plane; (b) finding the center of the sphere in a three-dimensional space.
Applsci 14 06105 g003
Figure 4. Extraction of bullet marks by the discrete Hough transform: (a) linear feature information of bullet marks based on icosahedral fitting; (b) raw input data of bullet marks.
Figure 4. Extraction of bullet marks by the discrete Hough transform: (a) linear feature information of bullet marks based on icosahedral fitting; (b) raw input data of bullet marks.
Applsci 14 06105 g004
Figure 5. Initial location map of bullet point cloud (blue point cloud is source point cloud, green electric cloud is target point cloud): (a) sampling point cloud map of the 9 mm bullet CAD model to be registered (overlap rate of 52.85%); (b) cloud image of the sampling point of the CAD model of the 9 mm bullet to be registered (overlap rate of 33.02%); (c) automatic zoom microscope acquisition of the 7 mm bullet point cloud initial location map to be registered; (d) the initial location map of the 9 mm bullet point cloud acquired by the autozoom microscope to be registered.
Figure 5. Initial location map of bullet point cloud (blue point cloud is source point cloud, green electric cloud is target point cloud): (a) sampling point cloud map of the 9 mm bullet CAD model to be registered (overlap rate of 52.85%); (b) cloud image of the sampling point of the CAD model of the 9 mm bullet to be registered (overlap rate of 33.02%); (c) automatic zoom microscope acquisition of the 7 mm bullet point cloud initial location map to be registered; (d) the initial location map of the 9 mm bullet point cloud acquired by the autozoom microscope to be registered.
Applsci 14 06105 g005
Figure 6. The three-dimensional topography of the warhead collected from five different angles.
Figure 6. The three-dimensional topography of the warhead collected from five different angles.
Applsci 14 06105 g006
Figure 7. Error comparison graph between the proposed algorithm and the ICP and its variants: (a) RMS error curve comparison graph; (b) NNED error curve comparison chart; (c) HD error curve comparison diagram; (d) MED error curve comparison chart.
Figure 7. Error comparison graph between the proposed algorithm and the ICP and its variants: (a) RMS error curve comparison graph; (b) NNED error curve comparison chart; (c) HD error curve comparison diagram; (d) MED error curve comparison chart.
Applsci 14 06105 g007
Figure 8. Automatic zoom microscope: (a) system composition: stage and different objectives; (b) data are collected on the bullet head, which is placed on the magazine for rotation at a specified angle.
Figure 8. Automatic zoom microscope: (a) system composition: stage and different objectives; (b) data are collected on the bullet head, which is placed on the magazine for rotation at a specified angle.
Applsci 14 06105 g008
Figure 9. Comparison of the three-dimensional topography of the bullet: (a) 9 mm pistol bullet shot; (b) partial two-dimensional images acquired by a 9 mm bullet autozoom microscope; (c) partial three-dimensional bullet point cloud image converted by the autozoom microscope sensor.
Figure 9. Comparison of the three-dimensional topography of the bullet: (a) 9 mm pistol bullet shot; (b) partial two-dimensional images acquired by a 9 mm bullet autozoom microscope; (c) partial three-dimensional bullet point cloud image converted by the autozoom microscope sensor.
Applsci 14 06105 g009
Figure 10. Comparison of the three-dimensional morphological measurements and two-dimensional characteristics of the bullet marks.
Figure 10. Comparison of the three-dimensional morphological measurements and two-dimensional characteristics of the bullet marks.
Applsci 14 06105 g010
Figure 11. Schematic diagram of the error calculation area of the bullet registration results: (a) three-dimensional measurement point cloud map of the bullet in the overlapping area; (b) three-dimensional point cloud image of the overlapping area calculated by the algorithm.
Figure 11. Schematic diagram of the error calculation area of the bullet registration results: (a) three-dimensional measurement point cloud map of the bullet in the overlapping area; (b) three-dimensional point cloud image of the overlapping area calculated by the algorithm.
Applsci 14 06105 g011
Figure 12. The registration results of the self-collected three-dimensional topography data of the bullet: (a) the point cloud data of the bullet collected from the initial angle; (b) rotation to the next bullet impact to collect the point cloud data of the warhead; (c) the first and second angle acquisitions of the bullet point cloud registration results; (d) the third and fourth angle collection point cloud slice registration result maps; (e) three point cloud registration maps; (f) the side of the intact bullet containing a quasi-result map of the distribution of the bullet marks.
Figure 12. The registration results of the self-collected three-dimensional topography data of the bullet: (a) the point cloud data of the bullet collected from the initial angle; (b) rotation to the next bullet impact to collect the point cloud data of the warhead; (c) the first and second angle acquisitions of the bullet point cloud registration results; (d) the third and fourth angle collection point cloud slice registration result maps; (e) three point cloud registration maps; (f) the side of the intact bullet containing a quasi-result map of the distribution of the bullet marks.
Applsci 14 06105 g012aApplsci 14 06105 g012b
Table 1. Voxel downsampling filter number results.
Table 1. Voxel downsampling filter number results.
Number of Iterations9 mm_Source9 mm_Target5 mm_Source5 mm_Target
Point Cloud NumberResolutionPoint Cloud NumberResolutionPoint Cloud NumberResolutionPoint Cloud NumberResolution
01,000,5410.005100,3250.0143,102,2720.00496,5110.011
1111,9260.02060,8580.02269,5710.02044,7640.021
284,8500.02549,4030.02654,2970.02434,8040.025
time-consuming (s)12.30901.50588.50351.2243
Table 2. Comparison of the registration results between the proposed algorithm and the other algorithms.
Table 2. Comparison of the registration results between the proposed algorithm and the other algorithms.
Textual algorithmICPTRICPGOICP
52.85%Applsci 14 06105 i001Applsci 14 06105 i002Applsci 14 06105 i003Applsci 14 06105 i004
41.36%Applsci 14 06105 i005Applsci 14 06105 i006Applsci 14 06105 i007Applsci 14 06105 i008
33.02%Applsci 14 06105 i009Applsci 14 06105 i010Applsci 14 06105 i011Applsci 14 06105 i012
22.75%Applsci 14 06105 i013Applsci 14 06105 i014Applsci 14 06105 i015Applsci 14 06105 i016
14.57%Applsci 14 06105 i017Applsci 14 06105 i018Applsci 14 06105 i019Applsci 14 06105 i020
Table 3. RMS error of each algorithm with different overlap rates.
Table 3. RMS error of each algorithm with different overlap rates.
RMS52.85%41.36%33.02%22.75%14.57%
Textual algorithm0.00089900.00020300.00010600.00029600.0281376
ICP0.01228410.01808000.02246330.02681150.0340953
TRICP0.03050200.03257940.03047910.04046400.0322348
GOICP0.03262260.03713070.03045570.04726400.0361345
Table 4. NNDE error of each algorithm with different overlap rates.
Table 4. NNDE error of each algorithm with different overlap rates.
NNDE52.85%41.36%33.02%22.75%14.57%
Textual algorithm0.00022850.00015020.00010330.00016410.0134320
ICP0.00251040.01940900.01323000.03292300.0134900
TRICP0.01244600.01459200.01323300.02473800.0151170
GOICP0.01428500.01933800.01320300.03305000.0203130
Table 5. HD error of each algorithm with different overlap rates.
Table 5. HD error of each algorithm with different overlap rates.
HD52.85%41.36%33.02%22.75%14.57%
Textual algorithm0.00129910.00038590.00014350.00047160.0323400
ICP0.01379000.03841500.03148100.04846300.0322890
TRICP0.03154100.03363900.03148700.04254400.0337280
GOICP0.03372700.03832900.03147200.04876400.0399140
Table 6. MDE error of each algorithm with different overlap rates.
Table 6. MDE error of each algorithm with different overlap rates.
MDE52.85%41.36%33.02%22.75%14.57%
Textual algorithm0.00086170.00018630.00101240.00028470.0231050
ICP0.01324300.03689700.03041200.04673700.0293120
TRICP0.03045700.03254000.03043700.04040100.0271220
GOICP0.03257200.03708100.03041400.04721500.0311400
Table 7. Error between bullet point cloud data and true value.
Table 7. Error between bullet point cloud data and true value.
Textual AlgorithmPoint to OnePoint to TwoPoint to Three
MDE (Mean Distance Error)0.0023090.00270270.0023291
NNDE (Nearest Neighbor Distance Error)0.000198570.000439670.00014829
HD (Hausdorff Distance)0.00415360.00639350.0031696
RMS (Root Mean Square)0.00301560.00318780.0029457
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhang, Q.; Mu, Z.; He, X.; Wei, Z.; Hao, R.; Liao, Y.; Wang, H. Low-Overlap Bullet Point Cloud Registration Algorithm Based on Line Feature Detection. Appl. Sci. 2024, 14, 6105. https://doi.org/10.3390/app14146105

AMA Style

Zhang Q, Mu Z, He X, Wei Z, Hao R, Liao Y, Wang H. Low-Overlap Bullet Point Cloud Registration Algorithm Based on Line Feature Detection. Applied Sciences. 2024; 14(14):6105. https://doi.org/10.3390/app14146105

Chicago/Turabian Style

Zhang, Qiwen, Zhiya Mu, Xin He, Zhonghui Wei, Ruidong Hao, Yi Liao, and Hongyang Wang. 2024. "Low-Overlap Bullet Point Cloud Registration Algorithm Based on Line Feature Detection" Applied Sciences 14, no. 14: 6105. https://doi.org/10.3390/app14146105

APA Style

Zhang, Q., Mu, Z., He, X., Wei, Z., Hao, R., Liao, Y., & Wang, H. (2024). Low-Overlap Bullet Point Cloud Registration Algorithm Based on Line Feature Detection. Applied Sciences, 14(14), 6105. https://doi.org/10.3390/app14146105

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop