End Face Attitude Detection of Special Steel Bars Based on Improved DBSCAN

: An end face attitude detection system for special steel bars is designed to solve the problem of defect localization for steel bar grinding. A circle detection method based on improved Density-Based Spatial Clustering of Applications with Noise (DBSCAN) is proposed for calculating special steel bars’ end face attitude. Firstly, the images are subjected to edge detection, connected region marking, and improved DBSCAN in accordance with the image characteristics. After that, the arcs belonging to the same circle are clustered into the same category to create virtual connected regions. Then, circle parameters of a virtual connected region are clustered using an improved DBSCAN algorithm. The actual circle parameter is obtained by calculating the centroid of each category. Finally, the vector is generated under the set coordinate system, passing through the center of the circumcircle of the steel bar end and one endpoint of the two-dimensional code, and the angle of the vector is calculated to determine the attitude of the special steel bar’s end face. The experimental results demonstrate that the method can obtain an attitude angle resolution of 0.2 degrees with an error range of ± 0.1 degrees. This will provide accurate defect localization support for the digitization and intelligence of the grinding platform on the special steel bar production line.


Introduction
Currently, the special steel bar production line has a relatively low level of intelligence and digitalization, especially in the grinding process for special steel bars.The grinding of special steel bars is separated from the instrument for detecting surface defects.However, the present defect detection instrument does not include traceable information about the circumferential position.As soon as the bar leaves the detection instrument, the surface defect position of the bar will no longer be able to be determined.Therefore, the current surface defect grinding process still relies on manual inspection of the defect position and grinding of the defect.This not only reduces production efficiency, but also worsens the working environment in the steel bar production workshop, and the debris generated by grinding may cause various degrees of harm to human health, which presents new challenges for scientists and technologists.Innovation in science and technology has enabled the digitization, networking, and intelligent construction of steel bar production lines to become a key development objective and primary direction in the steel industry.And the automatic grinding system for special steel bars is an essential component of the digitalization and intelligence of the production line for special steel bars.Providing automatic grinding for special steel bars can not only improve production efficiency and product quality, but also obtain consistent grinding results and reduce hidden risks.
At present, steel bar surface defect detection methods include the eddy current detection method [1][2][3], the magnetic flux leakage detection method [4][5][6], the laser ultrasonic detection method [7][8][9], the infrared imaging detection method [10][11][12], the machine vision detection method [13][14][15], and so on.These detection methods, however, focus only on the presence of defects on the surface of the steel bar.And they do not consider the problem that the defect needs to be repositioned after the bar leaves the detection instrument.But to accomplish automatic surface defect grinding, off-line traceable surface defect location information must be obtained.The existing steel bar surface defect detection instruments can obtain axial and circumferential position information of the surface defect when the bar is placed in the detection instrument.The axial position is represented by the length from the end of a steel bar, whereas a circumferential position is generally expressed by the counterclockwise expansion angle of the random reference zero position.Once the steel bar leaves the detection instrument, the circumferential position is impossible to locate again due to using a random reference zero position.Consequently, the special steel bar end face attitude detection system 1 was installed in the flaw detection workshop as shown in Figure 1.The rotation attitude of the end face during defect detection is calculated based on the characteristics of the steel bars' end face.By recording the rotation attitude, the circumferential position of defect detection has an absolute reference zero position.At this moment, once the special steel bar end face attitude detection system 2 has also been installed at the position of the grinding platform, the surface defect location can be re-located for use in the steel bar automatic grinding system.
efficiency and product quality, but also obtain consistent grinding results and reduce hidden risks.
At present, steel bar surface defect detection methods include the eddy current detection method [1][2][3], the magnetic flux leakage detection method [4][5][6], the laser ultrasonic detection method [7][8][9], the infrared imaging detection method [10][11][12], the machine vision detection method [13][14][15], and so on.These detection methods, however, focus only on the presence of defects on the surface of the steel bar.And they do not consider the problem that the defect needs to be repositioned after the bar leaves the detection instrument.But to accomplish automatic surface defect grinding, off-line traceable surface defect location information must be obtained.The existing steel bar surface defect detection instruments can obtain axial and circumferential position information of the surface defect when the bar is placed in the detection instrument.The axial position is represented by the length from the end of a steel bar, whereas a circumferential position is generally expressed by the counterclockwise expansion angle of the random reference zero position.Once the steel bar leaves the detection instrument, the circumferential position is impossible to locate again due to using a random reference zero position.Consequently, the special steel bar end face attitude detection system 1 was installed in the flaw detection workshop as shown in Figure 1.The rotation attitude of the end face during defect detection is calculated based on the characteristics of the steel bars' end face.By recording the rotation attitude, the circumferential position of defect detection has an absolute reference zero position.At this moment, once the special steel bar end face attitude detection system 2 has also been installed at the position of the grinding platform, the surface defect location can be re-located for use in the steel bar automatic grinding system.In order to calculate the end face attitude of special steel bars, the characteristics of the steel bars' end faces must be determined first.As a first consideration, the end face's texture should be considered.The texture features can be used to extract defects [16] and perform identification [17].However, the collection of texture features requires high-definition acquisition equipment, and the light requirements are high, which is difficult to implement on a steel bar production line.Therefore, it is pertinent to consider the features added by human factors based on the actual production site situation.It is understood that different enterprises have different treatment processes for the steel bars' end faces.Some companies mark the bar ID by spraying the code [18][19][20] on the end of the bar while others mark the bar ID by spraying or sticking a two-dimensional code on it.According to Figure 2, the end face features of the three cases are mainly numbers, text, two-dimensional codes, and some regular shapes.For the purpose of representing the rotation attitude of a bar's end face in a way that is more intuitive and making the angle of the steel bar's end face correspond to the rotation angle of the bar, we selected the center of the In order to calculate the end face attitude of special steel bars, the characteristics of the steel bars' end faces must be determined first.As a first consideration, the end face's texture should be considered.The texture features can be used to extract defects [16] and perform identification [17].However, the collection of texture features requires high-definition acquisition equipment, and the light requirements are high, which is difficult to implement on a steel bar production line.Therefore, it is pertinent to consider the features added by human factors based on the actual production site situation.It is understood that different enterprises have different treatment processes for the steel bars' end faces.Some companies mark the bar ID by spraying the code [18][19][20] on the end of the bar while others mark the bar ID by spraying or sticking a two-dimensional code on it.According to Figure 2, the end face features of the three cases are mainly numbers, text, two-dimensional codes, and some regular shapes.For the purpose of representing the rotation attitude of a bar's end face in a way that is more intuitive and making the angle of the steel bar's end face correspond to the rotation angle of the bar, we selected the center of the circumscribed circle of the bar's end face and a feature point outside the center to determine the angle of the rotation attitude.Additionally, another feature point can be the end point of a number, text, or QR code, which can be selected based on the actual situation on site.The feature in Figure 2c is selected for investigation, which is the steel bar's end face feature on the production line of the HBIS GROUP SHISTEEL COMPANY, so it is more practical to study such a feature.In Figure 2c, the end face of the special steel bar has two obvious characteristics, one is the QR code feature, and the other is its round shape.Generation and recognition of QR codes [21,22] is widely used in a variety of fields, making them ubiquitous in everyday life.And the QR code recognition algorithm is generally considered to be stable.As for circle detection, it is now extensively used in spacecraft [23], industrial component measurement [24], iris localization [25], nanoparticle research in medicine [26], traffic signs [27], automated bar statistics [28], and in other fields.Currently, the most used circle detection methods are Hough transform (HT) [29], random circle detection (RCD) [30], etc.As pointed out in the literature, these methods have poor detection results in complex scenes, occlusions, and minimax circles.Therefore, it is necessary to improve circle detection methods to calculate steel bars' end face attitude.According to circle parameter characteristics, intelligent clustering algorithms can be applied to circle detection.DBSCAN [31,32] is an intelligent clustering algorithm with a noise removal function, which can be used for better circle parameter clustering.Therefore, a circle detection method based on improved DBSCAN is proposed in this paper to solve the instability problems caused by occlusion, noise, and random point selection in the traditional circle detection algorithm.Combined with the position information of the QR code, the resolution of the end face attitude angle reaches 0.2 degrees.And the calculation time at the resolution of 2048 × 1536 does not exceed 2.3 s on average, which complies with the bar production line's timing requirements.
The major contributions of this paper are as follows: 1.A circle detection algorithm based on improved DBSCAN is proposed, which overcomes the problem that the current circle detection algorithm does not detect circles accurately in complex scenes.2. A method is proposed to represent the circumferential position of the current steel bar using the end face attitude angle, which provides a theoretical foundation for automatic surface defect location.3.An end face attitude detection system is proposed to solve the problem that surface defects of special steel bars cannot be located during automatic grinding.

Related Work
At present, most research on the detection of steel bar end faces focuses on the automatic counting of the steel bar, the spraying of a code, the spraying of paint, and the recognition of characters [33][34][35][36].Zhang [33] mainly employed morphological methods to In Figure 2c, the end face of the special steel bar has two obvious characteristics, one is the QR code feature, and the other is its round shape.Generation and recognition of QR codes [21,22] is widely used in a variety of fields, making them ubiquitous in everyday life.And the QR code recognition algorithm is generally considered to be stable.As for circle detection, it is now extensively used in spacecraft [23], industrial component measurement [24], iris localization [25], nanoparticle research in medicine [26], traffic signs [27], automated bar statistics [28], and in other fields.Currently, the most used circle detection methods are Hough transform (HT) [29], random circle detection (RCD) [30], etc.As pointed out in the literature, these methods have poor detection results in complex scenes, occlusions, and minimax circles.Therefore, it is necessary to improve circle detection methods to calculate steel bars' end face attitude.According to circle parameter characteristics, intelligent clustering algorithms can be applied to circle detection.DBSCAN [31,32] is an intelligent clustering algorithm with a noise removal function, which can be used for better circle parameter clustering.Therefore, a circle detection method based on improved DBSCAN is proposed in this paper to solve the instability problems caused by occlusion, noise, and random point selection in the traditional circle detection algorithm.Combined with the position information of the QR code, the resolution of the end face attitude angle reaches 0.2 degrees.And the calculation time at the resolution of 2048 × 1536 does not exceed 2.3 s on average, which complies with the bar production line's timing requirements.
The major contributions of this paper are as follows: 1.
A circle detection algorithm based on improved DBSCAN is proposed, which overcomes the problem that the current circle detection algorithm does not detect circles accurately in complex scenes.

2.
A method is proposed to represent the circumferential position of the current steel bar using the end face attitude angle, which provides a theoretical foundation for automatic surface defect location.

3.
An end face attitude detection system is proposed to solve the problem that surface defects of special steel bars cannot be located during automatic grinding.

Related Work
At present, most research on the detection of steel bar end faces focuses on the automatic counting of the steel bar, the spraying of a code, the spraying of paint, and the recognition of characters [33][34][35][36].Zhang [33] mainly employed morphological methods to segment the end faces of different steel bars and counted the number of steel bars.Zhu [34] used an iterative training method to train the SVM classifier for many times, and counted steel bars.Feng [19] utilized HT to locate the center of steel bars' end faces, which provided positioning data for the spray code robot.Xie et al. [35] used HT to locate a steel bar's end face and spray paint the steel bar's end face.Zhang et al. [36] used HT to segment the image of a steel bar's end face, and identified the bar's end face characters.As can be seen in Table 1a, most of the current methods for detecting and localizing steel bars' end faces only focus on quantity and relative position, but not the attitude of the end face.And most of the current research uses HT to recognize the steel bars' end faces.Despite this, HT circle detection is time-consuming, sensitive to noise, and has low accuracy in complex backgrounds.It Is important to note that there are many other methods of circle detection besides HT, as shown in Table 1b.Yao et al. [37] proposed an algorithm of curvature-aided HT for circle detection (CACD), and by pre-estimating the curvature, the cumulative operation of all points can be avoided and interruptions between different scales avoided, making circle detection faster and more accurate.However, appropriate edge detection information is required to calculate the curvature.Jiang et al. [38] proposed a fast circle detection algorithm using differential region sampling (DRSCD), but it still needs to reach a certain number of points on the candidate circle.The random sampling method is limited by random characteristics, and it is difficult to adapt to many noise points or complex scenes.When there are many false points, it not only affects the detection speed, but also affects the actual accuracy of detection.Chung et al. [39] proposed an improved RCD and used an effective refinement strategy to improve the accuracy of circle detection.Also, due to random sampling, multiple attempts were required to obtain valid sampling points.Zhao et al. [40] proposed a new circle detection algorithm based on inscribed triangles (ITCD), which achieved good detection results, but it is difficult to ensure the accuracy of large-diameter circles with occlusions when multiple small arcs are reserved.González et al. [41] proposed an evolutionary algorithm to automatically detect circles in images (EACD), which performed well on synthetic and hand-drawn images, but needs to be improved in real images.Therefore, it is necessary to study circle detection methods to solve the problem of low detection accuracy and poor anti-noise ability of the current circle detection methods in complex scenes.

Related Definitions 2.2.1. Definition for Circle Parameters of Connected Regions
It can be seen from Figure 3a that any three points are taken to calculate the circle parameters on a complex background, and a lot of invalid circle parameter data will be obtained.Therefore, different connected regions are marked with different colors in Figure 3d.We can get any three points P 1C (x 1C , y 1C ), P 2C (x 2C , y 2C ), and P 3C (x 3C , y 3C ), which belong to the same connected region C on the image.The distance between any two points is not less than G L , where G L generally takes 1/4 of the maximum length of the connected region.Then, the circle parameter of the connected region is defined as follows: (1) parameters on a complex background, and a lot of invalid circle parameter data will be obtained.Therefore, different connected regions are marked with different colors in Fig- ure 3d.We can get any three points 1 1 1 ( , ) ( , ) , and 3 3 3   ( , ) , which belong to the same connected region C on the image.The distance between any two points is not less than L G , where L G generally takes 1/4 of the maximum length of the connected region.Then, the circle parameter of the connected region is defined as follows: ( ) ( ) In which 12 , and

Definition of Circle Parameter Density
Three points are randomly selected multiple times for the same connected region to obtain multiple circle parameters for the same region.If the connected region belongs to a part of a circle, the circle parameters are relatively concentrated.If the connected region is irregular in shape, the calculated circle parameters will be extremely scattered, and all In which , and d = (y 1C + y 2C )/2.Obviously, when k 12 is equal to k 13 , since all three points are on a straight line, it is not possible to determine the effective circle parameter, which is discarded.

Definition of Circle Parameter Density
Three points are randomly selected multiple times for the same connected region to obtain multiple circle parameters for the same region.If the connected region belongs to a part of a circle, the circle parameters are relatively concentrated.If the connected region is irregular in shape, the calculated circle parameters will be extremely scattered, and all these circle parameter data will be distributed in a three-dimensional space.In order to characterize the density of these data, the definition of a circle parameter density block is introduced.
Assuming that the circle parameter sample set is , the circle parameter density block is defined as follows: (1) Establish a coordinate system: As we know, d i are three-dimensional data (x, y, r), and a three-dimensional coordinate system (X, Y, Z) can be established.Then, the circle parameter data (x, y, r) are mapped to the three-dimensional coordinate (x, y, z).(2) Division of a cube block: In order to characterize the circle parameter data distribution, the maximum value MaxX, MaxY, MaxZ and the minimum value MinX, MinY, MinZ in the three directions of X, Y and Z are taken, respectively, and the planes are drawn with Equations ( 5)- (7).Then the Nc cube with a side length of G is established.
x = kG, k is an integer, and (3) Definition of circle parameter density: For any d i ∈ D, it must fall in a certain cube j, which is denoted as d i,j .If the number of d i,j in block j is denoted as N j , the density of block can be defined as follows:

Proposed End Face Attitude Detection Algorithm for Special Steel Bars
As previously mentioned, the bar end face attitude is determined primarily by two feature points.Since the two-dimensional code has a more mature identification and positioning method, the accuracy of the bar end face attitude angle is primarily determined by the accuracy of the center of a steel bar's end face.For illustrating our detection algorithm, we use the algorithm flow chart shown in Figure 4 to illustrate its implementation.The flow chart not only illustrates the detection process of the steel's end face attitude, but also the key steps involved in its implementation.The detection process mainly includes the following steps: steel bar end face image acquisition, connected region marking, circle parameter clustering based on improved DBSCAN, actual circle parameter detection, and calculating steel bar end face attitude angles.The specific implementation is as follows.

Modifying Canny Edge Detection
The filtering in the canny edge detection [42] method generally adopts Gaussian filtering.We mainly extract the rough outline information from the image of the steel bar's end face in this paper and do not pay attention to the detailed information.To this end, the filtering in canny edge detection is improved to minimum filtering to highlight the edge information of the main contour according to Equation (9).
In Equation ( 9), K N×N is the N × N all-one template, ImageNew(i, j) is the pixel value of any point after filtering, and Image N i ×N j is the N × N neighborhood of the pixel position (i, j).Then, the edge information of the image is calculated as shown in Figure 3a.

Modifying Canny Edge Detection
The filtering in the canny edge detection [42] method generally adopts Gaussian filtering.We mainly extract the rough outline information from the image of the steel bar's end face in this paper and do not pay attention to the detailed information.To this end, the filtering in canny edge detection is improved to minimum filtering to highlight the edge information of the main contour according to Equation ( 9).
( , ) min( .) In Equation ( 9) all-one template, ImageNew(i, j) is the pixel value of any point after filtering, and neighborhood of the pixel position ( , ) i j .Then, the edge information of the image is calculated as shown in Figure 3a.

Marking the Connected Regions
The edge image is relatively complex from the edge image in Figure 3a, and the arc of the circle to be detected is often connected to or intersected with the background or arcs of other circles.When circle parameters are calculated directly, not only is the calculation large, but also the accuracy cannot be guaranteed.In order to detect the circle more accurately, the Harris corner detection method is used to detect the position of the corner point,

Marking the Connected Regions
The edge image is relatively complex from the edge image in Figure 3a, and the arc of the circle to be detected is often connected to or intersected with the background or arcs of other circles.When circle parameters are calculated directly, not only is the calculation large, but also the accuracy cannot be guaranteed.In order to detect the circle more accurately, the Harris corner detection method is used to detect the position of the corner point, and the detection result is shown in Figure 3b.Then, the position of the corner point is corroded, resulting in a complete disconnect between the intersection point and the position of the broken line in the original image.Then, the marking matrix ImageLabel is obtained by using the 8-neighborhood connected region marking method as shown in Figure 3c.Smaller connected regions are removed according to a threshold, and a new marking matrix FImageLabel is obtained as shown in Figure 3d.
It is evident from Figure 3d that not only does the final marking result retain the effective arc, but it also removes a large amount of invalid edge information, which also eliminates an extensive amount of interference data for calculating circle parameters, thereby speeding up the calculation process.

Circle Parameter Clustering Based on Improved DBSCAN
The circle parameters of the marking results are calculated using the method described in Section 2.2.1, and all circle parameters are shown in Figure 5a.In order to obtain the circumcircle parameters of the steel bar's end face accurately, data filtering is added to DBSCAN as an improved method to reduce the large amount of useless data involved in clustering.The improved DBSCAN implementation steps are as follows: (1) In order to reduce the large amount of useless data involved in clustering, and the circle parameter filtering is completed according to the following steps: (a) The coordinates of the circle center are set as the X and Y axes, respectively, and the coordinates of the radius are set as the Z axis.At this time, all the circle parameters (x, y, r) are mapped to a three-dimensional coordinate (x, y, z).

(b)
According to Equation ( 8), the number of data N j is counted in the j-th cube, whose side length is G, and the density values of cubes DY j are acquired.(c) We set the cube density threshold to be TH cube ; when density value of a cube is DY j < TH cube , delete all data points in the cube, otherwise reserve them.
The data filtering of the circle parameter data is completed.The result of the filtering can be seen in Figure 5b, and the same color in Figure 5b indicates that the filtered circle parameter data belong to the same connected region.After filtering, most of the invalid circle parameters are filtered out.
(2) The core object, clustering number, unvisited sample set, and cluster division are initialized to Ω = φ, k = 0, Γ = D, and C = φ, respectively.(3) For j = 1, 2, • • • , m, all core objects are found as follows: (a) The sample number N ε (d j ) of the ε− neighborhood subsample set of d j is calculated using distance measurements.(b) If the number of samples in the subsample set satisfies N ε (d j ) > MinPts, the sample d j is added to the core object sample set Ω = Ω ∪ d j .
(4) If the core object is Ω = φ, the algorithm is ended, otherwise go to step (5).
(5) In the core object set Ω, a core object A is selected randomly; the current cluster core object set, clustering number, the current cluster sample set.and the unvisited sample set are updated to Ω cur = {A}, k = k + 1, C k = {A}, and Γ = Γ − {A}, respectively.(6) If the current cluster core object queue Ω cur = φ, the current cluster C k is generated, and the cluster division and the new core object set are updated to , respectively, then go to step (4).(7) A core object A is fetched from the current cluster core object queue Ω cur , the sample number N ε (A ) of the ε− neighborhood subsample set is found according to the distance neighborhood; let ∆ = N ε (A ) ∩ Γ, and the current cluster sample set, the unvisited sample set, and the current cluster core object queue are updated to The output results are cluster division C = {C 1 , C 2 , • • • , C k }, the final clustering results are shown in Figure 5c, and the same color data in Figure 5c indicate that they belong to the same clustering result.
According to Figure 5, the DBSCAN clustering method with a dedicated filter significantly reduces interference from non-circular parameters.Due to the reduction in the amount of data involved in the calculation, the algorithm's clustering efficiency is also improved.In this instance, the improved DBSCAN clustering system can not only classify the data more accurately, but also allow a wider range of clustering parameters to be set.

Generation of Virtual Connected Regions
As shown in Figure 5c, several categories of the circle parameter are determined.Circle parameters are clustered into the same category, when they basically come from one or more arcs of the same circle, which also shows that the DBSCAN clustering method can well detect circle detection.However, there is still a lot of interference information in the clustering results, and for the same circle, the centrality of a circle parameter obtained by multiple independent arcs is generally inferior to that of a circle parameter obtained by multiple independent arcs participating simultaneously.In order to obtain accurate circle parameters, the arcs related to the same cluster are marked as one connected region.This is not a real connected relationship and is defined as a virtual connected region.The generation method of the virtual connected region is as follows: In Equation ( 10), CirVCA j represents the virtual connected region generated by the j-th type of data in the DBSCAN clustering result, and CA ji is the i-th arc in the j-th type of data in the clustering result.The final generated virtual connected region is shown in Figure 6a, and different colors in Figure 6a represent different virtual connected regions.
According to Figure 5, the DBSCAN clustering method with a dedicated filter significantly reduces interference from non-circular parameters.Due to the reduction in the amount of data involved in the calculation, the algorithm's clustering efficiency is also improved.In this instance, the improved DBSCAN clustering system can not only classify the data more accurately, but also allow a wider range of clustering parameters to be set.

Generation of Virtual Connected Regions
As shown in Figure 5c, several categories of the circle parameter are determined.Circle parameters are clustered into the same category, when they basically come from one or more arcs of the same circle, which also shows that the DBSCAN clustering method can well detect circle detection.However, there is still a lot of interference information in the clustering results, and for the same circle, the centrality of a circle parameter obtained by multiple independent arcs is generally inferior to that of a circle parameter obtained by multiple independent arcs participating simultaneously.In order to obtain accurate circle parameters, the arcs related to the same cluster are marked as one connected region.This is not a real connected relationship and is defined as a virtual connected region.The generation method of the virtual connected region is as follows: CirVCA CA = (10) In Equation ( 10

Circle Parameter Clustering Results of Virtual Connected Regions
Figure 6b shows the circle parameters of virtual connected regions.Different colors in Figure 6b indicate that the data come from different virtual connected regions.Compared with Figure 5a, the circle parameters of virtual connected regions are denser, and the amount of discrete circle parameters is significantly less, which also shows that after the first clustering, the connected regions belonging to the circle are basically preserved, and the connected regions which are quite different from the circle are deleted.The improved DBSCAN is performed according to Section 3.2, and a final clustering result is obtained as shown in Figure 6c.The same color data in Figure 6c indicate that they belong to the same clustering result.The circle parameters are concentrated in four unrelated regions, which represent the four circles on the steel bar's end face in Figure 2c.

Determination of Circle Parameters
As can be seen in Figure 6c, each category is a set of data representing a particular circle parameter.We need to determine which data represent the actual circle parameter best among the data.In theory, the closer the actual circle parameters are, the denser the circle parameters around them should be.Consequently, we consider that the position of the actual circle parameter corresponds to the position where the sum of the distances from all points in the class to the position is the smallest.The distance sum from the point i C d to all points can be calculated as follows: In Equation (11), N is the number of elements in the current category, and then the circle parameter corresponding to the minimum value of i S C d is regarded as the actual circle parameter of the category.

Verification of Circle Parameters
To ensure the validity of the circle parameter, it is necessary to verify the circle parameters.i S P N represents the number of pixels in the virtual connected region that participate in the calculation of the i-th circle.i CPN represents the number of pixels at the circle's edge corresponding to the parameters detected in the image.The true circle can be determined as follows:

Circle Parameter Clustering Results of Virtual Connected Regions
Figure 6b shows the circle parameters of virtual connected regions.Different colors in Figure 6b indicate that the data come from different virtual connected regions.Compared with Figure 5a, the circle parameters of virtual connected regions are denser, and the amount of discrete circle parameters is significantly less, which also shows that after the first clustering, the connected regions belonging to the circle are basically preserved, and the connected regions which are quite different from the circle are deleted.The improved DBSCAN is performed according to Section 3.2, and a final clustering result is obtained as shown in Figure 6c.The same color data in Figure 6c indicate that they belong to the same clustering result.The circle parameters are concentrated in four unrelated regions, which represent the four circles on the steel bar's end face in Figure 2c.

Determination of Circle Parameters
As can be seen in Figure 6c, each category is a set of data representing a particular circle parameter.We need to determine which data represent the actual circle parameter best among the data.In theory, the closer the actual circle parameters are, the denser the circle parameters around them should be.Consequently, we consider that the position of the actual circle parameter corresponds to the position where the sum of the distances from all points in the class to the position is the smallest.The distance sum from the point Cd i to all points can be calculated as follows: In Equation (11), N is the number of elements in the current category, and then the circle parameter corresponding to the minimum value of SCd i is regarded as the actual circle parameter of the category.

Verification of Circle Parameters
To ensure the validity of the circle parameter, it is necessary to verify the circle parameters.SPN i represents the number of pixels in the virtual connected region that participate in the calculation of the i-th circle.CPN i represents the number of pixels at the circle's edge corresponding to the parameters detected in the image.The true circle can be determined as follows: The circle detection results are shown in Figure 7a, in which the red part represents the circle detection results.It can be seen from the results that all circles in the steel bar's end face image were accurately detected.Among all circles, the circumcircle of the steel bar is the largest.Therefore, the largest circle is retained, which means that the steel bar's end face was completely located, as shown in Figure 7b.
The circle detection results are shown in Figure 7a, in which the red part represents the circle detection results.It can be seen from the results that all circles in the steel bar's end face image were accurately detected.Among all circles, the circumcircle of the steel bar is the largest.Therefore, the largest circle is retained, which means that the steel bar's end face was completely located, as shown in Figure 7b.

Calculation steps of the Special Steel Bar End Face Attitude
At this point, we have obtained the feature point of the steel bar's end face, which is the center of the circumcircle.The steel bar's ends are sprayed or pasted with two-dimensional codes, so the existing mature two-dimensional code identification method is applied.The end point of the two-dimensional code far from the circle center is selected as the second feature point.These two feature points can be used to calculate the attitude angle of the steel bar's end face.The specific calculation steps are as follows: (1) The circumcircle of the steel bar's end face is detected according to our method, and the circle center and radius of the circumcircle are obtained.The results are shown in Figure 7b.(2) No matter how the steel bar is rotated, the circumcircle center of its end face remains the same.Therefore, the circumcircle center is taken as the coordinate origin O .The horizontal and rightward direction passing through the origin is taken as the positive direction of the X-axis.The vertical upward direction passing through the origin is taken as the positive direction of the Y-axis, resulting in a Cartesian coordinate system XOY .
(3) The two-dimensional code on the steel bar's end face is detected and identified, and the steel bar product information as well as the four endpoint positions of the twodimensional code can be obtained.( 4) The QR code end point QR P is selected that is farthest from the center O of the bar, and the angle between the vector and the positive direction of the X-axis is calculated, so the current attitude angle of the steel bar end face is obtained.The results are shown in Figure 7c, and the attitude angle is 25.393 degrees.

Fast Calculation Method of the Special Steel Bar End Face Attitude
From the circle detection results, it can be concluded that the proposed algorithm can accurately detect all circular targets in the image, providing a new method for detecting circles.However, we only need the circumcircle parameters when calculating the end face attitude.When the acquisition system captures the image each time, the distance between the camera and the end face is basically the same, so in the image, the steel bar circumcircle  At this point, we have obtained the feature point of the steel bar's end face, which is the center of the circumcircle.The steel bar's ends are sprayed or pasted with two-dimensional codes, so the existing mature two-dimensional code identification method is applied.The end point of the two-dimensional code far from the circle center is selected as the second feature point.These two feature points can be used to calculate the attitude angle of the steel bar's end face.The specific calculation steps are as follows: (1) The circumcircle of the steel bar's end face is detected according to our method, and the circle center and radius of the circumcircle are obtained.The results are shown in Figure 7b.(2) No matter how the steel bar is rotated, the circumcircle center of its end face remains the same.Therefore, the circumcircle center is taken as the coordinate origin O.The horizontal and rightward direction passing through the origin is taken as the positive direction of the X-axis.The vertical upward direction passing through the origin is taken as the positive direction of the Y-axis, resulting in a Cartesian coordinate system XOY.
(3) The two-dimensional code on the steel bar's end face is detected and identified, and the steel bar product information as well as the four endpoint positions of the two-dimensional code can be obtained.(4) The QR code end point P QR is selected that is farthest from the center O of the bar, and the angle between the vector OP QR and the positive direction of the X-axis is calculated, so the current attitude angle of the steel bar end face is obtained.The results are shown in Figure 7c, and the attitude angle is 25.393 degrees.

Fast Calculation Method of the Special Steel Bar End Face Attitude
From the circle detection results, it can be concluded that the proposed algorithm can accurately detect all circular targets in the image, providing a new method for detecting circles.However, we only need the circumcircle parameters when calculating the end face attitude.When the acquisition system captures the image each time, the distance between the camera and the end face is basically the same, so in the image, the steel bar circumcircle should fall within a circle ring.We establish a circle ring template centered on the center of the circumscribed circle and with a radius in the range of ±30 pixels based on the first set of accurate circumscribed circle parameters we have obtained.As shown in Figure 4, the fast template is applied to the image after canny edge detection, so that the edge information within the template range is retained and other edge information is removed.Through this approach, a vast amount of edge information that has no relationship to the steel bar's circumscribed circle is removed.This significantly reduces the amount of calculation, enhances the efficiency of end face pose detection, and ultimately improves the speed of end face pose detection.This paper analyzes the calculation time for 1800 images.The average time consumption of each image is 2.254 s, and the maximum time consumption is 3.656 s, which fully meets the timing requirements of the current production line.

Constructing of Experimental Devices
Due to the inconvenience of verifying and debugging algorithms on the bar production line, we built an automated platform for detecting and grinding surface defects as shown in Figure 8.The platform is mainly composed of an axial surface defect detection system, a steel bar end face attitude detection system, and a surface defect grinding system.The axial surface defect detection system consists of a line-scan camera, a bar rotation device, a light source, etc.The steel bar end face attitude detection system includes a three-coordinate device, an industrial area-array camera, a laser ranging device, and a light source, etc.And the surface grinding system includes an industrial metallurgy manipulator, a sixdimensional force sensor, and a grinding head device, etc.This platform is designed to detect surface defects and grind surface defects.The steel bar end face attitude detection system can be used to verify the end face attitude detection algorithm in this paper.When taking photos of a steel bar's end face, it is necessary to adjust the distance between the camera and the steel bar's end face using the laser ranging sensor.The distance is adjusted to the optimal distance for taking pictures, and the steel bar end face image is taken as shown in Figure 2c.For the purpose of verifying the accuracy of steel bar end face attitude detection, the steel bar is rotated at a constant speed and 1800 pictures are taken at regular intervals.A database with a resolution of 0.2 degrees is obtained for algorithm verification.
should fall within a circle ring.We establish a circle ring template centered on the center of the circumscribed circle and with a radius in the range of ±30 pixels based on the first set of accurate circumscribed circle parameters we have obtained.As shown in Figure 4, the fast template is applied to the image after canny edge detection, so that the edge information within the template range is retained and other edge information is removed.Through this approach, a vast amount of edge information that has no relationship to the steel bar's circumscribed circle is removed.This significantly reduces the amount of calculation, enhances the efficiency of end face pose detection, and ultimately improves the speed of end face pose detection.This paper analyzes the calculation time for 1800 images.The average time consumption of each image is 2.254 s, and the maximum time consumption is 3.656 s, which fully meets the timing requirements of the current production line.

Constructing of Experimental Devices
Due to the inconvenience of verifying and debugging algorithms on the bar production line, we built an automated platform for detecting and grinding surface defects as shown in Figure 8.The platform is mainly composed of an axial surface defect detection system, a steel bar end face attitude detection system, and a surface defect grinding system.The axial surface defect detection system consists of a line-scan camera, a bar rotation device, a light source, etc.The steel bar end face attitude detection system includes a threecoordinate device, an industrial area-array camera, a laser ranging device, and a light source, etc.And the surface grinding system includes an industrial metallurgy manipulator, a six-dimensional force sensor, and a grinding head device, etc.This platform is designed to detect surface defects and grind surface defects.The steel bar end face attitude detection system can be used to verify the end face attitude detection algorithm in this paper.When taking photos of a steel bar's end face, it is necessary to adjust the distance between the camera and the steel bar's end face using the laser ranging sensor.The distance is adjusted to the optimal distance for taking pictures, and the steel bar end face image is taken as shown in Figure 2c.For the purpose of verifying the accuracy of steel bar end face attitude detection, the steel bar is rotated at a constant speed and 1800 pictures are taken at regular intervals.A database with a resolution of 0.2 degrees is obtained for algorithm verification.

Analysis of Steel Bar End Face Attitude Detection Results
The experiment device captured more than 1800 images of the steel bar's end face.The attitude angle value of each image is calculated using the method in this paper.At the same time, the commonly used Hough circle detection (HT) method is employed to extract

Analysis of Steel Bar End Face Attitude Detection Results
The experiment device captured more than 1800 images of the steel bar's end face.The attitude angle value of each image is calculated using the method in this paper.At the same time, the commonly used Hough circle detection (HT) method is employed to extract the circumscribed circle parameters, and the attitude angle value of each image is also calculated. Figure 9a is a comparison curve between the attitude angle value calculated using our method or HT method, and the ideal data.The yellow line represents the ideal curve with an equal interval of 0.2 degrees, the green line represents the attitude angle curve calculated using the HT method, and the red line represents the attitude angle curve calculated using our method.Due to the ideal curve covering the other curves, the yellow line and green line can be seen more clearly on attitude angle curves, whereas the red line is only visible in a few places, which indicates that the calculation results for this paper are closer to the ideal value than for the HT method.For further quantitative evaluation of the accuracy of the two methods, the difference between attitude angle calculated using our method and the ideal value was compared to the difference between attitude angle calculated using the HT method and the ideal value.The results are shown in Figure 9b,c.From Figure 9b, we can see that the error range between the attitude angle value calculated using our method and the ideal attitude angle value is within 0.1 degrees.But in Figure 9c, the error range between the attitude angle value calculated using the HT method and the ideal angle value is within 1.1 degrees.According to the distribution of error data, the distribution of the error data of the method in this paper is relatively uniform.In contrast, the distribution of the error data of the HT method is messy.In addition, the error data of the HT method are not centered at 0, the first half is biased towards data below zero, while the second half is skewed towards data in excess of zero, and there is no specific rule to follow.As can be seen from the results in this paper, the method presented in this paper performs significantly better than the HT detection method.
the circumscribed circle parameters, and the attitude angle value of each image is also calculated. Figure 9a is a comparison curve between the attitude angle value calculated using our method or HT method, and the ideal data.The yellow line represents the ideal curve with an equal interval of 0.2 degrees, the green line represents the attitude angle curve calculated using the HT method, and the red line represents the attitude angle curve calculated using our method.Due to the ideal curve covering the other curves, the yellow line and green line can be seen more clearly on attitude angle curves, whereas the red line is only visible in a few places, which indicates that the calculation results for this paper are closer to the ideal value than for the HT method.For further quantitative evaluation of the accuracy of the two methods, the difference between attitude angle calculated using our method and the ideal value was compared to the difference between attitude angle calculated using the HT method and the ideal value.The results are shown in Figure 9b,c.From Figure 9b, we can see that the error range between the attitude angle value calculated using our method and the ideal attitude angle value is within 0.1 degrees.But in Figure 9c, the error range between the attitude angle value calculated using the HT method and the ideal angle value is within 1.1 degrees.According to the distribution of error data, the distribution of the error data of the method in this paper is relatively uniform.In contrast, the distribution of the error data of the HT method is messy.In addition, the error data of the HT method are not centered at 0, the first half is biased towards data below zero, while the second half is skewed towards data in excess of zero, and there is no specific rule to follow.As can be seen from the results in this paper, the method presented in this paper performs significantly better than the HT detection method.The difference between the attitude data calculated using our method and the ideal data.(c) The difference between the attitude data calculated using the HT method and the ideal data.
Researchers have studied the detection of steel bars' end faces in recent years.Zhang [33] and Zhu [34] counted the steel bars.The method focuses on dividing steel bars' end faces to achieve the purpose of statistical counting.Zhang [36] and Xie [35] used the HT method to locate the steel bars' end faces to identify the end face characters and spray paint.Effective positioning results are given in the literature.Feng [19] used HT to locate steel bars' end faces to realize automatic inkjet coding through positioning.The accuracy of the distance between two adjacent bars was analyzed.It can be seen that these studies do not pay attention to the attitude angle of the steel bars' end faces.And we could not find a current study of calculating steel bar end face attitude.However, steel bar end face attitude detection is a key step in surface defect circumferential positioning.In order to illustrate the accuracy of the algorithm, the HT method is used to calculate the steel bar end face attitude, and the method proposed in this paper is compared.The comparison results are presented in Table 2. Using our method, the attitude error value of 1220 images is within an error range of ±0.05, accounting for about 67.78% of the total amount of data, while using the HT method, only 174 images are within the error range of ±0.05, accounting for 9.67% of the total amount of data.All the error of attitude data in our method are distributed in the range of ±0.1; at this time, only 18.89% of the attitude data of the HT method are within this error range.The final error range of the attitude data calculated using the HT method is ±1.1, while the error range of our method is ±0.1, which is an order of magnitude smaller than the error of HT.The error data illustrate that the method in this paper is far superior to the calculation results of the HT method.

Estimation of Algorithm Execution Time
Currently, the running speed of the steel bar surface defect detection line of the HBIS GROUP SHISTEEL COMPANY is 1 m/s, and the length of the bar is between 6 m and 12 m.This means that the bar end face attitude detection system must perform the end face attitude calculation within 6 s to meet the bar production line's timing requirements.Our Error data The difference between the attitude data calculated using our method and the ideal data.(c) The difference between the attitude data calculated using the HT method and the ideal data.
Researchers have studied the detection of steel bars' end faces in recent years.Zhang [33] and Zhu [34] counted the steel bars.The method focuses on dividing steel bars' end faces to achieve the purpose of statistical counting.Zhang [36] and Xie [35] used the HT method to locate the steel bars' end faces to identify the end face characters and spray paint.Effective positioning results are given in the literature.Feng [19] used HT to locate steel bars' end faces to realize automatic inkjet coding through positioning.The accuracy of the distance between two adjacent bars was analyzed.It can be seen that these studies do not pay attention to the attitude angle of the steel bars' end faces.And we could not find a current study of calculating steel bar end face attitude.However, steel bar end face attitude detection is a key step in surface defect circumferential positioning.In order to illustrate the accuracy of the algorithm, the HT method is used to calculate the steel bar end face attitude, and the method proposed in this paper is compared.The comparison results are presented in Table 2. Using our method, the attitude error value of 1220 images is within an error range of ±0.05, accounting for about 67.78% of the total amount of data, while using the HT method, only 174 images are within the error range of ±0.05, accounting for 9.67% of the total amount of data.All the error of attitude data in our method are distributed in the range of ±0.1; at this time, only 18.89% of the attitude data of the HT method are within this error range.The final error range of the attitude data calculated using the HT method is ±1.1, while the error range of our method is ±0.1, which is an order of magnitude smaller than the error of HT.The error data illustrate that the method in this paper is far superior to the calculation results of the HT method.

Estimation of Algorithm Execution Time
Currently, the running speed of the steel bar surface defect detection line of the HBIS GROUP SHISTEEL COMPANY is 1 m/s, and the length of the bar is between 6 m and 12 m.This means that the bar end face attitude detection system must perform the end face attitude calculation within 6 s to meet the bar production line's timing requirements.
Our algorithm and the HT method are both implemented on the VS2019 platform, and the calculation time of 1800 images was statistically analyzed.The statistical results are shown in Figure 10a.Additionally, the same time-consuming statistical calculation is performed on the fast attitude detection algorithm, and the results are shown in Figure 10b.As shown in Table 3, the maximum, minimum, and average values of time consumption are also calculated under different conditions.In combination with Figure 10 and Table 3, it is apparent that both methods have more than 6 s of data before using the fast template.Although our method contains more data above 6 s, as long as there are more than 6 s calculation data, it cannot meet the minimum timing requirements of the production line.After using the fast template, it is obvious that the calculation time of the two methods is much reduced.The maximum time consumption of our method and the HT method is 3.656 s and 3.359 s, respectively, and the average calculation time is 2.254 s and 1.402 s, respectively.The results indicate that the two methods can meet the production line timing requirements after using the template.Although the HT method has some advantages in computing time, the advantages of both the maximum computing time and the average computing time are limited with respect to the required time of the production line of 6 s.From the perspective of detection accuracy, our method is much better than the HT method.Considering comprehensive performance, our method can provide As shown in Table 3, the maximum, minimum, and average values of time consumption are also calculated under different conditions.In combination with Figure 10 and Table 3, it is apparent that both methods have more than 6 s of data before using the fast template.Although our method contains more data above 6 s, as long as there are more than 6 s calculation data, it cannot meet the minimum timing requirements of the production line.After using the fast template, it is obvious that the calculation time of the two methods is much reduced.The maximum time consumption of our method and the HT method is 3.656 s and 3.359 s, respectively, and the average calculation time is 2.254 s and 1.402 s, respectively.The results indicate that the two methods can meet the production line timing requirements after using the template.Although the HT method has some advantages in computing time, the advantages of both the maximum computing time and the average computing time are limited with respect to the required time of the production line of 6 s.From the perspective of detection accuracy, our method is much better than the HT method.Considering comprehensive performance, our method can provide more accurate data of end face attitude for bar surface defect grinding stations, which can be more effectively applied to industrial production lines.

Conclusions
To address the issue of defect relocation during surface defect grinding of special steel bars, this paper proposes a method for calculating the attitude of the end face.This method is based on a steel bar's end face circumcircle and its two-dimensional code.And the feasibility and effectiveness of the proposed method of this paper are verified using a detection and grinding platform for bar surface defects.Using this platform, 1800 images of the steel bar's end face were collected at intervals of 0.2 degrees, and the images were used to conduct experimental verifications.At the same time, the proposed method was compared with the HT method, which is the most commonly used method for steel bar end face detection.As can be seen from the calculated results, when the circumferential resolution of the steel bar's end face is 0.2 degrees, the error range is within 0.1 degrees.The error data are evenly distributed within the range.However, the error range of the results calculated using the HT method is within a range of ±1.1 degrees, and the data distribution is irregular.The execution time of the algorithm is also analyzed in this paper.After using the template to participate in the calculation, the maximum calculation time of the proposed method is approximately 3.656 s, slightly longer than the maximum calculation time of the HT method of 3.359 s, which is perfectly in accordance with the production line's schedule.Moreover, our method is more accurate and stable, which establishes a foundation for locating and grinding the entire special steel bar surface defect.

Figure 1 .
Figure 1.Layout of the flaw detection workshop and grinding station in the special steel bar production line.

Figure 1 .
Figure 1.Layout of the flaw detection workshop and grinding station in the special steel bar production line.

Figure 2 .
Figure 2. Steel bar end face identification patterns: (a) Numerical symbols.(b) QR code symbols.(c) Label with product information and QR code.

Figure 2 .
Figure 2. Steel bar end face identification patterns: (a) Numerical symbols.(b) QR code symbols.(c) Label with product information and QR code.

Figure 3 .
Figure 3. Simulation results of marking connected regions: (a) Results of canny edge detection.(b) Results of corner detection.(c) Marking results.(d) Filtering results.

Figure 4 .
Figure 4. Flow chart of steel bar end face attitude detection.

Figure 5 .
Figure 5. Simulation results of improved DBSCAN: (a) Circle parameters of connected regions.(b) Filtering results of circle parameters.(c) Clustering results based on improved DBSCAN.

Figure 5 .
Figure 5. Simulation results of improved DBSCAN: (a) Circle parameters of connected regions.(b) Filtering results of circle parameters.(c) Clustering results based on improved DBSCAN.
), j CirVCA represents the virtual connected region generated by the j-th type of data in the DBSCAN clustering result, and ji CA is the i-th arc in the j-th type of data in the clustering result.The final generated virtual connected region is shown in Figure 6a, and different colors in Figure 6a represent different virtual connected regions.

Figure 6 .
Figure 6.Virtual connected regions and their circle parameters.(a) Generated virtual connected regions.(b) Circle parameters of virtual connected regions.(c) Clustering results of circle parameters.

Figure 6 .
Figure 6.Virtual connected regions and their circle parameters.(a) Generated virtual connected regions.(b) Circle parameters of virtual connected regions.(c) Clustering results of circle parameters.

Figure 7 .
Figure 7. Simulation results of steel bar end face attitude detection.(a) Results of circle detection.(b) Extraction results of the steel bar end face.(c) Calculation results of the end face attitude.

Figure 7 .
Figure 7. Simulation results of steel bar end face attitude detection.(a) Results of circle detection.(b) Extraction results of the steel bar end face.(c) Calculation results of the end face attitude.

3. 4 .
Acquisition of the Special Steel Bar End Face Attitude 3.4.1.Calculation steps of the Special Steel Bar End Face Attitude

Figure 8 .
Figure 8.The platform of detecting and grinding surface defects.

Figure 8 .
Figure 8.The platform of detecting and grinding surface defects.

Figure 9 .
Figure 9.Comparison of steel bar end face attitudes calculated using different methods: (a) Comparative display of attitude data.(b)The difference between the attitude data calculated using our method and the ideal data.(c) The difference between the attitude data calculated using the HT method and the ideal data.

Figure 9 .
Figure 9.Comparison of steel bar end face attitudes calculated using different methods: (a) Comparative display of attitude data.(b)The difference between the attitude data calculated using our method and the ideal data.(c) The difference between the attitude data calculated using the HT method and the ideal data.

Figure 10 .
Figure 10.Calculation time of steel bar end face attitude angle: (a) Calculation time without template.(b) Calculation time with template.

Figure 10 .
Figure 10.Calculation time of steel bar end face attitude angle: (a) Calculation time without template.(b) Calculation time with template.

Table 1 .
Summary of the current research status. (

Table 2 .
Statistics table of error data.

Table 3 .
Statistics table of end face attitude calculation time.