Next Article in Journal
Tree-Based Classifier Ensembles for PE Malware Analysis: A Performance Revisit
Previous Article in Journal
Polymer Models of Chromatin Imaging Data in Single Cells
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Irregular Workpiece Template-Matching Algorithm Using Contour Phase

1
School of Mechanical Engineering, Hangzhou Dianzi University, Hangzhou 310018, China
2
Anji Intelligent Manufacturing Technology Research Institute, Hangzhou Dianzi University, Huzhou 313300, China
*
Author to whom correspondence should be addressed.
Algorithms 2022, 15(9), 331; https://doi.org/10.3390/a15090331
Submission received: 1 August 2022 / Revised: 8 September 2022 / Accepted: 14 September 2022 / Published: 16 September 2022

Abstract

:
The current template-matching algorithm can match the target workpiece but cannot give the position and orientation of the irregular workpiece. Aiming at this problem, this paper proposes a template-matching algorithm for irregular workpieces based on the contour phase difference. By this, one can firstly gain the profile curve of the irregular workpiece by measuring its radius in orderly fashion and then, calculate the similarity of the template workpiece profile and the target one by phase-shifting and finally, compute the rotation measure between the two according to the number of phase movements. The experimental results showed that in this way one could not only match the shaped workpiece in the template base accurately, but also accurately calculate the rotation angle of the irregular workpiece relative to the template with the maximum error controlled within 1%.

1. Introduction

At present, the world is going through the transformation of Industry 4.0. In order to successfully transition to Industry 4.0, companies must develop and implement certain key technologies of Industry 4.0 [1]. Machine vision technology is one of the key technologies of Industry 4.0, and the application of machine vision has gradually been added to many machines and production lines and is playing an increasingly important role. It can be used for product quality inspection and target positionings, such as some appearance defect detection and accurate positioning of precision workpieces. Machine vision can also be used for hazardous tasks in some hazardous environments that are not suitable for direct human operation. Additionally, in some workpiece inspection tasks, compared with the human eye’s inspection, machine vision systems can greatly improve the inspection accuracy and inspection efficiency, their own capabilities being incomparable to the human eye. Because machine vision plays an increasingly important role today, research on machine vision has also become popular. In machine vision, the technology of judging the object input to the computer by comparing it with the template in the sample database is called template matching [2,3]. Identifying and locating objects is an important research topic, and template matching plays an important role in it. With the continuous development of template-matching technology [4], this is widely used in industry and agriculture. In recent years, the shortage of labor and the increasing demands of workers on the working environment has made the spraying industry more and more automated. Regular workpieces usually use the robot’s teaching program to realize the automatic glue-spraying process. However, due to their irregular shape, it is difficult to locate the position and posture of irregular workpieces directly. In the field of image processing, template matching searches for the existence of a known template in another image based on a known object image (i.e., template). This method is best suited for the localization of irregular workpieces if there is a target workpiece whose position needs to be located in order to guide the robot for painting [5] Therefore, the theoretical significance and practical application of template matching have great research value [6].
Template-matching algorithms are a common feature acquisition method, which has been well applied in various fields of production and life. However, the accuracy of template matching is affected to some extent by light, background, and occlusion. When the size and rotation angle of the object change, the number of templates will increase, which will increase the time of template matching and affect the efficiency of matching. In recent years, research on the efficiency and robustness of template matching has made some progress [7,8,9,10,11,12,13].
Regarding the template-matching algorithm, its general features can be divided into two categories. The first category is template-matching algorithms based on grayscale information [14,15]. It completes the matching between the target image and the template image by calculating the similarity between the template and the target image. This template-matching method is relatively simple in principle and easy to implement, and can meet real-time requirements. In some well-lit environments, it can also obtain relatively satisfactory results. However, in the actual environment, the illumination is often uneven, which leads to poor matching results. Another category is template-matching methods based on geometric features [16,17]. The features applied to template matching can generally be divided into point features, surface features, and line features. Compared with the difficulty and indistinct features in the extraction of surface features and line features, point features are widely used and studied because of their simple, clear and robust extraction process.
But no matter what method is used, it only achieves the purpose of successful matching, and can further give the center position of the target workpiece, but it cannot measure the rotation angle of the workpiece well. In order to use the robot to grasp and spray the workpiece automatically, it is also necessary to know the rotation angle of the workpiece relative to the template. The classic template-matching algorithm needs to traverse all the location points to be searched in the reference image [18]. This exhaustive search algorithm is computationally intensive, slow, and has no good practical value [19]. Given that the classical template-matching algorithm has the disadvantages of being computationally intensive and time-consuming, the researchers proposed to reduce the computation by using a fast template-matching algorithm, simplifying the correlation metric function or using a non-orthogonal search strategy [20]. For regular workpieces, the literature [21,22,23] proposed a method to proxy the amount of workpiece rotation by the rotation of the external rectangle of the workpiece, which more accurately calculates the rotation of the regular set of workpieces with respect to the stencil workpiece. Reference [18] defines a saliency constraint similarity measurement method for the automatic recognition of planar objects with irregular shapes and low heights in complex backgrounds, so as to achieve accurate matching of planar object contours. Reference [24] used the Hausdorff distance for the registration of the template image, and used the classification algorithm of the support vector machine to detect and classify the defects of the PCB board. In [25], using five white circular features around the charging port (CP), the distance to the center of the ellipse is obtained according to the principle that the center is the furthest point from the contour. In addition, the pose information on the CP is obtained by the geometric solution method. Under 4 klux light intensity, the positioning error of this method is 1.4 mm, the angle error is 1.6°, and the insertion success rate is 98.9%. The cluster template-matching algorithm (CTMA) proposed in [26] uses contour matching and logarithmic evaluation metrics to obtain matching positions. Based on the image deformation rate and zoom rate, a matching template is established to achieve fast and accurate matching of textureless circular features and complex light fields. Reference [27] combines the pixel statistics method with the template-matching method and proposes a new fastener positioning method called local unidirectional template matching (LUTM). According to the experimental results of reference [27], the method described can effectively realize the precise positioning of ballastless track and track fasteners with ballast simultaneously. In order to solve the problem of slow speed and a large amount of calculation of traditional template-matching algorithms in power image recognition, reference [28] proposes a second template-matching algorithm for fast identification of target images. The algorithm can accurately identify and locate electric power equipment, detect equipment faults, and improve the matching speed. Reference [29] proposed an improved template-matching algorithm for tube-to-tube sheet-welding positioning, which can automatically create a template for each image, instead of using one template to match all tube images, and the matching result can accurately locate the target tube. Reference [30] proposed a template-matching method based on direction code, which roughly estimates the rotation angle, which struggles to play a role in practical applications.
Combining the above research, this paper proposes a template-matching algorithm based on contour phase difference to better obtain accurate information of irregular workpiece poses. We will briefly describe the stencil matching based on geometric features and Hu-invariant moments and its shortcomings in Section 2, which leads to the algorithm proposed in this paper. Then, we will expand the description of our proposed algorithm in Section 3, discuss the experimental validation in Section 4, and finally summarize the results in Section 5.

2. Template Matching Based on Geometric Features and Hu Invariant Moments

2.1. Construction of the Base Feature Vector

The most frequently employed picture-invariant moment today was first presented by Hu, M.K. [31] in 1961. It is a series of invariant moments with image characteristics obtained by using a nonlinear combination of some regular moments. No matter how these photos are resized, rotated, or otherwise altered, these invariant moments stay the same. For a discrete two-dimensional image f ( x , y ) , the ( p + q ) order moments can be represented as follows in accordance with the definition of Hu invariant moments in the literature [31]:
m p q = x y x p y q f ( x , y )
The corresponding ( p + q ) order central moments are shown below.
m p q = x y ( x x ¯ ) p ( y y ¯ ) q f ( x , y )
where p , q = 0 ,   1 ,   2 ,   , + , x ¯ = m 10 m 00 , y ¯ = m 01 m 00 , x ¯ , y ¯ denote the gray center position of the image. According to Equation (1), the geometric moments are the projections of the image on the function x p y q . The standardized ( ) order central moments are shown as below [32].
η p q = m p q m 00 r
where r = ( p + q ) / 2 . Hu, M.K. generated seven image moments with invariance by expanding a nonlinear combination of second-order central moments and third-order central moments from the algebraic theory. These seven image moments are shown in Equation (4).
M 1 = η 20 + η 02 M 2 = ( η 20 η 02 ) 2 + 4 η 11 2 M 3 = ( η 30 3 η 12 ) 2 + ( 3 η 21 η 03 ) 2 M 4 = ( η 30 + η 12 ) 2 + ( η 21 + η 03 ) 2 M 5 = ( η 30 3 η 12 ) ( η 30 + η 12 ) [ ( η 30 + η 12 ) 2 3 ( η 21 η 03 ) 2 ] + ( 3 η 21 η 03 ) ( η 21 + η 03 ) [ 3 ( 3 η 21 + η 03 ) 2 ( η 21 + η 03 ) 2 ] M 6 = ( η 20 η 02 ) [ ( η 30 + η 12 ) 2 ( η 21 + η 03 ) 2 ] + 4 η 11 ( η 30 + η 12 ) ( η 21 + η 03 ) M 7 = ( 3 η 21 η 03 ) ( η 21 + η 03 ) [ 3 ( η 30 + η 22 ) 2 ( η 21 + η 03 ) 2 ]
It has been shown that the invariant moments based on second-order moments (i.e., M 1 , M 2 ) can better maintain the rotation, translation, and scaling of two-dimensional objects during the recognition of target objects in images. Other higher-order invariant moments bring larger errors and have little impact on object recognition. Therefore, in this paper, M 1   and   M 2 invariant moments are selected as components of the artifact base feature vector. To enhance the accuracy of the base feature vector, additional constraints need to be added to the base feature vector. In this paper, area, circumference (contour points), the ratio of long to short radius, mean radius and radius variance are selected as the remaining constrained feature values, which together with M 1   and   M 2 form the base feature vector of the artifact image.

2.2. Create Template Base Feature Vectors

The object of this paper is a shaped workpiece. Since the image acquisition camera has a field of view of 1280 × 1080 and a mounting height of 800 mm, the maximum size of the selected shaped workpiece is 800 mm × 800 mm and the maximum thickness is 100 mm in order to ensure that a complete product image is captured. New templates can be added to the template library for any product that meets the criteria. In this section, several artifacts from Figure 1 are selected for testing, where artifacts B and B-Mirror are mirror images of each other, and D and D-Mirror are mirror images of each other.
The basic feature vectors of the above artifacts are shown in Table 1.
As can be seen from the above table, there is a clear distinction between the base feature vectors of different types of artifacts. However, the feature vectors of workpieces that are mirror images of each other are almost identical. These two types of workpieces are like left and right hands, which cannot be overlapped on the same surface and cannot be classified as the same type of workpiece. Therefore, this paper proposes a template-matching algorithm based on the contour phase difference for solving the above problem to obtain more accurate results.

3. Algorithm Principle

3.1. Scheme Description

The proposed algorithm’s work is as follows: First, calculate the center of mass of the formed workpiece using the Hu invariant distance. Then, select the starting point on the contour and expand the radius of the workpiece in turn by 360-degree rotation to obtain the contour curve of the workpiece. After that, calculate the similarity and the rotation angle between the contours of the template and the target workpiece. It is proved that the efficiency and accuracy of template-matching of irregular workpieces are greatly improved. Detailed instructions will be expanded in subsequent chapters.

3.2. Method of Calculating the Center of Mass Based on Hu Invariant Moments

Obtaining the center of mass of a workpiece image is a key step for template matching. If the gray value of the pixel on the image is regarded as the quality of the place, then the center of mass of the regular workpiece image is the geometric center of the image outline, and the center of mass of the irregular workpiece needs to be calculated from the geometric moment of the image. According to the definition of image invariant moments proposed by Hu, M.K. [31], the ( p + q ) order moments for a discrete two-dimensional image f ( x , y ) are as shown in Equation (1).
The 0th order moment ( p = 0 , q = 0 ) represents the quality of the discrete two-dimensional image f ( x , y ) , and the calculation equation is:
m 00 = x y f ( x , y )
The 1st order moment ( p = 1 , q = 0   or   p = 0 , q = 1 ) represent the centroid of discrete two-dimensional image f ( x , y ), and the calculation equation is
x ¯ = m 10 m 00 y ¯ = m 01 m 00
In Equation (6), x ¯ and y ¯ represent the gray center position of the image. After obtaining the center of mass of the workpiece, the next step of the workpiece contour unfolding and the contour phase difference matching calculation can be performed.

3.3. Contour Feature Vector Computing Based on Rotating Radius Cluster

The rotational radius cluster of a workpiece is the set of the ordered edge point–center-point distances on the workpiece contour. If the workpiece is circular, as shown in Figure 2a, O ( x , y ) is the center of mass of the circular workpiece, A ( x , y ) is a point on the contour of the workpiece, and | OA | denotes the radius of the workpiece. With point O as the center of rotation, starting from point A, the rotation is 360 degrees, and each radius is expanded in an orderly manner, and the workpiece profile obtained is a flat straight line. The result is shown in Figure 2b.
For an irregular workpiece, as shown in Figure 3a, O ( x , y ) is the center of mass of the irregular workpiece, A ( x , y ) is a point on the contour of the workpiece, and | OA | is one radius of the workpiece. With point O as the center of rotation, starting from point A, the rotation is 360 degrees, and each radius is expanded in an orderly manner, and the workpiece contour is a curved contour with interlaced peaks and valleys, as shown in Figure 3b.
It can be seen that when the workpiece is circular, the only basis for distinguishing different workpieces is the difference in radius. When reasoning to the irregular workpieces, with the same initial starting point A ( x , y ) , the cumulative difference of the unfolded profile curve at each rotation angle can be used as a criterion for determining whether the target workpiece and the template one can be matched. Since the contour curves of different irregular workpieces unfolded according to the radius must be different, the contour curves of shaped workpiece image unfolding can be used as the basis for stencil matching. Usually, the target workpiece has a rotation angle relative to the template one, so their initial point A ( x , y ) is not consistent. In other words, the outline curve of the target workpiece and that of the template have a certain phase difference, as Figure 4 shows:
To sum up, the details about the algorithm of irregular workpiece template matching based on the contour phase difference are as follows:
  • Phase difference matching calculation
Assume that there is a template workpiece (X), and the distance from all its contour points to the center of mass is recorded as an array X [ i ] , i = 0 ,   1 ,   2 , ,   n . Under the same conditions, an image of the target workpiece Y is captured, and the distance from all its contour points to center of mass is recorded as an array Y [ j ] , j = 0 ,   1 ,   2 , , m . The number of pixels on the profile of the template workpiece and the target one represent n and m respectively, and if m n , the arrays with more pixel points need to be isometric sparse processed. The relationship between X [ i ] and Y [ j ] is shown in Figure 5.
If ε is the coincident degree of the two curves, in the discrete state, it can be expressed as:
ε = 0 n ( X i Y i ) 2 2
Equation (7) is the Euclidean distance between two points, which is relatively complicated to calculate. It can be replaced here by Manhattan distance, that is:
ε = 0 n | X i Y i |
There is only translation and rotation between work pieces, if they belong to the same product, the workpiece profile curve in the figure pans one contour point at a time, i.e.,:
Y i = Y i 1
where I = 1, 2, 3, … and the values that exceed the array index are added to the end of the new array in turn. Then we calculate the overlap degree ε m at this time, m = 0, 1, 2, …, k, as shown in Figure 5, the two curves can basically overlap after panning k times.
ε m = { 0 n ( X i Y k + i ) 2 2 , k < m i 0 n ( X i Y k + i m ) 2 2 , k m i
ε k < Γ
where Γ represents the threshold value of the successful match of the workpiece, and a smaller value means a higher successful matching rate.
  • Rotation angle calculation
As can be seen from the above, the workpiece profile image moves by one contour point at a time, i.e., the step angle λ of the workpiece image rotates one unit at a time.
λ = 2 π N , N = min ( m , n )
So, after panning k times, the rotation angle θ of the target workpiece with respect to the stencil workpiece that satisfies Equation (11) is calculated, which can be expressed as:
θ = λ k

4. Experimental Analysis and Verification

The object of this experiment is the backrest and cushion of the child seat, and the irregular artifacts to which our proposed algorithm is mainly applied are based on these products. During the mixing spray adhesive process of the backrests and cushions, it is necessary to identify which product the incoming material is, that is, we need to compare it with the template database, which contains the backrests and cushions data of all models of the company. To verify the validity of this algorithm, several products need to will be taken from the template database to form a sub-template database. As Figure 6 shows, the KB-Left and the KB-Right are mirrored symmetrically and the contour curves of the four template workpieces are shown in Figure 7.
A target product is selected from each of the KB-Left, KB-Right, ZD631, and ZD632 models, and the target product relative to the template rotation angle is: −18°, 22°, 180°, 90°. Then the matching experiments are conducted using the algorithm proposed in this paper and the external rectangle algorithm proposed in the literature [21,22,23]. The matching results by the algorithm proposed in this paper are shown in Figure 8.
The specific matching results compared with the external matrix algorithm are shown in the following Table 2.
To further verify the effectiveness of the algorithm, some of the shaped workpieces mentioned in the previous sections are selected for verification comparison. The model’s names for the target workpiece are B and B-Mirror. The rotation angles of the target product with respect to the template are −70° and −180°. The matching results are shown in Figure 9.
The specific matching results compared with the external matrix algorithm are shown in the following Table 3.
From the above results, the phase difference algorithm proposed in this paper adapts well to the matching of various types of irregular workpieces, with the maximum error of the rotation angle of the target workpiece relative to the template calculated by the algorithm is at most 1%. When the long and short shafts of the workpiece differ greatly, the rotation angle calculated by the external rectangular algorithm proposed by the reference [21,22,23] is close to the actual angle, but the error is slightly larger than that calculated by the algorithm in this paper. With slightly difference, the external rectangular algorithm does not work and the algorithm proposed in this paper has high accuracy. As long as the target workpiece has a position or angle deviation from the template workpiece, the phase difference algorithm works. After being applied to several matching experiments for other workpieces, the phase difference algorithm reached the success rate of 100%, and the angle error was controlled within 1%, which has much higher accuracy than the methods proposed in the reference [5,14,15]. Therefore, it meets the requirements of product positioning.

5. Conclusions

In this paper, an algorithm of irregular workpiece recognition based on contour phase difference is proposed. It can not only help to match the template workpiece with slight differences accurately, but also conduces to calculate the rotation amount of the target workpiece relative to the template workpiece, and the error is controlled within 1%. Compared with the algorithm created in the reference [5,14,15,21,22,23], the algorithm in this paper has higher accuracy. It also helps solve the substantive problems such as the positioning, sorting, spraying and other processes of the irregular workpiece, which has a good application value. However, the algorithm does not apply to the scaling workpiece matching at present, which requires the camera to take an image at a fixed height.

Author Contributions

Conceptualization, S.S., J.W. and D.Z.; methodology, D.Z. and S.S.; software, J.W.; validation, S.S. and D.Z.; formal analysis, D.Z. and J.W.; investigation, S.S.; resources, S.S.; data curation, D.Z.; writing—original draft preparation, S.S., D.Z. and J.W.; writing—review and editing, S.S. and J.W.; visualization, D.Z. and J.W.; supervision, S.S.; project administration, D.Z. and S.S.; funding acquisition, S.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Key R & D plan of Zhejiang Province, application and demonstration of “intelligent generation” technology for small and medium-sized enterprises—R & D and demonstration application of chair industry internet innovation service platform based on Artificial Intelligence (grant number 2020C01061), and funded by Open Foundation of the Key Laboratory of Advanced Manufacturing Technology of Zhejiang Province. The authors would like to thank the above funds for their support.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Benfriha, K.; El-Zant, C.; Charrier, Q.; Bouzid, A.-H.; Wardle, P.; Belaidi, I.; Loubère, S.; Ghodsian, N.; Aoussat, A. Development of an advanced MES for the simulation and optimization of industry 4.0 process. Int. J. Simul. Multidiscip. Des. Optim. 2021, 12, 23. [Google Scholar] [CrossRef]
  2. Won, D.J.; Moon, J.H. Advanced template matching prediction using a motion boundary. In Proceedings of the the 2nd International Conference on Image and Graphics Processing(ICIGP), Singapore, 23–25 February 2019; pp. 110–113. [Google Scholar] [CrossRef]
  3. Zhu, M.; Chen, C.; Sun, D. Improved Template Matching Algorithm Based on Feature Point Matching. Comput. Digit. Eng. 2022, 50, 377–382. [Google Scholar]
  4. Ma, H. Research on Automatic Focusing and Mosaic Algorithms of Pathological Microscopic Images; North University of China: Taiyuan, China, 2019. [Google Scholar]
  5. Wu, X.; Zou, G. High performance template matching algorithm based on edge geometric features. Chin. J. Entific Instrum. 2013, 34, 1462–1469. [Google Scholar] [CrossRef]
  6. Rao, T. Video Image Processing Method And Apparatus Thereof, Display Device, Computer Readable Storage Medium And Computer Program Product (USPTO 20200134793). 2020. Available online: https://www.freepatentsonline.com/y2020/0134793.html (accessed on 10 June 2022).
  7. Jin, S.; Li, X.; Yang, X.; Zhang, J.A.; Shen, D. Identification of Tropical Cyclone Centers in SAR Imagery Based on Template Matching and Particle Swarm Optimization Algorithms. IEEE Trans. Geosci. Remote Sens. 2018, 57, 598–608. [Google Scholar] [CrossRef]
  8. Lu, Y.; Zhang, X.; Pang, S.; Li, H.; Zhu, B. A robust edge-based template matching algorithm for displacement measurement of compliant mechanisms under scanning electron microscope. Rev. Sci. Instrum. 2021, 92, 033703. [Google Scholar] [CrossRef]
  9. Cui, Z.; Qi, W.; Liu, Y. A Fast Image Template Matching Algorithm Based on Normalized Cross Correlation. J. Phys. Conf. Ser. 2020, 1693, 012163. [Google Scholar] [CrossRef]
  10. Yang, Y.A.; Zhuo, C.A.; Xl, B.; Wg, C.; Dz, D.; Mx, E. Robust template matching with large angle localization. Neurocomputing 2020, 398, 495–504. [Google Scholar] [CrossRef]
  11. Tsai, C.Y.; Yu, C.C. Real-time Textureless Object Detection and Recognition Based on an Edge-based Hierarchical Template Matching Algorithm. J. Appl. Sci. Eng. 2018, 21, 229–240. [Google Scholar] [CrossRef]
  12. He, Z.; Jiang, Z.; Zhao, X.; Zhang, S.; Wu, C. Sparse Template-Based 6-D Pose Estimation of Metal Parts Using a Monocular Camera. IEEE Trans. Ind. Electron. 2020, 67, 390–401. [Google Scholar] [CrossRef]
  13. Han, Y. Reliable Template Matching for Image Detection in Vision Sensor Systems. Sensors 2021, 21, 8176. [Google Scholar] [CrossRef]
  14. Gharavi-Alkhansari, M. A fast globally optimal algorithm for template matching using low-resolution pruning. IEEE Trans. Image Process. 2001, 10, 526–533. [Google Scholar] [CrossRef] [PubMed]
  15. Zheng, J.; Zheng, L.; Zhu, J. A Fast Template Matching Method Based on Gray Level. Mod. Comput. 2018, 26, 52–56. [Google Scholar]
  16. Hsu, C.T.; Shih, M.C. Content-based image retrieval by interest-point matching and geometric hashing. In Proceedings of the SPIE—The International Society for Optical Engineering, Shanghai, China, 15–17 October 2002; pp. 80–90. [Google Scholar] [CrossRef]
  17. Zhan, H. Design and Implementation of Image Recognition Algorithm Based on Improved Moment Invariants and Principal Component Analysis. J. Jilin Inst. Chem. Technol. 2017, 34, 27–30. [Google Scholar] [CrossRef]
  18. Zhou, L. Research and Application of Vision Positioning Technology Based on Template Matching; Dalian University of Technology: Dalian, China, 2012. [Google Scholar]
  19. Yang, H.; Zhang, G. Design and realization of a new correlation tracker algorithm. J. Infrared Millim. Waves 2000, 5, 377–380. [Google Scholar]
  20. Zhang, K.; Hu, B.; Xu, X.; Li, X. A fast correlation matching algorithm based on log-polar transform. Flight Dyn. 2020, 38, 61–65. [Google Scholar] [CrossRef]
  21. Jiang, S.; Li, C.; Yang, R.; Zhou, X. Regular planar geometric workpiece sorting method based on machine vision. Ind. Control Comput. 2015, 28, 102–104. [Google Scholar]
  22. Mei, Y.; Xiong, C.; Fan, S. Design and implementation of an intelligent sorting system for parts of 3C products. Mod. Manuf. Technol. Equip. 2018, 10, 16–17,20. [Google Scholar] [CrossRef]
  23. Li, B.; Tan, F.; Han, Y. Method for recognizing and positionig ceramic substrate. Ind. Control Comput. 2019, 32, 95–97. [Google Scholar]
  24. Zhang, L.; Wu, X. An edge-guided image interpolation algorithm via directional filtering and data fusion. IEEE Trans. Image Process. 2006, 15, 2226–2238. [Google Scholar] [CrossRef]
  25. Sun, C.; Pan, Q.; Liu, Z.; Wang, J. Automatic recognition and location system for electric vehicle charging port in complex environment. IET Image Process. 2020, 14, 2263–2272. [Google Scholar] [CrossRef]
  26. Quan, P.; Lou, Y.n.; Lin, H.; Liang, Z.; Wei, D.; Di, S. Research on Fast Recognition and Localization of an Electric Vehicle Charging Port Based on a Cluster Template Matching Algorithm. Sensors 2022, 22, 3599. [Google Scholar] [CrossRef] [PubMed]
  27. Li, L.; Lv, Z.; Chen, X.; Qiu, Y.; Li, L.; Ma, A.; Zheng, S.; Chai, X. Research on track fastener positioning method based on local unidirectional template matching. Sci. Prog. 2021, 104, 1–15. [Google Scholar] [CrossRef] [PubMed]
  28. Wu, G.; Yu, M.; Shi, W.; Li, S.; Bao, J. Image recognition in online monitoring of power equipment. Int. J. Adv. Robot. Syst. 2020, 17, 1729881419900836. [Google Scholar] [CrossRef]
  29. Cai, J.; Lei, T. An Autonomous Positioning Method of Tube-to-Tubesheet Welding Robot Based on Coordinate Transformation and Template Matching. IEEE Robot. Autom. Lett. 2021, 6, 787–794. [Google Scholar] [CrossRef]
  30. Huang, J.; Cai, X.; Yu, N. Orientation codes-based image template matching method on workpiece inspection. Electron. Des. Eng. 2011, 19, 128–130,133. [Google Scholar] [CrossRef]
  31. Hu, M.K. Visual Pattern Recognition by Moment Invariants. Inf. Theory IRE Trans. 1962, 8, 179–187. [Google Scholar] [CrossRef]
  32. Chen, C.C. Improved moment invariants for shape discrimination. Proc. SPIE Int. Soc. Opt. Eng. 1993, 26, 683–686. [Google Scholar] [CrossRef]
Figure 1. Template shaped workpiece. A–D are different types of artifacts, B-mirror is the mirror image of B, and D-mirror is the mirror image of D.
Figure 1. Template shaped workpiece. A–D are different types of artifacts, B-mirror is the mirror image of B, and D-mirror is the mirror image of D.
Algorithms 15 00331 g001
Figure 2. Graphical radius of round workpiece. (a) Round piece, (b) Curve graph.
Figure 2. Graphical radius of round workpiece. (a) Round piece, (b) Curve graph.
Algorithms 15 00331 g002
Figure 3. Graphical radius of irregular workpiece. (a) Irregular piece, (b) Curve graph.
Figure 3. Graphical radius of irregular workpiece. (a) Irregular piece, (b) Curve graph.
Algorithms 15 00331 g003
Figure 4. Periodogram comparison chart of the template workpiece and the target one.
Figure 4. Periodogram comparison chart of the template workpiece and the target one.
Algorithms 15 00331 g004
Figure 5. The template coincides with the target workpiece profile after phase move.
Figure 5. The template coincides with the target workpiece profile after phase move.
Algorithms 15 00331 g005
Figure 6. Sub-template work piece database.
Figure 6. Sub-template work piece database.
Algorithms 15 00331 g006
Figure 7. Sub-template workpiece contour curve.
Figure 7. Sub-template workpiece contour curve.
Algorithms 15 00331 g007
Figure 8. Target workpiece phase difference match. (a) 18 ° Rotation of the KB-Left; (b) 22°rotation of the KB-Right; (c) 180°rotation of the ZD631; (d) 90°rotation of the ZD632.
Figure 8. Target workpiece phase difference match. (a) 18 ° Rotation of the KB-Left; (b) 22°rotation of the KB-Right; (c) 180°rotation of the ZD631; (d) 90°rotation of the ZD632.
Algorithms 15 00331 g008
Figure 9. B, B-Mirror phase difference algorithm matching results. (a) 70 ° Rotation of the B; (b) 180 °   rotation of the B-Mirror.
Figure 9. B, B-Mirror phase difference algorithm matching results. (a) 70 ° Rotation of the B; (b) 180 °   rotation of the B-Mirror.
Algorithms 15 00331 g009
Table 1. Base feature vectors for template shaped workpieces.
Table 1. Base feature vectors for template shaped workpieces.
CategoryM1M2AreaCircumferenceRatio of Long to Short RadiusRadius MeanRadius Variance
A0.302424550.00825412125,95922924.429232.4987.62
B0.170345080.00001640362,93123681.763341.8944.11
B-Mirror0.170345080.00001640362,93123681.763341.8944.11
C0.188382710.00157881248,16619832.684290.2962.25
D0.255314560.03709374212,66819023.585278.4094.46
D-Mirror0.255314560.03709374212,66819023.585278.4094.46
Table 2. Target workpiece phase difference match.
Table 2. Target workpiece phase difference match.
Target Work PieceRotation Angle (Degree)TemplateThe Proposed Algorithm (Degree)ErrorCalculation Angle of Exterior
Rectangle [21,22,23] (Degree)
Error
KB-Left−18KB-Left−17.92750.40%−18.00980.05%
KB-Right22KB-Right22.01690.07%22.23041.04%
ZD631180ZD63118000100%
ZD63290ZD6329000100%
Table 3. B, B-Mirror phase difference match results.
Table 3. B, B-Mirror phase difference match results.
Target Work PieceRotation Angle (Degree)TemplateThe Proposed Algorithm (Degree)ErrorCalculation Angle of Exterior
Rectangle [21,22,23] (Degree)
Error
B-R70−70B−70.00680.0097%−69.88740.05%
B-Mirror-R180−180B-Mirror−179.88740.0623%−0.164699.91%
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Su, S.; Wang, J.; Zhang, D. Irregular Workpiece Template-Matching Algorithm Using Contour Phase. Algorithms 2022, 15, 331. https://doi.org/10.3390/a15090331

AMA Style

Su S, Wang J, Zhang D. Irregular Workpiece Template-Matching Algorithm Using Contour Phase. Algorithms. 2022; 15(9):331. https://doi.org/10.3390/a15090331

Chicago/Turabian Style

Su, Shaohui, Jiadong Wang, and Dongyang Zhang. 2022. "Irregular Workpiece Template-Matching Algorithm Using Contour Phase" Algorithms 15, no. 9: 331. https://doi.org/10.3390/a15090331

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop