Next Article in Journal
Grain Refinement of a Powder Nickel-Base Superalloy Using Hot Deformation and Slow-Cooling
Next Article in Special Issue
Fast Analytic Simulation for Multi-Laser Heating of Sheet Metal in GPU
Previous Article in Journal
Tunable Acoustic Metasurface with High-Q Spectrum Splitting
Previous Article in Special Issue
Comparison of Flank Super Abrasive Machining vs. Flank Milling on Inconel® 718 Surfaces
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Using the Machine Vision Method to Develop an On-machine Insert Condition Monitoring System for Computer Numerical Control Turning Machine Tools

1
Institution of Mechatronic Engineering, National Taipei University of Technology, Taipei 10608, Taiwan
2
Department of Mechanical Engineering, National Taipei University of Technology, Taipei 10608, Taiwan
*
Author to whom correspondence should be addressed.
Materials 2018, 11(10), 1977; https://doi.org/10.3390/ma11101977
Submission received: 21 August 2018 / Revised: 11 October 2018 / Accepted: 11 October 2018 / Published: 14 October 2018
(This article belongs to the Special Issue Machining—Recent Advances, Applications and Challenges)

Abstract

:
This study uses the machine vision method to develop an on-machine turning tool insert condition monitoring system for tool condition monitoring in the cutting processes of computer numerical control (CNC) machines. The system can identify four external turning tool insert conditions, namely fracture, built-up edge (BUE), chipping, and flank wear. This study also designs a visual inspection system for the tip of an insert using the surrounding light source and fill-light, which can be mounted on the turning machine tool, to overcome the environmental effect on the captured insert image for subsequent image processing. During image capture, the intensity of the light source changes to ensure that the test insert has appropriate surface and tip features. This study implements outer profile construction, insert status region capture, insert wear region judgment, and calculation to monitor and classify insert conditions. The insert image is then trimmed according to the vertical flank, horizontal blade, and vertical blade lines. The image of the insert-wear region is captured to monitor flank or chipping wear using grayscale value histogram. The amount of wear is calculated using the wear region image as the evaluation index to judge normal wear or over-wear conditions. On-machine insert condition monitoring is tested to confirm that the proposed system can judge insert fracture, BUE, chipping, and wear. The results demonstrate that the standard deviation of the chipping and amount of wear accounts for 0.67% and 0.62%, of the average value, respectively, thus confirming the stability of system operation.

Graphical Abstract

1. Introduction

The quality of mechanical parts is dependent on the accuracy of the machining tools and the abrasion conditions of cutting tools. For instance, Fernández-Valdivielso et al. [1] analyzed the effects of geometrical features of inserts on workpiece surface integrity and developed an indirect method for determining the geometrical features of inserts that achieve the best performance in machining difficult-to-cut alloys. Pereira et al. [2] considered the abrasion conditions on the interface between an insert and a workpiece, and proposed a coolant structure that combines cryogenic cooling and the minimum quantity of lubrication to improve tool life and workpiece surface integrity. Thus, to improve the quality of products, mechanical part manufacturers must be aware of the service behaviors of cutting tools in the actual machining process, as determined from the on-machine cutting tool condition monitoring system, to be able to analyze tool life and decide whether the cutting tool needs to be changed [3,4]. The insert wear formation mechanism in the turning process comprises abrasion, diffusion, oxidation, fatigue, and adhesion wear. As shown in Figure 1, flank wear, chipping, fracture, and built-up edge (BUE) occur most frequently in general cutting processes and are mostly concentrated at the tool tip and tool flank [5,6,7,8]. Therefore, these four conditions are classified in this study, and the insert condition is reviewed by visual inspection. Flank wear gradually occurred at the cutting insert owing to the erosion between the portions of the insert in contact with the workpiece. Excessive cutting force can usually lead to brittle fracture of a cutting insert. However, due to the high temperature at the contact area between the workpiece and the insert during the machining processes, the BUE (the phenomenon that the machined material builds up on the insert edge) occurs and it could break away from the insert edge and could carry a portion of material from the insert, thereby causing fracturing and chipping.
There are two types of insert condition inspections in turning processes: one is indirect inspection, where the external sensors feedback the analytical machine data [9,10], and the other is direct inspection, where the cutting tool status is measured [11,12]. Indirect inspection analyzes data to estimate the cutting tool status; some machine states are analyzed according to a reference, which means that the cutting status is systematically evaluated, thus replacing the judgment of experienced operators to reduce human errors and enhancing the ability of production automation [13]. For example, the cutting tool wear condition is analyzed based on the difference in the cutting noise or vibration [14,15], the cutting tool is monitored by measuring the changes in cutting temperature and cutting forces [16,17], and the cutting status is analyzed using the machine power or current variation signal [18]. All these methods use sensing signals for inspection analysis. Lately, indirect inspection by a charge-coupled device (CCD) camera has become popular. It analyzes the cutting tools by capturing the workpiece surface texture in images to determine whether the cutting tool is worn, judged according to the changes in the workpiece surface texture and surface roughness [19,20,21,22,23]. Some studies have focused on the fusion of multiple sensors and visual information of images for further tool status monitoring [24,25] or used different algorithm models for analysis to implement more accurate monitoring and evaluation [26,27,28,29]. According to the aforementioned references, the status of cutting tools can be obtained by analyzing machine information variations; however, such indirect inspection sometimes reduces the accuracy of the system under the effect of the external sensing environment [30]. Therefore, a direct inspection method is required to analyze the changes in the status of cutting tools.
Direct inspection analyzes the problems in machining by directly observing the practical situation of the cutting tool. Some documents use sound, light, or a probe to build a cutting tool model to observe the status of the cutting tool [25,31,32]. However, such measurement equipment is relatively complicated and unsuitable for onsite inspection. Another method uses a charge-coupled device (CCD) camera to capture tool images and analyze the status of cutting tools. There are two types of analysis regarding the status of cutting tools, one is to analyze the wear condition by outer contour and profile inspection [33], which is generally used to monitor the outer profile wear status to judge whether the cutting tool is still workable. In comparison with the indirect inspection methods that are used to analyze and inspect the surface texture of a machined workpiece to determine the tool status, the direct inspection method is to judge the status of the cutting tools by surface texture or surface roughness analysis of the tool edge after machining [34], which is applied for a more detailed inspection of the cutting tool and the machine states, as it provides detailed machining information. The general visual inspection of a CCD camera analyzes the different locations of a cutting tool, for example, some studies have implemented analysis according to crater wear [35,36], whereas others have implemented it according to the flank wear condition [37]. A majority of the status information regarding a cutting tool can be gathered by visual inspection; in other words, changes in the outer profile can be obtained from the images. Giusti et al. [38] proposed a visual inspection method for cutting tool wear, Rangwala and Dornfeld [39] proposed using a neural network to analyze wear status, and many scholars successively proposed other related inspection methods [26]. Regarding the methods for optimization of wear features, Kurada and Bradley [40] proposed using gradient operators to calculate texture features, where the wear region boundary feature search was calculated using an octagonal-shaped matrix, and the slope was established by the brightness difference and radial distance from the matrix center to determine the location of optimized wear features. The original image was smoothed during preprocessing to reduce the interference of irregularities. For feature calculation, the pixel value was converted by image thresholding to obtain the actual wear intensity and determine the change in wear amount. Yuan et al. [41] proposed a new filtering method to obtain average images and proposed a new edge detection method based on wavelet transform. When the wavelet function is selected, a new wavelet function is generated that describes the gray change of the image. In other words, noise interference can be avoided to obtain better edge features and the abrasion region, width, length, and center location of abrasion region can be measured. Wang et al. [42] proposed an image processing procedure, which is different from the traditional method based on constant thresholding. In this method, a rough-to-fine strategy is considered. First, the thresholding images are obtained for the search candidate’s wear bottom edge points. Then, the threshold-independent edge detection method, based on moment invariance, is used to determine the wear-edge. To shorten the computing time, a critical area is defined first, and only this area is taken as the region of interest in subsequent processes; thus, evading the threshold-dependent wear features detection method. Li et al. [43] used the pulse-couple neutral networks (PCNN) of bionics in cutting tool wear monitoring and used the spatial neighbor and similar gray clusters of pixels to segment the binary image of tool wear according to the condition that the gray intensity is higher than the body of the tool and background in the field of tool wear. Shahabi and Ratnam [44] used the external profile of the original image to test the alignment of the tool image and then used median filtering, morphological operations, and thresholding algorithms to reduce the system errors resulting from cutting tool misalignment, the presence of micro-dust particles, vibrations, and the intensity variations of ambient light. The aim was to determine the tool holder position and positioning error to ensure that cutting tool wear could be inspected without precision tool alignment. Pfeifer and Wiegers [45] used light source changes to determine wear-edge features under different light sources. While light changes can influence the shadow changes of the cutting tool wear-edge, the actual edge location does not vary with the light source. Thus, cutting tool wear image information under different light sources can be obtained using high-pass filter and thresholding images and the recurrent edge location can be obtained by overlapping to determine the location of a strong edge to filter out the misrecognition due to contaminants and reduce the effects of contaminants and shadow changes on the inspection system. Barreiro et al. [46] used different moments as descriptors to illustrate the tool wear images and then used a finite mixture MCLUST model to classify tool wear conditions into low-, medium-, and high-wear classes. Furthermore, the monitoring results were validated through the use of linear and quadratic discriminant analyses. Based on the image processing results of the cutting edge, Alegre et al. [47] developed a procedure to determine the time for tool replacement through the use of k-nearest neighbors and a multilayer neural network. D’Addona and Teti [48] used an image standardization process to obtain images with standard size and pixel density during cutting tests; then, the back-propagation neural network was optimized and used to estimate tool wear conditions with standardized cutting tool images.
Differing from existing research findings, this study analyzes insert statuses and uses fusion contour and texture inspection methods to build a more accurate evaluation and judgment system, which is applicable to on-machine automatic inspections and eliminates the environmental problems during inspection. A visual inspection system that can be used in CNC turning machine tools is constructed, which consists of a CCD camera and a lens for capturing insert images, a protection box to protect the photographic equipment, and a peripheral circuit and components, to avoid scrap splashes and cutting fluids during the cutting processes. The visual inspection system has a cleaning air tube, which jets air toward the insert to clean the surface of the inspected insert, thus, reducing the problems of subsequent image processing and increasing the accuracy of the insert condition judgment. The visual inspection system designed in this study has a surrounding light source and a fill-light for the tip of the insert to ensure that the insert condition can be analyzed under changing lighting conditions. When the light source is adjusted to determine the location of the blade and the tip of the insert, it is applicable to insert condition monitoring if onsite tool alignment is not accurate, thus enhancing the feasibility of image recognition in the machine. The light source intensity is adjusted and the insert image is captured under varying intensities for inspection analysis. The effect of any external environment changes on the insert condition monitoring result can be reduced and the system designed in this study can obtain accurate results in different environments. Image underexposure or solarization that generally result from changes in the insert condition are also improved. This study analyzes captured insert images with different features and the common insert conditions of an external turning tool, including fracture, BUE, chipping, and wear, can also be inspected. The analysis of the results can be quantized according to the texture feature distribution. This study conducts on-machine insert condition monitoring experiments with inserts in different states and the results show that the insert condition monitoring system designed in this study is applicable to computer numerical control (CNC) turning machine tools for correct and stable identification of insert fracture, BUE, chipping, and wear conditions. Contributions of this study therefore include
  • development of an on-machine insert condition monitoring system that can be used to one-time identify the four insert conditions—fracture, BUE, chipping, and flank wear.
  • development of a mountable visual system with different light sources to on-machine capture good-quality insert images that can be exactly analyzed under different lighting conditions.
  • development of a contour and texture fusion inspection method to reduce environmental problems and to accurately identify insert conditions during inspection.
The structure of this paper is as follows. Section 2 describes the experimental system and related equipment used in this study, along with the hardware architecture design of the machine vision inspection system. Section 3 describes the insert image capture process designed in this study and the usage of the surrounding light source and fill-light for the insert tip. Section 4 describes the insert condition monitoring classification process designed in this study, including the insert outer profile construction, insert status region capture, and wear region judgment and calculation. Section 5 describes the experimental process and results of insert condition monitoring. The experiment on the on-machine insert condition monitoring by a CNC turning machine tool validates the feasibility and stability of this system. Section 6 summarizes this paper.

2. Introduction to the Experimental System and Equipment

The CNC turning machine tool used in this study, shown in Figure 2, is tested using an external turning tool. The test external turning tool is mounted on the turning machine tool turret and the turning machine tool turret is moved by a computer numerical controller to the visual inspection system placed above the turning machine tool spindle for insert condition monitoring. Here, each insert position is adjusted by moving the turret such that the region of interest is focused in order to reduce the blurring of captured images. Moreover, during the period of experiments, the security door that is usually used to protect operators was closed so that the turning zone environment can reduce the influence from external environments. A GigE DFK 23GP031 color industrial camera, with an image resolution of 2592 × 1944 (15 fps), is used in this study. Figure 3a shows the camera hardware combination; the lens is a Myutron HS3514J CCTV lens, combined with a double lens to capture the feature image and the 90-degree reflecting mirror can adjust the angle of the camera. Due to the space constraints of the internal structure of the machine and considering the potential contamination resulting from the actual machining environment, this study designs a visual inspection system that can be mounted in turning machine tools, as shown in Figure 3b. The protection box for the camera hardware, as shown in Figure 3a, prevents the cutting scrap in the machine from splashing, thus, reducing the contamination of cutting fluid on the lens. To capture sharp insert images, the cleaning air tube jets air toward the insert for cleaning. The protection box is equipped with a surrounding LED light source with adjustable brightness and is covered with epoxy resin for protection. The protection box extends the fill-light for the tip of the insert to be inspected (tip light source). Two magnetic bases are set up at the protection box base to fix the protection box in the machine tool for on-machine insert condition monitoring.

3. Insert Image Capture Process

During image capture, the fill-light is used for the inspected insert and the light source intensity is changed to ensure that different inserts have appropriate feature strength. This study uses two light sources in different positions as shown in Figure 3b. In terms of the surrounding light source, a strong light irradiates the test insert to obtain its surface shape and area features. In terms of the fill-light for the tip of an insert, the tip status feature is enhanced to facilitate later processing and analysis of the captured image. In the insert image capture process, as designed in this study, the insert is shot under a high-strength surrounding light source to capture the tool flank exposure image, as shown in Figure 4a. Then the insert image is captured using a high-strength surrounding light source and fill-light for the insert tip, as shown in Figure 4b. Here, the exposure images can be sequentially used to confirm the insert position, enhance geometry features, and strengthen wear features. The featured images are captured after the exposure image capture. First, the fill-light for the tip of an insert is closed and the surrounding light source intensity is adjusted to obtain appropriate flank feature images, as shown in Figure 5a, and then the intensity of the fill-light for the tip of an insert is adjusted to obtain appropriate tip feature images, as shown in Figure 5b. Here, the adjustment of light source intensity is automatically performed depending on the average thresholding value of the captured images. Referring to the exposure images, as shown in Figure 4, the feature images are used to analyze different insert conditions and can be utilized in the classification process of the insert conditions, including insert profile construction, status region capture, and wear judgment and calculation.

4. Insert Condition Monitoring Classification Process

4.1. Insert Outer Profile Construction

First, the flank profile feature is determined using the flank exposure image in Figure 4a. This study uses grayscale image thresholding to determine the flank profile feature, as shown in Figure 6a. Similarly, the insert profile feature in Figure 4b and grayscale image thresholding are used to determine the insert profile feature, as shown in Figure 6b. Here, the thresholding value is 250. The lines in the thresholding images in Figure 6 are determined using straight-line Hough transform, as shown in Figure 7. The flank profile exposure thresholding images determine the vertical flank line and horizontal blade line, while the insert profile exposure thresholding images determine the vertical blade line. The thresholding image can be trimmed and rotated along the horizontal blade line (Figure 7a) and vertical blade line (Figure 7b) in Figure 7 to construct a complete insert outer profile thresholding image, as shown in Figure 8a. According to the vertical flank line in Figure 7a, the complete insert outer profile thresholding image is divided into two blocks, as shown in Figure 8b: tip front-end underside (block B) and insert backend underside (block A) for subsequent insert condition feature recognition. Figure 9 shows the results of the trimmed insert images by referring to the completed insert outer profile thresholding image.

4.2. Insert Status Region Capture

The horizontal blade line in Figure 7a can be used to judge whether the insert has a fracture or BUE. The insert thresholding image in Figure 10a is obtained after the grayscale image thresholding process of Figure 9. Here, the erosion and dilation operations with the 11 × 11 diamond-shaped structuring element are used to clear the geometry features. The insert thresholding image is segmented along the horizontal blade line to obtain the insert fracture zone in Figure 10b and the insert BUE zone in Figure 10c, where the pixel areas of the fracture zone and BUE zone are calculated to judge the insert fracture or BUE status.
If the insert condition, as identified by the insert condition monitoring system designed in this study, is not classified as fracture or BUE, the flank wear judgment process begins. First, the grayscale transformation is implemented for the trimmed insert outer profile image in Figure 9. This study uses the average image RGB values for grayscale processing. After the insert outer profile image is converted into a grayscale image, the Sobel operator is used for insert edge detection to obtain a good insert edge feature. The insert outer profile blocks are then segmented, as shown in Figure 8b, and the lower region at the backend of the insert (block A) is removed to segment the location of the flank wear feature, as shown in Figure 11a. To facilitate the trimming of the flank wear part for subsequent judgment and calculation, the computation for noise removal, contrast stretching process, erosion, and dilation operations are implemented, as shown in Figure 9, and the flank wear zone image is obtained, as shown in Figure 11b. Here, the 3 × 3 box-pattern low-pass filter and the 21 × 21 box-pattern median filter are used for noise suppression. Finally, the trimmed operation is implemented for the insert outer profile image in Figure 9 according to Figure 11b and the flank wear zone image in Figure 11c is obtained.

4.3. Wear Region Judgment and Calculation

The flank wear or chipping wear status can be classified according to the trimmed flank wear zone image, as shown in Figure 11c. Figure 12 shows that there is a significant difference between the flank wear and chipping wear. The flank wear is the tear resulting from the rub between the cutting blade and workpiece in the machining process, thus, the flank wear surface features are mostly continuous and even. However, as chipping wear is the tip breakage resulting from abnormal machining processes, the chipping surface is relatively rough. This study analyzes the continuity of surface features for the actual image of a wear region to identify the insert wear region as flank or chipping wear, as shown in Figure 11c. The grayscale value histogram of all pixels can be obtained after Figure 11c is converted into a grayscale image, as shown in Figure 13a. The number of pixels is obviously larger than the pixel grayscale value histogram distribution of the flank wear image, as shown in Figure 13b. Therefore, the number of pixels larger than the preset threshold value is divided by the calculated value percentage of the number of pixels of the overall wear region to identify the wear region as flank or chipping wear. In other words, the percentage (chipping rate) of the number of pixels larger than the preset threshold value to the number of pixels of the overall wear region is taken as the basis of judgment. Moreover, this study uses the length of the pixels of the upper and lower boundaries of the wear region image to calculate the wear amount. The pixel unit is converted using the wear region image, as shown in Figure 14, where the conversion length of the wear region image pixels is 0.007 mm and the length in the pixels of the upper and lower boundaries of wear region image is 184 pixels, thus, the converted wear amount is 1.288 mm.

5. Experiment Monitoring Insert Condition

To validate the feasibility of the on-machine insert condition monitoring system proposed in this study, the visual inspection system mounted on the turning machine tool for insert condition monitoring experiments is shown in Figure 15. This study uses twenty used inserts in various states for experimentation and the results are presented in Table 1 and Figure 16. Here, the used inserts were collected after turning with cutting speed (130–150 m/min), cutting feed rate (0.2–0.3 mm/rev), and depth-of-cut (2–3 mm). The workpiece material is medium carbon steel and the insert material is tungsten carbide. The laptop computer with an Intel Core i7-4720HQ, 2.6-GHz CPU, and 64-bit Microsoft Windows 10 operating system was utilized to implement the whole system so that the time required for each monitoring task is approximately 13 s in which 2.75 s, on average, are required for the identification of insert conditions. To further reduce the time required for each monitoring task, a computer with a faster CPU could be used to implement the system. Table 1 shows the judgment results of the inserts in different states. The chipping rate is set at 50% for monitoring and the wear amount is set as 0.3 mm for identifying over-wear. Based on the results, the system developed in this study can correctly identify the various insert conditions of the test inserts. Table 1 presents three types of BUE inserts, where two of them have slight BUE. Thus, it can be said that this study identifies the BUE status accurately according to the preset threshold of BUE.
The insert condition monitoring system can identify different insert conditions and its operational stability is a key point of evaluation. Due to the changing external environment and light source intensity, there will be different results for insert condition tests and calculations. This study repeatedly tests the same insert to validate the stability of the insert condition monitoring system and the experimental results are shown in Table 2, where the wear amount of the insert wear region is calculated for comparison analysis. The experiment is repeated 10 times, the chipping rate and wear amount of each experiment are recorded, and system stability is checked using the calculated mean value and standard deviation. The experimental results show that the chipping rate analysis has large standard deviation, which signifies that there is a large variation in the results. The chipping rate is calculated according to the pixel grayscale value histogram distribution of the flank wear zone image, even though the algorithm and light source system operating procedure are identical, each moment of image capture is affected by the light source change and the grayscale value histogram distribution of the wear images changes. Despite all this, the standard deviation of the chipping rate is only 0.67% of the average value and the stability of the chipping rate calculation of this insert condition monitoring system can be calculated. In terms of wear amount results, the standard deviation of wear amount is only 0.62% of the average value, in other words, the standard deviation is lower than two pixels. Hence, the stability of this insert condition monitoring system in calculating wear amount can be calculated. Therefore, the aforementioned experimental results can be used to validate the feasibility and stability of the insert condition monitoring system and calculation method designed in this study.
This study developed an on-machine insert condition monitoring system to identify four external turning tool insert conditions; fracture, BUE, chipping, and flank wear. The experimental results demonstrate that the developed monitoring system can successfully identify the four insert conditions. Moreover, as shown in Figure 16, the developed system can be used for identifying the insert conditions when it is difficult to measure the wear amount precisely using standard wear measurement methods. However, because the view angle of the developed visual inspection system that is mounted inside the turning machine tools is different from the view angle of standard wear measurement devices, the calculated wear amount, which is used to indicate the degree of wear conditions, cannot be compared with the measurement results obtained using standard wear measurement devices.

6. Conclusions

The status of cutting tools used in the cutting processes of machine tools will obviously influence the manufacturing quality of machine parts. Therefore, this study develops an on-machine insert condition monitoring system for the turning tool insert of CNC turning machine tools and uses the machine vision method to inspect the common flank wear, chipping, fracture, and BUE statuses of turning tool inserts. This study differs from the existing research methods and outcomes as it fuses the machine vision method with contour and texture inspections to analyze the insert status. This eliminates the environmental problems in the insert inspection process to build a more accurate on-machine turning tool insert condition monitoring system.
To fix the CCD camera and lens in the CNC turning machine tool to carry out the on-machine insert condition visual inspection process, a visual inspection system with a protection box, cleaning air tube, and two light sources is designed. The protection box can avoid the scrap splash and contamination of cutting fluid on the lens during the cutting processes, while the cleaning air tube jets air toward the insert to clean off surface contaminants. A surrounding light source and a fill-light for the tip of an insert with variable light intensities are employed to analyze the effect of change in lighting conditions on the visual inspection of the insert status. In the insert image capture process, the intensity of the surrounding light source and fill-light is changed to ensure that the test insert has appropriate feature strength. The surrounding light source uses strong light to irradiate the insert surface to obtain the surface shape and area features, while the fill-light enhances the tip status feature to facilitate subsequent captured image processing and analysis. An insert condition monitoring classification process designed in this study includes insert outer profile construction, insert status region capture, and wear region judgment and calculation. The insert outer profile construction uses the exposure image to determine the outer profile feature, and then the vertical flank line, horizontal blade line, and vertical blade line are established according to this outer profile feature. The insert image can be trimmed for subsequent insert condition feature recognition. In terms of insert status region capture, the insert fracture zone and BUE zone are identified according to the outer profile feature lines and the insert outer profile image is trimmed to obtain the actual image of the insert wear region. For insert wear region judgment and calculation, the flank wear or chipping wear is identified based on the grayscale value histogram of all pixels of the trimmed flank wear zone image. The wear amount is calculated using the pixel length of the upper and lower boundaries of the wear region image, which are used as the reference index to identify the normal wear or over-wear status of the insert. Finally, inserts in different states are used for on-machine insert condition monitoring experimentation to confirm that the system designed in this study can identify insert fracture, BUE, chipping, and wear statuses. In addition, as the changes in external environment and light source sometimes influence the image processing result, the operational stability of the on-machine insert condition monitoring system is tested in this study. The experiment is conducted repeatedly and the average value and standard deviation of the chipping rate and wear amount in the experimental results are recorded as the basis for evaluating the operational stability of the system. The experimental results show that the light source variation does influence the calculation of chipping rate and wear amount. The standard deviation of the chipping rate is only 0.67% of the average value, while the standard deviation of wear amount is 0.62% of the average value (standard deviation lower than 2 pixels), thus validating the stability of system operation.

Author Contributions

Investigation, W.-H.S., S.-S.Y.; Supervision, S.-S.Y.

Funding

This research was funded in part by the Ministry of Science and Technology, Taiwan, R.O.C., under Contract MOST 104-2221-E-027-132 and MOST 103-2218-E-009-027-MY2.

Acknowledgments

The authors would like to thank representatives from the SRAM Taiwan Company for their useful discussions with the research team. The authors especially thank to Meng-Hui Lin (SRAM Taiwan Company) for his beneficial discussions.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Fernández-Valdivielso, A.; López De Lacalle, L.N.; Urbikain, G.; Rodriguez, A. Detecting the key geometrical features and grades of carbide inserts for the turning of nickel-based alloys concerning surface integrity. Proc. Inst. Mech. Eng. Part C J. Mech. Eng. Sci. 2016, 230, 3725–3742. [Google Scholar] [CrossRef]
  2. Pereira, O.; Rodríguez, A.; Fernández-Abia, A.I.; Barreiro, J.; López de Lacalle, L.N. Cryogenic and minimum quantity lubrication for an eco-efficiency turning of AISI 304. J. Clean. Prod. 2016, 139, 440–449. [Google Scholar] [CrossRef]
  3. Yu, J. Machine tool condition monitoring based on an adaptive Gaussian mixture model. J. Manuf. Sci. Eng. Trans. ASME 2012, 134, 031004. [Google Scholar] [CrossRef]
  4. Jones, B.E. Sensors in industrial metrology. J. Phys. E Sci. Instrum. 1987, 20, 1113–1116. [Google Scholar] [CrossRef]
  5. Avinash, C.; Raguraman, S.; Ramaswamy, S.; Muthukrishnan, N. An Investigation on Effect of Workpiece Reinforcement Percentage on Tool Wear in Cutting Al-SiC Metal Matrix Composites. In Proceedings of the ASME International Mechanical Engineering Congress and Exposition, Seattle, WA, USA, 11–15 November 2008; pp. 561–566. [Google Scholar]
  6. Ee, K.C.; Balaji, A.K.; Jawahir, I.S. Progressive tool-wear mechanisms and their effects on chip-curl/chip-form in machining with grooved tools: An extended application of the equivalent toolface (et) model. Wear 2003, 255, 1404–1413. [Google Scholar] [CrossRef]
  7. Nordgren, A.; Melander, A. Tool wear and inclusion behaviour during turning of a calcium-treated quenched and tempered steel using coated cemented carbide tools. Wear 1990, 139, 209–223. [Google Scholar] [CrossRef]
  8. Akbar, F.; Mativenga, P.T.; Sheikh, M.A. An evaluation of heat partition in the high-speed turning of AISI/SAE 4140 steel with uncoated and TiN-coated tools. Proc. Inst. Mech. Eng. Part B J. Eng. Manuf. 2008, 222, 759–771. [Google Scholar] [CrossRef]
  9. Ralston, P.A.S.; Ward, T.L.; Stottman, D.J.C. Computer observer for in-process measurement of lathe tool wear. Comput. Ind. Eng. 1988, 15, 217–222. [Google Scholar] [CrossRef]
  10. Massol, O.; Li, X.; Gouriveau, R.; Zhou, J.H.; Gan, O.P. An exTS based neuro-fuzzy algorithm for prognostics and tool condition monitoring. In Proceedings of the 11th International Conference on Control, Automation, Robotics and Vision, ICARCV 2010, Singapore, 7–10 December 2010; pp. 1329–1334. [Google Scholar]
  11. Rutelli, G.; Cuppini, D. Development of wear sensor for tool management system. J. Eng. Mater. Technol. Trans. ASME 1988, 110, 59–62. [Google Scholar] [CrossRef]
  12. Novak, A.; Wiklund, H. Reliability improvement of tool-wear monitoring. CIRP Ann. Manuf. Technol. 1993, 42, 63–66. [Google Scholar] [CrossRef]
  13. Szélig, K.; Alpek, F.; Berkes, O.; Nagy, Z. Automatic inspection in a CIM system. Comput. Ind. 1991, 17, 159–167. [Google Scholar] [CrossRef]
  14. Downey, J.; Bombiński, S.; Nejman, M.; Jemielniak, K. Automatic Multiple Sensor Data Acquisition System in a Real-Time Production Environment; Procedia CIRP: Capri, Italy, 2015; pp. 215–220. [Google Scholar]
  15. Scheffer, C.; Heyns, P.S. Monitoring of turning tool wear using vibration measurements and neural network classification. In Proceedings of the 25th International Conference on Noise and Vibration Engineering, ISMA, Leuven, 13–15 September 2000; pp. 921–928. [Google Scholar]
  16. Prasad, B.S.; Prabha, K.A.; Kumar, P.V.S.G. Condition monitoring of turning process using infrared thermography technique—An experimental approach. Infrared Phys. Technol. 2017, 81, 137–147. [Google Scholar] [CrossRef]
  17. Fu, P.; Li, W.; Guo, L. Fuzzy Clustering and Visualization Analysis of Tool Wear Status Recognition; Procedia Engineering: Shenzhen, China, 2011; pp. 479–486. [Google Scholar]
  18. Hamade, R.F.; Ammouri, A.H. Current Rise Index (CRI) maps of machine tool motors for tool-wear prognostic. In Proceedings of the ASME 2011 International Mechanical Engineering Congress and Exposition, IMECE 2011, Denver, CO, USA, 11–17 November 2011; pp. 867–872. [Google Scholar]
  19. Dutta, S.; Datta, A.; Chakladar, N.D.; Pal, S.K.; Mukhopadhyay, S.; Sen, R. Detection of tool condition from the turned surface images using an accurate grey level co-occurrence technique. Precis. Eng. 2012, 36, 458–466. [Google Scholar] [CrossRef]
  20. Dutta, S.; Pal, S.K.; Sen, R. On-machine tool prediction of flank wear from machined surface images using texture analyses and support vector regression. Precis. Eng. 2016, 43, 34–42. [Google Scholar] [CrossRef]
  21. Datta, A.; Dutta, S.; Pal, S.K.; Sen, R. Progressive cutting tool wear detection from machined surface images using Voronoi tessellation method. J. Mater. Process. Technol. 2013, 213, 2339–2349. [Google Scholar] [CrossRef]
  22. Kwon, Y.; Ertekin, Y.; Tseng, T.L. Characterization of tool wear measurement with relation to the surface roughness in turning. Mach. Sci. Technol. 2004, 8, 39–51. [Google Scholar] [CrossRef]
  23. Kassim, A.A.; Mannan, M.A.; Jing, M. Machine tool condition monitoring using workpiece surface texture analysis. Mach. Vis. Appl. 2000, 11, 257–263. [Google Scholar] [CrossRef]
  24. Prasad, B.S.; Sarcar, M.M.M. Experimental investigation to predict the condition of cutting tool by surface texture analysis of images of machined surfaces based on amplitude parameters. Int. J. Mach. Machinabil. Mater. 2008, 4, 217–236. [Google Scholar] [CrossRef]
  25. Prasad, B.S.; Sarcar, M.M.M.; Ben, B.S. Surface textural analysis using acousto optic emission- and vision-based 3D surface topography-a base for online tool condition monitoring in face turning. Int. J. Adv. Manuf. Technol. 2011, 55, 1025–1035. [Google Scholar] [CrossRef]
  26. Dutta, S.; Pal, S.K.; Mukhopadhyay, S.; Sen, R. Application of digital image processing in tool condition monitoring: A review. CIRP J. Manuf. Sci. Technol. 2013, 6, 212–232. [Google Scholar] [CrossRef]
  27. Dutta, S.; Pal, S.K.; Sen, R. Progressive tool flank wear monitoring by applying discrete wavelet transform on turned surface images. Meas. J. Int. Meas. Confed. 2016, 77, 388–401. [Google Scholar] [CrossRef]
  28. Bhat, N.N.; Dutta, S.; Pal, S.K.; Pal, S. Tool condition classification in turning process using hidden Markov model based on texture analysis of machined surface images. Meas. J. Int. Meas. Confed. 2016, 90, 500–509. [Google Scholar] [CrossRef]
  29. Mannan, M.A.; Mian, Z.; Kassim, A.A. Tool wear monitoring using a fast Hough transform of images of machined surfaces. Mach. Vis. Appl. 2004, 15, 156–163. [Google Scholar] [CrossRef]
  30. Bartow, M.J.; Calvert, S.G.; Bayly, P.V. Fiber bragg grating sensors for dynamic machining applications. In Proceedings of the SPIE—The International Society for Optical Engineering, Troutdale, OR, USA, 20 November 2003; pp. 21–31. [Google Scholar]
  31. Wang, W.H.; Wong, Y.S.; Hong, G.S. 3D measurement of crater wear by phase shifting method. Wear 2006, 261, 164–171. [Google Scholar] [CrossRef]
  32. Dawson, T.G.; Kurfess, T.R. Quantification of tool wear using white light interferometry and three-dimensional computational metrology. Int. J. Mach. Tools Manuf. 2005, 45, 591–596. [Google Scholar] [CrossRef]
  33. Klancnik, S.; Ficko, M.; Balic, J.; Pahole, I. Computer vision-based approach to end mill tool monitoring. Int. J. Simul. Model. 2015, 14, 571–583. [Google Scholar] [CrossRef]
  34. Kerr, D.; Pengilley, J.; Garwood, R. Assessment and visualisation of machine tool wear using computer vision. Int. J. Adv. Manuf. Technol. 2006, 28, 781–791. [Google Scholar] [CrossRef]
  35. Lanzetta, M. A new flexible high-resolution vision sensor for tool condition monitoring. J. Mater. Process. Technol. 2001, 119, 73–82. [Google Scholar] [CrossRef]
  36. Giusti, F.; Santochi, M.; Tantussi, G. On-line sensing of flank and crater wear of cutting tools. CIRP Ann. Manuf. Technol. 1987, 36, 41–44. [Google Scholar] [CrossRef]
  37. Bahr, B.; Motavalli, S.; Arfi, T. Sensor fusion for monitoring machine tool conditions. Int. J. Comput. Integr. Manuf. 1997, 10, 314–323. [Google Scholar] [CrossRef]
  38. Giusti, F.; Santochi, M.; Tantussi, G. A flexible tool wear sensor for NC lathes. CIRP Ann. Manuf. Technol. 1984, 33, 229–232. [Google Scholar] [CrossRef]
  39. Rangwala, S.; Dornfeld, D. Integration of Sensors via Neural Networks for Detection of Tool Wear States; American Society of Mechanical Engineers, Production Engineering Division (Publication) PED: Boston, MA, USA, 1987; pp. 109–120. [Google Scholar]
  40. Kurada, S.; Bradley, C. A machine vision system for tool wear assessment. Tribol. Int. 1997, 30, 295–304. [Google Scholar] [CrossRef]
  41. Yuan, Q.; Ji, S.M.; Zhang, L. Study of monitoring the abrasion of metal cutting tools based on digital image technology. In Proceedings of the SPIE—The International Society for Optical Engineering, Beijing, China, 11–14 May 2004; pp. 397–402. [Google Scholar]
  42. Wang, W.H.; Hong, G.S.; Wong, Y.S. Flank wear measurement by a threshold independent method with sub-pixel accuracy. Int. J. Mach. Tools Manuf. 2006, 46, 199–207. [Google Scholar] [CrossRef]
  43. Li, P.; Li, Y.; Yang, M.; Zheng, J.; Yuan, Q. Monitoring technology research of tool wear condition based on machine vision. In Proceedings of the World Congress on Intelligent Control and Automation (WCICA), Chongqing, China, 25–27 June 2008; pp. 2783–2787. [Google Scholar]
  44. Shahabi, H.H.; Ratnam, M.M. On-line monitoring of tool wear in turning operation in the presence of tool misalignment. Int. J. Adv. Manuf. Technol. 2008, 38, 718–727. [Google Scholar] [CrossRef]
  45. Pfeifer, T.; Wiegers, L. Reliable tool wear monitoring by optimized image and illumination control in machine vision. Meas. J. Int. Meas. Confed. 2000, 28, 209–218. [Google Scholar] [CrossRef]
  46. Barreiro, J.; Castejón, M.; Alegre, E.; Hernández, L.K. Use of descriptors based on moments from digital images for tool wear monitoring. Int. J. Mach. Tools Manuf. 2008, 48, 1005–1013. [Google Scholar] [CrossRef]
  47. Alegre, E.; Alaiz-Rodríguez, R.; Barreiro, J.; Ruiz, J. Use of contour signatures and classification methods to optimize the tool life in metal machining. EST J. Eng. 2009, 15, 3–12. [Google Scholar] [CrossRef]
  48. D’Addona, D.M.; Teti, R. Image data processing via neural networks for tool wear prediction. In Proceedings of the 8th CIRP International Conference on Intelligent Computation in Manufacturing Engineering, ICME, Ischia, Italy, 18–20 July 2012; Elsevier B.V.: Ischia, Italy, 2013; pp. 252–257. [Google Scholar]
Figure 1. Four insert status forms. (a) Flank wear; (b) Chipping; (c) Fracture; (d) BUE.
Figure 1. Four insert status forms. (a) Flank wear; (b) Chipping; (c) Fracture; (d) BUE.
Materials 11 01977 g001
Figure 2. CNC turning machine tool for experiment. (a) Turning zone; (b) Turret structure.
Figure 2. CNC turning machine tool for experiment. (a) Turning zone; (b) Turret structure.
Materials 11 01977 g002
Figure 3. Visual inspection system mountable inside turning machine tools designed in this study. (a) Industrial camera and lens related components; (b) Visual inspection system protection box.
Figure 3. Visual inspection system mountable inside turning machine tools designed in this study. (a) Industrial camera and lens related components; (b) Visual inspection system protection box.
Materials 11 01977 g003aMaterials 11 01977 g003b
Figure 4. Captured flank and insert exposure images. (a) Flank exposure image; (b) Insert exposure image.
Figure 4. Captured flank and insert exposure images. (a) Flank exposure image; (b) Insert exposure image.
Materials 11 01977 g004
Figure 5. Captured flank and tip feature images. (a) Flank feature image; (b) Tip feature image.
Figure 5. Captured flank and tip feature images. (a) Flank feature image; (b) Tip feature image.
Materials 11 01977 g005
Figure 6. Thresholding operation result of captured exposure insert grayscale image. (a) Flank exposure thresholding image; (b) Insert exposure thresholding image.
Figure 6. Thresholding operation result of captured exposure insert grayscale image. (a) Flank exposure thresholding image; (b) Insert exposure thresholding image.
Materials 11 01977 g006
Figure 7. Lines of thresholding images. (a) Vertical flank line and horizontal blade line of flank profile feature; (b) Vertical blade line in insert exposure image.
Figure 7. Lines of thresholding images. (a) Vertical flank line and horizontal blade line of flank profile feature; (b) Vertical blade line in insert exposure image.
Materials 11 01977 g007
Figure 8. Completed insert outer profile thresholding images and block division. (a) Completed insert outer profile thresholding image; (b) Insert outer profile block division.
Figure 8. Completed insert outer profile thresholding images and block division. (a) Completed insert outer profile thresholding image; (b) Insert outer profile block division.
Materials 11 01977 g008
Figure 9. Trimmed insert outer profile image.
Figure 9. Trimmed insert outer profile image.
Materials 11 01977 g009
Figure 10. Judgment of insert fracture and BUE statuses. (a) Insert thresholding image; (b) Insert fracture zone; (c) Insert BUE zone.
Figure 10. Judgment of insert fracture and BUE statuses. (a) Insert thresholding image; (b) Insert fracture zone; (c) Insert BUE zone.
Materials 11 01977 g010
Figure 11. Actual image of trimmed insert wear region. (a) Flank wear feature zone; (b) Range of flank wear zone; (c) Flank wear zone image.
Figure 11. Actual image of trimmed insert wear region. (a) Flank wear feature zone; (b) Range of flank wear zone; (c) Flank wear zone image.
Materials 11 01977 g011aMaterials 11 01977 g011b
Figure 12. Comparison between flank wear and chipping wear. (a) Flank wear; (b) Chipping wear.
Figure 12. Comparison between flank wear and chipping wear. (a) Flank wear; (b) Chipping wear.
Materials 11 01977 g012
Figure 13. Histogram of grayscale image. (a) Grayscale image of Figure 11c; (b) Grayscale image of flank wear.
Figure 13. Histogram of grayscale image. (a) Grayscale image of Figure 11c; (b) Grayscale image of flank wear.
Materials 11 01977 g013
Figure 14. Wear amount result of wear region image.
Figure 14. Wear amount result of wear region image.
Materials 11 01977 g014
Figure 15. Experimental system setup for insert condition monitoring.
Figure 15. Experimental system setup for insert condition monitoring.
Materials 11 01977 g015
Figure 16. Captured images of different insert conditions (corresponding to insert numbers in Table 1).
Figure 16. Captured images of different insert conditions (corresponding to insert numbers in Table 1).
Materials 11 01977 g016
Table 1. Experimental results of condition monitoring of different inserts.
Table 1. Experimental results of condition monitoring of different inserts.
No.Chipping Rate (%)Wear Amount (mm)Status DeterminationNo.Chipping Rate (%)Wear Amount (mm)Status Determination
145.761.530over-wear1162.470.651chipping
20.000.000BUE120.000.000BUE
335.140.735over-wear1352.420.875chipping
420.650.658over-wear1419.260.427over-wear
526.040.238normal wear1537.090.532over-wear
633.750.287normal wear1631.350.903over-wear
712.200.105normal wear1729.480.161normal wear
826.340.252normal wear180.000.000normal wear
90.000.000BUE190.000.000fracture
1029.260.686over-wear2063.061.250chipping
Table 2. Experimental results of computational stability of insert wear.
Table 2. Experimental results of computational stability of insert wear.
Materials 11 01977 i001No.Chipping Rate (%)Wear Amount (mm)
152.8861.302
252.9931.288
352.9221.288
452.8311.302
553.3351.288
653.0031.288
752.8201.302
852.7461.295
951.8781.281
1052.7601.302
Average value52.8171.294
Standard deviation0.3520.008

Share and Cite

MDPI and ACS Style

Sun, W.-H.; Yeh, S.-S. Using the Machine Vision Method to Develop an On-machine Insert Condition Monitoring System for Computer Numerical Control Turning Machine Tools. Materials 2018, 11, 1977. https://doi.org/10.3390/ma11101977

AMA Style

Sun W-H, Yeh S-S. Using the Machine Vision Method to Develop an On-machine Insert Condition Monitoring System for Computer Numerical Control Turning Machine Tools. Materials. 2018; 11(10):1977. https://doi.org/10.3390/ma11101977

Chicago/Turabian Style

Sun, Wei-Heng, and Syh-Shiuh Yeh. 2018. "Using the Machine Vision Method to Develop an On-machine Insert Condition Monitoring System for Computer Numerical Control Turning Machine Tools" Materials 11, no. 10: 1977. https://doi.org/10.3390/ma11101977

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop