Automatic Reclaimed Wafer Classification Using Deep Learning Neural Networks
Abstract
:1. Introduction
2. Review
3. Research Method
3.1. Ten Typical Defective Maps in Reclaimed Wafers
- (1)
- Surface with line watermarks: there are residues on the surface of reclaimed wafers caused by the cleaning and rotary drying process. The representative pattern is shown in Figure 3a;
- (2)
- Surface with scratches: there are scratches on the surface of reclaimed wafers caused by the manufacturing process such as grinding. The representative pattern is shown in Figure 3b;
- (3)
- Surface with scattered watermarks: there are radiance defective marks on the surface of reclaimed wafers caused by the cleaning and rotary drying process. Figure 3c shows the representative pattern;
- (4)
- Surface with over-etched spot: there are blue and green points on the surface of reclaimed wafers caused by the over-etching. The representative pattern is shown in Figure 3d;
- (5)
- Surface with water jet: there are parallel straight defective marks on the surface of reclaimed wafers caused by an abnormal cleaning process. Figure 3e shows the representative pattern;
- (6)
- Surface with clustered watermarks: there are marginal defective watermarks on the surface of reclaimed wafers caused by the damaged parts of the cleaning machine. The representative pattern is shown in Figure 3f;
- (7)
- Surface with half-sided watermarks: the cause of formation is the same as (6) above, but the defective characteristics are different. The representative pattern is shown in Figure 3g;
- (8)
- Surface with Eridite residuals: there are residues of polishing liquid on the surface of reclaimed wafers caused by the abnormal machine. The representative pattern is shown in Figure 3h;
- (9)
- Surface with banded particles: there are ribbon/banding defective marks on the surface of reclaimed wafers caused by the damaged parts of the cleaning machine. The representative pattern is shown in Figure 3i;
- (10)
- Crystal growth defect: this kind of reclaimed wafers can no longer be used in the manufacturing process due to the abnormal material. If a wafer cannot be classified into one of the first nine categories, this wafer will be recognized as a No-Good wafer. Therefore, the total number of categories of the No-Good wafers cannot be counted clearly. Figure 4 shows six example images of the No-Good wafers.
3.2. Proposed Method
3.2.1. Data Acquisition
3.2.2. Data Transformation
3.2.3. Image Preprocessing
- Step 1: Read in the image;
- Step 2: Resize the image into predefined size (e.g., 134 by 134);
- Step 3: Crop the image by three to certain pixels width (e.g., Width = 3);
- Step 4: For a given pixel (r, c), if sqrt (r − center_r) × (r − center_r) + (c − center_c) × (c − center_c) > radius: pixel(r, c) = white (eg. radius = 60);
- Step 5: Save the image.
3.2.4. Network Structure Determination
Reclaimed Wafer Classification Using MLP
Reclaimed Wafer Classification Using CNN
Reclaimed Wafer Classification Using ResNets
4. Implementation
4.1. Experiment Results of MLP
4.2. Experiment Results of CNN
4.3. Experiment Results of ResNet
4.4. Comparison of MLP, CNN and ResNet
5. Discussion and Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Korzenski, M.B.; Jiang, P. Wafer Reclaim. In Handbook of Cleaning in Semiconductor Manufacturing: Fundamental and Applications; Reinhardt, K.A., Reidy, R.F., Eds.; Scrivener: Salem, MA, USA, 2010; pp. 473–500. [Google Scholar]
- MT Systems, Inc. Wafer Reclaim Processing. Available online: http://www.microtechprocess.com (accessed on 15 July 2019).
- Huang, S.-H.; Pan, Y.-C. Automated visual inspection in the semiconductor industry: A survey. Comput. Ind. 2015, 66, 1–10. [Google Scholar] [CrossRef]
- Liu, C.-W.; Chien, C.-F. An intelligent system for wafer bin map defect diagnosis: An empirical study for semiconductor manufacturing. Eng. Appl. Artif. Intell. 2013, 26, 1479–1486. [Google Scholar] [CrossRef]
- Hsu, S.-C.; Chien, C.-F. Hybrid data mining approach for pattern extraction from wafer bin map to improve yield in semiconductor manufacturing. Int. J. Prod. Econ. 2007, 107, 88–103. [Google Scholar] [CrossRef]
- Li, W.-C.; Tsai, D.-M. Automatic saw-mark detection in multicrystalline solar wafer images. Sol. Energy Mater. Sol. Cells. 2011, 95, 2206–2220. [Google Scholar] [CrossRef]
- Li, W.-C.; Tsai, D.-M. Wavelet-based defect detection in solar wafer images with inhomogeneous texture. Pattern Recognit. 2012, 45, 742–756. [Google Scholar] [CrossRef]
- Sun, T.-H.; Tang, C.-H.; Tien, F.-C. Post-Slicing Inspection of Silicon Wafers Using the HJ-PSO Algorithm Under Machine Vision. IEEE Trans. Semicond. Manuf. 2011, 24, 80–88. [Google Scholar] [CrossRef]
- Wang, C.-H. Recognition of semiconductor defect patterns using spatial filtering and spectral clustering. Expert Syst. Appl. 2008, 34, 1914–1923. [Google Scholar] [CrossRef]
- Chang, C.-Y.; Li, C.; Chang, J.-W.; Jeng, M. An unsupervised neural network approach for automatic semiconductor wafer defect inspection. Expert Syst. Appl. 2009, 36, 950–958. [Google Scholar] [CrossRef]
- Shankar, N.G.; Zhong, Z.W. Defect detection on semiconductor wafer surfaces. Microelectron. Eng. 2005, 77, 337–346. [Google Scholar] [CrossRef]
- Su, C.-T.; Tong, L.-I. A neural network-based procedure for the process monitoring of clustered defects in integrated circuit fabrication. Comput. Ind. 1997, 34, 285–294. [Google Scholar] [CrossRef]
- Su, C.-T.; Yang, T.; Ke, C.-M. A Neural-Network Approach for Semiconductor Wafer Post-Sawing Inspection. IEEE Trans. Semicond. Manuf. 2002, 15, 260–266. [Google Scholar]
- Chang, C.-Y.; Chang, Y.-C.; Li, C.-H.; Jeng, M. Radial Basis Function Neural Networks for LED Wafer Defect Inspection. In Proceedings of the 2nd International Conference on Innovative Computing, Information and Control, Kumamoto, Japan, 5–7 September 2007. [Google Scholar]
- Ooi, M.P.-L.; Sok, H.K.; Kuang, Y.C.; Demidenko, S.; Chan, C. Defect cluster recognition system for fabricated semiconductor wafers. Eng. Appl. Artif. Intell. 2013, 26, 1029–1043. [Google Scholar] [CrossRef] [Green Version]
- Yuan, X.; Liu, Q.; Long, J.; Hu, L.; Wang, Y. Deep Image Similarity Measurement Based on the Improved Triplet Network with Spatial Pyramid Pooling. Information 2019, 10, 129. [Google Scholar] [CrossRef] [Green Version]
- Jiang, W.; Yi, Y.; Mao, J.; Huang, Z.; Huang, C.; Xu, W. CNN-RNN: A Unified Framework for Multi-label Image Classification. In Proceedings of the 2016 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27 June–1 July 2016. [Google Scholar]
- Chen, Z.; Ding, R.; Chin, T.W.; Marculescu, D. Understanding the Impact of Label Granularity on CNN-based Image Classification. In Proceedings of the 2018 IEEE International Conference on Data Mining Workshops, Singapore, 17–20 November 2018. [Google Scholar]
- Fu, R.; Li, B.; Gao, Y.; Wang, P. Content-based image retrieval based on CNN and SVM. In Proceedings of the 2nd IEEE International Conference on Computer and Communications, Chengdu, China, 14–17 October 2016. [Google Scholar]
- Seddati, O.; Dupont, S.; Mahmoudi, S.; Parian, M. Towards Good Practices for Image Retrieval Based on CNN Features. In Proceedings of the 2017 IEEE International Conference on Computer Vision Workshops, Venice, Italy, 22–29 October 2017. [Google Scholar]
- Ren, S.; He, K.; Girshick, R.; Sun, J. Faster R-CNN: Towards real-time object detection with region proposal networks. In Proceedings of the Conference on Neural Information Processing Systems, Montreal, QC, Canada, 7–12 December 2015. [Google Scholar]
- Hui, Z.; Wang, K.; Tian, Y.; Gou, C.; Wang, F.-Y. MFR-CNN: Incorporating Multi-Scale Features and Global Information for Traffic Object Detection. IEEE Trans. Veh. Technol. 2018, 67, 8019–8030. [Google Scholar]
- Chen, L.C.; Papandreou, G.; Kokkinos, I.; Yuille, A.L. DeepLab: Semantic Image Segmentation with Deep Convolutional Nets, Atrous Convolution, and Fully Connected CRFs. IEEE Trans. Pattern Anal. Mach. Intell. 2018, 40, 834–848. [Google Scholar] [CrossRef] [PubMed]
- Liu, Z.; Li, X.; Luo, P.; Loy, C.C.; Tang, X. Deep Learning Markov Random Field for Semantic Segmentation. IEEE Trans. Pattern Anal. Mach. Intell. 2018, 40, 1814–1828. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Cireşan, D.C.; Meier, U.; Masci, J.; Gambardella, L.M.; Schmidhuber, J. Flexible, High Performance Convolutional Neural Networks for Image Classification. In Proceedings of the 22nd International Joint Conference on Artificial Intelligence, Barcelona, Spain, 19–22 July 2011. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep Residual Learning for Image Recognition. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 26 June–1 July 2016. [Google Scholar]
- Tsui, B.-Y.; Fang, K.-L. A Novel Wafer Reclaim Method for Amorphous SiC and Carbon Doped Oxide Films. IEEE Trans. Semicond. Manuf. 2005, 18, 716–721. [Google Scholar] [CrossRef]
- Zoroofi, R.A.; Taketani, H.; Tamura, S.; Sato, Y.; Sekiya, K. Automated inspection of IC wafer contamination. Pattern Recognit. 2001, 34, 1307–1317. [Google Scholar] [CrossRef]
- Liu, M.-L.; Tien, F.-C. Reclaim wafer defect classification using SVM. In Proceedings of the Asia Pacific Industrial Engineering & Management Systems Conference, Taipei, China, 7–10 December 2016. [Google Scholar]
- Tien, F.-C.; Sun, T.-H.; Liu, M.-L. Reclaim Wafer Defect Classification Using Backpropagation Neural Networks. In Proceedings of the International Congress on Recent Development in Engineering and Technology, Kuala Lumpur, Malaysia, 22–24 August 2016. [Google Scholar]
- Tien, F.-C.; Hsu, C.-C.; Cheng, C.-Y. Defect Classification of Reclaim Wafer by Deep Learning. In Proceedings of the Academics World International Conference, Cannes, France, 13–14 September 2018. [Google Scholar]
- Yeh, C.-H.; Lin, M.-H.; Lin, C.-H.; Yu, C.-E.; Chen, M.-J. Machine Learning for Long Cycle Maintenance Prediction of Wind Turbine. Sensors 2019, 19, 1671. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Keras ResNet: Building, Training & Scaling Residual Nets on Keras. Available online: https://missinglink.ai/guides/keras/keras-resnet-building-training-scaling-residual-nets-keras/ (accessed on 13 February 2020).
- TensorFlow. Available online: https://www.tensorflow.org (accessed on 15 July 2019).
- ResNet v1, v2, and Segmentation Models for Keras. Available online: https://github.com/keras-team/keras-contrib/blob/master/keras_contrib/applications/resnet.py (accessed on 13 February 2020).
Type | Description | G/NG | Number of Images |
---|---|---|---|
T1 | Surface with line watermarks | G (Re-polishable) | 78 |
T2 | Surface with scratches | G (Re-polishable) | 638 |
T3 | Surface with scattered watermarks | G (Re-polishable) | 39 |
T4 | Surface with over-etched spots | G (Re-polishable) | 40 |
T5 | Surface with water jet | G (Re-polishable) | 33 |
T6 | Surface with clustered watermarks | G (Re-polishable) | 121 |
T7 | Surface with half-sided watermarks | G (Re-polishable) | 77 |
T8 | Surface with Eridite residuals | G (Re-polishable) | 139 |
T9 | Surface with banded particles | G (Re-polishable) | 235 |
T10 | Crystal growth defect | NG | 1061 |
Data Split Ratio (Training/Validating) | 80/20 |
---|---|
Learning rate/momentum/decay | 0.0010/0.9000/1.0000 × 10−6 |
Loss threshold | 0.0100 |
Accuracy threshold | 0.9975 |
Patience | 30 (no improvement in 30 consecutive epochs) |
Structure | Training Loss/Accuracy | Val Loss/Accuracy |
---|---|---|
49152-256-10 | 0.1123/0.9680 | 0.2992/0.9285 |
49152-256-256-10 | 0.0250/0.9964 | 0.1600/0.9687 |
49152-256-256-256-10 | 0.0113/0.9983 | 0.0905/0.9777 |
MLP Structure | Training Loss/Accuracy | Validation Loss/Accuracy |
---|---|---|
49152-256-256-256-10 | 0.0113/0.9983 | 0.0905/0.9777 |
49152-256-256-512-10 | 0.0082/0.9982 | 0.0988/0.9771 |
49152-256-512-256-10 | 0.0255/0.9975 | 0.0865/0.9810 |
49152-512-256-256-10 | 0.0079/0.9982 | 0.0604/0.9830 |
49152-256-512-512-10 | 0.0129/0.9981 | 0.0837/0.9806 |
49152-512-256-512-10 | 0.0173/0.9982 | 0.0635/0.9849 |
49152-512-512-256-10 | 0.0097/0.9982 | 0.0818/0.9806 |
49152-512-512-512-10 | 0.0089/0.9982 | 0.0641/0.9833 |
No | CNN Structure | Training Loss/Accuracy | Validation Loss/Accuracy |
---|---|---|---|
1 | C8(K3_ReLU_MP3)_F32(ReLU)_F10(SM) | 9.8379/0.1226 | 9.7326/0.1270 |
2 | C8(K3_ReLU_MP3)_C16(K3_ReLU_MP3)_F32(ReLU)_F10(SM) | 0.0148/0.9980 | 0.0533/0.9878 |
3 | C8(K5_ReLU_MP3)_C16(K3_ReLU_MP3)_F32(ReLU)_F10(SM) | 0.0096/0.9982 | 0.0332/0.9936 |
No | CNN Structure | Training Loss/Accuracy | Validation Loss/Accuracy |
---|---|---|---|
1 | C8(K3_ReLU_MP3)_C16(K3_ReLU_MP3)_F32(ReLU)_F10(SM) | 0.0148/0.9980 | 0.0533/0.9878 |
2 | C8(K3_ReLU_MP3)_C16(K3_ReLU_MP3)_F64(ReLU)_F10(SM) | 0.0229/0.9982 | 0.0656/0.9893 |
3 | C16(K3_ReLU_MP3)_C32(K3_ReLU_MP3)_F32(ReLU)_F10(SM) | 0.0227/0.9960 | 0.0677/0.9824 |
4 | C16(K3_ReLU_MP3)_C32(K3_ReLU_MP3)_F64(ReLU)_F10(SM) | 0.0596/0.9807 | 0.1548/0.9632 |
5 | C8(K5_ReLU_MP3)_C16(K3_ReLU_MP3)_F32(ReLU)_F10(SM) | 0.0096/0.9982 | 0.0332/0.9936 |
6 | C8(K5_ReLU_MP3)_C16(K3_ReLU_MP3)_F64(ReLU)_F10(SM) | 0.0100/0.9981 | 0.0278/0.9932 |
7 | C16(K5_ReLU_MP3)_C32(K3_ReLU_MP3)_F32(ReLU)_F10(SM) | 0.0175/0.9981 | 0.0224/0.9976 |
8 | C16(K5_ReLU_MP3)_C32(K3_ReLU_MP3)_F64(ReLU)_F10(SM) | 0.0084/0.9981 | 0.0313/0.9929 |
No | ResNet Structure | Training Loss/Accuracy | Validation Loss/Accuracy |
---|---|---|---|
1 | ResNet 18 | 0.8864/0.9980 | 0.8930/0.9958 |
2 | ResNet 34 | 1.6018/0.9976 | 1.6090/0.9956 |
3 | ResNet 50 | 4.8866/0.9981 | 4.9844/0.9958 |
4 | ResNet 101 | 9.7607/0.9983 | 9.7590/0.9962 |
5 | ResNet 152 | 13.7344/0.9982 | 13.7266/0.9968 |
Performance | MLP 1 | CNN_1 2 | CNN_2 3 | ResNet 18 |
---|---|---|---|---|
Loss (Training/Validation) | 0.0173/0.0635 | 0.0229/0.0656 | 0.0175/0.0224 | 0.8864/0.8930 |
Accuracy (Training/Validation) | 0.9982/0.9849 | 0.9982/0.9893 | 0.9981/0.9976 | 0.9980/0.9958 |
Train Time (secs per Epoch) | 47.2 secs | 43.0 secs | 30.2 secs | 51.1 secs |
Total number of weights and biases | 25,439,498 | 235,978 | 119,514 | 11,200,458 |
Actual | Prediction | |||||||||
---|---|---|---|---|---|---|---|---|---|---|
Type 1 | Type 2 | Type 3 | Type 4 | Type 5 | Type 6 | Type 7 | Type 8 | Type 9 | Type 10 | |
Type 1 | 2636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
Type 2 | 0 | 3005 | 0 | 51 | 0 | 0 | 0 | 0 | 0 | 0 |
Type 3 | 0 | 0 | 2554 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
Type 4 | 0 | 0 | 0 | 2587 | 0 | 0 | 0 | 0 | 0 | 0 |
Type 5 | 0 | 0 | 0 | 0 | 2501 | 0 | 0 | 0 | 0 | 0 |
Type 6 | 0 | 0 | 0 | 0 | 0 | 2602 | 0 | 0 | 0 | 0 |
Type 7 | 0 | 0 | 0 | 0 | 0 | 0 | 2565 | 0 | 0 | 0 |
Type 8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2676 | 0 | 0 |
Type 9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2821 | 0 |
Type 10 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2573 |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Shih, P.-C.; Hsu, C.-C.; Tien, F.-C. Automatic Reclaimed Wafer Classification Using Deep Learning Neural Networks. Symmetry 2020, 12, 705. https://doi.org/10.3390/sym12050705
Shih P-C, Hsu C-C, Tien F-C. Automatic Reclaimed Wafer Classification Using Deep Learning Neural Networks. Symmetry. 2020; 12(5):705. https://doi.org/10.3390/sym12050705
Chicago/Turabian StyleShih, Po-Chou, Chun-Chin Hsu, and Fang-Chih Tien. 2020. "Automatic Reclaimed Wafer Classification Using Deep Learning Neural Networks" Symmetry 12, no. 5: 705. https://doi.org/10.3390/sym12050705
APA StyleShih, P. -C., Hsu, C. -C., & Tien, F. -C. (2020). Automatic Reclaimed Wafer Classification Using Deep Learning Neural Networks. Symmetry, 12(5), 705. https://doi.org/10.3390/sym12050705