Research on Low-Cost Non-Contact Vision-Based Wheel Arch Detection for End-of-Line Stage
Abstract
1. Introduction
1.1. Background and Motivation
1.2. Existing Approaches and Limitations
1.3. Proposed Method and Contributions
2. Key Technical Theory Principles
2.1. Monocular Camera Imaging Model
- World coordinate system (Ow): Its origin is set at the reference point on the workshop floor, directly beneath the vehicle’s front-left wheel. The Z-axis is perpendicular to the ground (pointing upward) and aligned with the wheel arch measurement direction; the X-axis is parallel to the vehicle’s forward direction; the Y-axis is horizontal and perpendicular to the X-axis, and it describes the actual spatial position of the top vertex beneath the wheel arch.
- Camera coordinate system (Oc): Its origin is located at the optical center of the camera. The Z-axis points toward the measured vehicle along the optical axis, while the X- and Y-axes are parallel to those of the world coordinate system, acting as a transition between the two coordinate systems.
- Image coordinate system (Ouv): Its origin is the geometric center of the imaging plane, with the X- and Y-axes parallel to the corresponding axes of the camera coordinate system (unit: mm), reflecting the target’s physical position on the imaging plane. Note that this model differs from standard image processing conventions here: in this system, the X-axis points left, the Y-axis points downward, whereas the standard convention usually defines the X-axis as pointing right (with the Y-axis still pointing downward).
- Pixel coordinate system (U-V): Its origin is the top-right corner of the image, with the U-axis pointing horizontally left and the V-axis pointing vertically downward (unit: pixel). This system serves as the direct coordinate reference for computer image processing, as illustrated in Figure 1.
2.2. Low-Cost Camera Calibration
2.3. Camera Distortion Correction
2.4. Wheel Arch Calculation Model
3. Design and Implementation System
3.1. Hardware System Design
3.1.1. Core Perception Module
3.1.2. Fixing and Positioning Module
3.1.3. Computation and Control Module
3.2. Wheel Arch Measurement Software System
3.2.1. Wheel Hub Detection Based on YOLOv5s
3.2.2. Image Preprocessing for Robust Wheel Arch Detection
3.2.3. Wheel Hub Center Feature Point Extraction and Error Compensation
- Geometric Calibration: Accurate calibration of the camera and system ensures that even small deviations in hub center position do not result in significant errors in the final measurements.
- Error Compensation: A compensation algorithm is employed to correct small errors caused by wheel hub center localization. This ensures that the final wheel arch measurement remains within the required ±1.0 mm precision.
- Segmentation: Based on the watershed concept, the MSER algorithm, which utilizes affine invariance and grayscale robustness, is used to calculate regional stability according to Equation (11) to segment and obtain the initial hub region (Figure 14).
- Optimization: The cv2.bitwise() function is called to merge discrete hub regions. Then, cv2.floodFill() is used to fill holes within the region, eliminating interference from hub spokes, resulting in the complete hub region (Figure 15).
- Fitting: Finally, the cv2.minEnclosingCircle() function is used to fit the minimum enclosing circle to the complete region, outputting the hub center coordinates and radius (Figure 16), with the final center positioning error ≤ 0.39 mm, meeting the baseline accuracy requirement.
3.2.4. Wheel Arch Edge Contour Extraction for Height Calculation
- Edge Detection: The grayscale image obtained in the previous section is used as input. The Canny algorithm is applied with double thresholds of 60/180 and an aperture size of 3 to extract all edges in the image, resulting in an edge distribution map (Figure 17a).
- Region Filtering: A wheel arch-specific Region of Interest (ROI) is defined based on the wheel hub center position to filter out irrelevant edges, such as those from car doors and bumpers. This leaves only the wheel arch edges (Figure 17b).
- Contour Segmentation: The filtered wheel arch edges are segmented into line segments. Short noise lines, shorter than 50 pixels, are discarded, while the approximate vertical main edges of the wheel arch, with an orientation angle of 80–100°, are retained. This results in the wheel arch edge contour line segments (Figure 17c).
- Result Output: The final wheel arch edges are marked with blue contour lines (Figure 17d), providing clear feature boundaries for subsequent height calculation.
3.2.5. Wheel Arch Calculation and Measurement Algorithm
4. Experimental Results and Analysis
4.1. On-Site Platform Construction
4.2. Experimental Design and Data Collection
4.2.1. Laboratory Simulation Scenario
- Vehicle Positioning and Calibration: Seven vehicle models (A to G) are tested. After positioning through the guide wheel device on the side of the roller, the camera is calibrated using a camera calibration board to ensure that the camera’s imaging center is vertically aligned with the wheel hub center. The alignment deviation is controlled to ≤±0.2 mm.
- Lighting Adjustment: The light source power is dynamically adjusted according to real-time light intensity. It is set to 15 W at 500 lux to compensate for low light and reduced to 8 W at 1500 lux to suppress reflection. The camera parameters, such as exposure time (500 μs) and gain (1 dB), are fixed to avoid image quality differences caused by parameter fluctuations.
- Image Clarity Selection: For each vehicle model, three frames of front wheel images are collected at a single time. The clarity of the images is calculated using the Laplace operator [42], and clear images with a variance > 300 are selected, while blurry frames caused by vibration are eliminated.
- Reference True Value Measurement: The wheel arch reference values are measured using the CMM (Coordinate Measuring Machine), with an accuracy of ±0.02 mm. The reference values for the models are as follows: Model A—765 mm, Model B—795 mm, Model C—845 mm, Model D—875 mm, Model E—825 mm, Model F—710 mm, Model G—805 mm.
4.2.2. On-Site Production Line Scenario
- Lighting: The lighting system uses a combination of “workshop ceiling lights + LED fill lights,” with the intensity controlled at a constant 1000 ± 50 lux, ensuring no interference from natural light. This setup eliminates light fluctuations, which could otherwise affect image feature extraction.
- Camera Setup: The camera is mounted horizontally at a preset angle relative to the production line roller track, ensuring that the wheel arch area appears in a standard projection form in the image.
- Ground Setup: The ground is the standard roller track of the production line, calibrated with a laser level to ensure a flatness of ≤0.3 mm/m. This eliminates any vehicle attitude deviations caused by ground tilt, ensuring the consistency of the measurement reference.
4.3. Error Quantification Analysis
4.4. Conclusions of Experimental Results
5. Conclusions and Future Outlook
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- GB/T 39263-2020; Road Vehicles—Advanced Driver Assistance Systems (ADAS)—Terms and Definitions. China Standards Press: Beijing, China, 2020.
- Zhang, X.; Sun, H. Analysis of GB/T 40429—2021 “Automotive driving automation classification”. China Auto 2022, 3–5+7. [Google Scholar]
- Yao, Y.; Jin, S.; Sun, X. Design of non-contact vehicle wheel arch gap measurement system based on image processing. Automot. Eng. 2024, 14–20. [Google Scholar] [CrossRef]
- Ding, Z.; Jiang, J.; Zheng, J.; Kong, L. Machine Vision-Based Method for Reconstructing the Vehicle Coordinate System in End-of-Line ADAS Calibration. Electronics 2024, 13, 3405. [Google Scholar] [CrossRef]
- Du, P.; Duan, Z.; Zhang, J.; Zhao, W.; Lai, E. High-Precision Measurement Method for Large Diameter Part Angles Based on Binocular Vision. J. Instrum. 2025, 46, 35–43. [Google Scholar]
- Steinemann, P.; Klappstein, J.; Dickmann, J.; Wünsche, H.-J.; Hundelshausen, F.V. Determining the outline contour of vehicles in 3D-LIDAR-measurements. In Proceedings of the 2011 IEEE Intelligent Vehicles Symposium, Baden-Baden, Germany, 5–9 June 2011; pp. 479–484. [Google Scholar]
- Bai, Q.; Zhang, N.; Huang, B. Research on Measuring Vehicle Outline Dimensions by Fusing Vision and Laser Point Clouds. In Proceedings of the 2024 5th International Conference on Computer Engineering and Application (ICCEA), Hangzhou, China, 19–21 April 2024; pp. 1152–1155. [Google Scholar]
- Wu, Z.; Huang, Y.; Huang, K.; Yan, K.; Chen, H. A Review of Non-Contact Water Level Measurement Based on Computer Vision and Radar Technology. Water 2023, 15, 3233. [Google Scholar] [CrossRef]
- Qian, W.; Wang, P.; Wang, H.; Wu, S.; Hao, Y.; Zhang, X.; Wang, X.; Sun, W.; Guo, H.; Guo, X. Research on a Method for Measuring the Pile Height of Materials in Agricultural Product Transport Vehicles Based on Binocular Vision. Sensors 2024, 24, 7204. [Google Scholar] [CrossRef]
- Li, X.; Shi, J.; Liu, P.; Zhang, M. Efficient Measurement Technology for Complex Parts Based on Optical CMM. Tool Technol. 2025, 59, 151–156. [Google Scholar]
- Wen, T.; He, J.; Zhang, C.; He, J. The Wheel–Rail Contact Force for a Heavy-Load Train Can Be Measured Using a Collaborative Calibration Algorithm. Information 2024, 15, 535. [Google Scholar] [CrossRef]
- Xu, W. Vehicle Load Identification Using Machine Vision and Displacement Influence Lines. Buildings 2024, 14, 392. [Google Scholar] [CrossRef]
- Zhu, C.; Yang, Y. Online Detection Algorithm of Automobile Wheel Surface Defects Based on Improved Faster-RCNN Model. Surf. Technol. 2020, 49, 359–365. [Google Scholar]
- Liu, B.; Wang, H.; Wang, Y.; Zhou, C.; Cai, L. Lane Line Type Recognition Based on Improved YOLOv5. Appl. Sci. 2023, 13, 10537. [Google Scholar] [CrossRef]
- Liu, Y.; Liu, L.; Zhang, S. Research on External Thread Measurement Technology Combining Contour Extraction Algorithm and Machine Vision. Autom. Instrum. 2025, 159–163. [Google Scholar] [CrossRef]
- Jia, Y.; Mei, F. Simulation of Image Blurred Edge Contour Extraction under Histogram Equalization. Comput. Simul. 2025, 42, 395–398+404. [Google Scholar]
- Huang, Q.; Du, Y. Research on Clothing Contour Extraction Algorithm Based on HSV + Canny Model and HED Network Model. Autom. Inf. Eng. 2024, 45, 1–9+17. [Google Scholar]
- Guo, K.; Wang, J.; Liu, X.; Wang, K. Research on U-Net-Based Workpiece Contour Extraction Method. J. Comb. Mach. Tools Autom. Technol. 2025, 8–12. [Google Scholar] [CrossRef]
- Lin, T. Design and Implementation of Multi-Size Wheel Hub Weld Detection and Positioning System Based on Deep Learning. Master’s Thesis, Chengdu University of Technology, Chengdu, China, 2023. [Google Scholar]
- Liu, Z.; Ren, C. A method of detecting the cross-section of ring components based on line structured light. In Proceedings of the 2024 6th International Conference on Intelligent Control, Measurement and Signal Processing (ICMSP), Xi’an, China, 10–12 May 2024; pp. 288–291. [Google Scholar]
- Pondech, W.; Saenthon, A.; Konghuayrob, P. The Development of Adaptive Gray Level Mapping Combined Particle Swarm Optimization for Measuring the Diameter Size of Automotive Nut. In Proceedings of the 2020 IEEE 7th International Conference on Industrial Engineering and Applications (ICIEA), Bangkok, Thailand, 16–19 April 2020; pp. 240–244. [Google Scholar]
- Yang, Y.; Shi, J. Research on Automotive Wheel Hub Detection Based on Machine Vision System. Electr. Switchg. 2021, 59, 34–36. [Google Scholar]
- Chen, J.S. Research on Automotive Wheel Hub Aperture Size Online Measurement System Based on Machine Vision. Master’s Thesis, Xi’an University of Electronic Science and Technology, Xi’an, China, 2024. [Google Scholar] [CrossRef]
- Chen, Y.; Mao, C.; Cai, E.; Liu, C. Design of Precision Measurement System for Part Dimensions Based on Vision Technology. Instrum. Technol. Sens. 2025, 49–52. [Google Scholar]
- Yu, H.; Yan, W.; Sun, J.; Wang, H.; Zhang, L. Design of Intelligent Measurement System of Vehicle Dimensions Based on Structured Light Imaging and Machine Vision. In Proceedings of the 2019 6th International Conference on Systems and Informatics (ICSAI), Shanghai, China, 2–4 November 2019; pp. 1238–1243. [Google Scholar]
- Maślanka, M.; Jancarczyk, D.; Rysiński, J. Integration of Machine Vision and PLC-Based Control for Scalable Quality Inspection in Industry 4.0. Sensors 2025, 25, 6383. [Google Scholar] [CrossRef]
- Yuan, C.; Huo, C.; Tong, Z.; Men, G.; Wang, Y. Research on Vehicle Detection Algorithm of Driver Assistance System Based on Vision. In Proceedings of the 2019 Chinese Control And Decision Conference (CCDC), Nanchang, China, 3–5 June 2019; pp. 1024–1027. [Google Scholar]
- Kong, X.; Zhang, J.; Wang, T.; Huang, Q.; Deng, L. Non-contact vehicle weighing method based on tire deformation recognition by image. China J. Highw. Transp. 2022, 35, 186–193. [Google Scholar] [CrossRef]
- Ye, N.; Huang, Y.; Zheng, W.; Yu, Q. Study on accuracy comparison of coordinate measuring machines. Metrol. Meas. Technol. 2025, 51, 12–16+22. [Google Scholar] [CrossRef]
- Bao, Y.; Liu, J.; Yang, H.; Jiang, S.; Yuan, B.; Yang, P. Non-contact rail profile detection system based on laser profiler. Electron. Sci. Technol. 2020, 33, 28–33+86. [Google Scholar] [CrossRef]
- Wang, Y.K. Research on Visual Detection Method of Automobile Wheel Alignment Parameters Based on Binocular Vision and Light Field Network. Ph.D. Thesis, Jilin University, Changchun, China, 2025. [Google Scholar] [CrossRef]
- Yang, X. Research on Parameter Calibration Method of Monocular Structured Light System Based on Telecentric Lens. Ph.D. Thesis, Shenzhen University, Shenzhen, China, 2023. [Google Scholar] [CrossRef]
- Sun, C.; Shi, J.; Wang, M.; Ding, J. Research on calibration method of monocular line structured light 3D reconstruction system. Opt. Tech. 2024, 50, 698–705. [Google Scholar] [CrossRef]
- Zeng, Z.Q.; Feng, P.P.; Li, Z.H. Research on accurate correction of distortion of large field camera in image measurement. Mach. Tool Hydraul. 2022, 50, 21–25. [Google Scholar] [CrossRef]
- Yang, G.; Zhang, Z.; Sang, H.; Tang, H.; Wang, M. TMSS-YOLO: An industrial tool detection method based on improved YOLOv11. Signal Image Video Process. 2025, 19, 1217. [Google Scholar] [CrossRef]
- Chen, J.; Chen, X. Research on optical coherence tomography image enhancement based on multi-scale Retinex. Laser J. 2023, 44, 110–114. [Google Scholar] [CrossRef]
- Yang, Y.; Song, C.H. Gaussian adaptive multi-scale weighted filtering dehazing algorithm. Comput. Appl. Softw. 2022, 39, 187–192+265. [Google Scholar] [CrossRef]
- He, Z.; Li, Y.; Yan, S.; Zhang, Z. Research on digital image edge detection algorithm. Appl. Electron. Tech. 2025, 51, 70–73. [Google Scholar] [CrossRef]
- Zhang, C. A vision-based lane line detection method. Automob. Manuf. Ind. 2024, 4, 20–25. [Google Scholar]
- Wang, J.; Yang, G.; Wang, Y.; Mao, X. Target detection in vehicle-mounted millimeter-wave radar SAR images based on MSER. Mod. Radar 2025, 47, 1–7. [Google Scholar] [CrossRef]
- Liu, H.; Zhu, D.; Zhu, Y.; Xie, X.; Zhao, H. Identification of flight area identification plate based on an improved MSER algorithm. Int. J. Aerosp. Eng. 2022, 2022, 8374300. [Google Scholar] [CrossRef]
- Yuan, S.; Zhao, W.; Deng, J.D.; Xia, S.; Li, X. Quantum image edge detection based on Laplacian of Gaussian operator. Quantum Inf. Process. 2024, 23, 178. [Google Scholar] [CrossRef]






















| Row | Column | Radius | Measured Value/mm | Reference Value/mm | |
|---|---|---|---|---|---|
| Threshold method | 1151.28 | 620.446 | 506.697 | 498.705 | 498 |
| MSER method | 1149.01 | 621.921 | 505.586 | 497.609 | 498 |
| Canny method | 1149.39 | 622.056 | 506.498 | 498.508 | 498 |
| A Measurement Value/mm | B Measurement Value/mm | C Measurement Value/mm | D Measurement Value/mm | E Measurement Value/mm | F Measurement Value/mm | G Measurement Value/mm | |
|---|---|---|---|---|---|---|---|
| 1 | 764.08 | 795.466 | 845.798 | 874.125 | 825.747 | 709.841 | 805.462 |
| 2 | 764.477 | 794.301 | 844.373 | 875.42 | 825.344 | 710.036 | 804.428 |
| 3 | 764.113 | 795.502 | 845.274 | 874.416 | 824.896 | 709.021 | 804.789 |
| 4 | 765.858 | 794.832 | 845.692 | 875.508 | 824.867 | 710.483 | 805.323 |
| 5 | 765.094 | 794.606 | 844.887 | 874.594 | 825.518 | 710.397 | 804.887 |
| 6 | 764.731 | 795.589 | 844.338 | 874.789 | 825.192 | 709.71 | 804.128 |
| 7 | 764.502 | 794.318 | 845.289 | 874.213 | 825.409 | 709.048 | 804.568 |
| 8 | 765.694 | 795.499 | 844.449 | 875.567 | 824.697 | 709.589 | 804.267 |
| 9 | 765.338 | 794.253 | 844.651 | 874.538 | 824.632 | 710.86 | 805.326 |
| 10 | 764.439 | 795.348 | 845.872 | 875.656 | 825.801 | 710.688 | 805.629 |
| Average value/mm | 764.833 | 794.971 | 845.062 | 874.883 | 825.210 | 709.967 | 804.881 |
| A Measurement Value/mm | B Measurement Value/mm | C Measurement Value/mm | D Measurement Value/mm | E Measurement Value/mm | F Measurement Value/mm | G Measurement Value/mm | |
|---|---|---|---|---|---|---|---|
| 1 | 764.851 | 794.794 | 844.884 | 875.005 | 824.836 | 709.988 | 804.819 |
| 2 | 764.868 | 794.891 | 844.905 | 875.148 | 825.174 | 709.995 | 804.805 |
| 3 | 764.840 | 794.841 | 844.802 | 874.973 | 824.946 | 709.989 | 804.837 |
| 4 | 764.903 | 794.788 | 845.103 | 874.982 | 825.036 | 709.977 | 804.798 |
| 5 | 764.842 | 795.010 | 845.061 | 874.849 | 825.099 | 709.975 | 804.808 |
| 6 | 764.783 | 794.992 | 844.999 | 874.913 | 824.985 | 709.981 | 804.809 |
| 7 | 764.787 | 794.455 | 844.997 | 875.029 | 824.878 | 709.980 | 804.832 |
| 8 | 764.946 | 794.949 | 844.940 | 875.164 | 825.133 | 709.987 | 804.827 |
| 9 | 764.766 | 794.936 | 844.988 | 875.051 | 825.122 | 709.976 | 804.820 |
| 10 | 764.886 | 794.817 | 844.869 | 875.062 | 824.863 | 709.961 | 804.829 |
| Average value/mm | 764.8472 | 794.8473 | 844.9548 | 875.0176 | 825.0072 | 709.9809 | 804.8184 |
| Scene | Laboratory Scene | On-Site Scenario | ||||
|---|---|---|---|---|---|---|
| Indicator | MAE (mm) | Max Error (mm) | SD (mm) | MAE (mm) | Max Error (mm) | SD (mm) |
| A | 0.388 | ±0.92 | 0.612 | 0.026 | ±0.234 | 0.071 |
| B | 0.288 | ±0.747 | 0.589 | 0.046 | ±0.548 | 0.082 |
| C | 0.33 | ±0.872 | 0.595 | 0.01 | ±0.198 | 0.069 |
| D | 0.334 | ±0.875 | 0.603 | 0.009 | ±0.164 | 0.067 |
| E | 0.205 | ±0.801 | 0.578 | 0.014 | ±0.174 | 0.073 |
| F | 0.375 | ±0.979 | 0.621 | 0.0004 | ±0.039 | 0.065 |
| G | 0.269 | ±0.872 | 0.591 | 0.033 | ±0.202 | 0.07 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
Share and Cite
Ding, Z.; Lin, M.; Ding, Y.; Li, Y.; Zhang, Q. Research on Low-Cost Non-Contact Vision-Based Wheel Arch Detection for End-of-Line Stage. Sensors 2026, 26, 234. https://doi.org/10.3390/s26010234
Ding Z, Lin M, Ding Y, Li Y, Zhang Q. Research on Low-Cost Non-Contact Vision-Based Wheel Arch Detection for End-of-Line Stage. Sensors. 2026; 26(1):234. https://doi.org/10.3390/s26010234
Chicago/Turabian StyleDing, Zhigang, Mingsheng Lin, Yi Ding, Yun Li, and Qincheng Zhang. 2026. "Research on Low-Cost Non-Contact Vision-Based Wheel Arch Detection for End-of-Line Stage" Sensors 26, no. 1: 234. https://doi.org/10.3390/s26010234
APA StyleDing, Z., Lin, M., Ding, Y., Li, Y., & Zhang, Q. (2026). Research on Low-Cost Non-Contact Vision-Based Wheel Arch Detection for End-of-Line Stage. Sensors, 26(1), 234. https://doi.org/10.3390/s26010234
