A Scaffolding Assembly Deficiency Detection System with Deep Learning and Augmented Reality
Abstract
:1. Introduction
- Visual Inspection
- Measurement Inspection
- Strain Monitoring Inspection
2. Materials and Methods
2.1. Problem Statement and Materials
2.2. Scaffolding Assembly Deficiency Detection System
2.3. Implementation Methods
2.3.1. Deficiency-Recognition Module
2.3.2. AR Visualization Module
Step 1: Marker Setup
Step 2: Implementation and Initialization of Markers in Unity
- Navigate to the Universal Windows Platform setup page by selecting Build Settings/Platform/Universal Windows Platform.
- Configure initialization parameters by choosing Project Settings/XR Plug-in Management/Windows and activate Initialize XR on Startup.
- Specify the XR device by selecting Project Settings/XR Plug-in Management/Windows and activate OpenXR and the Microsoft HoloLens feature group.
- Integrate the mobile control module by adding the Object Manipulator Script.
- Incorporate the hand gesture recognition module by accessing Project Settings/OpenXR/Interaction Profiles. Choose “Eye Gaze Interaction Profile,” “Microsoft Hand Interaction Profile,” and “Microsoft Motion Controller Profile” (Figure 6).
- Enable the hand recognition module by selecting Project Settings/OpenXR/OpenXR Feature Groups and activating Microsoft HoloLens’ “Hand Tracking” and “Motion Controller Model”.
2.3.3. HL2 Visualization Module
3. Results
4. Field Test and Discussion
- The camera shooting angle should be as orthogonal to the target wall face as possible. Nonetheless, the recognition module successfully recognized the deficiencies in some frames sooner or later as the wearer approached those frames. However, because the attached alert frames are always orthogonal squares, the highlights may cause humans to misread the wrong frame. This problem can be avoided so long as the camera shooting angle is orthogonal to the target wall face.
- When shooting at an oblique angle with respect to the target wall face, far-away frames may not be recognized by the module owing to self-occlusion. This problem is understandable because even humans cannot evaluate those frames in the same situation, and those frames will be evaluated correctly once the camera moves toward them in the absence of occlusions.
- Considering the practical use case, to enhance work efficiency and inspector’s safety, the tests were performed in front of scaffolds on the ground without actually climbing on the scaffolding boards to efficiently capture multiple frames at a glance. So long as the shooting angle is near orthogonal to the target wall face, an image with 20–50 frames did not seem to be a problem for SADDS. In this way, the use of SADDS is more efficient than inspection with the human eye. Nevertheless, in double-frame scaffold systems, most of the inner frames will not be recognized by the system owing to self-occlusion by the outer frames. Although one may stand on a scaffolding board to shoot the inner frames without occlusion, the number of frames covered in an image would be very limited, and the frames would need to be checked one by one. In such a case, direct inspection with human eyes would be more convenient.
- Before the field test, we were concerned about the system’s ability to recognize missing cross-tie rods, which had the least precision (i.e., 0.82, compared to, for example, 0.90 for missing lower-tie rods) among the three types of target deficiencies. However, this did not seem to be a problem during the field test. A possible explanation is that in the training test, precision values were calculated per image, and each misidentification was counted. However, during actual field use, images were run as a stream, and when the HL2 wearer was moving, SADDS had many chances to successfully identify deficiencies and, eventually, alert the wearer.
- The scaffolds at both test sites were enclosed by safety nets (e.g., anti-fall or dustproof nets), which did not affect the recognition accuracy of SADDS so long as human eyes could see through the net. Indeed, in the presence of safety nets, it was more difficult for humans to recognize unqualified assemblies from a distance than it was for SADDS.
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Li, S.; Gu, X.; Xu, X.; Xu, D.; Zhang, T.; Liu, Z.; Dong, Q. Detection of concealed cracks from ground penetrating radar images based on deep learning algorithm. Constr. Build. Mater. 2021, 273, 121949. [Google Scholar] [CrossRef]
- Fang, W.; Ding, L.; Zhong, B.; Love, P.E.; Luo, H. Automated detection of workers and heavy equipment on construction sites: A convolutional neural network approach. Adv. Eng. Inform. 2018, 37, 139–149. [Google Scholar] [CrossRef]
- Reja, V.K.; Varghese, K.; Ha, Q.P. Computer vision-based construction progress monitoring. Autom. Constr. 2022, 138, 104245. [Google Scholar] [CrossRef]
- Shanti, M.Z.; Cho, C.S.; Byon, Y.J.; Yeun, C.Y. A novel implementation of an AI-based smart construction safety inspection protocol in the UAE. IEEE Access 2021, 9, 166603–166616. [Google Scholar] [CrossRef]
- Park, S.E.; Eem, S.H.; Jeon, H. Concrete crack detection and quantification using deep learning and structured light. Constr. Build. Mater. 2020, 252, 119096. [Google Scholar] [CrossRef]
- Alavi, A.H.; Buttlar, W.G. An overview of smartphone technology for citizen-centered, real-time and scalable civil infrastructure monitoring. Future Gener. Comput. Syst. 2019, 93, 651–672. [Google Scholar] [CrossRef]
- Sarmadi, H.; Entezami, A.; Yuen, K.V.; Behkamal, B. Review on smartphone sensing technology for structural health monitoring. Measurement 2023, 223, 113716. [Google Scholar] [CrossRef]
- Yu, L.; Lubineau, G. A smartphone camera and built-in gyroscope based application for non-contact yet accurate off-axis structural displacement measurements. Measurement 2021, 167, 108449. [Google Scholar] [CrossRef]
- Nazar, A.M.; Jiao, P.; Zhang, Q.; Egbe, K.J.I.; Alavi, A.H. A new structural health monitoring approach based on smartphone measurements of magnetic field intensity. IEEE Instrum. Meas. Mag. 2021, 24, 49–58. [Google Scholar] [CrossRef]
- Han, R.; Zhao, X.; Yu, Y.; Guan, Q.; Hu, W.; Li, M. A cyber-physical system for girder hoisting monitoring based on smartphones. Sensors 2016, 16, 1048. [Google Scholar] [CrossRef]
- Zhao, X.; Han, R.; Yu, Y.; Li, M. Research on quick seismic damage investigation using smartphone. In Proceedings of the SPIE 9804, Nondestructive Characterization and Monitoring of Advanced Materials, Aerospace, and Civil Infrastructure, Las Vegas, NV, USA, 21–24 March 2016; p. 980421. [Google Scholar]
- Microsoft. HoloLens 2 Release Notes. 2023. Available online: https://learn.microsoft.com/en-us/hololens/hololens-release-notes#about-hololens-releases (accessed on 10 November 2023).
- Leite, F.; Cho, Y.; Behzadan, A.H.; Lee, S.H.; Choe, S.; Fang, Y.; Akhavian, R.; Hwang, S. Visualization, information modeling, and simulation: Grand challenges in the construction industry. J. Comput. Civ. Eng. 2016, 30, 04016035. [Google Scholar] [CrossRef]
- Park, S.; Bokijonov, S.; Choi, Y. Review of Microsoft HoloLens applications over the past five years. Appl. Sci. 2020, 11, 7259. [Google Scholar] [CrossRef]
- Pratt, P.; Ives, M.; Lawton, G.; Simmons, J.; Radev, N.; Spyropoulou, L.; Amiras, D. Through the HoloLens looking glass: Augmented reality for extremity reconstruction surgery using 3D vascular models with perforating vessels. Eur. Radiol. Exp. 2018, 2, 2. [Google Scholar] [CrossRef] [PubMed]
- Al-Maeeni Sara, S.H.; Kuhnhen, C.; Engel, B.; Schiller, M. Smart retrofitting of machine tools in the context of industry 4.0. Procedia CIRP 2019, 88, 369–374. [Google Scholar] [CrossRef]
- Hübner, P.; Clintworth, K.; Liu, Q.; Weinmann, M.; Wursthorn, S. Evaluation of HoloLens tracking and depth sensing for indoor mapping applications. Sensors 2020, 20, 1021. [Google Scholar] [CrossRef] [PubMed]
- Wu, M.; Dai, S.-L.; Yang, C. Mixed reality enhanced user interactive path planning for omnidirectional mobile robot. Appl. Sci. 2020, 10, 1135. [Google Scholar] [CrossRef]
- Mourtzis, D.; Siatras, V.; Zogopoulos, V. Augmented reality visualization of production scheduling and monitoring. Procedia CIRP 2020, 88, 151–156. [Google Scholar] [CrossRef]
- Moezzi, R.; Krcmarik, D.; Hlava, J.; Cýrus, J. Hybrid SLAM modeling of autonomous robot with augmented reality device. Mater. Today Proc. 2020, 32, 103–107. [Google Scholar] [CrossRef]
- Karaaslan, E.; Bagci, U.; Catbas, F.N. Artificial intelligence assisted infrastructure assessment using mixed reality systems. Transp. Res. Rec. 2019, 2673, 413–424. [Google Scholar] [CrossRef]
- Sanni-Anibire, M.O.; Salami, B.A.; Muili, N. A framework for the safe use of bamboo scaffolding in the Nigerian construction industry. Saf. Sci. 2022, 151, 105725. [Google Scholar] [CrossRef]
- Abdel-Jaber, M.; Beale, R.G.; Godley, M.H.R. A theoretical and experimental investigation of pallet rack structures under sway. J. Constr. Steel Res. 2006, 62, 68–80. [Google Scholar] [CrossRef]
- Abdel-Jaber, M.; Abdel-Jaber, M.S.; Beale, R.G. An Experimental Study into the Behaviour of Tube and Fitting Scaffold Structures under Cyclic Side and Vertical Loads. Metals 2022, 12, 40. [Google Scholar] [CrossRef]
- Baek, C.W.; Lee, D.Y.; Park, C.S. Blockchain based Framework for Verifying the Adequacy of Scaffolding Installation. In Proceedings of the 37th ISARC (International Symposium on Automation and Robotics in Construction), Kitakyushu, Japan, 27–28 October 2020; IAARC Publications. 2020; Volume 37, pp. 425–432. Available online: https://www.researchgate.net/profile/Chanwoo-Baek-2/publication/346222919_Blockchain_based_Framework_for_Verifying_the_Adequacy_of_Scaffolding_Installation/links/5fbf641892851c933f5d3492/Blockchain-based-Framework-for-Verifying-the-Adequacy-of-Scaffolding-Installation.pdf (accessed on 24 December 2023).
- Sakhakarmi, S.; Park, J.W.; Cho, C. Enhanced machine learning classification accuracy for scaffolding safety using increased features. J. Constr. Eng. Manag. 2019, 145, 04018133. [Google Scholar] [CrossRef]
- Choa, C.; Sakhakarmi, S.; Kim, K.; Park, J.W. Scaffolding Modeling for Real-time Monitoring Using a Strain Sensing Approach. In Proceedings of the 35th ISARC (International Symposium on Automation and Robotics in Construction), Berlin, Germany, 20–25 July 2018; pp. 48–55. [Google Scholar] [CrossRef]
- Ministry of Labor of Taiwan. Safety Regulations for Inspecting Construction Scaffolding. 2018. Available online: https://laws.mol.gov.tw/FLAW/FLAWDAT01.aspx?id=FL083843 (accessed on 1 November 2023). (In Chinese)
- Roboflow, Inc. Roboflow Official Site. 2023. Available online: https://roboflow.com/ (accessed on 1 December 2023).
- Uijlings, J.; van de Sande, K.; Gevers, T.; Smeulders, A. Selective search for object recognition. Int. J. Comput. Vis. 2013, 104, 154–171. [Google Scholar] [CrossRef]
- Ren, S.; He, K.; Girshick, R.B.; Sun, J. Faster R-CNN: Towards real-time object detection with region proposal networks. Adv. Neural Inf. Process. Syst. 2015, 28, 91–99. [Google Scholar] [CrossRef] [PubMed]
- He, K.; Gkioxari, G.; Dollár, P.; Girshick, R. Mask R-CNN. In Proceedings of the 2017 IEEE International Conference on Computer Vision (ICCV), Venice, Italy, 22–29 October 2017; pp. 2980–2988. [Google Scholar] [CrossRef]
- Jocher, G.; Stoken, A.; Borovec, J.; NanoCode012; ChristopherSTAN; Liu, C.; Laughing; Tkianai; YxNONG; Hogan, A.; et al. Ultralytics/yolov5: V4.0—Nn.SiLU() Activations, Weights & Biases Logging, PyTorch Hub Integration. 2021. Available online: https://zenodo.org/records/4418161 (accessed on 24 December 2023).
- PTC. Vuforia Engine Developer’s Portal. 2023. Available online: https://developer.vuforia.com/ (accessed on 20 May 2023).
- Unity. Vuforia Hololens 2 Sample. 2023. Available online: https://assetstore.unity.com/packages/templates/packs/vuforia-hololens-2-sample-101553 (accessed on 1 December 2023).
- Microsoft Inc. Introduction to the Mixed Reality Toolkit-Set up Your Project and Use Hand Interaction. HoloLens 2 Fundamentals: Develop Mixed Reality Applications. 2023. Available online: https://learn.microsoft.com/en-us/training/modules/learn-mrtk-tutorials/ (accessed on 1 December 2023).
- Microsoft Inc. GitHub Copilot and Visual Studio 2022. 2023. Available online: https://visualstudio.microsoft.com/zh-hant/ (accessed on 1 December 2023).
- Li, M.; Vitányi, P. An Introduction to Kolmogorov Complexity and Its Applications; Springer: Berlin/Heidelberg, Germany, 2008. [Google Scholar]
- Bolón-Canedo, V.; Remeseiro, B. Feature selection in image analysis: A survey. Artif. Intell. Rev. 2020, 53, 2905–2931. [Google Scholar] [CrossRef]
- Kabir, H.; Garg, N. Machine learning enabled orthogonal camera goniometry for accurate and robust contact angle measurements. Sci. Rep. 2023, 13, 1497. [Google Scholar] [CrossRef] [PubMed]
- An, M.; Kang, D.S. The distance measurement based on corner detection for rebar spacing in engineering images. J. Supercomput. 2022, 78, 12380–12393. [Google Scholar] [CrossRef]
- Xi, J.; Gao, L.; Zheng, J.; Wang, D.; Tu, C.; Jiang, J.; Miao, Y.; Zhong, J. Automatic spacing inspection of rebar spacers on reinforcement skeletons using vision-based deep learning and computational geometry. J. Build. Eng. 2023, 79, 107775. [Google Scholar] [CrossRef]
Qualified | Missing Cross-Tie Rod | Missing Lower-Tie Rod | Missing Footboard |
---|---|---|---|
0: yellow (763) | 1: magenta (245) | 2: purple (575) | 3: red (643) |
mAP | Qualified | Missing Cross-Tie Rod | Missing Lower-Tie Rod | Missing Footboard | |
---|---|---|---|---|---|
Validation | 0.94 | 0.97 | 0.96 | 0.937 | 0.93 |
Test | 0.88 | 0.95 | 0.80 | 0.90 | 0.88 |
mAP | Qualified | Missing Cross-Tie Rod | Missing Lower-Tie Rod | Missing Footboard | |
---|---|---|---|---|---|
Validation | 0.96 | 0.98 | 0.979 | 0.90 | 0.96 |
Test | 0.89 | 0.96 | 0.82 | 0.90 | 0.89 |
Box Loss | Object Loss | Class Loss | |
---|---|---|---|
Validation | 0.0020 | 0.0032 | 0.0030 |
Test | 0.0021 | 0.0041 | 0.0037 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Dzeng, R.-J.; Cheng, C.-W.; Cheng, C.-Y. A Scaffolding Assembly Deficiency Detection System with Deep Learning and Augmented Reality. Buildings 2024, 14, 385. https://doi.org/10.3390/buildings14020385
Dzeng R-J, Cheng C-W, Cheng C-Y. A Scaffolding Assembly Deficiency Detection System with Deep Learning and Augmented Reality. Buildings. 2024; 14(2):385. https://doi.org/10.3390/buildings14020385
Chicago/Turabian StyleDzeng, Ren-Jye, Chen-Wei Cheng, and Ching-Yu Cheng. 2024. "A Scaffolding Assembly Deficiency Detection System with Deep Learning and Augmented Reality" Buildings 14, no. 2: 385. https://doi.org/10.3390/buildings14020385
APA StyleDzeng, R. -J., Cheng, C. -W., & Cheng, C. -Y. (2024). A Scaffolding Assembly Deficiency Detection System with Deep Learning and Augmented Reality. Buildings, 14(2), 385. https://doi.org/10.3390/buildings14020385