Workers’ Unsafe Actions When Working at Heights: Detecting from Images
Abstract
:1. Introduction
2. Related Works
2.1. Safety Management in High Places
2.2. Computer Vision in Construction
3. Methods
3.1. Research Framework
- (1)
- Before building the automatic recognition model, the workers’ unsafe actions that are to be detected should be defined. This research lists five kinds of worker action types that were likely to cause safety accidents when working at heights.
- (2)
- The second stage is data acquisition, mainly to acquire images with features of workers’ unsafe actions that could be used for deep learning training, validation, and testing. In this step, Red-Green-Blue (RGB) and contrast enhancement images are integrated to reinforce the dataset performance.
- (3)
- Then, the model development stage involves deep learning training and testing.
3.2. Definition of Workers’ Unsafe Actions
3.3. Data Acquisition
3.4. Model Development
4. Experiment and Results
4.1. Experiment
- (1)
- Each student must perform the 5 actions, i.e., throwing, lying, relying, jumping, and without helmets, with their usual manner of behavior.
- (2)
- Each type of unsafe action would be taken in different scenarios, with different shooting angles and lighting conditions.
- (3)
- For each class of unsafe actions, 3–5 sequential images as a group were collected to reflect a continuously varying action.
- (4)
- Images of poor quality were filtered out and deleted, such as indistinct images and targets with a small proportion of images.
- (5)
- (6)
- Samples were reshaped in the dataset to a resolution of 375 × 500.
4.2. Results
4.2.1. Sample Source Analysis
4.2.2. FPs Analyzation
4.2.3. Helmet Detection Analysis
5. Discussion
- (1)
- Whether a deep learning algorithm could fully extract image features.
- (2)
- Whether the dataset could adequately represent the detection object. Deep learning is learning the features in a dataset. Therefore, the characteristics contained in the dataset determine the final effect of the model.
- (3)
- The quality of the image used for recognition also affects the robustness. It includes the quality of data acquisition equipment, image acquisition angle and object obscured, and the influence of environmental factors, such as light, rain and fog on image quality.
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Fang, W.; Ma, L.; Love, P.E.; Luo, H.; Ding, L.; Zhou, A. Knowledge graph for identifying hazards on construction sites: Integrating computer vision with ontology. Autom. Constr. 2020, 119, 103310. [Google Scholar] [CrossRef]
- Lingard, H. Occupational health and safety in the construction industry. Constr. Manag. Econ. 2013, 31, 505–514. [Google Scholar] [CrossRef]
- Available online: https://www.osha.gov/data/work (accessed on 16 February 2022).
- Ministry of Housing and Urban-Rural Development of China (MHURDC), 2010–2019. Available online: https://www.mohurd.gov.cn/ (accessed on 16 February 2022).
- Li, H.; Lu, M.; Hsu, S.-C.; Gray, M.; Huang, T. Proactive behavior-based safety management for construction safety improvement. Saf. Sci. 2015, 75, 107–117. [Google Scholar] [CrossRef]
- Li, S.; Wu, X.; Wang, X.; Hu, S. Relationship between Social Capital, Safety Competency, and Safety Behaviors of Construction Workers. J. Constr. Eng. Manag. 2020, 146, 4020059. [Google Scholar] [CrossRef]
- Guo, S.; Ding, L.; Luo, H.; Jiang, X. A Big-Data-based platform of workers’ behavior: Observations from the field. Accid. Anal. Prev. 2016, 93, 299–309. [Google Scholar] [CrossRef]
- Heinrich, H. Industrial Accident Prevention: A Safety Management Approach; Mcgraw-Hill Book Company: New York, NY, USA, 1980; Volume 468. [Google Scholar]
- Chen, D.; Tian, H. Behavior Based Safety for Accidents Prevention and Positive Study in China Construction Project. In Proceedings of the International Symposium on Safety Science and Engineering in China, Beijing, China, 7–9 November 2012; Volume 43, pp. 528–534. [Google Scholar] [CrossRef] [Green Version]
- Zhang, M.; Fang, D. A continuous Behavior-Based Safety strategy for persistent safety improvement in construction industry. Autom. Constr. 2013, 34, 101–107. [Google Scholar] [CrossRef]
- Ismail, F.; Hashim, A.; Zuriea, W.; Ismail, W.; Kamarudin, H.; Baharom, Z. Behaviour Based Approach for Quality and Safety Environment Improvement: Malaysian Experience in the Oil and Gas Industry. Procedia-Soc. Behav. Sci. 2012, 35, 586–594. [Google Scholar] [CrossRef] [Green Version]
- Barber, B. Resistance by Scientists to Scientific Discovery. Science 1961, 134, 596–602. [Google Scholar] [CrossRef]
- Kuhn, T. The Structure of Scientific Revolutions; University of Chicago Press: Chicago, IL, USA, 1996. [Google Scholar]
- Ding, L.; Fang, W.; Luo, H.; Love, P.; Zhong, B.T.; Ouyang, X. A deep hybrid learning model to detect unsafe behavior: Integrating convolution neural networks and long short-term memory. Autom. Constr. 2018, 86, 118–124. [Google Scholar] [CrossRef]
- Bhagwat, K.; Kumar, V.S.; Nanthagopalan, P. Construction Safety Performance Measurement Using Leading Indicator-based Jobsite Safety Inspection Method—A Case Study of Building Construction Project. Int. J. Occup. Saf. Ergon. 2021, 1–27. [Google Scholar] [CrossRef]
- Khosrowpour, A.; Niebles, J.; Fard, M. Vision-based workface assessment using depth images for activity analysis of interior construction operations. Autom. Constr. 2014, 48, 74–87. [Google Scholar] [CrossRef]
- Cheng, T.; Teizer, J.; Migliaccio, G.; Gatti, U. Automated task-level activity analysis through fusion of real time location sensors and worker’s thoracic posture data. Autom. Constr. 2013, 29, 24–39. [Google Scholar] [CrossRef]
- Guo, H.; Yu, Y.; Skitmore, M. Visualization technology-based construction safety management: A review. Autom. Constr. 2017, 73, 135–144. [Google Scholar] [CrossRef]
- Li, H.; Li, X.; Luo, X.; Siebert, J. Investigation of the causality patterns of non-helmet use behavior of construction workers. Autom. Constr. 2017, 80, 95–103. [Google Scholar] [CrossRef]
- Lingard, H.; Rowlinson, S. Behavior-based safety management in Hong Kong’s construction industry. J. Saf. Res. 1997, 28, 243–256. [Google Scholar] [CrossRef]
- Li, H.; Chan, G.; Wong, J.; Skitmore, M. Real-time locating systems applications in construction. Autom. Constr. 2016, 63, 37–47. [Google Scholar] [CrossRef] [Green Version]
- Kim, H.; Kim, H.; Hong, Y.; Byun, Y. Detecting Construction Equipment Using a Region-Based Fully Convolutional Network and Transfer Learning. J. Comput. Civ. Eng. 2018, 32, 04017082. [Google Scholar] [CrossRef]
- Angah, O.; Chen, A. Tracking multiple construction workers through deep learning and the gradient based method with re-matching based on multi-object tracking accuracy. Autom. Constr. 2020, 119, 103308. [Google Scholar] [CrossRef]
- Hu, Q.; Bai, Y.; He, L.; Cai, Q. Intelligent Framework for Worker-Machine Safety Assessment. J. Constr. Eng. Manag. 2020, 146, 04020045. [Google Scholar] [CrossRef]
- Hu, Q.; Ren, Y.; He, L.; Bai, Y. A Novel Vision Based Warning System For Proactive Prevention Of Accidenets Induced By Falling Objects Based On Building And Environmental Engineering Requirements. Fresenius Environ. Bull. 2020, 29, 7867–7876. [Google Scholar]
- Liu, X.; Hu, Y.; Wang, F.; Liang, Y.; Liu, H. Design and realization of a video monitoring system based on the intelligent behavior identify technique. In Proceedings of the 9th International Congress on Image and Signal Processing, Biomedical Engineering and Informatics, Datong, China, 15–17 October 2016. [Google Scholar]
- Vigneshkumar, C.; Salve, U.R. A scientometric analysis and review of fall from height research in construction. Constr. Econ. Build. 2020, 20, 17–35. [Google Scholar] [CrossRef]
- Hoła, A.; Sawicki, M.; Szóstak, M. Methodology of Classifying the Causes of Occupational Accidents Involving Construction Scaffolding Using Pareto-Lorenz Analysis. Appl. Sci. 2018, 8, 48. [Google Scholar] [CrossRef] [Green Version]
- Szóstak, M.; Hoła, B.; Bogusławski, P. Identification of accident scenarios involving scaffolding. Autom. Constr. 2021, 126, 103690. [Google Scholar] [CrossRef]
- Martínez-Rojas, M.; Gacto, M.J.; Vitiello, A.; Acampora, G.; Soto-Hidalgo, J.M. An Internet of Things and Fuzzy Markup Language Based Approach to Prevent the Risk of Falling Object Accidents in the Execution Phase of Construction Projects. Sensors 2021, 21, 6461. [Google Scholar] [CrossRef] [PubMed]
- Fang, W.; Ding, L.; Zhong, B.; Love, P.; Luo, H. Automated detection of workers and heavy equipment on construction sites: A convolutional neural network approach. Adv. Eng. Inform. 2018, 37, 139–149. [Google Scholar] [CrossRef]
- Fan, H.; Su, H.; Guibas, L. A Point Set Generation Network for 3D Object Reconstruction from a Single Image. In Proceedings of the IEEE Conference on Computer Vision And Pattern Recognition (CVPR 2017), Honolulu, HI, USA, 21–26 July 2017; pp. 2463–2471. [Google Scholar] [CrossRef] [Green Version]
- Zhu, J.; Wan, X.; Wu, C.; Xu, C. Object detection and localization in 3D environment by fusing raw fisheye image and attitude data. J. Vis. Commun. Image Represent. 2019, 59, 128–139. [Google Scholar] [CrossRef]
- Dalal, N.; Triggs, B. Histograms of Oriented Gradients for Human Detection. In Proceedings of the Computer Vision and Pattern Recognition, San Diego, CA, USA, 20–26 June 2005. [Google Scholar] [CrossRef] [Green Version]
- Rubaiyat, A.; Toma, T.; Masoumeh, K.; Rahman, S.; Chen, L.; Ye, Y.; Pan, C. Automatic Detection Of Helmet Uses For Construction Safety. In Proceedings of the IEEE/WIC/ACM International Conference on Web Intelligence Workshops, Leipzig, Germany, 23–26 August 2017. [Google Scholar]
- Krizhevsky, A.; Sutskever, I.; Hinton, G. Imagenet Classification with Deep Convolutional Neural Networks. Commun. ACM 2012, 60, 84–90. [Google Scholar] [CrossRef]
- Fang, Q.; Li, H.; Luo, X.; Ding, L.; Luo, H.; Rose, T.; An, W. Detecting non-hardhat-use by a deep learning method from far-field surveillance videos. Autom. Constr. 2018, 85, 1–9. [Google Scholar] [CrossRef]
- Qiao, L.; Qie, Y.; Zhu, Z.; Zhu, Y.; Zaman, U.; Anwer, N. An ontology-based modelling and reasoning framework for assembly sequence planning. Int. J. Adv. Manuf. Technol. 2017, 94, 4187–4197. [Google Scholar] [CrossRef]
- Fang, W.; Zhong, B.; Zhao, N.; Love, P.; Luo, H.; Xue, J.; Xu, S. A deep learning-based approach for mitigating falls from height with computer vision: Convolutional neural network. Adv. Eng. Inform. 2019, 39, 170–177. [Google Scholar] [CrossRef]
- Fang, W.; Ding, L.; Luo, H.; Love, P. Falls from heights: A computer vision-based approach for safety harness detection. Autom. Constr. 2018, 91, 53–61. [Google Scholar] [CrossRef]
- Lateef, F.; Ruichek, Y. Survey on semantic segmentation using deep learning techniques. Neurocomputing 2019, 338, 321–348. [Google Scholar] [CrossRef]
- Zhu, J.; Zeng, H.; Liao, S.; Lei, Z.; Cai, C.; Zheng, L. Deep Hybrid Similarity Learning for Person Re-Identification. IEEE Trans. Circuits Syst. Video Technol. 2018, 28, 3183–3193. [Google Scholar] [CrossRef] [Green Version]
- Hubel, D.; Wiesel, T. Receptive fields of single neurones in the cat’s striate cortex. J. Physiol. 1959, 148, 574–591. [Google Scholar] [CrossRef] [PubMed]
- Hubel, D.; Wiesel, T. Receptive fields, binocular interaction and functional architecture in the cat’s visual cortex. J. Physiol. 1962, 160, 106–154. [Google Scholar] [CrossRef]
- Goodfellow, I.; Bengio, Y.; Courville, A. Deep Learning; The MIT Press: Cambridge, MA, USA, 2016. [Google Scholar]
- Russakovsky, O.; Deng, J.; Su, H.; Krause, J.; Satheesh, S.; Ma, S.; Huang, Z.; Karpathy, A.; Khosla, A.; Bernstein, M.; et al. ImageNet Large Scale Visual Recognition Challenge. Int. J. Comput. Vis. 2014, 115, 211–252. [Google Scholar] [CrossRef] [Green Version]
- Lin, T.Y.; Maire, M.; Belongie, S.; Hays, J.; Zitnick, C.L. Microsoft COCO: Common Objects in Context; Springer International Publishing: Cham, Switzerland, 2014. [Google Scholar] [CrossRef] [Green Version]
- Kolar, Z.; Chen, H.; Luo, X. Transfer learning and deep convolutional neural networks for safety guardrail detection in 2D images. Autom. Constr. 2018, 89, 58–70. [Google Scholar] [CrossRef]
- Luo, H.; Xiong, C.; Fang, W.; Love, P.; Zhang, B.; Ouyang, X. Convolutional neural networks: Computer vision-based workforce activity assessment in construction. Autom. Constr. 2018, 94, 282–289. [Google Scholar] [CrossRef]
- Lecun, Y.; Bottou, L.; Bengio, Y.; Haffner, P. Gradient-based learning applied to document recognition. Proc. IEEE 1998, 86, 2278–2324. [Google Scholar] [CrossRef] [Green Version]
- Zeiler, M.; Fergus, R. Visualizing and Understanding Convolutional Networks. In Proceedings of the European Conference on Computer Vision, Zurich, Switzerland, 6–12 September 2014. [Google Scholar] [CrossRef]
- Simonyan, K.; Zisserman, A. Very Deep Convolutional Networks for Large-Scale Image Recognition. Comput. Sci. 2014. [Google Scholar] [CrossRef]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 30 June 2016; pp. 770–778. [Google Scholar] [CrossRef] [Green Version]
- Szegedy, C.; Vanhoucke, V.; Ioffe, S.; Shlens, J.; Wojna, Z. Rethinking the Inception Architecture for Computer Vision. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015; pp. 2818–2826. [Google Scholar]
- Morteza, H. Learning representations from dendrograms. Mach. Learn. 2020, 109, 1779–1802. [Google Scholar] [CrossRef]
- Glorot, X.; Bengio, Y. Understanding the difficulty of training deep feedforward neural networks. J. Mach. Learn. Res. 2010, 9, 249–256. [Google Scholar]
- LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef]
- Chu, M.; Matthews, J.; Love, P. Integrating mobile Building Information Modelling and Augmented Reality systems: An experimental study. Autom. Constr. 2018, 85, 305–316. [Google Scholar] [CrossRef]
- GB 50870-2013; Unified Code for Technique for Constructional Safety. China Planning Press: Beijing, China, 2013.
- Wang, H.; Li, X. Construction Safety Technical Manual; China Building Materials Press: Beijing, China, 2008. [Google Scholar]
- Yu, Y.; Guo, H.; Ding, Q.; Li, H.; Skitmore, M. An experimental study of real-time identification of construction workers’ unsafe behaviors. Autom. Constr. 2017, 82, 193–206. [Google Scholar] [CrossRef] [Green Version]
- Cao, Z.; Hidalgo, M.; Simon, T.; Wei, S.; Sheikh, Y. OpenPose: Realtime Multi-Person 2D Pose Estimation Using Part Affinity Fields. IEEE Trans. Pattern Anal. Mach. Intell. 2019, 43, 172–186. [Google Scholar] [CrossRef] [Green Version]
- Xiao, B.; Lin, Q.; Chen, Y. A vision-based method for automatic tracking of construction machines at nighttime based on deep learning illumination enhancement. Autom. Constr. 2021, 127, 103721. [Google Scholar] [CrossRef]
No. | Categories | Unsafe Actions Descriptions |
---|---|---|
1 | Throwing | 1.1 Throw waste and leftover materials at will. |
1.2 Throw fragments down when working on building exterior walls. | ||
1.3 Throw tools and materials up and down. | ||
1.4 Throw rubbish from windows. | ||
1.5 Throw dismantled objects and remaining materials arbitrarily. | ||
1.6 Throw broken glass downwards when installing skylights. | ||
2 | Relying | 2.1 Relying on the protective railing. |
2.2 Rely or ride on the window rails when painting windows. | ||
3 | Lying | 3.1 Lie on scaffold boards and operating platforms. |
4 | Jumping | 4.1 Jump up and down shelves. |
5 | With no helmet | 5.1 Workers fail to use safety protection equipment correctly when entering a dangerous site of falling objects. |
Unsafe Actions | Distribution |
---|---|
Jumping | 11.12% |
Throwing | 21.41% |
Relying | 22.12% |
Lying | 15.82% |
Helmet | 29.53% |
Indicators | Accuracy | Precision | Recall | F1-Measure |
---|---|---|---|---|
Original | 93.46% | 99.71% | 93.72% | 96.63% |
Comparison | 83.38% | 87.79% | 63.79% | 73.89% |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Hu, Q.; Bai, Y.; He, L.; Huang, J.; Wang, H.; Cheng, G. Workers’ Unsafe Actions When Working at Heights: Detecting from Images. Sustainability 2022, 14, 6126. https://doi.org/10.3390/su14106126
Hu Q, Bai Y, He L, Huang J, Wang H, Cheng G. Workers’ Unsafe Actions When Working at Heights: Detecting from Images. Sustainability. 2022; 14(10):6126. https://doi.org/10.3390/su14106126
Chicago/Turabian StyleHu, Qijun, Yu Bai, Leping He, Jie Huang, Haoyu Wang, and Guangran Cheng. 2022. "Workers’ Unsafe Actions When Working at Heights: Detecting from Images" Sustainability 14, no. 10: 6126. https://doi.org/10.3390/su14106126