Next Article in Journal
LightMAN: A Lightweight Microchained Fabric for Assurance- and Resilience-Oriented Urban Air Mobility Networks
Next Article in Special Issue
Impediments to Construction Site Digitalisation Using Unmanned Aerial Vehicles (UAVs)
Previous Article in Journal
Formation Tracking Control for Multi-Agent Systems with Collision Avoidance and Connectivity Maintenance
Previous Article in Special Issue
Automatic Volume Calculation and Mapping of Construction and Demolition Debris Using Drones, Deep Learning, and GIS
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Synthetic Review of UAS-Based Facility Condition Monitoring

1
Department of Architectural Engineering, Hanbat National University, Daejeon 34158, Republic of Korea
2
Department of Building and Plant Engineering, Hanbat National University, Daejeon 34158, Republic of Korea
*
Author to whom correspondence should be addressed.
Drones 2022, 6(12), 420; https://doi.org/10.3390/drones6120420
Submission received: 13 November 2022 / Revised: 5 December 2022 / Accepted: 12 December 2022 / Published: 15 December 2022
(This article belongs to the Special Issue Application of UAS in Construction)

Abstract

:
Facility inspections are mainly carried out through manual visual inspections. However, it is difficult to determine the extent of damages to facilities, and it depends on the subjective opinion of the manager in charge of the monitoring. Additionally, when inspectors inspect facilities that cannot be safely accessed, such as high-rise buildings, there are high risks of fatal accidents. For this reason, the construction industry conducts research into unmanned aircraft system (UAS)-based facility inspections. These studies have been focusing on developing the technologies or processes for using UAS in facility condition monitoring, ranging from infrastructure systems to commercial buildings. This study conducted extensive and synthetic reviews of the recent studies in UAS-based facility monitoring using a preferred reporting items for systematic reviews and meta-analysis (PRISMA) method. A total of 32 papers were selected and classified through the types of facilities and the technologies addressed in the studies. This paper analyzes the trends of recent studies by synthesizing the selected papers and consolidates the further directions of UAS applications and studies in facility monitoring domains.

1. Introduction

The global outbreak of COVID-19 has slowed down the economics of the construction industry, resulting in reduced short-term construction demand and a negative impact on long-term investments [1]. McKinsey & Company recently emphasized the need to accelerate the digitalization of the construction industry to improve its [2]. With digitalization, the construction industry can expect a reconstruction of the supply chain that can improve elasticity in the short term, and increase investments in information technology integration, expand off-site construction, and improv environmental sustainability in the long term [3]. The construction industry is improving both productivity and safety by using fourth industrial revolution technologies, such as unmanned aircraft systems (UAS), artificial intelligence (AI), cloud computing, virtual reality (VR), or augmented reality (AR) [4]. Among the various fields that can use those technologies, construction is expected to be the one of the sectors that can have huge benefits from using them [5]. Facilities require regular and precise inspections, thus requiring the production and management of inspection data. For example, case studies by the Teal Group (2016) and EuroConsult (2020) found that the facility inspection and construction industry have high potentials for the use and commercialization of UAS in the next 10 years [6].
Facility inspections are currently conducted by visual inspections. Second inspections with testing devices might be required when the first inspection that visually conducted by the inspector is not sufficient to determine the condition of the facility [4,7,8]. Since the visual inspection is conducted from the subjective perspective of the inspector, there are limitations in obtaining objective visual data on damages to facilities [4]. Therefore, it is difficult to conduct objective analysis and make decisions based on the data. Moreover, when inspecting areas that are difficult for people to access, there is the risk of human casualties [4,7]. To address these challenges, UAS-based facility inspection has been introduced. For example, FLYABILITY uses UAS to conduct the inspection of the narrow space under bridges. This resulted in a significant drop in costs compared to past visual inspections while reducing inspection time by at least 25% [9]. Another example is the deployment of DRONEID to the site of the Russian cruise which crashed into the Gwangan Bridge in Busan on 29 February 2019 to investigate the damages and continue to monitor the crash site [10].
UAS can fly near facilities using various sensors to capture images as digital image data and can improve the efficiency and safety of facility maintenance status inspections. Therefore, many studies have been published on developing UAS-based facility condition monitoring by adopting deep learning [11,12] or image processing [13,14]. Likewise, there is a huge body of studies on UAS applications in the facility monitoring domains, which contribute to providing new applications as well as a better understanding of how those various UAS applications are used to collect and analyze the visual data. Additionally, the studies address the various types of facilities (e.g., bridge or building) to adopt the technologies However, it is necessary to conduct a secondary study (Level 2 shown in Figure 1) that can synthesize the context of the various UAS applications according to the type of technology used and facilities, as well as consolidate the further directions to address the current challenges. Therefore, the main goal of this study is to systematically synthesize and review the UAS-related first studies (Level 1 shown in Figure 1) in facility monitoring contexts to provide an overview of the future. This study adopted a preferred reporting items for systematic reviews and meta-analysis (PRISMA) method to examine the current state of UAS-based facility inspection applications and further research directions.

2. Facility Condition Monitoring

Currently, most facility condition inspections are conducted by an inspector who performs visual inspections [7]. This method requires a lot of time and expense, and it is challenging to access skyscrapers and underneath bridges. In addition to such problems, buildings are becoming higher and more significant compared to the past, and, with time, the number of old facilities requiring periodic condition monitoring has increased [8]. Therefore, it is necessary to find efficient and economic schemes in addition to the current visual inspections, and UAS-based facility condition monitoring has recently been introduced to solve these issues.
Research has explored UAS-integrated monitoring systems for various facilities, such as airports [13], bridges [7,15], and buildings [16,17]. Furthermore, advanced data processing techniques, such as photogrammetry, image processing [13,14], and deep learning [11,12] are used to identify damages to facilities. Among them, research was conducted to visually identify damages to bridges, such as cracks, efflorescence, detachment, etc., based on images captured using UAS [18]. The most common research focus was visually identifying damages to facilities using UAS, and surveys are also conducted on various facilities other than bridges [19,20]. Additionally, there has been research on identifying damages to facilities using point cloud data collected using UAS and conducting 3D modeling on it [21]. 3D modeling of facilities through point cloud data is being used more intensively as a part of UAS-based facility inspections. Deep learning-based study was also introduced to identify cracks automatically [22]. It can also be very effective for resolving problems that can occur while manual facility inspections are conducted. In addition to examining facility damages and inspection, various other studies are conducted, such as economic analysis, establishing optimal flight paths for autonomous UAS, and predicting risk factors during UAV inspection [16,22,23,24]. Likewise, there are numerous studies on developing UAS applications in facility monitoring to overcome the limitations of the manual labor-intensive inspection method. It has been proven that the UAS can be utilized as an efficient, accurate, and economic tool for this purpose.

3. Research Methodology

This study used a PRISMA method to identify recent research on UAS-based facility inspections. The method can systematically document the extensive knowledge presented and help to reduce the time and duplications of comprehensive reviews [25]. Following the PRISMA method, this study had three main steps: (1) selection of the studies; (2) synthesis of current knowledge; and (3) suggestion of further research directions.
To select the studies, the Google Scholar search engine was used for automated searches by using the Boolean operators and search strings identified. The keywords include abbreviations, such as ‘UAS’, ‘UAV’, and ‘drone’, which can be used synonymously for ‘unmanned aerial vehicle’. In addition, words used in facility inspection, such as ‘inspection’, ‘maintenance’, ‘building’, and ‘facility’, were selected as keywords. A total of 534 papers were searched, combining the aforementioned keywords. In this study, inclusion criteria and exclusion criteria were established to select the articles (as shown in Table 1).
The PRISMA-based article selection process is shown in Figure 2. First, to analyze the state of recent research, papers drafted in the past five years, from 2017 to 2021, were selected, and earlier papers were excluded. Second, among articles drafted in the past five years, published journals written in English were selected. Articles presented at academic conferences, books, diploma theses, research reports, etc., were excluded. Third, primary research articles, referring to scientifically described papers on technology development and its validation, were included, and secondary research articles were excluded. Finally, papers that were not UAS-based or are unrelated to the facility level study were excluded. Through the paper selection process, 32 papers were finally selected.

4. Synthesis of Research in UAS-Based Facility Monitoring

A total of 32 papers were selected in this study, and the type of facility and technology were categorized and coded for synthesis. The facility was categorized as ‘bridges (BR)’, ‘buildings (BU)’, and ‘other facilities (OT)’, such as transmission towers and airports, and the types of applied technologies were categorized as ‘visual identification (VI)’, ‘3D modeling (3D)’, and ‘automated identification (AI)’. VI was defined as a technology to identify damage to facilities using digital images captured by UAS, and AI was defined as technology to automatically detect damage to facilities using deep learning for captured images. 3D is a type of technology that enables 3D modeling, including the point cloud of buildings based on captured UAS images to identify damage to facilities. The papers described the technical issues (e.g., process, factor analysis) were coded as unclassified (UN). Table 2 describes the result of the article selection through the PRISMA process.
Based on the type of facility, a total of 16 (50.0%) studies on BU inspection using UAS, 11 (34.4%) articles on BR inspection, and 5 (15.6%) articles on OT including airports and transmission towers were identified. Based on the type of technology adopted, a total of 7 studies (21.9%) on VI using aerial images, 11 (34.4%) on 3D technology, and 7 (21.9%) on AI technique were identified. In addition, 7 (21.9%) were not classified, since they mainly described economic analysis research, video processing, process proposal, autonomous flight UAS framework, and environmental impact analysis when using the UAS. Figure 3 describes the synthesis map composed of the selected articles for the secondary review in this study.

4.1. Synthesis of Contributions of Current UAS-Based Facility Monitoring Studies

Based on the review in this study, 3D technology tends to be the most common for inspecting facilities. A total of five studies out of 11 (45.5%) focusing on the BR adopted 3D techniques to use the point cloud model through the photogrammetry process. For the BU, a total of six out of 16 studies (37.5%) used 3D for detecting and analyzing damages to buildings. However, this technique was not used for OT facilities. The 3D point cloud data can be processed through the 2D UAS aerial images of bridges, and they can successfully help to inspect the damaged parts of the bridges [21,39]. Furthermore, this model can be integrated with the BIM for establishing the AR environment for rapid inspection [7]. Han et al., 2018, documented how camera angle changes at the corners of facilities might have significant effects on the accuracy of the 3D model due to the time lag during the change. Additionally, it is important to establish the well-organized and optimized flight path of UAS to collect 2D images for processing the 3D model and analysis [27,34]. Another contribution to the building inspection (BU) after disasters was also documented. A study proposed an emergency risk assessment system using the UAS to assess the condition of buildings after earthquakes [28]. In addition to the 3D model, a recent study proposed a 4D map based on the 3D point cloud model for providing information about the facility monitoring plan execution [31]. Through the review in this study, it was also revealed that more optimal path is very critical to avoid collisions with obstacles (e.g., buildings or bridges) as well as to generate accurate 3D models for detecting the damages to facilities [27,30]. In this study, it was also pointed out that there is no study on the use of 3D models for monitoring OT facilities, such as an airport runway, transmission tower, or billboard.
For the VI, the studies contributed to how UAS-based aerial images can be used efficiently for verifying various damages, such as bridge decks, leaks, stains, and detachment [6,32]. Another contribution was proving the effectiveness, safety, and economic benefits of UAS-based digital images for facility monitoring through case studies [20,23,32,36]. It was also revealed that UAS images can be modified into mosaic images for airport runway inspection. The mosaic photo can be generated through stitching together each aerial image with the geo-information [13]. Even it is hard to detect cracks or damages, it might be useful for recognizing the design compliance of the facility. The automated damage identification (AI) is also very efficient for identifying the various defects of facilities, such as cracks, detachment, and efflorescence by using deep learning-based object detection [11,29,33,39]. Deep learning-based AI includes image segmentation and transfer learning to train the recognition of cracks using the UAS images or 3D point cloud model [12,35]. This AI technique commonly called computer vision detects the object, and this can be used for identifying cracks and calculating their width with high accuracy: about 90% [34,39].
In addition to the technology development, it was revealed that researchers are also interested in the technical OT studies (a total of five studies were identified). Based on the economic feasibility studies, the UAS can reduce costs by 4-18% compared to the traditional way [18]. Furthermore, to enhance the usability of UAS in facility monitoring, factor analysis was conducted to deduce risk level to predict and manage risk factors (low temperature, malfunctions, or collision) on UAS flights for condition monitoring [21,22], and a decision-making process for inspectors was proposed [16].
Table 3 describes the synthesized review consolidating the contribution to UAS-based facility monitoring according to the technologies adopted (VI, 3D, and AI).

4.2. Synthesis of Limitations of Current UAS-Based Facility Monitoring Studies

Based on the review, this study synthesized the main limitations of the studies by the type of technologies. The studies commonly indicated the low quality and high noise level of the processed 3D point cloud model based on the UAS aerial images. It is very challenging to identify the type and extent of the damage accurately. Also, it is very difficult to obtain 2D aerial images during flight because of accidents due to nearby buildings and surrounding facilities. This might result in reducing the accuracy of the 3D model. Therefore, it is necessary to manually collect 2D images that can help to generate the highly accurate 3D model of the facilities. For this reason, the 3D technique can only be only used to identify damages’ existence and their type. In addition, this method is not automated monitoring even though the UAS can fly over the facility autonomously. The depth of damages was too limited to be accurately pinpointed and had to be reexamined visually. Another limitation is that there are no studies exploring how the 3D model can be maintained and operated after using it for monitoring. The studies pointed out that it is very critical for continuous and sustainable monitoring of constructed facilities in the next decades.
For the deep learning-based AI technique, the studies pointed out that the deep-learning model is still limited in terms of updating and continuously enhancing the accuracy because it used an existing dataset. Furthermore, for training, it still requires a huge amount of data to acquire high accuracy of automated recognition. For the VI using the aerial images, it is useful to identify the location of the crack or damages at a glance rather than detecting the size or types of the defects. Since it is determined visually by looking at the captured images, the extent of damage cannot be accurately identified. One of the studies indicated that there may be large margins of error of about 10 mm. Also, it is difficult to take pictures when the distance is narrow, or there are obstacles in the flight path. Therefore, it is very limited for facility monitoring itself. The studies pointed out that VI should be combined with the 3D or AI techniques for more effectiveness and efficient uses. The UN papers are very limited, introducing the technological and economic benefits and process development without detailed verifications.
Table 4 describes the synthesis of current limitations.

4.3. Conslidation of Further Research Direction

Based on the review, this study consolidated further research directions. Studies that used VI and 3D methods are very limited to automatically identifying the damage at the facilities. Therefore, it is necessary to explore how the deep-learning model can be adopted for 3D point cloud modelling for automated object recognitions. While focusing only on improving the accuracy of the 3D model for identifying the defects efficiently, it should be established how the UAS can collect the aerial images with geo-information (x-, y-, and z-coordinates) for enhancing the 3D model’s accuracy through the photogrammetry process. This could be connected to the autonomous UAS flights and optical flight plan. Therefore, it is also necessary to propose a technical manual for generating the optical flight planning before the UAS flight.
In addition, it is necessary to develop a deep-learning model based on the specific training dataset for specific constructed facility by using the UAS. Additionally, it is very important to address the database management for continuous and sustainable uses of the deep-learning model for enhancing the accuracy. For AI, it is very critical to prepare a dataset that can learn all different types of damages to facilities. Lastly, the UN studies should have further case studies for validating the process proposed and autonomous UAS inspection system, as well as verifying the economic and environmental factor analysis through actual UAS uses in facility condition monitoring cases.
Table 5 describes the consolidation and synthesis of further research directions

5. Discussion and Conclusions

This section comprehensively summarizes the status of UAS-based facility inspection research based on the classification and analysis results of the studies carried out in Section 4 to present directions for future research. This study used the PRISMA scientific literature review methodology to identify research contributions to UAS-based facility inspections, limitations, and further research directions. A total of 32 studies are selected and categorized into the type of facilities (BR, BU, OT) and type of technology applied (VI, 3D, AI). The studies mainly focus on recent bridge and building inspections.
Based on the review, 3D is the most frequently used technology among the types of technology used for each facility studied. Additionally, technologies for automatic identification of damage to facilities require the highest-level technique (e.g., deep learning). These two techniques are very actively employed in the facility monitoring domain for detecting cracks or defects. Since manual facility inspections have many problems, such as costs, inspection periods, accuracy, and efficiency, much research focused on the AI/3D-based inspection, mainly on buildings and bridges. Meanwhile, seven papers were not categorized as they did not correspond to the three technical categories. These papers carried out research in various fields, such as economic analysis, proposal of facility inspection processes, analysis of optimal routes for autonomous UAS, and risk analysis of surrounding environment when using UAS. Different perspectives and continued follow-up research are needed beyond the three technology categories.
There were very few studies presenting research on other facilities (OT), but they were similar to trends in the types of technology used in bridges. Through this, the AI method was very low compared to the other facilities inspected. Finally, the limitations of the technology category are deduced, and the direction of future research is presented accordingly. In the case of studies on visual identification, the analysis results show a need for research to improve technologies for accurate identification, as facility damages are visually identified based on captured images. In the case of studies using 3D modeling, the accuracy of point cloud data collected by UAS was low, and it did not identify the damages to facilities based on 3D modeling. Therefore, it is necessary to develop technologies that can improve the accuracy of point cloud data and detect facility damages directly and accurately. Lastly, in the case of research on automatic identification, the deep-learning model had the limitations of identifying only one type of damage because it requires a high level of transfer learning and high number of datasets. Accordingly, follow-up research is needed to develop a deep-learning dataset composed of the UAS aerial images not present in the currently existing dataset. This study contributed to gathering information about the recent state of knowledge in UAS-based facility inspection, documenting the common challenges in the domain, and consolidating further research directions.

Author Contributions

Conceptualization, J.K. and K.J.; methodology, J.K. and S.K.; literature review & analysis, K.J., J.K., D.L. and S.L.D.; writing—original draft preparation, K.J.; writing—review and editing, S.K.; project administration, S.K. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the National Research Foundation of Korea (NRF) grant funded by the Korean government (MSIT) (No. 2021R1F1A1064109). Also, this research was supported by the research fund of Hanbat National University in 2021 (No. 202103310001).

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Lee, J.H. Need to Prepare in Advance for the Environment that has changed since COVID-19. In Construction Trend Briefing; CERIK: Seoul, Republic of Korea, 2020; Volume 759, p. 8. [Google Scholar]
  2. Rajat, A.; Shankar, C.; Mukund, S. Imagining Construction’s Digital Future; Mckinsey & Company: Tokyo, Japan, 2016. [Google Scholar]
  3. Jonas, B.; Jose, L.B.; Jan, M.; Maria, J.R.; David, R.; Erik, J.; Gernot, S. How Construction Can Emerge Stronger after Coronavirus; McKinsey & Company: Tokyo, Japan, 2020. [Google Scholar]
  4. Siyuan, C.; Debra, F.L.; Eleni, M.; SM Iman, Z.; Jonathan, B. UAV Bridge Inspection through Evaluated 3D Reconstructions. J. Bridge Eng. 2019, 24, 05019001. [Google Scholar]
  5. Jang, J.H. BIM Based Infrastructure Maintenance. KSCE Mag. 2020, 68, 38–47. [Google Scholar]
  6. Hanita, Y.; Mustaffa, A.A.; Aadam, M.T.A. Historical Building Inspection using the Unmanned Aerial Vehicle(UAV). IJSCET 2020, 11, 12–20. [Google Scholar]
  7. Donghai, L.; Xietian, X.; Junjie, C.; Shuai, L. Integrating Building Information Model and Augmented Reality for Drone-based Building Inspection. J. Comput. Civ. Eng. 2021, 35, 04020073. [Google Scholar]
  8. Kim, H.Y.; Choi, K.A.; Lee, I.P. Drone image-based facility inspection-focusing on automatic process using reference images. J. Korean Soc. Geospat. Inf. Sci. 2018, 26, 21–32. [Google Scholar]
  9. FLYABILTY. Available online: https://www.flyability.com/casestudies/indoor-drones-in-bridge-inspection-between-beams-and-inside-box-girder (accessed on 24 October 2022).
  10. DRONEID. Available online: http://www.droneid.co.kr./usecase/01_view.php?type=&num=13 (accessed on 24 October 2022).
  11. Lee, K.W.; Park, J.K. Modeling and Management of Bridge Structures Using Unmanned Aerial Vehicle in Korea. Sens. Mater. 2019, 31, 3765–3772. [Google Scholar] [CrossRef] [Green Version]
  12. Chen, K.; Reichard, G.; Akanmu, A.; Xu, X. Geo-registering UAV-captured close-range images to GIS-based spatial model for building facade inspections. Autom. Constr. 2021, 122, 103503. [Google Scholar] [CrossRef]
  13. Jeong, D.M.; Lee, J.H.; Lee, D.H.; Ju, Y.K. Rapid Structural Safety Evaluation Method of Buildings using Unmanned Aerial Vehicle (SMART SKY EYE). J. Archit. Inst. Korea Struct. Constr. 2019, 35, 3–11. [Google Scholar]
  14. Kim, S.J.; Yu, G.; Javier, I. Framework for UAS-Integrated Airport Runway Design Code Compliance Using Incremental Mosaic Imagery. J. Comput. Civ. Eng. 2021, 35, 04020070. [Google Scholar] [CrossRef]
  15. Besada, J.A.; Bergesio, L.; Campana, I.; Vaquero-Melchor, D.; Lopez-Araquistain, J.; Bernardos, A.M.; Casar, J.R. Drone Mission Definition and Implementation for Automated Infrastructure Inspection using Airborne Sensors. Sensors 2018, 18, 1170. [Google Scholar] [CrossRef] [Green Version]
  16. Aliyari, M.; Ashrafi, B.; Ayele, Y.Z. Hazards identification and risk assessment for UAV-assisted bridge inspection. Struct. Infrastruct. Eng. 2021, 18, 412–428. [Google Scholar] [CrossRef]
  17. Han, D.Y.; Park, J.B.; Huh, J.W. Orientation Analysis between UAV Video and Photos for 3D Measurement of Bridges. J. Korean Soc. Surv. Geod. Photogramm. Cart. 2018, 36, 451–456. [Google Scholar]
  18. Ruiz, R.D.B.; Junior, A.C.L.; Rocha, J.H.A. Inspection of facades with Unmanned Aerial Vehicles(UAV): An exploratory study. Rev. ALCONPAT 2021, 11, 88–104. [Google Scholar]
  19. Akbar, M.A.; Qidwai, U.; Jahanshahi, M.R. An evaluation of image-based structural health monitoring using integrated unmanned aerial vehicle platform. Struct. Control Health Monit. 2019, 26, e2276. [Google Scholar] [CrossRef] [Green Version]
  20. Mattar, R.A.; Kalai, R. Development of a Wall-Sticking Drone for Non-Destructive Ultrasonic and Corrosion Testing. Drones 2018, 2, 8. [Google Scholar] [CrossRef] [Green Version]
  21. Peng, X.; Zhong, X.; Zhao, C.; Chen, A.; Zhang, T. A UAV-based machine vision method for bridge crack recognition and width quantification through hybrid feature learning. Constr. Build. Mater. 2021, 299, 123896. [Google Scholar] [CrossRef]
  22. Grosso, R.; Mecca, U.; Moglia, G.; Prizzon, F.; Rebaudengo, M. Collecting Built Environment Information Using UAVs: Time and Applicability in Building Inspection Activities. Sustainability 2020, 12, 4731. [Google Scholar] [CrossRef]
  23. Aliyari, M.; Ashrafi, B.; Ayele, A.Z. Drone-based bridge inspection in harsh operating environment: Risks and safeguards. Int. J. Transp. Dev. Integr. 2021, 5, 118–135. [Google Scholar] [CrossRef]
  24. Sauti, N.S.; Yusoff, N.M.; Abu Bakar, N.A.; Akbar, Z.A. Visual Inspection in Dilapidation Study of Heritage Structure Using Unmanned Aerial Vehicle(UAV): Case Study AS-Solihin Mosque, Melaka. Politek. Kolej Komuniti J. Life Long Learn. 2018, 2, 26+28–38. [Google Scholar]
  25. Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. RESEARCH METHODS AND REPORTING 2020. Syst. Rev. 2021, 10, 89. [Google Scholar] [CrossRef]
  26. Kung, R.Y.; Pan, N.H.; Wang, C.C.; Lee, P.C. Application of Deep Learning and Unmanned Aerial Vehicle on Building Maintenance. Adv. Civ. Eng. 2021, 2021, 5598690. [Google Scholar] [CrossRef]
  27. Ayele, Y.Z.; Aliyari, M.; Griffiths, D.; Droguett, E.L. Automatic Crack Segmentation for UAV-Assisted Bridge Inspection. Energies 2020, 13, 6250. [Google Scholar] [CrossRef]
  28. Cho, J.; Shin, H.; Ahn, Y.; Lee, S. Proposal of Regular Safety Inspection Process in the Apartment Housing Using a Drone. Korea Inst. Ecol. Archit. Environ. 2019, 19, 121–127. [Google Scholar]
  29. Freimuth, H.; Konig, M. A Framework for Automated Acquisition and Processing of As-Built Data with Autonomous Unmanned Aerial Vehicles. Sensors 2019, 19, 4513. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  30. Lee, K.; Hog, H.; Sael, L.; Kim, H.Y. MultiDefectNet: Multi-Class Defect Detection on Building Facade Based on Deep Convolutional Neural Network. Sustainability 2020, 12, 9785. [Google Scholar] [CrossRef]
  31. Yang, L.; Muqing, C.; Shenghai, Y.; Lihua, X. Vision Based Autonomous UAV Plane Estimation And Following for Building Inspection. arXiv 2021, arXiv:2102.01423. [Google Scholar]
  32. Jeong, H.J.; Lee, S.H.; Chang, H.J. Research on the Safety Inspection Plan of Outdoor Advertising Safety Using Drone. J. Korea Inst. Inf. Electron. Commun. Technol. 2020, 13, 530–538. [Google Scholar]
  33. Kim, Y.G.; Kwon, J.W. The Maintenance and Management Method of Deteriorated Facilities Using 4D map Based on UAV and 3D Point Cloud. J. Korea Inst. Build. Constr. 2019, 19, 239–246. [Google Scholar]
  34. Falorca, J.F.; Miraldes, J.P.N.D.; Lanzinha, J.C.G. New trends in visual inspection of buildings and structures: Study for the use of drones. Open Eng. 2021, 11, 734–743. [Google Scholar] [CrossRef]
  35. Bolourian, N.; Hammad, A. LiDAR-equipped UAV path planning considering potential locations of defects for bridge inspection. Autom. Constr. 2020, 117, 103250. [Google Scholar] [CrossRef]
  36. Bae, J.H.; Lee, J.H.; Jang, A.; Ju, Y.G. SMART SKY EYE System for structural Safety Assessment Using Drones and Thermal Images. J. Korean Assoc. Spat. Struct. 2019, 19, 4–8. [Google Scholar] [CrossRef]
  37. Falorca, J.F.; Lanzinha, J.C.G. Facade inspections with drones-theoretical analysis and exploratory tests. Int. J. Build. Pathol. Adapt. 2020, 39, 235–258. [Google Scholar] [CrossRef]
  38. Kaiwen, C.; George, R.; Xin, X. Opportunities for Applying Camera-Equipped Drones towards Performance Inspections of Building Facades. Comput. Civ. Eng. 2021, 2019, 113–120. [Google Scholar]
  39. Seo, J.W.; Duque, L.; Wacker, J.P. Field Application of UAS-Based Bridge Inspection. Transp. Res. Rec. 2018, 2672, 72–81. [Google Scholar] [CrossRef]
Figure 1. A secondary review study procedure.
Figure 1. A secondary review study procedure.
Drones 06 00420 g001
Figure 2. PRISMA-based article selection process.
Figure 2. PRISMA-based article selection process.
Drones 06 00420 g002
Figure 3. Synthesis map.
Figure 3. Synthesis map.
Drones 06 00420 g003
Table 1. Inclusion and exclusion criteria.
Table 1. Inclusion and exclusion criteria.
CriteriaContent
Inclusion CriteriaPapers drafted in the past 5 years
Written in English
Primary research paper
Papers related to UAS-based facility inspections
Exclusion CriteriaPapers drafted prior to the past 5 years
Conference, Book, Thesis, Report
Secondary research paper
Papers not related to facility inspections
Papers on monitoring building energy and heat
Table 2. Classification of research by type of facility and technology.
Table 2. Classification of research by type of facility and technology.
Identifier (Reference)Type of FacilitiesType of Technologies *
Bridge
(BR)
Building
(BU)
Other
Facility ** (OT)
Visual
Identification (VI)
3D Modeling Technology
(3D)
Automatic
Identification
(AI)
Unclassified ***
(UN)
J01 (Liu et al., 2021) [7]
J02 (Sauti et al., 2018) [24]
J03 (Han et al., 2018) [17]
J04 (Lee et al., 2019) [11]
J05 (Grosso et al., 2020) [22]
J06 (Chen et al., 2021) [12]
J07 (Kung et al., 2021) [26]
J08 (Ayele et al., 2020) [27]
J09 (Kim et al., 2018) [8]
J10 (Cho et al., 2019) [28]
J11 (Freimuth et al., 2019) [29]
J12 (Jeong et al., 2019) [13]
J13 (Lee et al., 2019) [30]
J14 (Kim et al., 2021) [14]
J15 (Yang et al., 2021) [31]
J16 (Yusof et al., 2020) [6]
J17 (Jeong et al., 2020) [32]
J18 (Kim et al., 2019) [33]
J19 (Falorca et al., 2021) [34]
J20 (Akbar et al., 2018) [19]
J21 (Bolourian et al., 2020) [35]
J22 (Ruiz et al., 2021) [18]
J23 (Peng et al., 2021) [21]
J24 (Aliyari et al., 2021) [16]
J25 (Mattar et al., 2018) [20]
J26 (Bae et al., 2019) [36]
J27 (Aliyari et al., 2021) [23]
J28 (Falorca et al., 2021) [37]
J29 (Kaiwen et al., 2019) [38]
J30 (Seo et al., 2018) [39]
J31 (Siyuan et al., 2019) [4]
J32 (Besada et al., 2018) [15]
* The AI comes first if it overlapped with the 3D or VI when it was classified. ** OT includes airport, transmission tower, outdoor billboard, tank, and smokestack. *** UN studies include economic feasibility, impact or factor analysis, and process definition studies.
Table 3. Synthesis of current state of knowledge.
Table 3. Synthesis of current state of knowledge.
TechnologyCurrent State of Knowledge for Facility Monitoring
VIIt was commonly contributed that UAS-based facility condition monitoring can collect the image data of point-of-interest as well as help to identify various defects with high efficiency, safety, and economic benefits.
3DA 3D point cloud-based crack detection and estimation framework for constructed-facility condition monitoring was commonly developed.
One of the papers proposed an integrated system between UAS and BIM-based AR for facility condition monitoring.
AIThe selected articles developed a framework by using a deep learning-based object recognition model for detecting the defects of the facilities.
They used commonly existing datasets (e.g., COCO dataset) for transfer learning and image segmentation.
Table 4. Synthesis of current limitations.
Table 4. Synthesis of current limitations.
TechnologyBR and BUOT
VIIt is possible to identify damages to facilities, but it cannot pinpoint damages such as cracks, efflorescence, and detachment, nor the level of such damages
3DIt was pointed out that the need for more accuracy of point cloud data collected through UAS makes it challenging to identify damages to facilities accurately.
Insufficient accuracy of point cloud data collected through UAS.
Current and previous studies focused on how to improve the accuracy of the 3D modeling data without identifying damages to facilities.
-
AIDeep-learning model that can only identify single damage types of facilities.
It required large amounts of data to generalize the deep-learning model.
-
Deep-learning model that can only identify single damage types of facilities.
3D modeling research has yet to perform much. Therefore, it is hard to identify where the damage occurred in the facility.
UNThe process development or economic benefits are not conducted with actual UAS flight case studies.
Table 5. Synthesis of further research direction.
Table 5. Synthesis of further research direction.
TechnologyBR and BUOT
VIIt is necessary to develop technologies that can automatically identify damages to facilities.
3DIt is necessary to develop data processing technologies to improve data collection technologies and accuracy to acquire high-quality point clouds.
It is necessary to develop technologies that can accurately identify damages to facilities through 3D modeling technologies.
AIIt is necessary to develop a deep-learning model that can simultaneously identify various damages to facilities.
It is necessary to develop a system that can efficiently collect and manage large amounts of data.
It is necessary to develop technologies that can improve the accuracy of the deep-learning model.
It is necessary to develop technologies that visually identify locations where damages occurred linked with 3D modeling technologies.
It is necessary to develop technologies that can improve the accuracy of the deep-learning model.
UNIt should validate and verify the process for developed autonomous flight systems through case studies.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Jeong, K.; Kwon, J.; Do, S.L.; Lee, D.; Kim, S. A Synthetic Review of UAS-Based Facility Condition Monitoring. Drones 2022, 6, 420. https://doi.org/10.3390/drones6120420

AMA Style

Jeong K, Kwon J, Do SL, Lee D, Kim S. A Synthetic Review of UAS-Based Facility Condition Monitoring. Drones. 2022; 6(12):420. https://doi.org/10.3390/drones6120420

Chicago/Turabian Style

Jeong, Kyeongtae, Jinhyuk Kwon, Sung Lok Do, Donghoon Lee, and Sungjin Kim. 2022. "A Synthetic Review of UAS-Based Facility Condition Monitoring" Drones 6, no. 12: 420. https://doi.org/10.3390/drones6120420

Article Metrics

Back to TopTop